var/home/core/zuul-output/0000755000175000017500000000000015157301700014524 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015157313520015473 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000342210015157313335020256 0ustar corecoreݖikubelet.log]o[=r+Br t-n{(!9%CM/c;b[>Ǧ(ZΑ4cڜc^*߶Y٬:|fu<ۭ_x~̎+ޜ/8_poL_bڞֻ];YoZO(_-V,<xnƙQʀClxv< |N ?%5$.zٶ'p~U Pm,UTV̙UΞg\ Ӵ-$bM+]7q>dQQo.aLk~c\UlxDJzw6xi1E2 c#FD?2Sgaf3|,ejoLR3[ D HJP1Ub2em$HěU^LcZ:F9TJJ{,mv'L;: ԣ$aɾ7l7;̵3](uX|&kΆ2fb4NvS)f$UX dcю)""û5h< #чOɁ^˺b}0w8_jiBx_2dd$YLYG(#?%U? ` 17ׅwڋًM)$_FiqwGtWL,u0V9c  Tt2H'b*t--ovw:Z8HU=y=o&|/'oZOSL2uQO)rat m2m`QɢJ[a|$ᑨj:D+_ʎ; 9Gac/m_jY-i`)͐noNGWo(C U ?}aJ+do&?>Y;ufޕ+D`7Pa]Xj0ćNbYe獸]fNdƭwywOw0rjɻ,]LF0);I$>ga5"f[B[fhToɾgZ)~5ɑUIU"$`SFKa"j[Hp'{fȼ-vE,4IRkL!~kn0ߐNPJ|U ]]=UD m}}O-%UNnOA~HXwhO@GڷMVw dOox^-:}KA8玛7C;XHK:lL4Aْ .zqHP"P.dTrcD Yjz_aLm.x'})')SĔv}S%xhRe)a@r AF' ]J)˨bqENjʵbu'b߇ٜK;tf*H7(?PЃkLM(]֟-Xصp&NI%`t3Vq=Mb㸵2*3d*mQ%"h+ "f "D(~}moH|E3*46$A'>7aX)󇛠ƾ9U^}KmJ?t 5@հ1hr}=5t;J|dͤ߯R> kH&Y``zG,z҄R K&Nh c{A`O'd1*-B[aL"T 1dȂ0TJ#r)٧4!)'qOכrXMqHe1[7c(+!C[KԹҤ 0q;;x+G'ʐƭ5J; 6M^ CL3EQXy0Hy[``Xm635ӯ,j*X}6$=}0vJ{*.Jw4?؃ E"#1?|ђP? -8%JNIt"`HP!]!V 尛a;i`qCNG?UPԠ"ƎoC!0[r_G{j P>Qwf8*c4˥Ęk(+,«.c%~&^%80=1JgޛIgǽgr&P29LcIIGAɐ`P-\zʡP=]RFZx[|mi ǿ&Gi_owi[BOdG.*)Ym4`-RAJLڈ}D1ykdW׻"/6sJ%%´ƭ*( :/ /\ƛnX.}_u]H<8'ct Հ}a>-:(QxPyA Z ULJ- upƜ.4cY\[|Xsɾ7-<S1wg y &SL9qk;_OP> ,դjtah-juHvhd`N|ʣ)-iaE';_j/0xPA*1bv^JLj&DY3#-1*I+g8a@(*%kX{ Z;#es=mi_)qb㼁{buQ?zT u]68 QeC Hl @R SFZuU&uRz[2(A bP7k:Rǜ%V1Ȁ Z(Q:IZaP,MI6oiA>edCĥ6uOڀI dFF rF,:XXlw{$UYwS1dӧl 5Yp$'mZv"ꒄℬT ٪ȿ%j\WFI#R޸B4vOL-LIP E&4x0<]pK>UKkZ{qqio :íyFR1u)X9 fNU ~5׳batx|ELU:T'TtݭRj^-%[ R'l}jdX*kj1H`z8F5]We߷}J0TTƩ0RxSe=>/ ђ(9Uq EmFjq1bX]DןR24d {g6R/wD_tՄ.F+HP'AE; J j"b~䞐Xa>EE衢^}p/:F?}bi0>Oh%\x(bdF"F ?9(7 _:+$ߗv{wzM$VbήdsOw<}#b[E7imH'Y`;5{$ь'gISzp; AQvDIyHc<槔w w?38v?Lsb s "NDr3\{J KP/ߢ/emPW֦?>Y5p&nr0:9%Ws$Wc0FS=>Qp:!DE5^9-0 R2ڲ]ew۵jI\'iħ1 {\FPG"$$ {+!˨?EP' =@~edF \r!٤ã_e=P1W3c +A)9V ]rVmeK\4? 8'*MTox6[qn2XwK\^-ޖA2U]E_Dm5^"d*MQǜq؈f+C/tfRxeKboc5Iv{K TV}uuyk s" &ﱏҞO/ont~]5\ʅSHwӍq6Ung'!! e#@\YV,4&`-6 E=߶EYE=P?~݆]Ōvton5 lvǫV*k*5]^RFlj]R#Uz |wmTeM kuu8@8/X[1fiMiT+9[ŗ6 BN=rR60#tE#u2k *+e7[YU6Msj$wբh+8kMZY9X\u7Kp:׽ ^҃5M>!6~ö9M( Pnuݮ)`Q6eMӁKzFZf;5IW1i[xU 0FPM]gl}>6sUDO5f p6mD[%ZZvm̓'!n&.TU n$%rIwP(fwnv :Nb=X~ax`;Vw}wvRS1q!z989ep 5w%ZU.]5`s=r&v2FaUM 6/"IiBSpp3n_9>Byݝ0_5bZ8ւ 6{Sf觋-V=Oߖm!6jm3Kx6BDhvzZn8hSlz z6^Q1* _> 8A@>!a:dC<mWu[7-D[9)/*˸PP!j-7BtK|VXnT&eZc~=31mס̈'K^r,W˲vtv|,SԽ[qɑ)6&vד4G&%JLi[? 1A ۥ͟յt9 ",@9 P==s 0py(nWDwpɡ`i?E1Q!:5*6@q\\YWTk sspww0SZ2, uvao=\Sl Uݚu@$Pup՗з҃TXskwqRtYڢLhw KO5C\-&-qQ4Mv8pS俺kCߤ`ZnTV*P,rq<-mOK[[ߢm۽ȑt^, tJbظ&Pg%㢒\QS܁vn` *3UP0Sp8:>m(Zx ,c|!0=0{ P*27ެT|A_mnZ7sDbyT'77J6:ѩ> EKud^5+mn(fnc.^xt4gD638L"!}LpInTeD_1ZrbkI%8zPU:LNTPlI&N:o&2BVb+uxZ`v?7"I8hp A&?a(8E-DHa%LMg2:-ŷX(ǒ>,ݵ𴛾é5Zٵ]z"]òƓVgzEY9[Nj_vZ :jJ2^b_ F w#X6Sho禮<u8.H#',c@V8 iRX &4ڻ8zݽ.7jhvQ:H0Np: qfՋ40oW&&ף \9ys8;ӷL:@۬˨vvn/sc}2N1DDa(kx.L(f"-Da +iP^]OrwY~fwA#ٔ!:*땽Zp!{g4څZtu\1!ѨW(7qZcpL)ύ-G~^rFD+"?_h)yh=x>5ܙQ~O_e琇HBzI7*-Oi* VšPȰһ8hBőa^mX%SHR Fp)$J7A3&ojp/68uK͌iΙINmq&} O L-\ n4f/uc:7k]4p8wWLeUc.)#/udoz$} _3V6UݎvxyRC%ƚq5Щ/ۅw* CVo-1딆~ZYfJ"ou1ϵ5E bQ2mOΏ+w_eaxxOq:ym\q!<'J[FJ,4N:=6. +;$v6"I7%#CLTLyi{+ɠ^^fRa6ܮIN ޖ:DMz'rx#~w7U6=S0+ň+[Miw(W6 ]6ȧyԋ4ԙ./_A9B_-Z\PM `iĸ&^Ut (6{\٢K 5XGU/m >6JXa5FA@ q}4BooRe&#c5t'B6Ni/~?aX9QR5'%9hb,dsPn2Y??N M<0YaXJ)?ѧ| ;&kEYhjo?BOy)O˧?GϧmI C6HJ{jc kkA ~u?u7<?gd iAe1YB siҷ,vm}S|z(N%Wг5=08`S*՟݃*־%NǸ*kb05 V8[l?W]^@G:{N-i bɵFWǙ*+Ss*iނLgJ8@o2k'Hr~4Z(I8!H G8HNW%1Tќ^?'EZ]DaUS@''mhSt6"+ҶT M6rN+LxE>^DݮEڬTk1+trǴ5RHİ{qJ\}X` >+%ni3+(0m8HЭ*zAep!*)jxG:Up~gfu#x~ .2ןGRLIۘT==!TlN3ӆv%#oV}N~ˊc,_,=COU C],Ϣa!L}sy}u\0U'&2ihbvz=.ӟk ez\ƚO; -%M>AzzGvݑT58ry\wW|~3Ԟ_f&OC"msht: rF<SYi&It1!ʐDN q$0Y&Hv]9Zq=N1/u&%].]y#z18m@n1YHR=53hHT( Q(e@-#!'^AK$wTg1!H$|HBTf̋ Y@Mwq[Fī h[W,Ê=j8&d ԋU.I{7O=%iG|xqBչ̋@1+^.r%V12, _&/j"2@+ wm 4\xNtˆ;1ditQyc,m+-!sFɸv'IJ-tH{ "KFnLRH+H6Er$igsϦ>QKwҰ]Mfj8dqV+"/fC Q`B 6כy^SL[bJgW^;zA6hrH#< 1= F8) 򃟤,ŏd7>WKĉ~b2KQdk6՛tgYͼ#$eooԦ=#&d.09DHN>AK|s:.HDŽ">#%zNEt"tLvfkB|rN`)81 &ӭsēj\4iO,H̎<ߥ諵z/f]v2 0t[U;;+8&b=zwɓJ``FiQg9XʐoHKFϗ;gQZg܉?^_ XC.l.;oX]}:>3K0R|WD\hnZm֏op};ԫ^(fL}0/E>ƥN7OQ.8[ʔh,Rt:p<0-ʁקiߟt[A3)i>3Z i򩸉*ΏlA" &:1;O]-wgϊ)hn&i'v"/ͤqr@8!̴G~7u5/>HB)iYBAXKL =Z@ >lN%hwiiUsIA8Y&=*2 5I bHb3Lh!ޒh7YJt*CyJÄFKKùMt}.l^]El>NK|//f&!B {&g\,}F)L b߀My6Õw7[{Gqzfz3_X !xJ8T<2!)^_ďǂ.\-d)Kl1헐Z1WMʜ5$)M1Lʳsw5ǫR^v|t$VȖA+Lܑ,҂+sM/ѭy)_ÕNvc*@k]ן;trȫpeoxӻo_nfz6ؘҊ?b*bj^Tc?m%3-$h`EbDC;.j0X1dR? ^}Ծե4NI ܓR{Omu/~+^K9>lIxpI"wS S 'MV+Z:H2d,P4J8 L72?og1>b$]ObsKx̊y`bE&>XYs䀚EƂ@K?n>lhTm' nܡvO+0fqf٠r,$/Zt-1-dė}2Or@3?]^ʧM <mBɃkQ }^an.Fg86}I h5&XӘ8,>b _ z>9!Z>gUŞ}xTL̵ F8ՅX/!gqwߑZȖF 3U>gCCY Hsc`% s8,A_R$קQM17h\EL#w@>omJ/ŵ_iݼGw eIJipFrO{uqy/]c 2ėi_e}L~5&lҬt񗽐0/λL[H* JzeMlTr &|R 2ӗh$cdk?vy̦7]Ạ8ph?z]W_MqKJ> QA^"nYG0_8`N 7{Puٽ/}3ymGqF8RŔ.MMWrO»HzC7ݴLLƓxxi2mW4*@`tF)Ċ+@@sngir^$W v:?_ͬ5kݰw[!$ڄ`[nUgu$ B6 [^7 |Xpn1]nr CC5`F`J `rKJ;?28¢E WiBhFa[|ݩSRO3]J-҅31,jl3Y QuH vΎ]n_2a62;VI/ɮ|Lu>'$0&*m.)HzzBvU0h} -_.7^nya+Cs 6K!x^' ^7 l 2Jj.S֔(*CjaS:vp/N6I*x8"EȿQa[qVM/)fpOj4r!:V_IG^nILVG#A7jF};qPU嗈M9VS;a+Ӧ8E8zmMs*7NM~@6 ' 8jp*:'SOANa0rӍ?DT%l)gvN}JT(Ȋqm|dc+lQai,|Dߟ|, d#EjZܴv]pEO7}&gbXԈedKX :+Z|p8"81,w:$TiVD7ֶ]cga@>\X=4OZSܿ* %xccDa.E h :R.qɱMu$ơI8>^V Y. ,BLq~z&0o- ,BLqfx9y:9244ANb n\"X>Y`bb*h%)(*_Gra^ sh6"BzƾH( ."e)B QlKlXt҈t9՚$ضz]'.!-r"1MCĦʸ"66pE{ =CNc\ESD[T4azry !5yY~ :3;Y[Iȧ q:i Ǟ/"8Wxç,vܰtX-LE7 |-D`JLw9|fb>4Nu ߏ3ap5k_JA+A.A~ C~`[KaQ-Ģn9ѧf q:cT >to^ X]j?-ȇlCf0hM`~ ó}0W@o  K[{d+`ze"l |d;L2k%x90ݙ^Oe ]nHfS+.4<#/5߁ݛǪ0q,7FeV/!; 瓠 Li% z}ɯww"O-]J`sdN$@"J`Y13K/9`VTElsX|D^c%֯T][$m;ԝ!,Z5f`XFzȁ=nrSA8; P=uY}r/27OUa%~0;үM3Tu ȩ*'3IC~LG,?.?C3tBYpm_g.~>3ʄ55[c&-Wgy_jVo,?s w*n\7[cpMY<~/"˘oV܉T6nn \_ߋV_}Z=k-nn sn.*upw pX\_ U-C_wS!|q?E-S_w$-#9?mh{R 4ѭm_9p -h2 dֲ 1"j {]]Nk"䁖%5'32hDz O\!f3KX0kIKq"H~%.b@:Oec6^:V8FDza5H`:&Q5 ^hI8nʁu EA~V O8Z-mYO!tO֠υ9G`6qmJc,Qh: ݢKNw2taC0Z' O > f-`:F_Ѫ2)sCj1THɩhS-^p b~?.>, `0!E%ҏ:H =VՑӄ| Ć.lL t1]}r^nʂI-|i*'yW='W6M$oeB,޳X$I6c>EK# 15ۑO2Jh)8Vgl0v/eNEU"Ik dRu˜6Uǖ xs%P ع omWl҈sApX!^ Ɩgv{Xn|$̇d`>1Ljn떚F+B9l"UP۾u2Ja>0c0Vvގj$]p^M+f~@9{bOe@7ȱ^%u~-B竟} |23 Z.`oqD>t@N _7c$h3`lg\)[h+pHBr^J |r\8czEnv@qZbRT1e8V Scc6:$[|a.fpU`ZR֩bKgTlѩynۢ, "1LӰW&jDkM~# (C>ϭQ3{ߤ%EN;?P%ٱm -{2k 8Vbv"wŏݙmn&O1^'}plM)0\n ή ?Cֲa9H] lX9^vCο -vd+OUgRy2Я\ B0!% #>bJPUck\Ul'F瘏Y4Ew`[x٘p,>9V"R1I>bJ` UL'5m1Ԥ:t6I >jz(:W֪Ƹ)!fꠗe[XLE4atGS1px#S]MF˦NJPYDX%ܠꡗhl}i9f?q>b-E'V"mNf""ŦK9kǍ-vU #`uVi<s)/=r=nlӗЩsdLyVIUI':4^6& t,O669Ȁ,EʿkڍfC58$5?DX 4q]ll9W@/zNaZf% >Ę_"+BLu>'Ɩ=xɮ[⠋X((6I#z)2S zp&m?e8 "(O+:Y EaSD]<^(]В|Ǚ8"oRs?]\McZ0ϕ!1hKS`h0O{!L-w]ln2&0Ǚ'0=.T4G7! H/ͺ|@lX)+{{^s1V63 ۗI"*al NJ`Q8B\pup6_3XqCXznL9:{o qcuו8`n{ave=}OR9~yL Z1=W8>É R$|L ]OfJl˪VVg:lDԒ͢Zu[kWۗw{{7st08`J0ꨴU1|z:9dX)z2!S:'q9| 76"Q;D*04Zٚ ?V¼r8/G:T6Fw/ɚ~h?lUc3MکEen壹n\殸,˛_uu.Jssu/*47U0)l?R_^Uon̝f-nnZTeuu nn/*0׷տ·sHH?Et _I`[>>0ւcz/Adh.$@bѨLtT=cKGX nݔ͆!`c|Lu_~ǴT?crO e9d ljB?K_z>p%'3JQK-͗R>KkΤOq,*I|0]Sj%|-Ԟ = Ʃ%>H&t;9`>$& nIdE Ͻq*nŘʰNҁVwm#K_Bn>XǞ7N;`/Fݔ+|V~V-?V1b&{IΩbuuU2.bͯXQ6VQex瓞^; Sdd3Cj9 k.AG"/Rݕ66衍J)b!yG=h b_e=nteۋxT/yL#c5 [y(h JD<xRc5UX4Zx^c5XHc5u`Z*̶A~xt9)d~tB3`)"'|teOG)L4c+`'8\\WOzg;qv5V D .)ʶ]Rz ]RzqxŨ2t mry !ً@,*%5r󺤲Mqnwz.,jZ*C=hc񀛶94탣fܶl'$dFL̩i~[[EUsAYQ\B[Y&pLGE:+&0\N'SۡE1<XndO~j'n^dEn.i Yn`y&yEvxG>BaZ5٧vKr2϶=GNSrdiBn PFuӈk3]c}A`Oh|"/V ZM!CJ <ϦSS7n^B/t¢SK7\In~tը%F`(xi_e|b6-v,@ O,Gڟ#=ћ7'#ݴ-tC_*y_&t__ oe \bl"?+ JlvudvJhT.sKɺU'X]5XrA>ӖEa$^=^DF$2 ʨUd%_L,zwҒjcj;t󨵼ڋRE3s!qZT#0Lߵ ^(8M"1\5( ] sQ ?[ę$<3*® 4|<VO3I겈3?JiyU<Y4`8N q-hц `-&v#VFD,C^|! 9*LI ar1OLNWTK1 RY2As|]X~-ze: 1YVU^ݳ]be9k2[oYXM}( d+P!uUD;oyfԴ U~O>PH~EE/(ؖG [Ǖsb:9m~ _<ŒϮ'-yrAD^:B2=խR6M2?I{`ޚ9#h8AG9x`Hx6z35 j}W:[7lϹ.LZѢ?p*JG'gGGGdJo2*jiU/pw4wKgR2XM䣸 N긊`y*[(sLDeUo@qvW,3y݁G)D+Sz"',@nH}j9Yx`9igΓd~,p8j~\c'bn7]lQ,ݽYd%ş<=/epz^Y}EjnO8ATQڀݝHcMGޝ_`*8Ŗ>hEN8&"SgCv׈\?G|%M|RӁbI D%sbdˇ'[,8.V.e)|Hi"b> !K 2.2؎Nn ̳ zx)ۜE@Ğ2 *?mi=7qp}Ŀ+㻭/"_1E44'6krz;}B%BVrC,k#rk*bM|+F׍w%/ Q,,K#prz*&`AY3 QLkr.YYsmq3V vR5;,9o7[PX`I ~xfD[v%fR??;4ƹD;<Ч(4b)^ iB!7U 4(n$KM<<1QIqWՒ6ypE$  Q ;U v%e],.*Ax`&zbBw=h(bfF+Go>!ѨUa!}>Q!_"> p뜳JlU,K2+8*p*W% c9J{8y~YV*;}ıTPգgpD[HO]j\ NQGwnUEfi0HQmտf,RmopDܞD3"CGA%&BMĠH^&d8xrf=.2?<جQU{˽z[Wwm29+˫LBM_]{02nvץ,5*aRĜUA[F:zy(\ ӕMaq=U^0!y}hzFKvТY Ӛ8n VlOy܀EB&2%!+khb=VU,lDQnb~zceD|drMYOuU⸂8:qc)ꖢ`J2YH=3H>y\6ne&~ܚ0Cc ŗ̓eSvZKeBt+ZTj!: Z7X>d-Z{դbt#4S6W)`E3=Oˈ{Ӯ>D~PpuL{O:S1 zkh6粫/)]XqCg/mUtH!]ETwD~SLzak\ҬG06yBu=uzXl@B0un@p)\ke7d/f4 ip:->PW RV!"p]m"x u?]vu[i`*Rd3尧_e>J>;xt!}:?-P;j}.3yF7KX,F#ˆ5.7!̜7EIDŔsY0bAC<w=?{2+C*I;B1YtB&# 2̯A8#v+<&&!)UaI9)$'\(BISKM{]8 v}M$ 2(1Vf5*;IGFgKnxjLkP5% ƩrLj$4Xq*+@MnsVtI&V } J>Ǹ&:xՃsr,5v>2W9&b'#HbÇ!8a98#G g13s0UHjE>/thԍB;'gu[jV쏈4;SB#t I",>`u搜:}6nIzk,NEӡk({~9=tᲨ_V_~6BXâ!b]jouix!RӣLXϽdmKVZo g2с%|vQ`L[J a!g' ʢxnl0Q#t3Ӫ*}: 8Q%`ȇ ׁQ>th u̅4?az5\^7=7Nܨc x,~^nȖafz?v4E0:f #Ijځ7;I #|&vأuqK$^ %?n 7q=q"3/B1|2ra: J۪C X e.Y sȪIjj@YŮj_kyL(߃Q[WZԦ*kU ,sSXŸ2&*5gʹ2 +TAGF@|(ոSkdtObBoF.KI^3X?`OɇV@.nJ݆^'/p4C曼thA AVsl 3Ҷ G7_ȇNOƃ<|'{yq qԲ;6uMe[~@NG 숇`Y'Ԋ:weż}: &9N@Y=Ņzr6\@X~ d;0P>MKTKwpR ۡ<ӵq`/SuTFycn޴TyɆכ)gq$+ts +ɆҨYM}f.ӧn<:=f]=`C3Dȇace}ф kwgw6e[˶F*{&l BEB-'ڍP뙄Z[j/joA=n$ނPgPg B uv#y&[nO3 u [$ۂPo{BI"FLB-  4؞`7Bg!tNWR.%^"ouߍ %bŋPu|%*]]a֟ FT۬ 41˦7lJ ='!pbIo1,Ӣ$Q4}.+] i?C#$gX}wy]n: ^6f#F8->Us_)yLLl< bSؠ2T:0(j I}(]pj?ӫ>U2gwj6͆*@ђg? I=,Tic+ >v!1pDŚP:<DBPw pZP'1_٥>qNɕ5v-u:v~M(A@~/YqJ^%HC>W1Bt9{ ϗ \V͐ 0!,?niWׄ]:%[GʟA&to7q'g`13? Q9pLBU15!#tG]OA֗uQE L%Ggۆ7cqЎp0$2CG+!)Qڐ@$Bd)1vڈnrᒷ$~J=0n`4.twʸ̴شA$.* CdF58 bʣhQrE4уDiCBCͰ \5koVfq4A_kDvZ|_#1J->)rBEn1ȳ#n 3'MkjXl)"=x +, Lki-u-5U)Hxyf1BEw-r1򇇝UKCq "U4V- ?Й?+JN4Xw,2~xxl pUϓNHC'>LIJKe11몼xp+m ^:,#&NYHe \0L'!Z;TaQVuXrh*3@rk'zƷ#oG\?&rB'\KHu/Fǒ<3? `ι;GXph{*}ag+oA~R ?̐ o4Sl|߳aG0.QDoTk]. d|hl(>A l>yVJ0Ϫ,/}hQLҒ`{H8tObKLcu6cK\L8?Unk=xV[8|'k'?nG|yrI ۻuնL6- cP[r+~@ȟ[G~5'Ioh#~cW6ICMk/-fLogd.8^zXM5k3h[LưWO.<_0)^9.^9)l~{{cKc]ݶa}{tZj!u#F4K_𒘢;e9g,5 &A$Z%c$ $8sZi궙NAtZZAq2mN*YeR({>>P@.Y3ȹAi>  5d@{)~Hʁ Ǫ]43h+98V񬃶*K%EfBb#ov=p4fi,S*S Z"MvR$A:v~uW, xHڷ=Lw_f9%׃Kovq` 8GCXschUrk^]yIphLz}v~gmKhAG Dt8 I0u)fRa[ʛm#hJmJՄcKӌ~.Bj7kc.VΊC(1KX6Ҽg$v͌5'xc&4Cnt)H=J+@UwɡxV%V,D ÀPE<)byB~΋w;͎J > yiKR jADzJv"&M#>zhw|ZMYZMШx҆[yn%YO։8J&X0Y i[0_k֪ Pn?Y xoRGp̸' ?3Sc7m,*C abf0u 5Ij%0&iopXW$iIARAt# csN)ԅ%WcUQ(AXN4X2lEҙtxtqp⣦%V1 ܀/*c@O \mEHfîb(x t>">|s>|&d"Eu@orH tνC"Y11o 6.8 (Cr<$+9rpӥ T Bx:%tT BtkV]p4rQ$3jLz굂/5<]\_gHRƇW"gp1,gimR>PHG1Ȁ>D.e}# i,3|Jb 0n>Q >p WTb:zB "h ElWSb xҽg` 0&L[{zĹ.1Gb1E9?tBgV xgVĈ1IL$O %xxfs6=,(8 5xV'HtY=vT0ړ$qA%o@8E`5' `&j$XOiiuAR,D댖_=(8cKQ^"EA)mHwbh)FX5r%V!,{YI;9-P:麋_hԲh : gaj,Fo=UO6_ýPvqp^cfl\ $Ecbm. n _n0] (C0ET>|:3c${U̢Yw#J;xi9I@y ҒxŽт{І=+`oShp/^V;RݳuS)G/~s3Gv hrZ`) hKMJ1b% 1.WJq4$gyDÍ-*{?PoTcxceBfbB6x'߳}GISih/t2ͷU8H|jt{ ZҔ"  n!CcOq*]QiB#`DR2&O"Q$+6؜pLP4A2,oMLjpNS^p'GZ b^J낣=Tt4v-P}RHAP$ƺ钋KQ,jcMJf\m řCfeCIpҴ4;[ػ&8tN"U5W"9*n6cFg qvAwYTJe !;_u1ʈyoqLr6Uf@ƃ'\ˌڱi7ALoZjQyAM? VD!wŊܯ:}8JE5`(oҤ]  sw|u[Qk38s k)/+л%K4m"2ۓ /23ٔ[7OF6O?vΙEEгyw b ::W-BОc8GQ͍C P(b~bOࡋ.VlVet\\yi>4c9maNHvE->ֽ݂ ^ƹwh7vkp~g8\TlE8|g ׫.JԮ*$ɴY`R*ΌNa㼍ntь^HAiyaiLoy\K[(7(bY1l*mMk a&pi CJhqB=* /:%uf+iKO,iPFg,JU /NYvѳ̤Ix[/ys|-6V^bj"D]px&ۧz}\.0s|YSey ?n0|uGwȇ.8:'LY]9[SPā,xc% I19??v)/ʽ9)ߦ*i}'A#z>[ !f0"[bʟ"?. n\tNY>)5Q"R[n&) +׀eixfcK'mIZFpsp7cډ Ig9V84r2q({B x7l&}o5[`]@tzQl uZB?uᜢ.{۸e !`Alalc3 #=I>X۲x{,ʲ[)Y"Uuyֹzbl{Gm^orǓP^/׋7X_kc;^my8?b.wvv/Y]mrJ Z~R4'vby&|>`ٕøUUB 7*֥a[nt֎D-.eQQ_z׮krǯObD@?~tb2M`p򛑭]sj'\(~a$v ~9{|Oaxsb^t ċ1\PCQb* %1)ӑW @$oI2S$-G 3}P<ظ$Xr2?Bl_'7sv%,t:o&#GY8#Kv=4'<ᘌc; ޟ~f-$RXvٖDDf9%Xֵtn8W7'_eED8 .ML9 qu߭=s{]_Ǟdž/'Q¬wL&'Fc-K]9>O 44&7~]rӻKU\f89N ;DShpVDpa6va!ήCnFw'ULly'Q0Λ8 E;>ߡ34y~.(Ơš< Q$êv|0ʦ'og48N̓dj1"bMƏ#Գ;1K}?+P_F&\򮬠(}3g@4' JO"#X)y4'pҪ;]ۂ^uՙahgA*nI x lИ(+㈂,&#D;LVa2*ϧLYCҜ#uBY}IN 031k/~V"Tw1'޸ę)(8ep 7->[ؑ>/?驕pv$D%\D[8gKskIgX-|[]h-ǽrL wZaq]D|UCTW|G%9NNHzH'oa6O9_C7Y77=AA<59zIj z^X;I:]r\@8*6:Z6+;9ӊ_~,me@\h_2g -i}PZ]{{6͆#0#!Í5Lz|%yS81(Iq]'Xxz\> -[HVzr?WZR4icC?͓;Iݝ~ګ'"}X/$E?|!)QOdBSt8q$WzJ>m UhT/6GVY#x*uj9FVz^&0E?d4. R:I_Pardkc}^u[YQ8ZlzQޮ(l'Uwd\ky[Ev|uXTnDPHogW?͛ 7 3Ugy]iܔl|(׬x.u9Χ(:ZrKH *dp,r-kyocZ n3dtDv>:-̢g5*=ݨDFX3R:s r ˠVZ^{j㴮{ǩ_Adߚtr+Vkq^ea\ oöc,c3sٶZ=|$¤Q:+Q$&O/E-v$yH\EȫR}"7S3ځfv WHV hxtgO_khv.68)]1~8}?տ\I;^p)X5Z=;h/|+f8Zh ^S_]؞_Q9xA`5r]nT+Mdp2̦++y]:Y[0*aWfXg%VPF:μ`Jl^3BwD@/-w$xgZ+%.v8T#-}]pmgVʌܒF xB#~- }䈰m{'/[kfϣ6tg bfPIpVw_v`\˫pVx~6hG ^7n ܀Tb uT֥ZM!%A+ ҩO@RRg$e2Rp`B 1.3גQ/}UpVЬy.QgTPӔC=kNtAP< kp%GX0jC7gm;j5vx̀`CnݺfpS$d@i pS .r 0^0ZGX#ƚѶVѶGkגzG?9S.IEvS5xO#pΛB>rNdq9YKaN Ho lzfd+zZMHUO-"T`e໘\UϞvp8^:Y@U"hsvng_{{h |wMS^o^7g=QDWrRgG(.}Yɉ*މHו jz8Ȋ>X=|(<[;=yҏdzb:>jNr=K/Con@84DREXy2GWG#ތOuO~q5#s՜ڗ b8w"K=6$x4l<2T@6oow[oy6y7Cfi):)۱p:*͐O.g[, Dl{Dx}i+֭lmshUհUڰVau뺛v% qe 1kJ(6m rC :P2%0%hwI/pN2;{h܁|ԓ5`~cܟ<˧ԾZM' ,ZĭJV#i ݛ3| in2N9w.%Wh'?T-3IBkR1_3 ӈ#]Fz6ZTl2zd}2tx;ha aj \ Si wQE 1cxF&91M)J dkl!΢Fj#gfY gQ#E*%Vhy2- %rk5KRKZG$ЕUHZ -+MUg)5bm(Hƽ`M0kBHo(#,x֢?Rk봱tk -]H *5?I&R+S2Wa3lN[8BzǕ)K-ڎÄr}1n=-l6^O Qd菣mob2)8>pðTpiQ/ RydTj"Mpa-J,SJkYa>=5|)hENjmYWìiL˦rVZބp^ԵǂsT+gmHT /,xyDaT)K. † HiE7>;@%R9P(^v"XYQZaVUVLCB/>j6\.ĘjCPsm[+/zTzH럎.P*x\{!K=)͌O"y˅ J̩MqZOpsk!䔩>0 At*+Bc9Vh>! I-OqG򠥁G4Vidc|?4L (ScKM,T!eS4VrL5S5ZG!%ɂokڨD)@LX2=^m&sX0.@jS$ ^;R0 T b2Vw.m(bh(<)K# ;D=Wb5@"( \_Jy-0w*i48(9Jv ERjss;6FT&G.RP` ԧNZf!:FȭW~O#5F ̴ UUB%DXa1` ,fk$&=H٣{L綴;p `F<4"BX)x 0FQVcH`@,mU`kmHeN@6Vp9[z%9RNrn.|IL%Rؙ\F3Yq^A_!`H0J}ֳ3)5#b=@#XPiF@ hd&$4h y+en)C^u0'v8LҭYQ ‡kQW81&F5 HWj"„)!6YT$.lP+yLU@H_pE8 P { $<+ f6HB i6Hu86lɀC C4b3jp8^Nf:(MP(`;mG]j2ESŴoSط6\oe-bRJFlǸ~&B@4MA0hbJ24$I+8QyUR7'TL3 U,@\RZen9d66pk9\o6JT ryԁEٛ 32M "*6,DIrM'#=ϱfk3` &"5 4=+ঃ|*+- s}'`܆<u8ZQ _\ߕH5f|`&_Esp'\հkAB;[g6S>0%#n̛a+=ZVr ?ޫE5P4Elr@4t4;T3pnis/턄^Ѭ^\$ZW4kDV] ^OYKkYbxPК8(t\JbO"B$@ ]J&PiLjJ2NsrskN\8 O`J|.?7 W6Ů:aUdV `pC`q1mj˧,R@ٲm7uNTua3õ㼠튛L *}5L7Sӭs?CΕQye݌r4r ab}|Ĺ;nXKO(HD.Qǜ}-" #Nye~$ɳ4r{q h.wWd=-\lu~]]n|nQptZT$+/_~tw<ؐGa$|02{ݍ~F8G{ՍW[9E;?tڻ>?kRӎQ (ɘr%D2+zf/.,jÌ/zx57C{8mݡ8F<֥yڕ Y8C5!Ѩ{=-~;Z* ^Ց&#S=Ux#*G 801sp@gfKlF¾K4F4Y;}i?5Ic֧ō cO?_Z XyFf4q\Gn/ H?W_?}=~{}٬mQY]Pt+znkZOy=yFPqtNgt牢0ypJ/W58; _6э rE+{5N!QtRCWB:Vx yݿEp v*pW8)#v}z1+^{o^O$3&スF8zel X5CA,?4˪#ߊo *[o|1i`9 \Uj|nN֤(JX%V5hXmh`!4W؞1}U!^\Ҿ#ZGt))J'R[~ur› .B+`MHym84~F_`g-h`Gϗ{! yMZΪ?VxY).~3_# =$?ztғVߢ/KDcr:hnӧ|;M2>R *[_/L|Y)i44$>}`r9? /\l6#QN#}W=帤L@ 8 r.#PNriT᳚ @-i~ g*%s7ߩ~ZT0>舊Oͦ!Q,S HM:It3մ7^h Ҙ״jUmJƼAS~Y}Y `Pgۨ.x 7>֬-Fs}\b<`2Wq?4}!b?s͊~%jV9r?q!=-s$3 d>ͤ5kQf4;.+01:s&*63U3G~|ao@;A1UTE ِQ5iDe9>,%;&3r`;!7 xCnT' igʧ8hA75uj:o\ {5J NAa3ZAY1a8Y1>_n%sC^:;; )X){5@(+ךEeVocfgTT$Y,IM+dL}!=$*rFir4-2WDMoT~\H~7:-3YCF:ߕLXm7FF>F :JozE7{Y]4%}ܜe3qY1`/Cb![86<ȊFɴ(NQ?0 Zy"i*i 6ָѹecӻoWKٺ9EE SŘL%c'L-< !r@AJu84uc:. >h0>( 8>bOӊV*zz\B ފV(za2J[ۊD/.&{IŕHV7P+)pwv{pB3TJIU QJJ%Ki8U OJʴ$"))Dz[UT`(IC8*Vw'WQcKh T(Bm1..F73US5O[4'k4TWX*`HY&8L˞$~Ŗl=ZvSZi1&=J$zd>oh%>8k4!u<6З按^8B-xQ29)L໔̋ 3uV6yԣnq2mcYiPK2w=NJ2ɞݞAT"YohcMw َOSչSpn6(H-!@)d8 &%~HZ mhϗlJ0Xʭ+BEN!!<\K:Z^"km%껅 G1Šχ ݒHanWg +jo{rQApĂ:q#]8?G{Yg7ӫIiϱP23(w"6mf.tʍǢD˪bBxKdʪRs?FygZEBƃDASp F|ঌ_|n(@)7myȀQin!Q}֚ٞ)JRnpn6)7GRNqz.!t(H TCʍ Ĕ0šݱU ! r)<)7قweFJ껅 Gאb4 SnfwNI)}Ġh:WΆ:!aWĒYbtN}zfI r\hEN€gفaID/X&5gEu{w6$WU.\[I gy7"3X&Eg^ Y\6Z\KW%bWgYUUf5ѓF @ifvQ2 D)'s2(w,}uU^|L$F$j^q^e .l8սvŐ+]Ek`A )eD _N]|9۝ Y]b8a~NJɶڀ#3%d0p`S%&d-M%&Dzzb9KlQ{JcGkeR3ۧ Baa|PΌwYƕias]d25UY}b`m5b)cc\Pʼf{䂏l8ChN!Á\Eڬtgo!yE-& Z1 2Xm9`'ߠjSg}mY(g+03Mfj .t.PUǷW֓j+(Lu  Lg3A)^H 4nQsڻ`O1A!aP&v[YQn5?VbWA(&.pC,Eyᨹ$OfUˊ)c!B>o#6P?V~ycZ!(:dC2lbNDB:Bb툐*Bj5:)@9]2I#R: UizA]&t)22e:ˤYVEf-^PD "Cy&vr&=QpLDUo> V]cjc"2C4)(JX8.Qʣ9!L{﹕eqO|Q9!T*(Y,P9`{ Px0 k[:NV)?c@-d8uP("yvM󼯊Ϫf:K1N 7mӚPE4B*-t]Q|WUIU DW"v+#vZm>לU(!QrO:9./"9.Gvw](j븒"Q2P :?Ykd0(Y@d#XVfOZ1J-d0JNQMGÁ5pk`#X>[t(gEဂ=ZEVTb(RE "kزB_4!22̀YVbNE\nƾt.ƲC'd0 u>_ȺHEB})<"('gU"w " +1"9[XkfQ(7b)d8@ <] 0EP6g ޸ tst  3?]*r*4I7U8βE }b;R6J F=h<](+Tx>6dB',[Ңc-Q`2P~`̀qX"˄_Cxo[;VGht( sB]\R:үOVuʫYoĖ>&9!d8*M taTǤ}fgPg)dѡqTmW\9'n- Uʱy!a|>Y U`S`N*,RT)DO9"owlP BuWTBGMSvS=9"Qn!QheVȥQ?&ixsQ%d0P5D *۠mC nl\Б1>2U6t~P lUhkUV[pRBANQd83˦gkY~+3~ygVoW}a+EE{u8\rTEVW塳jR<i^}s{;]VXWwzuş爗*{rt]]^ʚ{DX_9M Uv]5JFoZ+wF"di 5`_SMݛֱ8Zt /ӑP2c!\G`Lc2n0"Һ-[N` &@c}dW&2֮8dg;zl`WzM/@,dIƩ8 j.7{CN,<oJx_3Syr*O>d5HmvÒ ϵo h^M/fnM͏}k(p~y_]⪌n󫉨 Z^Nd:En_ kGoo;QvppT#qnSOؿWGL3+Qm0hEgk/,|S(Z|dleFϿ%Kl6 w0z;[#~9pb7oW^:vߴƎWCbsju6{C\4;e{LAo#59v}NcNБ#eȓI#f"iAmzcf0'6#,3{lK֘0GgX;AV%x3*6Z.)Oj.8|wR7Gem+cT]ghFK{\BEF{%SLCruiը,% Q6.b4oīޖN(6`dws$o?О-m0gzNe-|̮ތ%f7OLSU\H\6d_7w\5p_ד"߫x/`zL~_tyV.d4=M6m^jutγ l^<)4'//bk@}dcv{zZS9b$#f{nľY1lݤqSo﬚I>f>dlzht4d}jHrSSlwhgӼs2bۓ11>WސZ&SO^hɏs5r3-//OM]<^Zaz{eUgͽEY]q3ٟ}}w8~7?n)uM볐K|f٥l9NpJ<7Dߌ4;62trCD삧|qW9A=&%3Vfla5CF'u?q1oƳ9[C ;^}&zt9 j?Ϸ>.Wڬ}UA}MQuP]y0qRtr65{{.ܪBY*[#|mq5+,3Pͫ\)os5 \/ozaV7E)o|MU~=^qcAM~8,AZ A&I׺>~2nlx/^7ЀvPwZ侪ՅR̗W؃Vȟ@qᡀqn{aP;gEJyy]L*C©1ޔ>K)8}qkm"d6&s]R^@&^F注8_x--O_H_ ([~G|؀$ȇ]ĒUHg3`v @`OYҒgE{N_R%mҕdڀD:U]V {~Ypˢm{>qbGK4"|8i\@g%l-µ *M"Y9 ĻWúm+C5f II?s߽d=wȴ`=) qnkdYqYovk8qMtkq(2ڸoD5HҌs|VJ^(uQ2&$yyk%) õ+,B"t1Ҭ ⣗K>$Ε2'ggeIum9.CUQAێo<:?Kͅz?ߣ?oG3>f(8v,kʟfo~㿽4U4˷Ͽ/ժAQ7wV)7": r:}V QoJh)wc7ufir_4,~-PtH %pI.D^pa'-弣԰?Z˵pbuɗPzsAmnIF fKmp%ZnUύ:(.mڳ_^E׿A׫K @wHuiXĬ0 Z`l 0Wf v-h5!; .7g_b,֠QwZG,NP)GL (R%AK}6[|ŗK7KN=PKř|Ŗ?n?1hV-hV|{oC u dAڨFQ#pI!8OQ*Õp_z3Z`:kliōp֡G+R=d- NKae[Sgo* xQ6NŽUr%BF[zhK[ZqU}æfuyO}CyQjG-s˛ 71H1LLLc&1L\aZ^Lmk jGצ׳/yɖƺ6FS|/s_>u~xɚ7$.릏MS%?_J97^NVhv IGt~3nÛ8ŖCʃUQ kxt&>82!X..JH9*F;NC4& rʁZrrtOP5(F9UQ+{Ho#jyւFvbWjA7 TJj)HhpjokhX2/Zr]_n2Rld^6'^qmmՉ'?pI;v˱7^-BAlPa-$h*\m63}cw+'ߜ j~.Zn|ndJN`<7G7d]-CM В=`Hr&JT/GnB[Ǩ[:jdQ0_-7l?疬jsNu=ƈWͷ~iw\;iZ#o`#%z yz/{z_gGý;h~P\A@y[zʶ{";ܓRܔ鎅2qΫ-G5znvO~hyoVb=w!ޭqR{/C=6K߬znq@Luzޘb 0ޑ i6*3/Ui?D=+W'2 w`KTg|c>$ӭ&&nbI91 _Jjb!ʱe>.S ԝ$!9Y!z!VM㧜. T>Xkm|L0N|̀ݠZAwhB[UĬp={3 3`m̋1s5L3wܘPP~?Q xOUNYK3k߬=@1~g4%w LRMفwܧ#^݆bjᚹҾU53v9f%F=%VdTj/1Iݔ,=)Y=NZLOuk|2YٞS^bW#X&AZ:Jb[`Ql$ ޟ6OZ!zGۆOb0yWH7xI*aY\b\a3[㏧U(<\|$sROξv!;7 43ph썥t3hdm A3Z>a9^L.㬴U]M1ttVͼtty|%Hv͒:=?\":nI&++˙ MgfPNsMku?Ȃki؟v\ F#R-d@vS@nO̊4wM|cqiWD T3xݳ㋯k.z<"m~$o=Y1-2VͻxO邸G֌UOuߟYs¸ë.GGe3(.jX@y`nze>O!M+kQ<9^tr* p+Â$E2Vjt I29KZ,u׃oS0NY!QJ d|I$E0qU KLJ>}:asqvܨ"uLI\XB0>d ([MrhWXV9˙J `9dHR 4#9S,Q&-DksR*r<!:]*( 643( ((s3c9$TKB}=ti ƌn,i%1fd#CNqUdٺLdt}o 4CS]Y4'j]AAۑ&*F"md=o#7,Z*N(FFT{,z#!K YhvϛnA8MI`%+}iցF4`= jWZ kЋ9(P"QmK—0!F䕬Pشҟe`y0Fw%d rv.w$`ut +aTxu(v7+wC54(RI2T0qع نtv$@fx&JKe@>bBE$d5\ >3;aƔ"|,a4RY)HEBP儷 /tY/ Ait=8Q5 (B*3DiP*B[ q"46on% [ 2rT_Uݸ4e C2 SjhT&p3Kܢ?h?3 1t0eU2qÁGQ]\J2E4IAVldHL8{hwK(#HND:ҳVg,ew(EʪGCo d@9:lhHMwJ3l Ԥ9npJȀѪ lQyrrCcPٹDNYλ(#S$A LK(%Dc'0\)PXoY/2Ec%B?LWD!Rgm@(\T 97$V0dLO(KeGa2 :*#֋J+8ܜBU%XD t.a<$jCJ8Z /Qcsr-#7Zq%Zz)Qc4HIJ0yo(I 2i}D‰,aseJP?!\ig󻝃2U,MM) D K^y0kJ!=,i:XD1raYB3 _TbcH%"XJXړx@ b*q˴ ]0!29 DEAfZ#NmWzXUV.56A/=cI4,'")CPb]'t\H҈5ڣ1UDza!(D0J#"VE@/J ߏZr<5KaJG䮨TE CY8]x@$r+ vT~y{-P\T"B P =VR4 zc/' KwaZ}˳|c*fgAMr VE2k~Oglv1=)ikWZV8d8sO{xl^t9-:*tH(\"ZA^P'\ ; u!X{W^dNdNdNdNdNdNdNdNdNdNdNdNdNdNdNdNdNdNdNdNdNdNdkHd..PhHjV:ߐPu Y:Y:Y:Y:Y:Y:Y:Y:Y:Y:Y:Y:Y:Y:Y:Y:Y:Y:Y:Y:Y:Y:Y:VQKBTvGtFz7|B eP' uP' uP' uP' uP' uP' uP' uP' uP' uP' uP' uP' uP' uP' uP' uP' uP' uP' uP' uP' uP' uP' uP' u^Pwṃ+5w(MmWյw?>0mh .pkFP4uث|4kUyo8Hs sRE""Xd+eHGZm~"X0 4" :X7=;&p,t2 gn2Maz()qܤqzdU`)D+<(s"x0MmhݵFVX>8pΓj́~۟:EW>Awok͗8$HnBJG\L23wT顩C-%& My!3e6_,O&׭siwG|!wG'wG?5Ƿ{7@wǯmF (յ||o H .*ttxY`@P檕ӆ 4 <֍ZZlHxxy{7yzn6sb L[6ЯdUzoV\ux6aiMvA-|s7嵏5lg0:|כ~\\;bG?&ղzmΒ8z0;뽝Y ,k1 \ Ȣx]÷=}Sfc\?V**;BUюVKڕFZC:S})k4Jw3SSFI/&2Wh8@An:9E7}yF,yvoqf:b~D"XJT#` -*i-giPu60붭ǧ\mW zغmHum[Cmsց *7ۑt, FvB*" [+JeG+6k}"XGk:&/ش#O)vX;VQj$XM]rRnA~yXF ӎIN:Q)X0mG nlVQ+,X͈范1)A'DŽqTGRFk;V`$} iNmVXC 6mfw`ti"3ú`s!:}cA XGElZMҳsbt 1NډX 8 U5*jeO,~u0{W?@]u|zv> bvu $U Gl;Qg T]NbpšFBT^H[)7zK86h\ 34B.gaHyzS7 bw#fU1buYX {17;i=%4H6~Ҧ1j1Oܧw۳ϟM"ߔ^>uթVpVօ LZWτm,UR._:lmZpxFhxRHUpg eآ-*ލ(XicJl [wEC?;|*OOfmkAD666gj2de!>i\j;g;0%CVY8}G`±iPnȡz/i,#5*,I}Y"4GVNBa79-N2OJ^<V|V1zif1i1N[+wKi_N3dr>j`2;uOp|cS6{C/I{op0:->nm*v!W%O~>SR$̔2@7SHDG WZ.p_H36ʖ{5ȵ}5?Hm=Э\ /a>>Y|~C$e{nϤϼAȉKS%7m}~^6)dw/cb{m/S?iC65HOTS3Mo ^) t϶h~zLi1ya|;L2{̍Nj 4i;7y&{zdP #I; `rk mÂ,/(8?i9U КLK ww@mJےXk {i J[WE/d7S׳aVĽXv䮊=ϝSϫXTЮ9/E1yT e+ܖ=؋鱑 |r6+5:ަR z7Z!ZDgTj @z*r嵣Y '+A6۹Au5Z|wE@Н7߅(K/Ã׏ȑr8ϯum][^>GKow_Q;í{iB,tڻz7p=wt_GyC/i> ZZGɏ⪑e.ݗ҂w )wEq%IN{zT'qc(DXb#VZ0Kk"& VT\ie{j0-,?HCF;K9?`9r]S̟fpxp^3L"~|9 u(co85>u5wH(^o:fl~}Gß Mmh7k,NvcY&n<]k7Ԛ`qxxynjnvȭlZYrkߏMgEd:X8qM VC=xܤRџeԴ_O:&HGuK4;|vyw1(p,j7+}p|ܵ;TwuGnذv '9An8LT:gG>wqz+]c,.q]Y9+ca ̱臝1=ӽ UjT]ԑ*)J9 d~ ?[PtXLo./ U1!oGfϳcO/vEWe{VT&ÿqmET%':S#SgMQ}q^k q6J3n5@8DL!BNX`"3F+™[\;6dž2ifJc3L4'P֧뼂P;+Z vפW_k V'`#fbDZ] ZЍަ|Pa~lPw.fnqP͋hM0ZVKM+-*)k<]l:aݝ Fa]H"+BY Yևhٙ og.V,ràWx;^=Ԩ5q^vzQAA83PƄp8'ɹ{ZDS*0K5g>@nG~s7NN8./HPԎNxfY>55}4rW05_ݧgbz;林ϻny7{= iT4lwVyLӾ-cHS_?i;{)ܙSf llpvg"tIN?:zYN#$H F

d2g>tv1:x9ωM79`:Ug.1&aDrf耥ڄ1v<579}?\$Ӊ Nr֝FñGrxb҇UL8<P 0|FlMg4YN?ʾ~cʭ=nFC =9;d'Wmۿx]ʿL)X }u%,F"5Cbu76؈c9.cCW_ɗ$zeDL V}r_(jayƪvSA5_MJ\].b =XbÔb&W Pٯv>iCTO`'%6Y }_ ʬu.l`MF/`Qo6b#_Nå31qĩtRPrkB+j(VcUXW Ii/T9^SJĽHyA˨`vP5;BNk '~u LjXʗ/9`6)8|SFe:V^$M fR7vzI\R1 lL GlW+PKZ&T#z_V>A/ j'VA),[z{ qɳ5;B(qrx$`PIx09k5r`Ƀ>B[\ 2 <*C;tRBdq0rxg0.e]Aݹס|P#LKUq=22` dI=-xy;w ?.;'Vxt 2h.A EJ|T2 xWo ".ar<{l,,7Q! 9$Z0*C8ʷ <\L#ȢGbR/:)L "ǎ}) tG"ͮ?~z!gEۈ`QL4  2A^&rI ʌ]99WL# W52FQĬE0݇|S5ĻB.iov [xm1MrR6#ǣ@ i81RЀJ0eenFWs^c s pN-bp)5?Rŷe]5w 1-&\J&AJTLQe-Q&i(&O;BNh9u=!adIkVtT(yhxxWP6o?\{"0 `FFDBgؖQu\$rBO OmVK p$B (1As\HY_ *Y]i$rJo&B9Sgv|%;JIF{* tjr+AC0.a 0@&~#.0Lq AJ$wzxWPh9a4LiJ&FZ*(1'$V`wW. - [ 0 @<  $Q2"񮐟b[BZfzHc[7qyKM։䢿qjʓÅt"J!VU)`ɍsFTT\@c=1Ex׶=r@GS^тej.cy ~ɏGSɾՊ =&򾱳q~Hߋr8Iad4"r%\ Sw]A_4_p@byH*ƷWI_`QCv+Xl2~"$3\CN'edvaV׷ɢ#0zx#y<'[-2Lɕ`Y-1ERic of={._#C\t1J[L%n(c٬6ùk^bcv]7%'tp4=r8T0u%7P>^9# c'W9[&9r(vQl*6e]Bc;_ݺnt0Yoz7A1`݈w[[cdǟx[͖ ;^=^r\!qqXNfMwxs=iM_*vrrLe=63,}_~Z)nJ[8Feίr_Û-9T7SǂiiNcbDmbALe"Hj9VHt<æzUe~cU܎sH'֜#Mm{=Ox u^$<(6~`J/V_ YbM#oRr&yGw_!C_hՄΎi}ggd族yÊ"nۿkmG] x"RH`홇y؝ivR$8vb+9Y_W"hYUrHVWPs彿G V~\~ZNg7?<)|}W9-)`!T +YN[(p6"W~2Sx17O' X_Y|I㵪Ũw}|Z|P[2?Bou*~kx!W30(D^Zu K5nnd8/#OECCa;dob e%/衪g_0_2UryRysL^?l y=-c0ヲUĒՅQ繼Aݑ"寮VÑ_m6#Vl&wmTs_[+*I;W) R$=bʗ}_BCǸGkv:]^zoQ`Gq, <`.FE6P.*%KlrL14/c(1`)>D/G>[F͌*rp_խVk@n 7b 8F ĈT_$oZ*J7gZeq*܇>:O$L4qC[_@[ڷHFy"H8 űk: hx[w^Ͱ0Rfe;i'1$DùT &F*aȷhx[3wA@G.PfCua=+cX3DyˤNdu8^= ogZ‰u83H |zesQ,兓@bc Q okrN{ +`A0]bOdR*0UB, d+fNS] /BQ_H@Z[T1nG{,ʗd}69EoA!q$TZKT ɦZyCw>V:70Ms >QcF QAۚ |ng}L9Jk% cZI>Fί7jMT_`+jg1Қ͛Ys %rl9ҹ+Ͱʓ!& gw{'a (E x FS"z; :zؐ ;(7\&ZjhQr@NR9CzxRHܛR\md~F9V[0DC5$,C-qDi BH?_)r!5Sdzo|#֨K)3߫ m:M؂#,KSgB*$s>WQoj8 ȑz{UmP8R7:hx[wxVxFGz}J$ J+@+;˲ML, IF~.¶>׋X,Q 6vax!㗝d1* u ^= ok9LT> A0NlQdu= 8*_?2Qyw!dd&[E$٠ZRfkGt*`9x?.B/>^4i;d~A3P#3 0]moywVN u? .T 8{>o:`} ~4d#[ 5YD~o~!ZR#:727@ۚ<&ؠ.==nd&Tțoo5L\HV_,ǜ_@c^"@S{?UwyBt2 <6bD U;SL)c9w~{\I~.SD,D)>&`V + 5əQR(3*$#5E.A4WOwZAt?01 9,.`&xyp \P`;<l(),S9.rL6y5I)ƍtEP9‰HW}^``01.Q=me E6N=(Dڣ`c\ǣ8!)՚7?X jb˘a5rw܊NcC' )CΦ H_ 48)=b[YgLF@؄;tJ\K*JBdX :QKA]mWc&Y3C_)q /z^,+Eq|rmM*ͤ n\ 3DMcGWz6+M^a = "}8afJ"g\".>OsZV JGr b'{TW$@& ǸJ '?czU{膥Cczc?T зY]!rט ·ktj]^6%4`Gqʄ%4ɞqه{t>z5G6s2c\ţӄ%4bxs*H;IB5r BYBFGYh\">)SV!af,V*ٯ\?+vKB] vH0Q؆<َϊTov ǯoi830\}~+QOFE|Ҧ@R)UfE9sA8ߗOkg UhEp4؜cnhh{x!?.LY?ׅ)z[ؿ{?,˞f Ǘ_&$>iC&Iσ_[NזSU;`qwLov:S?ڿ^Th ǃFY550 GEY$k@[ھ  P9^KO|.̹H@qA נ#1^i:W^GOgfqjLNr)R' wtץeARmБN5,!YAGb,(Hf{"p*!p. :DtW\.֥41RrT?)1+Dd(ό9/TK 4-b$b2|^a&>-+Q!_ox:5.hM;?67f0/hsnrSmw׶ĘPqI@fMIx 22ͅ}5ɀZ!(8A~߷Fm B*dx[3s'(1#ZpRcdi*"B:Kn!EAW_A[uHa~W߇ch4z~6p]sQ+E~}(؁n>5Xcd1@ۚe+QhI鋂'q8!m3]Con1E@/49W 7-̎iqI;DeB"ڠ#1h;$gܻAya\"周"Uɣ-:[r)󦔂rDHP ϢT2?[b͸mB}v5PT"(3ˌ 6 o㓤S$nSPbchxHA3Guthh]t2B>Pqs Uti6-B1Tz'hx[tMe||Р&Y7w -M{M aYg0éKydZrlƃcJQ<pF=;I6qR߫Erаc’]~D# :k@F}  okfqx ((ζ5qnMLAe]"N,ńo;?/R5?U *]?ά@Z$Z:;ۼi{a>)}n]֞P<ݾV92L8;ɖ,'SR"VA.(Ym]Y~ޙд1v_՗UrwRk;Ϝ3A"c4:eȪ(d& mz{Nk8-wGL!;n`2-??>KWqsġL#4AEi_|\XyʯjdM": J`簪v2fdlf'Ȁ9doys/ aSG3L>cM,W·W^Y~0u@dB]]l;y:WaK/,3Hqy+85XcY_߫{RsDal_dGԶsK8^Roz!ymVi7@?Scʗhq[K<PJvɤTΔRt:̿K#yo+HRo(:)3Tc|Yc+/1LbCm,g |eĄr+%CEXǟLI3-ur,(zh=yfx{oSvcE:-n<8&vwUE8 9"ia ZcX~E 1!Z#G,KRf䌢Ro+ U 58& EDQ5}֝:όMN'4z&mX+t*Xj:?8!Z吋lK w 8τ:34!H ^aӟds:DA_Erp6ŢT]KEkjn*αL.EL4-fA.03]r˥op.6FBcȕP0"&+3pyR>. /Uy/epf_ y }%w"*|D :K󆳘:fi/%%[w2e=r`O&ѽy,똤mGץ;v" >_xڈJ W||[LG'X)8d0D)/;ٌ;0P8S,iWcGRMi *-ٻ}3V GF>rONw4jq?yen޵KRFa4\:j5>x#S͏70aF"iB_AjArQ*+ChR}fK1\IOK8@vG 3,qX{{lXk/ G9'&E̪J;A଑!M;A; Es&قa|C >k؉i-~-.(OڔCeN4 Q (b, *,w"eELd$*Ӟ mCu5`npp"H>H!.)cLDS\ U1=.Pl7`bDc7e#FR$z~8WuN H6aF"nwQo ߻D?KBmS6lT8> f0 aT}^PNUI/DXpOJW{uh%f(ݺϽ~5;,9,p8e5xp1!T'cOƪsF E (baوQjD4tX,#Dцz(Q1F^SҠUl%g_'' B`+86h>hDꭗzb, j "*]Rs@y8rT(bLjο.3цzg]g!6b(,?YԆD'.+ 20$}y )>t1HeLA V<3䍮~bW$R A2/_ѧcCU~jcOF>f^Dg[WseQyeb,[WĨ+:`˹AC cO,lqvQN 6ȅ"I S UI2.3A+=[<Z<1hq-F/ZTRk*\[!UZӂ:R($$KIm˓&JW)q7v~(k_Z`ۮmTS^x[ 6@rb[.pjt49m }ŏ=\Aw^ ON bhKn* 6JJX9_{ D (bkcUy@%qan=y 0LuF bHH1R;Y_Ki40R_W?;_2=_ACIZKޙa N쭩?GD^kcNMQ3}*~f߇vuG^̋~/i1؋Nr90DߕvS)&,*\(bOi.JO6ek~Z>i؞Ch[~4B\r:zOakIB԰IG1K_l6|6:CNXC(LMFԵOq"d\τLR/sKFw-ߋ%C9 4JQ)+J*De)LZRq^Pom 1v;6bdre `V~kӬXg?'~uUtZuNcT/&RmZTOo \"QlʣnL4qT%MURlADۂ!>f݇f̪+Ŭ){~~+KƜ-thJ70qiZN,͌ːMpؠw|JH]$^>NT$T>J;ɜV<©R}OG`l@ 6PC%IyPWKp@k4!(׭,Ul>qsHZ,f SG^]'gK<{"W+M l8|M zD=b;mbߋ ڣ$͜ q-T)MVLp}{ m?xx7XaժճdAGNT)4KaFo/1vNُ'>Lɯli6p w>wMΔ~Ūj24p!4}\AYg63'3cܛ/QN }/c1x/BˁÍ|X{4;+ w(@/;ӂ:ymB [ׂJ$Tf>ÕtxrY-UE۫C܅ zRRWf.Uv"`Ǖ-rY榜w5~FЁ *Ɯ(E_QK7pQWr =F!lAK0u9~,|0 ?jET uJCO g f+DP4'ger5uQ ^jM RΛ5ǔk٩ -z5ҁ;nܺZ2$YTb,;ǖypl-xej85c%[Po#e4uw YȁZo* _Gkcj5,2L*^jOS7Nч\@03:PCW:Hߖك}4GjtF^]9O5HW]3<{T}F116؈C⍡?ghL97]+\3`l*]ҎdŁ; xΔj'Xx89Mī`HJ!Y㎿%} 3nS!`c &[Xa# jRUQնE[g*A5e/Q@@7|{X\tLyZlB?Mڹs/ O#PE0wZ*Xn璈k ئs^!H?qo+)JBFt~Gf޳5eŶ *b墧{McT0>my@ևϿ&8+ᩌ4JY}, daH3yT7U Oqj57U@ԖzsQ*+Ch:-F"WNFrЖH qJ<茗`ĉv]}A0#DHBb1@Œn* AxV9k- |rڥ옲Ov,B/X7xPZö+FglKZDx 3rGY&O&x Im[!|M]ܰ<-F?!jyk4Q "џfO8>6?A.h/öئ| FP>ߝn4FMZZl-+8\iv78ێ#j-j̬+Fʈ9Zmgʜ.;PX d43_o-J;E)\Qۣ]Y^b:{Xx2 yZ_x.kh?3=>erMI  *2fTQ;/gC5qZC[A)˸i`M?$t2M+e CeOkDd=y]I*Hb2YQ(8⡒5/ &"7 HQX?coI5~oz/]S0零*Ⱥ{r>>}"w~X& V<6$&fAüP+olXT8]ͫ0"YjB0W] -z?\[V`'VcKNAPs+*0*/\2c{'=DzC\b)ڍdqE5AӨM=CPЉܖR)Umeftb ׵y87VYѽlTVb_ŕ=`}o(XaV3.'D=R(] -8=ҜM@f`9ޘ5+}}Q䜪4WJwwdc)ČX +z:z B\=lqb[`t < vޙX?#9$:^Fcr?AH\鳻 /[()O qQ%e$FP*;Z@חWho&K ͷ0c w0o86SїkTc玙yttu3q 5 OIȗ:Îyq5\1$s rG`c`Q#qO!GЊ'w&ǽzUS5,k2xkVy(GBf.d$T(F?'1 bpC\%Gǫ?>~xooo at?3KHǩzI<\KG0I?ANG%/D~dak"Y 7ΰl In{8 ?Vbq Me|(N|<,,D>KN>W9Upo*= _=Oh5|9܎'0J00E5${7KJQĜu+\ٓ_S`HҮD% w檬@=G  xx,ނbP9Th(pl!wcء`U`ǥґat'x;$HN>d~K9fZ uC"*X~L`(YpMbMүtZc5*)[L4S׈FcgޮcF";Ex>s~m_lde $[^0wq&p,**$_"b[ͳ ՗4;PKՃ2@nTTWM:`n6]R~]pYV*e& ܹJWs4XFXA1'OcIfkdB{  OIvToꅢ:Yӷo ]_G"6B; @#9rF32Wm/T+@}] K^Jv~'h$ rص [ szlAvq:VKzPe:Kd]VQ)}Nl)YL7U~wusE iEOE{mZ}qő (z^=y~ n&w)fJc[l1W%.Ld8 4ik:>PHOW EY`;'Q!Z+>L@İFAa"\^G+7'AԵ\ECi/M !t 3}aEoۑ{wV^`ጒE26{@q1"zXF6sGyP=0iRкyP^U\Wl$EmtEJAQz@(u|f Z sn[ 1I6GKQj0țqv24o{w'MoQ-^dhP`#$\%Driy.[*c]'{g!>XwIX*LzGO]7\_;UIQuG]v3: ywFPB[zvlKȩzW(v13Q-8.x uB]$x~Yb8lq* >yL$i͸q[Qbrl!fl-(xM$>^"Aso/FS87"pm'~@P@9&x`:;p.bik1spX?_]PWqQ;B*J`]Q:GPV:\p.C`;t:Ul@fK R<0ؾL1," qZ:@¬zqu& ZbPl9r..}eӸ>җ \p0k;K 2+a/\Y1ƿ5s8#KzduR%[gTL5Jg[i5 ~/̯_hoeeӭCݼL)KCp#{y@(v"HX sܽ߶22練OXz+ZԜ8<C&M+$<7^^^OZOyU;@'9cm%ixSg=X0;QU#]C5LCC`h FG[f@(Ӝ{r6R3#-YD왃m">+#׃0lWanj.T1^evBڦ}Ql/WF f;햔oW+} ys]= A96|f+޿>^aӭ鞾a w2=lx9X% \ϤXFe7*]u:'{uW?1|UxyCWfv<{F,?x ۙ?>a|ѻoRCs͟{J`Tt՗ygg- }S~M(.N5^`5^fy9բY(7|3AYST[ͼ ﬍$st,p"Zp+{rٝIutߪGt#ţOp{t=Z.StJSPa]{M2Q < Q?*r$ߟ|hA ]v5ӽ.s}Dw1YrX h7&U8x Q_:@ߴXS.bgf>)E|T`L`28/ANR֓hC;j_zYMT(EÍCwےw eLIzro/zsh >ҙCk ڌ$cwUh ]V&zw3n{0i_8R">,`b 1mCCh%ި/"21WH}%;@[(]ͲHg#X{&D,TR 89G`` J%ZŪI|9 ޶n2AB;X˗ذ4fo2͟=2ՈJf?~KTY~jI>EgTEǨ$`]O U'}Z}rg`φĝJ!/c"P1_weR ]b R20,dtgI걕|}zRrWaAr Ťj_h0 y*ǒ4d(>ǖ h$@7܂앧Фg'5>jcI <a|`]m~`E2+|Xvlw)RBaRaI79-8J!썢t>'w ?L0*iaxU.rɗ6uv-d-{ni(t v4W˅>^[8 Zɞ҄]2ע5^Fހ~6w"z6hdm@/^w5ZݨXRwWr: SڼVA/Qר]~l_ ˙-91B{2D܎ضig|lf Y3=Y mĈ-8Κ!ĐwМ!Ff!t0v , kڦv(ҳZ;h]ԛg-.>SBGlb3ؑѫV=)YNrۇ{ _ˈ[<$+nD3nϋͶG v4g#JS 9E^"G}l ax?[_p3y8mS/ ̉otvo=-9Ijhs1Ha(}MAKY1ɖka۽0JG{s"h X>n\ƭ(ETt͵ׁ 9.i(eNJn *Zd]mޮPrY40pT߆&&K12X\8B:2"m0 HYPJa%; _-.`3x%맽Kf^'3f}|.Ml. uQ0fo8⻴MfdӶ!o.BTf#܅Q5Xq ;,f-1Bz8Du^hlh=x?Itmiܨ@͆vt,Ɂ2;1#UWe҈Qv&;F_̆ݎfm_\Y25WAYpxAÞpsnZXN'|.j EW1065Vn#B`ݕ51⫔WdtNRԜ~OGZ߮ ףw3PXj0Ie2-F,)²XSZ"ގk.Q2`)xo Q~znq;\yz\<\[m 調m[ bqnJĉN ;tOvv٩DC_8鉤И=F`c!S&jO܂9"W1wȉD_i{MKǭ?gDc>yLqΘƨۺd(dq+k0x܅5|+ظLdȶE9bd|rIpg*d-8.3!v+OQU3X-ELREHh ɹEM-Oqiy.S0r5)R(B&|һ$\xjf'փD>s>`T_9l&,Oze鹾Q{΃?VO"ӈWQ0&ப fѹQЃҮʶgEei}#nT^*C0#~d <=O'i^ر2a@XەS$YD7 *PsƨrٵQF;Ϧ;ϏК̺@s.{'2 x-Ulm3r}uXΚ)5i/k ,"vTDFq+C#z8Zٖ`y+?7(܈mE3ɱЮ؛nhFzzЮ%4o=b{0۵~PPYD彆;MJ;6=#{ %vtAU2wHs0e۶?x4+m$IEd+ zy=xgSb7ErȒ݋%je%Zh@h2#odѹo'|a0bl&!v% -s4ONRjg#:ĝShl9tH_ul5,sʟbnb#U.dajf* e: "Tk )S-pRM\L<7C".n/\ck*..N;|_~YI_r? _" sCҿbxg/C\旴E]_}^=/ц?ʱ?m' V. K } }ᛗ}to'>@`t?..XWwm\F5o>]g'ͻgi W+}?[UZ(gK"|warZ曟xYw~S׈B|z0Q?խBW1HTcP^A9qX8# `T",bY#To?ogE~7w1e>e46Wx~NAyvNG f7O<{ D;y{o'h%kmΰ'q``cl OVQ֭BM7FC/Wï?MV9|=,k?HP ˞檯=xG b#^Q$h'>ruQUH'S k_cc :b>[cOvW׹1Ow/Pz-N|юwϿפV)Ž! {+zP!GAĭAlng*{px]kG65Nקoy7nWɀ,fg!6P̜f;Ƀ&"&Qn"72W xfɁSh%!N AMÔpPe= 6SKZ؄A㋀=hQY[cX'cF⠠F1I*‚4G%Q,jA\1y^c8-vN{i-wz5,w4gmz\kx㹾hsJwoF l8aL"!˳R/ƻE9 k8ZdH;9r1#;l/6!{A`䧓iۆc(y9$ eƢ6FA]}7P DA479y+p4 V v1PZY?Yy٢ ! \Z^,*'W24℁L1lRHJܓ[" s&|P)"JsD 9sO;!Ԫzx4X 8:"0H#RR%AktD1 _ jPhlϖ&zPA<XD z5x=K8co+TIsy:Gx׀%O $ޡu"m3XTSΕ'C/j܌! vWg1"f%(kv,Jv. 谂ՐHzk 'XB\KM6Ql<:딏S NJm!xdI8'lTxPHZ*F$h,G!riS(\Fr%zxGWg? \6!di)幱h QWQ\ZpH#s8(}cXj_pP|Q4;xY.6Y7c=><_Ca(pAETqc5sm)[#;$t~k\.wm(*3~=(vJ3'S2`ù@Hdc@Я\rq#b[q j*\#u7cwY+COo ܓ4:;}hq?mI(}Ic_W1шCxVK@/=Hh Nzڠ8`2֎u,41N4Ros4W?<00>@|#,ΐ= p4.ů74mtpȂ,\{dc\F "/*Arkq V-a@Q F6,u %Vt~P0bҀ?59wA4ǀeTh 'f->"83Uh/O8YsUat@ Z&4;vc0t=_UD1fTY`p,2 5(в2ï%3R#.L:O'~2đVyN1n5+I(M6P L "ǎ3tˣ18שR9F[ '0,` Ўh5DFy2X"AmAhH*)ET%& V>itc4Ljٗn@0,Wn jJR* [,p>#v2kG4"PS3'{\8qˣ-8wrOÀ#(Gjǰ$rIBhˣ1834~cj($ V7勤SV^TTpc$tx>KU.% C&e@-Q&AG<S!4_^K ׈,F))hnY&Qk TptH5cۋK0D ֩] V0y GM`-Tn wu!35Y/8Y$D #m/Cb+'7IʻP`8,@sk;>{mm"- ,vho w?o Ivs&>w(lOo }JQx..D&ÁD3Jȁ 5^9-(ʻ`]l9P4Qk=LR 9mfy_QD{>Q;⁠u."3_Wy\^SlcjdK<+UVJ67nTQ{p Uæn1_vgD:~q2kL_oė> {x4Av'S9*+vܴpc*n<_JG 9L)q"hc*]Фe񎨕 y6CDPAQYrDQY [ d!`e ;!EH'4[]bI7B Nx$P]hk{x6Ə\!.ѝosڙ(hX&}9'8`ɍsFTĮ1j":dqN~!~:^rf<`]~ 8J1>0"1T7"r1 I}L#k$yF8Fo&#uݲjZ+TnYx#}#>=w7PRrh WUs<J^s@c;$ar((F:IAHՀ~E5=<#肣4cϤ.Jni&$2/?/@Mo*>zL1" Zeo? X_q㍪؋Fk҂bS4Gcp*gjkGtUWujrω7膕%q/ǩ,ٲjU]-dkb~T,M*4P:wSfuQKq]3k Vz`8p4$/CFU=dۢ@+$[=7Mc$h,î `B`Ѭ(=dRSGcp*;Zs_ƛ%J_eR&6: 8T(0H)~F-5zԟH!]ﻸ0sn׬%HvvEѪMj5=QG0(`8S 95Jͩ?p@-ѧ/Y>cL_fP`&6 d ۠juT ղ6PL>2₶QGw%mT/jֵF'ގ$O$@1M*$%q HJDl7m`o8: -8)hum"eFZDϤvٕmTzweD0> -mE)qKW 8i业Ă)cvQ8> T>ONXJ /7 mBSlP.0[ t߶ u!!RXZ8F9.4;R?VLWbc dWRcL%HYD窍eJPdDz]Ԩt RApZ`Pq'y +D,/YE Vu%h[ON[="wVe޷">[$|p=ө#U&{*_^bSXTŅÖ04֎>feQ̈́WQ]-~>|pB xtS\$+eUW4JAqr<3pw>XjUMKDf%>,n3 lV[Ek\^ES>Gp[1QKI)h :q|,CJWYSgxR54;ν6Kkh=nەwߏ? or̂ѐ*nom|$\iX5 J#"eھRO/-G5ĽF̍? b<qZdDO2*IF()#~2<GY<*McreoN%i M=Kա܋|!NN@ cY!>0+/Y0cYYmIVbXGt_ ěAz,'|%v)^os%nJ^݉}NjC@lW,̔51Y2rUv Ay#;Fl݄/ir84{aºf ^ʪ)ɚ%W[\z(m|-0 y&f$CTHo'$K`r#0W8$^k7)쉐$P*l SKCX;|6)f!h'0@"@cA*Z+w]KjWtṶuMsn@L@65UT(; D4qS!P4ŊPDJF |Xx|VmwkZN6דL 4`YhΧn4&,4p;jet l;g~[w{vv7nqc徧-} xybD|^-ng49,#е ]bϬ5dnCk^X_<Nms$綂=]8r~DҦ' ( ^pB~o<3T]mY&bCoP/67BJJ93V"yT~WW#paov#PPS\&d{ɖ+NW% 4?4Hw Y/1s8&nɽпDk'x$.)zBA׻ biZ : d/IF Xi^a-F/~Yh [aW-WK\[QA1ZǹY#7ƣIeT=.,ugi 1F!;B9 ]Is7fDŽRTdՂo1p@?3Fs_A\KɐNhH:2kp? LNgcY'O㬐<2ejGe-{-{{F9@g`_"ߐ&"je4xmx% ~ `[7< |7<,ٞU(4Hzy4i8W  d1!> QG>i ol 㪲$KVFyT=a ozke &TdߚYf"%2D(q߽X"תVـ9$/gF1fްI:⟃D"dV ~J.Bמ'#29bHmSUɮ xIKy>4CU<ɞl:n6[,-S,E"7NʫyraUBOD*jp(<MQ0'*EV&^ۂ^FFq}~fq FfPOs(go[>\SeOqǶmW]ׂU˴dͯitmv7;9(8 >lA9xJcƧj[huow2V=Q[ 5mŰt5AS l8Rtx=kn ?gS䎀m}oQ F:Uֲ{UykIkIkIkIXaPjΡ`QVKE?0̫sfna*RU- ==02WdՀ\Q:\Q:*`M!nL3a&\tY3SteW(]ULEiKU V%^,[_#p҅tׂ}7ygcl~G&CD?' Xƒ?$>HG?b^w`_5U.1_nQō!ZޥvZɨ=kgǕ"y@O?~wmo۟>G6=5q2_&Z8dDIB%;FS;⍰D-I1D =S״Q$zI.j*j.p'Z$b)_k5 ^HNuNA0J봂܎T[r1%`\[#>@*Rd! ΣEq`ħ%m KIA 7ћiO],"{8>;"0 Z$anק:A2> 3gqrஎ0}.dmc|hX(>uL7{(sezHR$Oí*N> A#띰t3)#WLqDC:DpMi֛>&7w{;ƟG 4!`}lMzsl}_/+ڟ0zpZ \F|+~}Ef/ "D&Obˏ0*]^|sRJ*u@׽ Z}u(="oFs'O99cfL/,)%R@~ GZn21{VpT/C Bܦ Ājp-aP6V6o:ggTnX nSebc pAcA) I)Pʶ\Dn6I.݋1jlHX/Qvp'Q,^Sz.xV4LL5mHMg<A)|>0$/zm'ޚ\w[ >~;_NWiđy r5?/ZL߀R~l;sc._4N_x4 #t>&C: .}.^|:Y]&yHGWݍb^LBڢ.,7btas m<xw" I9Q/Y40];SϢQaS7- .غntƺ9J-V䖪X-7x#7YVÜ}E4ƅJ#B#UUYvjO0 Ρ/>Ro<ڥ' J/`(0҃纅%K UzӴg}NJf׳bJ\q)l[QcʶAӶ: Y6,;(gTL*Syvw6s~Bn溪b4L5uf| ?{W㜶]^r봑]I^_؞86YI+ 4]I 0_uiu{E$ dd-$jzfCS]\ΐeG<"spZ0GI% bPP2`?p[B^ D_~qZ]ɑK|".;r\$[RLa["J?v0&ljZud-͎l&g,1 a,Q$H$"+Bkq 7ZFտISI4U9-~zTQp\]\5_/GW9W@/qJT,zV6[5m)Xz8"H~?QLJC[ABR-{{Wm/Nq]mrCH[ ;!h/Ć c20| H@p%OΫ'oF}.jNs~Vm>kqVuzO.kOL׮\W7c" 4o#fK8j2P `@%-($T%C^.YDz{֡:xh}(s̒@rA!ڹI"Z¸WzH!C]@qZN anu}&O"8"b9g, x tؤh#Au.iw&a?tٱCMwmnx*LJ\0f:嚾dSu+d)[)^+_^V> <o6Άokw@9w~>謧@g=:eAgp8HaJ"0L ˁT(~4X~ }x~a Y;E.-ait+qP&Nu#xR?PaN*qyps&t7iL^R*_ck#bD ϫ,uϜ^ԫ'ZĴ[f]a\qqOoaҞ213JI:w]Tޮzg4_Ɂ0&fėT$vaM%6OfS` ʖזzM'k*pW!h.HdBΓ͒oA&'G5쉾WͷZN(xQtluШȃ}u5hLGH> _l|3QK\fpFI FR>[acJZeQ_qbRA jI Dn\zcM=Fa#Nb/eO\)ڰOy7+eZx4RW\rUhf]C<ۈdo:0e_1 f;h6Y8={Vg`~tWޝι*g{H=!1'F ;6.C!e+ )E@E:Ti%=ET_Ŋ CYqןG1.B<@Lr!n(@T8qC? Ŗ My㢬g;?s\;4*~H^h7ڸqu< t8RU33"DxJOz^U&:x%߃-:zzEap??C]_x`?+C#T~liV|UgR5>I5*ЭF7^dskMޒ@$E_wsIFE3x?5l-hυ!l89'.^ ̖=h={s3E%IߊSϻ2uKN9ʌօ<*rE⤩޼ ŽԢ7Eo*6c:&x+]quιcΥ6>t2Uʋ^[tIRB'0YᰀeE;㰳#I" =' $= Wpn*Ea'mEeK;~h Qm.V"rDpD}v1 oݕvETaW AV. 6p.ɛޫ.1LB͒5reDs$YZ{.ט3^-ߗl!go#Q[QJC, / u>NHUS RALYBlc a[QQq ,Q n֖۪Y(.yQ, 1z8 [M"5!( (s RYDYܶmsSrZZ3iIV6!> QDV_N*bVFVj* 6/#>mIkW|R7+? Rʝq67ۋ𺻇#CG4UW+@&)de7'b u?F Ž>ǣ{䧗_WdO!Q5?gGX^ypΏez]\A,z-yI\Oo^}TZ#y7}w[= eױ8rJI)ڃ{j8='6gp#@o&KO+0x:bxwE+G" 9i%?֌5*ۙ]M" |͉þ ZCo5 nw'S3>%&כ߆_wyFg|M mv>N m(h`IsD>@(_6JhyȻLJ (K^>㧥,Ǡ#˻×WW9Pc>0M;}/M6BNGy& }. |Ρfj%yGmnY>|~-oV+*xY;.T/ӞzaoTGO^idޮF}epd-0ҹkv]sA{ȥc5Q>qψ'm&Q 3j\3==f#ǥx ɶ6]KNϛ/y0 pq{C+#ye$FsyZWm8NrkcL5kg*Kz/7VR04Қ<ix,)8YD7@TvSqYCgT4r$\zza~ʍD']u`OT 6fS:`֎7PF Tf&2%*PeJgexjt?ȒXh-Wu+@t5R8)z}IHƗ 5ynrN^ 24c;Q|szCr_=yW '흷6PC\nN-VݧK+}JWϷ-$(f7%ftELcl.8e>#,r,J02)uDŽF*Reܢ_[=E7Хj(Vn_mWPUٕ#0Cy.luB#6NU5D/k-i1j]YjPgRAxD%la"/ek7L(apbM׏˒j u/瞬Vb^Wׄ ו6ls "Qlݩi&)U\ڌc0kA߿'y4-+LwcIIduh\黏jă&\,҅J5-IA`\]7#ګ uMle F3#R{C+wG 5_]Lw/Jqt<ƥ9vz2#ͨ Q+_P$A8n\,p@2ݏw.D$|iZrFVRA}x~Kqv\%B\v=czU Rcp.+X،F%S1MQiUcB);tgz 9v8" i;I  0@b,1?KdiL"S[:VM'>"4bg푳>[+E{x;NP??zzdzB DL|\`6n(AF v 6:x NxϵfRKfLp'IZhƻMo/vUoQ Fp Lwף*ƿ酇g듄}*s#[[Bؽ>z}-A)m &G/Gn;kF#;9@N}~os }@燢Tq6l*TUmCPXnW"|%"_W"|%"_W"|%"_W"|+sf,,T]B(x@@]:,Iwmn"0Ŗ[ iA2ĝu*ӚX} ?kOlg`s$mUVxkI3uL0 03dS#cnWW5M.z1ϼ-Vc鈬toyzZbZdj#v]X*,/!!JbxLL/f0#;[ʞXm?"yY¼E؇ޙxkVhh%ZVhh%ZVhh%ZV3=H:[5m&E% PL*78&wZQ mBX%/mEeK;~h Qm.VEp(4r`acߺ"e-^HO(wUG_ zRұ(Ȋx ݈OMpq(]1R,hsR:- P?-l-W/רTӋzYDju&gIUoeivaWl5<9<_>ifz7{;}eE͗S^و߂竼J)S o ݂UMt5xԨJ* i.?~ocF2<)bK)D-bK)D-bK)D-bK)D-bK)D-bK)D-ʘ\[YN*e9H'ۻ2$]ᇑĦp#j:&m .}+,JJ#!G$m!,m!"W?p X8 a!B)MMmF6]!Ȣ0\XZ;ڠYyZ(v]mͅ AF>(!HD0n3Ƃ>?jXm2ۧ1%Ǭ j.n-ځwԮW7Mqm77nU$Gpq͆ⲚVvy%GzmLśbd?{Ϣݶ䯰NNoچ2A ^VR:MxZTJvf;IGF*S 0 3y$Bُ?hlg껣fuKp:N괖e&JTE}.^c&Ͱ;;mbCG|c#)~+Pi[zh^% -=43eB9Mhw7g׍BK0[4161!ŷo(Ż .7xM0mz UIsjPC }bS_#Ќ!$MoN9SA=S\649+M}[ 8@\1!Ep2x Ge^4 M)P*y2dΒNK8S!e6Y%-S>h b ™h&\ #>8*P"{\&_,Xt* !y*( o2ԁ4I^r'WWmՈpdI qE IMHFuޥuvܯ@wW<ᷛVسsȋО(&Ӯ`]qgՋri%4ZZ4@ &;ijf#䣨Kț艐YJ6ȜߡpZ"=Eh=CX9m̈́7c?ѶhF=`#>2*܏|_Qc)l݉Oݛ?ĸcRu1,+y~4 "I~,"Vk8}#J{o.ى(@~(ϣcbTcφr-1ovH|,wBOAZ\ۗ,,*N_R5fۧjsq+M~kO6H{w-۴[g_@R(Mc{Z@0cn;ai~d\{ 1(1cKjɼ@(?1y;w8kdsiMW@hg2C)N ';k x&3^Sd@FMˍب{md-A&;]Q_6BRfڋ9: ;u9=~8j׽]Xy"T5Z P\7=qP,&ˌ^XX5jE?pr z\әjtmoDq;6Msݛ疡 4n^a+\mc9--7 yvkq:Flfώ@|\=)i V96YIpA1 7{W36b[3f3}WJ!zːb`^wG᭪)m흫wN7-br*,8غ͋򈚺b) :quS"0)#ӈra!tK-M!D&"oi[=|<%2l$W(|ʅ}J\L< "*( J/%1 "B"nJjF;FC!Q,dB CNR0-sjkZ-?Sٖ@wdRnȴnbNVakb7CzVꡏT%UWOk' 8AA N\lcB%#.K覞p)f(V2֜x*:0&=L)<QEq$$Ph)9P\A֩h~B҆+O۰w>;y>}>n(Xd L3yS M+T݁lhzOʰZ{ᨛrtț)Դ&i?F :Z]7q>OAIIn|'Wˈ$o|}aUYuیD?649|Lc\g1IЙ&3stY2l\RKA$8>AAdߍF~>n07q9Mꆁ1( N5gЎ釁Ea%}r:xĩhjs>]Vckx)2]LJ~'6릛d:S0jn.y{\{{V?/+;|Ue>'f\&sibej9%VK!?u){}=$/"#T$\)2TfUJO0 Z֞Qd 7πe-ѣ2oiwAh:mKmC7F4MZh^N+,lm%h]2AbZ sJl] x͆p35DZZbPͥX-Ur#kmv`rF bN3rv&nMC4c]ciDyhDC 7RY|l͜t_ph}<"q_)7”,cۙnpy,UԽXQA夶tR֫oOjGoi$7\Z^,(bf\5eX|)>3d{im].:6'&n噭<3cӽj-gg #I ˘)L$aȋ,px!' (ű|M`Ke)6E}M izT4?dt`x|! ]- Ǔc|;K J2i kBG؇oR0GU$15ɐbmlO3f'zqf (p|@i.H[a:?Fױy]Y<ᣉ|L{!M]i놽l%^_ٵo6 F#{mnNmӽ#dQV+Wώ*yY&\BzO~fun)8#F}E#$BD2>%XiO3[0 |H/V}(B0H8uc y'195FTyptPc-p:Y bb!@P|1ad%kEsSM Py}9a}*0KK mp{7SH$ BE>SQ QiHlȈןڙzypEG@]A,8bQLH e!)DJ (qa gDzX_ztX^qH". &"uG8DZ:d  _ Xx`*ڄf6F,~T91* hh "! qs.ѓ@D+f\y ]:Nv'/W:5ƠfO_~>4gWG'rqxj^,dxwT-0.bJdYꊶ^`@NIwY ]_bZ!l%`)ES2K7Z<ԊQ!L8bඞH/$~(1 <8I`S?~ɽWe5aW_k_ ̕ruV>vx+o}LWN^^lF@;Hf㘲 $f  "/cE"N瀿 ~,_UuT3;j[MHWR hi7\`uWF⒡0TpF4C0TPrY,41>5GQ$MK'/MiܝL߾XLA/!BK'%CFxsg;Y 6zhϖJ:20"L3Dm::OQ\g y4#wq%2Rw15e.ɴAT,dI9tnTT ȶקldOa1BԸe@})J&H?gBaC%@!sBӊ* śƉ`UVi$2DZH]uroi#yL$͑QR ZlZV"ʪ+UG +B[tP0L SŸ OM:DKkm/{l E2@qqwf)--RƱSIzز#JL;MSȣɗZM2`UD(SQ0htVC|O, թZ_=ZοKM(ھeC̱KsoEc.G~hz{Fܰ}w'KE叚_ KlYg ΪP?^ِ +;zԍ6cw$Q/y'$AsS8tf@$y^ᓦqWL5dV/!j[{WK8$;s`2WHsG uFv9p@HIv%Fc XKV貞XqCmڶDm+|vmF?./ ZȮ>2K8g)$!Â:s{oe$XLO%?4g9mVg;u1#ܙWQ6+/uCYdZ?yɞH*Isfx[{IZ<ȵC{Thx}u|joʻy+2k:\);u;2ʀ[-=-_sw9 yJ; k;^-p*X]_vX:{~:lWw]|ȥAnH*E^}S2J ܁w,VDB' QQXF"wOPw(W@FԼԃ[`;?|x^ۖpڠS9;8WqQ0x0ZZ7ƶ\FMsm T@Ugwg%\IeH/elzSkbv]ǹ[~xvenG F4GqVNC[?پ%F,cAJ۰4+]TM+B(5M?PSUUp+VtV ",o )u&b^eu|]iЂ·hҼP඼ IQ xAZ_ToίV\] ӻ|>T{f]=OA8b@9ĸ\Sٷ'C U#R8ňΓEFo{4Qӛ[˶pU^@acb@[͛x{́phv^VW?0'^&>b=_c%:)xd4~ׁ[tǽf}@ʪS9wmyov*K@Zك.=gmtjx<}@u@’ -|:)$z!IJL142ŘL,T+ʞe$BIF" 0iʪP&mB&f*[ BѡdkpR1/c'$cEaR D5~ZU<u15?<˼x9H9"B{Yx?0i@’R/#Y5[$/k*6[%C2`z|Ѫ{mKh\`yسP,Lvuv3m6"m3;RhPEΠ&u6bؤPlX|Ϸɼ* ts+*QT/*‰W qtRqpk:{{HǑ>;;ap%i~z& r5sQn63c i@23bh>4֛ͮu1_pXpvў!T8\j9Dn6{JoelYgM'DM "n;gv^(Z7TÜ$ΠZrUc+3Cj+0 Y$ `ȴ1L[bSC8Cn{KN2ZWk٫nP.n Ƅҝ!j1:UbDb)L8PD2mPd"!B$e1I[bI @c:zZ(&[y u#1Vy3WHLO^ їO7Wejb$HFNE\ ndɠN(5HZח.i$x98IBJ0:vGg8J3*g,ߘ4L2']}Ŧ⚽ْ{9D7t0m 1Wߖhvy߾Hh0#81D3!QSp=f !ͧǼb&[WіE`OjPmx5*O⾪XҙIY:,Nyrb:j= J ,h31;Ĩ\aE}"Fa7CVATjGBIV.!j7W6DZ.!KGNOc#䣝 Z'_Zod8+;wUMޝ]•D].bn׭q_N?wniG~xvexm\mFaX03rdi[(PU)MpJ.Ԏ\(]4`O,cLZM=Y?WDMcQEz"b^eu|]i·ryA󡚻A IQ xA8w(Ս]_ qEFpcuzχj}PyPͧ1}f "$)Xa4o.I.&eNB8%)bA ~4b*^2],wID|f7z+|cy\a|d1=;hDqJݟ3/5eƴ~4"a:A ,3( ?#^s=ɵɿcR,dE-qHPdRcY5Yd:<R*Y}bޢ}KuUoMO*k]1 JnGnEn&YUf,Mr]UVnmu}󢱋Mϭ!Nݜ<e3/g07fm7$7f37ei -n}hZ# O]9 ,TJWi"ˏ0h0w U$ЉBDT0?iOZg-J_~t[>Yվ;3*.cc_?|x3)fF0ƚi2&*C!iI NT%Wi|ܭIv﫸(Zat{ ";t`&G5Y'#"*Y+?fd&ҪFX+r%J,BX 12ZаP7LF_%pEP1BT1*~['Tz0*}WsҔIaPZS8!74*,Ϳ E?G˦ɲҍ= ;nH*sʁf 0s6ߊ$鮾4?_eO~ o:Zm|0Fd矰M0|j._ەd|eT>-rjn)[5gC}6T!%Ub4g?~>yOa :Z? p,Dwo%Sw#h8X]ڋZ5. |N-)Y `\Yo5rry`qaifjkSm#O\oPyi0t!LsR o ǬNuqlku=:U8M|zudRK7b8䩌R׌+C2$i#b)%4n#RI4d1Rٟ~2 w)֙RtD1QL1Bue,˰FQ("Xd NT쐗ҵtpgvgϽd n{?a8OZ9KRYܸ#Ac&$")<&Q),MξξV]!`׽nE6ʧ jӫ++W"}>cL&gh%o#P~ Gr}aBYn}l\`^aΦwNn(0-]Aޫu%9dU@p+ԧiWJт3澫_z]^.!%!^0``x^B{ Şb0e//j\kJ7 I&5llq#kBY=n⽖;$-5诶]*B(Z!~ a O;Bm>R/v\Ib3m6"m3;RׄhPEDΠ&u6`ؤ#bX|Xuɼ* tsWMƆfEˍI8Nx-8X}ry A',E+8/hKsmLeBB8q%YZq^@IhFB\2  q8$H&L1D))N jB <'uc^"m3c5"a,Tf<&D%PoJrӐD+2jNh L9wt"R^3ZV^D - -lj~?`N)nK<"M0*"ۯJmMу(mTm`i2gLV6;?}m\O 5הiipj8*i֔&sfIM7Xsn&s7{Q#nu>J8\MsD]#P.Wf)[.7 %{?l5Zݎۍ2aS3rdi[([!*&b_g9{ScP`O4}laZ.Cڇ?WDMcd8涎O.KJT_w>d44.n޵7,⏽=h^usoMR7*:'m9!K(ɶ01 &g3={^Nɀ@Lj3 SWpc<;LHu~߭;}mwȿxnކlU{&r&)Q)-XEgA? (b՘u{h[^۩yjLSf3|;DI DF@hؠXK$H0ScGjX M׿~a߻+lאjfgk7i$;0"AUykӡY{!>DQ@4{gQJAJ :HbF9$ ,xQfe" Ќ*NZUy[[TM:΍ֶr5Q XNt?FNs&mٚVtn*mxDhp~QB?5Ƈ/ՠ;\M8h*1" iV>5%O*A; Ayvy=ShxuM L(h[?..wzaUJAٕ*ƻ:v.،,ENc6 :G4>5mbPw]fhp@_sm5JlЌ>V\ e3l @d1[ A?prwCh"!TQD8\ &hMڻܽGKsDžXY,b޼*A,\ad"0 nI* )WF8ƘT cj@LI=V^Aczc(A|P2X( 94w1MtEFUDQNUT0,:T!%RΣ՟ά 6&29{A!OnDGYͯ׶:({,L93N]N9بW57)NJ[K1.c9ԌCl Po.ZMPo$y5RW8riE]Yj1tueث'#Fj`Ymԕ%+Km7]]U[qvhpGWW"#ۑ㪫QK)[f+v uż#˜H]G]YrI\ue]7־TFWWOP]aG(0䒨.Rm4yڕ݃ m{f]RZþQ=@bkw{(gJƥ;5|;moMMiGKNpꟶl\7.A"FsuÖcbðN m{'es#];7џ%"(P(vpnh cGVq :qN;V) n;ѹeI̍=݁>19u c[9N;2N{ÏYӫwYG~{kx>n\v;-C%:StudL9FuW{:>,SQY,qvzi}O'нN ;m?jrވ #zɎ{n-dq~œ__Xj`:O4WǸTtre9b{oKճU~FE.[ p6ϧ&G4w [`hmliK.uZnҖJ¼-lC̾s=2{NijMF]tlx^ y`JWL^vϋS*}.Ψ3 TbS^a@9aχvUmx߈>z܇J69-,Um|lѩ&waB%aS֡?;  L?-Z9'6R6F. w|IyxzՀ8t92rrS'q3UzniVAcqWG䞦%w Fe_2 $s {w(l }*Wٺ8I(V~b}YUwrXٓKjn_<_mF Òˢ>;>;11 n(yl|F,c>Oc|/rj'ytb[,"r>1|s'y1O'O<>1O<ՙb`Õ>ۙ{78&+03M(q aj $"PcdY#c!X]#UF΃2*30htM L)^]i J-;CfW.s+< r3aN3$ߜ\lxqP7 4z,J=♋x("Ub{ځ;@\w+2'1b)PHMTZ 4½Ozuz?D,<䒣-AKPdR&Rpj)"F""!_KUBwXu}48w`q>Rv ] @e%l8YWP$Fs .)E` uZTYL7T]WGN赅]Z<[ȥ ]jK]ӻa R yv;{*Yi* n]e{i+/Psr~g7~Tϴͫ<薛xPqqyf{=N3vn`qEohQ;junKξs[5Gsw'Kv7 Qr G6y, Kr6A'TRQŨ,C/^8ש[@n[=kvC:jEO-`{P&VH " hPdn#&D~U{'d{*Z5fcA =\b+>>ecXG\%w]-3W3{go-|'(}=D'9f|>f KI$Q&ʀ@3C6+ 1&pNQ$qrU/ƼcVw{OZVR[xjJr&@pAc&ka4OS*b#"DyK׊uxtj-?\7rx>TEVD}>IJBT!lbQQD)Ӑ')I**U?D(Jr( bLԬA U *nSC Ģ,rVsyd,( @5Q)N I-&2J4&"FTah2`}K4/i딴ZKX]Ie[ޟg5r]as!Q@0 dq KwﴳoU0iiﻗt&U tc+gŝ_iT8Y6z`t=31[SWtQ̎W}:m

1c LLqxb*-$W{P;YvM^*|'(m y^4m)w|Ig+z[gT]Y]!~̀ ?x hw^g^1]a:x)5Yy- ԫw|%01o1'j2igxnozV _75Z:Ou2X_>뺧mi!bpp8-8)}|2 L-7 _?zF;\37甯,)S6k Ov>>@B;UR#!>KfʊyBj8-r ET!|ܭ÷^%{&CvgUB9'^9E -Z=rfdEhS1izʗ.լ̾f gɟm%S?2 #ƘC _H(6O9WɷwBW:,`+bo[q3rd/׊|zڒZ!z:c+lQE=!y\AM)pp'υAQ$Rr)'=6 gP"NE=E/(Z'goA9B6# PRj,АȜ oI`f+ļ]Lgi2GNՆɛt>vqM/f]cmZՋ\7 nu3YNat׻Q5o~$Z-zltQ*0KERh{Hw*v\PV^ߟOQwY3›mK.mw Lɸj. u |?_OŃ Cڗ-]{$†$Kqg52r8jհe|U3~X~vV4/L`~F=Y7 $v6"IJ F2WMohA޸?vmFGc jwu}id-wיSޅ'e7 mƠcr +7qZ7oGN|/q{K8 jXHY~ߌލ&\8][Jn~as< ([..+/$%wM'[[a&0[1,* w>s+\w0QFvqr!POxb Q DC)3.\Yh~UH^^ޗӾKaVa ~$Ҙnaߢ?/|?x!>_SJuZ/)~)%*l؊\_h/"Ļ$uO 4I74X XB#<[.]\YA\Jq)gxajG3;nevT`:&4iSءJ ߭aV_W1C&JX݌}0h_٬Ͱ/- ?}?qe}Z3wng=D c?,x@{\Po }vYj |5lɽ. jݩF|8v̗s,p62&ֱ#gJK>NoJ /&uB"Ռxd(21i)w#B 5xb'8hG<2(R⣒X; R)!!Tib\~lTCPyE:DD㈃y OP:x"k]z.hITmv#B ő'A;e إd5S(F,3J /.u S@r />Isga E' eGK8&U Х+,6]62?d4\ ?(y`[=OB/riϪ. <_!{xh鎅T6F)`ibb 悷΁# `T",bY#TbB;WB D|I\X)[Û3?=/m?g_Nskar>ߩ12LK |3Q Q<1C*&HȝXQ),.ɅHFMizvYe͖x._), n3`n0(5@j|5Z<37pc>cxn'f))nCO]Ŵjj'ҮIK6\a`^{L-!2O7A 4/q>Hcr+!Amz3|8L )ʼn;']~>>^1FFwCaY̻c.g m֙OܴeQqj Ƅ~@uf5hVfLJ`޴STj)<`}=jVnCٶ*w eE͆1oI+˗Q>dldb&tu3Qߤ_1sr.}Ҍu`#WstO!/PUX$I=Gx|ҙ?G?Ƌf  JOO47]q}f+f)[}|t8_EmSg:V'7M,+ػ]OVCOFaٜzX J0CHI Hk )SS %(8(8_jD"9pHKlbB 0X1252FQĬE0mކBh΢juHB + +47hC ̃MHj#LB9a#‹&[P(=*4T$+JPMĵH kAhaBbX3߆B Х<#aZ$f# =ZR$*|64| %;tv'+Ӑ7*!0<770߂B /]bBu氠jEuϗ!AP(Ji^H?%NNEsw' B NEa8 zN/z7~?s|xF$Նt]D0TrG5{Vͭճ 7u`lؚ(Kd1/p $Qȸw9I2*9jbB)bc$ ๳jǘ2D*"T IP"JlRVmwNR&IKgdT96ք-7'"_w+;n8ּ-6!)Rz&FEnS"Py"u N$hB+K/tL|C\lgAF;MVS%#UĒJSŽJJny҆ o"DFDm:J&($wW} <ܟy8*k+gSHfS,xcis9@ؑ4l7GMaŰP ib'"ӛ6$1zprkY9'I[J(u,-)ִ[)RYEl8RƏ9ho%0!5b/g_"O2h`7Cz?sW UO@s9${D7PF'V7i3ᬿ̍2Lo_f̴ϋ.03jx?}c==Gh0QS dO #NܜLg^āAgr#cحP+WkW 70!}V ϲrjv~CC/!7B%Wt#fmķj].c d$"&MJrӢuo޼]glg{Z~Ӡ~1[fs(Qz ;\M _qkz꺡hSͦE q8(H Jxq1aK. -ځ,WBT!y^m- 䤺C"Pux"]#db1[E n. ޒXѻ%GQ8ZS:sV'svs؀P±`Ay6X#RmRp^5ԁiCD,X KFs 18!MB+P-(`^iC3à'4FY1T%+sjنB -^yFJ>:s^+49lC v.мz " Eea u-(`^ͧlNaq4O^:.Py!0.!ht$@HY* ˔u'"0k=ӮϞx)s牸s |Nfcꃔk)1RY˴&c6DL,IE)1AbX'-(iwuq U" Tnb{>\ao+`wm;y1W?7/ǾϷ+hw&g\c;mYMҳg#7|GG1g{E[y:@k~{:*@9zcZn0WB˕|m[hY^'@fehFP屙)gX+ :Q$RZX"မ=H'A  4;gDzsh%O}̣dwN. !Ǵ\ɀ,W0ZycZ㸎_—X,y*%}Plb3RRR)")&.) v ]-WQ;gv>}gΌ5BWϐؑ O>:HWBWyu5PFHWy++QW@):] ]=C^O upyojnjNU# p @Mu(Y*bVQaj~j0#] 6t5P 8t\zBJqX\R_݈B ]hnѕ]yϖAYo_Sz w?/~><:g &3<`{2L3܀ύhk>H^ڻ78t<;?9F k8wR1%Ke<-6<4OJO[[u"koP)gWĽX"}@*^mn ¸>^¥LO(uw5R ={&rLlM:.N-iF;sctP&L:4j^#:kU:eذO9WS5}Mj^UC kT5h0bn{h)L 7'9r+Z6-7۷dz`'rwRwVZ+Otnf()n"7M)cDM",=նE.mPИ^3i*TT $~n6dF2S$s~)6fx{ǹ;woxJd,Td2; PۜޅKBئ{ysv\ߨH>:d!?†Q@~NjRi >Ma3iY1'v!g5?}0 Ԝ$=hZLM%5tjncP9\:'t aSQlt sa;kQkNҩYJE.)$8:AO1| 46bd# V\HQՒ$dH,8dk GCԨ*ME=i}Rl U} dTW)鞺5P)Y0f;drNU.Jwb=5]j};aiG-/VN L.rN3zSؽؠtx4@sXmlV EeiD)Qn첫#8X\,B. kBjѕ<P\-ձ;udzEB&cnx$$X,xZ`jG !.A D7)ȗG$Ā:c TO 4XW9(y;XGgk!\VZ h1̬%Zex/T4kdIywD=`dh_@PSH\F+^Ibk8YUDI)bu+ե,,*/G$$KV52)ʚ9&tYk\Dd@cn mD`j|B\FҤi!938qCӱBw*ycǬSDs|cbLByUC9iAQ.`lrg;`W^/٣.g+UЭUl{Y]`!PIX|%@uPiP8B:v%S+:XFS1f4XhV39xjN4\0dUB5U[ PkH4XXyy`4 /! O&A PH܎m }ÏϪd!T?Q y E*U#HۆiBZ!$BTd4X!7vF{di*0n&+tmm,!{ ԥ)G>m=Ŭs̆hTʗ0wa}CPGm玹aLP{X1z.D&Zζ$vH 9ex jw , "Ϫ90V,HZ{&̃ MȚR\=vά$1SiU(ͮɀ!JP DDM תwa¬2,J cF:覑gTWh!VDRa5{-'hVY#pi jdV޺3FP)J]VU)> 2߬n$F ny8 V.AԸ0ĵU 9< cs[<_ᦗ']yeڳÓvŹnL&- Z0:{Cqgc`3;(li#m F '|wR%3,Vg aMŌk)DmzIΓFC!5a6<WG ?^ o=̈=Xu{8P!/[trM1WtsCmkPQuFObuĨ#F1QG:buĨ#F1QG:buĨ#F1QG:buĨ#F1QG:buĨ#F1QG:buĨ#F1Q- #uŨu21<#UX:buĨ#F1QG:buĨ#F1QG:buĨ#F1QG:buĨ#F1QG:buĨ#F1QG:buĨ#F5ءCɨc9?F{cwiŝ7 byFִW(F1QG:buĨ#F1QG:buĨ#F1QG:buĨ#F1QG:buĨ#F1QG:buĨ#F1QG:bίf6Nq{/w/cioh0i+_dJ퍯hy_P_s}ѕՀk¾@Խt5Pz-t r$_}sڛ+F:] V99k]0ƫi_j ;6Xd%xtŎI_o" @$[oۏ??9}8o%U~XwұWj^:rP1ZZjAe߼>>G$٫fUx>oNߥ77j ;rjKMƧ=/2ƖK6}ͫ?1ܽF[YqU7{oӻ iK**.3f\A } ˛_|ڋd_.3Q/qcx#mq\_s8]j3:ng8 ZVr>:6bէ~pq']v=<y5hqQzql~܋qO!xzoE&<}b+?aA7zk,្^.J`+L>N"/7_hșWzzc\[l:U[:8:<;_vc/_c'#I6Mdx `K&pŎzA:JPDʯO:_H@jq)ǜw Cof"qۥӷ?}L7şwIn`Q+\͑V\3iKVegeCjt~bK,ɪiYB0[)[NU:CgL6WުIH}h7UC3_%n9^ΘTj@$JBO&3ICk4q5> kr5+솓oTw^-٢ a XdHVH 8e6cƺ_9Պ:N "ֲ.PK6jJ/Җ߇Yv/ɶ]wKNL߇MGh9/{ҟ̋dc&$.zWs{W}l^6*G.%|1Pm T` H}䵁 ,9@ѸDx]X}k9xhrχa{~?l#͸ Vą6.饇$.zom^" NR{{r+Pݐ4>ܶ'٦SPrCd ɖp+,N-)ަJ^DW+as;791$iҙ"M$`xԟ@J( #+\nVuJ ꡮB`#,HPZ.]QWNX~za56ªj-<}wʒchA&PPK*QfŖZ+4p1v/>_qN;@t< _a8C wvFr;E>^N7|xfO 7NPTMBߤvC-Q  3R}fX̖[Wo,k7^︺6Mz2)6zC%]Do J]êw洹ŋW*FWAѫNtD^|; tt][3! "+@{BZcRj=誇vԦ]0$dtRКm )c[zսJz蓐d[/]d+ )]QWtBZ2B\ heW[oQ!꣮ ҕa{:cW++\Ů+tK]Y.uq q+'@i-tC]9n4vR 2B\Mf"vjSe}wAWѕ0JcWl]ym iecWHŠ+߰ iU󮂀zV{XWaiAE++?jݪje &+p2yWEWBr!!]T**ZXBJ1DW}ԕXOHWl;븣+5 )tC]){4Ư+jKf h w )kJ+4+ i]WZ Ƈd2J+Gi 3vt "_\WHiՠ*#+*CEWHkmJ젫i)-+VYaNĮ+lK]9Yc"blmhv _^AnÂ/6 D~PCq'TBɡ: x"%kkc* HC른PUJXKI&F\%HkT)6RƃI%kVM[_n]7ݾH@&J\WZ깁tzΨ i]WHنGW8) dtSn] )tC]Ih{Rۮ׹L2(m6F)tC])!Z!+N2RҺ+vUuZxm (=sz+pSB`E(B\M&Dڎ+ថљfiUR3誇rKS!:B\c hұ )0摒^8Ki 3v6d{t`tV=0| d0]n7 5EWa.dZqMN '+ĵuŽg|Uu%#+NtQn=t\WH)ŠJ1%dcf@[B\' h5>DJ]PW6O x@ZccRAW}ԕ>I|m8A6;:Wki`ٗ|h L\.?lle^ȒUOy±&G?}8x l~yYqVu:??O>'qU *dUkȕƸ4)Ɔm+ƞJDڋ绍&kK9xuWe 89,&;~9\}%r ҚWv녁pVa ق(KYFYY1hvݪr. #+BSzRr7誇 F)B` q= h^WH)誏OHWl&+ĕVuꡮwI dt]?"Ǯ+~UuW+ڐ:KEW@D7R*9誇zwLV\x2\!<]4'ZƮ+l}f`]!TtNǮ+tl+!V]t&2 %]bϻBJ]PW^C6( q!;*B\c iw CGWa V `wu2F:( !+9jݪP+CGWAu%EWH&,!]!SdtI*BZcR6aսJ#+vWs2 *AW=|'+(CFWk]!^WƮz+m+N2]FY" )ݠ>X +NtPn}h`s]!e-et],fE)` q+U#tG]9'$yW ➊VE]!J|Kpk !bPu[Xl}C.F*%- jopľ3pLd ȷ Yg]s-,:5CuR}3ͯ'Ukn4c +w 0Jy1:y?rũ@Y[yoxaի EU0 \:~F:z=2uUԠu{&+q䟌WiGEW@k]W@ic{<^t{CBFWk$]! (El+1]) +t]"D+Y R!꣮4s6aޓ:n iBJ=]QWs!]3FEW ՖA"]݋,+BB`+ q+.v]mAW+Ttj: ҊڑR`ucK:Wi*BmyҺAWFWa R'2kP{nњfQv֭҃z.,skMFWKGW@+]WH)ԠJHvKom+\JNEWHEB tu/R / +0Z}tꡮR.Mm q%+,v]!z+𞐮XZFFW$]b.v]!0vK]%WtJNf i]WHi>*!+tƮp2 Ư+D>)G+^1!iԕϊ(9bW*,fU?z/U/??o!#=/(+|V/_&"CZ,q^|}~k29.b>2?_G>sq~~1w$uL- .r4],P9ktYN(oE6ze1Vveq r/)lR9Sdb㧝|w-4I[OUG͎g;~RKˏglv/; '?{/;ao݋7w~Hj݄/hxz<7?/^=/G VxWi!kI3MU@(2`虴/$ K)&Yk2$d*]'͋R SrH(' rf2撤dVW^J[ .c.uC*.Mm=ϧ~sTx>c?_H藟\/6)XM14ˉ4"B43ͬ ԥo+:EF)ƌ*I@^Tsϥ<jpjC~0gLJ!X#WδS$&J=vOGg po t_ݠ{݃ 2+U³ǮʻLcN]83p̅ӱ2e]wyQ~ _2tu}Ci+ɾ-vZ;}N, (DV:mLy\_U o֯ݘ'"Y ː=궯н͢lRD/O:^5,y+Vw?8fu&+_d-ŔeB ~;8A坰:Q{20w;%0YkU[B0_yiD(nh۹,hFhdp]t:3L )҉Â%,RJ.`z3۸Z7ꇲ"Db_ҏ/?|?I_,1"rM}\2o׍y*mERٓRX\p =Dþ7+\/n"Qm-nv_뽾wLo?~|10r,Bd QN8I=̤<_ om OU[d~X!{ڛWz/A'oW5Wݜ>am3Ĝ;;l{|/t}Uۄ̵mI*w&Ya7'zE=}ZyT%˹\ٍ^=Ɍ_q\$DbnV+=<}"3o_Oؙ]wSJիkh2n^U*.ɥyzfR6:|϶..KU@mZo3]6v}7~.Jgx^#O,**MAZQ+x+}Bb^*/G/sԺ/WIk`սL7q4Rnkff}uam98 WFuSbUW y8YIBf>.Kz! z ࠫՊV =K/>>ã{D[!ܛ0" $P8h}NB/<I7h_QZBViqNיr!,8ZsY/5;ȑDH.6Q\Yo"k|3|3no)X.[]wэRC 2IehۮR1LE}3|>[|fv.(P͍uqάj3FسlHDZI#1)R.#J8IJz`BcL X7Ö:ƢAWbN=YMFsIC!4yTjnΈ6 ͤcRX/b&hA fsF,p_" #ٰ9LYBL3Q/bMwGi{ kF{yؤWXZ$q@ܗEHfBqFE1Eβׁ2G"npvoFhs>N=9=)ۮ ÜVxƬvg(_iQ36ĕ݋D_@*OڈyD9Ĉr8 5bM~g djr1Ef:OIgAR&T s$Md8w\]PǛ< QYaxf=ЮF8F!amWWci}JȮ! z2QgF"Al aOڼtj. &'W(M*~?StTStR>StKhs3(daQat.,[;6 "W|f֡a;r@{'o_-?uZEԲg -;U Vt,ݝU@տ!Gu nD{a;5.$$G Q`Yujr\yl0EZoAyz}n7~y-~X"PFs<$nkpj8rBaP߮\O)6J}UKJS+"%FK0J` ym,6,}cS eM Qf7C- {Ⱅ*1k %q te $$ly=e, 4c`OIqvYdVΖvc23:EaKۘ(?DYĎF12# QM30$C#MfZR.~9-?)!Z9p1') 1KaKMҔ &Ϙ-nVmP˔&Nj⤉}<"E!c%^:(MOQ&3F2lH Dpi# ަq8۹5ˊsE'*e3ת^:X仢&w[Axaq$HsX3ϱV)*JљDifñ5V_Fx\.<զ"1pE0b,t3x&.`~Fq&[Fq3h06)zQjf1 :,jT{u4IbId]X!ҕ!HzqABS 0բƛױQN% (;ĕHF*ki";[6۬Xz^RSjE} ri6dnb33~OnoǗ?.߾pV 3)bb%f,^bA8$U$S^;KIf(ak9.j'y53VM<$"/VĆ@՞EER~rđYp$$ 9Jʌq@Y ]3J]M; %0P7]ٚ Ϩv+6n=Zk^WE* ׷C tC<$ֺz ȤY^@N,O.?kt禊Z!#)!1諗Q/2LQdG44̈́ ij50ua{o""Lpo:Tsg;CjDqgbBk1ejkKإŗ-( S¯$]?Ld*Lk2 #˱0MHG$pQX=?Ќ?yʸ2ZPD`f9d]jvXc-!\PzjS q갰kٜlL  I8,'CVAs cb31XRȏ6XZciwiiO(j1IdWq릒)SS1S݌݌?)a7㖡9%f\qnpJkN ۹K"wѝNvr:g\: R8J' DKKQ*yؠ/k B ϻL(rh4sŸʴv,U į] _ՅU_c0};,Y'էgjƱe &ևm9owJSh缝aeI&v' d;_$/h2Bbҕt?~Wҽ;v&:q[{oS\6.abMY&KZ_>L3*f-4"*r$ p J{. %!CR% }Z`+x6=͉ pе\-(qT.Ewt*ҟ.ߕg\\]Iɬo'cu%SGP%xy0MaY*̎65I]5l ,sKN䊬*tZ󄧕['-7^iŒŽxy,$&h; I({2Ŏ^qbiN|YdY|` 2q͔y~ePF&E:N$٤Hc_͘DbnY-T2֚U }-+mhO 9ZZ@Kzp»>kw:a$Z:dUWh^z[`n,W;^&^jY"0tx7CJv'bوr j,=В6EµCΐ"TF\5=GsS_=`hB)zi*xj^ PP #'%'$* q߄JS/=X 7&gKS(h?R銦-]fkPOHW? DqGNİ \2W:SIQHCT\D+O9op)xCPwMx>Qk^P늜)A#lBb񑍈ym.{w](V4g,m<OpCp((NzDx(~RR,|0TP"`O6/2>J!IG  hD\I(D IWwU /nZޔiΕE (DaA BƔ+DaGEH.9r=2u`f/)oo'=}%x" 0G>CӈA"APq n^y4 P; LO" Cϡq 3LT1 *A0@UM?JDef$Ya#1eHq"`%% }2t,+͍")F FvVg4Q^ڵK6Z@=/Dx#f*MO8P~:slŅ>ξs>RsSQʋ[ _rݻksc׽(. _$jƍ|lI0ۡ"`Mk2_?MRǙԖ;ִffX0af9A*Iԣ-y4ʁmƭ{G\Lm;wZvrZʈ 4Pݰ *]u/YzvS;*'I~qgG/_ߧ{>;8=G~zaJaؙFB@&L`0ՉE_뚡%u//Lԫu3K-Sbm 'Ғ#@?0@+?3ά$ò_<# yj+&4&uV)owȠ]ђF0i^c >(/"0"W);":ZMZ/K0{wk~>3b/a1<8J@(zN]I|3 C^cpkJݐ|DQ7pWNc6:y@%o|).W J,,aqO`\s#z Pk2_Em|tXAQ G2PYoe1]эVE|@Fv)u\;T!\Ti"-0@32(gȯ'pB!1# % @9ʧ>0+aHAE<!sxF9"${γ 5S)Af*tC֗ X](MYRScYOk嬕<;nRM}>7OI%4VmIo;9exO#=Re2u1Eܑܑ_Zv$OEyx[{aE&>L" 3L##|KCu2' rlBHhSG-H L_Ae A#|t~UXŠ3clxo@EZ=C篡%|CCv6 O0}0 Jd0|~Hl374 5VHQrd+D# 'C7 :y:}*PEkj P+pb܀)A\!{s<=HH*t!FD28sI0DBB9//MIN%>` =r]W1lk _!H .3r$&fKp5\HaoS};\Z8Kmi? ֆ2!@nk̒#[Ymm@.M~@5`z{U\rXKC >`s e4W:/w9ŝm U޵rg:I:4`ZKG ʎ焪XUImwX<.`n,e&1m?YWy} UK}#T&~ڂ1kaB]X[=ѲlKvV:To*'il68ޝSt;k/3U˄SӶ#V/Z:<]ZGV3kЉi̴lbQ&WjkIM;;W4w/hL&M~Yb{w|n*yn}}vzcdPɏxaX{Go񫷻UD3m,3I0I_\_|ف?qdu^͖e7V1[h6L޲HLXdLjJ/v'5cs#0ȚVtbfc>[z.F[|o;Z(rAL,;sk|5\6s&E:uK&$k*նiC,YɐV1=_jcXuֿۆYpag*ϛ:Gbs)[8 V 3KY/!޽# X%Γ>#!mm;%p lKR Ugd>nuYۿ{_ƝI h\_f>L_3=R,tIHS}Kh~weNzIg~;fmv}3 _DCEWcƺG7yc!eT"nvˬrX9y[? \7M- և _P`g3hwkr/UM׫>oBS(Iۼa:t{ 3n|ǖ)2B5?r:m?[7Gp16SsUG/őW;+NЀ>zN{ZuIM*_ XU'mL?F@\+l/gQ&.&2[~^{xpϲ!K~(g:7tF2p6a!|@aSas;UF3)Ew#;ȱϵtxQnMlYVsO;X>Yn!cr2 iDs Ezwvw˶Ov>&v1&[ z8[ӳ3i={]t ˶&n 빾/t"36mv4u_][3:? `қ0^5}T6,O!ZˌT}#WtIJAݒS5Ί4c:XQ)f.' _ZvU+-_V_zWU})0MW&`{Y[R]˶> veẁulZu >.rTjl==* h+I3(qx4\?;>8;8=?~xAs?ɫX5;x`ݴswr!ϰ]D񪲶;ڴv}6M\ꪼvj\ȡ 7߷[p1Ō9#k +職cKEk`"*Aڿ-ݭ;G'<*I?V$WI~pB_N:eT %88׷qe59C}W]v?9)I_z)fTױkLf>?gjڦ80d'2 l7#J ~Pɱtc`40x^ZiH=}Nd-,QL.ŔYe^deތq9q2q\]^/@}|aHv~rԟj3Ls/Szs+rS^s{WgMpygКf4 zt~-'7ςDyq1Y^  7Xz)XbhX+BB ט-~=}5Ǽ-yr~VO#c}m~}WN@Wfysl?Q\<7}{~BEF7n|2e2v3*a4\ݵGf}JGt4E_˿Ύ~{VζV;gx@-'ǣC장'_E#[>o`8]JrڍQۇ=0[tClO{އ5nٰ~ޗM|1Q/[z6+>jt =/0qȒˋY(_hvְŲoR;\oX [v`XbaJ/}r*xj g޿<#4\\i:̭:꯳Fћ}rwǩVjv5ZMpVu3/^O&W5v>L8<Ⱥx6y6&UY gG7wO>_εj~pNgۥ'N{;DSw3O{yhS[t~z9k(^Fi?Cw n]?oՒS7Ƶrޭ{Ǣ-ulf_ m= Kx*>,$>B rz1!7~7=؈/ϩOU-*zy>{`׻{Gq~Σq* s^־6 c8gcĽ5||Q۠3v F} =WH/:MBԿqs|qyvd֒9!Z 蟿,c>uWZ <c>қS6X7|x?s9x) pϖw3?#=5|~97_-Uu|_(aiGm(*Q0Vedgv|pi\C'ެ'}9h#u7K 5*稵2Ui.Ȣ)"Uٚf}6˨ +/8*$'Y4ìFǓZs:b~rZ/p)ZբBl T}Kq)DUq^[J6$MPd1k\P)&jQBrM(cZ"(bJEU;L-Ƽߠkli8)no1@FKK<)uZ N%3$R6JpMVdl)I%eGEz&r>駫nɥ֌%)m]3BJYLuL-IFRKP`F76D6c6m@Uk%ёLYTZN1 늳wOឌHf }t;eeqHhB"6Ls9Iid*)5tUw~c? YSh*ciͻCs=[_ _[O .Xd!=!{ZK9DAxzy!- |Req-[!!P2hOBUĹL.Vkqs4GUNȪs#s( z[4hZ)*!w !Fihl R"wUS5}GYGW(mF萖E6@Z備2XaS8 Vhr IkSm'MŁ $ԓŬs֞Dqˌ6$%UVش+!Kb&E h 0iJ!B QoTTdC@JRP/ F2Uczӟg 2Xr E #٭(Pd8gƩжW :8QvH\56қ܋F'-s#[M`s qAg\hZh ͢# 3=U[˺ FBPV)0'CQ6g-a)-% Ԇ,Zed@WwAQttD[T9wSN-Mnh+U!S%b5eDF*x fȄWFZ#I)XPPJG2ʦw9eZ.|CcgUd4X=2z]C Hd!fP57._s*2[r`5d!6HC#i5KdY&ysS 5%xbG ̈́#| *TzBbTP bC9%pa7,T RѤ-K 4|A]()%RԕM( ? ]Fkd5{RR}Ysd[ByDS@!Ѿ(Z%J̣ q 2Z* ȁv'PUP'"ZoVMH8Nd3u&ܓ.VnZ1!/EϘ5 (N+S; ')! /+0,raml_]o_>Rr7i[2F=F3Aw ܦHBkP\iYu@rhtU@sc $(v5kM,$tB^XK* <@H"5*eNOjEJƃ%>|OxH\th"(Qkvsox P, ]BչI,TGWBgj@bQI9ҶBM fBNA}v:38n;:o1TS4+Qw`=H@/&9І(T`%_.$%LE7S v顔Jr@Q uCY.Az r l+ lw,X3)g4Rꆤ%2,0-'E} @9)1j=#Fޕ"I%\`JtP ,)IP @OOW,#X7mbt4 'C+ EJ*UJ\(\B0Aɟt׻,/V2 -)BTE# HMb$djb`,Tu`h\GҩH*B?QNڙZg$hH5i+U|RվfR@&CP ZTY[~\oY%sP|P}!L1A%V$Nk D +@(\t"Àeā3^ a4SQAj|d$NjOVS{`8e߆JE$)_6F>P"NĔ5*5K}!b!}N1 `FqI8]Xzꢀ+9b $Ti`ːvF@QR `j?ʭzߜ뽸rpf :ΛmbNy.Ν% Qڰ|o>,Ѣs |^ NA -Ǖ5z؛)fߜ.)r, p7ku>ݛ7 W:zw?ofߝ-.q{Qu?³˷R, BO}3[=kMxQբ'[0 `oǨ-uN諓lyFEெ:lauبF6Q:lauبF6Q:lauبF6Q:lauبF6Q:lauبF6Q:lauبF}5(0LȨ:k1p:=ZFH D:lauبF6Q:lauبF6Q:lauبF6Q:lauبF6Q:lauبF6Q:lauبF5x-<$غ8 N`:v]5jAب:QγQ:lauبF6Q:lauبF6Q:lauبF6Q:lauبF6Q:lauبF6Q:lauبF-dAި5P:=ZFgK4h k٨F6Q:lauبF6Q:lauبF6Q:lauبF6Q:lauبF6Q:lauبF6Q:l;߭ךaavi q^Y%Zm  u+qp|E7I"D^Fד{~O kġUѾT[pr(HA}@pv`ઇƵhRW=JǮWVY/,z``=\(C`zp\9=nC+D+wQn=gCIw0pîzvzE•W5s}uyC񴤳_.XDȺ+!/W3 9,n@{`E]a IV5dB%Df @jK=sAqyp]8+)m\r:3@Gsj f:E 0;λ(?'Ūن}7J aZ܋0iSiX|yxV|90^߈@ l \#~ǝKL7øUߋ'Ąz{pwoo;G7G[?뿳H$n&1o+zN4kxdl[}XkIZOQːvoaYDr5}>Jc.αߣި/vp'f(|=/~3ycw;kIv˜{, c~0dZs YGljk̾8 OĤ78n 4ETᥠ N*dBe`ħs`{o:aq8)\@Q]ཨoC㞐%Mx/v(>6i'MxtE#F^ Yk++ H|M3)TywEJX@h5ಚ߱^c0ڎ:@\`ugw(MhXŸY`bTxi<|ij2Nh3K 1[1c=EJheJU_6c.XTˠ j\Wl^x ܐA;򑔐.;D Sk1^ a)AHa>Eݘ]8<%bpC51صi,"UEO{߷R`F/d T0hD;TQ Q6: .ID=N/ :x#4W‡iI4'⺨E8$DnDj醴b1ŧj6 xZP i%i/xłup)פq /(rp򒓬8GnHi=Q7C>D-HV ɏH- O7X>IiCЫ4"_(_?{^t&S6b(jJnjW{PЦ('!FJ>n_|?q"[g<$;eP-;]}?Y{;dӇ6g{xǤp`6wwַ>\+p r(&(8ה9w u9\+!p}z\Ie LFW+a@_.1 >AY*& yK~5Ir%t ˗&4|3Ke >k)_q\2&4@2+S31;m]0!F9(;` F)]vFqZsՏ|y}Zy88BeəULts]=.rUswҀsr;]"R(.|;ТK/c;z>^7SK$tQ_OgyTa4Tg-Bp*+l{}W·9?7 /OPM z[V̥i F 2ͫ;!E$w]Eiss^cK lMU #wm0_w h|~ (4{AYY(6zNh'9Ga$?&E4p=E(]<3~*E+a2FЙM]蔹x9`B p(5O(8}OQM0`.9 .mZ'> Mz,޼s߼ܑOʧY4OէFo`v_:ʷMfpעzӨ_g&診)i5)I7?׾reszs[0np;GY!+2Zyͬcp,G_ˬ׼' +?ꦭDESQQj*ǿ&sK; EwTkXs?]͇RCs=`{\q,f055E\ ń>:eY;Jhɛrcu? ̹m4@I{KH3ǧ~9{d0iσsv}4diw6<-9X{*ň(ޗIUύύgռ[Ǣs9?@M.4|F0WŚ0Dלybb@y^|XM҃"dт]^[rJS=:@ÉWu 'n1C[tT *7nt.C_Z1.Ŝøe X8@k *cLUYNД&&eFc#;f|(卪(l;FYHY$gԈy|ǫ]Ţp\`Eݳ| \Y͹o̤ 6/H4[o]h] pW>α[8a8%?Gk۞>!Vm7hM !~qټaTO= <U*Bnu&\UL5D~!JYmMlMvo)zXk>|y[e?0]nυVլP+wrWtUVXU!B[CXt>TP޶ЕRv+kTƻB& QƊ ]ɕ`Bt]2tp52KV{Krv$v2+DWTU+]!Z{WҰVtq4˞~=H |/* nl~:*ۯmfDquCIXi]XZN uQIsjbs0Ax kWaP:"_}W4FW |ܺĸ!$&uL0n.Vg4LwU;LD/eS^zG O]4(-bp5 ٰՕmlD:y@9\(uNm΅m$_0m(a Mv&An hBdI#dWEJ2m6 wwjbSʮ;{yw/߿y?[j19wץxvnv}N34ZזzgNk(wEߜ-~_+AGL&Ua mV5٬2Tj!ngPr?`q^lz(#k{I0.f4`R|>/]1E-Dxp*^~W,UE&ULoDgLu>8`ʹ̄xn~5E7&C&<Wj9v2T\8uMA?z@fE+jew$Mv#a'(T-|kc~,@ n.~0_Wzn'&\3'aK0, Wo}}$N #?շqyIT$2nEퟨJ2$MChC'T e^b[<_\ Ѯ#&Ik7$AO*MIՄ (} ȥܛ7Pl|>T;\=\ RbiiH@3HaJ FPdxl4}^`U) 1^}<_%ơ EFLC udðed^Gɴ7U2{By[%77{;L#*bZ a-NsǹׯN*_,8ڦ;$wfew3՝u-^rWٮ*xѺm*j[䭺ۇzNOprZBh­˶ǛGyY4riO-ߣe'?/u!hZ 89;or4SŋPJzFK 0,)ýTMyz01qh+5sDWYpl-7YϳRJ//2vBKwұ#\SUډ;M"&t;qowN k!;h[U&Vw&W/cmˡ^ZCK,wFa=SС gy8R)E)S =q}6]ߛe>}TʀtcD4Y^x >ݥ\!}wgg=Fmn&]'y+|j5\Z+PIpB\q" kn5\ZC+TMJùO \\c|D4WvUq%)W(]\skBtj!p?}sw+T+mq*M6Js}Z\+/BvOۈ++ 3O0r7B2*9pF\YƵ W H ,DZ_pj m:P >\ъ]O{iB ,M\Ezj9 ꩴ [5pE;\mӞ p 2 @-'BUq8הx+L7:2_pjo+TYZp\!W(je+Tk+PKm=p<{+YBz2XKwF\IJ N?ʕ\ZJԎ*pB\)6 Svyd]Z++P))pB\i0OS0gj @"BwJ\ɉi 2o+T+v*UVpRZy[w{0i~Zm^8>t~f--Tߙ&[4KN6Y]'肋oJg%nӳvw\wgs7ZT kMr-%6>F N*v=iPZ+ > 3f\ZI+TihJhbO ] M a 6WRu`q% 8`YB *t\J;\WJkum+7B+P+Hq*Y WP}Ȁ?kW WSo+T++T`2FJwG +Pkxq*e WXkGFK7B ՓGj VvԓMZu}WTxMZmWB7B+T+Yq*5pB\qW Xppr%WVۦ UZ᪅DP-=f{+{pڈ+If#\VKaxWV Uj᪍eF<2Qrx4Dp䪟yNFW3GGLetu`EY}ϳ)PE#-TK. v_jYP%Fj aOB0opr7Q+Ս4U q4h@0=tZju+T):歹2"*feOWҙ ?YWSJǮuy=:Yr*DOetLQgMb]/zDZK`#\}ꩵ0҆6z"#\`IU\ZC+Tiy*?zX P Zt\JKXC(H| xW SoATUq*e WB&VK1\\K Q-M]0F\Ii sO0rdZi+TUJ͸W K ]lM0jm:Pe@J["OB\\|UP%Z+c#\` v++/BOQe]W*R;Ty+˽YjGR4WRwz:^(!ܶZ?t2zrZvUO%oK8dMW(X opr}w- U2᪅D3=y+/ Ur᪅T֧`kE\ZCXq*K/;\WBKp=e\Z+TiIJQ-4W X pr7kWv^;\W¬GBz+KW t\JN`2YڕaQ.fՊ/JyWĕadM?^Ÿ帋ZRZ7 Y1z^lTdz;,祜O9;rY=٩?'o4J>w\h.晻%ХaѲ!|;N( &(,04lyq4KwxǏ!Ep$䲟?_,IS8T1 PRG]&9L;.J.it!(v-/狯nFSBnToz?t0[~v~tr1HEJtjڗ*Mj>=~!,/ȕׯ G'J>D"4IgV eTiĕHU,L#ХYi"@e_F{5M&nwaꮎG8= cz5O܀.-d&H1qꢌrTF(s&%*!&2bo)r(_0**.}0iAmoŠpy^0_0gkϏBhpy~{?xΗܚ}Qu&kJkEIWxЏ+B(,]|tRjY*ӣL(]W\UUG_n88GO` ':Eg$3Ę(JLm2IT"NH7w2TqKj?=M?0SWN 3QxIb CЛs>>v(iW|;JS .v4hq -z@ A$L\z}$ 0kװ\Eqm・X;,8 )l<LCOw GL ,7/cǡKfm2`hyLd)7-󻊉^\y(qGAPF]"2BCc⟘hBb3.YAA%qMeKrO;s[9 ~m]Q~D/]( |mVQ:A̬$-.T̂RZ0f<8(LKä4r2FR.N$"LNIaJ)YjI*vLN`piO. !*I+&( !]c& Mx(bҔFdf(# Mvև7W Ugqߥz4<A|R DsQ %F7 F.s$ ~KTq|ŋp0LQ➝4X9&j彾#Omޖlj &({ QG  1J('7&/xI/NE<qcI=,pvq/3'g@I j+#KQo5u$mShel5U_UWW?|(g`?.@0! H ˥CRzrgXU`}J6}JRƳ*s?Vqb|dC.> ɂKo-,7\̮o./5̈́Є^Fת.9v Z}*AdSƚRYZw; KI >LdWiuv,uԎNĿ 7qo?w~w ?>~`p`L H$ҞGVЭnZJtwca&pa842 5q\ڛWZN CR\ 0hҝgo.Mj++[y _bTX,Q` e ~SNLj9+j$>CnS{(9d<'m|0qó~Hb dSA_ #3|FbH,%#ΑR(|1( F,S0:X Qgs%DGJHJTl(QKCsL(QmGsTtdXDcZ)"V,x$F5 Y/dkezP-;dZsWv~v!SXe=T[>r̒ /%7iQd k,)PO+LgٛԶm9odTct{ctۂvB|ݢtS%&h#WzDhV2tZi,sgKa%8P:F*Th0Cw%K|CXH_<]4 ##Qyq3-4MXnè{ ̏ ) /BLA &P OADr;s~ܛ xhrr`raT0V^"RsIFn+[C^ٓ`L^/Z<(Eĥ9vsBN_Y~[E*L<\SDc/FX,-,xGI4  ժ".5jr|W (!6Sp^+љ{>LQcjjѫ?`񑫍?Iǫ6n_3 2< ,JQ%hJ3G`DZ Q)͚sP7H rP7/9Z;{aΆLGȹ鰘DR+0k͘ Ț<808sOYAz_"iJJpp^eV@i6(҄HD`,-g!)f㚋r5^*$nǕŋZgxef2kngf48مʁy͕q#bDp8гthi &֩LjV,l%"2Q29bHETB@Ѵc=댜zVSR3G] BڨFIF f0zC&eS1 "dRދ/ u ;}d1fMCOXͱ5XxQ[T:-` L!;)=P3f'e?iFW†&(gQKL#5㌣hA|2ȁuZdmpav{JјP&2VjFd\T {ޏLAbmJEBJQDIAD:n8Rm1@5Ҝ9'gm."NC6bm'S:*XBڨGX5D ,J ?!DgP$OF NB;jX;9Ao4:3jӂ&Ed@q/n9bFj1#)dsAsAFjSƒuʣ2{$]vvp&9l +d(D,64o8&Eͻ4jaU|)%W>J5lMXl{3Szvٻ2/ޝ^" e & ¼&BP5;LjnDh:vFN`$=G.hQaP~u9%/uRz]Du )\pQ}qΟ}䘳 9 p:s52+9ñ[Dpfb x"L7$y`k=/0N ગ-qzދ 6+v'na#ql]} n"dHjXF|0a1{+%R 9nryc< *۩I.+Gkbjr˔O ^Ȍ ^`A Ɯ ߉Zy(kWKڨC5M * -Kff4܏,ۀPE# R5 j-$* %^FHHLBDN.l@eߑE.qVqdfGe9AQN\ J{DiL$u,;10BB23uS/IN Y$53Gc]XC|4kZִiԴ^k宥䜑T/!5]!WP򺋕}$0(@sLh^+/ZOeQm?uʩt20qb4Lr+R'ӗid:1A>!緐l^cvofv|^F.ءi_g}z~)*̚-z$Oۧs{Vm\4s] 8](6LVP3E(֗ٺパ<ҥLw 3 L"+oV<'Zum|;]Vʹ,N<zދ"UEq3t5вMNW%vwRdÆj휻pS ]ƅ}I껡?DQ̈́_YWqh m&=;wAWQSo"mX6CWK_|Z{ʻW^]H&lI]M~3tΦD+~ (WHWC[RWpڎhe+t5D++)l&`]cVjt5Qt jD[RWf plF]Md&J@Wbꅂw9y;]]r{'XHۧPpphwv~rzI41P =lͯcއQX\[tOr"=\` ७?[ˏWbqA3}4=xrvCGsx=S_{Za|>a7Ųܣ15 -o òp@O]>cuy>;=V}F]Ѻp+g2f"etZbCIȔn [A fbH$OzA?9 ᯪ )T?^|&twxkVc%2Jn5%o'g6wgJLV:JB+P0;s)FnnfכM >YYl4{k>/tۼkQnM7V12wp-ai4Gɉr%"Z PJď-z|.oK4G!o5T||1#%Igz>B 8  9| cC1Zc˕F-9[ g@-Ǐz?=?-mM8cli6hÏHbG&PegK*0!!R;&SncH59s`>fyb Bq|{F}Z9L'|k?pkDպ G:jd %S>H)j脯Q$1yshz\59Jj m%1#HdglQ&Yk~`ts =y3Qg,kmߡgsX1!6Zz_ R"@iQ!r`)D֝OѺPZh4^dUbZIشV!;3L&HyPlZ31K,1Lmt:BAnaGlݥ;зu&G˂B ih g0rM˼IxAEEg; `NK.P_V'Cq2(Q\zPyUJhs(a![Ficme ,d] 2VCL6nhJK`H(hJl@7lmbm4hWQUw˽AAq(]K` $qMVuF:աR-r W c̆6E6AEhqM +𑈅Z2F o<8h&`- кzjPB]ځ6FoTz^KC 2Pư)0mz5ePBeBEAi G`ՙj H"t,,x$0LX;FR10rXk N uY 2Xo#|Z t&JY {s ,(J892z"XBUGMy'T),d@P\`H#BA+"uy"T5{UQRA}k FA#ҥ$3?5ΐ_.C$`D(X(rk͐BB DۣAvI=>TVc#CzBAe!)!tOyڛQ,KLg'4'曎1*MZ!pBK>}̻y z켟?Ïo*ZŪʏF]`B I`-$L>g@uPe0}J_` <+IWshC2j2AYS4X hv=!.W ]P!0Ao%CJ$JDOk*VC$wti|<0t`s!Yf":Btm;^XPt*'*֝7]l;dzeA4T2X /-wq~u'蓵죔je:Xqob=2ǡA]ژK`)B^s` ѨL)wKb*20n~@R@"̃i3W ]Kq[S! lԎ e> A-DG7H!Vvmg4+@Ek(y8LL'e#[ (q3ϱS\nP:LlM5Di 5eH?< B&q7y8YU;L6`E6à]؉qPkV+hgHWY g&,ά)6Rn[3W\D21ߪvC6CHw> FBvv n36/ ֛ }el|<zv.7]\=;<973ɞ }H7?,،ѳSD65gGB{gSGŬ}ǰ,ɺQ3yhY#Gh4vfc֯姣?_ @vUFvYpݰ(a!/{ |˩4ts ]a7Vho[uGvE.WsXTѩzam!eL 9P#@zop}vidP=XA|⟿jTyĀᗻ^ :H5-0gQWcBژb XRW =<֪] uk(PUI6LÁZ+)!?hG6E_6$7Oy7ARw!*~pڠ@2XAӪ 5bd=pev󺐹~Z)Ah`=>\2 'UèQ$GCXPJ↙duO NäeыjK%pU|qT҄ZZ*c14KPޕZKp$By`hk5c4&Vyߧ( B`I`c R ū[?{q1Ny70;ryt/v/w'嗻_k2J o` = 7-7Oۉp|c/G8RPㆂ:6+`!3%u^aP'~S4A hPG:Ѡu4A hPG:Ѡu4A hPG:Ѡu4A hPG:Ѡu4A hPG:Ѡu4A hPG:yN0 aCAfI a+AӾu&ʨAWqUI:Ѡu4A hPG:Ѡu4A hPG:Ѡu4A hPG:Ѡu4A hPG:Ѡu4A hPG:Ѡu4A 8 )l)$NPpo&x:@IV:13Ӡu4A hPG:Ѡu4A hPG:Ѡu4A hPG:Ѡu4A hPG:Ѡu4A hPG:Ѡu4A hPgo:cRPuץuև:2`zuG hPG:Ѡu4A hPG:Ѡu4A hPG:Ѡu4A hPG:Ѡu4A hPG:Ѡu4A hPG:Ѡuw{|o~9wc߽~AT{oggd4޶_79`̿a~{|rv~X~9wO-fg#,)g2~GݭnG+dc~;;88|wunz ]^qͷ?ջny츚+p+R>҆,v\f,6&b'Z6n'ʨF~,$} *5fn$W%;O-#~)z{w~CCq׊s76rW}<_u&mʍR i. z>FTdTf1^]Û{uY%]1-Us)b _y廕~{U7||=0{{1 ?=:$*ѠM8i*PtNuSB&õ B٦00hpV{tݗr{!,8ZI}B9+@AH=iޓ6~(mԒlcYy\l Lﳍ/}d";|CΑl4ᣤ˟/>eƨvfFRB`hj`o + ֈރ>8u5?z?f`HBmqX|e$x>4E@vb@l1J_|pinԻ@mG$.Ӊ ߦ;vMLb&7̒]Xý1 uKy=_es7:}&GV}Ŕ./ qjՆ\=p \lsΕT#( }Xczl1S [͊Ua\\5.%y.s4+B@ :^e3a1vjV0j0GqS%egl 1@EY$r$%{ὡNEeb xଋlµ왧TX{A&]ŃM/.9U=6yp~62+hAIp2\=P ]ԁ2a3eDQdS)M:Cdse֡^;yX\_=RͰӈ'ቍ#TwL3WF!Į?\Я.ͼ(XYmO͡&D6L ;6o<88I4c$0Ej)'7NpnsW&j:yNǫ0u܂YǰQKC Q >X%`ZB PDN((;G"\JlB O1 pjT!6 GgcΦ\Wg L}Mlg#.H ݧWMyg\C.S1kAGiK%.NH5qBkHr\I9uxsl)/brǜqiJ^X+ڊG%IS-h`!JaGRhŋ@z C)eO5$@V3ăSQT;*KJ2y@ka u+H:~?yK}E}6 ]?&x\8KFͯ?x`#_ZH*CwBwm~=0 I)5k(^&`?rYW~7=uӣ}l;U/3BqD0+tv*b|vtw6_w*aYiC(XK U*OAc2(L++cj|eC?II%w˿͛(i&&|1V`euKL/m>_^q [H}\C⺻0XSzP4ܘ5w^9| j&#\OflNhxqYЖ5m+"`ɯ׋:jkѫM ؇CՆCi r¡*.OeN?3cVPŹ w(ݞKzY&Tπg˻͟:%ߎ˩7Y]RkaGRU*itr'"_3 KOE5 JgLQ+<ELY_ARt^_~'YAd +7bP_sVA-m=3{9A ȊUυ\mzQOH:?PQeǙQ_mbMαB=(J}Ǯ@iCM25I$^ bх;2rga1V"RuV{խAD >4"|hhYOգ.ѫOt|\lv?=vsCxK|_v&NXH>'dh ERA(Ŝ(a:Iܠzj'VO+:oJ=v!&K?f`*OZ^Eie;;75MMbЏ2Z$.gf K6.B$7ih 5d(Ÿeqy:.|r.]/oǺufwZn; Ɛs9pق3%g o=tL38sox:nsa%8,S:gF*hCtxÃ'D4r۾@?88 \͆7:ӶȬ7Xy]6ß׋eN`8y-Xhm 4wFm6*GT KIwBjY/Q- Elʣzw]J1z#R#}4kf7/mu#g4ހX04L`D$hoߛ'_ !plzs..Vr Fȓ[.:%= (kzu /Xmkk9XexԐtN>}}VCܬޛdyfCtZŦjp ٸ_cc8hُ P`+\\0s,[̮F]EcgmI:Oϴ,sJÃ&ގ||î9AƗac>{>cҫϖ?i19+[koGzCU|Ňڃt*ٹhKU`B<UX-Ԃ"Vc&ye4zl5VDtYk{$mC0É%g=?/yE+А_n,Mvod[}ӫE'P֓9yחɮ,L^?Lu1YwuYzw@|O6\FCy2YɭE{=4Z\iR(gs˷kn[[{:.Nsok;jXseӶ1˔6OdC^DVoXoJ8!n|aQ G<[|w1H=Τ1:cNL3`nB} Nմ6aA@ìNXpyeXLzN@[x>_⟼-vzyއBWODWrNd };aHjXFsCvY`iI(J1,,VJ\|ܸuz/RtR Gkbb\* "}Ԗx93Z^ZJ׆}qb-+8hhz;>}] R5 j-$* %^FHHLBDN.t~$,r!c6S5;$ rZBx`# :HIړ{ޜ^zdL+ jbKFb$S, A9p0IX2%^Ҏ)i'-aGcN»zbg##~NDOU1'q'I`|1Df5!i?'L}Uk0K%A鏲"髭{~TVlLťLS 鏭JCX#ry{4]hhw)x,}}X|>鲏'l(0]M/7/Rn^BԴ- +qrт45()8 \6e{ɬ^0{|U˺:f≅be/onL ,vbI[I=hK6 .dGm:.t~̫k\ས;,v^(|]>Ο KPG{OB_9J cPS*#!6 jGELϪ3h oc lRfIܵU-Tli]}%2;76eR{ʉyp"L u&K0ES]Qlr6 6EkVkV\qR'-{{lFH93Jc!+RfCI8M'>"0l@d!"GpEH* qZk:ݕ֖?ob IJs@VS,/nVj41( ]5b͂ {"²}&sܯUH"ۂϵ &(Rr'hM:d9'5ΰk4ƨEx,8 :ҐmA-^"q5_<\4&,el;6&:t~r#QB }.\+$ϳmboo9^R|ã07{3KM޽?S)8;}E`btp}.]ĕMZgiev6)OSɌP,YA`qvԐg_ӄ7d7s/i9d&ٺyӻ4~_A^wz~li#fa}ݾG~75Y'a_FG7 TŧMapJ6|KQGW7LH;L=9uOW?_,qAk ,t,M7eh|RYZU_uOe\?+yZ_@P_E^=q髥 NXhr 6j-}nXC_nhd"B-gQQ i@KI&ޥVek7۽-1_Ya8ƫsZaugD Rux ",_y "$$!bIWJJiۗCW[!;g LZ'5H BhyR‘=m^J-tD+@t'%AOW@:'T $Ir+dg*-c*%^!]) Vۣ+ ]UqAwHNWUJ{ztUu˯N=s/WW׃{?翟Εs㧪7ӍnNRoO~O\QX"dB / CAĠ,lL QG1F'(۸rSKFdyT8YeouAQ r=zSdnCv_)g gCdLJCWaWDَCCšrv9(g#& 11]42&-#ZPF.{cW^< (X(cc;fk' I^-:yC'Wg|6-l`ˈ1Sz !h'C ;@J,-3۱,%r@BMvCtK;CWU\biAc*%t(eu+[mUWuf7JkıUr%CWV;DW,0ڝ*nwruJ;]U)Mzt.vJEYbq5BW芥5XJ'WIWJ0i2tco_"??]߯JQ&Xh!('bX&E@y/Y6B{6#eqt!d%4.c \ ֍TM*\-E:ox"ubSm4`NjF/u8M20-z%7E`1-es1 [4zh}u5dr1>YGg{mӮ8y:'0+\QFZ,Ĕ86*"T3IJ* %NFI6H9EţZm{9Y?>7cӚXn}oShqǻ`xX"~̿y}.o9ߛ.z4Cl͜7ܳH5+# /4=2@VO2θM kx&n< /ZNx**DI}m}GnaCq3k/(/j8--6 p9KG64VWZIU~w}YW$e EVD*@\YDѣ61oMlRds$X?V98%&||^{+j=M/};ȿU&.cYWc94i5g=[v=IUFe$m|ӮyHo-m5w>_̎ "(`wNk]8Y̞?ވR}*Α\[ܮґ*-c?[4?nNzH'XP[r/5!H(mBYkC:; l!^Nzl)ښ3N=p|k5UHQ M:Gz.Cd@!#arZ Z 6g-1_kҗ`\V*S.`%j U'y:hcl{/Te xygN>[1ol_%`k : ۔Kv+=*dx9t_=bj/wTⲣc-ZC@e| =rD@L HP qL$ z䶆Gt=NbrH;sH%gťzK&t8F!FXlg _C2cve21B+.g댩=q֚8lߏ fr;-|QԚ ImTي̊3Z%Iv&mQY*LdL @y)?' x/@ >3+ALZ/MM*P(LAhmjՊ>N%x;%5G`Fi,dEl( GU H,Db* qU*@k:U*ݕS?ob IJs@VS,/Vh41( ]5bM͂ {"²}D|ȶs-o <.Ifr1AN@&>433YrɹDo3 1jKgW_miȶ蠖AqӸ/N'Z4&C/˄s?^َζ.}0mH$EBB> \z9CdKb2{*E]oXXu=W}}Wa\j+C}eYL=_V};w^w㾏P!/7w42 AҖLƨT(ArV%QEER+E2I5h+-9h10G \06yc@c:@uuxKWl>磚4}.V4yxWT{0m8] >g&L|3O:=>(jjAzL=̾fIlK}nݷXo !)"e%[~MqHJ4$3͞_U!Ek18N :gЄC0*i Fd*Ӟ֮Ah*sI"G I:"^Q$oA_!ZꉏI+ӚDpJa훠mlw,UfzWE}vX6 tJ $?,bfzAe,Դ*fx WNe]hX͍֞yΪL kܽ(FFEJsFU !9 \tHP`Be_S h)qbD2K67/ej)2 ķrtJ-[/nKS[PY0ŒTm9GrI%wEMP )&Qn"72W {re_%!N fJ8fϢf {I#^^ L IDGemvǚXc/nj&KF1I*‚4eFryͣ(g)W*=u7c5іZNk9)99 bCQ&JDm"쭫:G#~if"Ɖ7/q< yHg'=6 gP"IT#;?ԇ.a=sy.b!(F.|BI ,pn ̙CNX`"3F+AXSDZ4=_~bn\,r<(s![[!VPGE,ӳ.Խѻgҫ c}.AY_~OA&{𬯣G_3|-oMIzɍHy-[-[J6B<+4vo}>< (95RdER" Yև؆p/" -0s52!eLXό'!JʘĉL\r~.yԨe&gx/mag|18g΍V+uu:jq|(HǪcCJP"D#1?+UD /Pʯx=^2h:NHFٜqCJV iQ\$26{@ P 3j (C(ؒB7{l NũYv\Zzװ+N:'L&yM=8r2gn#~)cp"ZY"w6xh. ߽.s2%:SdTH,&A *)MI.ъ-0B Ѝ,w_~ ,ptAmuY"0YD2Ĝ2<5XU:̀Ivcx/1-۞o}W[6$1{pS.ngb'I[@%:C`NSX-oYFzXѡݏ 叝YucwS޿,.Deb #S Uhs![DɎЃ $)&*gEn>$`?o\&w*xgx~W aXR& ̆(~D4(PS )ЌG,CGk-q(+yGgcN %JZp-,HǟV)LE7'D~O}b(lы_,C/mqe+A2^%XٷO*Itrr<:` PvfX>#W2 V0bd3; jJN&YԺjlIa'<,|u +`}LH`ݑcSt*&9.|O!_?O էw V`}ao`šVNOn}*~>ȸ,iLqf^(+0~ J*:e1IZ,y3wD6hxL-MY6MN_~l\*C< م~cN[HhbwV˘-)O8`i }~1?;s~VD9c#֚d/ö<߿SzJh41lAjbRƒJ8|T,0\Gh:2boB8t#t0kB唷"X`G¹9-9CgMj9:x逸IO1`k@Of| mx$05QE\M3~/S9.!FeZ `1$1wƀV e0:ڌ.dt9˾ ^up&?TY_r}1΃岹$5,8o-[ $tuf.qz:i2S- 7 Řի=7yS\9^O>_@8׃JBwFӎvh7JgJ[W#S` 3oLoNFwcx*`2a+tdlB=n#ێ1"O f$uXuR2T}mQOw~pM0yQŤ,a^zH6XGnEΜQ@e4/QXӡ^ 7|?}c^ɴTg=ذNA ,`h< TQeF15tՐH)xʹj5t:|eymnڮ&Yv.sFl[m5Կ9\+z\&()9H>Ti$DKP#EKhSLCm.Kt)MC1山ʘ_i1}/eyx6Wp<[ΡThx~մIW1@B6pM aJa)0gg4YNo$okj;t:@Zk-wSew&u<˃fWxQQ5a n`NdIU`s2_:9f)8fXʹ7,sU;6a(w)9ߏegaf. cנHDf[쐳"x25ZF2/ n]]T\\v[ }~pKnO.fC %a𝕴XlЗDKA"GD,d%+C[D=!$]HŒ~ڣ}cAf<ϻk;G1"aT5,8xD[}7w igi%62nu4[mMr=)ZKfےǶ ^Z)Zԣ#{Kpb]5:FeIˤ+يF">HgM$D fNV%:zӽur:Oad Dk@S+'cAt` rhr+8R0v-pN8hK)nWC_'; ß/S Ë0hV{ZԻ'ˍOeRW^E7o]P8&Y}-CQ$T_шB}A@T @Ek:pxr4R5P}V胲F<1Bq$!2zp-sz @&ϩڔooy=;3{/!ZI/f-kS03_dS M\hAg-aIͅZmy%.?T!A5QMvfھ߶͞)Jz$=2%Y-'=YmҳWLJiCҳq!$sppm"C7?As(|Lp@F Ԋh(5˼gK7`T{ v )Jt,YVj{57F]w? %"x X %Qٔ֝$G+M[mR$v9dy$4J@lr0 R*Gec>en]=Mn+4ZSA \nLl)rB35 pj>-nZaxKx߾հVvIڡٟl]ν^?>eiC89 $L9z:GOzΘ{B# Ebk, HSz`N}A1e"yv CLrb80I,\1eOmùw^J>)<nkl uȀ%.gyB.WNJ8i\#6hI` 6qqˀ0`WZ=oNhb7&޵#ۿ"W0Y` lb HښȲc8zXՒb;@^6MU!/\^ ي{LR;t+x(V21ȭdߚPwGDIl,s)ך%yB"UYA L'Tw,DXzj*قu8Iǫ`> Z_Rh-%a8&;Bp)`Qg2D(k)tjRCPpE $u!x ItLB9Cs~ stlU`6vxғ>XP3MR˻&Ć4ri{5:,;orA\F+P&pAX$v&\I >sI NKo:0(Xyh+YeE<{W_kiMHSMFӟߖ5Qᅮ_x2R=eXwGIz+dnqKd,N1L]sC̈K12ȌMklkxGǏuzc4s?G+S0}W|Jwk_(0ߴ:}{4kx$Pth,c^1Fbɯ@ٿ7bm~%V}?#f>R|vuů++o]MıҨd3i?\aǟʟ?#㻿i#8ahꂓO¿vu[S>Σn=R~zl(tF^ r"pD*'>~^WsRVXJT9*/ʜJ[#K饕]WX7\(C*/4~NqI5.JJ*N*|(LG_Fi4gWǝ;3w^ov]և8v@Z`ِVYAʪM&U3WeJ:g̱Z``V}3WdaH}YYif^-]?x7Eh A@=˜sKbmh\MhE<ӕVVYb YẓI(YElmI[@mw"JmV#϶ې2"szJ 9$-听h<hTW>:>[ȭVG%̓UF0s̹>*f77/N<|$֔YԺZNFn3yq}B/y[PoQlVHNt!ӂFP 3Ij&SJ`lN'ǹ}mr .C(lN#B-=9UF[ֺ݃ǥsnR%_;YCUc=غC u6YYJ _gԡ$M&6 LD u\Ytܷ(dWGJ! 2ƃ1KA,\BH zG$guPo" @]En-ZE(S$\CNZREB*jڃN eZwi\~s.i}VD`.o& ?I,6o,~<]0Ug ë«oL՗apĎ[Tx-\4Z`e6eE8*#gPHѪJDL ui: {$@I9+MU>|^.6pZJِ^B~g3Zi.sgN%;۷Dlw˦ w#Y䱜r+'bPaD<RdܣhÝ"XV;n. PjMG-ܴ4D~z qI;1SeSSxLee5lGA7k?|Z)+ ӶeLWr*vO($Z [0|R#dp"sΤ\'l@)&\S1'df<8úfw徱H|1NK•9Z.2( +,1"0!UDE/EW;mydNߦ^h Yb`]5Zo;~s ~Sjc ݖ;w2r]ũݖJ~rݾF[k,gr2}JiPom!8PeQe%3 *VxeRtusKb\̉|}ᰥIɧU`K)2P qEf1:̑6Ar*'%~8z~ߔhZhf3 Kwhp"pn4NU`6 lN[7}u= *ʫC:s'7vmTNՕk_@K茺A1<,cGwWtwjri8-ټe4ͣY7HwVtg5ݛ4Zy{ӵZ^s=fW72]lbv_ri[0/>[+gnWTtVf/_lvtF&_ˊ,;!h铿رl=Y8cԛ*.T!p^IL@&1[>o9fLsŀ[!EV}i^i;{gϣrn02hβhQ4LT 3ˉj jOV)$pQ(-V 5'ڗuMR& 1 eFP'87AAKTmK͵֝lУorrh~gy+cEsnyw @\ʤ8oh۬@Q?Teg E]0873)AX<%ED@F9h$" l1r.YQ+"dI2->H#6aRJ_1y#A` fbka3V{*>q [ɣW<}K)iO;UHNjyը&ǘC`Vy6fE)i"mMk9W3lc)Sy];zגHl7N.k2x0j  ~1@ZzX%.HmPCpd̺,h36b }@7˾ J*i2@ e2L11" ZrBUdH< bu>]YoI+va؇Lcg<"ehMRvodIYR>Ȗb0+82#{EPKpjҔG-X@#@ Y T(-[iiiDZvPډ u!u^-ΔlCDtY`K;CDZ»BD҉貔ѽE":k4#xn~x0ۚ_J437j"&JJyM {)a¼ږ nwۛ}X.z8+{a/n]"`tYMEj 1 D10 ֜IgJ:(T{*YršmYv^臅4^m[ -jDg;~7q+27 |\+BÙUA xxW Ӗ%}*av)'ԓLjipNFzm 1=!jKrk[U">t$,8W&XlpF/PcbƸ ?XX>C-:0|| fXYfP T! a$aN̴$8a:r!XTEœLYBh#{S0Kt.^W]tZK^ʠAT{he9{eV̕AOWV=MZsʠH !UDPDkeбb%uRʺ]B)U]AuŌІuH]FΨ,]QWYZC.]]e) +nT]P`KEgUӮ+nwVuՕ`R/k s埧z_~=L|/mwb-mznuly%Sp<1ìU@$H i!8eIˢRh3O2k6.&a0HWf?6~w7]}&/?{U= NiW'k)Ѝ*߾fALJWl Ie73(JXlhgsLϚA47,dױ0b bpAy2j:1+JIJDI@8@̆X]K@wvR2rYNr~Hܞc0K'l.z5*NbLC$[Gey¼ҹ-oQxgξMdKKy|[s6z+ۚ3+g,gHh…:Y|o& F)PB k?sPJo:$6VKc`p9ωs,YzrKZAl-$=7d/҅@sy"|4X =h୍c!Do(s'l˼^I\oJ^t]?W*t0+ɺ m^o8ɭ`K vMiE>\ _\)oS%t/RAq#ӔVL.\.O&qNT"GY7ɓhTgm8uyy{_˟ϷԙL =*2-(K$F)RܡgXҋO/3Fy mfIMzy6' ;ރ<]0e=6xpMq!}Og{Ym=Yo]AȝM`e隯.;] o=?!erw#y<.氆+[gQ_s|]Ktxyi0won8u?O}qte8X߰?]o}KsCw|uC߾vC8Fg-Du(> Sն+9(!_dؿ{ԬtfR1mJX+cbTRRU*JXqS#W*JXW*JXYY+cEbTR1V-W+cbTR1VR1V/R1V*JXkJX+cb|ARDT*!3 c3- ynZ}t! xtFkDjZiK:gs^V9Xe{Dg&[S[ .r3=+JB dpFG +YyJD#$ *n)fuPFk!(\ 29X[B,c:)pJs*'nf1qY^L{\>=Ҿ4ڗoL΂ބma5Kw^z+NMolˇ_~G(/|SoFNR,It6YCg$ڔXf-Sj8̠CH.VCT#C& ԢƅˈGd1h0Ƅ 3B e=j}-A|P#r(MXBw@pKbF,/bW_q0X9t$Qxr&) :K3EzV W!Pez&"w +wTEzA66Sc:#V1W+L5Nq}u#0ťHVh(X$FZ'.CXbV$`Ƭm(8.ؚ8nu'w=bAS딥$/1jT;i*xn=EW P1rHF+툠lw#hrzs9|0'#Qj@p2*%q=H+#*znjI[cVF$ƨ <آE 90:[g\{hҴCiϳ53&,K&DN;'$d3%Hk(;[Zg_4+ƒGCj׈@}"!Skguv<_eup3 n0/T -`vSVY ڳ|s_/Z*Z=Qa"Z2dXnk"KS[RE hhKH.Q'! BH!IˢRՌc8ez R|eDp;Jp TiXLqr]Xle iՅ҅+~%z⁜aoohʔJC33βFJ%Q3uH4ª MlmX'*$vICP!SLT2`&(x$P@;Wa/+*^Ƭ(-е)8Iid)Gji@Gs4 J`lˁ@EZEZE)i{Oɷ*zՋ?N讑Ky0dEt3XAC:̌_y25=Y91a8 }vv>OWӎyȧ_w /|~{֞\۟y%./9?'|Xc .Z(YڸYkWY$}w"pq0qnND_աתD|mBgCW3)ZT|w |mC[ꆙ{^{?N/Ԛg*<&wx|?)oKz8vmax?j6-D:ޣ^f]qWᤊgm~J&l4V^uUdP"5"xm~#LېwcO/c2&|dIMԅd.Jhi .CtR+e< ,m)Q)FXBӫNR7.֟ϟ`^0֨xa9Mp\vZО*Vߍ>|z]G?}6[)l/xKQv6i=cg-"f~YypXKtlD׬riFL#^?$wt$5~r q ɤnl0q<.D*U#b\J8.p $I[UI$% Ͳ^JP>;X_1Z{XK0hczns: ޡM1@'Lz0_oX d}:O#;HMyӐ7vOZS[S39\*^ygT 3́7rZS&nXQ9l FR$7\B%*&I{T(YqWv$0άiT1::bd\0"49F_d-,Γ:Sv~<氖S\K.6x@ fu"0sFF?AEilRX۽ؚ~oígxmAי.F ]󭋼Xkg(+Q"1h ,O4m('[dFYs{,nĶ=:<o"t>HqZ&]Q KN=8bJ3e#Ы05u18qAxgz`\h=z4EK(QJ:uqf'ʆ<;;4O32  \g2  *D hG% `]𥵎пBk u.W10i-yP`t҆ 'Bk4*ۋxl.k4:Ѱ߯`]A˟GHVi`; ?{Ƒ"_6n:d{Xga2cEڎ}Q>$DRj%ΰW] 6Hp2 j$P6Uߋ CZQ$HWLp@2r8 c.8B%{b^\Z9O;ԭ"ȎᕍE.֍@7 ῗv\@A`ozc[-\ɛ+YeG4Mzm!S"%}g{`ޅ妢 sw/̸onn̯{O/y~o#O0w͙,_R'`(S9;9'n}_F.saԽx]kHf\"ξ6D/_X!YTrb2Gǿmj@bjGZ+ޕ/o/T+9]Hu^ɻ-h/9sy?}7孰]("QqҦ~ 2I'IgW?ؕY71 G".|4|3z|YfrOlS\Us5BKKXHy$"?~^'Mnر21kU$N/B_~_7߿?y ܼ}o=:!LR(3Hbg=0^2uuaau[ {׏~??ǡU2>q/r\y{qlZRMWe7s!?u7b]<70ţr31湦)x߮?W3qp[bC^dnso *xZ똭fsYvb 9}~m&߮ }sd22cD9Y:ٍ2(G۩*ڙ0 ڲ 2+N FŴBAS(-%Nc-aӪC!9+Ƒod A8'IEΗ#-K\1*ѕ̒u Bu:d3:k^d]]ŀ5j?Vq֌gQrF߮64RNrGiΘ'O LZLU-޾o G볳oL8wN@.28'~ Zr{L1`FE+=h<_tU2NYXo*)U\U 6("eYw6vVT\'\i4yv+cߝ$ h9$u&͟.Gm3kAf ]L<8I5v*Ho?n^V05]@naT0\\ť ູV'e?`%!t'j%Py kAѧa4]_gi SFbhd҅%X]c ,0μPV6} c-tl3F ?GݿWm-}}TGV{uA;rFׯxxgT,Tu,B,[h/ZpI&x$Nz5`US"KzRnڐ|1Is6/$w;|qOQ/XkGF@0JBDR8mXS1L, (!knIv(gܓuZ<,ƅGw&RH&ȨLBNZ(a7=fapVAZ>N͛o=0vWXJ%#9*`}LJ*`Ez`rJ{HCD3Ϻm,wV% ȴKIVg>&p9A GJskpgl[qhW]Y\;ZzáVoH;8e_Q==y>5ʹŗ5xOe1So2`2qd#=@UfU@]H)Hڀ;M   `1(/g츍`L0P% @5υe)NYYJhgl3 OqHKjz1.~'qu`a쇳wGW1sS~a=5"hu4kDEq,kZr0΄kϰr-^I0=}F?s7,I,m?bftD 0<~O(vioSf M 镃~=o3oR2'*/yA;C&l儑ڑ)ȠwvMȪl"w펴-"1w1uRBĀ|wXɌ-F-!xd^6` w2YN$r\a$si΁f#=b(ms߫ӝWMeһ-WD&f6^zK,ȽNk1uLa`Pbr-;\]D.q0DnB0-gsu7r9\ݍZ1WwR2WWsuW(Q+"Xry4檐+ԱBH?gzZE9u#}KSUTwf=^]X NfV6 YluL~|vC`Mq,K;tYB΁u8O+6N5%|$8u#ggRPBB襍(.Y=)3,@ I3 ~I`ᗌڏϮ &qTL5rDjnNPu+C-ZK7%PB8 XDhBbx*T{!(\[. o[. o{wjjBv-jڅv]xkڅv]xkY۳%D[Vv.d[. o aUH9^oN@v< d+= SzεB%XT}i<1&?WXRR2sÌ7`SAx2d[@L3 "Gp;+J{:9N\#[/|CY]2/[ h<Q+׷pV!~OY0/xٳ%}гGvLJήmMZ%rRk]]F}o'D < οL^?\d#|D=ո'-̮=}jyW)xF`-@9bJTZ>Y&b*EAhS(ǴA@J^$!x˘\s";I7tj3r6CbC[Jԡم~hhM.24WT%sUC]f\<<ӮT@8aYJ-gadB:asm IEp4a|sHĽ.Ihb)OҾד}'mMȜtGA6j:HXhYT)%"Jb "qÙpj<9I'p[>ftܺpHF"2|j|۰GXA5D $J CH[IQFc0G(dWЙC?G'M3DRC.o&3 )dPMlB\(9m~nfI @wIrvbL05sSWˀQXlh Sު^ei^ `f8hM X2 }7pS-Bw D[@Y^Uy3+\oWP$ uHg] {RʞmWsN>*{N>& tR) P1BL `Y20q0 q9 0$5XpbL <* ?s~]`FʩR ˽)1r8@HuVV 1b6:qS"w=|^Y\RY2CAmA;@)UDx #8( wiOd[ KVJFHqJ@H1\Y.S2&x#32x5sf"ܒ1rQAda6U" OJ.ߤH-+k677ݛ?P~kƟ!L ifj\/.XIi·KӸiE`cUMgƠPqx6nsb9X`iUwCO6^/fA)|s|FI\^dA;ݝY[s[9^ۭ[Z0+Jg$:߬xngr?8}|yU|05pmjlr9øp;A.(bHP T4*&x\kzSeU0ZS7^VuЊR^̠dI$eCg4#Η'" 88IfV"kLFɆP/;@׷@yoGT]aSzѝyou$udXI] N`4f2jp*-YB ЩvQJlB Oc5(K)µ%o8z<݉/r F99:^ (5U1kAGiKejmLN@ 8帒ssI`Zj׿iR Ff<kF")9IE$s2JaQiƝVc fUAP8@2q|{(?{aX2VO=IDR#@Pj!iCRҩJEA#)P7@BB!j*մ # (-a ; 28TB{j%p0ްHyi7gw$oĢ>jh ,Β0\ٛ^O\-D`%^\q5ؖ1H8YާAkW/~ev=E?.;˃nyՊ1_Mΰc 9SDήمSwf&\f B,u=6_  ʊN[lBJsTarh\ 6޵~ŻDEa0NXCワL5[O6 oŋG5fBhBSޙjL9G 3<Qte9zw ZamLdxr3¯+ ܌/Zgk#]̕7doGǛq۪ե7/B31:gb|LW]ӐifX#02 v0bAl9{WvJYg5k׳**6p!x$Ḿ|1tWGj89}#MpS4z/o}ߥ|y޼D{W޼`b8Z^?_շۚ:=7Ro^ YX{.#n)K#|g_\M jҫs[flq7557*Ԅ|)l3cu^ʷ"D tKq9‡#‡V1[S0ƊăM0<w]c_Ff gexP#(wR:#\0^V1>o F9v9t:/|bQ¼@[J]b$TŤuOIºeUl#%U Sc83\e3jđAT8f̤&He9ruT_hbS]UӢS[9rx xgCx5[&ފ`[^L~Pse[]8hP.2kmCAbI'Ȫ:] 1ۥNbNu>L<`DҨ(L2{odϠ2YhczϊP0#z YPQz\TknbZV yG(Sy֯nVoY)GU ]C*ǂ({DGg.@s4OIt HjJd|S:dAFAF;'oK^KHom e=x:"{hY`*`/p{f?}+mä|2 1K#hZYU-6U7H׫ 6dt'D] .wM1h_ɄsZpeM=hkA.ʶ !-{CZͰf(k<}4mW+nq]_"`+np05sHז[]GSkFbePu}V.a߳T cztXt@A=gz~М/$ӱW" X fZ;EkE8 -8yPAO"0QDϯo.JqTQܦ3'Ew9lVJHH!Y4짫ظJSoWC+K M̸J{:>HcU|tH x-9:e7n8m8Qeߪ?NQW*`ڻCWINҫiD؊|c>=j ѽ/!HQxr yoZ&x*5]@VfE@|֪ޥk/&틯H>h1|ςKeL=vNL6y-_49\\RkeT>׋G_xϭ`6EͮVw|g|34&^}5)%~3)0S%\մy9 4͗z=~$)/Kɕnk_w <-~׍zL^eka1vFCTQJu2j-Ϣc`%bgJ4gBH*؍S 80x hhۼ ]{}U@:"-LÂ,z%q% Y}}JO=][ z[ ¬K̺SJ"ibRN ,ꤑ͂" xT!t(o> \c .r{0)7DpTP !突e*򨼱0ްHyibwX$oZ/9md4жB$y \H0jZ!QIR}>0k4(1Csm\E4o?zN}gy~-"Z1&IV` $"a5:`akm#G/L|?̇38`70"ik"K$;_$%Y^qHl)v5b=ho|hJ}jzm(@J`=u@H.@ϓk$sr.H./38O}:\:]|[W+*8kwۥvJY/ƽ{jr6sthkBJ՜#gLg7#Y>K}᧻􃏋b4׏ܸw׽-k(BVv7?^|q{ N; nnÚoclm bAp*|=zxMw]YC6ݵY qIG9,d$X"m0<Wd}O;I='5fͤ y⟓ɺto_?]?~Oˏx闟p8Bri"W xfa֦NZ^Zv S}.#)KݼgAN}G5:0qY^f,~m7b*uk܈-nxɧ65%[&pjbC*?l;G&12)u2XOqF\1l:s!~6NF%h]:)޶QUFkgEjf/ esQSNr=eQwhLV;EɉG\ nFFIBkp2SDG ./h%'B9:bfwshz۷?X(eo!RS"g @N&x=rNNS-W1@Cn( C`z3H:2T3$:(]9xSTsLQm﫜q1D=:km=jk";Eo߉c4upCpÅ&iel .y߶͑DM{ ]w%hnLBۦ5TB !@TmDh_Q[y_}^; 4㖮o1ἎPC8T؍m93lnO=s5.D%mk(m>[A= e<ٜ+]?]MF?0|zlׯf=!pd5]Of~yi-f^rlR|cyGݲ]2_nxZOSi6gφnhcZ!qA.gZvcc?/-tϖ -`RqT)R/jV9[ PT:kѵg]!~gnq<~>A W?D]agh78ʚn8oEu89Fi+U˼!>9eD000! ȉʶD[FTP$ 1%ͿP£h%7FPeZ["gƫa]!oU)-';]a}ׄ#wszVCj qfN .EEb 1N0 '!BHQBI˂l*2:'F:LH2T=r!,LJUQ*KJkbl֌QQta1UV.\:0P-=磇L+<_{w:ƶ7%ɪHJ!*IXBp"TʙQP턣F3㉲.C\`{8j+y%f6161Nª1Ц.Ff`\.i[[BU%o$h"h \Q 1$wvꉳ]R2+PB* NmRr4Xʂ#jĄNS̯MoZIk%}JYK`4YYTҺ6XHJ1kΨ2 9vʙ\{6픑ZةST ڶS~3P-EMJBf*2*e 7!o4R*rk>WWWyX)0]ٓ=aQ סG3 m)>~w|`ʽU߽.}ھO>t*[}֍7_?󅝢,VnZ9I[M[YEsl~#apE*Olh\ έz+rYV2V0[<`X֭6ő ţc'15Wm'ozіZ6wfgʼn'n[Bez:^Vj)ixv*:bE&Q^as]$XpEGHZ0#=0>7[]ޞtkfL4Rxrذ@[U"q> t1ɴb{a*)^;ef(NZ+Z4$XlGu'B7 ]srdS 40 DÜRs2:a:p#XTm"}3BBk>nO%Ko옥X"K}:N# ¦U'"b)TmfP e=O(G Y C5I ymX}pIJ:JDjkmsXO9鳧t>0Vhi΀(n<ՐpLs(I"IpF?34g x JDCL2(Q F0hAr]DΣxKeL3$" 4+ 3EiOԡ'˛}gr& )5Yy᫸4usĆ.kyK%g\_>?QmS,IYTO¥b=# P+L1\0& }ے&T% PԂAlăc20x J=C3%ʙÍ<%(1R#rxMXBLDL`IAL21jT.,/,gRjvJbc}v 6" DGY5h%^5N2D.Bz;rkA66wP3 tkV1^k'v\xOO$Z`K&!F1Ң>qC,kC6(8NOtx~|WAxm:jjԘEf5O*ia4T [O-7͌drn-hWEv>SXf'ۈGDI|ɨ\D~VfU@~׌I[c1&pZVInFcNW1U:lao z7+^s!^RCaiqMtqicJ $r, 6Yll>M=gfV&x3WJ4DCEW rC#me=Uvڡܑe‚o)K[@qap2 x#VCw<۾*G:?qzRP]$L˔!Yʈgz8_! ou `žl`UD"-9~Q(i$% {jzJ`8J_BN e)mVKyG/ t[Pc]-0U[q/[q?qn%If;4'UICޣ+i<Ճ!{{dH`7檊 ֵ#sUEJDo^ƹ2WrqoUW}1WUZ%v\U) 4W e+XK7檊usUtЛhPYe20W/e? _~׃\4K+Yè9 Im5XI)z*/8P1|2 y@c(r(2h}P#:UtE.07,vw_Fl 3#` 5Wdc^oց߽m_C^৞?K1ȓ\IJHa S'}<>=odC*Y0a(a.Zɼˁ'S7^jaP;Ñ@Yv 1R2(ܹ Jˋܑ{ilBdцNR T>JG E dwQE2-E`Gm,]u& d 3#Aiuw U $؏9w/ӰRv%7{]TqKtuKUʕ{]y9A,vc9g~孥^/L0y٦)ioCo)ȐVܓmv&ߓm+6dydɋHQHjSR#uZb}H}PR0Oeq_l"ᶋ]*-Ǭ\|,%ҮGFB! 3+xcUU.I0D+`' S|,cv݄Pw#NHJ>ٹӵ֘mZˏyYr"2$@VHE*{/kO,MGqXlnxަQ*U唃uF9r>!dB@EKs|OB>g':% 1x IuB9Cs~ %9~ |O]_ÖrE&{QڒEJ"ʌ.HS gr,dI;IlTxP2CW:`ӄ 54!) 0FB%7/b,b0ٓ"d]u"\ (Mdİ>U֍L\j"7AsRzEo,yuF48VdP)v[/؏皚6Q$IWl@*ZPBs_|b=!)`HlQ&z"uH#z^ , 6 ?;ɃqXq}Pd?OÓ4 #~'oՈQƟ\y&K2c` 3k f2߫!ř&THR#oG6$ל<K@ Nq$gȐg2$/)Ӄ|<{U #9̝Dnl]\k_P &Wpt f{/+4z0c͉?Mޮ[1\ sxַ]JOӆ-;$ -wB[;Fƺ۬w)ƍXgˢ'kNqzWumwm{V+IMx 7V4|6>O_eP#p,#ZQMc_ƆApo{~o?wAoY/ib8hGce7-nӵR>ߏI/էf].G\Su:˿(e;uA'xE9FDW1iKG|uJ0inT2fF8MD=wtda weθBi=Fؒ |J"Ѵ$4W"f@;:YکWR!KQ{~uYWW1`o|+TVrS64Li颠!؊X2DXֳ@ FΐUq-!{viQ0X)jEMb(>Mv%4ebvd %}Ku2掠䤄dw"C6(#G&"bur*!uGٜ CtB_VtΗcѓ3{pI+id[w:)sC'hin,诿ߎ $'!jqRq t]1?ӌr9 ~:aW?{7}?Z=U[!ړNyWZ ehkenN=ҪwiN'y/dvhNʬ %r(}G.EUX, #w,XJ"l{UkW7Xg򒔳9JWFk5*$ .aʤӥ"EA+5kunf+y]^? -b]wn[v=ԫ/9ϦuM?~ OF\fZBy@m3\'غx}jOF̀DyjBE^Nq!Q%fৃ:(h&k*2 < t6iOk&ؗ/Q` L@7l&񜎭ruaw͒F3v(,aKÇ4n 4:q w|zf_㫴_˴HxaYO׈}VYߕ%1o,gx =OTiMw߉Jk5:ZpI>E9 )ɐQ%mw'~ 6]88&G|x1cψL]Lշ$Hϧ񂱹3ИLP̝*tVN,#Z&Ki(jwlNgɴNEo+!C(6 [cPQKOJU9vsߊ=>moW_(k>@\tY6~οJvG.L LAʂayʠlHuBź{rG]H%̆C0Ƃ;Z[3J蓋 I)o FK`mP%I@ JޅU i^;%xNZhglkufȹ$yyQ^8qgIvԾ0qa3ثxuKxU8pݯ/Flea>67eg'.#i;i"끕!tFV1xK]b [ޑL.r%%H41Ybb򵯅<}ðYYoKޟHϷ"SR%~+9OED[\BEEN cJhr Re@[q ))y ґ&œ$ (:VĹ[k:Gu_2WXextت+ᨎ.Zju.XeU?'bתۻk\ԫ>Kn|Wt- 1b^K //>}nd7Ti WKݕdAT-Xk:xE"hCJIVj%a(}%٥i"-.%瀽hcpV+( -oh+ZRňlz~vE 2 bm&yb(dZUvsZAz.۞Nڎ7ډHOe*!Ֆ Y5iN~Pl:,u@!Yz d%}6~cw̴#LX«'ʞڏ49*r,S@ Y3Sr, | IOR e[NӬ%DKF[*/,|?/wS5Ad1fZqWo7e—4*6+lOw4=7oz|6m +׻t~nc!׊6[/Wе;9 1''ijFW7_f7O91ؓaXߛMt%+|nw7>gڠ+׏t.y1OeN|k<"ˡxťoYژY=J.3o%O:{HI՞]g0RViuF*)_"#^zT)h}4Qy XLreBfknv9`G ؉bgv>PqW-adi&lJSTZ>IӴ=n9&V f) نJ-$ zm$ Į:a;'^!/B8Uad4l说Ufn~1XSw74 Iq& E) 1q+rO@OK’Ovݻ_zv7$Y䧳RWw^"o-hwtF;Q^m܀۽EfH n"*+/2ԂG5A+cAk\>Cޑ\W g1T(8:EBU!h1R K.3!.gIh :EN8;YmW0upIQ헰]L@pď ɓ5,*⭤%X*2)Ixe ɤ3lk%M;&lD ,!(mr0@nXW*jZ~kOO}3 =0$HP@0'[=C1 ģr2ڴQ8Fqp lZb":@Ve1l. NUXYzQ xe:*j4T F53's0 Qjх^1-m܎0UN](<S}O6>.(QGo>d)PP1#2)t/䭒Z"QA󘑱Y`YXG) :La0oqaVo{VqHU ͎KFD"Mظ)ʂ>i&]Xޡ>UF[JE&Q۬#e@*3l)u3s%,yGJ;JLCSS.we+͕`s*E ôyH57:Eb xC =^$gA?(γж.̉\ŏ\Æ F7P1@#ƀfgM3),C+qT9* Y@gErpmo d@3Oq+$(H9wX#+cC`S @]iWM|{ {\]fʋά޹**$TF FNDÔR^Ѡ3To/5 y U"ڻsHr*RHxb*'Am&s*CWylݥz*;=FIrI.6ϱswQ\ R\,:{sKeb_ ;>zxO#0#\%.ճe%% WվA \\tni=w9#gWYZMFp͵ B,1W(`*K 7WR '߻z\ \ei9wR +' ^ \eqryp[+)9<\GOyV?O^ϯ?gf@+eK X)T/k *ګ\+R$N:Ny4ɤ@" FhiꄒG21&K9d ~G0Y g0u5sOW1wwW,y.>wW?ڱŗtI~[qܥXR|6)>Z $%x)CpRbNh93A%od L.A)YG=8nFr=&t!F? W1q=Aa )P5?(adŴX*gHŸNzUWTߧ4'E˗d[+Hwr ѹq.@N%eL*$ <#:j)ǔT'n\H<\J=U_~sz.vz:_nb3q9 qk0_* O?4,V.+2˓ؤ4:71\w츓i}|'9>q'pvq5#\;ogI?>O[rŒO|.S:s5}h3?7e~33?w'h ,YxyK7_s驖FH y UT.Yމ%Gj$λ<98nr`]%:|"ԓvNM/{l8-ötZ7C ׻ZCq+Ƅx}B Z^9l%)a Fo=UI'#;g,1Lj}@oF<5HaTb%J ")uG^C8pA5]gDçw/Fx9q^DgCHAS@hTȔ2G V$DZ{,Z-b {{%Ի"k\!/瀽i0؞b'qmߜȿQZH~4fW9jC\*]Db?\nՑ7r EG"ur)\E"&ygLDB+bu;z]l Lf7ώ s.9ܓNfؽj9x`&w ڌfᘸNfQ)1qg=f8,XOٸSVUI}}Mvn*CS)rqP@Q{p8-8) Q" F' !`# sJf)N#E6Q4e`lZԼn(#LDavRPm:y^u1KBz݆)ԞiN,~țy1W@ c~p/=UFxțW?wn.Vtҏp=>ǟs!Vϸ?^ЊCc4vx?(\gE!S#dyJgd5ByY›_$/1x!WsBB'C›ǎ#xgi8ޛ7Ls4TMJtƓi.0}Qӻq~nk\;cSnu]5sWu; ԛ3?{zȩTfPk;W]]_@ s `IW;]`x,x]+j|J>|!:PN]֤>1Zk ΐ2⸡&h\X$_l 3))\HBhUI$% hQO1&$䙚T.fs9tOenPfњ#QEgxN)fzdZGqR W(b&g_:`xrgPm}uObTJy*+A3($ Ь{N3[lrzَP3[L~\lGẌ́RPp*IY "Pc$0άiT1:*Jd\0)"~"49F_7 'fz2~{|Ɓ :RwRP>hϩ`V' #>rhKQ%"xRpbCEebC|e|B?Yj&k}ָk{In#]%]SFv֚},de% A$W£剦Dy̵V = .{L*Mh7 euETdQXh|zƽԀLAQe]05u1иq#Axgj`\(=j4EKbڜgggH( b&sz{s%%A ;ަHnsnx- 3;))Xo0)8빖Y  1*kUHq&H%֭ПGHfi`; ֱ\䶬( +0djDPZ? M"H+&8!$$04K.dj%y1rL-NTvq,{ E&T5g= Lw_sβk űiÑ="p|cP!t&>0<(u=ez=%XdLf(3j)(rv2<}ltL zܛ6Y)XO] t\@#  uM"co3a"ĎS;W3.ξ~Wsׯoo/UkV(eu@Vp (*цʊ܃mxJ՘#3MFwó?_w7\x;Fs̵Oxo}9GZǻo\=4$i\I+]7-Ú1qydy?y`,tGkvkN޸*#W>dӪMjV[:d0D`w#_=v'|,Oj̚I'Ƿ;{s⻷߾.}w|2soo/޿Οpn!`8kAe{Vl}׭~BN}>8⑟8r?v+?"8;۲ϛy!ǭ߈ҸtinK3^'hͺ#7u5X:n@_|9ɣ?[HU1{T1Xٲ:U Ri43T}c~t0$jgQW:`oux+-2㭦/eݲ|r݆JT kT[HuTBYy0Y #AfզX/GSس󈝖&y2E$  >`SĿQd "}c.yRTOPF :-hc!TrRzR1p迲ĐN).-:}VC7mBBSu*ܨ =A:wpl#iGFŮ#%VgHvCTUܮWXf8&&[SlL/0tTT~R-J5C GEtZdRTD_RTn@FoAF'l)N)EEZ~jf(M^[ڽY?},4!_n.ϫ+#)O3_..?yO ^gu|8ސ全o N|Yohӽ:=?Fc4]J*kvv p=_lu@!-6(/P4 M=Pwګp.K(DA]0 TpPs:{kQ!Qq֢#&u)M" HHk4:]Ҁ``5i3ǣPw3KAz[;x7ln~z'z s~?](5pI1NaY;2uT+A c1Mxo gT͟޺}J9:# $@pVqǫScx&K*?q:ujY]ݯi9ŀ3RBg6Pܭ;JgSН5kִ|IqoOnNr'BnmF] vs}b2ek=ݐw?NٍOugzmٕ܂qx[vGw~g߲_bGw\8m6 czt宷a,UsqS>*u;&\TXX T_g?~hcolj0 KWBAJ LJFM'@r2MB_$g=נ-h4! +ϧi*h~xlѥKvQ&9# ̝i<.!vB~;i~P~% > 6[OLOdҒ@} |[()M r ԀY18`e*Bq(0ѧA8R>PZUqcw#o){enF#zr+qvhN%4=UME3=9b4oljܮMgXjwo^󲭈x%d+r&ћ!]I <]O,qPP焥w9YiҀ/Z{fQI|a38/\qVlܝܳi}k4/? nG7ɨ1Ehl u( 6ze΃W֨ cOJVmjX`!9ERQ l谱n&nni%n}rݦfC0&d}r,BXP9<ô"!E6/: !32QtٱIѓʊMc<‡T *?l&n8 41#6ӏC=jX6h9rPB%ID!CT&٠PlsGo:b!RLH!"Z4#iN{&X7٣!s}WrʹP{wguyh8d,6Vl޶*D}hh l *2J3d|p$ͫo+#~\K o-N]Vh;v꒑z(BEȀճnGc} FbB4ZXJ FMFPmhM'խGNՉ}nyiC~ҍW-/z.Iǯ~ 6aJ)FYY Y'],d%d"i{pPui{~!q|E %Lȳ Cب]LG&1=&c_+|@S@=rT?>L=n6PVg,Q3L=[08.#Ƀ}JIaa]4(I p 繽N}B=s=pQBeSQ ɗ7֣'J!j;M`Qiy*^qT!z8.O9D Zlǵ)Ds5`@(äFPz?\?7:`]B nn /pՀ'=Rp7uʂ$PV@hɕH6 H0HyHX5sO6IWNsjJoSs|s< +E/5E;(}u>m}BBT'RmM}3tբؼ 0UMݬm\许;`ׅ eq! mn:^K>79|ԯ&'RHt>eqꝙtjCA 4(x 2ѣKgPIV:[d䐁 g`\dOdʴZŬu1S,dg NKz[3q>7KJZח’oȥWAobk!OQ~4S^ _8X[3$B̊1*zbCex*JQ1Ic et!h 1޵5.cȦȠu;N"/ 1 BT.(V(RZٶhjuI},*0\V! mcT ,Ck+]'ΚMw"EQk;>&`a'œNDoeт&(LԐXe 4sG>"5 mAr2/W.EV3GC4IKvB 92/* 4GuTl6dUBD (Uo38&W[{t⠭L۷{ލp%X9]lq*x^ q*GnNQrv]j- 05SCb_}(RbC&)_3{&G6O%zg2F QCX^q6ӐC[}$I]XmΧtSU/@Ynn,@Zv(HMNTlVnUlx);{wG34HΛ P:0  0XCT<V(|gzg,,jV6yW1R.SI{IP2cLmB,9ed-e1,:d藒t :R:}5gLOX)[j-n=g8Nv+IsB,KJUsqQKtdDGZ´FG_/kN*w$R-ju5C GEtZЁ')vjEF )~K!Q4c3 mJrQki"Keeӗ|‡WQ/jj:/b>4b^?4_Ƿ I*ėuYt;J=Fݱhke]\imCZxHK Pt4;ʚF!?;4D0]2JQPL(SyPUpPs:{kQ!Qq֢#&u)M" -F% X zqfmNzm^ށg ]R m֎*cމFə_'0҉_%L˯V0-NJ#Ŵl^l_= #4\N׏Z4bveJV7u(< ){=_yK|"~eӛ#ҾBDR)Jt&UQ e; QVQUhD|cPJk&(Ƅ3M-P,JD$BpuB暉B=tT"/% [C&]]r߻8YgO-Sd^wn S1%1t6HӁ egda(\3:nCS HFFNDɦJVł>"?{Ƒl /w~T C&:eZ5E*"erVIERGiDM4N]]E&rt}G)@_1ux!ū;YyXiXػ":E-AWDJDM5`Fxsܫ 5[Xw/It9eg˅B ucG*HJP5^F@dƠ6 {u *uӺpVɐq㴞@Sx|jc2u#HgJp՚9W38}p&JH̀RyIUiqN"TJd 4w AD!IxΏ:eAص{^sGyi_kOs|*R1@J*i^&m -{O;F9m;0Δ_uEFS\7%]|(検%хgWz 0)x'c dG [3h^tiŲw2"5h( )gE]g$MW0fv 7g18TSӝ P{R{tF,(5嬣c/1#uHKHgxmaQsjҮg#L%++.ҾA`OpK-ʩY"rDSJΈdMYELMu f:ߋϗ\WڵY{ZbMk*@]J zy*^9VFR8I]O^(4k˯ B`*)DIH"h\~v7_ EՊh<ݗ~9:L0S`LڕWbp-,FGդ-_^6(j6Lw\>WA^qs1O!tJwZ=^v+?;/~vUgNq cخ,=grpG4~Bi X[:Y׌Z8íͬ;vLIQ.=jsx2mx9#V%׵]vjH5>R6l>)WWL!fX}j&gx~ylwxO??Tۿ}x R= A܅ܽ lV %C_8?Zb +6ƾ)g<kD0Yxkb32{RnQm9I I@u0e7XE d+Z'rRKaSnuO:D+v~#@{dm/R#@cY|EdyZRVJ$yxE`\\ &^E\uf_HlE|Սx牷 FyxC.7~N&HAVI@]r΋&IG:cm3<,茵؋3!E2,BdY9:IJ6ldd*>j1 ʅlqщ_{5r 7?s>;^sSh!uu{ kSђcN 61Bp1QH[eSNjR}%61iJ5)-C / )|$O J2jbq؃;vg}~|6ʺEk_}٠+zeN`~-ILWj}Fqp?mNGȲAZYa)IfQY$DԶD7fKd<EL(3huAlE,fHCT Bv(H|a,KRխ4w*?vTO %i׺$=y-$`.*LrbT&ik1 w:tPM9"̵,x$aGblPD4@9iŝNYw;%Z.X;V8eZ+7mk ?6An/Q7H뚥w-v}& Rihu:/DG<. 0.]AH6{DENE9W$ݼi'[n79*VPy*7 .ٔ经t9Eq{|A`2N_<=#Ӱ40Id7]R?W%<[>zyxmf t.rf}|p,Ybg,zkkQ:Ӈ4>iPOp?g?Š `@v8+nPV}Kc{ J\qVWOP#z%j/Ap9 >1Xk͡Um{^WW^xOK'KcI_Ե2i"5lenB̺)Q䰔j%ͭhuK+_gW,FPkUJVUYJ46fQ˅3,H aA 6\A";hJYWOq!K;n?.̲qd k@4`%2 W EE"m O!#Vg#@Afg)PPV<[CoÇ Dzs4QÞj9i?7gtN:Ե%Yw(BQnlvj#׶<MK^s>)*o??"i[3qBVPf$cDT="6j&XE_ӱ,%hαS"""uژ huRBTL1y`iѱ .Yw#\ٙ<c\=.אp8Gp85/3:iQwAlw݌v}^&c{WyNx(.q OuA, zJB"z"( 1)e繴mrD'>[[[S$L"b4ڜNR ƅ"T# l|GGAj 3g,O"H%) (cؤ]ʚ#%@0k ht}"nWbaoD՚Irq|>o>wl~"fbq>@.٘ N-ǜ yXXLо(Ф(l 8Uzy&gOcϠG/N (O(cB:1UJ`V[9z #OWwFndQgꙑ>VLF^:\FȲUzϺ2Je%"^ߙ_NYS5>{Wqdh[EVvø|c{t#W^#B35g_8k,A** Va-EUAtJ`2 VWN϶|RȴP5^]Awe_7E||+7;Y@ M֎* C_/FP.sIP3Eao'vRx;" Yw"e^(Th<@4)oUB[pEF>Um+N^E"t[E"d}$~_G5p%bFTbS1UC; XGP4 l9F^ں0mWo?\g,a·D<1S0YX`I56O΅hZŨCX$l+c/ z5 +a{z]L*xvik H(E@B8L"mY0|>57:Ah*  45.KP( ?Qs[T~xZX}e{]xR;$| t^B쭇z՚O}Fo#V=1䕵:2$7!::M6EOS||&9pvsz-nO=h{f7&ڢ܄YtNF< cmz0gi;.u|:6m|Hٟ$R0[3kMQ-FZ~k(PV4s 2$1> d1;PEt"2.e"/F/m[\}ܥr9̓d,F41|<ͣKG^R9zIiA_ͿFF㏣q@ !/蒿V|Yhoamv'CZ,0/Pš PoN;{2% ,4@7Deje]u}S.c<kVIJ8kQXU:)L" 6BF{Ncn]`V{˶5zU},h!onxw=mnz;zީF 7Ceb*Y5׀ >G& `Ɩt \z6 ESr)*E D(k`ḑ2֝h_21RimirŘD15 mb )A2J_󉋊r ]9nni}4sƻaz.! #Ny?_K^x9;J)J4 41l2QܗB댾'-Rh["J0,Ĉ RZyc2ٔ@**%0'$d :K,>YϙNݣ]߮tUkыe墽ϖ/t?X FrQKPQPdHhTm9d{"^;ʖ'?[ZjCݽѵ# _ Fu0+^J-|VI+J_`-|kݧ'/g]Cˏ9Ee DMN6R .ڨ;)d"Dkfy.g %0cE8B$O )ޅH$2g3;-PT:[Zm 6Ae!gHOh&  l0jmx&Jd#YzY/9:ǖ|͹^ܺca*FsPvXeC24bHd9$< 󁆆WFkhfi/Xm)grBP9 WJPЃ!NJ BM {Aܗ㌡qO(,ar<zKX7bUE'aHˍl 5`X{ O>Z6j/XVdKNt "@``TT9%KS0[AڮjpG>2fҶg<%+i0rΆijQN\ 9<?|U̚drՋ0L.?>?s83su4#w{yZ5c'ՑGZ qt2ddJ(}s}n?Z0gQ"ǐ8"$7} 9XEy4Ki]Or(m,˟aHggŧ4ѫWC Ϛk6\GcJ\>e~k)/! q_+g8;Upv1Nbg2<;9ϭ7s(bV|5 > ;e$Pu$0t6j}01xx,֕|<~ZLf̳gnUG]wWCXrf]{tt }4LY6sy}, \Y5b^2&fmPG,B8'U{Q訳2:R+#bQ>dЎ$'ZD6 mKP>IDB0J$I$1{[dTڤc&Eh9Y]:D}Bb;d$:NXԺ̨",dLڻU[(WթKE[J \."T%*/}8 [&((]r":͝/Qu(5J ,g'ǭrr#CKч0_u19hML6DnmٮFWOrv1B4i) G洋mgB jDߑq[ZYWYmœB'w۬S:.MxK\y4˔t0Q3ģzMmM0 &.iC7ã˼/6.eYôDMO%(V~ЃFkn^ݶ PVj궕J/C5>[)5'}.IFfhgwuIMp(XMۻO;k>+wgX>D@AL 8R&5cEfLN O췾;+ߝ;g fz,fvs5=]`{t|[kSf)#Ն,1Bp1Q.ˬL%3yiJ^J˪E^70IaA:! Um۱wF|/r<+s+_=+zGna(x<[/ޣ͇cve.,_MXl DBcPbz sgK򄐼WWLɅ)$qH)eI9u:raඵ:X[:-9Vwss㺉z<5}okUPZ~.gzڕ_1S t%px =@ۇEll oOǒ;rb],Q3H5qf8 ش,ؤk`zPO؅.xYL!`-A=Gn߼}z<}rK|'+f^KwQXVk#ۛQre ZViVv1vC-' &mHIRYPLr?w9{,jkG1'Tr6\\rÄι7 #Po ިztX#R2P&F(ڡpl1!+A^5f)g39Fח/?,#ZiV6lE$# AD de$Am/? /z~aE  u&{-I0E(uFM]jjINZ 郌W_=4[`Fi,ފy$'),6@`!"[pM&1ӻ#acc*_< O<(!+ӓC7  ^l(x6dtY9ުb@lԤĠ|b.֤ K[;vwz/zr.׌E>XB".G\f~^'&1?kfd9'gX5fPNޅА%SɃ1*t rk VrҨIGr?$ϜN$6wBM2a^QJׯ-F*`νHrD0f:x)͙F6 ӿڢ*F`Q26Sw͊;_ZQ R|./&/o9YqnXYfV|>B|=1Alڤ!x7SU8 ]ޞ賃).g'eփ&HBQ|̺8ײ/e(crcAiJt  4#gju ltlC@3>9(za)DӺHj"O/YDKH!YmXgKJ%Q;PRQzrd2i ('W HHJTZ#c3rv#v-Egcw5jCv'&q:l^F믣'GloeT.Et ElX U2E@a~*tXö*P?J T¦2 lasO8_'\D1b7#g7bqZ3.qǹ5ح&k/ +cR,%6dϱJ&ҙQXM+#z]c*EK18--d %U5)X4Yqd/(Yb:[jЌx83e +0MF@+81"BGĎ83;`b G6ނSx핇Y/`p2hkUg٘kwX4.*鋱RLH!:d5*6)#[ҀVe '#b3rm1esu6Cqqqۤ]ر PˀY'@dk`V^6YS0gJQ;.qǹ!'^1+k۴-*Go6{CwBPzޑ - $ 0CCRx[1F TQKoG|{ 7ršrv9(gXLX-R!&rQדBqd/LΓfg5Bw>Uogx3]^w{xޯgn}?:x Ɵ~)֥bDuʆ1)y1D@D- x{"$Pyw`xKwQD  2D ʢƤ>be"EmB|m!sBF~scJLXc|}Y{|v,n6_y!Z.Xm4!JJG)&,oبryEּ;J6s};:lzZ #9nW28_[vQĬ6iZQBlx'ԏ86,)g]{oG*0`{ v -OD$%Y_ _IjD6%RRdΰUuu`^ŬӅ]&'0[+ś{Nr]檙%壇8ܱ`s#_3rJAs6ˍjQ 4ioWw}2~oy-o_}7uGt yl? k [:=?2Kw'e@>SzYgF*LlF4vYvٙO_zGtDߴky)+9 W~ܯlޯ р!/i}t'6m4qsC!e̖Tqyn4 ><`:rF5^ZgЎ}0F&5 D+ed/R'6%>{4g[3z~(nlЮ#A=ESmm^̷x^Ðw N]M ''MWv:R`q I>W~q6݆O?|h͞wFvQ4G&1^@Kz5zpf`jsAv'_O|P" Ä#/J'#ڬGmoy2 `^UW =΄mW?:1BQU՛_TB8=5hJ8՗ab}a1@P+FG_V/# Vw)3/mk]}f[b۽UdCsRߌ_N5 CЎ:UDuפmyO.z iwnittXҍRxl[p U|&0c*6̤V,veK#>\% 9h3Q#5#nZ]\Brhu*xS[88~sox-,SW}nLmzc+5N`KNYKfݦB-ֆdpPf.K}3omjB4]3hR!9#k`J1ƭRt_e.X™hr}B|#9hj" _I+龄x=B|5]-oۘoNT!bC0Y$ь9VfMhmW͑%y|ֿ#jv`ذig}#ޠPAYCV;$/ A"I0D IAY}(a[( LHK 7aMiX0Q,WQ&&'&}\ń 9@,fa)#c`L'̙;0eꝟㅗ0T7k.ro6btԬۙL6Km[:lWB.R+ B(#j81TSs r3r9NJ񧣊9Ip4Ǚ `PUJSeG.c`+JqNIE ƆREa[K3Ap\g_z)yP MZ^j.易 <ޥ\󣻛f.ꆲR5rLBPoM6yLXQL10PIb3ƩCO6t[s=Z0B[@F,hYL:h 0|;}{2QYjAQ35;F9t& 22f0ctJK"ְ|Vl8-?Ym.?/a?"$KШDf ML"a<JT:#X$ <r$̹= x2$8J'QP*xV1I3=Ď+Ld X z뙧6FҐ`|-ʄ,m:QDpTbb4qgnFMGEhiMbI&ETya1@5|``ytV#cFσӼ<&#mOfiR6QGol2)`A2)tcVI-JA;eT=f` ǃR-wa>G'(+Vo/\!xկ+O!sr $r(4 6y`+QbRnxs ݯE(]ӵ6E씶t5+UKMf;G.uoW9^ؐtVH'on'CngY5ztW&¯){ec\ZRS dhIA(6L3#S3\ 3/3PWK#' UTr@ۙ0dhKGS%Me!+Q..*|>]}y\!kmH2r?_EɮqeaTHʊr~IDciNT>U]5S?'OgG&+㽫 j𲅌+}x2ϣP곊wU}y1狷$2,w)!E@.2KəeNT(@3]'DgΔefA W*F pe=>_̺B`Re BbVL^-\$cQ ڪv%]zR:E`w EsWV۬_tIGt jlJ)F((MiS0gMN,A{!^S× ;Yps@8O441IUcIz'uNaHwjU^ 1x9ƣrVcXyj5Uj|b16&qF:kSM6Q9iஜP;ixFiꠧz:|!-Z:ea|FzDΎR@`ry*L 0(zr<8^?3rxw* @E戒P7t>bR*.R(Fa\Kn^4XusC Tڼ8x=k"=awn21 DJof ΋rb-iF48@+xj";\ cO55mS9(&<>l=!Ĕd0de62>PCt@C%;XJmu6(,q8 z?.4 #~%op N{K3<?F&>ޔ̤ٟWwC.+u)ZG>$_{+dMhj q9;:1qdgapu4Ukf4+GaȜDf,/-MW4 x\' OpZ\G>a/Hx;5GFj$z͏*5Z;!GqV=Zٙ|hcRp||Mjk3[fK~u?Ϯ/|}1=&9G#͵0g0'zloHՆ?\ ;pk k =5u# 1pTc`2@OkN1ث`{]rC\0yorŞ|.y7,VZ&1dR]N3M!#]!ѷT< y^Z99)!Y2y )QI1[]qC,ŇM͏IE@ f[H}.ͽ!OX-J$[,չ QzɱcGĎ?/ ,G qD(qURPdđR!d>Ht1̨5ی҃Uvɶne=.vϵMr+å/L_}t1H5 FeHSԮ̘4#$3Go (H] 4ٻpt9l\W[!jzwtarT4*6[>BpZ׍Pm`[7&[d1aZP 1-2h U{!IXYr媓QyVy^!\Ad ** %B ڎu}Vbx#呼$!R2M c(Kl:g dK=NsȜ:׺qjz5ILCٿG匄1CwaI;\TπK7?l l(| 387s|UWJ~OɇLCѧ~QQ} Og1>J\͞r|I;~?ilqR*{좖L3I^h-qd|՛{՜5+ Mx4}t ãkYm6v6 rXZv=i(0GzqQڛO5.:X ;Ӌp27Aݏf\ʱ=j,Gػ]Geɜ՟zndϐO8` 74!:] z < UAT,ӿAGunsE<-[Փ`$/a__hϷcR=KF^ѧio49Ҧ82^iș4 €Nk=..0gǯ|^B4mLo*Cˮ/ >Yc񸍓E Mf2qGeJLC򀀬< [ t6:G})e%3 1oYAAB@}}阬>Qׯѿ9ȗO#CTU6NRR jťc1%ЫVXX(`J`sINzA*ɐ1 FQh ƈe#n9xSTZ,vpڿ]˪Mk(2gi(v{΁|>/OӇoEjv\q rP,*\ZzNo= mLkmL3mL ,A^:ɌZȺal^!2B5Ŀ^\Gspp{X@j-bb=Aj`xծHwGʢ"cTw%9! hNh9 2r%&]]+ǒ3Vw^Hti xW=l>b4τ-& u5x{ǚǿzx%/QKTeƟn?^?zJlQ$l`D%B9WMtτֹ@ćCdPR-j:'tDz6)iqBNr!_ 9o/Xmoӑ4Ψԥ`0%1yu<@w!@Za.bZH1/Hڃ4 k1"kys2%ݑ! 3 r'ΰQao97DQy$#ST^v8?8:"Ž`Fq6׻cQܣza]qrASzBSt[i.Vb܂]LlC)8J|(Au\Zd),pg|"AijΒm[鄋m+s"w s T$F/֥m$J ^bFiZ޶j9[OCںOvB8ɫa7۬,ƃ[[ʹt}Kxwqj|ˬwn(z o_zܔ<鵭izIͮCm닮wSj`t!lr; Yxً{)dt']׽|§de\?fW?m<̓~\R'W禍3n:TdA@<:łϑ)n+f9PmQu!ESJ,7p)%GOר9>;ԬM{n }H2glrjî=2'qDࣃ{YS֝}) .2f P`ZdVN5a@χkn=}ψ̨EP۲&ժ'fZJ $h9JUCtEdR %#sw\) -Ω$t E-*iB($+F%@z[xqGLt(DrҎDbv}WK6d8ي <@b'PC ( JUVEKV..Ho<ҟ48[R-]0e>9w_ aN+ih?qozijVNlpTy >6R a2ŀMF󷯫ftɿW\fWS!KM~y><dFU'N#2PS\ .3 OhhLilzm*y!soX3<\477Q]|?WGQ7ʛB NI?lVwQ{4"Pц {(5Q,=How?'W~j.|3~f=}c~_LV]pPz3n\s/s;|EM ʚ@?^ժjj_Y`yKrQԃJ>_:zt_g7^pJCuճ:e}jYHX`_zĩ8LB>jέ)ИIa /h8_޾?XO;!ݻ7}OӪqzeU&s<ұctRzCoUXn t6׍O'i'~WQE/f>됿uo VT߬j`j`z]޿=ru_/^f q!Vqm%U86Z' ~m;ݪo DI] ׉;4Z/]EU+|ݤ.$# 8$TRpRPОӒHݮ#5=$qFκA4m|EA'hYk2(1 {'rzw"Zq䭨ze [%H}9\#E}ֹCWߦVrԷBm#ϗi~LLq% P1Y.HDVM`qY害˝e=VsQU)hmKc )`/^‡!K1&+*{|H{nfZpB:$|}ţlϨbP>aqtm{8b v0L(tffؑ=9S$qe'iA gJ xC)VRP(4>m񹺄+]#_vHkOHK 'MHkp_qFo; dח_d@6W" XA$sLQ;Չv*NZ t{ƣk_rJ@ $r śdmVev< q]g3wjX]Gf7)}LmQ*z6NR\v Ht5R1JʘEUF,,e0%$Ru^Jj2$%l\rB%1"` í5gQ4fq^1WXw~ҎnQۿ]@׿eMS.^޵Yo랸U c(zjD#._h"˾h&7L:Y Q2^H-BV#p:p,p.fK`5JZc붎tHDf"E/,tH0< ƭ|F429&cD}qt4nJS`}}\,Ãϧ)?` gTI,вb &z/ri;D^  x"8Zq.`2>BH(C"(Hy!"[P'c%9_Ӟ?i_?D|c,JZEZn3!-\C[|w{щBD -9Q(𒕱Qʘ}}Ϸd@Zţ$'vئjcYiuIhpT&;gon4}h ?M Mf;,N|vo^ʾ1R5e5*:UZCkR*95}yML,_ s׼n "F./O[7ee +PHY]t-:x5(>]/^~L&tX5rXJG ,@ Ns|uʷ~uYԬeK&K!֗2XCqr҃p2>$X (,k)aI"puۂ.'6gpI9*0@1K.%E/L `4gX6@zwyg+Rt)Eݢ?9ߘLomŎ[t%Y/uz?wU؍9>+/w=9Q[9{[ Vi˹u]^?,R9p|nqknz}>G4 [hmnC wds K]Țɇ.3a@]Rezk-iÐ{iO]dO;5w&P8TS.匣HuAIGK8 c}dmQ̡[ `ci,ܢz~y>D,RJ1@:feCdLǼVCDՁ|oD Bi< / I"!yVd(EI1|Ĩ#OXMYHaH@B̻[y^O߷ |=[b~jv |3uRф(-YE4Χ䙛doyFMu "k^F e63=гzFyu.;ɁI\2$_^P@)(fk' IyZ\ pZ{p{a8x')(g/& l2#d.HIxvZ>* eK-VqdhP"E-YB6K-@*]+qx}.ƀA#HhkJjP  S$ S\k|C"p[yo?给on^翛{je@yۑhK m=p TDhp"}+h+O-<ӗq_o:&sJN-]!kmy^_N Kb/ z;^S˓p[*:XbO_/.ˎ3+ZCĊ)T 1I !a e A)DŽG,>LsLG={j1'ylT/ s1h6FxB(0m]#d$4"e^( GUV2*X\!6#p4A5;~޲'Q xe:[IސtY9dU1BYF *[n9j["w՗_t(l|-շQ".G%3AN@&|d9'505Ũz,2/ U;v[օ5iI7Yzey\'d0:WaŻY CEBE}Q';N@^>xo_m'}X=|{90./W{ro ?wOp.X}=nwPi{?ǟ\ k1/m> D`J )H`*_BNU4)P.#ɁOϠ wЎ& 0 ҄{o51qrU9_{q㙋|m?U8K&w}1n>I\)3?4i+*>MJs3ge%3Bd[B Ľq݇d0'||}ȌW|! 7W-X|f4Tedž9<_fOẈYw~=xrŋZ=:og..Φ3}_p_~>E_]DGaջ)ƣLl/+_oWӞK)4ϐ1iwKMde:;B;R{"el6ѺM;oe3 fk$ɤ*EȆI$b\JJeUdcW@+()ķxB YyŭABY ]^oHAĔQEDz#2 KK`ZDua>EW+s(M)X+\lc5 2Cph6OvLBE$Z {*珯P[MQyh e7!_& dvyRou!4AmSd4ΖJiϋKEI`ɑж@99ZdQ^J0HFflFv\6bp)Q}˫ʌWD4n"7;wMNtttdzʥXٰd:s*k}M-b+ 0Tc'*PPaS%[ض 0qB2%"u8;L-s(lv #]m&QI>Y $?C5m gV$F$]$EcpZ"[ %U5)3{ LhfUlTg[;7g3F"j""4FDqDu3{, @lL&Eyv&A[S:\}!EF%+[P*΄CQ#b2% h"6FflF/g~Ձqq(1fdW\ƸF\qqmI]ݱc PT<཭ZZLz(#.fPp"c{ 5fu5r|)??淼p_Y@9P]t0U\C]uzUJqֶ[zf0գVF0\=N\'^%- 2p8)WpeGKp L \Uq $7tRj5+P@R\Uᰫ*sW0tRpJ)AXK:*{(pUݷtRZ3+TD\! p0pUEy(pUJFzptvs̝ŴNK |\s@$M߭s5[p>t蒖l-;wGyo$Vjgf,20?x'8>?Xh0{b  @K$@NɖH& L\(rU!B~wܿsU,s29_ V%}Pcۍ;ÐeO'A_$Èi䡨'!:/n=vZvُBqE#J͓z: Iu%}<9zi^zKJs)(%2(L0ɣkCJX eAqLs:+gY}W솮]ٽrhI,lC}bO, uu7k{γNm6e\;}ܲ㖋]7|xss `o\?.&5-sCQy:'{k-M-pܦ9.\{m_oL+JQu3͆nŷ?C\4,Cqf3R24n4>ѸnN;[o|Fp|w[,2PtaL;1Y|%Jm"PFJ#XQJmEQ&6dAHRVK?{WHP*A2X@czg1s,1i{J%%5L%NI);WŤ2#>~#3ύ R RdQٛVm9iRO+T~'_UI# Ձwlmoڽ̜ORWBKcH_ )  I MMT$ɻl# [Ey[]}o}:*n9ϩ jl1/1ž;WbB*|M;|jd(A@fv=SŃM&ـdFAS pjsb眭+3/d)Mb{T:?s5#I+|J:qD<0-qdzɀ߂6jQ!VI8-xJ]mU:7y_ hx<_o8bWk_% t`{!EM kFh2v mDRf`eĮÈabIG_P=0gi_Y 9d251{Ijq$Y8W-0F*=Omaǩ(*#q@gi ?ɮ&(W|դ/個.>K+b_t4<<U*T-}ApC E?J6Պ~?B9.5c \V$jK-,26"A9zo@ B$`x>`b`BA$ :R Q|Y3uoqɉO`{F#2'&.hV3(=<[4p˟* dpJ;t-Y?6($#h,TAB])8? #ǫzR#΁D1Q+f<; ̹=|t%׳/Kr!x%-V7h9xXOFcqʻl"!؜cuυV6 ,?4NuilX\Vq8W:d3R*l@F-azgɧn*k}O|XWPsOHoFtSF_Yе#k-_ns;66-u0-(<"OiŲZ+dҲo6񓾨Yo@`& U oM WKd%))HSFkdxē6BL9sYF,m=֢9\׺/qofu;ghDX|oߗ\eBmTW9C@nR ]%|I4NeӀtel,;G.crRxvRk-[1lLVdfm$,;0'vHOUAD{"˔Jq!rAZ"Jrc3p!k s[RЏ00kGBU~ZTΛRR3㯯+Fb`@1_b $˼:%m1߂Zݛy ;JUDQ;\5jC(xJTiTm}gPUNZwQe)"1VF%:F'5Q"dt40Ed6zS, ^FA"ˀʐh)2+Aȁ .Kt`QdrL=&fx'6YyD3iw#i!h5ekETіJ ̶=>A(`qO}<[]{0^]+Fqy Ej{ـ'3.VfWbe't<-m=9 M"KS*'KdX$H&:5iq%w1$JBߐuz 9Q6&+YId&S+`AF *X4҈HwmWs9s3`FAݢ_jsXS"RN}fydD+Yrĥw̜]S>tSU((Qyd4cJebU) ғ.:/ƂQ׈dne+xjahYz1{mдUV[s\j&iˣL#/dNI+E|= _C[¢Pjj\wQL {K+.\T p>R0\=QWDl0K>$?X}R5驏O/tg/,kOZ|$wxѻ52Y׿p 2PK&N324t'xrGY<[m5˞Y%}d,AȌ͛ŵV2UxV\[r8Ү=/u$QȾ\v9 ‹g'W#45GZ;!FӃշV EKJ];?#V#5j5̻>-=.3v'_ώ 6cY shxt<[c{i6|uǫKb%%R[:jFt7c]g3- 1Q:5Oƣ@OmO:lU[]vծZizHY^HmXA}29S2+LG>cR e Oo{~}7x}woyu׈:r\Dprg֭:øJn_ӷie/O@ݎXSa ՐiCNA\sXv䯯~zF,}5MKK[4-d^6>_k\ORB,0݀@׺wu7SJ'/HW*j& Ul_Zrq F_܁v?c WfO9\?'o &uA'n6]d\]I[|+)QAʘ iR!.WGUZg|~'HZnd-^W+U aIhqD̜{ /s`7zلzT'zӉ/]k^>y+MFo!~`(|29I`% Opn}ͷI{ڰ<̳N좕L3 j+#jkgm"sz >hr_gx} l.zPw6հ|$xN[G~ن/O~jSW,XN`fv~6Ν 40x||P4*5+Az~06.VM> ">TT=؝k'5n~{80Պgj˲!YReM(^8)!)%ڧ4 `"%DO@E*>&O2rΓJꚭg#mTIdtSBʅh;^ބicCg)qZVɈ3C%I)u +Yj*jR}tt\`w:;5\ Y٩ky(6 [FUlsoY_M&ҵgzdk厷oϴffٺf~b[0ʑBV0S 95>2$@zO!YRH%̆`R;Z[ˑ2I'.F'9KQ5Xu *8O.r@> JޅDT jړN gڛ8[r/r|?׶/=8I|y;=C؟O> `9 fW?^M>K;5tA[ d&}?lC}u5Xm7SxwoCdbcJ[)Ka~+'DM)Rk0^*RǔO,Y8ЇBMrR@c9{v#^ũZ%LYG;җn!9Pz[ bMwOQoW9KKQ-{yeR uou:WjDƏ0.p4>]n!B| +n g;DID;fO'&wLu0Z sHN*XE.@2(%cf6Ģ1BET,wq]yz^r7a>C3x?9w+7,xk)+gM6=LЇ3 ݷj'sߪ+SoU'y[[Y+,[l8^k}0&QZrVE8wEW_~S.˼V=حhj!;\3xQ5&''t滏::,Bp8z& ;_ m& BG3o_>S%U7Si(2\S!&]fY2Ycʠɸ)eh;8JSw1|IQf[]fvh0б4gC.xD^[TǖER/4i"NGe'5UjzkVQg-ZR8ivsV+y᮳ WL=K9|R>-4眂^-6z)%sme'd5$b^ɉS1,]P Vu 6Wg]h(-9˙s l0K2p(sn$g %q_w; BwJ9Iqrrr>ҥR)MQsf(2DJ ) 99Xq6 ժk>-t6V0fLcBsh30fQQ; QqY=&]?A[4CɑQʩ<(tA0VE]mb L&LxC(*xߦ˷`!fL4JIJDW'g" fsa-8 -#6;p˨W*stG}|#QIS in]xI6rQgkYovUEnXH>\R~eyl"eE˧PFk"6}%}>mցZ*aS>_l5(cX21$ ә_I.r^c1SB3ב"T]RqdyxD@+'#x>%[S]rg` c4ւZ` >*vZna>Mj`DSIAW90hɠ/-c!ڰ¤1t [Uګ7 CKHrW4:%a-'PǢd2C]4<ϊH8 4!#NoP7}4SX8+#k9d34~(S3d+KU9&ri `vtՀl!!~(3@Ơ) 3* bB^ aeK8h2yL$&:Sd)iY @F$]@8JVr ޔlMp%BA1SX^GvRP6ZO)AAO)!Uˮt@(6NץV߽.9&끜$,/d K6#4 J>5"t $Z0aԫbs6V@ZIs@ NwK}l%Ȣ x  BL)ڽAX(mt{0Z#AfU|MvbIe6APϘDjDɎ3S͇(hE`&շuNW[QXÎnyz5|< V(GMƪڜ׃bF0hX%Y֨48>p@Zi/e#sl'L]bpALڋh ` eDAo 2Zdpb,p0",0wϢba DYl9i M-tОEwi`J fB޼jPR\ ^֬Zc-7(-Ia. fM:,2 cmaHu`&n6b1;y^ڭ&0:MG]u&Xeh2.˜k3 \DP]\tBÑ$0z`2o6:7  MAہ1bԺ[Cm58EM񟞽$bʅs <@b69vS/yMaeNV(h(cx)U,JG52۹Kz55uClep?ܠ "eq819EKkLfV 58֠P/Rh;=%&}*` WMIB1Xg}jy^ϊH*R3QS Yk0m@fcX:dС /jC ~4S"qc:|#Kx>(zcvNA}6DUKo&".ń124+&8"0"%q9D L,\c.QmuEC@Tv"#DoaLn 0@5a[F迎v/nLWjZ ; ]5u*؅dCg pB& +v@N/~)%@.{d=hDT$ *<;aYJߛJy{jmDm ܎ ŬSK I )Y ig.~f?0%#%J H DJ?J T-z>:N%8Gz!%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@zJ )àJ h(d~4J f^ +%%KT)Ž&%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R*09LJ "x@.ZsJґ%*!=#%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R*Q dx@0Wϙ@VWJH @aaFJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%޹*ڊONeus|C/ܩcGC-@;t hU3MjֺYJHgӕp3uDѤfǒ+IWI!+,8tXUVCOWtKLWio+MWm]Oѝf.?㷛BzꗖKLWk瓟۰VcGM'j_::o<={$OW&?1p!vE85B]Fh{32-~w-NcyX؇~%\Xʖn2O'Q}G~.Opz@3)GA EC%%'xMEP1*}jG_G_ \я-}a6\.;ay[ţ馱>?֋SLw/N%̯}:Fۓ? M綟.&@dT AP1=*hd.VKz;o?{/jn_r~j~>s^RdvtyݫumzX>|9_]9*Lߴx׸:ݤn?崛/y>|$M]?;]Ogf6r+EMyg!zgқj<]_]ם ]+힑Q^njb(S[V$V}7M:F|[[^Eg}~^nWBsB];:ZYߌ{;w'?2ΗzUwWjk- :Yor-x>Qm]^wJ5WJK^(~j$]7ߜޔ&7gc-޿)k\x|(7 W򫓓F'yx7jb ׶:]?[^qt WrF]fHp@Ϧ%NWhsͦRZɸ Cף4uC&7jd~kۛAfl]fiMO.6ktFn5m:Iou moM_ Ia7:\ͣ;]^~Ucf pe7ݛֽZwhyp5lio6q4|~2;7@9ɸ-7Cv}F,r/.-],^{ew/e4^ƽ¼|yUw*]n͂A 3ƬRne`RjEnTsն vLqvK db;$Km|`:̭nJvJUuʷ}7;āU`g}\vHa)Ʒ8*#bitȵ5xN 3*plg;0i,\VB`x'D2ml=!>qGY|}O+|qً7h(0e.`pz%alDa4{҂!ϐ4hxo3L޽⠬&ygC`2Fu^Yu^[1YqpzO[qPAs!skIZUKfYǔ|fp-㜋H8!etֿx0cyh8.`mL)`R"0߂آ|q"\3Nd: (f= OEW'wE⩨ Bj&e. Mv]f`~rD__{LeWջA{DTAbcMÚdNY,UJLjIB{V2֜kI*@) fh6 쀵-p#Z$5tf̜3pXe\8g 9£ʅwQ_4daN?Lts.hzgloy.'QTd/Q(ZQN()BQ"`<5lƖNVnhd/ddB6/Vp;|NʛJeɺ3`1W TR]lz`<{R3?͈b((#RF!x3&J@ޢzڙ(ښ,ܰjX܅1ciEq%-SQíL`BY,BPĤΈC7CW]`^o^TEIy6Y+m#IүR>n{ml{hw#yJ\SòOdI(*J]EddJ; lլ!+6_HQڠrJhC'r4QZv\:NuƽpȌ̤<&^HjG,Ք&ӴR>D3O1"x/m/43eG)5yx3D9 3jHNhG\f+5o z鹹/rqq*| ^^ _?`|d{ǡ2gӳ|4( ,zX-a`8d9*!pr7>Oᎃ,{Q8b  @[F(u?Jd$$qP_KI:$f;sګ롪E'.5u x扭Ʒ߱1m>JN)˒Wo~k#l9V9,hD.E`wdCHX靐+}EpG߿&?ҟ_iyTSg8E?4$-vV_\?Ao1xg;f0XQqk};uh6BsQSVU;f[oDk}η1h7Sފb!RLTML諑=;Oԗo/ ͋5FR,P4S|g$7H*NGӖJf&X'i ^;hr\I9uxo9$%ZJ@G")9IE$s2JaQiƝVc NUAP4@2qU˅Q8 QZB-?'I1Bah IT:f$ӍfA0 fw_ \'55m@W3ԃ%LaGA&2J4xO-S1h,u^HwTmÅ (|0(^1q 9 ™JLK0Ɇ0(0 `I?*f2rb0Y2͐K}=E?#SdO ٣gK,`@ N!\S 0ZS  ɊN[lBW!jv*/W0 =wRa0)A`q<_l.Hp]T#k>pr~T&m zRf%g鸙R.a8[xuV6OaFZ^y]\x1Ug+T^3sn>;=+ǖ뱝  N!<39C'FPcO鼩܍ҍ {PDF G>㇃U=>{ɚ6J^lYzHi!:: >?G.~,EΆ ;*$3׻H9'^U?ū'?-?yGWh?/nFTk]SWbIGn~%s_b; Zvn#-텢_dSZl N3D1Wk0|`7iaϑYYT`{4H)ʝ/2^Lt~:+Ғx {zsΜ93`s~&gI;i(xlRGSJmΌ&zG#wG{)6;oY)HҚSxR]]>R*dȲŻa +.Gm^}n~| / 5nj{g{ [~Y,~&_sM ɶs 1H!IvR!a?(%EǸOq10 f1grH8+P7n ֓\>rO!uĤDh:<7 u.n{#k\~1~aT8Q6ask98DN填Dj }Af[]s `v%r͙P6R @Iq! X{+ˆ^F%,+ʂ2""&a,0<)c [cgK6+\ܺ&UI's:~Oh!IK1#ZFA>+##pۂ 9fܫ}xwE[sbq;{6Y 0EN93ȂeI"|`C-]IlwQR,޼7|8/'c 0a%8,W: F#|'8t0hGKw#B{"!9bZr<Ȕq^hˌwxY'] 5 TI= lM!Er@ ĢM ,PK-;%ZIU |/zy|~zzQ9^_JQߠ+|%e酣rӕy9mt!S NL 4E)䲞膄CB:$bBE*lў8C"fҠ2dSBVt%T#R8Xaq H21P)eI&:Jvv&S >*Spryp̧\/,4:h ~Ib_}%Ju竧'ӊszCҦ(9cx '^\0NVxl2OhYB$dٱsW~V`IDgےR-9c4A?f,4;>]AɌi*drS")[PCn4:*guOKy^m?Eb(fo s@\t1 }F10AAu7QuEV }^8vMI?}><;}?ޟDh~i~@oq?OO#itZظe~{ź>otmOcOz]a< A0hày4A?u0hàyk aJ0 A0hày4,, AnHa< A0h4v4(_ Zj9 Dv42 A0hày4aм{ڕ !=[/ý;]g[KFW\ )QOIc(CV[ԋmep& {d" 3ǀXle 0"aPU~.N0P֑s-kp!}Mfv3}9QD!sL:bRgۦVK6#y$ŷϊ`M^Nk0xxo-{}:Ʃ`-<ɇq"ʂJYg6\d 1cG^ΏOt| l]X,ݷߟʍχ|r瀞i|S99vmt*9O?iys;_|>9zWXXw=qs?GqěwHxxq$2ɅeףS7jeE=@V[DjP:(+Q*Y9 Ek®[(Vx !W_򇫿TC> +CvX'A]zIeZŨܕKCv'j/]L6+>hÊUGш(֥kYΦfUP/1@H /%lŠB @d@') LbMtE|By[8[^^a.Ǯ] ES:OW3IO\"}-AJSJV'V{aGiVE]֬f_pg >2(#8t(BDsZv_Rߥ@0m+B1aJ;f6j(R*g2y.#/OQK6)` WM18&sD dW]@ :t#z[ _CUƤX赱 \! )&˕Qjg$A ݈3{qPnT/N::y)(VT*(0`I$rd(iCs Ţj3fG/gOm x>qk灺o37 ~O3m⁢#&SzISj8kp܊,ȶ~3"ۗ2Ͱg|ݺc/tNHmң G+/YYlFrYfA&O!$J N8qMƮv";zfuz_fm`xLLL&aҵ%'s" N 䕊M6k#,=Ns{rsNK Cb<> Cicȳٝ!O ?)?3Rz f${5wFܮ_aw?.zǙBJڸXLΕx*A+cM%[DCяG=;ރIo}&B;$+IGg6T]M]~=rP &#qf"l$Η8Mϻw Q+YZj= s%l3!؜Lc{-k |<}N?)eeMA>Dq[}̌mBAoEbd Q'E+˝KQDڸN2['Q$I ds^bec΅> ֭FKZ@ao)hWf^U /iHֻ꼈a(q @E!bm8r`$Gx-EI:zIALl:#bp8SWgK]K3.ꁋ7xmh3vB*8%ML1Vzfu2Dt(Lkjw]C8<|hoxSܨ~| 쬿[ɊwGѣ14 n2N)-(TwZh›ڽ̟訜Z/ y6%m F % rB1ցm[w \}gSm+u8̏ôF/̤?],m-8|Aχ]kk~5t_N~Gh˥>䜙M,H)Yr.3๒@("< "k>0d*X EI~N1&iokY |0[61HXu}'l|*CFEdP^lDɻ>5% D--v h@~+ O[~lns>vlR'k崬Gq!y*]b5EsnRaX9ϖt-U2:;k<<}Sгz9-]ms7+SٺU$Y _Ε~!EJDR#qWbKph8^~Zzff)^RS |)RrNz $LQ^ǠR'Ԍ۳)?y|~19]eЛQpҝBw&wdSȦz}q./96uM(ٚ&*6fѪho lاs79a<> G-r-_:,ayl3 wr4xl;:f#CM;&BF{޹\Z: n낏 >6;R=G%/2q[uTL<"Cf&~]aeY˂A4=:ztꞛ ={R S2"d֖5UL#sh)48'$6N#Nk $`Yz)wIf=I F' !`#!1a!LGnu 8'FH%0U27&[6:|GoԻ9A 4~:&զTu:X5pn:I͖X(wIt!}=7ݾ[/֊bA襍Qgx,]MM@7(V Np))\Kd)=4,BP2^x i18kCɾg+hᢟ֞'ϭ3V6AwT>zsG*&(tڗ>}x~84;c+4{rƗSY d靡,2=O3cDY3#jfȴE3j3)Tr;Zal,j 1jgV e|4W%B6.30Ex6f/ySL,rjkJ_\¿ݕe+/j^Dfw?0 A{N:l,E# ,Kf[ H?ڃhL&ڃXV&16Wn>[k[9Qo׸+{k];2u*3j-GؙGSe-/X^0>[GOOEUx65ȋg?'oeG?4n0y3޿zyW'ߟah#%|JA ]\ @dXKSY ./Tfɳéڂ.oO"Fxv .[MN$x?Nw˷|x6_] Pw+n]OӏobU Y=-h2%CxŔ/PD5* AX|_JE?0JIոhEm]C_r e,z^ oh鉦%Dy3N8R^ޗi;U&Ly AzjZfp(`)ꅥBg gU"xƽO`eřj:]05u)QHP@#IĤ`x<%{\sl._[9"]7ՆyX+=Uyc<(bX.ar &')S9xs-#%4O cT `ƙ vT] 9* T 5B2W10Z6L6T82X)0#8P2ܦqz<QjmLԆɺHAN{E l_IrcE2 FL)?O>B/O4u\1QB=q1Ske&lPqDp"m.0)>Ajl+?sB Jh/8u0ށ5!g9NcJ9'`\1gI!`&E7].'+7MfLγj@_:aPN98`QdNH mY?kINd`=u]"Vy&IpgA8T9S{燱)sq*"{ zń-/Nϖs@Ye78k*g■ixVy0{'xMLLwRŴ87Ճ'[hOQO}U.8@F?^"_ѼsiB&ʚUհLinfQXDi48j#ĵm>l+8- Ll8BIWhN6GL<%Փqc(#GcM:OZ0)xkP1 B6 /t`x5&FH_AVuNUil=ŋ{ݭ Sgf8OscV*C7ĮQpkG;Ҏ$' gQ֎A ԃ$q.OѲ&B9@fRr})&)s&BC4 N^6U&ں1-kZЮE߯ZA4e,o'j~Ɗ}*c\b$E;Z1?ŠS9(oC'W,߽Y5ӱm~48ZAޡ'.vOCmV~fըpxpӹsk._ ݣm-,K Ŷ״vi)e(4M oL˪Gtdi"!my߫. \@l饷ef ZWu+\4 K`%IkR2D'JD6`NL<a U[k4ϱ׵ѵnJWfvz^x7ȍOӿ0 \zEnWvƩRFH۞2ڕ4˶gABi0#1L a/&n|} nR|5 G}fkl?dd9#JZ=%XZ-E`9,NnX20%3&6펉hi(d7;mbC0-z,w@rt)`..jcjt1ZKw@WMt֥YٞzʩCt++MW vBtt@UCtvpMg+@o;]!JFzC\Nw]+DkE PK%vDWV+)b]+㻪vFB\jJ E]`Fc "\κBWʶTttUVu!q#\BWVՎ(mG2sӥ@, ]!\>Q~gpJe]`˺CWWmQWDWt͡D]oKk ر}3xP tS,]!'hV]+Dky*Nj)i]1㝡+븫jvB}+Ct%+MW Ѫ4=+X+93tpc "ZkNWJҕԊ!B;tp) ]!|8]!JzCR(% kcJ[j[OWRG҆F;DWXvgg>r B 4} >ҕ1|Xi]^vŦ-` '|z}]|DmN.fj?v.XuGobt8Bj~8rԆKK&6;&6•fCڴF {X?&6f CtS++iW r D)𐯇ؚCG6ͮlxu#(f(E[tzv5v]+Dd Q'Ti!¥+thy+D)DOW{HWHʻ]`Fc "]h \훡\0{%VU]!]ݍ Jx+ECtw~sPBet$t)^.!Ѯw#J OC9B;+KIg|WvD)+q`xnQa<>q#C꫻`wyXLc}I:XuiYr$N }.ܺQ-"t:dxXUߩsƋÇƙ>ػA~ݯEp?m7d~;PKmX Wvf235gXDi* zvI@p~ rjN|Y=DPʖMftz:)(ӂuWn2\BW1R % tu>tL;DW2 ]e^_$mRAOWgHWiтz ]!\IYW*tQ*@h`.%+tѲ֛RΐD{]!`F3Ԟr ]eCa=]!]I8! XuG]etv( J KCqp ]!ZNM*佺:G҂DK ӓwf!CF[?Ԏ(Α4(F ν׮O={|W>GbF#pNJf2Wr?S:򅒾:[}Z5XIN~p1a.-ɀMwFzӓ^d3ZADIWXl~`K"P~ n3rjx5[=pced-AWcJ;DW条+ v2ZmtutR7~K  P W@W*լt(z:Cb0:DW Ww "ZAZo3J=]!]q(1 XU+;c3ZNWJ(a(]e;tJ]eYҕTy]d*63tp UFL*3+0]ZȀ WɮUf0d3+9Q]B߉Y.]2Jћs+aCt,d@t f(yЕ8%1zX 0}bU=DCZFW]zj%Ct|U=BWh;]ez:C#5U XUkHW ѾUFٶ3Z{zpB;DWBN9hn;]etut%xƮ@;CW.hi;]e7HWh4vP{+MW mP{Fٶ==] ])J+Omzpygf3ZNWe?3xtKWܨe3Ma|jY`Y6͖oyWPe'HOKbTNhj {i@֌ckղȶ[~{U.~TV7tx^V/U?u22I[ pr5,Xb4icsYH:X-I"}l@oPUqsv%65 o%Q`gqp y,ÿ?}fnf.B?C6/yO9Z߿ -zOVv2U6~<)*f|Gư-IᶛOOؙ qVfyf5_ŗ( fRZ7+>^<f7ݜ?y[p7fV3†[fb9ŋ^ϪmTfMdp[})'qXcXMo,y=J=.hCL$\To$с}2 n>%XY5ǧ`ӵ`LM`Yyyt?n͟obaX \ >?&.f.;G"BTh/\{~^7;<?՟{ ? 7VU5oe6G*-V}г21w_2e6O~l fՃ\RFE9|[ kBxGy&r!z;`j݊m1-Ǐ 4TԧOQ|OJ1GGv8W9c'1\_,:Xo,?_.! Jٗ,,ۜSsYfq}bF7ꊂ$6+?Ͼd3 d\8 ꥋl#囎fqM7J+ .kNypӸZ ZDQT` $Ċ>XrɑSٔb/6+AZoQ?#/ǎB$6iSa9NyUSG>cWi RB Ҡ #w!HƶM|]QPީmR$D;MMqrsr) DY &g (r ʻy&*ܢ#ܼASU3IETI/ ]mB>|)fG¡|)\2A=PZۜ2\ loe:ERrz nV扥Ht?YFW,vgs)kg .0u9u1O;{,tO4۳ʚ8pKSXóUx拹r&w籌1}m~ی;4L;''9[LSzu_].7斾%tyQY:E0sφIW-{؃4V`lګέ'TlkzcR6^-&_w 8gB -+[Hfs4GŞev!W{_۞rxrx̓1G!Q$OP\%0(%LR*ERRSPhUDGC^5jR ` Qch-C18)rd&oJ[$8A=PQF4Z3nOI|ֺm t_cJa)$.4TJYfpRaЊM&>x ޅz{h6IX 2q{ T,}ai~9mΣz`}$" [\#B9OQfwy99&srϴkN2'w$6:ǵ-(@1')ILz0k<% L4$9jQ)GS^j8h+W:2B AQ; =ۆ\cpK̡qpM_*ˡ$De; *9W~ESsή|C-xlF,hYL:haZkQ=m7@`Ŏ!1RE \Q!&$4':EFa Ycp2)/G/PCB@0 ɒ5VȌՒ[*`ŤJ,R&CὴQ'n4ΝH Ge4:Keː(m@`@PnXP+ZI-?>I}EF`܄ZL2Q$D( - 36FBdA<:D7&d鉣 ic뎶MߤMmU$6*f]q]HWtQQcZ'cdZQ,ɤY +GQK󁡅yӃG[s#"~<݊=mK5D >pERuV`%#HAȀIcL*%[) [iEt4Bptsm0(%ҭ RJ!ן,$MNDikmt}:ҋrR|6BvJeg?g.Ηb)9Ӟ".DXPq$!Z"ՁHOg1m6T#"m;(Kw.ں{-/8_>hc\ZRd!p <bfFXg6g^gաvI:ޜ[ނ^Jmoa<;h7_> h^"w>:@yIm;,ۘϴ}}xl44BYkJ^NcVXcTFFI&RD!(d/ ?жP;8yIg#_JFђ-yBd{Re\G훊D 6 ]dDw2DNAg((fwAB੏a(`iͩt"8 jcpg_1AU'ϏkdcQ^>=.>SةFYy p2vMQE-VM8axSrV)ɟZ( D0!4j-bT2K6!I5͌όiƸq,Ҟ ;Ņk2޲z~<_~6n/y`:{mLϵܒLEJ(1 YI@Bq1SeZxYf%Ő͞ @s & 9ɃTBoGI‰|L+0c7g?c(T,f[X֦=k vR! ޡAGH>IѥheZs#m> wZP!= &DTOC#r^.$ MuTEǍχ_{5Kj2"4̈3bψZ3q1ѣysdRJdaA]N: %ShLO!D*.!Q "3.89ƨ褁+f`RuÌϼXyqD:k%"oYϋ=/j*GP$*Ic(ZW 1Ѱy>luq| ײ:1 k\xţ5V?2.ÖʧVǒuEZں"DAPbT> t#j2~rx8e"Omp04g$ttLK2͔2΃Ҟi/ 8E)spj+;-8lnwWx>7&8yq<]t{n##S #wH_6~Ind#KIGWldIVKܲe[RSG_?V` JmA@Hy^4Q4B˧&S $5@E+u(OBϵP5@w%t?nֲ4=! 9o  "yFuton7(羮+xN_'`57Jj`Mh<94/6 &;5~5~5BMSL3ovO]|ǧ._,j~\3FQʐVw {&.of׊ۙ IGY92n;ƣ"Ff"P1?ĩFEIX]PdJ4TiI%GA4sBQ R+/B??z6`:6|zyo&_/*?u^XVv0cTrn$R9e׿#I VW \+hh\ y5h>G OΊUpS'hć_śI#T8oÐiza1噏A?7rn~4[Ne@tM2,Kg893xn#f?z-3A;rjtETzݱp㞫e?`^Jf(#wI m'NPsosNP'A$KC=/7DӒrK O|;E֮Ϯ^e*5~|#sj1p@wG)jsrAQ/,:Yi U/\jF GyxDXVnp]rOI1ze`VAPBs*8hG%2+_P싘ah`ZE 6L I*H hJ5JeDe-n$w_y 2,Y)rtOAX y8t4g#b(\1*Ov"G Gez6PՄ WR!1a@5R| ~FZE#cT`w]WIx8Mh Vb -;.Qu^jvX!lBP?q(A-=5(!-%Suk²_OCr,1Ch9)Th|y NpYbHW R!\NI,ZIwPlB;9q;lXpv~^u|x#jәov|[i2:jP 4Z-R^NS , J) Ѹ#FFgȪKaF1;$OA\}+Z͹7f iIhb'YJ_" !8m L<%[Iz1⃱FBMG-o-*&Q9:ՈR7eZs ^qUvLG9n̓"qtZ7;.ácYµ>: }w9!)H3*8-?ڠs@&Rr})&CR+ ;ddy9yQ7zDZ|uk(AױW4@/e5Dכ|0 o, T:b0k@(>bp!VX6rnG?{6?G ߷v/KjF`QÅsk1\Ym--+b!AZnѐV79ʪGuH/D!myޫPD HGHE*[zm̩tEvh%Ik"p I*U-&A2"`r %F:ֺ;^jvz5I/ X zrT]=lnzOc¶pRbwuZqQ:RdI;]Z0t6zxbuBWlOŽӧjK[Ԣ%64ѲDM^ IE⸡9ADž ;2q.pb IJzlT%5@bkϴ҄֜[ }J{=׼euR&\{{?S|>xb8g. AL.9C+=\bpɷZ * /d靡$2vAŞIP1s*!i@XD&8R(Ad#49 s L0σRl2M1mă\0(9b#L q!n`y>gM%b&X*tᦓO(](ByОSN$F<~F]02X`I4b9t؊A?HtAg! 3uDiMLJ7qAS[ ׂt*1x<+Iu?CG;?u'#)S^+4>X.xm9_x-9Db<$9[qfkSeEcvNNtpd~M m^;DGZ)գ>.^% jQFANdP Sp5S<Ϝ1;OpFhL8s3ͽ<{m}T"\>^Yb-)K V(R4e<%ڇDRJ{p*C8#,﬊Rȅ3Jp#3):5cw(TRղ֜g9FJqqWV:УŮ,',AֺeYryeƛWqmxLY),QƖuriWW9TgBv΄'dh)'O-֚G^~s{R'KaL,4o,<6puK} `ns'«X~Ĵ՜|u֡zlP鴍iI"uL@I)Pheˆ*uvG?OgCe8F^Y[iS ! )7J[IDUAȅgшn/< ~2[#_<˯!|@2D8~ .$&L{&+ 1U!Z$ c"DUf)͙DABY:2T3l u6孵ꪓԨs%LrTh:pgIh]Lv&ufLv}"XyOji|t`W8:gj{㸑_mH6;}f>\EҚXѪ%%^#}$kF[v6]T=dJtӁp`\mL1; A d')Thv@xȐ#܉41M_Q'\ U*],iqV3ykbtmST 4FJB!dN%¹苔E[5_ȪXA cdx]vAIYpI`@-J>$SDsб` 1P~& fJ<X.Y; ,rV ѐ !"Qf@C,,i &W3ywL@ƪg]L@Ɩ )M3 Kt;g spυ9xnn掗jV 0;Ӿ;_|ыymU"O>䡍#iV}7Vwp/ٹ 첃>ߞdnyrV+3+X"9F5t CXfx:ԍ*0yHRaX2g=`F] OXb04}l˗Th9!# b"ղu[ƖT$&UjZgݏd8I1T=f1PG5Ɂgo1`r"7-рĠ L!+WcVA(hiZ m6Fq)p/汿@GpgZ^Q5WU0$P!2(J2Ę ŠQhmgHS0MLsqw_Ozv@Z #$H's ly(\ )*(\`_dCw gvTz#|9w.f-0zl8ԳA9&5{v*_Ֆ?i.TAd3(m:^3 EgMbS.kpR*p ӟ;֠c"fzml.+#JL"JF!:.GA $5Z\-:;xē6-30)Dl(T$$M&8*XbWjr*L8FUuL<n>f: D^PhC1KA [ᱝ殞JݏEȾ+:H>QgB/YlR%~Xwawr.cY5q:Zv}#4֎5.հI74VMN#Q Sdծ*x*' }c|~7\ϠZ$ȃf aA^(QOy)97Ymܔg便mL}?e~v4;_ Jg?ޮJyU^n5_?/~y$hIv8#kf i1 d4֎.ͨQQ74!brФK)26'l$ qpR,)Rt2P٢ːs=Ic"1Hc2@H]d `kTj}jpEvBOu=$úD)xd[&"+ϊ-&dȤ Eev:bRf# J x-bkid:FUMIBNP;N&B nĎEB1ǂͤc,-'Ԟ<؝1. ;cRd" ,b;gT%#(i[wrb Eoa̞4̢Ig`I'&;QXTgQ͆s7~]ߢhbf[Ǿ#qB]XP|&&oך%IQ :/`j2Zpa.BLh!Rq&Hް: )%1V4:7i#7QWn\cZl&%iz wI<$E: Y6PCϬ&taȞuהM8՞(lg`#`O|ݕ{Mя)4;g oZ= <.+:bPKV5ҠG㿨[7~ӯ֧A>жxIw,&4KII{uER m2Ւ_8҈:͞Řq,ID4U4BF hG)E$Zy'heY9r]59VDRpq78&#B̖k. ~~\)e|VNof,Ջ s~.V{s?3<|5TL*1s]tA]0Μ0yS3a<07focӴ[O~|{דm~9ʿu.\ԃ*{51gŎtaY;:eⲃV- dLjk_|KONi|6Eּmy0_7_Ύ䳝OH>#|安?g|{~V 6*2H7G]F6ww/mZBF+}g䌝/:`Ȗ)LJB.տ~IÕ֟uyzr뻫?l n~TpB;ɎlTo[2c0[ݩ :4ums_|8!6ȟEK>l}:<⿔2%Cy[q6+͹ OSw?z;&g葇WCwwhEQwnz{ҵ/Vhp;_5|!O.lUθMz>^)& L>1fʵ g#msJ  #{`f^Rufù'3H(-||~WdszA;!/5fK߫usT_~?߿O~oSf޿o__p&pi=_-Eqri9CwT9bOQ^*3!C'Lp]QWl\cv΂﫟CUS]cӮ{tx>&߿ CB bÀ@c58ciM|[TGAI O.(9gZOf͚'@AJKǹrJ5(kSkܥ(9j܋Ip(omRHR@X@VWH\מXLLXan,bdBWR^NRNM\B‚1D.("+adԼBVm-C oG58rH\ $41͓,%_"%!6L<s%I:n (`,Pßx'rI(3ap(n@^}B-D8',_?iu&dO; ScНc˱EGF: ]㑟JL2N/&ܽ9B m)8(!F Y τ-2:*}ڃ4!̍J6?.{[}JsmsN3sJ_߮<;)t %w; t>_Ŝ77gB,3qK.]Uljo~>7{؂o#rn+|0my^Y?zs̔[ʪmZh˶EZ18EYR}T {7~a"y!mq>PD HE"[8ld t4-zRz􀫙)=$iM\R  q@Fl!"MDH<\OIn<k5'6+6I'(Ru}KE;ay\]lnz U7T qY@eǫK\H .]Xox6:qbx~ BIѠҼɕZDp҆f14aZiSәn+rbC͕ns6-q9=Sx?x+y| &C'3׸F@4IBh\ )G!1z 9 B4?En*k)\Qa Ry i"48RYk+/0 …6T3n&96e2)@P#XNJ b \IGNR O }$ZD_|\M&QFQ kp TH8Kc %N.q߈*Yi!^€:3y%W9[ /ҌX֜hGO N_3Tnxn&WˆwjIpm^"V*P2QR糋kB ) 2B")rTx}9#4[H[*3RBL҃\.L@fLz*6L g7YOG$m2͞=Zj=Z,BEh%, L7.Yn*Kل& Uhlȓ Ý ZjE3!%wPg"Vhf+o$8$P魧PXsBJFbP<5 @,+ 擏 YnxCaa0Uja `2hMALWjE ł"Z^Y {b6hLH$XEYzV,Oq\%mzsM2pfS謴)-UA$dU:rX4CA{R*gBxQLF^:j^ozgS›چ.q3PD;s@>c&f%uJ?LF&x@ih gUVM̾wpzhSP k>,YôB7G Tt/'8`ܩNvLnFq{_`Wo~Twͨ5~ujw^ngCEQ2 {o^>DR\Xr (<hy&sbtl>r;\h  + p_H5~vnrɧQ騝eFv]ާRe}~AKF_?&R1')x%D)D^8ΠP0?E+CkkydJ!ވ`-3E482yj:1+JU̙˞ctG3K3|hTEz©iƜySk#Wy:CZ#Zʼn`R0%!0T̫80Zi$ZtA'UϦm0,69O v/إҁkm%Tiިf;ZG?9Ot`ub? ]Lo&Җdn,rw0R{d57\-z HrPWɊ*VO %ji X  HQCJm2\9˳pWd iYJ)X"FY@*9ue"1i[3CfUn:FQ 1Pc,"R]V'SXvL gO7M2nK!5}[8e)ܔ"o0=^8>oxOu3C %kvǫ[xWn ֡soY?BB_/pǧ6.[Uwtf^uW|!,eQ_myyx3@1p7/~r>>=>g~OU;T_=kE~f hN?mo 憉/ۜ4H~ ~0 y{Q Ͼ%W4Yy%gw)n,-on[߫E-wS:>{^?l&ȘwPiո%SԚ:x 2l@W)C٤t[d:s?ɪ.l e8f~SC '7^1 FjUJH6OaRwzRCa}l gc׮(W:ޤ\"MpFLk}2l{ֲԖ`χOhi.Pຂ,>~e!.L+@biRh8n΂,hÕpI DrAeD -lAe\Y4%9'kmȲEЧ ma 3L%;APXk􊒝d1}mR$eeck|f8NOYrTV^cGy; $ƅ.twZkbB㦴v';6u2v=ԻfP%߭RxSU &K3mDۤIM#b#|!fLcWIx30fEE^3"Z䔄; &<lQN{,C똕tJG҆^{g)Qޤ*CdS #t !FTYjO;"0J=B^ /D"O0Q'*gezݮ\vYRXG^%cNօM,iB>sIQU-wTM%s$+f[t莎|Q椄,'Ш#1Lx'.k#U>)u;E!ʰ*4Q]k?|p.i~ 7(W~9Zv=~)bMDq|RT1yQ!BNh!ìr [&>Xz6\pjAԓ^FޅL}6=Dd-D&xsHu@KFiҪd :ᤑJ%V 0 CMLj,(t_8g@Z4D. iՠ(}ڄLkptqc@^BB&>y _V 5ȤuՅvTl>:d,XN5~T}%y E*ΊfG6Z/kI|"Cp[z9 ^D>"CO,%VX(;KSvpC*Kt]H_>1&!Gu:SAʀv PJNZ%l-ѧFk.IŽn:L>^^3>c &˽e;VAa:(%Z*h]ƒ3iHQ-fWbdU$zõcfl~%dl%VmZbAnB:b8a$De^ۆaMEѳT$$QzhƤ #6lDŒ)^Xh' JTEJԃ ( hTU")RC0ւM[ [ƮXd rV"⤩Mk<)@9%E bZ0p0) p#X("#vvPuJc-0'?Yj0Q380)`cُ ǚ&EHIwvdd, D ᪥hsOH:'/u~5kW!*Z_J(у7[ )W ƨ:8F 7VP8i* HFWP F̀zreR\HO30%7aFF{9>X2'Ipm!X7ϦrUq4D2_%T,<Ҭq .#!:O]Xzꢀ%!Z`4r/Op]g銆ޙGF'>`j?ʯzFof}ً-mtҁNCw] <pA&J[v})EbD ^=(#Q /_ {I]>LDB jP?@(P8D|6J XgK@_(jX J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@@u<'%@ L sWM&~J ڷ@V>V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X%)`|@d6"kܕ@VFDV}J M`%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V V';|@l DF+±KT9J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%Зz~<9~F[M~wi.OfvHH|ac٧(9YY_Fz3'dꋓl9YY{wzJ~} [*b~V~4,fczA~Oꘔ#w3r۟3r^CMr>Փz55<&94i:꓈jf |[x?KZyZ.$=&W}S.}JM Tu|>Aϗ9E:263 #1xLna[O`k~ܙ ֓Z`C %&i {,q, z֭h&XE0|xNo?ZNOWq?cQ?~ϐyN?O3Ze!x$;ax xz85c9(L qYV)xm[e*eُ덄tVJPCjE%Cb|NXhSRZ%V6x6DbHRT;/^IW >|9)ӳ a[ogf? ? )M72aQZ)98ny&_vOlΨX:OooNۿru =?<=ۧ_~0yvFMq܇U0T+էEF P]JU΋7֬tgl'6eu2i@Dv}r~_^V- 8~#n=k?̀ 6~:?־_\{o%}-pƺ6{ŝ>⋁xT; ץepuw/<.V[|TT祥Rn-oWBo4 _9m$Pf LtZu U76 m!WIyw_-|DIwg!vAVhn߾pq2(ޛG+ۢC?=o?l[nQwaSԏ~}4~O9,2y ?|ȩGC;'ί_Rr'6o(&?;ss{GbNFSt06&l:ޮ.g׌_S ۾Z\!UZ)GcBd2HdjrHzfc= Ku0I,媇i||1Xk@]/>zص)`"k'ZT|mBљ!\L(+Y#JhJ*ȐӴKnS=d<6en.__:"ݨ· H}Ek^o~= x`\ț/ngmٝhp dw{K@#$(*47=s=Q;6iyyݮɺM" tijuXtx;'\=tpF^8χGslWrXy{-o.~վ#Z޺?\ڻ*6[?ws v_$ vlX<䚷ݿp&qu^{> -}fZK㴢#tT[-onUMXI6+Qy/_e"̍'^N͕¾˗Y>T46wmH_o{~ٛv[fX$mMd+N[ˏ%Yb`2-]>y#vV?*N=dJDC;ҾRtef'R{(EJl ”ẜBn2TIJʫBF锫Jk]FWg3.z0YGHI_dž-* ltM.67Aʤjh@(>w؊>gJ`R*YeVe"˳Ύ1:$S^;RY)Ci: 9DP3KlB݅*Y5q]̌%laKۘ.'FfÓL,*r6t&P^/z ~b癩rp3T:p*)uc{fR91r7@؈FѹZ\-:?O_8hb`Fje1D!"sJ$Cbp>連 pL⑰y_Wщ[@ލ+__̢u[k׵(x4,`+,ux-![24`> F#@u<BMBLIpgT]ĖNfEPDdBX/fv,qRdLu8;<ԮڼC΃]i,35"-q4X~lH!'ܴ*ˮc]ni!ghDޓHLG(B̚*"Bjl_))x01cWDQt!:, HH-8iϑ[˲^z<DLmMau\lX1c)E>%1D ΄)Qí$JĤ,`)^ED:.N'}Zg5-Ue\.vNmԮؑ Al뤣`K虖AyDS0જմPP톇{ǡ?mu, .%Ӳ~q8X 4I8(LMԜy[3]7܋ZOq< cP?C`RgS1ktfΥR:uwe}0N]M[٤t@am/csOH/֟[pLͬ>u#Rb !hoL,Ea-b"*͍tC BC;o "Y 3@Ə *bGDlFnUcT6ey{>Z+&hdwJȠ8 "7pH;텿+'g[::- ;%*yU e<ud[AIn$4 3cQ{ྜoɒheEӳAO+tG p xDx&r~#LDe3!O):ͽOFIG q+3F3@vI)h]Y'-DC߶_/wFb]8ghPrj3~, BVjx1Mk?Ra\r<'& J0rj!Dn2L&#B).i }$G;+"ĭ4͵zfMtM1>Z"}JB}V~0)[^)1mclL]O%GIf~w84&0getEǣ^ɓ%!K ]}Sd6旌n &p򹵃vzRBd}Fѐ7(KƞbD#dO;?z)a9|zw׋6J:n)q\?` lLqɽpw'iyXVlg \NB ܞ5h6$/FԻ;!_lMu˱LOzWɅTڤ ҁD_44BF007=+O0-UgpRI.OCzzm7'+#@Sʪ@$Zfs{|.-_px[˷9As(kBRe0$M>"c&؈ޯտQE“?hmq51Yϻ>V qTҕV(;Y],g` Z=B6*E+$+  FqMOYK'ujOV_*"ʻ|{L]i(K{^y-n:Ւƾ<2e(Z(K! L~c.1<(ϕ͞<SpjO3B"(ܚT%CrH TNIUILLB{Ks~(9zk[M-J_<.QErmsyyA]\R xղZn4&$aI)NLiu^A"b! s{ YC,p-1@D%o:}HKvur&& kDu<:@eF,,"B2R y_ {=ä4 D%XR`"%`R£QGu+H5a:|b_kjڪI̱hZ0h5!xu֤җq6DdzB搓5!5] #s Nmqd@g)D)F,!˥&59[Ǟp7 +ÿH樐9Kܣ?.8_jV%6W".ʊiFhz8-/r*'e fJ}reyeys'dm\)# )&8㒉 vtjqG00t4oז:(Ҭ>p@RT*k̺ti7~E,2 E|y1l}Ի,FFo7SNNhԼ2 YJ<­.h%lXգ<<+J4OޭgZݛ^rO+?.->x}9G+0?̃|l Ej/ O_p54N1ғ'鼭ލT7d1 G3_L>>-zrg?]p^[!zm{V'IJK)kR?X~^Y?G7}cQ4jH':o?_ad?y?}W?oq޽wo~xEh ib8jG3z`gT+ԝ o{?^~|5O-Z>v5_XC^\~'8&Z1o?nF7]KZȚOPmu5?2y/8!vse.߶Hef2`"wւCGP)_V0G@q),bJ{8_!e #Q]-Xdmo . ,}J\S,RvouPȑD9mNU3H?Ӗ BD׽-)NJ;'9'#ZfgA?DT0 dԥ7RG1N;k>Zp-в=+&gI,w% ssБ-CG4Ŵij#%q&ъ.ܽ%A c3kdbnJc"mҽ=o\Homڪmz; Sr947`NW_ǩ)|[So0*{@}[-%b>hCYmRP f#XdmIjeh)ΤQfN21ǂauʙlCicNGW@0kᬟIghb'2i?|itR /'`0T!V7Q6o!ׁy,YĘǎ1ՏcJmgeh98B8 7|tFE%Cbmj@QNa 80ܰݶ){?$=KS: "Kgt_) @39;mƥPqѥѰG3#xsm|5}XZxqJɯWE0螯zwSm+gV,9Sbi~~n/[ڛٕ.s{~7pLφTPW?~כ;_{՞2~[ON]==}\._{r&$,6(hp*YNm)~opmV/2SW6wX2x萛kbbIŝqȄNnNymRm֦.4&!%N$6op-?y9%BcsedK}۹nEW>ߏ# :Z $ _8i$j{0U8i-!Av"q YP$E):8J*K=Ġ6CI+ewɨ6 /Nf>i;3aM3t9]|rLB @b\gϙ16^iZ!ْO0Ϻ-,W٣'̔Sq% U'399n*hHJXYwqg g$rtZ\{UnqY)ڃ[xE= y+=lUq3-ui :mn mG3וBLB(Td #b6H Mpm,P%( "ID 5/TDմ,E-s|}?ȹSxTY|;q.MsEjgdžOsc/sSU9^}6KHk+zؤth LHXǛ8kTzCV5"R\lMi)r@,NoA%o) Dm,QC]:$xY~Ew99d:$tMTYtW oɪ'mkh7}. V3EQ"7z_7|L;O7Z$#:#-{P`ڂx`G%=I&zcV,h|I=˃o5xb^2H@9xiN>)E]M{w޼}3A{A_i{&>z&frD~hݶ)ЎD`"뜊[ <$fL %q˳M,|n'{^kɹQbgu5hc_ې۔2/?\xnQ4qomK=yۊ q@}|&r9_K$7_zBbzfQ+׊bz^+׊fkbzEFbz^+׊bz^ -X^QW^+׊bz^+׊#3(U+^RKjZ1VLk 1ez7о3l1jўxPwS&QT˦?r;6_lzE<b4,)>KuNj EI`m"?D,rS?Ҷ+"n1QtSC75K7|3IwWB>/9k2(ϸ "+F-EVԪlMZ$_Пq!Bݕt= 2hYvn(F4Z_;bYY x"Xpɴ2K%\eYτe-|?, OaS=KM6ZJ8{LRL)D(k)t5HP(cK C:;tJ:&=ߗ`-1;k:xғ>Ql-ԥvM쟭Wzmכ_.y9mG s+E FEa%"2 ǨI0(4+L$Z٘(9Z,u $Q2Ay% 98hAsC BR tNt.F$i -šed)񚁐ҁ6X'J$F]v$BBAB$ڦdu?ִ Y0N ZY筵*+ZҖPAYl\.' u+H5#}qRqR6^ c.3:BBxbސDzlXJiZ\guť|osAZ+>"r&Ƕl,&}g6/Ta|B`ƽ_\WZS[w׺GIbϑ@1N1L쟲#C̈˰tR'i`ٵYr@++ %+JқآMH?βUi^&bv.fXy(-մ&~>?^1-ǿ7^|_y{ǫ÷~Mh&)(U]r.~nX{1 N~8a|=(tZv#78 ɴx Gm̵ k享yarԳeNWϫ&/ 'U!Hb3 .f\t\#]F$Jp_(y90m`l>q)!.9Ӡ,msQc[ڥi]:Ht7`Ne&BMp*H^hcRpYbԃNvvʋ56>y.MWd0 a~[Jyw8j4hl@8kmF~ܽK|? p  vKn`I[3H'3Wl=ly e˞DnQl}d=MWT̋-u#i^-ܵ(U?,Ψxt5FG+$KLJ *783I ڐ(c%`: snȴHHgHCA%'BDbɍPYugY@ǣ4o\{ե804f 31KՙekQ,F@! 1FyfoI0Q릣]꫎B(dKAѹ,|B,=FK"0XAȝC q$k Y!u  JG f kR42WպG]~s鉆<:syp{;q#|fQ'gqs]_^>]0 ëτWUǫ _n/]Rքco2r(2 !Z݈Ȑ dv9\O4kFuO,:>RKg&U&@$~U3oEXz[8DTok̙IVd\mv<( A)TUUYQٔYTCTݐB"cAJ1Wn%^սe.SF=~:ېoxה1ɋVݎn񖎻>w5R\;9yoQ)':$k0[(K±ơg Y.'KI&/ L0e~^NLKɊ#g" zV),;c"q'O_WzִKxr:q|\d'epZmSђ5K<=<ȼWU m@Z1bcp,ups_F U պ e:MeDN)E`IRਲa<+]7R2bI3H sX,Pbm.3(iG:U.I-LPȽ2sbVZ_XX:u +%yMG[M@N-|svnL1LssIYoew{r¡C}Uڒ7pN6*0h h TFn_vk0#Jogmir^j`pޅj9@ZԬUdz)j+m}qZ|tm;Xwn&4W/οb3T;/-Zv癠ׇ>׷}>_ 0;"pD9S_ x"D&wC:oS'-O[Б1a,2s8@d-B*GstP!wzڔBJFgg:S*"ȖxTԻ\CeQ{Zw|S]tMawo? ͇pTMSo R7 l '|LuS:sӻm 2*ӣi&!ۻ[6)v|s tiQ7?gj0醞n߿<̞< p3!oͶEԹJncs{O{y ~HK7y{?#Go`k)_<?|Ըm"}O ZWo$S—x9:.N9Kl 2`%F]BKLvTVQ9z{%ho !yo>q\A=moܮb?\i8YY9sFf: <̩dQSJoj9tB98N]bI}zջK\.z>*f좡F%I9^d8dC9Α)3F9qtqt*U6WRY`ɎMD޺@+.@TI" .K[6)@ b'I2nr̤Dm pC=&ϚOrL HIZ:Z氲VnEM6`i?:$I$I^_$kk WS*5ZxO. #&2ɡ Q AVc`NZ\̒ 2= KC p b 6EÍ k2ږZwv[zX-&Be Wbop3radh`0z??}'r.rhT`YǓ!-d-N%v*)n YaEE,68D9rkYK/'\@΀p5Y"sE_ad/Jr@ttp+R"1i TP_;{8^%j 8Y9jRr]TbgIڕ;9xtTBLKP'/3QGB &\cSդT:ULj"_UrmNp3y?:TZY\pzdTo$L 3 x!9ց3szF{U[ۋz1|Lt*fHg, 4s.֡t˹ DPփqnB;_p.J'֩,~ЏAgeUY<=Sͻ9!Qk[-k})#Rb*$5&i7&\ yg1{m}td"*͍$!н""Y~0Y4z+}Pb MR 3cQ{>h4F++rgxxz;ӳtq!.9&fRCN6DΞ؏2X cɚ^y;SpPQƎ|f8Οrx=H!0V8gB6, %玣uQf:BEowJAYW ̮;('+F(ϛZ܌K)fI^ Z-bȍTɤbD(Z{e C\H@;N69qkE@H EPRuVQw٧i|4x3տ}{?ӧl~aSoxSrFbD[zs1^zTozK2n'%ߤ??qһ6~uR'#|#nӋvo>&oO`szR&4DLFGg=67c m{?~a7`9mzn~-}ŦniJ}۹]Xؘܿbk{N 73\/pG2? 5j Bŏߌ&x8C>?)=^t/'Ɠ^ŅD[m @NJF}^(O&`:[Y\/HJ(7iQ3_=PͿ,l'YWFe@vmJ6f{7|H~40?糵5]FiF iL=;2fF4vTA\㻿$hx:Z?M7?/r.Zlɦ*]  œj}zѢe+-vx8'7.J},⁵&Y._|=- B?Yjfg}W7ED'O嚳3>/]Rkw427$Մӌw-:~{yU"3]4]Nѿoo}!>[/~L/#5s-֒u|DvfKWY?yμ՛ܶF?;~(>v"fa t}3km#GE6,C6;  e,6Xd2Ou$mYm9eլfWbW}f#Z*p4W ݳ '2k@PpųnڱvϜMK~Q`ֱ,wp%*Z3H9 |<-:?l2o|S_41m{?9-܃xYW_0CQAqW _g.O|!$1hqbrvVTczoMW5[|!],M^tmÓJK0ԱQ myͪYO,(6͹_.R"7W 0nhѸrg;ι~]B·1ͻYycGݭw-OFw~ⶦ35YUy8Ϸ ب؞޺Sw}eQcߺߗu_~Q ޾\5#n7R[TS:['Y+] kQkPOZsU1tAw4|JIdz\\(p!J<1^i 7%0ny6Jg: ӫb>n(<j#jq ]\*½U/c#P*yzpL-`BW༈bY6(M7iQ`]71B3#Bb_`(K7=I{wө6@hVY2B)ΦVXyLYS))[TmfE*F)Ʌ1愰 +M \M-ݛ[lw-O0{E!6Mgq敯L.1u !C73^w+1[=vDeQ\h=@P>eY`eR @g+-wYpogԢ!5,ʔ!r &PAB9Q5ڙZr^ȾSJHD&, Bf A1ȐcJ5de;&Ζv՟df|&eE-O`KۘFfD I:Fy 2q"̏in>~+g/C~@ܰжKbG4E4"͔Џ d(eܚ5"ڀJeT59tN M*7 kB>"=Puz%EIAB1zm !\(bIULd9sP-+sVu]߬{MzFge|Lc1jtY>o>x7Uh Cgzh-Ӗ}4Ij> 8.ibN:FIo R`cL^8~ }>zoBw~l(R1hVDf`mGɸy |R9&QԚ )GYLu HMV(Q0䁻RE~5\8wen<>1!J-dӦ<~}j ; `ZR>zLR:2 R@ zb{UUM'^~I gw 6B}!$B@HaRݒQBURrE,M$:vP&KFke/.|,%Î۲vFIX 'v) P&傴WDL17dqZb6G_uzcوxá.B]&& kyhFsK<B2$,o,w\J r8ab0T2SN^<=H#?[ÒWY]K'Lt-ōiBx&ߌ^i|1c0ė~0Nڝ̷d>K4ozׯ/NWjh֜2 yx1VGѐtiK;O/i8%$o4ʾ]Hu?>waWœ?<V]("Uxy&;bc$%XHJj2]Vxy1ӫ1Ë[.N39`ȮQ5WS:-dia#e`M%(='yPvLf)fX9)-'5N6>/ѷ_|۟wwoi ER%abĶ^Z)~8c8AS\</ r\ yqy_6*p~iyȝ6!|w54Zbh!k^\ j^rǸmkOcB0n@R%81:_Řz@If+"qLRa4@RV)1Y.93J;FXza]~g1) {6^G 4uĕ򞦗9<TyWYLt !΋Oiڸe&:s'I hh#o4B;l>5HI8(t&R3d՗-J?RSՓ2#z RDe49 !HsT9D[*Nz4gn4?^E zGZ٬]ϿYyˆ٧.>Nಖ}أgBG;Sul}>Y[41ӨYDf]-{B}A&xxip URҢ}typNrM sn[p h'\Cə̃:zJǀ W8-XF8[@ę_'dcse67\'qҌozqSC'zQ:"Y>^jsL%ўdG՜85VhQ 5Xϛ5\zCV7"2dild{=!\G); : ơNiSI)%븉FeDkTA$ d{j9qyaXRۊeZ_n>EsNf1 r,M̙I<$+@Ȩf 4Jcv}8!LaxuL0AU4;Ak$z^#_=z0X^z3:u3- '/6ZY(d`hAb'T\اYzӾhjQ$G:/ d ^)ì\!B;L99aW5bJ1=c<-8$0fLyCLPȔ+H5qk =~c&rZ4(ĴʠҚ*8!f46hCɗDV zspIsC!5C!;.' H Ƞ2Wn~Px:CսW_끅#f2{dF[݃Wʔm~f\&HLXE{Usٳ׍rIxt*z/HrH-H:#*s& !xNOF1&N'WeV?Tka MY:9 su3|zS$;dJR(GEV)u AF%F]KDqT=VaNzB,lZW M/e[o{y:81-g+fQs(B̧ N5,JnEbV :'@'/^yjK}VW^MIZ_ ڎ:޼%̥}Oxuykz_zHѤ~\`NxMSBv04-rs];O6O|*NjTrh~KH4ƲA:jHzT~c^ԚyvRk-AjVڋT/$Ri8#]M.WkLcmpBM,"$ 9F}cq6Gl@Б1a9c0"̜HmBdU, ,uMn6%H,sqRΒ$ϕJF9^xQ;pt6Z`1&c+!n=@"u{qykț̖_m=:a¦ֳy:o`wF}Qfwtw=}Ҫm\ރ:YI7P]^?[\dԺKfz{َNWmEGð}5eC-Zv JGk-Cv?9Cq/}C];:]џ&άDٯHs9e}77tne^tTĹ1Z9Wk ˎzd$&[U(8rI~˄IDBL#Y^0o7c*%WL cR sc9]pv3j:$]1䮪F'宺Um6g6|ʅQw;@-m!'HvA%xr@cT> JCPh/Uq6ǘC`Nm.fAOR}FDZ`S4H\+Rpv#c=YVCXiH74@>ΰ x &ףWA}ftC('C\(ZI8Q&Yd`&>0¸%gbvsZ,BK-`2N~NӼ=z 5xŷ?O)gNeg{3& 4p8Mǖ=J `axqAu'k{--ۭ5yqul=䤻K˶T.~ywjzf~9|B/O# 21рޭuw$vW✏j%qtm\Xj+/2e}f24 i<ܔL# cdN_1_vֻh17 <$[pLPc 7iqgQZN;,Mq:(QiߜQt{:@|TqiiD7Ad$M1+ZKm2>=zIiύĻOÉs}lZ*_W0;x)I™Ffh[}Wܱ<*}ْn ?m' zUkP?0C`RgS1ktA2f8JET+*B#@T0N= ueyMB:l ?Fð=E8ö$R=\f~`r5>쫸YcDTט A{cbIhwGG!HFx{#“ B厌{`!r0H^ xȤ$iAi1JD#"D ^d כb8&G̻w'$g{nj;C vɢ ȭ҃8H6I`ifƢe=h ,iڝVVxd:?=ly8]KiY-m{+}r>)F NF'kZtNe{{u_O܌p-]4. zX ")<ĿMF]Dy3Lj^i$7Ι- f-Q㊿e>;mfw)R G<_̿}b5~HYێ 'X|q U R[ joBYk,q\_.}nƺhCA X@k]x WGCmg9; §Í@M^5n+iUR*Ol@ W!="wWQv==a PѦGqb=.>vZVt2ƌXs[I:лMl{mB,XT<LlpZ0\ :pc-?c;СIIYxh<1t Ai `RIJ(M zЫz1648dAhdAQg5zYY.,&k#3x,C0遣 p^58h%G{%hs9dIoe6يLR 'Q4,ǫY=ճ?{Hn˷ѧpKɴ LM@01j)w\fw:W!K+4ӳ]=*[9=v,8H@q)P\\3KPď*\~>"xu/ޛFóY% |<3A-E|"Ló;;s}+ y.\n!gu+wߝ^].^b4jqR 'qRuV @&@{9m!f[uL҃Ij;?v7p{>fU3Wa8;rwc`(6mw{7ol 65#7734_d9W[F𺓏g=kvi6*VgMϊʚ!<ڰMUWQSq"νuVRCs⟓.ۃxq}N??~oķo?~OinsdS.cXw7sn7J:_/Ʊ}Hv*4n=%}nUif_2O .ƃ8b|͛7 槝N؋Z4M͚V;4-U˧N]Ŗݿkk7^n AO8?;Hhq|7(Z1,\\d )R1[7m,+J8V#_ ~9(PPI@%PkڐBl:DK4\s_yd{^H,?irsPQ+}G [QH *hmK"SJfW]*J([)N"x %%$te.(*d DIQRX$R|'5>dYmc0#DhwxMtU9oCݢoͱGG/`  6!)?H-glYW8SJFJ|#(l=[L:&[O`n{cS}6Xʮ̝W_/./BN|:-;fMrg9 MܭU3ȦW?y7> 7{j'v;uQ?1i-hnܚϯS=6%u !-φH`ǣ +RB| U#WLJi9uG=0Q(<,Hk,WufyȕVhBҙ|k=Sr:fh2!;4'>! AL?6!sn[duɘ B3QQ0-E+T:D]GBԅ+I}&",Q*4 EdR%#cAHsO\p)cx6Σ5qrygv 1* |: IW"I!mP<5jQNj[NIy/M͉$zq}DuJ )]N ]\VĘKRrZj&ZA2^@kh"~;~-~.fKk"䔆,򐼕hÇ`cE{k!dzcvc*tH'ol|43o++U4>?07K /2~@(7{i*Og}-20t`*yVX(ޤBODa3;ȋ?w G+ S>(HZiej$VaX] ha>$ȟn{by0{f_7B13+O}4&&ii&& bu_*$`"hI ې /Y6IJtB;ѻĒVJ$Tե7g9n&*[-zEYf)hYvK1&|GজtS"eq cd*_W!{^PC?KD)iWrJ@ :a9L6+2#g!,+v]t!3uZެ~X<_|Mk(AJWs?*."#- AꨌH[<TLL l.vyk֝x|sLY.N7B =Ï&!b|Ԣ۫[TeSQw[ztWt#`, ,NFc}1LpG Ge^ZYrHDt'ҟB5I|P#Ά*&;`%jIjn |{Q`jAET@t AV 99dysc!+b/S Qn, ͺX0^XIX7IЦAByֶfEb(S 8iYIe཰uX:rLRaA{}̳>XOGdoVCTV%. U$CM2P b%}&e$xk|ЉsiQRb086!JCJPϒ%ٶ s6HPEZKBp ]JJ'glͺQ{6Ӟ~zx{C RwVux6x.ߡBF*l S+V;1B6@]xG׷]a!Nɭ]8Ϯ}B۫ u~b9YO*Hf۫t^arU8#nݫ6j+K5/MoTds}Oy9?24wϜs᩷ͧgEw=۞^=O׽>8a&CG hvܱRN^eo-R c~dXP3#Hө܏RRR݀PED~1Y|% -",Bpe0b&e&ތ,-S2~͂ \vWj߸m,}  ZWgTN$F@k̋e0Vt zm"SJXnO)RBA/^ ΑbN!^yF ϐ6UM")ٻ6r%WyYD sYL0`9AH:(V'A[-Yv,KZi 2"Y `V,^9hm 8Im@Fˎ]$S++ܠ +,'5EQy@gHAƚ]6*%EJ$m]Wz,gR@s%㒔ɇg aIw*(0VrE筑]+zB-BK?"Sj9- +`s 5MV/o~fZ('K+9ô9Vrnk^>/cR sc9lYP[#gIa ir3)'{]q!N#^'$|P{5vHZ"dX09P'4F$"ZMb1@YrAГж8 #] l k@mdl؞b!Aa쮳yь7/nrz9`y z0>#Glo9JQ94 X"x2$2%$-( v[: C1 &CRؔ}h6mǜς'MƔZ2bFva\31O[SAmޡvi,252d!ZDb!p(BN: i+T^Svm NsE2i\2YšH{IhHE ĬI":Yhk5ram/w`<ٶc_D-#CĻ8;`$ !oI}ZҋI/ xΒ1n4,RdTl,d!\p&ĀNn%HPV%ۮ9;d|htY:[}qQx+;vdng : zeP'/3)TNpUR`SָTP퇇G==cGU.'+y?ZmPFõF0І#`K™25gޖtszF{T[K) ' MHz wt>)Tڥ !e1$sTC]]8F&@Y-+szQ:!NӶ1臻ULdze|=jmM>G(ǥ$5&iCޘyg1{m}tFx{!Qy[ȽCŒ"y%.ףXmh8WkfxIp̋ mXY[ӄI,k ]U(_a8EW_1E\7ki,s^]%VW-4|XVڡ$LETѠkxZ|k7_`E=M0҇= m{p_@@)lxܟψ&jIa0]t!!̧jl4j/{?Hu/{xK6Qv&{v^ϛ;m;\wrV͛-qKƷ!gnMm4@W'Bj~ mPOwf fwޜ؉K)ivLBcm{AQc}tj6ny6ƓEM#D~-6 Pj;{oX=݄ybY 6qPbM-#;663͚}7k[d={Y4N̶TʂuJX Ǭ8 ȵZ g!m/A0rvΧ$oSvj餬oxN;/>b<1t Ai `RI6Joz„Z{'Uo53Zv=lY/sNXwUr>CzjTMaj^UHu>ϪWi}aXP>ixUrqE8OާuPz:eGRs:(Qi;AsZYjRNZ)7f$Sr/"my]L=SMjz9"i5}[b<_h^Ϯv1Q-Xvk^-./UZ,kg͝\.lfg}yZL5f6 :T?zٜ'yF 3QjɫϪ_(.i΂& Iլ~pO/`{EQ}ǒJvdݥ!}ǛXS*){S|/{JAl7NWCEꃦ2MqMzjC !M$=o!- dz@ߜdY嵵In\"E"S{Id_X7yxиm^wt?'Ltt_t| OaU?~a#SèkIlրa։bq ,9\G/<9[yB\/tlnHCz{_Zדln2bWtqB)J~}1F{'1vz4 nQ6Ow,B6-eSnu٘OۚLATJ Vx~gcW/i:7Ao:M75ҵB1EٯtjcrovRxhyś];hw]uw_x./v%jћ .3up{Ȗ'4FE䂀ʻǰldPcs;|y+l%eߛu,2OO AJ]N8AHƩ ^BAFg$U-@Iev&YBnP;IiʹŧO9ТMzaL*0"G<&z\VZ0RY9́i)I+FM^w1X/]"Ci9 " -(|JʖEh4\ Y܅8}ԪQ\8eڤҷRKCRݗqCEK9k1il֌sfUY'"kHҩX+HeI&$;\Kq/ ,_4YÉIE15}*d) a Gsbf啸/"Se-1A0Cn6tҖ&M"w0iaI.]B5L4$UZBٌ>[PPFF/Ks{K&z1g^ `>tYjI=h%<JڸG$NXB2IE[BVFD@Hj Xgv8zb?4AÀ/c4bʦX)[XdX}0(גu3.Dƹ] sHr=Jڈ)K &<4}d S]RwLMZگt*( E?aB E1k#U>).d5#xKdֶڔ^*QT3j,I:'Eէ (хk§tp}E DnJ3HIw+]0+(IPkCuYv ӌT"yL*1c5.d)5>%j1 uIBvp ԁ` (-f1c &˽e;VDG3PEȚ\\Ƣ(ΤI`&#EP]  V;#2Ϊ ת ¤0,,H1dlD(! cK:QbզX5{-6/tXI+,Iƒ5$RYIe6HVӥGoU4"ZF<)mU(am WZR0 z0dkJqc=A;ns:e16}ʆcl~޶93ItUSiw@7ۭId3 .-zLa-h De Ú 5)DzI Id05I;)oFj|6lDŒؓUZwCP^: 5\Q py݈V}~nA4(QJj *#M)XfR 4Č*m P!1ւM[ [ƮXdJ= QFV ('Mm2X4M57J ;XIy/iLO("ZT1"&7cbw ;^CTXd أt%h#*`hAgp `NmS; "7 3 ǚ&EHIgheݼLFL@J-Bv Z:-Q] W}%zfH֫YcT98F +(|4jNae+MK̀zreR\H,f`J$nÌ G9>X2'Im!X7ϦrUrid" J.YxY 4IdS((`jI*\ m\z!wrt 0@5[h#45ͼًŽ])54xkΆUG~~)EbD |_=(#.:A9iy_osw ZoL` z.Nr˫w);{Az+m6[Ɋ}jhlpx);c\yi:̍# `q,3 +\W,b Xpł+\W,b Xpł+\W,b Xpł+\W,b Xpł+\W,b Xpł+\W,b Xpł' }<&UQ\\s<+XM ѲO( d Xpł+\W,b Xpł+\W,b Xpł+\W,b Xpł+\W,b Xpł+\W,b Xpł+\W,z*(2pD+28\\/ıڗYi4 +c,b Xpł+\W,b Xpł+\W,b Xpł+\W,b Xpł+\W,b Xpł+\W,b Xpł+\W*RpL++#&JË\Y`՟QpE[O Xpł+\W,b Xpł+\W,b Xpł+\W,b Xpł+\W,b Xpł+\W,b Xpł+\W,bդ *紣zbhn7pߥXN 9msZ:|Z5jvFiٜ,yһ.3'4@jΖYOy}W7qRuޞAfeqykxۮ_+rnƩf|1?kA$m^V`1Lon,/>Kei'= Wgi33V#lpkCM|o;;`~+kBnJ->oD=1=G~E1>3@>N|O$XJNrʒUSѢ[3sMOd $Γ<$? ngfhФ$"IN ԷYvg)O%cK3SaV]ߤOASx}k:P#=LNS_W't1[Z_;5ti,oohaNӣt0楕7:>ߺ5|rϙ >쾤w/x3 ƾP*"<{)anjb7=Jz:ŋ Ve9~_6[{CEx3Q {\sR3Xa7{vrX|Awd$l:~Y[kuejk vsW'˿_S#\,^^ |*.<5U[7K8+;^vzzͺb~f1'aᶢԄ+_׳ݼUSFt=t`TCR f/6;>;;z#?Zzm.2Na_|};ǕVHGXtӣr3]dbz~f//I_{;Vl9=ʃlHj1NQ+#_m+x-Ȑz.k^jDӚ1`e7応~ƘSм;߱krn˕lUX#6^C.n~iur5R,=V~]o;rWn^.]^6(@}uX'7Mʀ~__y%O f;ryzKHg,w_B?/u_vkUIq)M _Re̹#: MO#wЏ?^s BFRiUK]/uV.(R6f4Φh2aMfΞq^RDžQosT{>!`CM鞢;ɦc $i3HIuTdNIS?y ^kʜG>Dj|H}U &ZymV?Wp^K^K^KSS3;c"g7+w8!sښ BUlD>I%]5qR \bm8qL8icX宼q.sO_1W pŻsJ3V8k;jޓ x2ݼEBϦ5Mr5^$/6YJ1K@5R5(\)^QWVD`eß&zRo2yl/of-,&YZ%y𭳊фcC_||11<6: *%GGGH]Zk\%{Q~/l6 7f|=y >{Ʒ<}6km{ks3mocvkeE.N~];>_u89ݡ2B NxM\S9y\҇Lͧhwx'4?$; } B%ͮ>Yo擏“|rG]&Lk秋V.bfYipLN Bk?Oz3r%;֦L9]{TY"ѦѩcFWs,))Lr= }W堜_kcߑvYGBA3̨R(Jg+ΔH)QojVc?7'r~;vtO46InDHDǍBg1GR.Dse^w>Nlʣmzbq)fvck *qSnyނxw># ή>گ7K !x/ 2+}9i J5'+ !y!bQ!y Z㉳'Ϣw%!Z(ϵ{C`Y⠻`Z$|BD l:a6h͙DLI`hwjGs@Pc`L5 !wJ/+y@'ŵ؁%W]"[NXP˖\ ƁGm^9O܂T7q&J{iIڜߒDBng8ByXz 03 " d1+|*et<]z2ִ#qƢ > i$aTD!zmO2JƂ6@쬾3X}v-Zlv] !;CzSudsĂ dGR xZhqZj?7c!m$ 'Ak@( P2gVZ8&Դ-j]&i 3K9'PLd2S[oFb?^Dqo, ,ުxcU1q2Oh/S\LJGM}.+N&b}Fܧ6>NJ~?l}xkp4FRּh}8__٠ս(DZ  Rw9]2;꿰Z-or:{CQGo[Q]H.K0-'OUvxϪE0IJS8ӺG!צr28"UO_MN<ď Qi84wQ9o^WōIs-,{1#DM &c,sx _xjP- _ 3.5 F"h^ERL>q? FCB&]pQŢ7 |xx~U_/uW S5~3Ay맬6`KNq#׶4ƪyY7<ϟ۽1ǒD w S*iK~lӗiɍmDgpҚFPD;- A7?[L.xLxawW<9?г;}d<+wDsڱؠZQ2zS4XbO >Hӝ_ͦZ+vA*2a=֘DŽj&f0G}^POxt;2g!Ml:Y3f`3)[1[}1geDȤ-3Dkn3a*ZK=3"5iVy]$Ml3IVC`Y뤑x#<ˆ8tVGov e ܪ;Ӎ 7^ M?; ƻJy'-w&i-B@ۥPg(hQ y ,BIOʤ|h.xo}gnfbv+Ӟ,;۟oUUVܡOB=ơłw))18 H3xtPdPބI-W%+kS3Fd%w~5m:ZNg_s8Ƕyɟk8M[?Hvp[ɾpu>\KvnG}J4E?'XF3+NV+NЇ QdD? _W&|#C>ƫeb_ jkc&[Զ_EoԹ0;|V|yѿ>/|Ӭn"1!Q~)FE&hb}UWSNB's"zPU1F:go/XU+B!7|w&:PpA4335Y&\UG38x7ubjcF_^X~חͷi}õ w /m$ÌόRD:şbj~W))7v %F?p1>n۟t+U4BqVn>}LNoN3oyȴu[gT&B $hbCmhfQ2j3m2aă˧ЎS54LjM!FAA4}Mc=z2a44.hKS$t3qLpܚZ V>>i%[O#f4B2z`2N8$sBZu4/ M`Fc$G{OMOiIxaF'dY d+ hخj(kv6E6FM^#9}P R|%( *tiA:-EtEq7okڽSYթ6o+/S-YحW=z֜Y$ukGSF/Z:iI9'.<)]SoԆ|StALVSsǽyEdy1D 5Ft;ƝB8Sq7݄ M3߰߷&Zn8<(l3)5F < 븖C5kcPyPg4AöandW3"ZHxD2VHo!De- Z ='xY"- ҞXR#"9h``z ,o4)58X C% CٸS'Q3$I x k ଵ2I n3[ {C ICV;C֐6a)xg^x=-Fs}#| Rwr:[novn$ӛ~@>8y} Y %HkKB.n鬭$f}yGd+y$MGW7mz%' om6j۳MVyHll$7,Q}~|:m"}Z '{HXԘ%*ԉN* & ;?Eux&?x{c_ G=q80췉`,X{ӱu3ԭ○OjZh4 bq6A(Bijh~~ZEk|ɛ!߈inM3SO`vMOwՄ{=>f@G|sy78IjRױԙ8ktvLjc%3s&\+:?X&?_;=(\:C.C+ǝw4ޑbjhQU~ Mf$N,6sJ1H!9Oz}g[r0tw|ZTUo][o[9+ҍ^ڼ_2av3 ؗhnQܑ}I;mloOn \>'fMr9iAnwAm5ȶͯ}OmwGdַ,(Ku `08JZ_k݁öGZBZF-b< "yZ2gC h]P{2ཉe i$Q׋B].gsZ1K2gTS4Z-*MId!IKF|M;Wީz vŽ]LZ52x~,n1lnzzcNO `zře!,Y41cb1d)bQ81 ܼr<(;!A txwzt*-\qR s9z^"jA<*/|JS:4o+21B*e"c^-X JDc~uΚs+}LwwNuuxO}?>L8 -9|@3p&^D|OG"q[+{Ɠї+FΨTS L͞bD/YbW&C|{yHPO,&X$( F[EdSjJVb& ̉~DEHЭTV+Or)s\PZ"$(jIX}q;5gO[{ =Q<ɷo̚V3m($]G!I2‹=7),mxs؉#Qgk;}?cߡ-ډhIaH>+xVuS4.D`hw3QcLzkdп@{(}j!dCŸ,M)-8RPF#8!E'}_3 3Dtd{ev'ۻ1~ckvvMiegJe;4!Z@.C!}2}S!<;;A(5"kC&Jb%T'A=y#2wGa!DО15芏lʘh+e";$$+C}L_yM'xriшo=WTo>HRӞyO >Yκz{ ƚTg!R!1  ^9TqɁw#0<`s"bV: FC,x(ҠUfD;ΚsoY/@'Ѭl zl嫷VF}偘U:gV'WsyLOʙBN`J,2}`dؤڴ_|O!B*;.9/$}!Z9N`L -zm9UԝP Ix$a0 Gd 2)H:j:`$ɪ19{)/ry`^-ǻc< O c?^/=}{\/ALWWZvWGl-l?h!9kFg.C86 VPpiKgh #-IHd{cu,IJtd&RZ &Kɹ4uBlwἽY㣞_7y./_KQ)(J~1AW| *k#JԚ^u*,].k)zu?sqW-D^1qI~ ÕxWtb>+k-9#QT(3rL-_'FfDYP:Ҫ0vIg bלUP) Ɂ8@)gVk&A j&xb{M&>MoX2rR0AG{};Zg}[wS>Y-Gil>bcP|, OА j+QXX0f"KH[9"+\sV6IK," 6DM(+ @)x;m'u9ǜ)s^E%Ah.P%X  YkvgS=ߛhS{=MZE<dހ ,}g3ܱkXۿeb zM=_F*83%љ~clQ+AI(.WmS/*xL-xi˝=_An rkjכּμ^Ph6>o 6?~NY7VWM#2n?/?ּsr7W__V]~^k^իջ{KfIp<skz>qL/U 7*^Ѐ}C-UQ:Go F#uCx"!G4$]HȰ-Ȕ1Q|֜FWh79Wq8>щfThFFh|ֲV/ՖFsF _:?=i9LglB},&F] Vw?i>< 0v5 #类m„wM *n5ެjA#;+isR.$qw.NڨA4_d:#D~q--&F # C۝66\s'銖9*gԖa߭hy6K6ssɯE}u{Ia5^ > jeڠ("ܮ/R;R_D]׋jXԯÚup4Y߇\Z>w/ ۖwݫG6G+)L+*D$$Qp^ XXM!tE%(r|񣻀oLKҚ@PdYDPG\$ik=hnPlwV~ 7VѩP+n1hܣ%q५[r^zWb,AIEN:V[5`܏.c܅6BK%hВ"n\p%- ȍMcu"{;";8s32}D mUal*Te-ut"͋bŕ̴ `ȕs.7/Ms%e\^:Vam]2m[at d:蜵IyݱCZdL6V3gr c`4h~v9wdGyo[k0㝾yz(~+HAӉ+C )݇k`CV#K>W]Yv,Ӎl$.Erg~!E)0c 'hgPqj8\Z#5"gф%T!XFLU9qbl)g\wl ΎocT/{KTT<9ˢ&:K2@R1n"p [ǮX}\HOUzGUTH B9@PpL:A*jb2ID(BD Lq)$D\(X FZ80X`A-D=X2 Q8XKbV1삲MF!E#y"mlupYVA~sp=~xܺnTǫw<(%Iĩ!ă\0( ᑚ@\"/~%Ⱦ|uZF=NIol| NEDL)eIXnk"K>)8-8JHy9XMLdj6r>N=DV4(*V=Z]22R]DGe^ILֻ@O) EԀlDE\+!H!hB)e~(XLbdtھttyHm 7{G)(&2\oQh@$r"+~D[4m % Y2%͟Њg@! BAkM blqv<ό7G4ί܇b|J?_{H9ҥZ˕*'WS9`^8k4]p9QP' i!(!eA Y2:'9M2&$NBO$Cu-cPTqG.!HFb܍tbXX3vB:`+ˬnnYa1'`Ɯ'VS&0#Qd%TNwg. phx,bsÓiƞ pg6HW"hfNhcCIX &Fb܍~6 &澠v1Ԧj>*JK*cTGC6{!^-EdDp%*@XOI*IZȨRV4kBDEc!IԪШڕ-Fx]q {3[?vEDVـ"YdgD@MjM[-h) Rli4F((-T+8IiBd)@UTs4)-i&4wE0We+٢IWQ+\YKvEQ.:\' j:keA8ήg{am<鈚W /xX;nx0ؓ>^f*mֽp?Fޏh#6Ҽ#N:͌Ȍ E[FARbuNgQO*zL_֟=!#8\amz^&'[nlq{7Y)»v[jzqAo܍#eqF-~t50uyq?yzo7ro:Myռl7[m[$Jx?Kbڑu-_}HtsQ#,~4w)nGأfs-xhW<9?-Sfw$W-F{f>y'l>>ofm$_]PBf; ֝a=5^Yfkz.n.;ewV6lLD͌af͚v溕m/DDy FS0R#uGZ> #yMALV\3j ,*~1DomVϸ8SB\&.F \h8QȤը*xTVi ylFO<})~5 85.|gvN?DVs< {^˕uxmYNM`(ZfhP+B3 vTr.Cp)7x/ kvS7'0"ZAt҆ 'B5R BT\f @̽ 02qƕ [? i`;Xv6r,H+a5`((__La?kiڪ6KĐh@K@I뼵V&\mv(p04Kԥ HeiHE 8'dL3a(Kh25Kj4= /\{y7PO74aPN98hQ㩟\ Ӄ8^Q$:zꀐ]uѥM\c$Y.pBpxL8mh5WMfxQHY™ K#5+U+GƂmĵ$_dBϯU80`Ii{nX{9356S_O5d_Ъ9Nȧx8/Y OfwZf>}oPUjhЌ| ה|q>&r\8 !vS 0[HW{I$qHgn'y`Z yc 2xe=_wmH_җKj\Wvڋ7\xʊ)&i3çCR&b4׍"`ɅtjT{c8 t0kB=㔷"J,Lz#3\s}jmO<اOw} ʤgO˰|j1I6 ݑHy;SLV?@BRby"rZ#α@.j +ETBV|+JWQ R$>/i" PuڵZNV$A,qG7- J)£&" vT:چpmİgꁐ9F3Lˆ1 0RH(j)xʹ!Ȉs^ҽ]fPAw?c{!%ŷZ烙Ťwq^tEou@(>berC,3EmXR wW H_M}ߓ~ݯ,;즹̕{څQC7j̹5_ n;ւ]\wmi &aǃ^ >BUq<]% O{lΠDuSPדB]w7֣\L1`-JFGτV()58XL$2gx:%"xbA;`hRV;X{0kkN:g؇w}̉e[v 4|)JkYI \p\օ)")ęUx͐5`}u8Uϯ V UZ<;}J [<8aBPIQ Mؘ䔊Vd='[!h|NQ&PƄ,ߊ.ĉL0e"X\J#(A2mm8 YVr @ٕO>.gcʻ_ nޥƞ.ytxjjඏג8O4Y0[7AIɹGa2ifc-A8tV ;&Sٮ<H^ր锓Ƌ0)9gi AģȠTdѨywB(>|]LKoײZ48DHXJ nC^=tp9*冫KZ?k<ȭCt+ w{}Ǘ0 Sxk3Ak֨j{XxG6k}5%ޭͻݶwz7y`6ys77}=uf>&{~Oͪx|Gujx`.zKE֜Nڴi ۜ4~v_HCI`uyjhKTxf\E0|Q~v%pY"ۢE?~*%LLSMltps06c|2Ǟ\S]=f o[! <dm+﯇CXՃJPzmWiFJE~.*%kMOGw?}ZTߙ|Ǖ}/(^Ǒq,Us\sY PsQQg٦x]Ջ " O{lΠDuS>ʔmPy)dDdN:V>2⌲ILˆ1F"cCB9$̦) ʊʳel-:'w>!oiLy"QVce Q%RrBgzqn`4%QV`U d0|/)=ٖrf#)6<8$Y"(r8j``|c| *v_{=Ve6x{=Q5ٷ)و$*N dS^ZʡyCi1V uZyXߣ5+h; Y~-,sܮ q?*u"*UAD XO35 ۏ` $P7]?a`@tӌZ.c(G%.]$%T^M$N5e?.jo tFq\ s$ﳬS{ .rUsFU!D(Q?ś__7/BLQɊoBa1?/o/+-.vj?,ƚaeS/nf-w>[8Ma &]yF׷+50}/9Vw=TT%bvΫNLJBɑdai 5^b+R m;܏w?Ziz,Zm@8~CCx&]e1;C%!5eǤ%dv \N &`r\H5ųo^vT2QMu?Ϲ=M=?,>.7&(ZxW3xWe=ۂ\߂*ǡeGӦӕ_ gbZ!!}ކ4Mh kC4ck!"N/H_pZ_;мE"7rmgHdD-)v)Yq7 |YpV)Ū=Lk7Z c9 Sg9k&h}o2Mڎ8ҳV\'57̈TWpUv?EvC,T1sW'sS;HiouKƯȂw/ }a5W hKS&^ʰ</│rPT?4ix^G8i>c&kʐ`4piqlFy5#4[J)}4\qleE۰&ŭCu7@-̊^kyS ߈X'q{Q,$ྸ`!y1Z8k "E!8&RSsUatZ 0}XN"X%5/Gv_i %ԸS\˧,_<~\(7yd7XEu(R-,DP (d]4-fy8_WkQ :m*<SQ^nD?eA\ۜ/SID_(~qᄄx `ǿ|L稄`76`y(6\%N?.e0ۃF"Hh%t@̕+!.2 ^άƋ.Wu.'e* gAh;xv7ILn\as*Zٲ|7-Ϟhx{#X6bzr6y6u1@yz%ҨjM f*7(, a~|j9++<$|_b&8V4_EI˱%$DMr_"H28(t&݅]щo>as9b,m{&xG^ߺM{_XO%-'g63=T-a7{cd8luRvHQ:3\Puhj^ u*D*UENkO9WS*q^WuN1; ZnNk8֢kH|#ݜWg _R!UZ+t`lmӜR=F$\=|.f c7ix30fE bz5.KNI9_=; ԜI􈪢j35ԺV#ϩRRm)ݸGjpW#1̙KԶwSt-j I:5w^F[RHqu~ )a@zQijivNcY %H$j*dxH,CÈls6.WPZYcOi.*S}Fu) NXBDȘgEʥ:itAڳZnPB A եfQ#2LcmThi`EFXE)1ؽ؀t@>)`-i RsԦjD ('έZ:%‘l#PYǬd 9{@\!ZUF2|x CMZų \ :n$\@2Q*(}!n 3*]6=h<0{KH'&-(28`GmƂY,TG7`Z"jl<ܶMR(NOk"]p٫IT>COeX6yu)*rK B2=Rn&x1+ 7DRD.c 9Lv@ @ј R{pcUf}J` $vp ԁ` (-f)cy&˽eGN}lyB0 0Ș՘c'?m3k*jLfWbdU$zYSZ5cfn~Yu2%Vb|؀?Hc!gѝm`i,edԿYn$ HyCH*0dkv6mz9n~dޯOH=5i;0ICuR  @ztqtJN#0zRTZp[n=<;-,F ݚk)DmzI Id5|S^&h؈g'c A x آdk܂D{5(QJjN 0( dSbFHzD@z[`}VAƮXd:6';lE]1("Nsm:Xff᚛$r%I bX0p 1p#sc 5H,ܽC`r"[sOCu@Mqm3QAfHm@ZpYPAlZ՞KЦ5ub0K!e3\r!;tM9y=ݳ4sP>Dc4l klj؝ -6R zxX@Vr&#-3UP V'=dƺZf`J$nÌ'9>X2'YAQ&`9#̯;h .zul)BʝWӥ!`PrAu#|$#PN]X4wQ@Ւ-0 Ua ຎ C@rt)E^c#g}y7/;9;.ۤ讫ΆfRF.@.}ʹQG>7a{Reyvd A}1'0  _M^l[_.yo~o>7,](8jPJe)sO.;9S|,S%B6qJ@}ls ü7==3M=a1>=C{ߪ1VԚx,vEzdPtA7V ?d2~43ldY}vb1hahA s bZܓ^Q11$Qg4#9h$`֨v;H ~ v ::.}T8:!fwIntV`"ge;>yOܶ]xy?Q˺,>ݾrsaߏgxr:=-2ݜ JϨt3h?YvD(d|e6hAcbdvfOmrpڎ\׵QGnU FNH>Ĺ1flp1.k{^] 6/]3u5|ً_Z} ͗r3Zzӫowxtf'Iۦ}aIYO*f&~P(M<~uO/OOTFQ>ͣnn)i"TdS1qnRu̹r8=%E&DϺsOU"wȽv^K,5Z6ҪNT/8 i)Yq/3N+ǁi  0V(odɨGXceLoF;!jY;]T87:A.R6EVUk[hywuZuo.]튇L_ {L>0~4|1ooWW~7ڒ{ٓ)w^hwm?|ٿGثUm+w6n*YWkV(|=J`:1[+ ڣ \nR,_ a`*ÓIi(QZASZtrGO<qHu(e?}#zVsB,uE8H`k<rT52k^'b6>罥B$huFJM!4bj< SQ.Lb0߂ɰ b?ĉ =S8GEM2k3x30K 3nvsP^5g~mU!f:vA BgØ׫u;3^pMum c5ڏWO%V+*o0/9Y np9)k(T$ F  ރ&-Z, 9=m/11szRc15yxрt*(e7q#cJoX  ?),\I6_eA^čpvDuSneҍFpylgVM jC-QetjřKj'596Y@3"#{&W5Qg툱$xjЦgM'b j6:FNJҒ #%=!+9i Ĭc}Jf V8IYk~rfaLÙk|싈gDdGD<"L ylsZl% vv3C : \a)(-7ӿJS"ˈ3>`訢KfBsAq˞7q#EU(!:{%i( FT耕IFsg{am<J9sx(x8<3@j֎nyȝ6 84zqܑ؏h?qr5.ҾH8_&fT)QQRb5ngʏ/(gz,8Fi۠YzOFp#b&ynq0)BDm7AbޅHzedєg[yzR: Z0Yۭ~ýo T0} '`񜬯Z}9?ZS#/?/Rmb!!qJhFT1 9wDgAˇ$`Ԛ@!`ցO}Bx \>BAY)H#.% F9)蝧S)R e' g`M !wKҫ ,p>sjAm6}WeG>sktP>Pm5Npсmi(&3jqK$J xl~ =G|!L2DZT$G}t9rYrŕF!i2 KyzA4]?_5ǀ##'V9cIF6\:*Ie0<$PQ{Q;iѡ}h mIFQߞo b?AȞ>@)$K95ƺsp@p+`L%GP')y WG:CM뭟P4qJ7d FMDF uN`|L EËFWXg׋hޖ(n;/">=c⭊.hQC>K|I{ixGobn17u?^NS( o:>&bz x|&~| -305[Am?]{?/?r<E[K7eyK307Jޅ~g"֗޿q'gnYwVT bTw׿]OXthސHKry_VV"4Cm3$=kdU+$|M5qCcA5jL(]kku봶e- QpTxNFpH: Qo.ۺ s%nrcp ٶaKq@a;!D'Vy_\y a|[Tk_5L84FYAs0vi?h~jdXFd[|ؿm}o96l>`w Fy6\ێt5!m.W.~L_!`F(G礢x%( C$ #{"uѲ[SW}N٩.k T|՟^D2a۰;.kW+XR׿ՙb 6w~qn7,e{Xl\-G)cy+˶[0<5917%~ < 빖<]|<1PǨq 4ibNHi֒ 48" Ndc h.(BR\hX Sq>4NK#@}܏;  #i`; 1\Yj2*50X %Cte^|S%1$*SAK炒ykFș(cЬb.SVGEW!m)xgqB=qƁeɘ)F5 5a$|m=I[n}?V.hvy:`r%޼M 7.B+ѫS=T+P-c2/2_.iŃ}L^]qQyy̩iݫ$Z~t=h>x 6 F ̥A^[ak )a\F><y tKO֞@nr[7|{7u*,wm*5sՔ|4|-dg?19I^9lz6W#Z,Dؤ%GP^]ތr(ҟ3!87Xy1;^LQ?B_޿~ﻟyw7_wo~[ob86+<D~sjiURfa9X1~ !c/ξd&r3Oмbegn,e1L57Zأk.ڜus ״9-ޯx4!-;_{Hh$G9 So.'yڙBE2+8P fQR<㝜tlÌz 0 MNsT-'#鹢:*T(~Rދ]gϤ*W3qP^U uZLz :ϧO;=u/ZվlSf4m ROs >V y| rCK.i/*{x+{gR~*bTx<$F$K,e)OSX2A.?fa 5 \p+A*F' PRx-EHSf^B>0)8ɼV# i}qhB·(dh%A>"aG7}] 9 (c #qxgbY`=;;=!wzX<,IW QY."B(D 2*ͩ >4اk]Aܜv5{ߴB1LSydžzU T2dc>D *ELsJ{(dK8[#%X&7Y,c Y sI3' ֜ˠ+EIu-Spkù7'@hR?{y+Q!WtI ]+&uo>8|.Ф#b1qR: $cCL*Dc!OT‹)Z&8ʎdnSBYpцȼ FcP^ϊ"X *q1HDq 9Id2湐YE5)+9MQ2mm8{$)r!-gdn1‡TTNv±a#U𿟁>^Ӑ,_CEZ)@tF,:Fr`tVizr.TC?s0LF(tz 2y+mN1&Q2dHeI@g y궐;$۽Я!s|kۿx<9DTJ&"g*hQAy09ȠzeJj,Tt4X)uOo.Pb#x=b\h孵NsiP BK2ϟmFO[{HK7FbIQw2K%'tɮ5^϶=h)`<Pb!q27]c &(7SkB1Q*XV4%TP~B0+|JUnBq5TERdm7j%)Z;YޫҐZ k@GB V (e)樍uVJ&3t$3ȗ#)L { a2: X"K$u$lc,!dV AQO{} JU%0G$NWbtVKY:`& %59ɨp6PLѠE BFLJ&o"8JFkZ]9%O4BfTt;3bsS _sI@@2_ ڨUxM=ɼI H0;Prs,ya곔jGLx+´0^W:0k+fÝyȊmJ;H@JmeKY3(Jc%%ɞ KHyដFȑ&NF0]^:%ja6hy$DepmvkS]Srړj !n"#W_oOPW)Gٮm\y qE-Lmu2M Vez%kW7]MwK<{;6)q|\v0ogK)o&[W/Izq9[^:os{rYp9vxʯy9Pln+b0gP[^XsmΥY0͏lv"X9T>FN;T)`$N6 *+lGF,AT_2 b( OI":=vJuz y6r/SydJ :s3cT(T*>9.2щ=4Sf_9s^/&7yP/FzJPVs4g %E;ḧ+D+xXB z{"YnTԖP& fHXA+V oڔo>3Kkf?(R(:M0ї>7,])Ui>GeqϞ(kk%v@!SZ46{A1r}-j7c(iLgۚܠ (:ΫJ/W8t4^Vk%R RM"0 (7wBQ]mѳ cL%Z +$MZ늲x?+#XS[i&sݫgs(vjS(J06ܪD#m='FȊ"2(w/}l<8;5Ն%xiMyf! qr0fg+Z'A裃"'pǬQDȤ}ig'CT*Vc8鑬cgdɘu!gkO/Ƨ\ޜ4ARTE8USI{k%4%Ɋ] w1(sRBLkV̇s&k#U>9[Ӎ"zֺXt):Q} `]&|J'{X@䦤1KuX uf](TGoOMw_KͲnGdfɣ^f| D USk\4Ȕsk|FmxJ,v (*6 :ـ-wX q99ݦGDnDM2U _^1 Ƣ1Y cMpmV:E ul@KP/0څha7Wg\C^- V5z:5iEi^u=/C UhU|kGU (|N0¤T)SH19'-ljUSAJJ#LeFŀv4Qߖl\F9̍J +`WԖzm,AmTz a q2M!@C9@B!e#n;clQȌ:B*< 褱Y7ka;"p`L kV=XL ILux6Tr QtGdUmj(lDC@(;`Wf֙5#yGx)(`~Gڐ&zȮ\F)^eWcܳ.՗Th'UϨ`[QUjkIk $DpP(46 1^ B5Vczh2&M།wA9u[kwᗢHw̺(NcB*&6/ A; B mB0|~&ޓt'OW\p>ajAԓR^FޅL}6=Dd-DxsHu@Ky@xc#PYd :JrJ%V 0 CMZef5:/3C#HV **2MʴJ5k4o= %$deYLZ[]hQx nGep6ی"YFE?/A_EYьp6YK1e8Tx*}ה&di"YrBUF-VX(;KSvz R{YyH! %:.p /sX#TϺ^H@ e@ PJZ%~@[vVh Ŋ޲N+}Ƃݔc~Xvי5Id(nfz <(@"pwFdUU%o1CYaXYvc;&gл |DUb|؀?-JC:b:a$?H(ANr|dz醜{r6YoԔF݌KC$`PrAu#jIRFCtN]Xz%!Z`4r/Op]g銆ޙGF'%Հn-j7V=: u1ROw(lyߗR$ftC2$#tP}V>ͦ| B^B ~AJ l|1J 2WǗ"kzJ 2(V}J 9@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X )}IJ g^HѶ<{%w}J 9J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%2ܽ$%AQ/G sQ!k(@8V}J #Y J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@r@NZ^R(`^ +f%З"X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@r@ZODέ^%m5Q~qzUY].// Lu:ˋ ٧(yN˚WF:[,_e\.W6k6|^-p=C>ڇKEz,_mzaq;Iau6 J=޿m[B4- 8't&诏bVuޚ`}Pk"mR{?^ ~tjjWOPy'x.`)1\`8戹G]j4J(/" sQF*z-"w+Kw|L.σI q} N㟴z^٧}~9KD>% I4~i-RE鰔ܽ$GWm-G'W$p!{)a{vpz@al%]v/ vxa?y|O6a} Ψ{zEbWdnԟ=,iI89GԃGǿc|x S7 z KqB"rCς{'8T]-Ciy?NطW>O@?q)*hH*X\L&HUdTES"xͫ CNaq 5OōlXC=Bϫn6)i .U;ۣ7~tڑoM ?yB}a^ ~Lpҏ NWs/V޸=S-<#~tލڐǯuiu+O+JEi핮cI6<K@_~F;RY~~4 FZZ*Zݥ6GNl}>~7y^wt{*+92a{1Ab%fPls*iԝհTJ.<,`dCNAum{~:d# i4M*8`%EQc?rײZv@ɩC* [ſI7k9vg3nxSN}6xq?‹{pw318nxd7  8h88wsb=wDWN_DqQwуZ#l?BjEpvqyĈ|r|e:_@5GY!6#?.} ϖ[ $@#ߚBL>r1lt]sQyXVdփ XYx9H-\ke z-_J d=IjOb|@.++sCN+fys4f=Y.cdYa5Av5} >evt;Oۓs7^*čM!H %Z1{¸Ppq7и_AuJP4O5H%REE (7+P+Yܟ6h^>AT'OxRq6c+tx\޵q$2X~hayu B?%!MRRCUp8M$6əꞮ_UwUyGJ4]F:Ġ&; x)\ŽzLpdI{j%HJSHdca j=MjcÐ/~_Zs4T3{5@R8e407Q4$ɾ ۏ5C*T?MY}SC}gKu^ɺm] 34|0=_Aꌱ<6(Uœy ~fWwޖ7^_/ެ-H*WČ̟"R5 L◫ysͥ); +PCKƖҨLeчb\(4i%zg6|GxPbkndSM}UTՐɩPH,i<-}If b,}ٸ:ZuOaՎJ̉ߪEu;s%LǓ_~w1ߝcכwo_ês#0^ ~1 ޟ-Wp|XJrh)=~u91\w]J{x2ïig3$)|JuZ=UN3YU4˗{o~(,jګ4mz1 iWvۮDtծR(=l%8kiM|QObH|z+)r$`sdXs^(bgGS)Fq`tzSjᒗG~xQe@PR0533V:}K@3iPZ#/0xslX`0l03w_a-nC_x8> E6riż<˲ͧaK)54V|(7E'}mA+=Hݘ|76 ܨKdjF#+r~Z[h7ŶCZτPDbԵOu}Uz񀫝.ehI\(OSҔQ3ИpV{tHS&eZ+IV;X `VѫyMxqltfX@[V}~qXb֡+| Q~h̘n7)a(wQIKʕ"M$S'nA|qƈSٛ #1i,JR'3&*cTPjaf}wKi9gϡcLhAÜ'BE QH@RPQ~;爵"DH[enCJgG/6**b)7*EVq'VnAZ#sd>NfѠcX4) {T0ر({TF-(EѰ Kc`PK.30q(d *P%L959FANR)a: KTELg/y&}_n;r9̅4FDQ2+03>b[m- a\$2[L%T8 ySDRBp`$И!#K9 Q`>93u-q,PW̉;0e\xjq'?)hͺCEY*Nw}&UsDأ90l֫Gۂ"gqAB,[JL.0"}WԮ \3^@")&9"yLa FFIQ)jG9]\2q:>. w 1WX/ 5US#?RfZ2Ft$2 cgmI:mFϵ,wJÃ&N:bdIXr#cБ.Hxgẍd# Ki RDWCݯ>rs`KEֿMW}x? C݇i/(ӽtjr15>c[fM$f1@WFe- C7"*hdzp-rJVSl95B PJJDeg8 unWU.<] -@tB%$\iF%Λd j*J"s1IY4*Pc1sLP~4 +,<-/FJM٦%T!Fatm6.9 Sr>Y:j)`߸Gt;QbӇ+7ᆐK gk8{{Yr*?TUZ?(hpP:eS*zb@2wb{4 d%lݿ]{U+Obb{ԼVr|H v[y9mx+)Y=@ŋ`w/+ZwP_Nͺu}c7lp]YIIw)yTҡݱf+ꬁd d+8V)-s-Ca2MUJQ &GFXgJQ8~dwT"%GҞ)DS Lk0N\rPFm astSZHs Z "kx@\9 [ *(cS:rY,1C!9) 8ϗ?TL0c1;J0;\_ %_Xqƍ~{z(>KR2oMΕ)Қd41`Rb>P\n i$<4dC4wipg>h֣qt rqHN>gN(8Se}5,.r~H?@XBtDL%IZ)9ֳblg^}:$51 m3QGg&*~}({K$~T*aFnVR$QMLBr(N*v*PR^oz ~Oϝ 騊FRg "xq&G_.e<@`*FtI7]| -0ťHHP@4'[=C1 ģrh2tQ88?6Ew|ŋ;RE\|VF73mXeɉy>TʮV 79ɪ syߑӶ술lXCQbSdWӅOʷz$Ju[/EI 85#2Z9cVodbTP=nMރЎR3dW0|9LNX9XTM|/%!d9< hER%ͼ^DXPh*o " s.uR'--Uw L yvm*eR~֙whDljĆ?UrD%q09.`R<$rXʈimr1H xCyI^t>#΄y&h~0chYXWOW%@+֠%DeEhpvי^C3d:pe)"=g <$M<2 yo"qP&H!2ץH/"}dRIFp:09)nR6mq^{UQ.tdޢQ!ID®Q0h(Gj@2>ʤ9~4B+yp$Z h_XQ$O 8W׊͓~Sj7N _.-̿9a9\v)r D[h^[0YB:4,e+ )"C!DbHȎ kYAQ-R؄ UqbXXlf슅wuL/33^"I;-@ !hE{&DdO,2TM$1{&օ$UQ-&f)q;\. b7<,"_UuNpcIG#?θ0Ž8??ei#LJ(*g@Ng;ǯQW5ןL__֗N$2iIVHH)BОk/ 0+8eݔ~=88krz7=׏A߭05tn}ނ~.wKG^\0_n!!x/ldF9i HhH0A#D0/$U༷'!\><0|Mў8#|,E$p|K/n KB F "qC 40b1CM\3%-Cs"YZ C1quB <6WX^y@Gg5؁-w}Dm`C V<#ԯx@;0rnTiEn-2LxiNߒDXꀧ &AJB먉%Cb:ZM{+AG(!*y O?[^ 7o|x93Wl0]oW(VT4oܵNbҶüB4u9dpUm: /fp9 c3c[4? .4q?w۱%OYw佊7'E>m=}se{iWOGq}0\ L=j\AB7n߯rfoW5Ml}h4%ߔh7i~ فb?gقu⡺Dw"kFfwgZks+wDzslXon=|' m=F~̓V Pjq )oc^4ѳPlM57[ .=w,Yfͮ5WN/UUo1Z貁׻~U/8J'u낣ނ*P&8AfgM@t ʼn_>|-aſoC]=W64TêH6M)/&8&GS  ng>o4yI{X]~ݟo~˛7<ᛷo~8zdob6cg\'sj&Fa^D Hu\5sop|}и!Z_KVWOt"cG*M.UY)!~F.r~I.;o]* .E>Xd\0!C,(I-$fXpEI :_#&Na{H[;'`/gY 8n[:m|iw>|_no2;ʼn`RД%!иdGs99UX qFg _omk]iۧi"WMNW +ۥ,N!%3v f߄=}}c-JJ%‹j:R;TR\DV:Akͬ +;;)vuJیңCs%7T1#Iy <Ⱦ\]H\kJ=.hCL$]S;>i3PYݣK:PanMox bo0ʀzq؛b:f^q^gpB} y8K ,\ SܜDLsً@;DjRQgT {.-?n*Yf#K_g:5'kݩT;ʲz9=\MrA&cQO|XcDsmGO|AyR"'+(A180AY0@I 1jM$ʔ#Z(ɂ6gj{ȑ_iHr9|!_f1]`0;ݽnEDjَw~ŖZvlKez8 XdçbQSgĩ0u]D!93"ȞL4gH36Xd& 79a='ġt'Ͼkl#On:(@4 ˌ/`J/۶yɴ˦-) gRW5GIӎR6|w5Wj|ŵ^Zk^HE⨿>&\>[\fh$oQ7ŎHbE͢ L㺊 ?pB > }Yۛˏ¬)`z{'hgoŵ`et<2Q?ސuT v9?^sLmTSR K|N$kI!i?󃮖X䯣?wk>!ia.KY[RI3zǒPDUEtp{3)U3LBwt߸Eŵ?jʵT/vkhW5(.hR}mWpGVh'B2L@%m: r,c2 ˘5K*bVl:n|ܥ#ܵ7G+8q%+4tq7q?_X~׷e}{xx/+Ϋ?-g a{ɮ+z 1̬Ob֯uS]rSSctc ْU Nkgת S57yt#6LԔ[@Iuf,hc҃Q*]2Z%ߩLMh!'xCՈNhk9ą`$tQ: )u21X4!Zs,>89w%MLb+tDfB&3qb?U4[= 먥!Zl[-pJ2TdY38CR:"QOVMLT 2;_ffҧ{ǿYQ!Eb 1 0(q`>-%|4/#S?nJM~4&/6#F9qtqE'Xx)@.woG8~=1aD$[4▨$QqK咠ON Nd+'Ŭ'nǤYx d$VH-SaeC&=V֎gEfDKz^]X_'[}~s.rҢ?:Js#<=|zD(ky-C! ^aF1RAZePZQQY;ȭ`ʦ"C>r%F U(EEe`&=v !fMP-'Zby@'ǥk밃JlXz` /5>[>A>s.X,z+=(ɍIK03. K~K4F++xd6гz|!.9ˑ1 9Img"gOޏ2@`ɚ^ySy ]?ִn7_9z}B$8V8g [f6W%玣uQf:V_w2X?klq(d.Dj{vpTėjx1eڷ%wC}bܫ#JV!r#a2i9"8 kL`0p&4Z?}Ӣ5qS,DP_WX #М]~5]5itu'Ys2]0evb%R-UYɄY,kKtV&.pnTݳ/_cʃGWBh@r^^XWͥWeukq!`25O.N'!^\uHNG~63# ӿ }Ea;P"P-}$,>I\}2Ssm/C  MptZTc7=~+!yl\Fmc亨s7Oֳ>džDR;'97Ivӷ7o7[>1ֆ5͠}d2$k;4B4h=^i~˿b10-1O,ۼAyݵSq{Gy7`:-v{YEZ7%ie]ʨL.k,cBPjX\lxP/U`ݩ;rŷ/BԘOl;Yl~$gQhbvJYS+zᘵҗaAZuB,$%Z~y[AB J+eV\BZP䉡SJĈ`SBJQz&A  |<ةƖzʼp+%}Q*o'7GN2WOQiҴ0aqկ&$z:O/4tLV龋(n2i)?p:(Qi?8^L?ǤICeplH<$gyD4~|wn Y^`x:ȲGjb`j~XoO߽Òfn@ \S:*Lwe ,w1_.IfL2UZY&,N;([EmRg.]Fg rzwZ`m&(_6&֦_BjhCޙTJ}IQػ,S`y'r9Wy{,iOݡN> QTs0ie y,rǔ_k+9+.(H]JX%X-ApG5ɚEn+.jDFuտ )PcTtD O)tڏYwn#n4.ݯuc0㦂OPy&zn]^֓ߟޙ8\',!晏lbV BlwFHwSŔИ  d2\PƼ{2S22۲̸G+l˾ ] [SkXV`1"1ZVP(Z&( HxKJi}E3xq.?[ò7g:8?;?SBa td) ZrBJZq'*(嚇Q؉iV}O}g/,q|kI#._wMfiˎXV9BQ!t VV6R .YS`YppWX֖}*X%^5v#8 8F.9@΢c!/F+;y,Q#)VR"iEEs khf)U+ڤZԅ!>1Xࡐ%"{l4V `NtK]Pp0Oηb]֍NB"PZ3IV.:jb&Y-qIP q_뙴~?iڨRtAA"7ѢѠ9yUhW,!$I"}ԭ mP3E/YbQ'/|Q, :eMI<|/b]r)čY4Fa~g~o?9NNŠz(_Fө̚d2\@CU>oa<Ϟav}e uwU׃zt%1NZsdIx%+|o2"{U G!rY8Fa).ywG=t2~ ͓'+[UVguJښ!דSy&#a=>_ɍʠz&G>cD7}r ro?z~7|7Rћͫe70w¯w`mIkvh~=ݟw'X3/|ꆿҧ̗N9\3N;^AnM" dދaX}ѝw/ld:j"}8+EͮC-6~6أGBGV6)swdA2OC%æsh1|8<kr!Ϳz=w{n ϵDmw}鑒RڮϤ_ߎB`!NG<\ŬERn3y[-߽Y7}paS 8qzV=ZmɭR¨s|}-&v Z0Ksc#GBZDN{CZۤ3$?n|a=y#=3$tƘ&O y J)jԦl5EL*b_> Q@{r_vǞT5sr^5BAϻY=4{o un^ &f&&KbB\XhD@Le6"J4cv5S -r(xgҰj>kFOޕe(쥵Ja4ڳPKpgݹ]Wg eYi4d/:eaxE,YCٽSU$ǃvnoatS2eՀL$1=' =))$꠳2XKR+DE![}r1btZM@:xH(N&I!4&BS5Jdtv֝ M3|1/<ϵԓ o&iȓ~Z0qae3 Z1=^}$j4+xu~,P䗭My6秞i1.2deS JobBM鏄@vW! vȖHV`qspJN)hX'M2P _If Ae~/!['~, jJS5ȢDCZ"~BOEE*5zP .QcQxS!61%$S͂T  Y1W,#A rZqg]y; uMqLQ3* BTW߄/45NInOQo˜eopWԴ]]OGm 1iSH w8({TQ[of ggojiɊLF&8R8K oP<,BFiD͏0<_?o}/dW7黽ýM{~cz^ F|ǤBHdҖ(S!+κT,.:p20FZ9ji+~Ѫ]~xoxŀnq@//9XnЁ&n- C do_a4O"fGM[kXyjep{]JDHƔ glޝɯ&Jg%IJd(dnJ)l})}Nkk%ȩ-ҽ\zR# V(1R&Bh9_ ߭s6X $]5KBp KUDJKgglP];5|$U)1Ui6 {4! j^$ :*5߹4,Cvc/k`R& 2S(J^"pZ.t`cJ[Xļ"r`G"\ܔflC,,&{Ogԝ_Rۮnc̸ڇs^i" L+m0k9H*$ϔL*EGz6G6絲͡V܏ڱ-llPm٭8W=~O&BfŞv,Yu69ՒPP L@&JGډ2{m'ى!o7"h`qydJHJՠdFZ!Y)I  jk0j_+A$Dr!Ybqn8 Ad,(F?컈UoM  eI |k}#xg^;_Q?Jq$\ ]vYt9sB㚃 !%YfۛvyE2H>h m(Kz d0 qu&jaJ}V 2Tvge%٦d! zgG;k_dfe%"ݢtz_PD@i0pjA%`Ȥ"8NI0>MwG(1w'z!UAGQdXW)p MDc"?PިΆ?'}Eh0q "L %3,&(xdV9r2y,`<\XGf|C:~_:QL19x}%FjI+kw3]dydJsu.T^IAJ ՀQ %Ձ4`m_"Bͧ.MA7b|}l|vi|G&2Vʩȝ VK)7jZdԣy*^XG_r_p=[AK: r95&|~HJK$JIp6y`i JC=62C8> c 1蓯%B}X[xqHK@i ,&Eo7:*xr$f,OC>\Vg _E~lBj&'qE̎V8aRRTR=Iqus z-~9m#\--I&7$ gQSuqGuAPy2ZNBd ;'I| ֕7GF}y]`=#@s2HEȢ .7Da0`oOɼܡǞp:v8ݗy(| gD'kAH8ͷ$]eLgwSs(~8rmtM& '9&H JX+A1bφڛ8;쬝MMya7=Ŵ|K"蟟`Z9AzAs4w]%QyBGf,Z0J7RB+ǣ1ښcYi1 = I eJ /';ZZޤA R}#co܍qް7͸/ ),Zl^ 4W Ά_4ܘA3{ E-=+Pd,$MV(Ίs(}`` 8Y0gGD\B/h wduqNgL=#vo܍a0ح1:)21Pȑѐ ;f]ҫJhZҩدo6U C&fhD'NI1|̇*r}G&x:%tSƽK?gDD< 6ޑ: `.G0ep<9-et 9s KWaLIqRRڀ 3!lTR"y\ wĹ/%zԁp*ZgoZr_\=8iD26wN%ޛzD%'2t*;2giǾࡼ>XRe,))\яWd?Ȅ漏GpCյLq^G̙TӠGi[\gZcuYV +cV6a5f!e4EU]#^U<ث_,gpNJ{)vgA/4Yn{ނA)wMu{v1FDw 0!(uNÜ52.ZR{s="tPty<{" NbFΓcAQQDQxdzF02 1J29+EMsI)V0L^G30& !fCP-F 1y|CYZۻ)־Y":[//FgtQwcqɠ7I4̴A\oɒY+ix>O?3l<9_&T)3C>'^ioc.~4 ?-yb0iN)<7m ;`Ldz1'ݱK%d<~ig_v v)`/Z7e{=׋^~~[qms%2d|oՊawVO|~}K9*YX}Z|yy>l\/Xmh b]hdyq9ao*ʄi,kKtQƣ6pnTxu׷1B ZhM(YR-Էu=mfy_ ԕd</^:06XԎ(R1#pN㼨]W.lǗF~gAFԉUOW4P]dy| t I`5dRTc7~f+!yۻl-č֠MQ5?76>ǖD6rVB WO[<̾j1q8X4:q-k[Y!_N tm3| =IO{iq9$mؒc9IӋ;\X-[Wn4v|Y 93;Œ-xG3ftOF;zF:_g3QnqfQUk&hvΞXdL42Փ閆x`^;z-xsD#d6ssSv3ߍt# QY@]WN0_0LRjI92fc/=ٯ88Bu7Pq0nomٕ@ǬfDy^ `6\l %&'-E!.JwEnٿF~$[e}xWۭ5[TBFd"@y2$FQqQ K`NI 3ZI\թ{Csϒq^훹o馽'%K!H6Dm)kNTBմH%b$ `'W5A\'mS^}oYt3n:ݓG"] $I 1;E#Ud&,*7NFh02F0CBϲJ ڲ/ "| P jv#m3r)GL3[b@)c!'NikeY*%C#٤l"-Y$ugCЦ_:=2@'9#]w\D*7IZ:׼'lt(dU 9EG:)!3Xas RŐH!sl_4W-05YY0wLo$,Σ&ﬔ%"l@W `~^ts]Pp>탰n! %`$+O>ղl)Vjb0>QӈPiE 'd!C }{]4 EgMd1Pd(ٚ^mjFs$8ԹبID(dqRrJ-kJWjz3Ҭ;Ƀq\FYx*iv{~x.eKCGb0308%g$7Ua_?"庫a~t%k=~7u>ϞitLSF()$up&,3# t&|O>L.X ̮vF䀼 ( cȵh$7w9-MxB="2:j袍85) Fc:|? BHf3biV=~xqzvUó;&Ͽih,l+k +t 1Lo5غ~^/ `/;sx:/vAKmGG/h1'yX*%7ti,/׹ƭO%N>'zrtxqഀ*q{AkuXXs9z)e6yk|N5o\|(`<=uMՇ:wC _4<;_~|滗~o_~xwH޼7?x.2a8XϛwR!-n;~zǷj8zX|ץW7>e4 ( '˖GU8vZfM/8ijۛ7 7hZA-Mu]yMڮp.V8[ 8*,; toΓIY,= $D_koB<@J XIp+?w\`+%1h|&Z :Ě^E`HQVAgN':{ux5lO\*<&t~tgRWh]h}tQ:86XknbvlamWN+mSmcG6\^'yip庂Be8P mJR,!l%dbĒz;IY9)Y lQj>&*.A"L- $-]'u֝ R_QH_@:M" /aX.gٝs>tyxZ:q] - 4=:z$tK3thSK~^=VG#y)g`d} 1FJBIm,}vAW7 eݫPkuֽh{d533؛|03W_/mL-%kyp26 >PMu In dݫ_GoG㏣5n7/]i+q^4;Ku yǩBέZreMhkЮƶGZ,;CZ,ILjN Gjԥ;d%I&蛢*zE nV̒w@S a\̘d2]X Z 0GBI=k]_q_ZZou??kY7+ͦmm`{HPuC4/;OL8.Sɪ !Ɠ&E r" eA>JN/oJt S>%[UޗhmN@(YIpҥc>@I D'ЁLEXz|Ai I Qz?+tܽY窶s{lN¶3ItN~'y^=b !>IH5!u:JMEDA4T,6hyvrqeeCQOMRSs!m՗Ej*`$DR6)+І(3F92|.;cCKMЦ}kcMqźV(4Z<5b`+d# "y[%QJ BȜJNc+R&8:f0l, Hg\짰xadBDֶƄ$1nX%`"flNE .9oϧBcX.:t">A(,X_YX"J+z { SU8_}pȾ~!A-u{wӻsIzó꿞4_xM<'#}܁S.Y|QAA&oQEC"DyS`1vіp)LdndNA#EDL*]Ebf-LɆU-O[hZ;[>+@};p GvVJHa7<iyʤWO!GNj#I錖3 ( lVXIGeaЎHg"=}b/Y4}ZZ]d|ĐHĐIJ4h05[dRuOQlf}xv>0aOݙcriV۟z{h`: ݽ$6:131YHkaEC`*1Q2Lפ"JDm-Rb&{) 2`pk .őr%FBJi2Xێ%-7/@㋲nhqWuګY>DV_T?4y9]xLɔU^M`8&A[)RH 2Z$(3aF<%=)F0X%iFq5Iw}\pדq[V|u/jgT,WK[o }A6NNOYkdLF&zEg \%5C ps,0Oe꾃Vq1Y9:`?b_?_ʒ:j=~XB3/9b1QG[rI{/"Cp;FA%(@4$~<셼^:fONr[\]1@9xiD?gI~G]?>=g7ot^WCo LuCK<m]iiy~ΧlXPgjBLO dQ=Rڣ!,|o{OZk^yoY瓶 mAMPU`vxvƟ9~{ Yk /~y&Uf2@Q ҷ6>ǿV|nPMߞm#_7Ve/=k,z}77 \T#o󓚃FtqS.G`c uT##) hV{^(w L@3a+6wIGsKgٮ0d5e1joLERXE(B-CkF؉Fky{+GP IzG"zB*{T>S bRVDPQH*l^(O߬~+]moG+a/:K,6@3vs:ѯcrH+3%qHnYc Q3bOUSOWUnSM]Gգ;nngU(_'iRQ _=M=j\0_?gzzUSS9/nr?n=sV yt',1 8E2$u= c`>Y@"&Wp~7(uũƖef4͍ETkɲJjxO3W263S/-m΋l4_ZfiKZ3,qͨ9Z3 ql1<)ozV/@4 `fv6nDT]Eܯ}g:;s{Cbԇv/^0WˡͽG}LQ?bu .07;:=5>[xտko^q dMwUUt}ǭշAST1]fOߓ=mm)qw"Tu̕o~vZ)'1 hD֝Cqϻm 5ui"q[JcVޖ 19j&H #sAlGN$DzPϼn.F* !$ IIu:=҄#J#8Vy?~<Mgg&C7x]S 'z<) ¹T"F$1<j"S-N/KEYbڗtwz+qa}יA61[d!TL6${^na)dҐ`Jdi*Z5xt R}*^ߛ|䌳d>P$κsU DICRYZ;+#@^8cD,/=١=<bTviㅑJ]={A)AT",RLp7[.q2ڴxDy>M0KaCH` "zlfH883+IWũO.2i`8 iR(JW\_P-UUd^EQ%Wreg~RTNbǓ':yٴ+Nޞ$INNZ#)nXO$ Uf"yPj<]2Sq11W0`4|Nҷ~$,\M w6cIc .Tϋ_/Csh6Of(HPTodp={i!oz\c5c̪13/W~aܿåNۛIDqy4fl5 }:mXIے=6H_G|Mm`-N E .xGETUA5HjFl? H +4.rpw*XQpf\-5wu<69i<1fq1pXޖZZ~oMuօ 4%{Om%GnL.t@H]z^knM \)7 8Ί:Y~gگ&;lju0cQ.3uݧ[la`H! 2]0Z`\ZiPFm v8Rb& XSiRx zqyegl;7oR(bs /Y; 0'5c+!^- 9⎦rYz…U%n#2*q*"Ӓp "JFc&=jӀ%vU+F2mO6NR3`Y$P#H`8qmrmG66hMQ 7YvNuMe_LU̾_uTaE"-dDZKQ)~((ݷ(BEq'j3\DR+0apg<8p*w9B}<4 b/N()yؕh m4!,g!)f~O_%n|hFY/w5T Gq'/7w&GzsN1GBuyYMD)# ,lۻ,hg+5WA@WW CϔRY*MG戴JJD^ԯF`!`aB".!#*#+TT!H% M3Y6qڮ'3;HE{`$_=YS\h4 9#p( SoV4`<&rA՜ 4_7z?Κ2nck;᣶ !Ҋ` L!;)=9BlPG8'}EO9&N E-0QO3:fY \i퓷9:,WGѲAGdQw!n4$FhT,%"&$"lgYӑ֥〉E'hmVD@,3/ƗME5>sGX5D 9Ij!DgP8e4+dzȥ!Zl[\]Tibuftڤz4)&R_>%Js|$<ل&WnmꞁB], dxJ4  BG f>~Iy4IdVŞ=OD[}ꟍeUgaܕAZHe1fS*->Wȷ$ܚ`\6YnO v^:qYp~gѷU4-“)wz N?o߿ț0{L,c05%a4UTsc%bD\_db{\Jc4D2PJњ 1yH=o }9.p, _mRM12# ynS0cƽcLl{C#QHYu2Lwڔ~%{-#cSFSyby>8{"[]vr<I޺o#0AL20&}9b"2̷tP1 ΀$P]f*~`Hjm, jxT>s>AOvY0I(J1,,VJ\fC&;kum.26K;>9ݟlzr`bf!M]سckYdm:9+tx͝T URjzݜb0Gau&Nf[%@gNp.uv&^~ץ3 n т{ uF]^p]FʱQ yikvde#A1d`nUQZboe+ ^I/V=i*, Q@X]Z]n7"`'S[ RIv3#4Z9 )yR`i@:$t6)K-@hKy:ۮ3rv􋳯^da˶l"ҹSMW A}NFek{8LZeI옌 qYZŀ)[xT1 qGZ^uiQ&EiMjU;2o- @(΃Y!,qDFblj_{Pl/+?6qEHd4uYw:*qɫTh "#YQrIj"0>YOS*OG&Zh:K/uUtT,%[3Rt6$\DA"t.6Hp" FQ2 $+7[zJwh L{>(&a ~~}i;FjD(-\ ʷq]<L>61i&I2Mfg_Ul7VWKАg^GvԔUϬpd T ?8d p6˧atq0<=#ϖV;3}9sޮmunkӐ.hA :S&WSN|*C6dJ߮-D^O7?bjRG]4PzjtI-f7=Rv5݃սFo7_W~l.|w>?Y~z]CUS}8.bm\  &a ׋;o@֑@_?i0}[<,oSOE+XWjxtZiZGQWlmNYs9W>rJ*ZEVdSjڒ`J m%kR=$UBt=Dҝ,rN2RBrB U ޓJbq&y%j~{vE8@ڧTx.`iOX~ӝ=u2˾nO5;MOwkx[c t  Zh==2j(o^|!)?@)ߩr˱J*w hw7\H뙐V/;GZc6vF8̂O4:@dn~>p-RFJAddKj?pq{Z8D'tRҡ G׾2I śdmVdvЗ"xZ _ݰư]յy)?}J>GS8NqDmq)EڂS&21%$Ru$7HJx 4$bƮ?vFLF] %r,V>޸@lI}Y*yCĶث-L)2[Oö B4-Y,ДX3=YZ,P^B!EWQ{2x\&DOr gt9R2D ۉ'! im?YBl3/+֡tZ[9yz#Ct-L{iAt FVteDEN%Y+ǹ1!j+}r 9we4~2& Rȳ5eᢀN1U((ɊϚLr'1,& LRcX.At D}REE9h KJuj#5g+^gL[ti34.3csNW<~rl^3 \&S 0HF-`"Ї&M2h-R"@lr4` t!BZ]Ԭ|MJ}zzς Sq|F{7HhO^\|4~*ᙒW=y"sڪg M;_˳]߬>Z4Ⱦ|fGh8y|:cG-203HUC,4L=|`d#k"ق1QZEaL%&BI+L-- U.d>_z02#"MD|>E[SWN'];Dv?HO̽6:ag!֮L9YApR%+caicw&zKp^CE`V54} Yt%VJBb9vs;_>N&m}W.|F7aޥa~R ?hD1[^6{.D)Ky$S$q3 &䞚Z-Z'$ѹ"}AوVDL>&|^ ^N`5v J_ӢӄI$tQ#c-HA@A@9U:Cչ\ ?KRx͆cwU/خ\a'fQex&j*ڔk?eއ@l.TZqbQ$2#*{z"iSG'1JY y/ŀY(&+,Bpe0&uG-Z@2q:_Q4wXP`Η7>5wAA$'dMQڠB buj=v¼K<[Φ|i/yUq[(v5oɾ4J|/36` OF]A!qE Y='sjW@L󾵞]S3>^XUCgTX?,N;Ky+4@!Z b%1[,#"KKd=i_.PF\AZ0&B%L +轊@n6g"w s-ƠɒBE^!Ah콖Q!cMd-;ܴS]ʕgɖ7uIv&F#HrC`;`6ݻnEDj ,Hv$`bU"T=MŽ˚]^?4Wluߟ[׋ۻ7tjc4G[زun ?yUCk-CVv0oE!Vڭ/@]ŭ#$qWo)R_6ݵ߆ÎQg&ӂF ÔirSX^\ً+y5(y5ּšWX k^a;൫ešWX k^a+y5pּšWּšWX k^a+y5l}֍9JQšWX k^a+y5}ڷ-of}+hBVh [}+oо+KN,!KBfYBڷBVh [ c ہƫ?IH}+pBVh [}+oоڷBV7}#sG}+BVh [}+oTO 4ڷl(ȆBVh [}+oоڷBVh [}+oо'Y\o;Ҿ9cߍ\:2jeq+n澛0Q[#8{gdƠ}FcpF$M(c}pE)"h0(œ+#Y)E8Ip~bϚKvܢ8۲qDyE.?F3mBEf%DVAX!Ym0"I#d]Ը8^1j o$` f?xs-p (e;]3[y^4Çv'{k}xx?}Dy{F6^ۨAyjnT$:0Q&1jOYIY*3?z6y 0Y>98r ?R*8DgQ VXB+`8_[ sƢ cC53FuPh2iP&$d,hC+Y2ѷz3`}9zHGE}a݅)^Égot܍/Q)qAZ#S޳^^ K7x7Sc2 R Ǚ /d@ ߖŸyI; [N?v;?M 8ϖ--xyB@e :[i~2E~R}w>͗-뗈f/f *7LrZu.`/]?~9Y>|0߷-{b8m)i~_lWdvyj:'0/ˏoqd2[&ꗲR={um?rGosȣ뙛r 3~ui({\3>ӆ7n9]WtQU*}]qA'ad|2Cd"KhtysXS_;{Z2d2z5 R˝8&?r}M eyzue]-NnXo{>3D DZr,jt: jo3x-ۅPЎ_8`51~8ÿyV:K؎F "8>)M)~z}wj8o||2fa;պG_Ay_쁖Fm+h쐕isuOY e2[@RWa~M^Mft8p09QnA1O'k~4xzp´|~,|tAxhql%RAJ!ZKa\$ T6i?/ri͢K:\ѦXz,Q̲lQ[C8EjE} fl{fQ{gEp/3(M+ʮfl#ݷ{CJН_ ͏o=+WX{]J5źl1 iNʺ[xտioתp!>>Mnqo[GS7uicm~}m7ݸJv}8|'l}L}+=Dž﫾JzԷ`T1zoq:ׄU$Qhr|k41:FH GϘU)P;옛IgU[ԻCUqNcWrE_zkuT_n<./udWKkVaR5͟U{?Wk=zV_tWʒzU:Y^GX [ߎPEE37ŏ:^;.\3vIw'ގ?Sr,LwU}1chcVh]!f#TFOZ^ŭjy:D1nC4\.NjLc|eȾ6ۿ|Һ1pi ̚1ק3/~_Å΍ٙۛTip0ѫa|lǧ@HQ+iG>֚-ڡ/*1[; eBS28 4̏W:g8uwo/X*:KWt2 Dwg .ff%xB5¯zl|b1bnG;i2\UxGƒ򳾾[V]8>J bxz(R[B((G v] \)78.ɬ zv/MoijmT}kߊx}onVGl)DoEpFe"$)'ACu<.vh@xF @i P'|О`\&.FAA4kcMc#J02hi|Ж`m'scS6oR@ʩK@#HQ3B2z`2^x$Ai凯i(^.t K5(I ,7L-s1ÌN%bpIA:,)V1 cbcdcDgRpN**%!PE>:"'BZ XVlllE E1ojtԦh+Sm9(U/t^ťY=DISҚ=/t$Ics,Ig{EIrcgtZ3JVdQX1p#z޼^2Z A#2onDΣ`K,cy)pJs*yї8هPdGI//Äo֢'t3>o;ʓzIJ΢}.M 6%ZobTsG8`CH.rwo\4`q}@5h1` NAc&SR*.E Hā * +D=X)qPh^h7w)zp45NYjL"FϹ5O*e$a9O ۜ҂qYHF#bsE7b|Ⱦgz6e.(5Dp_2:%QYr3p虴5Fy?N+0UɨGXc5LWc}iȾa{-n7yb7:JFmUU/*s\&/6Fy`^#{T@}ZG!S7CPKF oK߇Iy4HdVŎI;gClEP67%Sy5p:Eg/Rud|K|{ς >+}sЄSg?#"K7*l]˽?w_'.-n؇kB9St0,Sdg(#nA *x ΄>~Ok Pi̶˕mmY¼F (K?XDҸ %ḩj6A lm_ǶwmF(m3ȇ4Mv]. ^e#X&),Y-[v-9a;΅<<3g 6j"cE֤tQRBfG5Kxد iY&hC .BHNthlodV\}ڽIǾXmѬvCwD/ƄY h2}Ej%}c xiE`Dc}QH2!CEGȆlMʄds}HE":wloݹvNbj&g/xE=[D,bwIę@ 18k"(Wgș Kp opE!RLH!Z$:{E.B.%&UO"PzTV }3 t+X S}IǾC|=|vϦ/Y>ȭ 2 r%q$⥐NFy二ׯ._?* ʱa=dCue2 #w,XN02l[6-ِ'o=sSCٹ79)I7mܔW`+um֮y1y`of=֎-q)+Pq.*ݵ}ڲ ߠގ#7[xzؽZcD>u8.% s.|LWV#V Vpna ʥd,ܹ# R=XaSJ0SQheRB(xQ*P7(tU< \]d~*C}'՝ädٚ'||ݦܐbugr0/8Hb=G7Gɪ1O}<)sNa4LzڞLƔ C D(r۞=d8w=٣]lO` Sm( NɃFJZb}H}x¥o6Fqw\P6;ݮp}=s7B:Il夽6PHC4Jg /VP:|* '-\0Dܫ ˯`pEv7 =oAڑ%]6j$_Zˏ=yYEEffE` (雗,|-վxYweU  Q_dZtMq\r:9]ABfA&@Y>g!%O:1x y:e.%9\;[zYKkW/r3pkYIAْeJ2BFԭ6gԐ%cΡФq6*kPOQݣs4S+N_8Hצ[&)J4FA!oɽ+cUtaH#'EȺsNHpy\{Dd4jʺ[?$.y5M%9E%Z+f$KӍl%^N5bhD_ƱIsMM[q'ͣ:}SETNk5)BT/—ltS~q[B 2qr[!yXOR9%,-vt߰TjkvЛsGy0 K?HC: ?1'41H#\kB l0T3s,2|PO0=1u=LjXb./u׍{yQQt53}’3P2\ .˃3~04 ~:ͧa`:##xzG][o5IJtAy>JDnl*nBrJQjп/GljO9 `?uwrhd,>]ޯD_NWb4jvgA,jrq(bhǵ0$g<{0?cy~ۋV-1`:s?dx<[pWc !Qt1rpGy9J%Tt66*,"Lc-ekʞtR4HK$Uck̍统~#vGޢiE|K1ئ]#oh5ڍ3!05u!TKp-|u4 ? O\~RIXc2 9FDWk1iK(@F4!qaw{`=ؒQ J2+9(c6bںN/:{ӛLGLkLYU-ug7;(OV}eIS6tL*#漵 +,B9jlV]Y>*ٙi)(.4P,ȄnPdJ"SJdWMS&f J(*+N"x ANJHq@%s)b,V'0R|Ώ.K*aPJ3x"#9YiM-\|юБRֹ'CG4$K4ꑑ#W8|2H:Ke4dԣ[U/(]*=@kTn}QhtweV&5usmHk7HWHk4Ve. w:8u^U$t*ZnW1#IkL_i}`YZPs͂%%36X pqU!Ps_6)et6ܼmoM2m/osx_ ަ=AFlTFlTFlTƇiT͖:֨ĿQ5UQ5UQ5UQ5UQ5UQ5UQ5UQ5UQ5UQ5UQ5U=w&FlFlTFlTFlT͆rDj6dԨ٨٨ِS -iOHaQ^ FÓ1* ʱa=.EU^,0ܱcE*"`mB]7~%`-KR*&BDL:]PX Yt\hY%5ֺ)q<']413S[Y.mm17#`ZORFuYP!,W0ow9c{*KWN*=// G\J +-/> ܃6d :׃h_2A)!|N(R+(tU< 6iPdKu%WlBӚ|k]Scuj]^4`?Ԟ2toۖ&~# %#ϛAAL!橏4tN*ӽ#;;|hޙTJƔ C D(B*T/>2#]ԧHHDBG )6IA#%-c>Z$ ̉>qQCݜ9]:|]Nf561e~DB6_+1:ͤ@VȆt-"CCCvQ&KUJ: ҈1CM (o t f̾n)gS̳J" $Mie(Qnq>[ɘ} `PV+Y AIӴB:z'rAOZߣt5:ZEWI>EAB@Lt ب6hO'N|y0^"~s#?E1 z׏Ni3YkR!EQ)NL܄TxɠMPD9fW ) U dZπfޕE(6J[HOJUSz޺sߊ=>LfeS)_-[;YƳt܁ٙb6|e7=2T}D L%C5ldt~BB @6\cIE1#荈>Dp1:P7\ U&{y0Xi` FI-r3yk1PxN6UY۪x ̩$| aLtEʢtp5sվLU4Se9%e$ɋ+EYZI ND.{7J 1(e9EW_[n JsJ-"ceeeRIeAUW+,)"J+ׄL]Pz9{K ȩ.y'l~n̰1Jsk+_ jڠADj~9iQcWI+:o5v zp,yar6fNv+4xPfor6{peƊmYJJJ T%U%ܪºmNIsB"%qOivٟ{TX[xy]yѩWgz1-^<#.N}:~Wz~z}y>)] !E.q⥴%sz17ϱml9+8[giYq_@2X0K0?Os֓z__{M>:^ܛ( ^@x !'{pHLɆ.$h x(R e4zbh&Ѣى}nۆ[p~@> 7k|Ҽm ZWo@*TPP;tR6T( JM&lDyW^=MoHm lra?5xr)xk7 mʛE@EQ]%w.F);%T˅[?E?:>jN6ʁUHQdjv~&kz^ d1ǜIb60FKY( 4h\V*]Bx <Ic`?ԺArCO(VM&j |^FPs~|(>)h |MI(|N"R1&*Y4hn3]{<و#V #%K \0aSJQ  l(lP+[;gY2 Y(CC:cbV"xXϚ @=)|vdvKP;J3ڨh!ɻ m! ZN)CU(1'LK-q$y|t#`¼ cșBZ2i6kˆx设|hR;[Iv6ϖΣ+cs,3!'WۜPYlse} IZ"UJA#K_@Dґg&ؤPSBJ qH|SxmS9NgHo6ئ^q}VD8 ]^o}v@2ld%45v(gDiXS/$e$VΡ4QhpXQ 瀓y9<ךɳI:`4zOOHj~=/J6P$\@ّ "YCdQMb-)ӁГmuclNF Tkdl6؎4f\,),2-.ˋ&\|]d} 0̾LƋwJR.:2D"d6*09Q BFi,`m[9U{!P1 X_4,d vLЛB1b7~Ďi01OIǩP{`;}"M^&cRSdC6$6dKAgF4H$:66PtZ"[ %M*[ƚ=AVM&QAhfUlTgZ{6~a@wI|YDLlEL%IkE+3/`Dp2jkUg٘k wVnY&QI_-D(gbuHؤ$ U"l8dUGź)u6"6E5​$&S" X:a!zUDWMGf &\,E x \<B=..D!הQ{cR:Zᝥ1D<}|m@ BS< IzG"zB=e,ECRR DPQH*x(*ض?o٭iLwF Wz`&O:;=[bu#śW7xBboxgK&*PIBJIaa,i\k~K5F2Oͯ>zBπ<;0AJV .xx~(c#ek'=z!i<T?w1#C6&d-L3(%TV(TWJ{C`9i=4ykx*ӯS 9XggO.|.?hF!]&cэ,Fi/|1ʫ;3,ĨͿV7=`F׻X~g4M~3ҋgT=F~qj#<_nd}{6}qXVyX|<-儶B/MnZ-s[͗xZ(YӺOG\whezn;7.yt- W щpdrw8[S]3ߧ^}p)J)Hm72S/.v_+V_?&$ElVk{|Si/fEU]t8N$2 *9y$KX$D6b2-.R0Nlˢ'`*8Q8 )jaAX\8JEɉdL0#"ÚRr:Q2_E~^e7>+||.n_q>G6~`J*DWŋ t$]6]$-I!sP2zl }%N4dmKbJ 5C`@6eWiRQEzy> g3r3fщ\">1KV%eBFDB2ɕj:R;S$/ۗ_Po!t W/lFledA ͸o]-D5*.TmR&y5Fc(DŽ5;JT u<Eκ A!D$NH. {Ѫ9rl *zsv?ѩ]/jTwV|2U|nlv<,,KZ|صty~ [w,նʌp,SGA_gɦ?['=W\☷ϴ9[ٮ6S3iT޺'=XT3?z]m`+'yǫOx?|R_} /WCߋ2ttE5waf`sc6wpCi{ݕ6lqa*I9.Zm l_~ڭ/5\ nݍ{zD\{c~:\ v.feۻ&̨v? ,+sJ}UzE飷h #FR|mIke[ٴӃ{&'"ݶc3TF!/^=p#>?BV5d)9ѹ2Tt=C'$ڜ\K˖MXENHmr)Ւ1fN7κ+eFu7Z {eV;n|<@R_'LK_EpWˋg7_"%??$k0m>ʫlDNZFmԠd !M;0zz֋7p0hK?2:1 z}*6^!5~k'v >B!GÜ͌aVŃvNpt2vU=lymI6uʜ^,\ͧopK:Wg.jttX2ed!0J?.f:<;voZEy6Y}&^K;6Q 攲M=YpwBqR}-Ug3 /$Vn8EͯQO]4$K今"2LZ+*Kt!+ K8-ˤVH jR"lFSf|@3v}7]l⾿ڸ͏0d?5אSd"J4jx"Y<$lFڨ`\P*Q22%&/BIYkAEhH5jGOXHS(c]H9linۥ_n֚v Mig&fD " #sA+T2͋e%*u-j)|RnZ6@΋wX,jm.IiR oa:IY&ZX#rt<ǚP}D3VD4cV]otʪ֒4d!֜ŻL'%D>H.#,G@똔2ak[fa mF4L0GBH*|ͱz2ᙝͯDJ`B xI6<OhN)GRe5;1!P2&.dbv<>O\|8n [Vej*Vd-ATr5ʶq'pבV.EQJIbm(pP[YGW(_+ԦfAcFHK|S;[E>I'u1u0x 2ÇF* OQd}Y d.b&C GeXLY)dU(! Q ډ%$xQ]Ju;aH%ȣ^KKXD=flEXjVkQ`Q$[w"MzdЮ n43`BI 2CZ\iė EI\ c!6r ]Nc":)4d[em]Q ;.kCx0+GQ6fa)5%rC,ZeD@9X_*w!.`P<<&ayX ?`k2`.24dXfT @6ː WF+S  L*$YuTxXWkGlNVb2j2iX@>,T +pN1C@zk PNiP,`6yiFj3f%L!tkJ@AGmT&M0 #BdKik`L6l|α8()ؙ5 *Qkq+9mL:V_@kb C{2d*ܙ`(q 92jܨP5o&(l NWPS HvPW.{/(tYxIYHI(%͞ L#EV؀B¿&r^KFjJc UV+ $1۽@VQ5eVcj`q"0Nukw^LKQDl=fD' #En$%0!"%]Y;3l|n)GScծdd=2[]󠻀 nSCj!ڌL u`q>m9#K땤I.+5Te -)<9?n_fRc!9y%4E"eLԴbPyUFA>xmBR*]7xѴ8{% yp_R$inunG͠ۀ8z[w U'eS ֏CwVQ8evb8tg H7VEhYRs3HA/&9І(T蠻%_>P&]mEhS v0ᡔJr'j硁h'I u @9.Az-b l/ lwXH[vRh \*'+VЏ.(4gh :e#Krqص7F!t& 1CIiXv!%,QJ 'rlk-ؚOd0w+[餞;q-4-/oY- Ԕbmu52?' JX6,붅Zm:h٪)0L![V`%ƍh܃{wv W~{N? ӶGx)mf;)[WP7[$LЙG8!vW'{ɢ̢ԖVVA!JU3ADɪBf cN`1ga#[NJ3b%  I z UPm TP͡2x9AnDd(wW[h'MJTLl:R*X&R4*DŽ*mz x"2mp!0BM[ ]@v_a+Bqc@'c7* ;XI;yψf Pj#J5Y1"&QMlwH ;^ցNi,?{.ڈr1h(8ΩeՎNMŒ36i&z`(5 1@&,څZ'rܳNKiBmXGպaL6fS""aw*C-6 7VP8i" Àe 3^TVjBb?MnD*Sk@zlPRa{֭A@ZG[`ŶmO[7v;!SkoJXG!vSޞ.5]/Rz vO ȊY>`R @ zL id!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2^-BТ#bᒝ@E\+ rL"%zL QB&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d^&Fcbf K4{4L V@EJu2!"@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 z@ L ?& >5o&H),2^#HoE&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!1nshf7 Qx}Z=\?;P;* /@輼r8i U?5TS.4eA)E߿Y\p toQ==Z\n}?4 Pa 7g  ]pσP j>F#x7ҙN~ǭT;B_'*2xAbk̶ά0 d˪gRsgoМ./TLکtCL} ẠU4C2KT,N$UG "+B1,*Iv"IⓍi^ H_Z=,UB2O~x5/E ;X,?XU3cx-w|ITGpS.f;cm]Xqgǥw]bǭCRu$i8؜i 1R ifqNx-yH":E2]+t&H:@c{0F{wW|{пՉ05?gW8vzIa&JSdB҆hӍz5JMS×컑§'A>/}8'= a;Kt^dه&QkJs%]KK }$yㇶ*hUv8:/ R:'q!*녋GJi,dUIL&Ȩ4#6t"!BLޅ0C+0sƽ6b-9Ck].'29﹡ޕ+wRX'J}-fB $xJĕ$Bʝ S -3q;qW`p7m-}=N./hrCL[ ,_(#{ڤup3J:) Udl'׆Me) !(BbRr^kC]ޘl*zrE"\II Kx :]\!L/sr1Õ5v12/I'qs?/3gEk  U%jPPwd֣E RB L:q\pBC5y 6eFv E>_P`VA0,O1h(Ba,Yy¼`2(JYc㝡t5Ȇl`MZCݸo+7lt< ]2b8g&JꍇAܧK\(T]&堝fmKiGL&6˘W4ϳ}-/ѐ݇aQ[,Q7@u4s`*xk"&DFE+ Ĝ7~#&ۄ6}YB+'bh|Vr ܏CÏyT(Wfkp1ZNjɭ\4ӱeq0lRdsnaD{g $nj`^rox)^] _.H/jey:P@it HmJR=MHe{׻OHI0YiNg/:XM% Q>:{K匼x< vg;,e~:7g_J$TA$WIj Na `KI 1PRI$=-"QIVgs.{gg]'mL1m!e$iY+᧍N@00[](r$kfNܝQD1#w#ºAa8鲅pԃILM;^6fTmoOerj%Nڙ|z9:0 X{Zh*j(hs.x>xë59%$ۯoIYL2QN5 `X#~k1W`id ~bĪV.iۿ|1P Is1zs)Z!u%!%΢yNhHa# _N~v25spMe{DyA+1jڜ ANƟpyCA5vmz};TЛ L$]r*!eXoٳ*ZSambzJTRG6K9QtR^=t砹EDޛmPL[wv[ުyYg5ܮ%QOmE!J}֧_Y;OsFw_󢻦߰: zg=jGx9LCVde*-T=*Xiֈ'5oyƜpn#PQ]-M=^;\MyƬʑzvr{Lټx]5d+~QA5U>]42)Tu^ߪzP>柾i Gj%2.oUVM?QEAzۮ|]RoިrI޾~>P͙paVV_@].q:p#x9=m䳻kIzY%JAȅ^PWYSKRfJ4XWLB5͌~hMLUg e_s܃Wn+x):ᤩFOZ@ 8N T 11_+Mu)._^jqrw7%E7 T?Y~Z@.{FeȎYaB}MWPέNB'Г99'i҅>:Է/ؐTr^2 ;CLN,Az6Q$HQ A=@An1dO-Y}mNV2;WZ'UsNJpsŒRD-:%\Ķ4'E5x^wW!ߞ:J-KH0%- 4޵6#׿"KvCv2 0I3 iYJ{{)ݾۀ[MRUz0kg&Tk<hQߑP~6B6cVb@䞤/HREoCA,A& }T5ˣv 9#h4ɢY+ll(؀S`9QHz^ o>"zAbz*F{jF7Q Q@cB(X B7Qз(spՑvBN9qd3{;iMrjbs&I{"La *u1{f"('d,Z :.l:=bvR4avL ϼo/ t=⌏ˇ; b" 5IB[ %x)(c2rJFQ ,`4nϾ= 씒'VGbcXt@:0Zy!Q@k뫁Qk PЎH ޹Yg+{ +AT/gV[&~:]0 ɬzgPhuB%d2 EgMJF딋T ܼ8/]@`eDIR >MA0@V3EWN~2\}hsښ@JxIR AlZ:6p4eBLpV2qǧXQ{YfU׎3C 2 Mκs.TI,Z(PCZFJM8}GA۾Dv/LDU>XFQ">Z~ .β<` .SZgˎfk~QUit&~z~]^.mZvOmѷ\ەwhzՕMZq >FmJ(P&N%ZV*ba45\Чr0rq=@|N`¼ #79:5f3&\'_~Eu]2Lr -gZWZ A;8X2_gukfȰ2$\ȹO^B$IR1s9d#QLA^HUƦBJ` d>UN}úDlS;nϚ:6>+olBLlgu?Mc (-^c+^l"+ (mFhS+jpv`Z~uv߯}IYiۥEzWuV9PѠ=Zb<1. FɣdBRYCH٠۶FPR*1 o7'OBuUl}'\ !''F) MqdlUaa38 倅 T븺̸Ext5]v/xs:Y^q IDƓ3(*ٱUJ&';J+F*x-bke&%tJ /:@m;CQLLpr1b7糴cbIG_P[=0ؽ>. 1i)! FSY(5Wӫh32M+Ґ :u-dfҰM:cM̞TVLs,0bUQ[{6x2k.$ m%"jj@I Kք֡ T)㴃`D\2`4AJבUv!8}axZ,6!s/Ay$gDjlKt*KD3h}3a)JYR=  =6'֖$>>AnX lėZxUM/>_qg,0d_BalP1+κ3M `?5s&tM3y)[{T@FoFe1Mί />=?.vz|: 09/ɬy<͛@m ]4l}Ƙk3Out4] ލ~γT=FO/yMuٴ#"_op5N,r::G_^֡n?Z *OkiHjV?Jw}4YՍ*Ν? {Ti:gU%`{oI˧tRCur~F|N9r.1fex*fd"4q>-K 3ST<2Ka ̉_ YPgfkg뫘cVfy7ϫm\lvP,vEMEED0yb`ݫyJ4ĠMWb38@K&@OOWZY~w"b݃BFcIhj{ Q,ֺ}*2)*zl"XQK-\ hp^eU/V֑s*,d#KF-Vּ;Y %}Jǘy/N+$ix,wzktn<(УdU yE:5ؤ̄Zg\F:8OJ)b"Ṽ:1ymom]".+ S| A/ ⃱?j~:s)^vz|݈(z?γ*7mFDz6ꡌPRH-ɥ89\.E~hX (smt_E>Lו*Eyv 8ǔGauBu>] H6+զ}[c^` J'tP+rC_ˮCV_mz{y-rZIw;izn^'^}W7%gL4|m2+zmu;$,j@>;g;ͳ#%:G2.Q9̮̊l(^u\#lqOcN.&W{sT%]v}W}Zes[[?T^2~Q= >'r4_;B}cStX&gdxDs?Կm#+~~0n\A\娵%ǒh;$pM!-8)rggv3x~ˋW]*3x'IJNE䴭6 fVζl`gDE܀Il2[! L`C۰]q`6P3g$iQZrJsML } 5hU2[9OǷtNn/ ǃFx`[üm@o?f{c0Fi[O歆y}h'wZۍR֢6RJQPOjVi}:/EUC];ɍںU[ݺ~oIO˳*\{Z^,^в9Dutx`nNNWSγ T $E p鏏iA?tq,6/^Mtb`Ho\8%6?wǨ_G7V`+1DV#!Y^@{q6NTy)kɃ ĭM/U~bV3ie1x~ض a]Ae'mkrg˥>靥?(|:4fq'3,[K~7d_WGwwEk~wX7f0BbX3&)Sf xKЍ5׋S;&/o:Sԥ$ϰ'[BJmAj&7=Sjxl!6`A)Oq;⢜n\EqnlڴIh9ZF~GSbGu[e1`'ILq3iܐwG}KNPG^!tO0ة Kg}gMQo)WAz{!Zt NkQ-|Mj4q(鶍!{Dr}v|&7VbEݰ(@ >^Ikʌ`4|xĔQJ%'d~ϓ)фSX(þL=Rj%m!M\ P$\`E 2P<1i +8q*;T[[<%WLJ 1j6fk꾻}(͛,@J &AvOs""aE&bH2-BLn!CJR(QQS/+Hx-C*km⎢U3&|)yK;U2wUZW64Nɡa:uUˆ:܊F"}PIDrf@ܐ"+y XaG.)0Xs3陜+':T2)'UMmky(9QlN,5MU|׈b0޲4M$RqZWv&s*|GEp)^5((`BAhRX0ogڈ(FEVlbtv9Lo?]jbv[\{c잧#?Eņ-NS!9gj!mGzQE~~Y ZɡAչ+U?Wn#9ڶj_dz!_g o o۪8@%' ֲT(?Y⥮_0kK}jQ\ Q_)E͕ 6XMS-b힧N|72\+om:gL`9}6UmEe8g~}Tm=8:I^F^(Vʩ[(BBJEιH"d9LJz&z=S9~SVZ*h.M#A}'T0LXiS%.fr3X0PXؽx]0J]U1jm $g, '~EAT5IrLsZHMYT(Xn4HƩڨmWH`GU 2r@:5V OEamg<. [j1`_>>{cݚ}8L/a6(9Q"f8y3{T|)Yr6L ly7XO>2}T& |^,OKaս3K%N^"+9"ċGď28f n=]˾@;?j:xÅf)`B^2i1?>KlY8Ki~J G&Ƭ!;BX*B+O6ܦeҤ{ߋXolb]Km$R,mxI, +^\$Xzcp!Jg; uýŵ`rݦk |/e*@l^bQo.$Vh~X /1"FtfRJDpS7SչIf]F}~ڗS,Kc#p}*`uz2Yz,hŁ&`ZPoDè0C ^*&?{dL.9Dcn^uZ/1/Ϛc)u#gcsjɶ`r_F}>[5(H^D0a WI{$N7۠ Խ\bO jYt2Y/a#76Ѿ>+Az{h.5L>6aBCT5L.Q 'Jq {ܺ|ė]`.Z^:Fk^D$$BDWmWv2ă+7%e<)t{G6ʆ)3%*8TH/Eyk-u!ڴXO]ZsGۏT`CdÑua~eUx\}8Cɚ^bD&3:NuܮS :0_:ρ G[@ 7vb݌=aU.qR+-1O׀ IF"A&MQngm¤]5]t]XqߖD_ۯ+veWbbCRBUE$*;SerHY(!9o K T,1dǸԚ ԭT)9[r> |4mA?/NGfէɧCgS35!mV3& !"%E@JhƆ$CDs&YƊ ) "*1Z ר[OvQhPUmşZ{["Kznu|dMl*ѺqF m5 IGQt;ŏKJaZv$iS(˽7gErQ@TH݉r~E7)eDZUjj4ūDQ3(:Cz zxATべɟϓeoڌU}R@0 9 V 1ajjaņ.3M:79tƜ;7e6KNF n֝-:@ 咁ё fҌYK%d-pdy-P N6N!gdi a$޳-c+*<&y誝y"AV[$N %YDYP+u!ppիpUJf6*cqBRoUy?s|h*=,>`tӽltTyc9(A\}3i+|jpb>1%=BۯNg%/~00"ʼ@Ӄ}cg+=k mI"j_1drQWdov 4Lp8a'e#!㹢M)cˮ3o)kgݨK^63xpoDBe/~5yЀ3RIMiSg:Q}0<_?~g,~}kĤAEU>~W06Mr6 { B*DhO;^r|!:s͠԰ n P ϫ*_ #D^ >ze8V}UR9糞`؟S*Ao+0? a1!Ne/?s{xg?\>/$_͓hUb3nqmL]8?v6-0㼷.1G|4˦ GC@WJZK;{l+nY|3ƭ|vHjm,Χ!6˵w.Ez){i))%*% ݈ǢDI&}hm|C1%Y\(gmk6i%_֌J DTrc5ꩴ'tsL4ZJ8U~0sq\-*9x\V{TyFnVJe*W[}ѿ+x[<{vջo\-a?b4)mu+)UE ;FBPn^t;,%6 96@}98set ^E-OHRLzGU[_6zPn]yGvݱ/l{&4z: Z{oD_5cg50j!jV(9 fp-]j\Lo=.BvG][V MTT=!(vp 4dZdڬ kx᝾Qv[ W%Hb= {f:Dr9<"Ԇ- FZ_!p*՝ï%clVς@AJFF dW8 i)$NUHYHU%(+sؓBz4r88;gcH5 +ޕ["vp` ]:QtpvW7Ԉc9dx#* |KAQ3r~>+u /UguE ultՆTbA?\rQ&]1t(+ԅtTQYK8qP)uI 7+~D_i@ E8ыTsJ3o8,fd]4FWugyS<롁DڵQ߈u'zwCd3Y'T /y>coR3BtgulE_E$P6a^IwgqڸpoܼF싰:Ȥ˪J۠SXY!<;:X@ %:k:u"Ȭ[si}4nP7L!Ցz?e5B-ͨ% uHqu K+ ^άoĐZh-JDi*SFZ&aintcs&('I$hU4ٱ7(YtfOCdW.dRA\K.L%H,X#"sxIw&hvEJ!e%{vC0{ng֤\]&2Mi2$CJ3QA1b̸\_LP+P('RKVד]bXHvzaHZ]DuoWkSJFo:^\I]FD;P QPg[ÈV5"W4/.:`ib[d@׿,:pٿ̐I@6ֆuq2;Ž/l2UU:(!xǍ0fl1yi2c"u}}@(WX@@i;m7+yH.rM4j4~ :Siz[C0?\W8ߢ0yCᏟ>]U2Wf_>nX4O}HM(R+?< >0Fo̧ a225}oZ(=%I&.)r{O-Q2NM0u#EY5(:YwFN`::DX+'YMUDOQyI6\! [Ya^mo9f X|f7$d;@`)]ޯ0bٓ0a[q'svKli181[R1Oā" [fq'p x(mKh1Uk)PxY呞~J.tt2gDҕ^! qۤiֱA H!kEXK|]d/a&Q3} *ZD"l ɶU@E 8k{IxG%i7l׺ff{[іX^0Q{O6T-~b/í Ҹ m!HY=Y8S"" @$c7b4_ u$!QS:ﮔ#[0q랣?q)ߺ8M Jm+xS2(ݤ5g}/> )FH6N+V)Y+[!)"3=BBxFϭ*ڪȠ 0mx)G2!` cDJq;7%7~ h+SqR FHX$7w_'^@Ja\ofӑ >H\M`VE*HO&WB? 70|\w"B@HymI$m ~ :cZ< ؐ{P[o cݲߠ_ <i.5*A)B=I6fˋ+rH& qeOiuĈ'IUkl8lb``<ǂTا^ 0"rNOF,b9a&#]i)qlqFP%QSn^B8!ʵAIAmR|M ~-d2A%R0|yۦtA'ܓ78jZ}}tXw˗ RYB̈:[[kq>I]ycjOcY#D\D?oYcc|<ZRJ)(-R0' _$WAM.d<ӵIyxţf1+Xk'( y'l>*D0T{mY'J353B&fáflfq&2Yl܍07"ޙ7|毀+)Q-/n 9DPaf6cόTd:?esU S!'yTV_K{&d?3%6<ww7~-:="Dbc:!!=ډE<рRL9"1 mܰMn.v=e+KmބD B 2]BA Jiel M[duYgtF/nO;e9" I@@ )R )$Ġd T,lb_aBA!J/>5L& ] "p0:($H!3A+ceG2FTRhO3I@ 1LLrm䉦HW_g[R%jiАh eTmZ<*ws-r%3 t LR `+6ÒmIf :.o*IA l@'PU sȓ4Kp՝[ry> -wS BB@hf|sDaɠaسFaigSaZJ1@$m1\ f!Bݎgs0InQFвDFKE M&:1PdE< `+yʢZtkUI8yB±۬AE&L usV&\2dJifγkz/x9b}Hm &AZ"D{ )ħqf<2XJf :\S|9Zk}.dI6BD7]L^_C3d-6/O<g7zjN8s&)qgWGxN%)es o8V;89&< wt=@7'T 7>6S Z5*$9*1LH][qY^zN $-vTy#g|hږלH. T= pr&}Ko<8wդJ_HeQ;N!f-Bj<-%Lu"Nݶ8QgteN`qbk .S p{C ARhۄCؿƮ~>P UԶ9,[_irZԋ1jtXvJMq G?(8 jݭm[VȐ}uusm uIz _ubiPA< h 0-ݒϖP?f[֫zmYrݮvr_kU'o ]{x D6,6pJ(pc*]JUJaۗ&ޣ{K!|6"Ow4eV̩@goHG!;QDԗN Al.d<~> ~Rry2A_,9Lw?,s"/$P01ݞu_osd$8tϻ{/2]ٳ;/Ȩ@'̿;njFRŒ!p=Sp5 2im%*hK6-RPEԡlWccۿWqg||113S_2F2y aɞY^+lc'9r0 a?Fhv|}6r#E5=ˏG2$x|V\152t7${L#loCOr*VZ_qh$F1!݇z& ($6̚04d<,L䇄[;`e>]d@xA 6%T:y%KIK }lPX#cMcesMk:G'QZ%Fy͂ J5WMQejdƘS](X2*5k1NtI7ZNpJ/?( s M".52v:rh]ʓQ'#r#(\)j3ωWyFu0m)WOT(+D+Ղ癩q %\6 >ּg:xcأYWɢnqGP9Jԯׯbo7S#Ͻ}5fXJ)'$LPJOD}2e|9 >}cã cÿA1ـkA*M?Oד.募ܬr?~^oٷZEplkqK$Ae3Mŝ}=h)\MB0afͼҢHL.Ј+jIK;pȽ^ zeIoHԯ~RYjL٨⸥s[pBh+:9-Ӿ:%y xW%igkgudC86C*賱)$򼬪}dU$_ K kiOfɃesc2WViGfn6Rva:g/6$ZQAk+iПKzg(*Ɣ#g I#^4xWOػX3;%* k.{al I95:i5tdRI+yZ~3m>NLg7"?MpL'a4^nI ǘ!/Jܲ= {V>LlTlS[H؁ԦEּ/kq{ϞL4-T{[iR PkܠOvӟ˥I(5cEڃAWT,k(T<.ԴUTJJ;5[R6ik;WJpÝ3\!IsVuM)Ojm񾿄D&%eŧo9moo 9`$v%of%#)I>AYC_Зtq <͑j 0u/ƱiC4=>\BY+d@NY7us;iy7i$x-ܿoCϏXsjvl(;GOJErSeƢHQZ=FQ]ȸs^̮{j2#0R)5 HYʍ BdAEj`# _aJ]|&Q -aQ')|.*CJ 2ONR >CW۱P=s8> i3Mb sy򌗚)Cy4WJPJ*@ PH&Mg0:Ͳu("3Mp.;0?OtB`D\lm2֜"q̆#]|]y/Cy N "/@wse}LxVj 11S,MmR2W_'DA^n2^ Wm3ao`Ÿ+.<5 Z#o _&J сpwXy,`gY\yjX$PY͙hyO33}QN fzPϏAYg(KI@9wp Yf0BqȎ-?^\0 ]^]\\꾰Ӏzg/,Zo|00s#eҁ7;"J90ҏV4 q Jd=^B{+}vT3%u8ZvhJ8/:jML-qKnzfG$^,aOSfLȐsXsI` Chj,BQZJip\NGGq /+g–+p, 3f!ӇW2vmӯzm+9|Q]T=Q(ߊj͊JY"R2w>..&OrGES)U*i\1cU.}j2ĦHϏ+ juZuّMXO- =G SqJ΃2x4*!$G\ ^qs3 G52겛MUy.J ʔ5,3i2q8D"!552>l5_୦Cw!4\{Y pˤ&w;UonN豽9il[tEo=Czf^l)-8a^"CE,0sIHF. |~9/xhq!#cWhv\.eޞvV$\艊axQŪ<[j*K;}tT0Ram {CR1r8G{V.TF,W 998g(>=t2tVJ3!G̫ݛ1FNNAk"buFCpR8ڢr$pM7 &y5[WsuPuŶl{T#u%חQ3nqb!X߫OcbUCJ^0[dN*(Dex i$Rж-{8(`ܔOlmS75ŀm/op3oRjkdܚrWI gy*UU*(RG뽵?I ~1R׿aqOB" ɬ RZIʰ:$d|j[IJOU8kDk կz˄5J.>*XTg%41q/YhՍu7 _bJ+fY?ZJqgokxz^y:̧Ű fFC˱Sz-/>bCT̯WOh/y/^\$ץiX' S*@H8 T2(Ԅc`LF1ŠYIuKΕ􌢽?kug k Rr<=]\.8\;\~Y2ÇF1‘fwJr8LFӔjd*r"Nn&k0} k}R1d9;EO h5QI_:}J~ eɷi]~TVR]poi Aga " H;k| /⼼,LJ i-\-4 e23'h*4j)u@fnFz"ވDQgQ)k;mݦpRh<ɬ&]uO*ziEq4ᝍV8m\ ,(: ux"JNO-W`TJQhpWțNnC6,+?&k"C] j*cH_ªALDFƸbNp]#wZ!!1P_-#6Sg+ز1c`r7q;# c}HtȨp:0把ʣs剷QLR|g,F-\IV33DG;$ُo.UʌnE1擀p e _{~C(Z~Ksno1jqDFn3`ՊRbj[q':ia^m˜ky X}y=zH--ȵO3D\ j8DTNDR >>n(Ѻ5`v !Yihi+7йѲZ,jněxȽ>n߭4+S>~ϢztjApUd^zWlQ-+B3U<ӲGaZSŬJ>zAifwI+U9:J5c{Tmf>; 8Je\c:/amb}YNPf:~TmtJɕ\'yBgOs^4ѴX#c=HAe;ȸISBxߴP'a1+ 8/efToGE-yQmbVk)'p'20<҆Go?G-=^z ,Qm)zocɴh*v0*?+0Hi=#= 6{'5V~f֣K4vZ"-84іAuB> >^rc0V12kj HWN:WX%2EQ 'f,փŪ_%$$tZ}?;Nwbab:`4~E`}0X3#{o L-b`E;7`I+:F7Iuӑ7^G^-`z|Qw]JFu3Mؠ*aD/|ΛVu(ѣ|Zs X V:?fJyR=zG+NG Q\0}]!`V*>y ]Fbu23Iy7M}Y폏bMztF# f^q0"H%kٙZW9aFRj@t"XmH--W4*QyB*GBاߞYx1pQ@(<,@61֘Zr[to 7Vh$=-8PN鬒;)/NRPk%dݬ]:έ5Ķ!'Aƥ;j.ch'm!TV;%ɩKjchi 1Y΁8bIwA, hg|x/:#E+]{SFƏ2T"fV&%>2h\`c1Ej8COx{:15KRj"xpW(,C*3!  gˁ^Go[BH c ]1e{Z|s΀D?oj5f7rsxmaܮ@$Q]GVyvg^}J6'Fi pO9Ijt1sE"X@J`,{. ] +X)N{:ˋ._ ]p:; J5|ӝnxwCMg\hsH;]zAэ/QL(6DzڥFߍkSw5u*>61|rXn6W"l cUh|pd~X႞>pI;;\Hֻ0cBL|TΛv( "Tx8U坛@)W7W)vܧ$E3B ui SNe$izà,/Ȭ5PB#:O^2鼙Edx,bG5ON^ %;I*{O?ʦQ~gd6]#Pd9r`ׄa 3y)hI޼*Ma4Bz8["ʞ acּ-7B"r]R(X#7q30ư?{hdbƹ+D^#{|oEQ7Ǿoq ۱őLyDG9KEY#=&sJm4b4Gpex0]uջZ:wisq{'eҵX* c[פJaCa%D$F Pek>y@h/G6/P/:gq>TZ 1f{,Dr ڤB!gT,TR@cfcW랂&I/tPʊ"|(/7foZK,SՀvBqS"Xہ~_&L26J跬Fp'U$ ԮlO9UVan=^ï >= )BrU7$RrH2D}&NC\g&&ȉ:+m2)}1SeG+0QlqıMdi*c.%)`V,jam G|O4vQV ݻ3qK3Nt8}4:g|XlEP_7O UN^<⡬4{S7)FW:Edq)CT D$SuR*S*v6DaԩbVឱ"vҳ])|m9E'%z8qۊsN7D̷QZߛXXmEuQYB-MX Ux tl"˓*i&޴DUQ|n;%ڻ;MWF3Irr@U;kyȲK]5L)t[s\O߉}q`V/W w>:5V͵X#G|L)ڭM(=DQDTkW}ijy.;lT={x6]`8Tw!a&XWj§~iuө0*h>{jKQ. łڶwM].ۋq;*Ct̅o-ݬHuw9lJ͍.p`fvAon1`EM.DZHҸ xGYqy<^bhnBtAqY!gҥ6oZ_]pey\X&X(Twp9׍z <5"ɖ:!ާ(Kg0C-\rᏏS/qGQyǶ> m'ҋzFSV;Ni6ӧNJ$&izxi*GfAqiİKc 'Y%PŤJU@1Nw>SXpgqkp.ko+ϊU^륑W1t-FߵrUkuAmPFU9B{3\hhH.O .KsR?-oѤZ_qRW1qIa]0CNS9!t| +`hv{mFR{kX{=n!e7ма&kA3/8nu$0p{0mae!=ϩ eoWl0L,Kp&SyQrvV6tK2XV0p~fs6&Ѩ^~3՛Y7Jƣ,"0"Vf\b㇦.5r"αҸ5;ќiTqۚwseTboPeAj<" !'\i=-;'RSpu֞WZץI`Ĺ֒@)9F`.Ky}բ6xQ{x Am!:Kfmo;#&zTlW3uBS>/#xKJM(H'Qp2s:j%V\Q%xAΒ*xK6}U ]H U-2ޚ/!H)5SեxG!euQH'(dwAk$xW|iQx?z4I?uc>_3qߎ_UlO|=\iHi< hOVll0j)&@`:QqEisqUKjD*§7(1B8/MP$qTScLHUu;}YkbkEt&tк?I#k;>@_{9ȝ(j-y{ 䚩 X7w~cTe}mFSE09ũjDIңǕd >KQlt:{ LtiTPƴ.\;~Ty6C[6ђ9Baw nrWTx̸к8[<)o9Z1c5ه4/H}$-x7P?Fyo2폧h^/>~wd&0k 9g/n:DI:ܢ%gӁ%]Ja:+u>Ŀ%]8aVi| eױc 4\\)5^ir`zT%[jaIt!Fi%G3v2]|=9?ԑ)ci{j;Ûש"U #G)/`6-Oq`Oz_z9nTԇzYԟݥv1|v VQFTLLDIibZ0Qi2a`eb""c<@;}?mQ8d^6!}ּԂ7Y?ۆpI ec?)?\}_g>o6V7e*ym*QdhcDq4e&N+Qpog%#Р!揄/9%$(@Z3!UIN{ִ*Vr;61`?vsp _SN##oX_GV+L{I?z~=~E<'apO} bZot-;^Βeڥ֒Wfz|iӲA"fKӅQk|>?g zh1.BUO6+(IQY-cL&DޥaY5j{?Ai:"\G`w@aF)-r ?aǯzÏ'KM5g NgE~ZG d(e*ط)r"(*X v7xW.Տ* (2 NsqCXؠ X]7CAR1TJ,Í,xncYSRZHQM|]K2 l ,v=aLDH(CF\ɔMzX]aհ%j͝Ս:;rZWZq>WF٣f_ +I6[Sz6%jɊ;ew"oS@X7꩞v))$ѷˀ'?,lӞ4 ЦjEYDmrN&֜֨P*l"G|*#;sbivV|`_/Ռ=^85Y5vȵN~j4vcR+qME/Tx°d.pS85uw(\kdt CSwC Km?_a`r^n<6XsRβ9UJZKj#+B=G`ٿpT D#p쭚o8qjXKÁ}<<9RsRAx|b{2mgx<'R]Ou㮚j]!BVP~8j_x|3Y ѪuO̲RtF=rKٯdK%ߠ~7U%l:L."[Iv%ҙZUz7%\Քn;W6XM/C OPr`ѱLs]g}u)؂ ~;tAt6I={58_ eBVW_|o~ +rQ𘰴~;GpUG&B!ǰ@=/p$Z1mTK |Tu<{}3*/*֒>WBfAuWΡܦ#pɑ~"#zL7Nc9c} 0 [>nA1TRPA!j ٶ׀Jz/-NzZ(!si cen(>rAF;nf[;zzp0MG,φ[{pF`p~@e$G8m`y󚢁=7 XyH4`SYŠ+.dXI QE8D%Z5G_Acfz*!D/@V( b 8A]-"iL A`\f4J/nC4C~]J `7N\9٦! (F+lt&0DY CcPc(h|-2d྄a.l>}3Ip7 \_~arC*r `@ s j*tO  'Lnf::3Vf[<v"p2hL>K7.wvkA2gH\MLڢߠ\7'%5Sr#yzT!$B)Cc/}!٢ZȃT! 9yt*{wхl6g2h!bqOF6\-瘁wX; kV2*`EFtb2JkOgV ! *z H>)Ayp1A5RCGsvkѽ~S@0xpJjpp"ryK9ȼצRhYBIR帻R +[ 8"b@ $=4V,}Ț`*dLFV{FEr=2U߿mW ?XZ /ZnI_-J%LX:lwi6g%rߧp,n/W/)_%+|<_"C6MA#ql)Sc;G "S[!̓8\ NC/σvo1OsALo%j0 . >vF4rR./?`)],y-tcsQ٤Hnd4FY-RILtFwMʒƳȶC8F:"|6o.j^(^E-΂skUKw~I/%wZX(UH\b;$hFf  =b-ܗP/-ܗez.;ckD[ b#2 0p%x)8( ݙKXOpkB`uQMq>Db9G &ԫ"]>ht$<=4v;%ښ^t.zO_B],XC׷kS7/IvD]ҫ\8T $'S08E6zƹY_n0덗{-%s+ٞ("+%*[%v;r4Gt{uicF +Jdg#1 =pgW5}ճ\1iYw؄`XFe;doWDטdg󧕡Em-PB1ӕm]ʂ+!hC8XB0nMrIc&nGvҬtܧwN:0^Re@ ^XB a;q sL(#Ah=R{IHm@x"K7yDQ0kf\ӹ()&'[\`^f<{uh۴`6=4(0o('K(Zb%u>go ~xAZ]oHT/Uj>Aٚ*zS쵰:9i}gɥ|+! ^!z߸Y#R7Z;LUm6|3C0N(ɇӻ Csvh]t_M*w\%5jU=s{FUgYRfF(]J@:{h#&MW B@vz%Z0'p"SZ##*׵qj "7<;d `CygQuK\j\06"?}{7ϮNՋGc~'8v0,5Fl(⊁oE4Z=mv}%M5d*SiRFH&̼V%:O zzxK*DQ|g]аzeCkO@iμ)㬵Q|g74 ԍAU=؋Kkkr_otO42u4֘{kIcԭI.M&JMUBV(\Eƪv[@p!Gh-idRnM7ʖ#)G&SBnO5J=DS|=Oi[v<|p'hbˁR۝һ>bi< |9E% Jh|#)hK?)8ή2DK_xbvxW0#wP 9i<(wfS&لM:( tW;,il7/PP(e~n{C* (+9qҥ-98 &T7DX+GB{!#8\Bͧ_'E4>"ff: f0w2% pJႣ`e h两GgތJH,^=lw '1;|p䖑=4m$ znI ;U^^NMwW3tDhAY['% =4ZGJ_$$8@VG qq' {25z:hy.2`UT2^Vn F}b?k a]6<8$YA`~pa9|1+'Nat3Lo;JqӡB,:)5e.c:;W+m wtxy8)yOm5̚W2SܲQBӒϰw.SUZ@l~8v ӑ5wԄ'8QiD`S&MG{hlgfv|*⛞J) px:oIn(^E/l7%c ̋pmO۰=f;i~yk gZߟZQ1Nx?~HOˍкI4gK'NJcE1׃BKD`F)*??hg_7޵BUu=7A6A^N f$Q\S=IC͐1 KR]mSI6 ګbm}qZ_R?w˻kL6do62aי̹a1}{=h LRiRN9ΚN2}|M?.x(4V*QUwfrV )ԑupV)eۻkqW9AaKX@jKu8-jg٢=X"lJrg9XQ. .³5c@(H= UT}okDߛ?#(?f/g(M/E${[ڕ}Ë)̏7 sbv!'Ox2W"Rly2ea߲zZ; @$AyVӤϤW4$|K=[yNoڭܩO!_cˇ1cB尬RÞX5fO*577/d= }//VA1'ѣPjP2&|Fiz"Ck=v vȈ8`ʎܝk"2k~Q1kW8#ιV-f/ p胜K3ܡwd=qWxTv׸f'ߥ3q}&1!/=ĞX:@ &oX,Ƹ}eq%ew' U`1P}6SZ=;ͽG\ft WWxws;boikkJcAD+5Z3D)=4t̓" \{x(􈁃}=w%X[~r;xL'݉r_sG3QgǹƤhBgX@}`̂؍<Wl}EّR+Iu(ʤecxn81m[+g^ :;@#~ qsjWgmZrNRJz3v.ї60pߨsFSZ }#:M PЍ^'~ъ]Pf=s1x^j݉nǷ?.f^}{ ?ёxo~2ɊKF[ K>O=|zˎc1KB\B =._~O8YYUJmڼA\<ifc ]n/u?=:9jõb=ӘKFJ<ɺ(GsTG?܍IGb#Rh4(1(gS@*cvsq>JJ8wbJ!pDH/{qc66d5^S'60'1D,XAAg:G]VꂖKh9?MG'z"H K串p?%?t毆C~Ew_0@,8˖][/zb] d($g6tIz(2֗:'\!:ekVuLBi,QIYBEZ/Z"Dᤴ.r8j ڕApAKV ͣDI[v:{wMꦒQd&k->iwXm;V'ޱĻ.;#!D7MȳpW2|flp&wSf!kD KS=E.(R ̑()^(J)F貴"$ !_} Qs=In#97dEH,aL锷HxV|~rU6%rV91 ?GNg1 .q5⬀Ls#/!2P{IDJ1@z›trEj J P@gBd'ޓ5b&pMD$OB EJEH;H":bX{sJ=W;2|ywE C}Wqhqp7Y;fU6ZsiP U+|624j 5A Esõ;7RzD6 z5I_tN8abc>U=r畽u޹f늺Γfyz$~/N#yCK/?-,|~I}i_fA-Ua*~ڋaA=PZ ]Ҽ~Yy)sz˃wzP?]6i7c:=Z BG+&K5(b̃"`?K0zÊ&NYY:F7»A,.HdVRO`lMM{ܝL\gOd[X] ^O\Ie%C@6)ΒL5rAom~m"''zoR 8x`N&6xGn$:0v4J;wx͚k feŴ*yc9 c*րyz&y>/9C+{lʽѺr[i4={lg6~pW"Fy"MG ;\}Ya1}lM Yš6vtC|#nbգbM;t6ջ1bBlo)AKr%Y0ZWpEj& 7ɀjݛdcٖؔO5Qh&ӢA+Gd SVp>"bNpm`sԲ}Z#4f50BA$q vC;Fh7Bψ>+@;[Z")愩vIԍnvt#{Ў _#Z)< hFh7BE@y^/jfgV3o<-t|Zh:ōz:Ns1uC h=jS( m1;7x 0Y#ng!g嵏wU$ϡQ!QœG?%Xo`.K5-, k6 Q#u:F_$3hm'J8^,W 4rsqA5f '}s/7_pVܒckىV@sKq3Gjk ="ojFv JJ. @Z;Z5a,Z&2K%HKI,B ;8sbZ־``m>? gaF03(׳E9# AF퓐Z[@\INH,&-P ̪(tIsff!\(Kcdگb JH/dJ8b /bkkf-1fȵv}jV)6 UuЊ. i"XD(.2嬢QNZ$!S0d?9H0ygJ*N&FHqkuQab nT+kpִ D̶07C@ )UخIq; ~$<(֌X)N60sd!+ͺ, $* J%09Y-H uL{Q4h]>Kh^}6a͋!(%S"?%KHD؅U@JEY+b|,1I/ڕqNɪ!$XuUPSwp_8Ko f}t$-`鹙;HO-sQF61Of>x)$?NdoBA;ʊ}2xi*E@;6a"BKC 7$Gv0A2 iz09,@_-[nOoߗQ֣TʒS%d!T> ;c˛Z{ YQ?}u_0c,C-jaZe.^?T͘CkWʭ5Ƣf5 Km>)FFY<{"뺊z8%U𙺂5n?-.1hH%Q{)Y{Uݷ+pkt IJ(NKh=Ԕ]JCaeXb cKߗ2qͅ|$ f­7ݕYPmwe)uǚ 9)|s)矯.2(K'^#J+ %']&g\hFnR.]2^Q8]vGq \txS)GKRuSy_[{w;bKk{>pY?*.=- N"paaBo0:]޲Jjk`]E lN8Vf3-c7?;D ^'[<$'p',dm=#fC͵ZRUX6S=ksARe ͫWZAKo;`18L/a EaI*UG7ž/~ Y""|w|Qn b^?u5#_oXw&}g5Gp$^E^ίG_7~ywwO38GSa`n/kGk=lQ7rqvF k-zcgoA1oJKտGW3ɦv Xpk5:qlSϱq jy8~wFGnUWRyǒm6f~ߘꔟ1&ۯ{(ogkB޾氝*?/a/wv5V _^ZW؅6,i \6TJ@' ‚5-aMKo x]FqϪiȋ$3g 1$SY1J}Yq{ዢ[Ue=qr(oט+#^lXBW) Xxe9 o~b䑸Jt91 ~~.JrF0ĵ*o *y⮒ v;<ӻJT'`B VW*C>3J>kۙe`mH6pxʘ_vA=xؔr(F/៛~L%9J ޯlVboWW W` !T t~IR>|{=x Cxo<+wu苑MZHI;I㚐kՊAbr I,n>v,*VEhVc}5֧w9KfUcݡ(0OiRS# h \hJAreQi=XADd\4F=c6VIxun΍oDqʭ6¿FWᦘC n9j#|@3H~WgPD_h*x4`gZ飄4So1:~NД0!ǯA{>׸ѽqL }_,c1;=;] GV/g?Ξ/ekol(NO+^q|jCjz7eہ9.E أZc֬ _ՖЭ +&o3j8j$vLbV JRvS͗]B&zc}=f11`zĻ঩;>HkNhr2[398\ʜ@Mj06Pj YK5Ś.VEN%G15H)EۘXi!jGvƈOޚ's.1QNoiiVr0&AGňl,9] ;sF|*)r=1E3;UZr=|PL;fuu7sR,xgO^k5ƯbT7 orVQFvU KIMRJ(9Oߵ/g|9F1BUbXiP ij7ʤ*B 8kR|}CnU{P"x_jFacA9D V$X U£%l_C}$ԧL5ԷH@r* ֙ Hs~"k$GgIBnn5L# g# 's,wu"[A7XS(5e1$ud.MnSV5ѺZ0>s] jG[j(oeGtW jb}V7\Bc(`]v YLFӿ]Cj"W5*= s'q>]7UɊ`j+l# 2n6OhuR{Mn%X1ue;cE_K MwXmEr6εj:h $^zz 6?-C9:&vM:sMXT, ##MӓsAaO{0 P:1~cƎ,+LDVP;'%NR8&BM J=H 0J_fqDmge<@8dF -t9YO*KȵZ}WL‚_߫.ݗf6cG|Y)4Wo>$TD_Nh 䆥dӬ0f sj=7nhM KgB4W%ŘF@VSY($YKRMٛ~Q4}hWߜ/۝KΔuw^].-„|MʥW_lY!H>ԡҭm-pJceIgG ' ɉ( \X㱖Q.LlMNU,7(!~/YwbUztw% B%n E@q ǡ*Q,SMpgv~kb.hAV'#qRr_XlTc0m- U)_TjBzxzocXk_QrZjKLFjuX~AznzsףC&8r*ry~:N>h3!GS$ٻQ̗WGT"pۨZ6VOYkC3˃`;qx|tu8:T1\S uYb tUa>IiG7<6`r-STq #x<_M}9zDNJEIi,JD {6N46򪁨J`W0,&"Q,) H,x!#υd[?.)˜|vb"os@Q6r8)WEԽUoaX../iO`9YEK53QB46) 1EfR+0Abk; WMƑ]FW#ES>5/f٨@(ϳp*~#_EwwZMA8s6OG6L[&%!-ZcY430m10}|q4 _brirVd)E%.Z|=ɸ HTsKz'l]$7rT}+Zd/>ឳ}^xI p*={ӸmNˉ~7 -n;ֆFwBw{<ϋ1c*E&cDk»gN,<YkS% Z8zkS"A+jfzI0Dc_ }E 3 XHzKR[pK@;t( Vf?B =BsEzj=B]kۗYWRue5<=z79DaUYJd''Td\ %m(dpBoaM;EѪ}ǿ=QZ dÅ] <93] CA2smM\Y\THu, ݓpP Ag,D1d)PR_衧ŏ".%uiQeSkdGW~+KG? *ѳ\? 2p Y״^Y(i˽[[![{0WﱈD|rƢx,|w"Og{? ii-lLiߍ Z߅3gbA"%)t nH`@2_X'vYrR, 7^0%c|t9+WONHH[GL6s'W99;T Xek6 Bix߻Dz Hz !8m)xvté" )Y@kWDw'Ku!D=a=9oiW?;QlmsWM>aiݸ֎P4ٷ5V]=+﷓N ŭBb4uYT ^tDwUE%HܔLmcvzE@ e@c\Faζ]5[rW@-<ENr!Q7d\ggG)>v \jW5 g_K`u@`^bw 5#3fw6j]$ 5j|sɣ1ɂ+ \#\f]zKȖM IJǮ";[Plۍֲh P1 3a@^aJF Wbwa;<42?Sv/ |]vl75TP :?9ާgV~BED!^&UzbD`Ыvҷ7&tF܇FO=QP ĎN҇=JilCKlJh5y"arn ҫK5`dul|%&ǦHYMSi- έ^7vKȖ{JNw!s_˂Ba4o 0Z HbP3!2&DX.cAZ^LAW]QqטN=yHl?pXDىE+ r]uSZzcmдT*NgصrJnwyg|]_20 9.Em>\i kp7lͷp$_5A~3An:/9XǗSɦ}m |Yd7П,ba6'=^ `?V(\JѪEqWCK zjm%. Va\RVCKFܡP?#PZ̆UDq*CU[;!=ӣ{({dFFNe˩:X;RdCEֺQШ?$hHMnd)ipC6d>Qz'UA7rGg螓yAd\%<_|z>Ǜ7Wp~u$:;y<ӱ켰p>7{j-,H'-IN;',Vq}wZYat~ٟ\9( Wg`uƫ7_/}˹Y#I|>v4gH•r_O> voUu|<|Az|=΂QrH3?4E6j(67K03%fH iժp!nl(B= IBҩiCfٴ-!$BvN).zJ4<ժ5)ժϑߦkZ;Tn椭PS׮Uް=]7t^r0ʟmUY#90 .9V-g?/l `s:!+3&&WNJ_㿍k:g1aD}2?,R8nc ud9K&8O&]ݍ](,YdkR9Cxc5Be\hH&Q!qVه¢AMq*ǧ Vwk *8 nԚ"ll]Φ!sltaў@@sk꙲Z)PnO?Ojzw}f7n^3۴}UQ \_H1\S2@nC;pCɖ{҆v h O>Ve_=~V}ɿLɁap|x^f4r3sa^CE3f?{W0 hR7b7H,vy^^˒"ٞ)>m %}NWbUgRfn@!=AR_Rp!vr#bx5\ 8 ;y][rvr Qy};As&:eB~9snNߛaseTs?}Ă\ R7߿_j-ՍhUJLm&.@G-:Nb}=M`TmpsVI^x_ed1Ϋm #L|麗FJ_&5SsКQ%cFUWeɲo-bu= VV-`%S툡JₚI,ɢ:pAMSF-KT*j i)M?Pi|Sh_꿇 tçejqMujUgӫ6(A/V%_c'$Amnj`=dRbX,wNUQ[,NF+(l@b5K%6J.&UwjlZ]PBju]q7`c=zud}O|rjuջ(Lʪ̑5ݑO[u.x@\gt?8H"G09; Hwu81e@VӵA=} :4?>oɇry<|1dZx=je|^d߽>_g~_c+H_.ֱh}?0`bET"s Mv25c27GSY91Ml5DXU̺ Ў{sa* *hZ%fS՞l]L[XK' PX^}=VY`byy#y C*LP+)bcJTUgW=JSR)S턽 ^S q{b_ O6ģr%91'[s&| 7::l>|E`j`jrrް;E}J^c#Tw˥$JPg=Z*+ Q)ڢZjpq+dmA?M ?wtӟނՅf`|o%Z3Ayj4uTF\Za%*÷8 B۝slHVA]Ge 4K?;~(oh1ǘ):h.X0c}2nW픵^YZ99NX4bI<K*` qKrlrOMV۲Yao⠋j0nRno6 >c9f={[{[!~G nP%57!&q'4Vwq^8JCbb7$XkQw;,WIEo##ňeQ\(]Gw;N+6%]"NWϩ>»X{ѿƟ.K7z7P ޸ !{}+ŪfG!urP䪾(H6.q)Yfy4x|}9 &k޿.y]"6rrqha`d!yu׊x+EHY= RH:*AУzD`!a+4@W, 9Cxu u,%:{O~<6"xBsP9+g) jnwoF5)/ZǮ8f~IV8x $Ί>̞_w*iխ\' w˷=ޗ1G䭱 LFlƎ>_%(tѨ׶,o}e҂LJ_~Y>b5FЈzcoxCl%Pmf%`!MtkҹNVXG);7df߬U:cXG)T_JT2DF5,EhdMjK0C#vX(1Sí*˳L8)nIqtdr}}х Cu\XڡV̥E| 6;)uyY@#ЅGޝe]w#cDS/L 3GmߜwT; ۬ȣ~ EU0j72 ]]ԜAgj,4k9n@BǀGN>J&xeDZ1 O {JH)~{5D\ӧO4y?CXa9\5Q ,\Qs8(hW).n3M-TƼbȆ% (ɉ9 sZ(1_I|mn`unѭm.T}эAݠu^u(3z>y9T2eFvj3wO7L0ob:-Oߤec^ G*84҉]QsS8G$ܷr.CJoZPX5Wڸ:b+'-_Sһ?O$I}\_.B>G7;cꎹN5_v wR=)jr'iwx+D;n.> ^P@X_wbn7ݿ|rڔN\]W{qp:i`Bb~C_A{(xrdzb(Wx8'c"|N(ɉ9EGOs4 ;sc=]FXz ƸU9?^+ Ac[]vn< F+-y!_W&>s$a7$zy#t=xrC)Y:|tOŰ(F/LqCe'|/mb u!]F&T*_µ?M}{u.@S]%VHFls^|c([:#>ͽ7>}iTsDZO-Ӛx+G+9ܻ-cxFPo?3.G~ud9NuqH S)z sB~-uu9p~wemr's2MUTsgx!c3<<#SJO( VM/>Ɍo4L"|s];>$`P_,7l!w rߐ[HT+JLLǭToRɍICůazu&4=1NzWWW}W>9z=}{q5|͈q4i7'# Ձhׂޱ[zКCp6zKmAR &.~z~/!z/lvn@s-s޿d9xhJk4M-^=QY>YSތ krI*~RΊhaPDT"Tk1h8^7]Q$rhIkD?7^?y=Ssc_;@k9ZR~8ɷOՂmM_9T[-X_ZgKh1 }|LTeRiA)B/Y_5mTT\9 %$bHq2-=5'ÒR BQWdEpL:'b࠸Rt@Jq!,d{XknSڣ4[f5,?1R{NW"p/Xa /v2Sgp&']c \dL#"gMOZ4ZKմRB2 ^ptr.+:+JUQSA H^isPI8o4i G=n*jzڪ x5z]V G70:qEbp<"XH"T +b5M\&a9@hȀg>WXPl\6HJoT&q4qM!xټR)*O$B|wG7wWMi}sOVgK33^i$j:?^(ezʥQ.5r9[^|4aCm,!ٸ CP&oumr| ϛu7TSR()_dQ(4 Rc4*A@wE `<'H &%LSY(46B"c)ֺmr @;dp>y*f VyPaݞrSG>Zۘl+ͿẒR**(f401HeIŒS ST1*˵4пڳDfxK"j;IZ$:pZב`/rFzZX:%-GX0(E 8GMdY+VS~woWl}כM%YO)=*MmiWz> !~~AϜ@[k8 ^S&F0~6$9\*i&Dp2EǠD FL`Yp-[_$ !X hghޔu:H֎sGm;*/iߎXhQ*{ eɝuZ^u[wJ3y?hl5 VYǸ|K-"~q(]go2ZJ|nQt~ߖNGDUUfY١ms8p(7a2㲃Qsߕ^LA8y6f/tqfr+m34713D35[T8<:glRcqӟwU#FRb3a̠,fْofoR݌=kp\'h5(D5)1hs4Irkfċ᳄7._ A<8&#Yj.z܆{M?ѿwjAG8]W0B>}t7Se͏~Mއqjg'Fe#//.q˛ +EJ:[.4Ov0@eon4nQ7S1t{u\z;UVw-aȥ Ѹ-KqSI+>M~{*\P7BJ}MӐmeP'Ӷ&iL ziGFj!!O|S;wqU 3 !~KC­T EwZFkfwljbAp*\a3/>hC]1ү꘺h邂~fUu ZЉZ,.tֽ/ W9\QBvL ՛RRtͿݛTL7u`RJKղ/Uxt gK0'3npĆ2n2Piw/>.WNw a|(I#hA 1FmƯ,M>rGJ#NT<)KCpy"!o],MdM~h$% 3pBS^ 2 \uF+1Sw4] \%"i~+ɢT rVFGH+V=:ݲR'"Ic`;HhHvFo|od mחT ;. >ɭez>2o5IaxML/ǍA+Q0A.M 9j[kYJ'vҨuZjiL\ꃬ*V"9SO'ΘKN6ײ mnoU}\Ui!A}*b B2E+ǩ UQ'`BrimK,&76dJ9Ru3Ee[E=_'gϺ{) W"3IwK@̍—gRea>fL>4 įy_lR\>ֿCه|4MUT-%VByTYwξ-.^}² o oLFGDž4yFu@^),r=Tzb` z2Cg2-0[1J'OB!6&)'쬝PcNG$tޑ\ִٙZCQ+;9~'Gq+O`'S9=bT&(;9Nnէ ߠ;(e{#D=D7j]M-\ة9PJZ=fdTIDl.٥ߤ-~ *})\mywAE [p[YL-{!hݪ (8j| @ 5-T1s0WHE$5yb&>%,*2ZiEIC BBE=@ T^[0X)5Tӊ&fW߿x_EfqJzLq ~/[)f8/3九Y .#I}`.7=yT"RATY+-?K*)G6]`-MīA7y*[I@&68eH0B2!g!'~M`5bЌhsrs=.0e8WE'HQqs4Ed*j3qMI!>)jLFR$y$RInE\B!5.4% D*Lӈu`ZE*.譍PkTd\U lD+/@yxo\yN_oi~Se&ߛEUJW)~?>O#J-"pX$H 4Jt(L6ϵ 1dr^[V#g\?.K]LKEEwj: :Mrg?4~ 3]jGAg4dc͌w?Y/A4D\S9 dN(AZt,YєС<2O9G32ZۧF$41pC G+ЦD][ 9#y:##Ը 4@H#* |Xsou>ԡ#Qe.A^)B@}FUGZZu.x*UQKҦ/ v[c>ƺ!spDFOЗ6fy@LQT2]ŠHy.4Fj|At!Bb=A[ۜHܛFhv-sj)EH$`d2GpbG'#ѱh3Sl8QLih|\heM%xڭyƛ]v^/hu{K\)sxzyqA~T!Ry.оtf~Ypn~H&eiWӕQZ;wRfNz*{zΟ& A=X88cFmPSĴYYamO ڢQ%"Rr5s6MtEU(>٦œbܴ{|yrAi#Ki2 ?` $!4+w,W4|]IۜAR! ps-,~7)hBRoMp"{۟zXpj63mՒv'I١޴i,0T:ʓ(8 TB,ܢRbf`!;D%,EvU4RI%AQ eBm %[hȡ)o+[hBݪO&nvBp`mM5zDpf[U(AT ZhMI<}xȔL:6iӑJ0\-^fs44;>Ynq˳g;<"|#L^4a$j{ض vǴ ZF#U*( ´#Sn~BGqJ_JGTRzG[)!nK=P Ōj=U5ǐ**j2X[-7-4WmcvF )A;3lyTf nQ}huU2X+TѦ*h2Ph>o}T阩h}`|{dPa|(L/A?$([&2R@W6PG$RHޛ-M$II*C{GQ.*P )f^^~Ь7([n/L+" [A/L?؏q^2>u;Q&{^[uRh[׌Y?B>n %V|JG !wC󫢮h{ .Ú!>A)țqs#ܐ^YS "v锏~?5^ӄ8=OjE,R,Ϊ5kW!\pA&2C[SkCP|;b1Aϕ.2pbF ҄-;6rĉZxJIRZ?Zuxj6TjljY8des4j ]O B΋c-mrE8l6X &#C#f̤pN[ifZZ"+Rϵ8(#@i+Xi20y soE ]$kQkNIߥqEg?h?N&Um-Rx&9P#Ξ@^wޮ^5ϯZʯ]}Iٟ4VkE*U('mR[sK)N ~Ka&7T9&ssh챓W9V_+O3q}xPBͤg]$=rt+8!KWsT|4i9A2)lR!YT,11y-c7!>'t5Ptȹ?-Q'nG~B/c8=93~4R /K~8A/K3[Z`qI))s0byT|_hXKN,Weֻn3J0֘w꺝91:#qX)y~tcatMvޠpZVx"5{D+MނXc!xL.3=L>dO-Eo>C;ز߫R^]V͙'?rP*9]\/9e fJr7?eJJ9tSKu;;ӐYN'r~PF:lfE>_sYX\#ޚWOG yx{KPRU=ң2^^vB:3?Z!UiFy=M$_MhDJ}ڹrLHt ͗s Vg>L2C" 65>ˀ8 g99@.{8ɕ֒~/0&("l<~FoLm-nsV]}VpNAtNJZ)K=]QRRP5wdy'LDF#v0dq) Y/= D#1Y ^-]]t&h{8~a+N ׸܊ZGŚ?f8sy٤s"!s>q.ímy. %.Gj}#@b)˜qYBf%qq3jA# w#+Vv^,},.鐖c1. pš=-BT|\86MDI$$ x f*rDdVqUPQkNKUgew,eºaU`߶f/9dy8-"6jN"B Y!22dĚ&8&1 0֌BpVC VuqIan3%gHHI1s`lc$9CR$pD$;*(xnT0o|dH-v;8o:WP+8f&l.(2߹2+cFD)u1 c΃0v;/QD)m%v:h?[..3uj fY#HN,^0?<lw=D7m30sK95~ nSe3̼dF!צİa3tsVW((qqU'R,4տT֦D@sJH1A5,4-!,VI"aHx Elt\q@%f㖦 qhM؝& AEԑh" "3DQ(%\Gq )%!&4&ݝ0x7lt؄!y׀IvL l|r(6P`I ׼1L^1-QXH !$iB-sgs& kEo冈;d?cb+gH/_^kݘitn$/C:!mXB :fInMl4I#TڵymDP';K-?jƚX֜3yR8"hͶrve{6sF+;hBBj Ӛ. Ȍc,ͱE$5+:"IbHҮ+g$ɏ!ۖҺbĻAB||L^,DXI+M]p4OY:"hYqotfair覤y:^Ab&56u\+OG#q&8?` YIcwىlպ2ݤzORZh|E!&a@G $2޸"Z%5A()^E04qM*K Pd߅)r [HY1\udZ~L.b o`S) L$ Dp*^k+T4Kptu>ѣ[Z[ p̣/i615aG_6%z*( aƭMBB&̬YRE>fYj T/;$\aX/ ic4*`RhsAg-JypR0~xrϩ)rcqA0N'K+.L 5׽;7=5JxNF\f᧢ 0%kd0AFFi"4ΟRmݵd/<]13߅͚6Vpoo?<; ʹB6$B3o`oFu@C/jLA`jGqQ9n 5{].aBn|@Ta|< {l"$I@N$!P4YRiVR Bf#2,5q~h bDQ !OX/lC)M F%" Ȱ0~B*nvN!E4: tڇJ.0cēI(ÿV1┍sUtSF >5OŪ`XaY@0՞7On9Zx6NHY LVޝ(A$0UڳDY1m )Ai }Qrj/b٪Y>98=bnl^|AsWԇ7ot RAhRWoicwj2)~B8w,S|Te*0Igz=$뤬 !,~.:IZk=dkݬ*kQkym~sNl!\l̪<]쒘+ I'4Љ1b%W;Y.] 75JeoĵjĐnԒۙ+ NQ)KM+#x uƆ ~"g3h-ڼ}H cΛw?OɩF9EHܔRiBB服cxN>޳u=toLc͕7 HR$CIzCgWhHCsDݑ"RؿH\o"*\'0`N&QddIfSF =ZgҺn0**%Q9[flB ?%&0 K`Ͷ ۺ! >9#fvѼͤ c褲]ܖ؎q$pM:& %cfAuxSֳ!rZk%X>PuAfje3=2}̠!)Xh6ê/l)ƪч-+9zwUHHJ`=0WeⲰo =[rsa@" RiDikqEyA"B0T/WD/%RYHhEfT52g@sHj =$Z#[E c!9 4 2D8a!#$$bf ,R&Ɖl+XC m@n ao/~ہb !VAr3HN _:6Lyۿ<"[6}/[0Ίy=8-֋օEq+֣|6ɧ h^k(J=(+ 77y [ql-L“PG^5=[xDriAXX {8|meDYڳ-՗9s˽ևGD:>ԟ^`ga; D~Ժֆa+跮~C3j˗4iV;/bXfj Aމ{>~`xI`:Pb%v:}=0ro}mPX=ڲB{O^fmsMvꣳ{6iͫ7N݇ٷ&/:۶vziwˏ:=6Le%GGrO]z43z>gcr5MF:O:njxl$;﯍|f7o2VOMR?/>ȅQy:g9i1u/喵2Jݏdj $7^pzU(.K1G4^'2g sXD+l,ԌŝVcn]}1a-a*_}0BG.Y^4vݦ4Y&"2.LfR~O8C:Y&'mӪ9:.nf1uX_~% xE $+<٭EWn߭E4lũ7fyq-V4RgOzd^v>Q (7J½ :轮RA q *p@lG+VB G,ܺtvk#CCN"Ƿ([AC9sMr7%CN*MKߌ8ù v\d&EC._$eZ 0AZqa{ZT2~S%*O>ΥJE a|7Ρ8F簞; GB^. P`"x)zog/:AڋiCQU,L;I{ ϾԘPE_G{wBAoI!N oxU,f@'$$:0ҁ=̮ydbƤ03. pQyd.f@R$NAGa"k>+qvi3O{l20YD%n1l7Nv!tNH}\fDpU* @$V$"J'+onVwv6a.G/&h]7Q_ʀfgA+›BI0&:C)kbe8BF}? l{ȍ` /śq|^2 8 $%Xƒ33 ߷ؒf% -XUHrΏýnEā (v䅭mn%(ʞU sSlD!|TݧE K;v>A.\ v ۷;<ʃ bpNVziM֧PTSu_T%KwLD]:FKd=,JI3ɞILr&~=}3ɞIL ɋL2j##PV1CH2D*D5w_32&YΒIKE5lpd$`We${&3V;=?*`R l0J*H<0*iu,d/GF)4+KNT')J8šڿ(Rs}P. tRaCEVP)Βzp}0< >g ÆAT宲'Ṽ&ZY)DE1Mҧahㄑ>ðKRFU/dcs2=d%͍?d%{.يK¥zn$xetrO1$_41d!rR/QQ#*E$V%9V0/$_ Yw^|& CE㼚 XKS׿{\ ̧ \rrT+" WENX 4Fj| DoAK8") [$sXDŽ}LcK^wU1a>f]^w!R wcEp^i#CDb ZӜ%-Q h{X=c[ݟjǞ?\L];NX M6YY/?r")/:+⏵4* e)7Lspٯyu0@cK2$vT_W4shTԟ"ϯ_?~'dP;ڕ!&8C4"W(+g ueڇWvjskW@2v8 .XXXX՜c%v.j {hdVQdPTT1ds*Dhd,&ald$qH>xIiW< %BQiFhvjN<}V1^U?T9~C ƎR1.RV h6EADqґȥyg1u¸kT cQhMor;̳ʠP^5RZR"^{=qhS RRx޵YZj:/WYyB:淮P[ǣV|d-=՝WϿb_Z|>l4ԏ SeN_ޜa"rDc574B6sa'Ɠ\ n[-.$'u鿘f#RpL~[9[9aȏ/p / \.iŒԟz(}"T)gi䛆h..E"O<C"@&k;^h('S`h1ˆćXA(IR[@8H6r(N]x$׳_2ד憸Bq$\ ō851ɒ5HmBhujcNgiI S_W2' ᨮ'Tc6f~qCI]YB0J gLt&ShAn4ٔM/QLi$(-:P>Ny+u! N-^G!D֯ئ ($l0AZ=[iP^?Q2dDsԃ]R_\Uݗ=Ɓe>61jYޛx}U apX0tufEcqhZU/?߼=қK5(q?qș}Iq]`Œ޾ >L;=/^d:'2 \Asc] &wB1{0o1Zgi nn!a[|4jɖq?̞8Öd\uJ`Sb!Ns RU.ze#WͪdsA΂ZF瑔i}P>7;"cpPE\3O* Oz%zYp.y}tF/,=Ouz #Cc/(guVN_ĸl%|@5{h!wR<[ LnZsmbd3,2;y=67wq|:9 C=/Σ2<L`EB#t?-/MrNMu*w>vY56k..oجؙYgpɭ=$zϷ6WSҤrʐtND˶t~'wJ/Jf$ #I2^KccAaHBWm6JK{fnL꾱@RCv\]]1BR@NIK|4Iψ#qA6v35YX1@ލ/.̴[evݙw.ƳbY;x}to!J5Ag z짒eAFZի#qصz EWR[cꄨ2/jJb4ø7* bQ $%tJ'gF7@|rOf$U_Wy;E8~3hrꏷI1O s;I-p4.5ZtcĘQĎbVe R`FL\du;lxp)hIqh-Ps˸*1=E21Xo0qYi+^Sޣ+uQ 8P5G fzl@Rʸg`%@m:cFD?&Y0H$ֻ޷O tiQg0hɏK~7X I[Xx 9jAG\D)>Qj%! w.h@=ۄFŶca}PWrү,}I3B:Xͮ.rA,WO_6P|3aVU D yW/?^OgMC㐸P>a|D\(ըܒPs7/ 3\21ҔTInW x#޻ZIn㊋txwU&Ory7ҠAԨU5Rp6sC%qH9hJS,rkUE6%Fȫɓy+ZTcZZ)["2yQiS$歸$vr:$4#{}$hT쨥ŗoqd[5 ^s'L 5w5wuk0 NK,28A;6F<k3 'RijAtBjn{&"O QDY2@YQVq@ vZ] ո2eDs GTABh;آH4+ąIґ ,wm*)ɶT;ޱ,yFf5` % jz^\U_W5.,wP%Vq.zҲܪmS$iGLKdIL$ K#&sj!i8Wb`6jq}*/wF8)mGсNYhqRYDcxRQ:`4/"q&>8oW%2% 0 $_FJV-0Z٠ F&9| JRe. DJ((M&.6`%1SRJ: p, g*t޿Q+}$Y%@_IFtb DɜM/T3Jmuvιu P<$Jʋzvs$ gI\;/IѶ,נaKVQ#EI2wHFY2HoPY&)F `PBI,G% h6n#oeKh=2jDq!ӧ7|ⱑFyJ@y< B̞MtT(- ym4O+laHeAa(R2in{&SG.W^FZ?Q:~%IM[V,, ?BLr!h'E& w;g!j8MX6 JN*cgA[ƽ"4 @8P ų4(-u :DVW|E?6J\9qykƶg"0Gi$AjJhߋ1@n31MIE|l vua dɤע$PƘ FUklˈ?71<e3iOAI:ʵ.X۹`׺>,0|εnle0:Rx}ʙc=G!ԇvc+C=x1(p4נV[Y1BZ2(+ J'%Ġ%Q,cVQC3lju@nKOP,jZS)'g"d,` 2$F  j_Pf["YB@Ru/8hM9[#n|Hq2,m))* JzZ"zwwQYaM,w۶19I)fTQm#cF5;1hc*Pm?Ͼ;]m~iEW1gxOho[jazbf~ZU8 ]F~,PT}ō-z֋j;e:}B~LȘ hWVu׬b2 Wr!lX>7;848w=/CՓ2Gl̢dBbѪgJBɌZ RӒ+a41DIHI-ӕuTiJIVwgfGܨcMco~c-rf)=ڛv:!Rb0gq>kEWY;]RF a2 1rASJ) ]6.3fmr`Cp ޑ?ȁ6W_\hQhK<ъu*T7YnuQ[sx'uo*J,# ˧늉R@iLoP'*J/ V1Kq% \UQl[ {JkwW|q_?xUnkJ^>{[|~ya7=♤<_{6ex5BZ.&y]ic`qnjشjOPNx &b4L@!LURzef!xTW/:D*\;^9ꅇJXUUs1e&FԔZ '7l6[/w)Wt2F-8ZxzꪼNݠGF1nQXļX{O$n{ܕlB `9n=bR)& 8q6'4`mNz}'ءŚGVx7TiCd#m`"n8m=4;Z[oƕP|>p2ߺzzvۧ1F6k/Oy{h6!-U$旕E\FY2f'MJ>lrMLJXvZ7Ӵ]̶WwMbUiHPǖ=INuJ_9LeI2y )ʔAUkhR*D@Uf2alYȘQ\{[J !F=2#f ZI}P&sV| 6,iUqmjAJ-++i)wF $$cuI{;kzwe^5o\\})w)~|ly_˸6_Mn*gxYz\o7yg/vx1Mמqjo_>s{4fĴّ^ǟ72@nMvROJz6yomU $3,w}IEerhsY|L{|eًjBoOk}G]1^/@*,6Ꮩ''m{؉9Jr՟$wYAbc$iF4"HD'dFƍ F@=Z;{pvfp9Cf  ujGb zMrDE}cE4@46CeV4fw6&ǐ\D0d!' @cgIY`̞I5zVI3BՠYu3Ft:PpCQV"30(RG}$f2`[z) $CѲQAL6,IhF=1ƪ =A槢VزѸ FCWK-hz4ܕŐC2 nH/4Q[PـR}xK4eq<ڋ:uD4*+dppi̦rF$AKJD%؞YŌS:ml0TJ #uszT|C\>"TJW}z)EQGt "l"N0cSat4ORn߲lt~ݞ4@i@=XcB$fdiRI@ UzNi: 񊓑O>D|9ȏQ ԧTru`sIkT |> RsXP+b~! 64(Ǔ/x^޺s 'VH¾_"jw~PwyJ)lUZgxw'1k[o6[;5o{^8w7??cg>x?wy5ɂ@a8N3'67zo\P~JHVNsƽ1AH .`82|4$nFU5:~U;GM8ۛ=ND!#0!39H:zpC2}">8,VQbFP$=b&աMlxdŌ`_?EXTͯP m0A fN "L41;5MCt 3N5q| gBralDN{sd7=fRmfäd,=d 8bki&o@2TV;{lX HS}yC)N -Ä= ngG}cu{HGTR- ӛv -c&M<7N`-EjE'^rZy,^pCauG% `Dqʦ5&on3bE<4$a?aЄ>ʴ5"/$: |H`bwOfرڈzAsh~z2K,ɘ"L;c֌LO'X;sM`{:FG 6W! 60ZOم{ B,<]wJ;XI'Xa1 iXAjV)?L'm Pi" &`t9 .Èhq 4%(FN)ZcxvFdIy(Qϛ̌^] {#YcDq{Sb*|TSqCCrf|2s_>\rcͮ7 W/[_>w3"e|@FJ~pqjQ$FLNL:N~fq67⡢qFVP Y&9EJ.x뤖FZb($u#PkmH  V . gϸKbqeak,^(R+RI&)gX&lA gWU]S]LuM95,Ym;l^1O\WLWA V`3˖g%NuEˌ]0ƦNElBi8 u,#ʉ(A ̨>N`0Xp"GT:?{Us$Ņ+hvR8cO/! @-K,v[`, !'e[ikB4D g`P. )k%8j1+FsO kB{ eA!(8˃hB)"V8j5yN5 ,.1כ:\fʼnN1˨ ͓;x] |:>Lߍ3/ćlvg5vA3G~& 3L[FLHВw>zf,ЛNXbͯ>p¶Tl=rs}&PΗ4 W,J3u%n*uB+2*tA49vlhYV|*S>n 6x\QgTn"ylFz:4 WуuaA2[2G$o2d_ )XHI9(9̢zntpzYfN񷒏K{ABK9xʰڠ:m a*Hqk ?e|[^s/6P2B)-8yZLǦŔÎi1d,+ؒ#I*p%jW_F$ B#Մ~ǻKil#C 2P'Zr'mQ(p8us 1hDE)dm TqRmZ u&P(L(F8EJ2 14L2lbs`á۶! 2K5XnҐ\44DJN9[ {H0H! sbrOIx6G =qK1& ;3hCXƚ'-,@ܲ bIҶkTrp4We6rQetK `*psåuhY:Elz7>w˕A}Fv0Moۻe3Zֻա!_fǭ87 -WeT'U TƼ[6e[:%R}Kl4XRɩrox$fAjĒR266d."w<"SjZ~bI!.nlbI%jrb*_zbQNFMK* JƄ2~4~TO2Y=]uO(S=LjG'`~ڕzOB^>qǜ+oK,\mNKNAЄ1tQC,H4b˪FKmda\)0qhJJJc̡/fn8J*Y!$1z!&9)V b;,W(/ҺA+Uîj ko\Eoy6Lpޤ*$/wfZ_IJ4rͦ׷R6\'`)1olh%M:=I&TEb\6XIX!a&Gߤ:ܛޤ_8ɨkZL[ik5!wz]:&&-Йۘ1+!SЪƯ';OhUgrv%+rJ n߁lQ~Rw;en}6e[N~jbռyN,x;wLo\ϧnr SwtymF"LwC1Oz<Qz}v:fm/?6Aw`2x F磻Wa0wo)w$j]j2IV6ۇC7VeO$@ƎQ]:n%?FVac2'+U5AB"҉΋o߀2G3? [ ISAI5J(HAab ,J], ֙4,r=G,l$W<$5)Qv Hj\S$JJNb%iW|ɾ?B9ۂ(3[w $[ hG_W E3G6oh^QO)Gp*w+XcAFc>G`IM6T~,<7h-@$SN$QYWiW%zSw0wwe!:=dp M@6ߺaoƃt9'Nn=͡o?kkϰ䅖&6g:-ӘBӀViɵ`jmcr%"<,{fA"mxeǐ H!(py Ú-.HL1# brbmJ_c/ `Q}ܣ}=H 4z5)K^}3ff;l'љ6nҹYjQ!JT1A/B=/jO1CHh}H)$ڎ@Rm0ԋ-?KC|9}*t[?*HsbR q,8Ya$GPrQ%b@7&Llo^w`y$;8'r90*X-]9Z:c-Lyrz뀅* 0Y31q]9?> Ekn3#vBm?NcnxKǗY:3zð1jv0R lr8Xi%aZbyo[ɛѧ~0N^=y~gц!֤3-Qnia3UjoiX_͇퇎)0Xdj*gOάO{o3u8}J Dtv:!ܝg;Q/:> c絹,)6\{=_gvI?93͋o^%N@Oa㸴&R'מ|LK}.n}p E) ~( |uû (b92`I}]P")9-Hd!fJ3,C [</ z| @R/eY|1?fяbw>ADtZ&N뿂$L!/r1㋙a|{ms?o L c쳟Ϻ?1,٫0j=Nƺ!( 7&Nę/D|w\Ņ8bW;}^.=/5ruEIE rI$֖|*;=ag 1{~N.On3},$.i4F("l+.Q.Ø&sF W8/sp59?}jb?iF5lCKI&8 4*g@㡐08\('!ېVXްxR6Zq%lOe.Dz3u+ZH++b@@ PkFHP2Id}גA bdz+sK ml1~=v~L㎩ߖנI1?,+΀@pN%VUz_E?h4{*<7*PX!ນ"1W`Q\;,vFxMhF X8sw8%e|Ԧq-Pn~W@>-ptP/^x˺re@P4k6u+snce`RWk[rfhW( ;'ߛq|)|9nܽbIU U+ (-[-dUQlD+?F .~ďwp;䛛,B%T]U ڣ':w;}vљoYPaF[U;Ji'}mtLsNʇy)8FqEJ߆88>Æ~'قh 88IV>EuнU;U`Nv+ S@em.vp(qNj*mw$dzX>,&>:}8&˕>|0:TH:;:j=PUF @aj V:z]{F*vs05~,p7k%1`@6I0ݯIIkDD'HWJ(+Gc1cQ~1k)%V_nZ Y7qaMgv`sc^}rߦvr`p$ɜ֜=O?~p϶zџ?W/=s?s f,,@ʯWư7oBi-& ES/Ěd`K~>"bDM;y r.0),w;30)`Eσ^+8kn3io<e珨\dh]@MW% hqTphDK[`/4v:qXO|:- )LCY#NJM)I>wЉqKBFQjYh"{XT~Mއ5ü&̰p6B 2c2қ o#`3-5x-iAZmɝ;}Ʒ֙4juR= ,d2VOg#?/Ynyc+;lzc |˕l3QtyB`</1NzX`*C:s2_n'LeDŽ]hņun-:Y;$\t>,3B)(=N( }V}^7/E!1!NdQɧOisApO=~:2?m"hfoזZ#}ZPlQ}gFV祄. a߼ێeVaC!l@aPg¼ZRJ(|*$Ф1\VO2QR[ iB[6}zͻZ{֊5=\ݢq4qC;՘1х}3 w٘$ٳ#Ei5;ʼߡ<u;@8Ar;v.i P7]>IZ wUppV4O=Jc+䤷֎g^SJ!FܣLִ_qYvjg+&-Z1d[z1k+n buge8YD#k #JaOBbLP2$Muk$R Kx$?FHi שIg4L%<>ϳӾOP/Ċ<5ۚb$e>Y`?|Џٻ|3* (UpJPv*jrR^5+о0ƦlE@ I|A,22.8᱈(ڌ0Vރȹ=j$z"mJ!v qi!;pU֊)NtMF`VF`-3$wD( &Q8C_(Vv9[mBn˰kׂO"M.! SmQEoA`zE5V>T%87:7qnlsNAן%H2+TБ0v{&5d<{wf>|gFT+kȊm\vlR(8 0I5\F+ 3 d+L%RK%SscoU/K Hc$&(qCGp8 aE"$"(6ZԂO").>!kb:-60"mLMDR+,JO'enjӕ:jCK!=*BAj)\brʢ*^_4H(XS_DdBph(  /(Mv|ˆDJ GD4Q7bpBSkA GkfV?"5UeW%S[:c=12'.IO}[e(%T#+߻"xzE*cj@cVtXDkěr%WqDn \AlMП+D'sЍB.Fw&1_DYҟ ^jnςWXYcc6 yTO.120fNLYj<'p[Ml:?Mv3 FbjoQ~;je0ox^dc9A1=w/OF`W|UhU)_^+OgdB`U5}Y }oM{ҵ)se |,zqBlr~w2V\xx})~"Qay:Ȍx۳!|0v!ʉd#Ghx9De(ewuj<¤?9t.rz:ͿlQWEſ9Tico&CzUsRː Nԧ._u9J̧R8s[qL㽶$4 l+};m|yAHrM\͛,0k;4A=%A&A䲍-oCu T( ( APlkWul4y18 splwP %٘{[zK,ɛ{z^j88t/xY~h0'TM&l{)NNէh5Mi}(RE;U`bg.70py:\%&Bvݷ,*\.0N/!UȿؠiZIj+NXwnUTQpuAfs`a^~=(DiJں`) =EWÔV^tʔ~!Jg:99-HtG'𭳹Pݙ2Vл_ə@p%;/߫qw-&trJKyJO˪Pzw]ڄNOi&.:N9Tm;fwzRt!6R3Ob&bOm-hqkuDVSbH(nJ^mG&H3mIfWr[ NK*3&YmutV3$.V ߙ[mpW.F7FyE+4Q?M\M$pU`\)h˂'6Os_/aoYh0C?>ySPytM/K%̇n*/$x=G8|D~ߤLugpdWrEy@xfU'G~r3fsQo?sB|>N\O3() ?{Wȑ1fJhށm~Y:)JDRW/ofd:"i 4IȈ;w,^ܚ=o$YibM cqRDqFHX tkM/]*[t)OV]::OoE<)ę(E.PD)M"k8+",5;O6_fuLkeqWX8 ƌ0tXW\$ bTQEZrJ皰XqcK?YМBc 9 *'*渣*7 աGwxC_p"hm40lj=KF9E _k9.q:DUo _wd-(1(pnAdbx^[z3hP\ŢxE Ҁ0;}l,sBcö]GFtrAzhKBh[~y"+ g-Ә{[H7{Kdžk֢a3gmUjTb!$!;ZHiT0a7_dS8 T{`OEoCbV1|-w@$F)^rG9IGw{ćS=B7ǯ1 Nؓ|VyUIAϩqIsd8Y50~Y Vo- 9hu_Dڮa0=6^oTZnԜ"0ls&P5BCnT4ĻC(Ո"̘A` z9HY%!Ы27ZRΛ 6¬<׆fKjvXHZ4V.+^^ u#& 5ED5҂B\SEpgOm(Z7;؎h]gUy4Jm7DI@? )L7Wn؊HBYM"6OCJe{PBR硛v[|<﷡Ahi{Q$Kb;[ /G ]lC,l4[ .cˆݶ]E APt]Z⤳S@KkhqI9W U&[kE|?\D MQ_tYjenVa΅׎q XXR(tNqL Ċ0kQ܆0hKX<J nf߹W_JS~zR8DȆA?| ̂.ߪe{kyzԟ\ 擣|6*beK@bR{91ׇuf|5_y!!V$ [^X%cp`?^C8-˚$0֔§:BA{kSɷiNČK(,]q>|6n_Q fyT&oVH99wń0gH-k Q-,ZZi7xHn6=ܓ~IDb̹H_+6# ʷک-k {KUv,\ԎEeeQ<Pkqus'&M,!'rbB0n@3D1|7|z Ob% Qzl׀-ЅqzkM˕.U.6 RvP`4O[ٰ2^礶^Cp!=X_jy;LNa. @2paa9jG ~tKk5'؜Jɿf3_7H#~[%r#!Gb-/hy J1r˩[6͊6O6Aulvb< [ٚ"s`J LoO cLuT O#pĮ+u5sV~ iijo[rwms(r6CL=YePUmPO؎tTqsw&BA:}6cP /97 Ɣ2"I+O 5 qr!/Ș 5|>aT@{s;>)֜ TX}"3B7ɬրO1Wr/U!(WR8pGVfi`twb`Z^@δ,T;CQiyj$iЂJCNj|__"&$V;jbj;cӁ"G5RxSpư M7O CMGݮX$k/*^Ms$/0 C iۧMw)!s!Wk *7\`}ZBUf ;ZKЬp]sx”;f:Տ-b-![lhIZP!&`Q %I!\md;vƂʈ<] x8<9F9&u+-:dh΅ "t#$[죆jWriF0}/%y8L/~ԻXV:{SJWK%~s%R&," _jNaLA?q0.flzͅTEku Hۧ[D1l'j\"[>4_7YJ}L8#^0L(DD1EAeo?#V[ps(3 4|3/aXͲ 8B$˵$x,DiŊ%;zbSmٌ'==Upb͈ܿwGǩP'|3SG smZVJo xNZHvsm Έkk,Ug6,3#py}ܜh zʵmӯv_篋Ylwj|rVjէ/Yi6o?5 -q%z'z'|0gm0 |5hYcoz`6__gFwqT@]a<_`<_ddޏ8B pۊ3n/FIWcҽx^/XX*Fhf~neU "̊ӁAkk'_0S[:gkxR>ޏ򥀬~Xkϛ1R[M0.|sk@[}lYUcǗa)Ϯ( b$qT(Mxl7,Jw/yFL]Q݉&2P})`U)#Myg0qr˽PCm2ߛɳ3o@x#U@U-hB($Pې%YDA S(iWe*u]e&6G.T'/icwut SΠ`gޛCv5PsJLo-Ŧ zʐfAn4~HMB X  4k Պa0|Mfm(PRSm #ǜ:$R*BN4:5(Qf'A f'Rt%3pX~!9)116pgŐH(Vf4yA"N"bP1HTF{R|LHbeU+ F 0|T1v.|O7]qZc<]^nۘOylTX`6e4Gg;/۸6}e(G7l0;:{-WdH_oOX GUOcn([e5~2V}V4=TJYo VTBwVY*8VF$ =|,³5e GYnjW0f#iPP.2Q9tׁp _:ٜې |tֆcXE֝U$bF3홥U8-yN C p'<3xvNvyH:gwuN6` ?Pyێ'u|R9#ׁ͔s3iB9 ?0rNBSg\Q7`}ݰ=i^M &I2$ ufG4t6;ePvapͼH&lah~C"M"CL{s;sM9GACNs#iw*yj+)Q+[1UkE>kO܉Tf؂.0 Fm-\z )m# B_Gy_CaEq%K \.]dɕfIY$"$8$ܝyfgggggdѲ2"1,\L?Xc$f*3%NZ:)PY}F92HH@.=7{#֫UO_lx A򐁤=fӏr#&W^}59`.9^Fj?L&7af{u o~`)iJn{qEGʍ,zb[=IxK묜7=|x0J$-y,]e`;aVVGwc[ˬ1~VջnR4&BD H%&SXvJ4R'<&)ڐ8֢4RK\ީ}S{oτX5 JxJ1aadR҈P!q*0c"#9eĬVc(ŅqEƫcb+fU{&q }4\UYt[hj.5*Mqп}MuRn !=߸cb>f}rU^Ǚ'.l$OL\d#P=6"W %(".9 _|O HGb<ǣa *)>3>3]2ڻ-4ZŸ맃o@Lp:>F uwXNͥft[xEyCW9#$9_T50+-0?4C)22}p:ZeD)/W\^xuu[8PLI5S?w<`zD 5%_K(ZGE[c6nF%D)3X!)I=, cU0XvJ烒[*gpL&ǰQADZՉ ǂ3Bh4aIs*$rL%13ȉ~NPhx!RWP}_"U"56A!CUHRՊ\5@JPda!&DMucZ Ywdl_?{$MX "1OA FL[Xs !$G1WD5y@DEpkM U۴GFB?2%9?=}<:x|\EQVP%nᄒ׌5FLZ7*q^VX{Dy%֖Ǘ!^&[7{T- ><ޣSj[)_NvFߊ jAX!cgE[~9\~Ol= ;|ٲyNY|?&K}Vj]u}KaS6ZZm{h9`?YNwÀ1T[m=My@u(@RF}ڸVjm\2cCؚp@a&tk Nz&s:-kTU;]qoG5D :q: Zssy!,)s iޑKq[˥xt0M#o[[dvO[vOaymJv~Ss\II֊"#]. %u\b GϢNuGڿF 9bAaqsl͜J~>]߼y.s.dʽ,*JRnhsQ+.*MH.e=W?j'D_. MDsh@HbVb1^^6#Wsq$uF5zdY~0Z-_ j}`uZǭӝMH 8RhGwy"Z$ͻk|>ɢQ0 Ěd|<_;SѲ ƼIA7q )8a٧"6ldtp:9N9 AH {%1a ,/%V I//0g-5=M'5r!Yw#_3{ 0Nهk٠X?쳝jpK_~A0DC ~]yv;݁FE+Cμ#YITZZV³\~102D`ĪƟ{Y,,/%:E-a+5_<ʴϋcy͋0`D{qy5>eN^[b{qf!Ԛ{\`T] +4w_,v4FxC@,RBtdsew FH6^lA5 ҋ7=1X.IHxq!7iSHDlgo Q\?+1TW'n:yLV+' ?s:կ"_덛7n޸zShjHш# lL5=H^ŒT6M )M]?YdX0꨹0yy\RYҨoGe6Vz.'_-U-RĻ;a%&_晔bl#G}0ݿ&ajghgF(>[v}DRe5a⫥rx*o'^3չ_QB(Gsƙ7ΜqMќ,*9bhBG :2*ZdXU] GZHFBQ`!ie%ۑ|K:TEQWC'M#ꛙl?J2u_Ql葕TƔDIjY +cZqLc"f4œlDnZX<t`eסLpPceoJLea@NQHWıቝuGm"łZR 1ZFJ //hD5נSQCdSNOФ7+g^wb>,fu't^!헭!w565$_43"W +.#,A l7^ 0u9<`N !f?B3hnY3%~L`"yOwPI0 N]+6S$0#>BEU!} BՄճO\a_ QW`+ L Ҹ=D4[W0M)ƭn0a2q0x3~MO7W?(%?@Klbwؘ=ܫST5{!hL**zj݋ӺEg <B kzڽsITڝ{/MI:tM'5mtA~$Ro;&/olZeg@[V! -yZ#^i$j8bT|Anyls DM/"Uz>>+ %J %3&d8aI+2 譓X %¤cJɨ E)WQDcHX$&&U)b+#\8cA>,(Gƞ&u5ɽlEUbn$ʇѽXS hf"8͜Z< :jHMT7GJMgF0E&-\7u;csm:S~=j,>`ۓQR^_A xOHtY{wP@ lf&NbP j{X?1VJz_ Qgq_Ciko]^^MK,o][XSR ƸI`QXz𜊖 ;#.p9)Y0S Y k028[h>7YpMII/JsoGS1:2XO18{pUE2mx7/d} .s쟢# <TB44)ȼt1IϐAwu0ǐ䤡=Yb ?JCgʚ_Q,%GED혘uVXi$KT3RQW l;E K$ K_aZ] u&i VA{4XjƁ ^1mMm TI.DrdݘmVAc[a@"vk # ^"{b5|L%Hg}r{Xb ᩓ]o6K|qif,Lp*"9 8fPNv‹RA{A?4;E{8PraYD a$FE60̩1&{C]Vvuuia6=_ LB_t:a{ZAg;ZNքi0|? AH_ u7 s;UAq|a8?:pew꣑pN Nz MA-CH*"QC:A~}__Jx>q17) * 4Ѧ?Sof`b0-PnPQÐ{و-$a/3 Á~Z.sԌTXhhX1m>Xi,ǒ$_55!A?OgQ0OnfbIfcw/ffA-FqO5CpLx(B);rW%P H@YʏݻiYqGm܏yiX be@J%>B3"ak4ԓVhKXpt*Cfi E5fXOT) Hk5b~;b_ONw7I\dMd \,Eԏr^Bb!I0 rh14!Z$ T AFa)0b !PKLb/0SP&HL3A}g`Jl,@ $)@k E41a GQjeHS2:n99lTBBי5Խ=՟Ӆ9ݘg>)uL~z`Vڿ׽1?fHu((ߢ<~Sx`S觱S4A5 ^4B b]Żd2>ݸOa;6ZߎcwfkN󾃇Spݟ$C@#|T~>/j'M(V/,e!EԀ}9), x̦BX`L;8 6QwR,zaJl: k^sߵfFijOO`=)z*2>ͦDwT9lJыg.:qn>Ͼ>QjaWw6v">x+G&o73VA'n}V8h͒(wU ^ԛSNr6O}uw󬀽qzDeAѾl G'ƅyDqI=No^w\Mkdnڀf  %~=uRN6[s%E>'q^|5xpfͬ /h6h欬r%ƻLw`XM! z%ti6Rf{Uc`~|b}d=gҜa+雙w 8v%[`ށDr+;?[RHWk"`voz :]-qLyY|En7}[@%kD϶)!>URyV ަ ;ThS Sj5]n:ABvP~t< %ÒA+inAOװ.@k mȺy}6M_-0nsg֣hm=:`[Y.w݀HwQ<܍hVD)FKs?+-W$Vbo3zlT9'y,مvc3[pgOx5fDRz$o$Y:=* B*Q,kvvG,FAn펮 rA'k5Nݪ5WNajm뿄+5JuB"w\ce ˎ0B;cm NX9J"B YF%8vW̾4T AmSC@F!mBOm5YOٍH5bKQ2W!:c p?#C4BmF'iY֨NF$`:.GAA<}q+8l|4.!!g˩HS]c5@C*PIaM]avz8 6>?QV'%(ƒaeUbyBM bhˤ@Ji 5  gyT1&+1L(I5PK&a`cLb!ĸ >p0e$CBR8*0Ac~xf(Mғ!m[gMm>Ҿd7v0F2NڭfJ GڸNAX#zYaM Ȋ/Nkx~=d/n囕= ^i:!\QeA+*9MB56F!z?X ו>ڧjI6uQU}B*T6= izb)( > 59R38H9 B>0BDfA8A(qSCO""p7uN%OT{V;k٣e\[T-gUBR/꯻RHU!do7y1?~܂A鳝I/p9;cxb~wM{7}_txi\ 4Fk*)Mƽ"H K$(S2-qk0>Kc-v֗ ˊmW`]PSn͛ vXTn|:hY!HQ5f}%/d  53 ''RXb!dbǜT'F_>ZgE}2jG )6yW~zXy^V{YT{Y94T 6`J{' I)@S¾R4/リ3wgL̳{<6Q;C'ݤ"©pQ;P@wCWK fU]pn0wPĹτ)be'n 1@&B,FCnrea D Pp|Ieut+p*ƒg֤˓ju*&ş%cwn]P*86DÓsA%׽Z.?YX RB O: i&3B ܆?+S!>RUC=B@;u|i׭NTmzjwy!;S:_>}ǃby&eRw]VIvk>l'U۠ i?G!|o6HG)i~D'1 K".bEP#tb^#CA! '*_WNANW c<~.#m\?E. BAðӥ¼~$wlßZ|38=|<@)Ujy?έ-i 2=nŦwx73&+d <}ev£=:v{=!е0>:jy8HPhmfuFm:G;t(kpnR+aqۧH"XteF G ̻StB*}Zx$=5!InyCYdЮo쇱T~[JrZY9mhT> Ѡ@RY [hT%'i~, *BT-@J_oVEڙTA iOIJE/!7E7l>гr $zAƉ$2i,QkTKϦy9AcBM{P$ A ](JB]~]Xɦ6 le6B]Gt"`)u-JCxM?@!Mi5IWkTABBLhw`HH~BQ7}p$0M`D0ʊ d5]bf7>Yt]Yx>Yx_L*Q K4uQS.L3*e pedUDVYBVdJ4:1} ~_!,NA1 ဉq# J,KggJ -fWlUdU?I(Ts0XY1b̸4Ǒ/yw38y :h0Z'@DΝ'E&28,$TшMu #FDhEl)ytJ([BiI8Jp uG N10DZ(i0JylQܸ')Db>I y&`G~d;GT#J`QTk6-آ\T"g`QD%B)͉sK!#&ZscKAB"p  ]T uݺVrU+DYX|JZ X/.7%n-$V1Ԣ9}bf1"{0ZIV+lń ˡ ޕ$"e_4yD^4 ,lc1/lkIDXYbX-D%+##32B|QjN`,E @C˄Ơ;‹S!"GVVb c[Va:1ަ.%ip%0iV@R"DDf)1ҹDEJI<$TG Õ1FqcU%j1Laj$a|լο$ln" L/ϩҬETys.\d)N$e{/49 D"oP@9Uۮli"@l?W9Ugl{s^ztA%-& 2ѯ"ҀD 4D6$e>oNT^;E޹iW*J]]+uޕjKHN,YL ĘH7M@YD'+,ډva AS+!u듕쉕dr=QL9ѓՌXA1M^c |Yeʜ(%$ f|^jM. Z% !޲(s' Qh'1~`APv@-rJ|gaE ItG|B$PFR%qEh!B4&uV02E%Yk=ZɺP)CQPɣ:[Sg!SL`2K%(_gZH?SK񠰕5BI=HUcK*Qåۨ`T&cR/E!bݧ^Nx\mdU\[C $(;_5jyc):oTsw~P[x7^^ZǬ"m؇wOÐƚm*_6Bc&Kq/'??qûGμ55P}؛gn۹53fR 1ÍhgL9J')00"p\Ź5UNB>h@K ɖ謒HŝH6 q:79 !D%ElrF V`P{F+X$cϩ7Sv|)GDqɴFLR'4[Yq0~u|.ZUFx:7'EK'n{b?͡Mhp lq\zZs׳oq2OZ#׸yv[Wsq}/˭K|TJB9K\Eɀ)IˡгB6\ VBO =q,4䓣Q}$Y%@^ =EśK7g3Da?Y!ɨLT{܎t#<5$X< BjOr4AsIDkD=qZG0tF Cf+{^o5`JRȜsEw G/p'L)o8Sс@Q'1sTgo稟.SrIHNGsxx_%m^/׋~~ʦ>wrF 4\#y 8c41xK&NVgP6|oߡ'/ 7dMs\_W7ˁ}2.+Мjg {X|^5_`ÿ:>~k0+L`stO7~cw9CxJd-{1+[swDq}ܬ<|i6]Jk$]'d8n1DvGȽZDY4怙0Z٠@*v$ZjF4Ɵbn![1ъR1ta{U 6lN\)DbbWIrB%p^+QBݘhD)r4*K&z#RH%F&Anq(aD]+&tSg$hox h 9Pq 2x[,5Q8 ČQ)}TQp'k]@2!(A@$#I"vfL$eQs ^'*@55;5z7'Bѡ+e+T#.gd`rɉӌz(HDpBPQdY}5"”u,Y@"ʀMG?ɒB8Q@q^I: fVCYdh8*H 7?1 '\ȁj8í#+Gl5纙?MocĎ͖=ˍl}_P%2rn_2/0 _wn5%o_w ~.*?T/~k 8-֫2ZЪqWc'1r-Zb9F(:\U1K%h]ܪƓb1rOPR' V72ϔeJc|D[%S,8f0'SYh#_M~R~rߛ=6\&C1iF9:%0gL(6pWrFF.Ì )9[| 4Mz3{"m1IrʎO_C+8e9}Q33Fs1ddb!Zj|!E/*7JHzu}r6E$/i=Ɣ|ecʆN ⓕ+9m:x-ˆBn`N\i!(^J,"1nj$RX.keoνGWn5{ar~0N@y;nG<-˗z[4v.▦V?[eY֏*00 >zSTJ)T8U Wh+^C5>_f4\NW􀣸Dfv0kQ[;s bnFOG& 5T 5VKJm!C5 cf;ffV=laJ[$L;=5k$5.E'sGYMH[81/q4xbk Z$g3!xT: D]ˢpܥĸUsodK [1yIђ {5_߂Cީ{A`pV|g%9GRl`GIѦr CM7E(x%(X `rD'0$?q.hI@;&O*hU!DLR_q`v];bTMZ4H0䨸tPIp.f F9"ETzi)4=g+tUjT}Vb4[eϊk=T]bN&By7`ɐJQ G,2r )"(м bLe°T?(X [u9'Ϣd`ǜEkg9Qu$ sV4[#gDx ՆLy}J+wMP{9u/aYX"S1B#k jZWHۋR3n˵G%`oŇw_dKŇ_mK7a?Wcw_VlR.vm>.\{}mrScیxW暑kǓ![AgوR6%?U–Iw.U2wJ2nN}wAV[BKhL5bnoڭѩv;cmO;C6ӢuIPw1ciFg%dJRG{P[/>T/lt/bQN9f gUlq~jB4,p巏ҳϋOwqXέKE*30SYHL*-ޗ=j^̹6`ɹzn( 9z^}dN[dR]aTp&)8*|x s  0xeWCx+Vk:d-CE*j-%0sI:casd)ZCHw.52ݸVfڭѩv;c*ᆱɴ[BKhL}zv)[-%S;GvƔWNU/T !!߹Vɔ8w.u2ՔPdf+ LWh.S-g>Qɿ f?ĩYE,R+y ^$$eCcJL'{ȭfo}>Cdѫ6ϳLvB/ē[t^i銲Z@t3ljzZC8W_>gU:vQLJW܀*j?oFS.$0gypUb!VrO&⿬zQz)Kz0[g}V5[ Ϸ"JI8!2OAtLjM\HB Z\ij (_uJIrv7(` f>/#q|P!VBB d$ PR`ihqRBbC2TR!( KUxuy|fa/fb2Ol4|`ǛR|6?bGo~?a^=6WK|a0!|r'9ч_ߍ1}Ktuv)K'՟!՟g|cg.j:GцAvH[Mg0g>ۅc3oŸ晊a j{8ᓅo߿ό| ̣ZN/Ԟ\C]Pr 8[+1wKcH֟5ZLtg_'gYzAR<_O\utRᗰdz^55*5mA%impDZvJ靉5j10LPRTmǙ= *W)^ݪΓYXډT֯C]RI,Q %z%\-mMJishwŷ3S4 [)U7v%y1$S3CF9 ۝3v{.9c(.ѝy,F uʳw-[ bs].~b۠z5tq{ܗ'//JtsR*z HG_:g_cy ?iϲzƌR04h\~ X!O5ƊHmIrܞ8eg*S-*hxyvޠQW6"eHڦ!V5zJ1쮜/ c Q$7ւnw/cFRvpwԛke6@r4ÈfO+ ա&p/>\gm~lcLG-H$/Q-uRJfZɛˁ~|/s|-y\ͻ`cx fKx8dggn9~DdR9PeVsc> -0o\=|ڦç< mk>m1S󜨶|çq|^L=% :P)}!}%8hpAGt*YR493SsQ(8hQQUɂ,ĕelNPj%oi.y@zY(IpOlQfaHP2s=b M763=JyXR$ѶI'rS Ûi|QUԇ]:B( j _{7`v CJZn*}|u#p z.xytV"O8ND8Lovxdo k0Ki!@N/$)gg|j'1ҍu6isKj!w-1atuO AonylbŇWiN'ovN$@.9@԰j8a6p6p^WDZJ\] ~ 5U5a-iJ$ /8 k$g,mnU(ZB Qj&dHu$fr]<(E5\M^Tsilpz6Ϙ=sn à_yZ J }85fI"pƍ^aTPKdT]L0>@IrA.i +zbgNބ1=!q q!nZq0S#JLΤhXxiQ(pJRDl9`ceSe2|k4_3UdzDz:i<;ϔ,ÚPӂؒ^0%qqbЏ:]hZr5NRG@xF2lIf39m"LJ k;r.vj^>~$!yPHI|dE#MKd0Q YzGA括$QaOGHh@F { eK3 Fg)ÕH}a>&Q)AYW2Σ8_h,e`pN1FZBق,d*% OL!J0 xތ9\ꦜd甲"^}FrmRcfJ'F%fl 49J$ 0>xwK+Tv39Sp&"vq5MasGUFScX&aRp`1#\S^rjR[!į6#e8(x 2/1M&{A( tD[C/8.97kL#xR\ly yf۳ 1Ced[#H!\_oTӪZEH*8+A}pN fJB FhMarĨ0r/7Bi2b ,H-gNe()1QchJAkt]cꋯOِDLB& Q}ܜJNIpJ",9QRS8RIpNkZĒ>Fw }:=% #m/}b3 Ukƀ& ~D6#=95PQ\B8}]Lt5)~7wxsQf}8#C.g.|3UK܍#Wcg_۸ bAZ7vTVĿn<6VTByYv38o v&nb;=n0~EZ "qK/K?ʪaIz #ǿ gu[|:(buRْl)Rz5 -WN~`u,@/FqSF%Ʊt:cr"U>Ifo!ލe0xSt`S % y0]oD}srX$ZoڵYA4N ;LFJeY1LCSeL%"Xw:]fݱ)?'†yhq22Rg;oX;})D\ FH˰&*˃}LG2zڪ6>l8nqTBy585('z!d̪eFkVd.TTAL CղX'PZICJO] v@V|rS5VNrfb ;@~ŒE`z6VZͱD.2Bmz9α"}_{eLweqH~YdGA৶w^OKݒlKr7XuI*IS~,UV2E02&іs3*?'V1c2.ōkTCmW'\>,F>\y//גQhCʍ^7lh,:hR2llT2Zchn' v*z_M<ڍGP^|țxmտ~~2a~Fy9n E@)S`v6V, S? nɿ*I#gԀ^3Z-N>\.=a7޹|"ZbɹJ1hvեڴAowĠ5vm*$-g[AzL_"K*Dk] A(> (>zG12K,*E59hǤàV&I4 `$`^))a~yEVr -QD5}g&M)V΁+Γ%8;Hd#1XAT\(Tuu'X(Ô' ΀VX)Օ0 C΂x<&y~5Nh>~Rmaj߭Rk-O, =6iZm;%wlw<읙폡ֶۙ.ÛM}( p77.I1jH 6`t"6 g!=&к/ڮĪe.w%-[~|Rᒁ?sX^4B@s4O\# 4[x4qf;Ub]{ YZ/<ډG5#'yo )5+Kll~^%ؘVkt۔{?)}:yZiӶ|t(aRr(eBNQlw\ЛA K\1Q>}է++,]rȴD U&LA.M@j,4V~hV:lH(<Ir%B`}LN S .m;]!LEieA_8W*A,|mS![i- nY59i2,Až=®R,욇kgd -m % 8勰8{.]@Dp,޳#!<=sZ?#$"[in̯gkgVvoYl|mbAy8I^9.Q®y,l<-M0z5",R:'`B"$ehEZsVV49r :_pk:Qh{LDͺv9ُ]0F[tK5O^Q>66&G@u6ʻpڂe!q1QWg78eIs35 kppV^V́)'d[ƣRmZ]G 88pBQ Lwaq eME=Z؅%mQ𧹉+''4H= 'fj! ReH$^eL%(b@:+Z\$Aؼ-#FKrD%@}Ƽ`b]6&@CZ ׇH sYZ-z͜ ,JmEXVTb+&B+tk18Z=:ݾ0FSw,8Or`,>5;wyVnG5ʭzx[Velns;kl[~jl[!͞gְ﮲sgO$'GLW%'38gHpz#W'# 9aBj_n7`[G>wv}&\<+݀S{NK2Ź >%P %`t`$u3SbyQl*vyyjc9朗 wn %'#Sd^?\_leyH]G׎&㯍6H.oDKڛ#%s0Ph>zAp_ފO=HWȤO-9cn': ElW|dR/At9rC2]EreX{-rKx{=R !4&ctO-clv+[ͨU|qque>k"/mP-Хf$JPecYƓ oc#YjƬnqO(uԃ}xjϋG׍%I>6K+ʑvImR$ ,"Bp*_bCe,rMVj RJ[@#(RH3BRQ G)J44=ҡ DI`r2WQ!ZHv3WicÄ"o],^CQDJِt"Y4E&e*$ /$9PTkM"nL?$aV#~K%勿Nw˵P˧^?No] f| t4O/3/LY÷-Fk54rx.>3#.C~<,h/?yUZ6L6=ڿ%c'?ͫ|~}s;_Wa7~gb |r毄\ e^\_|mv"P&\kíbWYnAQShvj];nxaZI hq- aM g Ȇn>8ՄLl>,de+3\`d-N!H=C=C~c ђ*}4 `|WnZT%vPXEEu#siU1]O#hҲJbw[@Z4B.}dCĶnOcI۷ßT.(D(}RU*MIr!rHdLw)=$|EG/dQVTIeWW 4'GAFa@/5;UmOQ(ANz!8_ D"#H!I ]VI)c%CGRzײ;L{Y/(5Ō[Jʘ5SuѴU{q΀DBi"ϘjF(˂EH'AͶ]'05Y4jeIz;lr?U qK5OgyZl* *A r 78=oU4?bxMN J{؅VD{z.;,n@3x+Dg{b[ooOw[YB$EKn#s!8M&J%DDÝbB|e=;o|[pЄwha%d^^T).qɜN#}q6le>;gޅR?][2B7[%R{e*$FQilH_p\|9&T蘅Ir)팳Ll dROP9*`pK5۳rd-iW)#XL!eɂ[gI P [4g/K?{۸_n \~ȇδ(}`_^ (Mv2]G("%JcđEw<J<bTe6"$0TYΔ&KI8܉$tϏkeQ-,qm@A gc &}[--95Hgp%޼g֙.($9Z0["/r.(4qx1PW)-:YLh 5삻uYί\J˾֯jb+OܪR]UWnmʭ( VB^*173U{ᗣ鸼j<}0)|8- cZXGN딾^/\,/z6`|WJQ墁BRj"EF'eNjujιP}., V4e eU\l1E-a$՚$9|oGKg}xţ2EVhW?43]v1!^@lς>,&U敋V&dH+u-HVlA்T46Km]a_؆1V +=LUHW%,l骈ǘ tb,篖peiZǒ<l="2MLJq8dk;16lŽ6Eo%?*\2^O pP$6F>GmǎC0ޞ6oYQ dmQU?LާP^mZ:4@f'84fYZ/p3|evy 7^Ah Pۊ-mk+7Z Rdշ=p|zs_i( ggi|>xTD{9|ٚ&{ջxhu?xTD=۹Dq-~Xڤl-Kά.x>A r*/]T}qmDf٣ϋ寫k g€hvi3Z[z|yAgvn-3Fh3q(+tNQ&Mi_etzTf8c Ge"5Zq^gfqX:avUSʩB޽; P``Z0ԩ}yx<z)7De<}>`)MtYZLjT幚 G\[2t~JMSnj:ldXRDuHws'=р8;]{I'kb!P.~3Z92ƠitIH5`EcC co'\zkDt>XZ9te`'ã4T}Q8DpHe-9:D,:ޭ4Iߵj2Hm~ʼC~CSq#~[I.E+]Bov"}EѧqU-R?Dhҵ:zu]-#nά^9/^8zC !sχ"uXN!VM`pnyKpP8^}&8TzsC'm[bMOtSM/0`DWO.Z"B "T"O_w h@.p`"LoA'& N׹mm`תM_q%c!O'-[ٜdз^O~tu&ܞ OvttMukৃ BBiDBH6OOwjJmitJJ&^FKO@/t%LB;$l&s rA6o/C=˶"߀]xR7ZP`I;jAZtC6MQ{1ԔYN'Erk4El6%>R { FgFn(`_ߍh V]~z53 ʯp%Xsscm9`2Ys3ah%Iː*tQG{+r8}!w;܃ɞJ+W~]={/wڡ{67v F+&oea%^FovfK*(LeU ƅeg:K-aX3`>qeGw֬n>kNYG֪m| 6K35Įyy+A431-r[jA5 zm'Lk1хX fe)O91sD J3e!4ΜTF̘%`ȟAH>>qǼ0ibel^~V zޫX V&:p]ڔz^ɂ  ?E+ޮBRJYp>Tk^ȭBg\fI0IFI&E^Ȃ(+Մ#V) I#u!P} R aDI02C@8¡@Xs,4%N3f8+cx&,=M deZkrj4)@XR;iEf18TFP&[3`ĤQ6C$ CRZfxDpe\ gi%G;WLe BAP=v1J<DŨby(>kZ)– lsD $N\r q Qk0hDBݙLuI3Дi+`F@A%`rD)a )s@B9Ɣ [Zp ưp h~ފDnWZȝxxOC?o?g_$0v|SL ?oJh %nQjuFk7Zj]~D.Qdv*pĵ#+T Ï4dP` )tȘx\{TN'UuD FR17W:k5b16[{JkFwjrIYrU̚,Y3;CMb&Z:YA=Ά;,7^w%X^ Rbź0lrt~kDȼ.6XXǍFaku//Vrb//bb5V׷]_Ы76c8-ˑE"C_r5FQS 'F8O(XSy~1fbbxŘ_lbbH./qgJ__w?/)QKNc~/ۆ\a)Dˉ h )2LAFhYL1WK)65mzo=?xZ/ن쫫\M~&50Li;CquTJQ o|qza4K:Wq%x7xϚa0f{i>k.i%|֬g-Iy]|w gr~Y3k_/yCiZQ!3n>k'm8_7a?p7Z,7.Oij#Y6 mSInlkMtgN"1L ֛(P,+@}-o/*{YcQRj˦X.nGzV^=3Ff[ j?˲DFW?/?lڡnegɷ]fsycqz홲1}W<_guoh@SLl6˟ste)BvbrO!Rn{ gQ/jCt+n`X/uXws[H$3DBpȅOI(ˇYs A/r4 u$jjr,Sihv>h7_ 갎hxZu jjr,S?kXE"j7_ 갎hx8)'3_XS gQ/}$vŠqv/i3n!8YԇO@aha̬'BcUV4\vU>P7C.E}xJ`j=;S_ jF1ݢU4$pȅOڭê}*+"j7_ 갎hx%MB n!8Yԇ8>n^/|7O jF1ݢ(F;Pq]-\8zcQ8յ(bC;m TTIs90ne/^{`1QLA%A(2 tKux!%A ef8I5[f+$0Eֹf-J='$ȦDֺ#ՂZ3x9f݀dT7TXN1sR!r)TTn1s̪֥-)ǜr1+aE1_DY53Ǭ)ǜrA%AI=sK*S9$hLrZcN9氒./LP=RrcRcN9栒/L":S9$(c&cN9栒@1bcN9栒 1\S1sPI H ]s1ؖ . xqr)R(;L@cN9栒\0af-/! ƆcfRɔcN9栒^ AR9CJJ /,M9c*  2;rc^16,jx'0x>A?5P,W:ݍ',8q5gzr{k*gnR0fp>)ʭq}{J%sư fml#|۟7Ï~7rn^",k\jzMۯէW˵7ڔ)ІsZ29Ls$7En@JZ|\$-B~r0\@`'ȞyRKY1 s51g9[2JVPs.P22Bٻ綍$e7jjvW'..ռ 1C!A~= )A| P(bӿE @9MgQ/P-hٜ܌oGiLo@InG_UۺE6'WEHVa]: G\dDn{l gLr ΝĞ䭖XZ.?9]T .AkVP}+8& ! ` $ JIe!R9A2$6ӆH͈2 gTcUD4DDMRr,1 bb}GA)*}A,ȌxJN fʐQCA,iM61Nq:})cU$4D ies-|Mzd4-46XLYSRQ A+&Jb)F\+`12F2%%,U+MY`Ƽu.*v~j,Ĩ}VdYEDX5x2+ Ȥ|:utYhXo\eTd! 0C'BՖId1B3jX<5`rZgjجȜ,m}q6 lӳE E""E4-'2.9J,iP7?!.gf/NU/l4G7VBX= _ǹZN%M>KqTbVQِǏ>WZz ګʅO R9V# FYQ*<#=%[<&9MnnWNٚɇ +ɰ[@D`#n..Lo-CGɿ|gkLI;"`6ߍy;tXgF D/.7Na%zv֘ [lIQ'/SfA_Lo6ITyrgg>df6S? ˮkWl%4[dNeݛW4N3Unx0tp@kQ%??$V >]D^Fy%,]n5B],h1XO9 aZpᠤpfThc`0aӦ(4~(eQ\aYjB4FmڬM6qc1шl"J#t.lw tl JJzR 7 [o{TXFEk^5g!1և!,^9r1g1ĂIyntڿ ub t-="{KS1?|P^%sD7AN=|.@9ۀ=d~ 2Zi `ֿ֩uA<3\ƴ`2Ed3n=k"rwt+o5hL(\n+*7n~R6SUO1 T SEH!⇐ROcg *X~cpt`֤T ;}EC$sX~*Jᬧ5V)i~ѫ".ㅕ8Xdb Wg28h@ZR)t5+xIrNں8dOQ.Awo,tF Lbq9=&/*o:Q^]X%n|pۋ=Clx]1?ys aO 1(=^?+߭X1eX*XVsa Jb *CŻm7W)B"=rka%Y6`&51vT d{; qXij~]39C JVo0u5EJ; QrmA,{(,`1% /ҽ{=m( C>irJ(/7#?zI$q|m8|$n#Ismbԃͧa )$/?<`|9/%WY>#0Tŏ,0[Qc,pr5%+Kɾ_yU,%# A;-FV$bFHC12/23SQ'ŊLx`X3,XΩGwf_JLUy]TUpszc]xM \ ~u)t[ u=C'3nS3Bj06p*6KB?{J<SZ:ꋮnQR 1:1@Rbny*D[XUV ~qy<ؼDF>e-G4iKx8 z:6\}ا\dd!"E{:͊98/.f>EqP'2gđC9|2SjNjAq1J]Xsڛ\&Z."dzzV o&׿:´5X TycTh5{%[}Q.8\$͹5g KcRFJ- i]Zxm{ *oN[Qk*M-&]P u0nZ1[I6UkBOX'/lX]*A_7XuU%֖mrlI85 8 Kra%EE*fdzw\\|d`g|Sk(L M wb4-Ҧˏ Q9}.,L4 EbגƠ MRI U8B[ɔFsJl4i,`Tn}4kuv/:4 Drc3˯+t<1s/a)|0%wlL~tyog smzC$1{ILfPZN=0$+9 emDk4nkK ʖJ8.] u>*_bz%:*5bC`Jnއ\6]ܭ^Fw(htl eG@Mª% m qJc˾8g?%))r[~*!V"H))U"H/I7hegʚ8_Ae Cz('u= jDQ PuX]U_fUY(2NMV'4EV!A4(s-kK< T9٦1P45iޫc`WX/8Pf |8N^%7LƄPDKN3tM**d-xll|\%*Wu""oJSL&: D2$XbEvIO9pj-H~8#s$;S,slB਻Ύ{,9)r;+^Hi Jb΄轣jbޛl|g`sL1Fb#[ Lk<2pa' ZWik1&zUMRpb([tLyiF̧z+q bl {漷v;9Ռэ#j`ʛ;# ~\??7.Lk}kR }>1sjQ;VJ<ݠza!w`@=^' D~<2<7~0 inôek=ahAmjm|xa{aKy|M`O,\n4a}Уac1[WsOPK3Q+;)ީ}_ *RT(% l"= ü@DCI*)>tG jtNw+NMB;ws /FTbl*5Bo]L[!.ʭ: Hyd?v= @ k16d-flJr)ɩ͖QSogaohֵί{ cb"kvZpqPR8d#AQ  G}Ĕ&jWSҔ᧏_<m(K\8esNWiL[BюF`om4@2#oy<|Cat h4k lk'-` AE>:)1止0N [0kP +`TG: a8y,2R?n)dٜJ[g XqS_x5ez.m_1zrŏ,]̳- X}^ôldR^=1ٕOj$UCzɘkik׳B@wTsO|ɲN'!oEKxJ~dCi)HR rXt&ցޤ[Bs[Y8ysSZ#JR rXt&^f&݊+7΢k<3yJ4"͋<>H]0{ƃw .c3yq\З) \x#0V.xdrlj$a`;gA]%#, Y*7.4(Q ũR;njLb;OeKR.F:,p6g!GTH|?6"4'4MI()duZɣDRQa,A&άڃSK'e f.+ K57>񍫺O7n:}AqgVPluMh0g}Q{@x|?GH&:RK$z*i@TK=gQS0FS`f)"GjJ F,D~] tk%aMOW T]k9H\lfdVpuf|O?LU"ё${"IG0g/KE$G!K cd\@Rz|Y+?|Sr2KT>g٦)X46Q6>%!Na7<\GXmJn_<͆1.^LK[,u$ `}U`Ҡ,uY搲cXfȌCn F B8k`rZ'4vWH$Z&gnVczr!;8~D!zkyLg-3%iW*?*oLި-S,E{ ELMαˤdHS:IOT&bD%:Pr &HSr`+8RISaP⭻rF(rGC(5T 8 "=2#lAb,3L:xJ- d.'!%C L. X$SY&  A;sCDGKwj2ZcG }HoֻZ`LDI 'B m§ Zrk= F9IaܵLIp7a?'iU|{n̫"*GR>K4.JW6>MqAZtWwmq_`>+tތ/sH8s˼7J5#A+j` ^)%X#sRTYX!';1(8Q=̔0T#%!șV먕 i,5JIRaJܞCKh~;M֠I}*9Mf]:U1k^w̲d /QYnҨZ-} YRֺ48sx> _+-l/HɈJB?/=B[aҜsz!`8`T,bJ¶R0EzA3@Ř[řuLZŞM*GADp@݂.@ 6mB@T#|:qfk0[1̖Y.땡;{ϰz5;]ͯ*Yw鮽\'?IP:nx `:7`ԝyXy~)L͛I*gٙ;{'iӀ`s\P; X}wzUᔦTv6Z>cē^mQ N&sḁ̊KU۩fH+ +T?ZYFkmrL׫whq|̰#,䢨Y=Ȑ;;~-˫E󡔌qK g|\_`բ #t<1P@sǼ7]cp[MQ_z=M} ?:/Ue 1uQ;-q= $ioǢDmFl8|]p)ˋżrI7JA JYKNgCgFlTTHFl9@3Liw_pq.jc͛ %,ŭRFќ7ًY[Ōe%FDtP0pkv$ d*Y:OEn|HkM9Ua9ž.:fG3cI]f}LmAB2޳C;^El״SH_,O&芔gՈB\L%y:niF)9NII=DDHu{lV ;IAE`ke$ʾE/ -m'< 8{mG,~v)K%I헒*G džH2>dؓ!1#UԒ-WS,97P,u%>S 3o}nx{͗Ec%2U9TwfĚH~:a C-JhOϳ芄 G3¤BǏzש'<}\sq:!F*34Ek0WZSDO$h9+؝v p0 0|/ }x.}>&5}vMoH8]ê_Zyy t:(>Ut{l4{C>./1#;% nv*[s7`~H8Lŧi*$T11Zr 2}`Z0bWWUsrUGorB:SY#-v``s=,2sNVE$VmW}p+$J0wia@K=q#lprT}R3̵$HɎo+54$E Ù!%)lj!ney ZZ/c݀djƊ!v$)%_zRg{*q^U&I-=Yڼ_ZS@[tZxoO|o^襓[LKw 9OGbtRN}*OkЧH5?~9[ ƁJE82$x2C.}dL p)xʲWR~wI\xx&q޸E9/H0m*m@b W Ԣl]9i='!Լ7O5nhϽՖhY_@=s ~F w/%"sٔ彟M~ WdJ]= 2X@Ȳ1B[ޱc=q5Vۂȣxǁى7d7{L;D hH@hJW)Y igQ;40&Ⱥ<@Ƹ^-$uWDMFI]'17"B0TSG2dr`?URu?Վ2Ocɴ✩u\it+I.K;A„ʀWQҹ;˱&(/I7wEsw1J ~zG+?kbonk>5FHu1gh̙pn#94R%NL kU^K~QI=WmzVger 3VSqס{aOo>f(uklTY7 }Lh'PYQR$]O7}s9o~0x3›ŻdNތ%m}ӰwV7#`7 =:1 &h+DrZ97%\08gDʅrjIk2_lBT@%YJ4[ 8GiLWڡ6td& Kx Ӈr^;O|d8Qfn Xsf{1*&Vd)Vcі*D+cfD).\y9oroFG3hc΁i_xi2BuLb2~[?Q/,\b$^/-NN#{>!f-^;*^'Y]Ac;@]Drʈ"[$kKiZiJ{glygdt֌\lyǒ뤏ͻ3R0mwF[ArRdIQ ɤ\ޡX傁VWFQC>&QL56c’~t"jb|`o3`)צ eN"4ھAϻ"_W&v~Ff:ýg`& i|^P"2zZxUtR|>}s.*Qў#(wJ2Pq1H5Lj8ðͳ/e'.Mݸp|D&BG3Z Fw!aGL{d hղJɛ-'/bxnCwȶ bocnP9FF= 2 ~C jZMb{wbΰNST1vut[[׬2D(N:# T4[뱞W#S$s\9,rRU?IbؖwV* Y>% ^\\wY4Yw X6:k{yyWKޅXńLs-!}.]tSW`w Th$/ &ڞ-%}Fޢ1O򎗢ae.?*L{6r}1+Wz+wg`=2K15<(T ?b`y۴rML91Ou#ˑ덵)Ns>]V!E]|6ʳ.캇&aVu ʪPMH7P/UɿQ @({bT=L7,.GXZ-L&IxSFmD@Wȕ^vlQ K(o6R]4;m\T=mlketʳW\?sCL'tZ*Npjec,vnYUXN`+- QbH;瘭A<#զ&&9{ ,VX?xϙPh!9 aϻ1lh ?m!<%Gso` 0GpJ!ǐ`)u N,s6L3*=qAMKb #1h (Q w N-``Џi^5ٻ3Jա_ UR1?`|R@8V*' g9yjvLlo4c d#cezhYOsX%z<y? aQ3n?-h9(ނj#dN֎Vwzj֥͖* 1FǨTpFzD)mF XU ]W[)帿9H@3ȡD,+TUjQJm}RشdOۅrwyIψO;nvd1;xzw}M'-1iLWJKԤ_OF_S c.aS ˺L!_AGcš#5OF2jݜ&gfG_d⠩T@' u2K%D0"X(#d܆nndbLE^ (*'BDʪ ;T Mdbf n팚ȧej}Y!oՆ߆XpUvҏsEc5eP]2 Ê+di1* $Rbc#-592d{)bK"7%Q[J%#{ X+!_`PമkG.$)C"OS=2!WZr eQ|0[ =Oq3xcߏE,쑑7uu;?{WǍJ_&vc#q)BO13;{=b^C6%y'7Q$}PFUuv 2Ld~0?Ĝ.q͔{ή|*Ҫj7|>w%߽-R_z1>Zז­dԃN&3˜cRql&9w[R6*Bw+0ԌgT^`ju6JP*16'Vctj4WH@fB1t:GӫO XEr7s} f[7-_38ΩON?_c"myVvGuIM="`в^h-MZ(ɡ|CD j[:N (Mr^8\>S/#yJu @` a^G%HdH8)ļJHZ֫/mF0DH$` ^GR0Q$X:05Av"2E0$KLJ2),ZRˣƙLext6("Xi!:P&B=  f˲>^ E}-^ Uj!o% D|0QG!$Vk')w)K0椇%Q`UVWs\r $np:ͅ$ AsJǟ ә+G>%x2"h Or\ ].4ͤݚm?nRv(V_-^|n 'uʛ Eʼn1l^{ϚњǩTbxiAw,q=N[!FR;t/uv3 GlRrAF3y^[6 !و mҔn9HnO1)Tu?-7"WJs*c陜P:_տBS\1z>P !AY,Eh6))KڣD14cRs6TRCx .}*2bR. Q!"!ZN`<1,N ćsTC37 p_bIYg4xsDR#$&mN xG'OLDglFgl g6茑z߭ I@8Ȣoݛd; WpQg=XD+ !,/6pÕ#x#4PB-BTb=cn )J`(+14B>v`"WEWa+E=C">z-2:eC&n@(=$ gM Vb&3 0Ttwp睎7uѺLgG\I 1 Iӏ2ͥT9&X0Ac8dx[A4\U&diW;bbH('n-$`&K4o*>Tb8͹}oݔe@Ƕ,' 5 ʳL@g:c 9?!Ĕg9䏐i:JrL׸mx"Ӝ-p7=lՉ+lcFו?ʝbĀF' nV2iq-]"u"\MAOr~*3I4/7Dq3#0B&q|0#X͈j3ӌ͙wKGm(^(5Z=/mLq\iVzwC%* (.Rۋў?K;t9ȞYF}eZ̠-,g gmo]Fj&0@U03* vESz?w~, b^HIc:ZApPqn:߰tOy:m6VrN>3%MWzoCOx΋qub?4|vFp!q,^6{cX|-tpt~*#R4YDe& lOy឵gKXcԝr|0G:DhY>oh^w0KvDMG'w/6dnzΟ_^ eǢ/l狋w_=pvbqxY!|oZZ}ѥyOI%WRMN.Y *=ղz^{|ѻ=xa/GT t5{P'FaW6}{o1Q+|te0֒>o|</>nR^}ivP=`xqMHs&x~)ޏo ˇnbLh烏.>#voxb5X9UԺxQ+ -VC5!F[:q'դ]N0Iח$?7I}Ѽ\_\u{;Á^?۳<;cR/GoVOn䃲w,F'&?<\ƛ#n懳-d*~d'!E8-a!sޜB{q!߸պjY7ܢ!XZ JT]|i,ͺӲZ>4W*jUǹn?߭թ}G(ʇnWZ>4W*lΤcn?Me;Ю|'kUۓs{a?,{,kQ;'J"+ ܘ1o.x5xr-~)w? {ss9$[W[,wGr5BB)SX nؗHVK~D"">Z8R6HA(ͤg6e8rn[J4xTh΢_zR\_'#m[':C*~ZЗ.W{L7 xQAr}y+um҄_} QoҒp(}0ew~YFY/9M+CDoH_Y^-%L86 WSgl"Et;aƲU[U,$haH޼h75b軟_ele?,AVRyCʐ%K4NtB_}ONI!FُKDf7gqnq,nS㝎d=:zu)N#2ZGMdI\$%ЍH CjN2k_f _V?,s]q5?u_BJ74,M%20;6jry}9/'c,tegx2EլwULI-V㢼]DxWDkuyrS"o%8;iKM<^Ǜ&  b`}1<@6kr*W5k}7"ٮ5㹫ʙ8_}1/GԌVF/sf9ڳ_MI9_97;5LSw~{[yHY*M`eHl8@!Yߩx^% T1%AHlvkg~Ԙ‡˴eR?~qa%қhT :IX .eB-pԢ'%~?p{/?>f37A  wuXHZepޠ!̌0sQdQOH%e[S%%OX"iސ\mލ;Nk{cKfIIf$jP8 VU40fDE9󝮖I"I;"9 {ks Wgz6!b7y໊H.'A,k^-M-"LكPOaHZSMEҺ~D9gg(59n>1۳y\_\͠ HoĽuN@] 7sf@ .)K)bj(.Q # w~w`DJFn@W'l3EEr^V~~q4sG礝jxʆma67?} 95u# SlA~t^]%3YaeT 3Ɓގm pvAD&6S%N,b˂))~*AdQcG7BiH+.KJ+(|Y&=VNZ8]XS8ʢ k_v_Ɏ:nu_fy˺e+pdwTa8&Sq-砂J2 ֭0.ioZuwɓiEo*5f6zKQtmfQz6y9 9.JQ tf6y.dzy<^qnl&6^:9P=;plNt~>@3X*hSHd˵zvm2åLZ~oy1/ڇTd2dC 㢪dEQuv;Ɠ!ie m4)R6u;$ JN;D*O wS't$`P`TЬnTcq ~uТpE#N~]1WVFKѲdL`3vێLg4:̠`|L? %#O9sz$8erO7o'U =(8A\Z,Znsfˑ ˨wQ I1bzoKVis=LkפH> aEZ Wć1JS?|M7"@b]g A^'lD& =&a\[cMj'!\+ > LR@/|oDKjl@GxԧTOXׯݏ^P:*֗vs6^J &wxte6mRT.H<:SkO1w7eVPJ0nd0!B.XضcG0{vD ߹mJx8DLQRSeTm`R6|&* xWwpxm_l7K=frٵ"[gF { y>E곾@wv״I"Y@uO!"TǏKYO+>iYbt=8"Q=ЍGRaLipfي7< TR&w@w@ͩ6~X`\x1H1)ctT*Wwx:fLno.A$Ibtp?Lsd5 i\!wJ8ט׏]¹,|*'QFࠕop&H%;خXq+el׷N͗5q Pj%SmIEkP}1(u#p_hÑu$IΣ\(ljllvy(vd ̘N RZ(x^k~ lws?FBZk0p1IQ,E .e{ pA'l6'*(m` ˬ(VHE+Cr?)a;N~ mFwEe aJU|wr&rFTSIM]UJV q6 z]^%Q(-!<{gCO ~L߼5/)yNڒ" VTbFiR^it޳O{Z_'v<8G $ρkRB3x&{積\E =.[G!,o&(S/|td\LOd1]#!n(qhQkk! :Vg~u 1Ŝb@ʨL~ɃЃ8WCJi!Q^1RF o$xy{cG ~ۧo޷]?*M8II"sdL>EZkM d0*m%T br'\*:i9I+næZQ"CȧAT)_m "T0ۚjޏ|j.wx 1 d9:>74_d''k>8aQq}g"u&?4JŕFkIKDщ,ir9'O6A7K`gNfg0?EucDȝވ4hxSY5/Abк "-{E=]o<#N|^.ɕ >]uNˏ=:}hc~eҰrhu猥J.YiXJ6I-oHٍhfAY2hvEO92V^"fӨ3JIo(E$ wbTk$hB^}YZh08v'olJ4}ĂϦvM#BQE\6^-gT>nzj>_PݼzXh;U^p*/]pQU  #*ȨRDtJ Z{UiE>TO _k4pY: ͭsV}1؁>e]΀C}9|<6E"g뫪1l}u6+GjwܜĪ$.·e{k؝_{;)$I箪bҒ,frt %bI&)YVXYHTBiI;&E)N:IAG2|zMBTɌ)!D?N7B'7s+g)ڬƈ=+ZK3ɊޤlPMF%l6ߨ٠@$Xoj5ԎttP45ک+~Q{˽_/ޛN@dХ(Eim]> @n}## )ih"Nr|0C P…_] 2WƠ"^2ʸMi\ { T WiD靑,;Ш1v )ky v_lJ!C\~/aV52r`2<.҇Po`HatwDz?ewm&ajY 1k<6؆@"UuNթ:ߋP&@~7cĢ`#^EF c4#Zj-T Y_ŀ dʭ&ޕE!ZhX(cB',J =| 9%,-po8e5kcQ xJB:Rtb4%WLy(bHU!Fp!*Ւ#gN>m9 Ve%E#{Ay1|F&(d*"\ ة( ^:5q?VF0!4STv#XhQE1 [kkK+Yoẽ ;r 5iP(Sͥ cBZ V܂3\2ňt7v-]w5%;]gJ/|oեJe"Wu%J ~rxF@p@y$Pk2[Q`6`]RR f| 8+,[%?>#lޏ/"ߪo% %Q]1$i܇ȳR teQHDB&9.=F"IDa/S Sg,08,%w%|s@EIe3A5DtUڡ E M#Є_%qJ @v 6f@ ^gfrQŷئIC3pqJY-Va`ԥFײ 38`AeZl8siHz Zى9cc;Űo_Ŭ.10}ΫSs%&Fs3|gWF'p ]2^ύT`bZ Z0Y.v^, olZ;{Y &CBJze''`ط!3a4zRHac*#g?Bw~v1R6鋋;- /R{;jw=;~owZM>c?mﲓWog]s>|v^;z?D[ z}x|S_wdZOaF`~{/༵uqg]5?b\^}9 NНd ]ݾS ;S-Q^,iżib/.O '{ s/#i@ ƿ=}ia'OZ>3ily~JO OƦ?:u?˒gJ} _1:HêCEenOϾ<>>9[?L?TŤW>Zh+_w^GΑ;Joy$ɫ\/8/ÃZ:뎻zQr1X򛝃0y '{B0\}EwsKqa5*$9v]nCx۝R) Og߲^d:8zˇQH!8>lxav{|0^?_]y{i?߼=v?#!Xj~;ѯ0"g,}x01{^e3ɨqbxw<ۡ|W4 4ݩd!غN{홛YO?[|K&еnDS2T8dL$45{' 蝺ݳܼ>2}Pwnp;0.#J2Ka'( oN߭mϓ[mW1J<Fw>/6 w)J<`}#po> ggkgw0K/wΕr)w9nz=\;[nZ[m&XI1*=.շLI=@afyDPKKʧ-X}S06۞fD[ID[Й-XwmՌధC%MՔ1Uju1DG1Wj*B3,/ fzFn^¨dp)y8`$^*EAQK<@TN H0I`8'ip0܀ . v=^:qVY! ȣ9KixPx|hk}O^ÁJLd+}%%J7-β{Z {_iQy-+l^$)7@R%/!\FVuwHς| {"T[FkU1/ct?ZƘ\֟r Ř'SQtֻwi*y.+L?}ַn>\,Jpk+ |&2bz4qE02ˉ,O;e36lFI1XSD 6M =2hA|ǹ0hpBa"2 LUP+V,O_D!p0,=;2!f+ +D!ڇd`) <&xpOlFl#I|nh $Cm[r +V}Swr;4W (1'𚮘|(Ib^(1 3B;TH^=줈Ԃ@ pAH#r$Rژ;_Tei@5ѭ*WL%iw;l1# ^>e9 'DJ)7jS@" ѻ1XC((I!T˵KM#4o8D_+JAJ޼5#GŌ!GiYsgܦ1{8S{M;Yl &#r,Ƀ 7=)r5Bn)6 )7F&$K6fq&y&J`Q۔Ծ$=QRꍊRrB8o8͗(1zQTMdd!Ei6Q5R%)P#(e6cYQdDIpuL+.|h#QI*爎(z R T4b}B)j.fR&^r-j%'&&JesWpJ՜c+8Ciͮx@Z%i+]NWwF~Fڿod݊t4T Áp[/̙ I=>\˔0v?rRy'~Vإ߳ͰdQN48k# d.2,Ti һ5Ax]AxI6 H3I<#z3kxFgixFgd=7Ҍsf>|B$|G:0Uέ'VJ4;KV(WӜ ,-oF~5Z'~Nk| pX# 9)9;[ /[[J+.AϳI9[6L"0$i§Y[V@R<B1~~&Z<1 JM{?T\g '`⇌O'vR23|bwꛩ9*Z+{C+WHAcՆWm~+ǩeepapׂ$9+YTg#1LpTDDyJZTi11K7A7 >"%+P̊>Z@RMT/Rib VD-4 mzqaȸl9W؊B@A+ka3Ê!=S~!5lc6Gl[l%aXM0L{Lց ,K<2]4[#A^dgW\0aQ|f0'jF!h8vb1n0 NXL אC(J7d F?^F ;V(< k펎0R9*[(lQ)!K10ZQsKGj,%0́W=AHSuOi&,o;>(E‚ڌ-L~-6L8` #> y ֊Fw(*$Z#HPQ;t&Kg$o̍vƌעq#i!VB1Iqwz2đ;xz$4+8>l 53-߈c @層C%q6 o;:hd VGwx0ڈUT2K5 0jQrV&CzԎJ$Ʒs %Y 5j̢%g|\^w !XpF$O+$:}@n7,E0+(^oxYiو%ɸRO(apc0ճf'ط41ΰȓu`՝7J _(bdc' ho9Z7mfNez1$ %ѷT3V:%IFV|ndlVCW5qCLS,+ Dwa2EQ^?'Ѐl~ޱA-f/gdFa8s \ﶗ*ܑ&ք=qsCL! ̀ėqcR y{\H^vlDT*{xha'D>ߚ/3⟍x#K@EX?< dD Ξ@D[~y쟍Wg8E+_lWM+wRO_'ӂ4u҇%Kϳ59}gFߏFIݰ=[0jzvŭ -gj""R S4{=KO3.nͮb[|ϗRcx?xw[3^8L]2û/~m_\+nԥiKgZ9~seQ=9ϗB9wljmCڇ-zF6'G:l]?>pxl5XCm̐ Vבmy邁ŕ{ cZx|.u3CYIG=C2RB~岈W1A,+EZ1Zˈ {v zv&zN|}Ѡ KLa'NV3]`1F~$bh !;9α (cbI> 6 fA=CC!*Y ! 9ja>!}\ƞKh}Mn^r{LLrl͕ko_9RwfqB)zr {.OO^=PkRnZV>Uiu*p%CH+XsZ1mVTKSP"fAG{ E4ht6ZvōX֠=B66)wL.#!ˀpnM0v3]jKjנhwtЈڤ7. lV渇eA)h\w*i.<`F<"S[ynCa=^"K+>4hlXިwlP .0q/X)oZZ_o~]"NlݨrFTD mFD#9 n&nωDaM\+J c{?7D,qI[㛯㣷KzbW}dJ$KW uY_n6bsQOe;8=)!=9iA u^hBC꼸\PO^1lY~ڱCJXBma44Mڡ0@\*wW 괗6]i>?nµs#6O%3.hwG'uxA0t~L[Ԝj;o`Oט@Ĉ>5&]Wyfe!^EbUPn|g Ȇ0Z1ϒW<-X𽀬]-ԷZx\YD=-K *kP2&9Ld,euβCSaBr]:+6$+!r4`~vCKR=>d݌ ^pмpO*YYr75|%KUK^Q{b67]09AFihȕh=bjYAn6I\l}NS*C9n3rn[[J/\aV+*UmӞq!:P8+Lڧ ͠jH@#}7}k|8~_ܯD s3CاhFEJc~c- N߶n{%s.mٕ6 \?i!ypJ۟{7 4,Tj^^*h~0Ziӧ[µyKкCէHZ޿L>D8}|I}}B7~}ׇ\ >mTI`LgPoHE"ϓoo?z_=:$≨wB49ḯ0{}gMeB\U=]XT?W# 3J"܁Q,zo]_E֟.|#M0nhԷsFbSJĺg<HZ1p.S~L)t=CߖeԶŗ]|tWc{y\U>)g#JsWY:- Vb ɀ7>ZAV,3ž/գo>RrtDgˉ+טq-}Q%㨺oɎ)H$AZS!PP2LɵQyx{}oxCYZ# T%)ey™B()a/OkFc Wrdb£ w<{YAWwSbZ%߃'O ~3sf q'^uro?~*O,'o wFc)L)C6o}`b`rkB;>䝸-o7MO .$(by 4yy}C|6dPN*7(Djд%W/j ͆!8HL!8L u?X|*%anQKv}E`4҃.vng8P4ONԀ62)pT$%/DbJ3(Yf*'/kyiv2Sdԋ؂8 cK}"GͶB-({8vtSaUR~Cx֦'{)9FcɓkcƟ%T,i(ASoӯѬ_'Œhs+6r{p&<{@lY9dMti;9bKxiH) xr5+@Njaf#3S`<>e s衳K c-ΞdEǟ?Wyنm*?ݥ_g߯owH=0wq !C^3 %60gzŘX'zG7uzg2&Xr'!.Y*߇(N_(>an_U5ϦA $Pp& yMܚI!op|%$3%Z6h*IU<,]15/ %hAL\|F])TqÜC@Kw9^H!FZ@8V8 9L c3Ԓ1ũ< vFkੲ@M ɬ}k`)%WjD1/Hj{擵ZQ!)E\RSqN@ [Q2p6hyjIY4F|Jjہ2bmHC=1˺DYcݤ_^4#[xDx<.|suAb` 8"@uvuSI&!$0d!~laMij@R#*V)K/5MFMR[^dnzZ?j4 OOݼ=Й?~Xapx:S!e'"3,g?,:b)_oϰQ &Y?s b֗GwS aqN/)霝`\?|%8{voݰ(PnmܯdM5f5ovqVCœjA@*e ȯܷ|6bϟ[3ObR#uŎiOj\s"e $bd[F Gފ!*/[i j{nJKʇf h+now@mĬgw?cHEUD4!1Lp)r Uw]iRS6{r=~W1M5c\PB\tISQUd(T Te=T*~cnHZvAa{# ?V:ܕb OT ڈ^RaTۍZpd=w$٠1 Q6b]HSe0OH2UJ4IT>xu;gPrE+1 ?xזǁOpȅ=>lN-ZL$; K}r.; vr~zm-h3Gk6ItTz]wLAy-/.#OL@(K->h1 2xĎ K+(ݸ;Bw?>7dks$2r揪eh) ˒LK#+)R'-x5jɍ6l7h tmF2eb,Ytq"4%')nfy9idCE;2?r IH9U8u~ei.A`H{va'-[{8Ml 0c>r^ 4#k`q/A?^i%'ñQ.xXq9[Hfq6 K`cPPY;B3] ^1WzTaz"#!$BF?-8׬ŗ j[PKEzN"| 4L^phI01%/ -FCD9HMXEcї TzR*W$@Fk zq[Z\ M ]hSQdiA2P#w+o}N=薆%I0skoyX;lWTFJH1G$LXʤyiuք`MQwd Dj…6.#9-'m{%>zR+.e>87Y$nh&jF{Df鬑y~=ܑ|0X=[uU?»zx}R+}[!o'<}h4ߡ"˯oD.3TUvnnzQO)܃Xew"K)4,+*/}DB][O7udkLam.mQH*Bh9D4\s*G5u,,yB*,1Ȳ65uK2]O~cs:̋ GW9GMzT9.z{M\H )Ɵ"1J餔EZb"0 +JF)tgz1dT!%c]TqE/9HBu;iR\mݼ,8pZ>qk<ُ ~D5 q9E 䯨X1ƘvF]!.cbz P^v{!B'|~"2B&]~Mz}λ0R>8PU 0IZ p҆dY:y}j 9|À#9$:^L8L |7*2LK:qR=il}3IXj䒭i@_XR3GX+\3noyddI.)Zbgb- ZR`H6zwscҬ[_ qKig/9 X1UꈢdIH5 XqE4Q03<)eSfǒ_jDu&#K֚p޴$ Ɗj+`ĭ6ƠQKE.H\J,ԕ\ >xzqNE=b Ʌ-8?f8/Ksq4ėw4I7..W3)ϯ^OCgc[ߋǣI'e+/|Oڞ_RG'ܷ={tW'DOPS W/{^~RV%.RܹSR~W%z_Fixb绷e}b$]/XU}ޟ2<G渤UÿFE~U\gz`6 t xYn*1_HPMI1sׯ>I>(K/$X ٳ{+ ơa?w!*K:-ݸxaF&sFKuo|ՀA8 7$MQtp?IR'3ҭWgqVyQ3 '029JW=~.eJEȱa"qλjz9OޤdMg/U|>= W゚:;l0bO?|/0/)f54_Vee=%HS:-"{Â^_NӕҊAI_}9<2ltـ~w>~NsAeVz1ҒsD[0l R}%Ǩ(FbC1\-a@k_RV9 iZŧ|4)Lv`ژ(%[+ܤ21fu4ywx1]~"ӷi{ xr,}VL=܏[lb@$U鷀Sph<1qӐS?Q8su*ɳK 2;d?1'u/{'O^ aVe,YLPGAebaa[%Kuy!Zyxݗk7Ŋ͛ȥVךkM6ObMąKyqnNIhKi@X ëEKIgr.  "pͬK/dGZ)qiɸ:c:Ls/2#fPqkvquHn̪gy_߉֔hBt%7xVK!x<:aD$iw}zP+CjFq(nvGviS#<ǃh{spZx]qу/"v[i~ %F[K ζ~v4Ho}"1xk>/_ȈTKk¹P<5EלC4;7TG+oFNNb{",)GH㒉k su)3,/nکRtڰűH@KgqhEqZN_ܟ; dW} 9?Ġ(eњNv%GqE4"Ǝ#D `3-kP F Q n mqpJ|3b%J2~M#4ଢ5 _f3˘3<3ʺ #.KR$X`hJ+Az~(uVkEhFLDHaw픢 m2%^7z5ߧCT⬭fzS.ӟrjl8b]B9B0RHIrígq'©C&EBY4YJU&*k4kSm/j{tipg{i/7k)0Z_mI'B7.ARu$jJ#e"1AcfN8k58k%^7p #'i>YN F{S FtM\*o? ucv<]5htum]g(ŸGQnLRt;q*bܿF"&N5wf8l{[oppkB-]LӇ(3 M0)f7.ü(NP,7yf8Z\ze%3^G_iXPE/R }b{+]=z\%׿,eu\P:]s"5Eٮ/޹^-yfRe,('WQ8"C4cJ8WZ{*"t = *@80-l3KN5rjhihx-]ChG1c1M0ǘi7| SL^M6C.+RڭA{<3nUsڛayw)b αd]}#9%iB/\i&y0Zi$,d4Nh{dꬵeH#3 Z++ݨB20,j4rRD@+ʑZzBƚKF0Ijg%(b(pEbl j)-v`ͼu%,yf^HXC?F`fpBF 2pO%2xm1 Wz?PeSNayw9DmӨ,y-<^G! sge#CrM1c IaFl[(n\Glq%,y\f2[ER(6AL1.4Qyoos$ 5Ə^oZbRBLHsRRD%uSH;N&)?mY}I[ZpBˎYd_j,5ϟ×mKa"(xGz!?INFm0Q"C4ul+ lEq1ڈ-zG^L0ۭOJC@tۭ)&[W?r~C@4oSߝWتRH\Nc.Mc_}}[792M~DC.ex7kUPa)0B:|}4MN"`T|EKjS"AHA[de w7nu3xWd \_3jZ(tVnj({y&}e5z [>oҰ<]ƇٴlZvSz3YSurXWyYrSrߗ}/`ؽ ~ۤNx#» 2Q5TU*VowV9K fZ0XoAy֕64M Y/m(v<v=kiQYXDZb{_7@ǻzyV|k}|(3thg {ѥlwQiWH26-y A@$;󒹱Ql"&HGQ1* 8@@Ģlbc lU.vӢN9kά7P?VMw[YKUkx"X>5-Ug(Aa1҇zXѫ"yMwQ .ɖ&WViw%m#~@fw Z}zlLbę],&Ar+c俿J%e) #RfHj(D]GbU!;i5B R+3.[k%fzsDUÀJOH!3 E<&#H58#|ցEhFo l`@r5X^`bF)c3޳QT]à bQ+2*:lrKnPeSwJ)Q[IUt-_cK*y0N\Q-} ?n;@`3,h1ʪ#$7;b=".DE6} іA\!HVp){p-GP+4oZ= WBs6KƅLEnG-)qey ʠM}I{=:9O.\f|[u[|M\*ɷ)+}`B1Zke,rMUJ$-D08Z}_ mFKѾ3hlhIP:$aDKnpP$Vs"oDhJwϟSN\ݬ\ Vv; ֕ #·Vt|O9Vb ;xvgoZdc2e-m)` Ed" j@D="pm Zzzإ6'GWK5N=았$;ƤoM..J:\q)U^Fԣ\բB2E%ޱx&I] L3#Lxۄ#Nz물EY ʃ/Lҥɞ̈́?4#3h~ZSt#9JCsh=f[Z@;_&k!6T8cT3~^\t|L5b QtΰNW@jvE7G1[唸E5pG-J<;&A?0'W\6L< kK -RE)}|&_'|noU뀘z~hI7k+ckcŲFQ˾u/ 3! f|z\͎P2,Z&< pD R$DG!|Cc{[%%J$Gp~Mc{a{gwk0Utw*Q}zg^}Nf7v:|:&^ӶftݸM[ 7/Vlo;f2Sw-It̠_\a~vNoϾ ־r82@qX hUYO{lB9Eef V#H;F?SwcwS@):'¦H׻i.DǤ/7H|Z4BS*4`3{Q/p7|Y _b$ioYc6#ӈ+M̷5ōj>:uf,v| O჏#s]_z0zِWtZi֍eMstt2KФۊ4#=c^} $\xaF: ޴6x<6k0az2;ss cꂯr&n>9֌pY6+>Ї 8>-9q&H+X읋% (] :Y-**sdYUj e;RXGDf@9aӮÄK:Oc]cPQUbZ$¶aGI' .#H8v̈́ 'zQ$|zwC vo/f/vӗV;X-ܤ6[ܰ[U 7C`y^߈q;Y-;a6iNYƗ6#°RFݸˍGY n ?aXƵ·3>GBF6q5މ8~WoyA%چ-S /*8Z" xqMy$<ǘqX{haJ128Ζ:[7k=o >|H ݎ9e8ќn)a_[eZ ,fyM|(9EЙQ[n51h=4[q$5S(Wܚ'xZe'#ʉ]y83;2Ϋ+6~pLKWMKFiS!J1VDy֠[: el {w8cMҬwng{oߎT'C) 3]TâqYGUp\97kaRjBmbG:uy ;iYdN7 ک֠ݶ:m;kA8b`ЀNx( (2"AiӖHo~mmLL2#ܐ}~GwُXDMuѝGwx?kQ2}C,8H%"H8a3I̓'2*|S;:6D? ~5Gч~Gw$B(H84S$6^d gZSѯf ^-538qrr)4LݩT&|j2*d+|jeyҶX9KwT֊D"d4d0 9hm/A,P4o=L(Z.0hedl(*e0ǵ g_OR]Je+ǥ͖$1GZPW\WKqK[]x+G ^U [;Pm*%;}=se:b29ۓWGrҤ~dC0=SmM4b c_S͈Q@1Q$)TH(CEA)~=r(2wLL{eưrٝ{ؐ! eLEg!c>/G0uEQAbE >Z6bU)|`E)CXUwnx\L9&&dc~AKF}%+*'3)Y`XY EX4& B @P+P}6MڽV; AXPgK׉7Gg@Ô\7~&Ni,y(:yfuZapg%vnmk{DL{<=i^"ᛍ:zOZcz՟i¤[(K^lɏiΒ՟(L헬1|اK i2mϲJ:h?ut~,lwJT^-?eV㲻Z܌RW}EWZػlnnF~$ A.^a_vBI?ӎ߿ S6[7g8pl~WJ6~ʔ\ö8+!61[% 9H;n@YamwSbhplNlI賶Ov7l1LD7oOW_߳-wDxCZ,) <)瞯ÈKQ e淒/^j gٌEդs\890} xH<*c`10?c`̯a+c`cͤo'VS2&9E8?I0јӼeXe&x5\,\;j*Ӓ~);F }G܌-6D(6JAH8Xy07EhSX_l5xQ8vE~c͵bP\ 3!Z۴Cwrq_Ȫjx GA("DF"K"# G%bL@T,c0AD*u$0#pXer=%˝Y-0>DcS]f'G [D בTiDa$)V H3F ~sAn)p%ِd6>Y=3t]3E.9|kz"1Acχ:>w͍He4{rU{ԤM]@t<^ɓj5Hɦ%ERΚrfO7Fs LpQڒ&G/L- 4PUdē89r/)ws%[&!V(9׺T iD; dZZA3^L TI0ǽ#eEG4`ZdTГkQ;n ,t râhԃ/NL,:,×2z2p)clYtRo?t¿OۦiC#hz>7i]z6]\nt"e]z}9qࠈ5G{IIߥ4O7_낉ugJ)&g?b3ow#Ԝ٧`؜Ӡ"}3t\ѐGCFc=c-zwb]%\V{0Ґ[FWv5*%/`v”fwZ ǸVINx?HRl-Q_ 6Ts3#2i=4gJ1T-3uCFBMzFֻic&hHK7Bdŵ7Į8rOj,i&wi4\'\e |ߚGSb=ŁPYCsfa@@TK&D_ťF[Chf#m ;:ZXY 20!Bq<0šrKƉwRA _:S>ȽDNi '.; %O Ohb8%{|6V4~ۭX_藧} E3cmӾ=qO&sKoD#8-bi\n6] )xcU)mOfc,pX-NNk_+2W{ Kn;+7=,N":KM^5;x;U49ir,,Ns /W,O^IVݐˇ[;(wn􄠕n$y{`86Ifnjc39=39.ah<ޭ@iXYK!>,+@jY|#n;rP jg7Q Ǚh%=0 4pUpB oW.@ I8U!h,cN +N#(V'd\Ꮽd"уUX5 íhI' $&bDf{A_yfs%54[W NHuݮFZ 30q'N iwؚťGbekEڲSL[v[ڬCdIDKD16Hq4wم@MGoc[ըRɻdOҞȽG|hucz4)v:>Ѥi{*sߥqs+7?dR}Y\}y;+FyO9DI;Թ~bZ讯~G›0_/{_y8K_!a@ (ߞ ޹kޭnL#8vX2_+-=Z7BV>f~KFlo>3|I`wIiVL~H< SA@#oOBD> y}vˋ4rw:\ǂA:9t0{ϢIֆPCʨ_meX)X3Ahoх0DII0d3RrRRuênӬL45jVUz^VPH]9 Q)[CU2alaq>N6?~մ&WfG, x+ NQ},fiffB0\rlyÙk4V N|T8H'Փzxz8TZ9QGp*<Z8Ͽ?tf?O!.؈,e0?gfBO|q Mc9{jCKrĜE5e:s^LU[- ) e Rڅ-D 냚lpN jb4<`gr?Po]NKq Fy:0])$WAE)+]m6pW5Q=nS1-m1]GL==^<֩(ÔSaW8k@d @<֒,-G/cR"]?ȢPZF)5X۹[p,1";4^:]Ǖh/ee|94v6he2!۹ ;W~4^Is͹.'>d xꥃl2yeAMȦm[K/Vk&us-C/9=$ x# ”e<@x__^DB۸hMºպB<{oλ_6%|m]T۬i.3]9@K4SkttS\Lyw<[)o[>,G/LyY\]?dSsN5>V=0 E|!6OfW=hzSs6չ?'1)jO a92ԴpQ8QO826J5>!|1A:\rQKy,XoeF8:x[[;z@U|vUtNJGnx牟gfjs_ZdJK"O"4 Ե|N$d?u”21^7goӼ5D:P#ݻz<%h^r=kU7Pƒ=TksobK^] ڨjLmkן/ngd=l{I9@Ȁ>|qMҵxʝ{⃼V+*.IP&S.)҉joFCus9ա |3kwsD..ۏO br}Mρq4 sD%f>A^<~Gmv UAv݋n:VorZ67l;!U}`k{+A g4q7gn+xGIg6.Z#!z JO+rtL_7R%FN&GC<<]qRYOECyAb#ސYU>MiRBlmɁ쩔AVѣFǼR( ^**D@#Ppb!i3jzh9̃ЖRVYNkS\26Ypa5oP0/*dIޫT,(SXy,Z}yP AܽQRpO4E㕍[9TIcք`bI,ձ f5b=ĔBŽlAw㳉Kc$ݨN!ޗx!ቯA>'vSU!ZZ*jٺ Ooft|dī wVK-;/i[)Mlqvby5ƀw/'@jxmkQj{4sF .x7'ƽ-*3b_RڷU|i,X44"(lHQ7D*hަbIyߢD] &#DS$ R`Ec+,VRʊ!ICܴ^Ħuyb<Բ*řA^fBt g8*`\wNN15ׅF.+n/%O*ˇXB ogDubFUAoK{vﳐN:c4?T] U=|W[^ŬҖw:s%bӒnexil@w4@zݮFv!PwU+m2Ņ9c [SZ+`Tڬ Ǖ[ؑ@u,;seԮ, ,wp0edLsX$F*90^MNxo9G2O<\Ɗ/cR>/q@ FmgT}Yg+WRpr4y=q$x9<qjG; 7 Z3qFg!\SD٭3#~΁uY'` +.Y?o_ eYd_Nn.Jdǃ# bu{E)ʸfG&u4 IpMIPeB2PVzo"G¼c\h$vF;OׄYﲒd];P.| !Ҷ.ʋ-}c~YRSRn;'<.A]&TzCtrqS>ܡ7 j3+ut#FPZw-c, # HLӞ(íq.JySkNle=K0v[c@*nu,Y;g1xҒl)5!u,dcL9k1֋lcBhgf϶cK#p)ғ|[/ k "  ж5F Q1.$reex;2wLuaaqKϻdOIơ|D3w9 Zgx1Nuux XSha AO2ZH [K%ơfG=b&9g.`޼MN+ԏ&wf{ * 3-euDz{Sɥ`O;VOvFdRsp$Rhʒ[UfIsD/.Q`5NLrMk^qla,A 5R ^{*%܀f0 y=ܾV,O݊öa;HY 9LxQьqNPcၒa0KJ ~~#h18]Ig#WnJ nk< DǸ I DŽYw\|AJ1CB{g@G] YuIH˫.H6DP k wiύwK +u$'9*F4GǭRڲ(X ` ?lF%,p#sْ' C.KN6C:ɑLU%W)-H*nNQi0T`'If' \kԎPD5b >Z$wR.Y(:] ׌<K~ !")`^T.K V7쨒>0 9A=Vm(ր\bB7~ݝ@޴{;@.jV|ې1 Xo}.`i=˂#̖xG4{ޒQK^_07F ȿa<Cp k!e%/'$=_F$lrh.g0{?'sOfPYh´9 >jjE,lO7GPifWo Hޗf2(Mcany0rDL^ 5 &|? ]ҫw)8\P^;D1kYre įgl7s,o@+ _qq[$F(bLܟVɵ.6@(~oŵ=mh?qqdQtHNa7),٭LN$NN\ˁJuιN:czYyPGPk^/r[!Bi_x/ǁz灦{"Rωb')PXzo´;,s&=|,P7u>#ԙ!耐mt gH@͞}LF(਒=rtӽ蟣3T+!9SO 22gK$|~'1*Hod}`eckYz̰{d9-oF钜R2w O{<ru$̎heGG:#8 NspuAV^QZɜVotӋKrIYKr)'d4ꜣ=L8=8[Y8U8Q+XvL;V[w6-O /ud B7 t  I0ںtNxw3HR/k]jC7g5ZB։g*0fY`S Ϟ]V'¬~1JRF]eLB8jtzH(B SL %/K)e+'#TKmc|3q¬Q[lCnKJi~MZo7] vƲg199պ?-DaQ쓌'GHbb-|)<3m Xl q@q[Hj@fGL\'vɔdK s[+aD7ojU|&Ӥ moWhHNu'IJMmc`,.{<ְ֢xh`N`ik&a.\cTJw ~Za2𩬱dQvs03#ꃇbp4f@Hz4{|-[UU {-^rWF\9EHk J+X~:ղjk\kv}ZgY-a6Hx?*}2 OfnW }Hvӯ i(A?)Vr븕\ǭ亾{ [b,m<ϼA qS9б;*R20f2$)|sϻ +x|_i->2jY01pB5hM1ӛPpz3XF}jz|xP^i}QzkGQub+IfФ&KO.K9y?=~(lTA3o+W;BU|5t]=1,Bҝ㹨A'XP>n^>8GX%UXTvX Dg!(ipPhDJR3B#EeR#0. z . mFnq41!.1(x|% VTI_[Bq&E W] )Ȇv1!ٻ߶n%ؽ+&X.Z~$:Nj9I;9%GCH=z6ly ۋgi׉BhZg# ` ު<(~|SxSB`~bf7H_L#޾+@_>37id\5J/pqor |`B>XR->8ѭP;]?x^CyaZi{eC˘;:rL 7X=m[[O%, ,9g:ù Rq뾍;.@s@1X_nr%#:w Fyb^sn@S 87yk­ k_/0]~8( қ])lZJ[֜ Z A;~G w"\[4`SDz=._͌6o-ƛ=2zKQ›6Ο1ICn eE)jun98"7[?a޳eⷫ5d&wǮ$D~c!rta2euz_ww4l#az77ww ߯=MH-+h184wrKѹe8㞹%wZn֣sq;q9jCspu'72OƑ7UH.>WSĩSRGttmalU%IS58|}b#`'DG}wu> jnҟ!AVuPD؏1?DjLƼ^8_>[ QE)܎N mfjJ?:\grAi' M7Iel&FG dBTNPL8]qځm~آJ1VQFž(y3uvޒ~"O?W}7icY;xgіEmM|ջrjPe x0,mjZ5u@gg9殹w/:I'4D'VԾ~B-#vq lK\&XeQ||ۺWAWG~-46'52m#lxYWgoBtM0'MhSK&iT@i0Lvr܇fųd̀Q\rWV£x$If~+<ruF J+[zE7HW;Av%Ecw >q.7̾We=I`T! d la]fD֢Kzc11$!l*еz*Kg7 lP9oxh]gѺ΢uE-Zk"6yt]) N`#c&2MK,O)Fm3!`&R3R/M5r%GFףE y5''-K29e BMBsKXU -쉈-hy,CF5 #&pǘ1@•#3Wr;rtQ3}dYיZ=W.ULU Q.guK%8bhҘ$!,%p!:)႕dA4 L V+|?]_\0,_n}+s2Fn{nTA"Ff4Bqu9`K K=eqЀeoKF2?I.oR0w[)ZՍ H)TIyXq3X-K=Z FR! BbwٝO.# EwE>n,4Y/zPaztfn'<8*5-GXtD#FɄ*u& _;O>ճ™[H^##cQf^9ýqTN=z4< QX 5"mA1mUǺit "x QqPxjґi`OhyTyO4Vù)GD1l+#ZM\߮~N}"|CFn-[}wo D>~5}}x5w\8`2gwCD'fQsa&6ګ7ih$aAn|viۘ-/B"K 0`! k]܀4Q m!!b6 :9FKsB+LF?9/?7ryE碗˱7@S8y3U`L9onn}nғf=^A4gR3#HyT}h6~Ryx}0QQsBm+ЛwE]` : Y&{:~.}(Lut";po+(`z[6hUJu0z'1 /:3&jVIKcc= g롫dRbYۂeo鋘ṳňBo.z+cIXW5_[ Ykn߿Hk{#FMNv 0+_l١B. 5RŤ>?|oCsyCX zX^\,<@; .fp^O)4_xБ^Ūz]x`ZC^K}ʰ7 r^m bp^0)-^]58 D8h F&!(`_ECt4,bĉC`_| w.Z/;LEK;dمI=W>m[/_Љ]1h>TA6*['y Xāӽ_AT߼+GnoarFυE\AUԙ뼺lA=-8Qj=C8OjC7tN6zbl y& >kZ#m=Py ztW;:&~p]P#y\43wms|єxF U\4#W;&(cvԓ?'sv-^{(9ji(`hHh 6Lt h\Q磶Kz yXO =bsc\QPu~zONVTd\˜l5C@t9m \ =4xZdU%`YYsە„% ZX׀G| $^5ALY44} iJ@v M#h@nPAkLHL os!0.3hHhe$mV 3%cN$3rڈ| 4=fYuOfɟmW IbH&1O%'ͼhCbVj0DxǤALƜP]Z@i/%/!,)vӪ0%?bM)8 '_&;e!(5V$OEڣut>Dt<\~S[1U`L=3$_*)V7``H!&aZ!c"V H#j+,7}9VvNdܲy–e'@C s@ ;eDI >ED-;AF%gf h9iѥu'2U\bOCiTwb)sxvJ[2&_.mB17m7&4QA ^>O?W}7igpNq r-&S?SNIr-, IB4$0Eh;%ip!$GDFLLtb$G0sv5GIęddHEX9а0L`J2WU@ 1ta23m2i+j+m5sQls-b'kG =[IiF^ǼͱTd>QD䨙NqV jJ$Uz%yaNrx}E#! u_ |SֳQ6swvC_Gj8׾Aǧ;*J+*JI]{BqXrx? U{jۅV^7ǖn$}CPP@˷MIly'JӑGۙZ}»'6I  G/_g~}fv`JVoQ8ܗSD@쌶RA;)}d˷*1rvՇ1"T!_މ,UlsxP GqUIp؊&j_}dVKL#0r`&Α.UG~tBw(U[s$Efsq 9#D64@vmfk~)3*e^g"`o:} ;|?/yׇXz{)|1'O7 0[4`U5Ά/#Hè]S;kvfʔm) b$KaOI@IT-kjZ =9k(0-3QWxl'3ފ Ql\z*Y~[7O^%9P pp~sy Q }zN^kl^SC詽(Oo?#>4nBfL!w5Z,<-iY:VdZE|hltv{˷ yhTx*,KKw2iP)Xꈳ[Nsi U-m@gj(Tk؟[/ɏmUrFkdA/4pI6‘'}Z3'yo7ɍ͘&hƂ7Rjh 6ۇIQ% 0}h4f4wT) gGpUtW\uAw1[n.u2("Vnj@4)6A9Nd<>mYdm]-ŒYk'2p؟ϓS\^87RIq..nofd/^:z Zӎ@ݸ6`Q??xp|YJ}nn+(55QJ 0Ax`3%QxS`O=V6\::{ڰ0&[4(;"OS 4V6_=:TeԀz?mY< e @.$RiZEEoXa] C嶖$@ɔrʶx?>2ek^]Ω ڙ&/;97~Nx2ym]x3sajj1Ek7;m(:' Û 䙅o:m l>wVǚ4*T;\38*eZ)SDkILZv1 h1\Eth$ĵS b Dy:,lEw=90pxG W}o /$)祪7}yٓ˞^akx7kP㍷?iMI;|p ÚOiF8G߾e.S\k#n bMۤϛ)n.1ЦmtC{wxΩnε7]! V*~e׮"jl=Z!ES9&g\ bOognZp6dH624jɡ"%Y}»'È$-盾{*ԘVYv] 3[#3ofv౳zwO8ܗScbeb]2)w"3y#6PÃa|Zg#bhkk/ya\C9wеxpsD* yuv#)9dB)KW%i"t 'z'7j*q@[ȃtҢ]KdA9!KeBGUӴ{wxbd6@F0CVHsR}mUOM|=C8 T&LݓP] OXv\/߾GI oyICNQ8t& m0' Q@/ ,.;'O-On!)O$7Y$cq29m LDN8U$cьD2唒Rr< <^LކmɆkGqS#Nx ާ # (Oq ~u8| ԧ2k^,-{4eJxLq<4\eԝ(Jc݄pP:quQ`HL)#Da`Vk^qWU@[ym|B-fS,h.HO r\5%1{iha-I8{=ݾdS.qpu25Wb8BQC"Kwea[Yj*\8uĪWmvnqf{-CU-~_^'WMѫ>.\^G7rbx9u0\Tx7Oa44]|.ٙPί%SN?ǰfyEꡩcc_]Hv/uF8w~ @wZ.j^MEd&DWҤM-qFV;/0{E&|^\胊ܾ\~ы!GQ¬#i~=t=~CXw*;޾UC/c`$$/mKG?#5=@w4wcKXܾ}m l(/ԋ3ËΝ,Qqѥû;O5~^/i<.t@kp m&R_VkyOAtc1LRնbV hP]ƪ22p3 Rՠ1l5+i<b8R r~uېX 9m`RfJ qy+n u]GR/_Vewm @Npty(/'?~yL v?`me0C3|o*9A*&H\ qMXCitD ; mThxTS S4VqGu*D9%KrC:Ծ,I<Oq}ڧouX7WC$2JsʪYJ˪2e&5oq{SСQ1j jYB#yEx;}A:^,Atk1Tߑ =! de#AƞbJBrߐ탨"OHulp,"p2c;0 5BTFe^JI[t_ҿ~Z]5S(9^뢧MҼƩY6k*BN X2h~ 'A7Cp\)A Ab7gt:`(HxWOT UHEՙ(Sǟپ7NQ\ яzx4eF˹ח/Vw KM9eq-qF8elHBg;~'uð}㝅9r|=qD LfŲqІ%stuCbD}k$qP(٢-[׃"__#*d{㙳B[*<˅wю[pʔTa2ʣy~'c}Nq Fg Ȧl>xĖ S j-yWo~yՏIk,r_.6QRp d%hrЮ'^$ʼ#R7UH!4`t)}'l)Ҥl9KFT2]^rjڬ99S$,!BB\Iz$UϺ= J6h:b#EbƬaqԞwϭG 4x֓@Wt+M('*N4dUзǝ`]1[XYd8tş>^J~lxa܏˻A K d~$fwUbͪ¡ؗK͐<Z pLEXV ,d'j͉48 i!h{>{!qnpnlFRjD NpQD{ݍ@wxux 4-e2b0f&]n"TLI *_Z(e)~ЂQ!l8J`ڰ=4kw%0 dȎIeT49䋩rn*뜓_Й`^j(m808ҞiTrʲgI ֨`PWgEN)Zp5P1" j[ƎPVJ z$ڙ pݴ",b$DQP-ԉ5 ZI۪i㽠i@'c=v´Ust%՘]lXE߭!]Lq3jVCۭ!*@meSL(Dw2Tf(p`. x.H,I*a%(ňf4Τb% ( 3&]s7|>zhD @TrE˄YRSp-VTWDgPO1ǸaG-G Ɏ7cl3I3xNNbE221G ΃CQUL1Nپ*YfYHfT$63$S3X*J$X1 dN-RTIrAk7"&9E5)F`b\Bt )UT#0 ^J10%nhR~^$&8*sD2AEIS|-x{.8~m^shp+Zcesm+_W3Μ>QGvFjQTZѴP`a=OXGp,MT/81ݤVք2{ΣsBh蔜:ITpP tՄPgkR*>zPϭ|RjK>xs+V֘кj7׸7rz@.jϜ\_9nP˯Vͳv[bI[KC'hEQm=!'jrvp҄e9#v.m'KrTv<{Bߣ=\oyV^J,zietgT nfCJ$}>}6Wp͋1IO'= /kSNS%QxtQBS'[ 6QxkUB; ]25)t;ZNSHh|NUXt\SQ[d3[wUŸ1zq͙ӭ=b\Qj˷a<Z.ULΌnn}uY.FJy^}yv3ol;V>J?} rD05l<;x^J ~{wƒ\l wy N.aEFPrA\<:%2bqDX Qje83W ٢ LLZjL IoIBDծ4b%j[j&h{wg/0M \T޽;wq:[>#O"S2?Qn4Oe */0bV4oeeߵ/"7Qon$sK-兽,Yr&,Wr2@1~xgA-ɧw"IHHj^k߯Rh~DQIU;TBW! ]5/Um.HW8B-tbZdrV+Iw˃Iory ֑Gյm8tp*,{CY@C1)ڄNƊ.]hVu{ Qp~}[1>,Q 2x8 k@bm M{K)R [o7a4OUқ^Ԫ\7*4SSM)S}"Ԩ}P9O `͗v{r(⒇3 %F)vd/ͭ.&|_i1W~Zm)IJQ,-)(2\{2٪ BK.BK>/'*,ߊ3؂â,,pd:aS#vr 4TJP3H$ekSQQ&D{L̿,EL×h5 E8+"{ kNc.PX<:G/TAjMl5gWGw#DME%MUfM`gˏWS{i祖β_1M忘ޓ!c5vBw iixTE H~+>H4kP wr_p)dA@¦*\lXӻw91ĤEpqIE; 2^8uitMFm'BNޓ><Qqv"qZ\lww‰6zG -<+=S`HIzr=ƫ{7F_?*Xϖ:NPj0&h|=wLͳOʤ7rK_v3NaHl^v`P$e{ڮvr;AUe\rQX%YJI[%yn<'u$FmvSݺD @ۂ4&F;)0[_gmN<Hv_M&xY@bq7m:5:%k`|~ӊdTJ[6LE֠ ;ZH/]߲W¬`OI{wM//"i!sΔ]Åʌ5>SRZ-x0lQ')*g Sc^J@P$!qDLSL<4? lPu8j`թO{ vp ]7,,K%E׷D^,ӻ@jjJP]\0\@Mz zmb[oIQ0ºp^^VZ&^GfaM~$x:T>ZAwQ_f.Z-NKvYqZ8#̮:6'3a`O {6~=Ղs:}j !f;Y ?=E8K|p:dxZ848E&Yn7ք ,jx`vo›E뜅9 Ŗְx|bf됸v9-.a1 UJLG酝PۋE~_ b(I/K*t}{Y/߂v*L>%a*`1gvKGЩ}E(*v]fT=\$Dg} ^#+\yV.EV0I;/ Lҡ;l]aLbjGg.)ʑ&W{yi{o I~_n#Qf $ o^mAd-0BHklZTf}3dl4:tF<:Yy؀h J '?WwyR]L> YAW+g/gP*A9cI/7> _HˇA͞TB-"\dP"6 5D]1ҷ8hJv쳥*KqAŽ}Uf)MM%I}8y$iSn=|eid&ʚ4<5nA!/8|9w~b~U.˯5377 Qw7Ơ_J;g)B%WEg(U+` ؂ +͕NamPa"vҕ6DkVޚ-ЬQWѐu8Ѯ㔳=AqPC`)rUOiJV ]N;xRFuϧFq%p\;#,VftڌK3 x2T i&N6d*߄R xdP4]긋Jtk"FS4NZK9yѬ@5_ [Z9E+gHLJL9Loؽڏ}v?XϦх(oATTy#Й:l4 lPBTu]x1tP"]ln#JX'$#j.NGs]IQY-U:q%Z~AQIS+3J$۝{2$xK1Gt1qU`ҨuusNB@MvdoJHX;ٙ[DǞ\"N=vZe߅'&꾡Ԕ[X.A3틢B)nႩĕSy,7@ν)mm*ik9e$U 49٥^X+-WedєF\5e=t!Bww!1\޵mV0pՋy#J=E!)!2QJ%e㔧AA{*J 0# ']w+|lD_*Fju!n3/Gh(:u܏iMҠH%̓ F!Ql{ NX=-t7[Fh'S2)5ʍ⥝vYWJsw~6<塝م;/ac/_{2[noEK\,7P g? )m8‰;$'RQmIP~p'T)%rA9’sku`cy^CwNٯy{k>ܟOg8MYۅ~^L1Cur5ܨX9[ʢw狳ٹo1Y%9;GJ*7έX _QCYY!n[4КD[C:oԢyIkJ^.ouy|GBs%ɤf" /2WB`:h"Qc#qnן!\g-Q+ؤERQ LD׀_Ru#!̅<)s!O\ȓj.d$1!r%σf ^ k̝- ~-9hBl޶0xhR?)OD Aެ/F Y@(*#@] 4P?Z.n\Tz~d _ =3P3] 2nDO\Wv2v~&.V<ʀ󛳰r?Us#ZbLx! a: 'Z̋#KDPxM~HپvS7 ym ⟭crH=W-C.Qa1d!s@Ee)Z`a1ȡQgA5TY*)+D0\2/˙ XiUA 8V9yl-]C HRu)U8n@(utcxpWwmpW -FX|=-rdg ܌gߨ[јUj@`m 3R{mS(L5 v+jNDF"~*v{.*a) FCΐ7>h ^Z(<@:sD@W8)4p[p!hмh*paѩK3jrF @d@#]N J5CU%cP)@zrpȍ Kwv\6tnڈ(A8,G/EӀRf[ 9f=/yLTWw!91JFV35H<^jY DŽ~65J"ghPQoУF%Whl6g3E AwAAssJXnP~*u>HBv оFTPs:ꊢ9P 2T A_7$qы}[ m`CF%alGtvA_$cd(^\E'hQO>4o\Kg陿]70O%?ç?(FGGn:,e/_ 3/~}gOGS-~o3f{=3q/ F~#FѕgJQn33JG_qK  ?jq$5<1l4ew :w-Uۜ;8Oyj}͡CrF+=Dq5[^365! QMUR/k)Qكv2UIYG@vfFiF܏M '; ͰlC^H^)kkqIu B؛*J*}3ӗ(;o53QЗ qIq@yb)*:X5g^-_ 9T JF?旻dIj票 y\ oJPe~b1X׬cQ( %E\Syu rQn%/nNotv:CT~]?_bvzxd ︇n|IGf?]ܞقf}w^-8*dz-+7;7yke1ox-wf?{۶o/q۠IlI ^YREiYR;) Ex}ffwf;蹻PD)waQ6vaNd!}3~kzOf,uECT4obTz&@ƕSn$L4"Z>v riㅂx  O2Tn:My@U{1# E%O򠇅Z\[32DوѴce +#B\b*K7jLd%?l*J0;Pw%5J)AD~ ]lcK;U-o/ƔJTzuWhu,S\֓߿}Io@ܖhu)qou g~SD8z&a( c!҅ 7?ĕ ΏYytcw4I\ڎ n۸4 xL;?p{fBsh-x ghvzgg[GSIn⦽ i>#1S°ayԁ2/DD Mo+@@C@c0ۀ1٫T}}Ryz>c ;R$o ;`tG jV`)|IBfNb+h@ 1|[s@,rA9i)DCgMRH1.kku wѪ`6*j|)/<]J'1H?xqF8v?Cs#SlysY2as %FӃyHCxy nﻖbrTW7~];nJf+PŌo.$k6b@>6.?کduO..NFϳkͳoI%' ttcF@\[sgqF:ׄ C&1x]ROkF,\Yߦ(K3i!KЉ5ٝI+"R5_I K)v:OeqIKʑnX3p| "EwKhL·ȱkThjD!T kme#*)/i3(@R2CWEA:մ8SZ7+!$uA«gm*8+5;eB·SAR<;A8^\ԫ|7FX&rرF3­rHWs4fJS+{sNcH{%/KHjx:F0kcu/9G`s\BVN*A@=ɤ ϾdsNgRr=QA׻dRj}PFJc\]J=ъ{"OE+Zuj&U,a 𕠸rp"TX.1kW}op^Fcqnk*"iZ|-K8­Յ5[nyHSnߥBpIi.V|:;\s a;/?p2$p]ֈVN&*PLS]cRQ!RzTyiDA415gT!ʡ, NOԜ/JDžrnGI)8+B(4ףrBғ<.H1t ]ϧ yc44Ɉ49Ŗfi(ҍ]ȟ _/;0&z _4i ~S)j{.>qHHzJ K*mS.'6qa̭iqaPKbk&Gx9.5A"}6VsX}?|Vh+oT8Pw[͢ƕmI?| 0>q=/'-8 3ύZO.߀wNqa?w±'AE/ۃ$)~U ¿o]^ϟ~$?U-!yYŽG_{+8(-3[ɺ NJnJy9w⿯>{ۋwW#!(e\]wū뷯.]|w۫~gEV^F{w ?>]⏻xO~zph:8~Ο?ݚc+JF^oۑJ{_v5o9If'oϱv>D$X:ݣ0:gq7r}p[ s"&_&2`Έh":py=y H;9I3v7~8 D< 'MK;:mmD5 R1S3sۥwv=l]'EǓ?Gh͠۽  ^_7&m"Q;ؽyqCF3Ihw㹎0b(KYOA4 pGpJF0ŝ7n!xx1a@nS@_`6r+Pg4"H"ppI#(XsЌ3V‡fgLfǠ p,)C߇I#,r/(: , F?ŭ>|($tl&C J=8^*1/"^KCAZ6V yaCr{̷1=졆oAf- ܌A~ "fyy }8`3u=S^>?x& -CFD"Bݥ*[|v \,*uQ)T2T|n~fc.Z"PS  3'+t79gJ,^y-wX-0AJQNFxA݌/uqW9]^ژ7i9 9vىg"niGѓ O-,Le7l.6ՂIʊ f꺲Opkmi@B1B9ѳV?r녥 KLj~6^ˤX'J iʤLo2:=QG4u)4r!qcj~7* m&bee>]Zw%+/U2FZ ex[b+,+slYd͡!Gg;V@8T/l'X6 F6{M}mպQp8>x\(` kܘZsU3P^o\)7v]҂=Ph:{s,}x՛z&l)=kT2 @(urW^" 9yc+jCpgtd 9Nj$bRs]I5 R 0BG`HPaXHC d"qqL7,'5/΄s;l/M@abBm!`XTnF+@\s|C 0*Di] 6$ByB8hrk0d}&;1n2r*U.S(#</))!%|IP6~!/$ d )\SbBe)27߽ReVUK5J yۮ*1FwU[%B`a~*}Ѩ=)r)Y#i}?HeUa/)*auP1Z@A>C/;1Ҍ}5;MwβC+zmhREl"XbC3o}PAfh|Nw_98QO=͇]:O$}k#H9w%lNk[K׬.ot(.O772Qg,.F ؍"'xːT_|;*T7+p36YxCxÖ;S,@($9w[HD]m z<*V-۶ʻ$}BE"p)xq+ }|9F:D Fh !!@`R)\G+\ CTh`Q9@O'3UO̝},vjW`?fq\>=&7ȣ^<] ]4@0L'( !j,P"C3:&L=sP&DFUHodC9ᐵ*;/{Шg]a ۲ִSSI@ah&ayBcf2 "hĭjJ-xB:򸏉Gd\ Ι FY9/e1x995E'.W3So P`eǯ]kڄIϩl hY= VnH;5mHwM׼Ҝ]: (inH[$}]wt3Nw ]wZ_RV D׽ ϐ Mߟ~ _?Kxtn)S)Yj絈=.OT{BoYﭓ4{6uTfS{G-=4PuëYk!wE֦$y u)p ̰~ƾ«~@٣Fؙc78jIF1nZh~#&@N+aW@06hU|s/U'@ P02Ki@eq14~ >QO/ƣ~eEF4 RI`N@g޼>hZ X?/^q}?ޮ<7: R>џX4x۫8":Y~Wzz}OSW'[ksZ~AmWjr6OqZڳS= Ngt?[OS?-0ԧt?򯑡6M MD+ҭznntdrjXHߝAgO91WRc ,m17'˳TG;@WHfnuz/#| ٷwz|7bN ov Tʯ8rsЈm+`'i$wK^~XMf*3gL2懇̽1PDhM >: YAТmX'g\'$w*=N"~ :(Ԋj/=?"0 e`ꇘPCZp ѹ0LXhj.& LM{g{l%8Bk)A #t4yqTeڊ{XC "ڮo$TPoD'YAv0nGΥ{K`5fC;OBK|=-?|f5rs1SUR tƭAS$'8c=̡'9WRD! t[ H^m5+hS?ر8U{ۧGf p 0(%@QI !3 " P!$JrZӹuhuGa 2N2sK$RtʵP9*_*5aۄS]mBj_9)H!ZiT #0HK&E@Uj ک%0V#`39ʶz(858Y7~@$tgJգFo ܲ+qF!rUF!01l  js/Zr;E9&*1!4Qh!E$`& BfL%u)W 32tf5>{qg 'cЀVYpQ8*  Dc )R3iUs* $~b#==HSTvw:.10R3f`bd~uD0G29 \S)2_@' P>&*~QY$>->sH pE+R KBb wEߍ4.k<D&oF BFv:=n,H[!sBhlkCE=4.(5=n9FppZ+-`hAZ:;ZBc7KEiZwG,#J1ɋ?>/!"p((!,J!!e0\& !YZI(@#pģfP?5)Ɩu窂:tCb i2_:نE_pniXFF8Jc4ݥJ>Py0nɄU T1Tsa*:*ψ`5Wg`[]±S J -/%J#AHq݈(&{*#+ FbXu}7o'o^f{CƃxR hVhe6LwQ_ZMxTğ68yro^spqZpPZ`?PQ^Sn^KuY2دFG!~t Ew#pȊ)GRJwo[*мPNVk0?_#&YG_[TĤ#"tm84ar>ɶ$&ړ>QyB4#9 A\\ޑqijN)DJԧJyJP/I%C/^l-&ECb=*:Ѵ=.S%ɬo[&;1 ܞdHLăЌ/. JBvKЗKz-UޞtO@_oMQ׉\g$Pv~Ieep{zB&έeXlbfqdFFAg_M M͔cpxYKDix{~0 $R;tC@z&Q@R$BlNZVi4fj탰Q5RFƨ&⛒Tlqֱeܕ[W?I%/, Ց{_MBJ۬7旐bc?gŚ9ui<ݥ҇H|xWyww o)]vMkԦ 4p~Mv, }[˹3xKzG6-ӟVqy 6!n;j =}9W[  oz-(#/Rn y)gm툤+6V+ncOrmyx!o\G!^VpZ=7@J5I[4؍ҭDm |#oN&. VA^9%ʢ|L$}6Frhbԅ$9a1W$EU4/ 7,B!h/!iepϯ~wJJJJi+. U+3 q{|zvy2Z )#ejt!jEs-l,)Him ,2:* A/EHC2N7J'EV|gE-< d^Io"t_ ceqJwݷd֚\g?R Wܽ(c7&?~w]y\ToOM2o)G33ͧ*sT+8paFm:8M.XUB `&-7aK/12ِXխ"fAH׫1bg*?=D^lHW0ރLd;dw_щD8-vN_KSeJY PrlI!lR"Ek?KgaUJrN {Q o Ql&;k4Z`mBs8r\ &sXKECE͝Db74)lp+p))WDaZ? Oez4Ѭ]vJv:#٣Rq$\@5{L:8gu=8~TCL@(n4|Ёg"i]cB6 )B| $>dfbfU 34q&X~|k<;+󧲙nty+*!9V)rg`sW JGH -8 {T9.5E #>8#[ccm'% zO- jd <.r6rl;&ì, RXHStuXQdNAj$OF >8ð9ހr=`"?A18) O8_g?W|qa=W+}9#Y;wtJ hTuT`Nt!cF۱a_hj ɨ>P®RT9ZûbPLΐ `-xzPz.M!XzJs(+6 γR~80^st-94PV-*ΏQy.>0>9Ti\p!K>ZwY uB5 8P\bsZ9q d "{wp2pNCKt nX]ni巆A>e $9p?Yޢz2w2NgȑʆJۇuK^4c/h[u hrDTCo'> WV~|u}:=ዳ{#/V?ŝ,p^oNY32H[띘|} y,ʭ}F/87V;{WOBbrCj=~p~wTs+zpC hN7 BE¯JHm^_Wda1fM1C}_>8%ݾA"'4y#ړēdj?:$Sߡ֧6YZk!w˵fpwS\&,ZGǀ" !02ZRB.W|V.Y!HӘy]ft)̚+Y'}ˢLļV\|kkcUZWj,(?p,5^p'*P?mPW'[ޤn·c@9^À辋 5s5䭆u#IamWcly[q(8>~O_{bvp ||W9Y2S9wxvObqڻY'?~ApeԤtOkI>쐟ཫ>,η'KyXY9톼207ح#K̂PJ KBVJ$ jAJ#a[$kbVK!246kfb0N(I:HV^(r LZ `4 :o)(sF*HS&I^¡Jb6& A9!<YVΚJ;zmjuX:^K &x\΁[48@x RSU%KS'͔PYz R?lQ*{]HU$5?U!Px٘e03QלSYBxgإK x^8mv1rCL!Hc7ٔ܃^G|o҄~9gp ]\RUhnīۓVnLbpVr|]$1g<o)9z 6*y>LDM7?!s tssNW/aZ͗ۜQF#cfO3k0Mcl)t-/VyfF4rM$m Z1sJ{Wa_-$飘 ډj>R\^ZER|umWvt{fdٽOx9` ^VE-%0bOeSdkҴn<`7%X>VR֑TR˾Ako4n{!kaQ~d9(U,EvUҾ lS|vγL|׮î9; pJh;1`)uy+ɠći‡ W;~I7߅1k8`!/߅5I"WBynlmgzF_2XP+0E{1ؙF_2#yK*=JŒH)y$EUmE/22"321]BCfUJ3“1%RIWy Z4nWbnix!_= ˉjN_nYd%t׌To1!XAb@;N( (!:wO*!K/A)s=8k%`MDBG$1a/4@#_F%P&ġ}='m^\朦8OR5 SkW$RHM$EO4*SmKjԁNBa6(Gx[&Hd;u" ;4\h,&N$'>\-TCl EKR+xeJYBHE:<ɨi=o(]-W})\Tp_s6*tS<#iwLrjU<#Th+ 9 &(^g[A`{%%XT(kTb!gibs3Lco jfU贑T̛<fY$s"" j$N'N" rwr?QTi`^$p,) 0qRzj*y{^V]`)J-)a .[꘸I( YA9h.<h>b~s[ mdbeWYG;ٿw' 4GN&?FuT#1G+XgSi3~x;j6N8}w0SID4H˜70@uÓ!Y̢7_&͵`\r@:Y:nHL,!=H9 +'%yָ=uo$TWt߻n[o_l[r{tR˽szkPmXS6,gȸjH fJT%LsӻI,25Xys[2=i]Öǃ`2}rL b:o/k$_D-pNW9 N?1wP8_{qPLP86( ' xuX4L-.4|{fʈwt7 p(N޽InVZkU@e2N_f*Z7,"#\Amdo5?柗^2T)~^E;VdE{&$7C@X[9 a] +Bmͭ"j_vbEʍ]-R`TRb/;7rO,QHڽh$)JI[kU9zgmg OCӐۻr$yG>/Omĥwȡu9pBVhFugyʦĜ~X/V, ˚DY.sX฻K_nn}:&o'wa]y.͖ ;t3*?[SWok1\>㱔wQu\ HXee:&y,5Od낒Y_kў*9NTnLNGUNI5xI[[ |T3X3iZhE}[UNiNidwN5NdJp*K@鹜߳,Fv_&c){(e)b "#2J^`c~/J^GcլvU/zN {r 돗U){yLp >{Q8Qau*ωveB.u닌Zbc 3J'6{)nX(4Tcт7 ZchpvT 8BA"APO_owe!F(@Mp~I9;{& P\u 8ԝqb_Ӯ~+Eu\ς][qrfNrj8%0ξܨd/TvYG~9>i7mN b/c?mswWW~,m*l(%Ð끵xr!&~s ^to.~7 k!.+1S9}ʹ%rL-jfGʄ:%Sh:%1uէV}jHn~0c߸ nYg&ߝpj^Yzn*z/B.H'$*'Yd4B$Ihrs dHmX<$90-uCYYQԐ"$v.ѵrsJD&DB (UO-2&,2F N$)(H!XS4 h4(EwC'[P\)`Fy$c R>,"L蜑8Q&Όܪ$Q&q(I3eDsT9v϶j$OmCMja]YM1%WMUڍ^=I7K;5L^旿l-Lػ>U`wh7GdDxT<)g&+tV+~ѹ ?7röwn8|s;Kd[k=~g99`jn}e7p0OaS 9SnʕF9C9FlI9Jxb`x!@8ШC"j:DT1nm8wuJ-i[޲jB%eZhb{j.7>#-",YURsMc$iI@W_ߜjåQj&1&47Teh&l(4?=ZBPqOTQ &HwR+vzZˢuTY_Fx{QMM9Ň^ʠWXt&\խs+}t"* ;~4^5{'/8 iiomeQ}ϰ»*x̖ơbɵۦ}~3iŕmK8ֹo}X\P w27tF3ɑ&Nd }JTоy|zRռd^SPL鼛bMMp'8zJBGiDr&M<66#R0DgqeJ剫12AuLTZEeNMuXK;Nf7KZ>.,lZ]Ufk{ޛ^H!ƾ?d 83^h;#;ϖ+wʕ"7t_W:y#ԝv~2ֳo0_5uwЅ :s|0hi"@f'}u#Zr:(w%,^,]w_es%CWvm.>:啩X-}9ٰqX% $nleDT\B>!Cm'j˾І :uJ\ ҫO\Q :ԁpb}0-DWjn/d1HƲ(S7'K!yF " 2Q3GWaD~NNT⏒{|n;~8Y![}-7E EqD ƺ+teR%#s oO#?c{!X]41nTdU*\gRd;_phu'Azj^0̠tIuM_u8q#KD~^Sahho~nh:ԨkTDu;RZAq%(tSOǦod &rfg[',_E8C)}哤hU |9tOЦDҳuhoŴR[^ܙƱ Ȧ&1ٸ3ʰQWcBvJ '_&,6ihh0_;bctJ @fTISc$m^c82$J&Jʮ흹2!fQ1%X<*I,wKdBIq(gJ{8_!%P}UR`'o$ HBRv }IJYa[<:H^*x{8ɉ:U{f87Gjoa20+Ǐ=ߣ/Z[@aqF16&F]ګkQpt}qqTQ8L}^O9T$kqϭdОxcCw 7WaARyFf8ngm˔4"wZmPTxph[Ef@ڳ&c]yVwC4w;|8F4 %FUtЧZ~ 0;dn'ݏJ>mnmk!yn}єo!B/&#JgۻyUo/R>o\qqxdeleBXj '-W$2 :˘mie䔑Xf}ICrҿ&-HU01۷3f7';W1ǭ/?NIkkOy)5*ć q 5x;|$Iݜ$Ir h  A,\/Zp "zN1¤E %n% QӞƻ~{x>Tq#T/ѺԜ*Ҳ1?B3|0'lA b~ft!{-ҏf1"\} |剾yTvGQ3T! s ?iA,#I&k>^jLey1a(4Ѱ|6{dyK鲸?jMXZN{]n> O_ZNMŋX o=y*aIuqB-J4XUydAMea֘㠅&kn_mpo㨈d;fhp1.d[^m9F)齇\]ֳ@ia/DfRζܸjOCEJF^$x!Mb*&a,ToO3bI@ 2}h~@صex1]~iJ$Hc\]zҦnWb~4ϨI3z jrhWqQ' ];ntfyTz ͭ.!Hb'jJogNorK{$1\J/!'iohc Im)i lIRA)'\_>U% m%qgjhfeȣ#2YW*o!(9۟~)ܠ8P]p(aZ0(TT.[,Ŋ1w%Tysr[SgB2c% z͎ \_koM7kWZ JݦЁk_FrdVq[B&t5Yl98Mvͽz%֏}tsx/؆KAZKMzޒ.V#o!nn+Q[NKu(i}+Z6o6K.K6s+RC2?똤0 6+ J{;,.Uf#̌VJ+l =̘Z1+=DxM.4@j<^&BKշMTuJC)9zvHJޫ^^g ZNk5dn9pp-0C[11ݐ`l"ъSb)]{Ol`yrs,+T꺖EL[ꢱi:yf0VZWrq&1‰j ܈ =vI(&o|/'x孍<Whr4$1jINIrIĂ5GzxFj@cjempZwC=i&˽ĭ"P2IB丷g9|rSHGJQ3.-dQZWV %!Nzܧw/m`֙4&-/e%6l6J%$CP%g|勺%.(9ؤv6v͠x-bRx-/j#jYK>hTmpO^ߥOɟw6\󝊍*3%(AYjy~6kTIK8.{U/cኚ=">MƁ]̴~y:-c@v)5m(꿵i0 -9drJZ8I4O3cP&  ǫ)c79!n98{F|`Q(t`q'6joa\$jwE D2~0sں&"!nOnOS<ʆJ(MZ*H9z ' @㻧Al88wo?7_| D v[T*ᚺ_06AOU1YW @T٠9S8bQza qg>XɀԄN>j֑GzU=4K~ثz1Z.u~:ByDK{ʡUNOwͿC fiFzWV׭vj*JH;H%o ~ d 486 , +ow糄q}KYb Jsp2"@R=ZO0|<DT~tO)\VfW̛]_wZX5Ne%ݚ>^O֠(CoF6MJz.ΩlҒ3ԁr䛽sE(pBGp}^w`tzꬭ$vN@#Y0Ո3{!Qj&fzӋ$H50q={>+|ך8(km?¡vĦMSSGGcW/66Z8EՊV7׏;M^qCRa\,NJAX!D`rEF/ ^-Fl֓ k5ؤ1v ǃ”c*#(&S8H#\IJs 9Z8O @aOi]N%i0o.o> | 1 s} GS2眀TJF2$+(*B'g {1+,2`i1hPPC L*cUB0ʂf)ZF#$EuBp` D(tP2cÂ)CBuncq,TRR@p9,#eET(lJ\=C LVq$ʴRP'#yX81Ey`6" asFJ۫&Et"Ѹ:NmmO(vnH ./ᠸT@gV//"Amy@BhΥ԰UCF\x \Hi$R@ ׋8~Gx=pwh9Jf9!|ct㌧%(E)K1وFȥdzɥ&gpo|Iy8=6ex=T Є(rYHN ɼ6g# Wg}z ~F}O{d6I+INfc/^T\0Gk4,1r6)!XR=@A}N/RqT Fhq6#7 0ED+2 <<;1ɼ „K^,F,8ܩ^`"Y9AH 4(%qbDKE!ꠇ7&wZe.=U/7,vg} ,f؞|Ҵ*1KY,ffe0dKb_DL݃u,fh}S ݭQ^2U4j0jj=9\v+WOL8LaV|,%xa=-H@4I sW_c\?o٨jD/vJa: ]2 FM'_L?^9-Z;9\߇y2LAMBrCdr#-<`:PuYO]nb_M*qxbO$ѧc<=Qv?~~Dz i2ݺ[隤pG>LܕZ+>Cb}fr^RݤJJHMYS~^1'\e"0 3:D^vj*.;ɲy87|-z~-G{qxRY)p#B:fK!tgj[ -֧u$+eIH {b S6xD="A#J":pp^zϷ3Bˌ #훬K,MZSw_6Pjz}nԼ;zdF\ bnﶀ9Yk#!<edYM,Qcb!R pLQdDNQ j=YLڴcVsjc>xY)x#0-kvk <J/%5H2aP0'Po6%<Ԧϛ6 R{g{aR R#=G]Q't<+ji_(VK }$źqOX-zIsӷ]jxN[xXG~XZD=F6zo%lF#jEKyV=Y `; LwCo3J&;FpKf ~ժFzelb&H/fH.FIA^]MOn)naqoSp|8=>77R/+N/W6W2[:!5!ҏo`VOdʲkK7^_Eٺ.K;uvLIt#t)VhίN$uږ!QBaSel& 4Ar{Ms "*DџP$׿|}<.Lg|Bfsg&oA<.>˶.K{W='Li|F8j*J"k*^KǬí֔`i?<|޲|LB T шxC>GB?뷅9-)I?Tle7 ThJ1"R-G## ~\-DșhILASKFjĸq2η$P$OoJ212&? iڦ}; o6}޴)~)5Bw՛x)P?j:(|`:Δt$(]ɴ6, !ԍNxZ-|zwJ,bm_q'>LښtPE'xJ4jqcjKUhT"!UAr2 Fw LY#Q`xNI\a%*K6PxO~ӯ"Q{1|GHҫP#-YSQ fQry#MSpSphQM&UZMm{{U=D>JB/ eWb=⿚uxv>l9J.5 tudTuu.lՕ̥4-LRL$Ң/-@Tx[`2eE< ToPԲre+j_R,g"[R< [ cĎU `rE,C@]fk[ :, _4MjXRi4lw=wdX$IFF㧛 GI~.4x3~wk/n>"|wI%{UTxZ`df],C"u^ev^͈6 iLfqR{G~|e.N:K빤1sD)(p ֞D>52[& ASIr=͔uX7BŬJ8[9u~˙չDw|;L:fel\mnwD^WsJ2pCsUF$:6hͲVkGgc?/\K"7)ŁY 1 eoOɈ jޯgHQ ʂFf:}{p6+jqT0YWEDJBZ cNC yE ޴33qeυ򁴲erSvg{>l)$գ L=IVSD $´ axf=0P:"@pPn{ow:*-o Ђyߛ3ӑ)c_1H!VAV*` yU-[\6 CN;WAX/>-#{qRl -Dkz[ijڋS´G*Ѫ!eq`*nO#uV,,<5hI-V:KD) 5!!n3S-0Oyd!~Ʉ+%U/o[ͱrp{9';v|WS|Iސ pY3Z:2|xߛPx({Rϭ$hKYYc\۰k0W9IP[I9j|| pE)e /ęF1QÒr1D(v 1, 6j$:.*QD>-J3(9vB7_4BKօJDnCZ)`O!mn1ۿas-xzSђz&"<\IN#AS0%X M(Q+)I@pBnuE^FmM5Qw?[}ݻ9rw%5KmCt(L0P+z{Lihσ-jt4#S]@r x:8^CG5CCȴB1(8tye8YA_>o#.#`j-ڒI@R—3weQuxvJLV07+սRP3+ZGŘW7* fO<ݞey̚qPgB+#$&&Cl= \ vі[8k=p@'Q]eCE ^I0F}8Z_s&gKޫQ!Cq,/yC7Cj^R%^jS*L&{Cs]zu9xr@62Ds7oaDI`%?M>0O)ysyCx !=3fRCf)(?lF.aF)|-&\L4u͡ nJ`İ$#GIPCzMkN@FcZCf,U40-pY:V+aԥ)r-B0TLN(ɭwV $ e"挖lffQ-Kn',NC~2S> VʁuuN_^MrdXddOւ,l' f5i9_%륭/bܿ lfůܗlޛli 7֔d>kF0wfom-EDM|"y)=zG7/riHBr)MzOMN& %jnmLJ~b8\jD#E5kWE P[u[EIA@+n"Fgt:k0:2ŏ֬:`W o'?b`Mm6pQ)IZj . d-Z'(\.sH`#HJE$l"[mrʀ +c'Ep_+PiaKyWYuѪ/̵_U]\UPWaRz7o"AP#qX)/O}j=;4׀|"%S > ݄D-щ}Fvúڭ E4B$帻]?;t-?LFSjpTiܧZ;J!J:Y:5[԰SkaiK$[3ųnn;A$&y0wokѢ@9>ZCaDU$v1OZi(Pݹl sW!ܠB;*| [yJu7{x7z;Pkk.OA.ScJ;W&hGx]Ul*y,{|p#yOCϑխK;:f${X@5d,l Rt*kLB u략oop48S[{{`y~֭kQb@p%hTXwD1aSo"t9aKZ`I\R]ate>N檱%NW1E7_Kٛ]()W;36q@AMo囤wޭˤc[Q!6NO~a;wt&g?wpfDZ_gz(5ia5;oS<njRӘ!5Ɋ1L?1OUoWQL(MgPK?Rs}/cQDڡkG8Fhy-"n6+(L+O9;M{c3RE\m *zš]C79%(J(hC)|v>{u-VUf~*H0xX3 ~C/Y{Exjm@szonEOWb*OU| 0ʚr7%)\Y(䒈 O' Opc|o]!$=VtϨL'oqsuO=Qsd r+s :R* eQpզ SBrĺC~?Õ旀@;w܍?gBXQb.SQy@ؠ,ˇ!58XVWΊ?g6jtw v'% ׈'<# CM a.Ʌ2VIaA򖏐Wl%C{6VςlrKCnXAyZ:4kiu<{$U ,՜Nx'֤}[^mk{0l ^zH9,ĉerCt 8%[//GRo1J%¥T5/.pQoZ+ K߲ъEPK^LiѰ9< fqwj Mj$P{92 *0_q\_"/c]y:KN)oɤ}:beGAHo:Zp*_wJzĕ1X1G-hnT$[M{,8}kla#%5J j+ʹg}U~R#d'DMsٳ S qp!gvF CDR 0րT@߯g3V Ϗl|}<H[n6w9pg^I??o|Ƈo|_vIB2?jNQIV))͘JKXö́\pMՊ/x1x3N/zb|SOa˼x`e?Yx%B+cBCR>Ƅ_:ѪM _SB2V)u5oz/)9LќΑ%\((& Ɉcg)R3m)55הHD!+jL5c\1˳}ίN&+H%4xJ\KI񗇇:N>Mf-f#.d,TD#eE$%Bۺw`iJj8K >)l`ynj ℑYil !RR쑹uWn*#P(xIyNree!~0bFXUa-$R4g<͞W+ [03¶h VN,U"Ja HrJ r=vs 8G g"MwNXZr#4NJ-W䦞13E99m Pj( %c(@@BɌj04iJBpKIrʦ|N΋-P<ӂ0)?:'=BގLQA)660)swy(9jf6t3iyق˫w/ p[pY+޹vӵ:`b~_|惟ًk!B1F33N46ceW/~wb;;_w_ )!q1{ Tq_⻾d4Χ޵c"e1)[>tݙ}nU,%9Y_R͢ĪR)Qt;rx}!y.:1.41H"lvy@ie ^ҋ;0vLՂG[ <~upƙdz꾝̲ Š+Ouq]nMa(ϟD?ukkgBH1;]&!ߗt/3C&v+JUZB+BnM52[͊SJsNhiPR,Ʌ8aٴ慛a=!)^q>a%{ae+E1z8 ˍ7^ W@!]<_̗޽_+v0#O3b6Nd}W1&(pM7QD}SU7Amd0H(i=cqG5eѣHk+VR(V!' 6&y̏9X(>I<SˡyGp6Yev~.RˊPܓ&3utw]>ݜKQŎ6͹wTLǃI91ȏ@K躈A XMDzwj4a1IwPm(C 2 {+Pzfi"<>d.(%[i1M֚]>z/} z$e#ۆ['p/1J͌4yfN.<:[dxRH2IE} ڦ(hGga;k-!ao&MԖvǣhOlhw6|[&k`QLƤV]NB85Ż+Uߔ[hsUqT۶}v `Q~gs@8?c9AKFrRJj)s^ o)J*K`beqvuj K4 *I꺧RK:RxVa'Iq2L'íu?rJ c |K|T+y. L^_Ms .H{xC\т)9v0t[z=BQѬ1P׸M&+yO21H:mNfNiTVƇc*ٰ1oG^ӹq8}7:xr;9nޗzv\zu4[ʮmw$v"B5#_9S̉=9^h83%j?o4QK .JYɌ* r\ NI'3 Tț|''p8Di? Ew)4<O? `K P%oFOFW0_?3'+uGd Ũ.͝{ Y,ծݻ_+NG1I I)YTODH *ʓ/U>(-|<1qW*w%w3FUUm@NYX1 ޥG Q;i3$fi3.ӌ.H/ ya&R3KA%QB"f,$~b̆?ƃ(v6^<[Ԧ>8V8#)"I:Atbq3lH.>CJbbG@ rѵj_ b A~9nsQ0Kh؝NtQ1[ 5)H])u-iprҹĜBS)`Mg$K %v96qdJ١W&Β8Kn,Βj+e*[Z\i¢J-4iDJ E4Wrx#rXkNM3-E5S^zALN/xzNz85BK3 TaTdtpQB| D?D_KUݩ=4{QL]6Q#\);\cc`DXpQ=b d!t'7 PVKX;RJ,aH;R@B CN)o;Dj򜄮Kt"Q\GR]v~+tB(U* jOIm'eTx±lls`P׎sR<Ԭ28HS/EKaF3'b)l1hs l5-Ush(_Gz*axWj^3Xh@\el9VZ" /PU%VQmm<dT'gͨ@T5l }fJfkݳeDi}}gIT{H6q=_~DLFL{Ϟ%U{]X5/;! 'Sw6֩!E~ۻF|ysCf 2Q%CTKeVF/XE\v{Te] F x BދJz .c~?\Սe݇zt55~وDH^{ ; VLˑܡ$0AV@s/LL- (;Lʔ`5S֥Gjo"7ڛHMjT&)j!c&1٤T(SrVy$U3. T o2sʊm<}|1>0]Lh\ }IEi%0E[3͐F;flkKA#1{W C4cFAɊ5 )ŵEf's LY$6ayp*~c@ƨB1W^;=5c=q1Lz8~?]. s=}o%RM#ogO?c4`t4p\[@DS*98Ea3?Z'ցP̿#j06ReeѾl7٠0J*> :FK ہA Pk{N쩓30U˃ @=סJq )!hQoa)lSxyG]#PD,??/yIhI!ވSdz Q?[@sۧoNU!mU8FBuoJ{ @%.Tix9,EiwfF7g~Dǐ,'S EOb/5g%3>Uݾu|^Rr[^8S_i3炖х"~Pd'͍#.'ar2 d@p?t;! mtr Aqh]Do'mp8Q*< -; jЬHۣdLKB[ד7Oۇ`.' Ą \Khv~`|G[/=|O氅:vDl=7} x؄6h'Qodž_"i%S/CBekr'e۟nbD*2 c:5vϾ'۟"<#9PD7\ҥAϋ}g*VX/N`܏Giz?g2 ( Bfx{f|DZ9kIqS`ܥzPzj֗-2^ "nQtLwϜ[fK"E).SB&P q ')Qta)}sRAP ]fER5;dϵZ=/7fKJVS20 cDlcPC9#BiQ@0CkS8@Rg03+)猔R9XJEsk0ҥނC 'r@)mM]J'(ou+9rTe.1aP0J&`i4X.F[@t }U3vIhʔj)PR,yGYih&pGSØ@8gTZ2Js0+wmG| v J} ,%` 'yI UM2IxCJ:Iϕ-HTU]&{i v`󭉆ILìrF *Z*]8u.KxsP rB;$Jhr1\L( 2GJ*9bbO;&s3帤"bE< ԹȈPDLezRt:<xԂr7ip4 L$n40*O,X[`O7hMHBNd"'ΐT AtUZSBs WJroEtG/k Z_=>/n gI8%ӂTҋOϿD@qC5 ZL]3J$R3t; ֠d8/c?f>`Cp>SfkXNQi\<-%ϟGљH NA |5Kw1x栻@n`FǢ9A^E|UOoл=;{%SAPʞ`:֌ DUIzDq3% ^@zCȵ 㖀[FA!* d 7C Q>bQm5E@=xq0R鈐 8B$,7уdZKJd!ֶ7Z}'R[ dz"\ 'E7x,G9U1pQToHРv0O4wy%`&MAъڀ`GVU;E9Mխq`= t-;/-(\Sbx7muGDm]$`Sm_Oo\^CbmJzdCuҷY@K=Oh\yXv0cbtDZ+\68EɁ9x2+srcO + 8R1>H:  %=;RHXlmfThTwԦbMU+ RX&8bRHZFVx,,i򷌇lю[D_?Z걙{9?Z} |=syPx|3"}}9y3BENO;pz4 kM}" koy,># #Q?#J~Z6x5dgGGRc|B]Lʮ|s+7;<@U}ěZϸ{ Da}1pǖcʕm!fIomAXGpD$U!iW|!cWPKXb ͭ ̇ F oV^|_IK{SV0 L$sV9/TsFшqz'0 gR1,B̋քINd~SvCDv4GN ds"֊dGsMEkfU_),jp4ku ڤ—# (Y)$m2ZMl;Bb01% A-,p$ ! E7B$I ̵3OU4lhQ"840#F E05>8M !p<gFQ"\jىZS|McH'䖒'V1#Z: !2:r H JBܪ^b!:2R`j93 qI"oTY Ԉ%E,&pDK@Ah$w cC4iif8S@9 XAr 60* /fo տ/ <,S6osfe&^u}ןVi{ɖOBv31AnmbMŰ(@_mK,#HdQ&z_W=l@V dJl~J/ !}? & `6)~NIt LŚ;1؞_}d+2@Kjt?2R9&BA&@8*[Pjֽ!pC(7 #ZLB~mw w/?p=:Dg?<'~K Tyo,WhL\ϯWkYhyCN]7o({o]̟[wϺx_g- ^jCF ,TİN$?O$[];3w։dTb5Sg;'~G}9"K:i(W# E}e|K@M4^{eG G(9 d;y7!v@,Ӽt%.o 4))|%%{qԲ||svxg ϻ):B{:9Ju>MOXlg)=8f .9n|rϹkCp(SaVu)# V\y,̺Yq*LO9W+EG~u."WɬZ$99,Ifa²R}:{aiuuV~7XX!Gm9wCpTahZ6GWWpwq*(St]]ݻ9咇̩^,mP$I핗N!z]#>8ft L:NR{LҺ˥3ptJ~|rЌa.e˛,Xsh.{^ iʼV+#NSj)Rojω4Brupvauu򽛌r: a8h d1TC4.sy!#,'8FY.b0ºuoWǥ#I8JW룈֦)3㔏> ER(C:B xhG!B oUV@ S?kWX9ÕO >T"|6 OѿM/=RJ H䦹q1g4JHǕ_z[j8(9  Lb化D2%/뇮I1,'ˋa~ܝ/ ?pŻsHꟙ0MA80jKzrRZ⺽`K`?6|͒ /Ϧ 6<̕LǓf٢\ύ xxG#f,06}YFqqK[`ف}>@ws2 l)J@Ĩ4!B@|BtPjC1 L>i칱X!A0k`& [ H& 8uK C`VڋWkuvKSGv{jê[e`gXrI-&bj2L:sQaրsK(kVKan=K\F>jJ9߻T1qwW mw-;S/*IV<lEo|l`LVkaLp yYOkp~H,??gC"**ڲ9|3FV 1f/j]k]TQ]vQ&*jםu`QXTlYTʚcсEYT񌺮@!l@R?e5`(2(g;8İ#Od.=htP{:ӫX:2QN6q"ki|`%6j$Z_iCH>p9`f^N_K-b7Yovi.%]Eq=QUDw[T$!*zqrnrUP].+ J,re|(\; v1DH8>0)@8*k) dkE RpC(j,F=`V `~~"&ϦtDZ+<qS6Vd^\m=6 &-mـF)YuqWcj+A-icz", \mrJ1-7X^s60MC^uFU@b*NΨ1M$GN<+ f(2pd h^Ct ,`EAaO۪T3hRKUeMJhEcbUHAU3u߮iOyVۄha(NY_=A?} |y1_eA&SNV5V9'd-wC!:z*B??{q kN/$R'TVHʖS$@8]eK랞nȳb|x휲wiOq}{ 5U6w/_i $.a<9d+ʉ8]adc ÈcxjR!jރ-T cf 'ku?jTx_ѻ}=޺W`!(ssyo m} ;?LgθIaxJj =eMtk#Wzbۥq^Jol|OvL?; 'ӕnrbıw3u9{m >(}]Xt}W.A2UZnN7hK[[ݚW.˔ R֚}mC?zmgٰ]xj! }{HG Mb;nŌ(MzjYG__/xa\>CNh{OL8ZQbg>KZRۨs5leVJVc>NŻιXNFC}jpRhwewG)cꜽ`ϾU^m4tBZ{BuϽ:g>]sNp!Qp۾6z]]"xN=V޻ZEIV1tHj  jsf:$J)yM~S0w}%HHSuRLy$B8Yz3+<ݖzpfuK? <1(_}KW!9y6DH1Mȱ[c\O?-)`_#Ͻ0:`^!`akYL/+gIˤi.ghRAЮm>jƞ%gAYQu.~%3>"-jO[Ԫg#Yӝ-ݺ*s'V!9ĵUH᪐ǂq{p HC=lι>+*]GU2jٴp#z*~{}|si?G aF?kBW³+?kA HC^G)bisԂ +"H^T(**@b\ThQ 2y M [ŵ¤8$+*.L V!T+*0t+u?^<ٽG{@BǾ-\k'5rq;uq B_^QB404vΞwrU8nffwZG_VB 4po!) 񎧺,x?Uaнe& sDSMDf-c5S͕g ObHއPe/bSj$u\.<dxViBB^Tz_Q!<{A~vM8.yO<[E4H|8C}nN7h >S;nڭ y"$Sޟ>nwv GtBQG]2F3 CݚW.A2]aJo1[ɕ `]͟󍮄g|-^J~M) O - p\ 3~}~pX \~t'FW 7?ɕ'hO MP7?'4@:?A#a'?dJ4XJT|4%T5@!&RTT*#;d}m[4ۧsz'31l Fqa? gUs..5uf]=VG}2|#h3>Md9,`5F:{Xw ZVga4NKM0p׸Uq%( p8@B%(9Ԫ(z 33h,‘i`.LWLsM"œjxBLg_pl˰(7EPOfg*\TeՄ 3au 2sk씚6ȤhY[OUuAJkfc"aTo\9|el= ~+t-.+BmY…ebDh˔P)]KT1ㇷu q1㥭~QBg1bYۭNX}_2 Fg,$2N74I*nmVDg輅YKM]qnH/i9[Rݡvc/5K/~6u/_3z^<+ޡ~U7O7 -Қ.uyⵞJp?W#U)mgCx-{3ȏPK߬=5֭鏛6'-jj WoD+hK'Z QVŵÏvsy*TBaTo*I\ 7Cyðެ=wܛu2FX4bˀP*upBhk.])fnKlt)!˯O}ҹ'| zD:R'b\&w1Gy d*DLXBMDR&1KxJ_Q{D=b߄q_,VZ orBhk0>g?K->«Ns L[j[HSO?h/xCSzZ5CXe1Ӗ-\/kDq>#N(C+{\5m fWuLѣp-(MޖbԵmlr;r2 f\}v{+yYh,5XȐF/J֭9>Ƥ\EiI*ys֕RiXj̢jm߶SD*BhD6Onp'/mIxa9vn{kF7 v(ä6HQC{Dtv(&Ŗ}5Lm1͂QFSl/bl4?溰)t{myT6eDZMK&D3]T$fe8c""Q&5mx<`*@(l 79k;Yϻ}Oo={Y ?KKn*Q [gO~B@_#h 8eJ6Hr.RWP>ːf2>tsQ)ƠS=rVX$Z2T"HITX˘ *V@4L@c$X!g. tVsZ\ d mV|](Q:diQ?~ $o,GXL#wigL,K-p7Nϫw+$xհW.A2%P﷊C .B1j3:ۿw% Ԁr=[ЅweR>.B\ 'No8yyLgϟ?;h]ɩY,>?1(_}Y>#ܯ2ynx~cjsbH4q&v>)%[Y$ ")T8 wdlI,ISbЂzUh 2P׈c \*uu̬dZdD3D+%ВZ); "=ɒCBr8#;L|Mc , kRy>[m[}=$2Ih2Y;l0aø:9&I!ҨHbo"ED6]Y[jyWg4T_nfi 3^Mfϰ$&Y$D1h(3VJXw3.X{Fjn}{/92& 3]-,``6lPe&I# [NC>ϵ f)ЖHjQ+A(c0@8.)&1D)zʉIPxΜZyb}+7~m{hDdl^: poߦ73^YձTW*:č D:mUCz6x.aR# e>o?Hz>U ^K8! pZ'%iӽV>S;1 .HkYk:3XW%vDp>LI=/pwPM.3tj?hpu#1}sOoN-I`RwYoY˧;g`9 e}ਤާ5H*jwl) 9aXgv4<6%#!z55.:<3(야\U@ML8]fZE6; TRb=c\ɣu%4ti  @ gj#uB f,$i1F1 L)< Qnݛrm,_k)c]01͏`1JRa8l`'uLl Xf3%jӸ14F ShSPOI=JQ6 T&Pzjt9Ѿ]y"ӈ "p<5JHY0Z^)ct%?rM 4Q\+h X{UcJEN2'X,.3Fix08Y0BV)`m k,ĦC2Wjb^ 괧z (`p톨|P;g ?ih3Toϋo 8F57v#Cq bW\U8tMWDl4Za)X*2 Y iwoEpJ3'à"Ubhb-% ӍZ緯Gej]5E@5ITF5.F(GW9acX3eK"T`D5R"s:AS^#ΈS@4R C{gCw c"~02AZQ;p ֝ )@_^o-x#$ônW|ke4Μ-揠ҥ3v3HFKW/7&ן''*.Up>%v(ɱXDRxtJ)GJa5,iQ+yu" Syxl1s^^]&'4}d?^d @߅dޗLBZT ur̘'dgI?9n6 ÇK0e;yIj֙}ɺ\ ڕ%.LnI KX |9߿ @=x mȰǷoCFA.8vE[ST0B;ڣ/k5A;ih{4J|g%\"ToHFze'JZeX7*ȶ)>׻B\}MJaGتGï)>KT!-Sc9ez(lIr.y$ɒAd,^[\L*2\*ÐD>JE`1ӃP"EM36/nak= `_b62A s*n׳JqdAv3, ǂ };S+l,"e@#(z~/ *xʮT"b.'wR# "xz=6ÄsajJ']!O2F~*ݦ>ɓV} ΥFBBQNTs`Ë́薣xMjS]ۀvtM!btR\#ђ 7wKI n9Zȉ7Ѥ66}uKw^hrݺ}${kI9O M1<}/Fjs7~9zRٚݜ'yB@)ǣ5 s'pI/OrCsr%KQLsG!ǫ*9[?E@̵t<ǃow͟heo댯ܽ)o҅J>/~1_~T}2PҽY+SF"# ZFƠآrE'DGc@Ŗ(tU`2@yN)p0 Ll-rتכ\'6FƀߦB!>fBNtQ$Wb$o Kiz)0j^Dk)c\S(o}0Y ƉWK?Y5y;xziQLj&e:irƜiM!(S ďIR-c0y'9k ωU=YOI\QچW\YP5+㷏r_Ӌ5 h/;nO>?)D{.e[{IأsT{_4ت\b?e[kuws姇nC^[gЕ2X*KZX]j[SȚWnC^Sci0mF7%A䉀jD )e=YĆf;tr8;U!x{ w 2'~[Ʃw;ؒâ{He pENNrХ6`e zcdؼ/1֙F(ՐΣc.DY_3[ @]pm 勏ݞ@sRhM:$Pn bbN9 qȥp&ȒvUr|1m_<=hS%G?^T5˕.]Q<^=͙2nB )Tc3W.WQZ 'o|VPkhk_VĎl}+OVZO^]}Qpo=i{ltIS [4R2anK0_Rӄ_'uHJ&&Tazu>sO_r€2rh2i:Q(ʾ %SHZV4/CJvJێ2 L!gD*:% 7W_m`h/$wgFq/-Oz^!)6EFĻbT{„S_Yjվ]a\؏KOw>#l7EZ:_r%eb◕$p`n n9Zȉ7є6TM>M݌n!)DT BN7bxK z8䊆F-ěh6%$ v\זS4IQCjUFh, ^jfպ򙝷{0=;F@"esJ EA{Ĕ&,NŀcdHaVɎ@=H{ IPyLY |GJy_D(EȠ'XRhQ~ bzs'݃'ՃX*$i-xeog^_d$Oom39ќlz/%asTdC/ΝYUY?]z ߉Q ,<KA1œF[i_k'>5=k@|(B!ceV< *ψlN ,jL6i\^UR5] ذ`pLV F ^+,Z)P% QO1S  b˹L,=ߘ2T/Bp JI=hjI&h,_1Қp9x})#R a8W>+dt -}Q yx84ewdy QVzPs=DYv ]\1Gt]וv3#\{t YVD fjh?^r?WZ<2Gr(aaHJPag- ^y/ u4HGeH\刊 ʆ̲xt𶺣 Us=Dކ>'&H joojNk4ff6,-Nj5UI#Y.t j5kBel)JץVV2LԔ!'8%)EQW,ݢ -w#'X+*b ׏s-#2\!#F[J=;sw#ƔYSf[SFT+I 9 ٢6`!Ir@k=WZꛞ |{5 3ҚLM鼦,*fçPU$XXSp\#s cB+Kʕv%Y+pw |uT7+U][ p82mG 1۾U+?})c'~tlqI>RKBJp©JךeY7@mMLҷ<{5C,3H. >l +-0)Ie ҇g#ob4]UyIl-t`6.T< >_cO8#;S3/?J|cv,J-h@4wQh| <\@9[H +̧60YE@WUHXmJ?zVU,ssGÝay϶'WOU$6x|Ox b!h ?>y+ݹŏz`I<׿;?/VLxuuy)-߹2\ɿ 7Q ?eRtϾ= h<_~3/jL<LW2Ty/.H]鸢,[(JZgRT Bjm;_|4po}ŭo@DS| 3J'ɯed-wJ(v":@F:T6%X)\TH?_j.eY kQhBкwo!L_{ؗJo_Ǯ{"9Ţ/vsCEl^ T=aWԚ) Ŏ;'A5wY#ݿO~V4^ D3El`<}yȭ㳀Bdɝ_n|m|4\BCo fsf}`t3@ 6K<ҾX:]?{W9nJcl`>%s!p(cOߏTtwMZ=mxfd_Ū~KO/0ݶuӒ.#o6⹳bYtv;]ή.oo~wg}> -iH+d3B 񞀠9bڙ?u'a" ?c| C}ra㸵?Ril8;v]#)HiH 6]i<_Jhךt] *BJ&MaK&%hI8Bɤ>nZɤDje:՘'! $f\GP}Ffx A`ZQjl,fVuUp#UMI#YuiHe1 @f9o 7/؋;Kxx T.kpOދPWНC}ڑND0T'z) zEy&gƃAfm6󬆦 :VOAJ٦.y}seф݆:IfDEo[* a>h@Z4h--?tW G3Cg2ќp6,U1J%}rmNĦDgib6p4hg 5s:s5g1Ǒȑi"O@1pӣ"Ve_x-Ahk{T?O7Zvϻ?E;]/$W!{&ŇS>Ϸ+?߮|*o}!Qm8kJ(2H¬'IkCDj52Kʦi$Z͏[??,>JmWEnG뚳>MuER?Y͹liٝs1vM̞{KN\ZT.^Fh2B!pVi*^ 6HNMkl;~ŵuspOz"ELjnٗ@lް+)}6ArG<aHxE֝o-Ӯ(u4e읻9\{WB+Q4 K$m)nsKD]YjCg "nq2cV~cťd`~vك@5p@T59{winL88 OkmU4Υ Y(=v,AP &$*QWEC@ `XfM:Wohv4$eЌˉP186T<*~/Ir]Ϗ,; 4..=b.i "):Glj!s~㘑M״zM]D}K״c ;Qq{5MȑRTLh=/kRi5%h ɟ? &=GGea"ƠlW'evz|R @in~M $o^t9A X*GJqC4*/H\#uHw{,d<[gR 6뷺u¢ic|~XxjV)V~v'/Jq6OAͧ_mvgc>l?V ޿ džo_q9kn@+ \cѾ}nOi^Ů2PIrrSRl305A@-+>d0PC2|7 FpfBwHPSF$̙;]F!wDzBԒ%LB]s[$i?uISΫޡI`\QIGN3rd[G'48%V>&W? .~nc]o?_d?7KUG/} @8 AvH5;Zhy!xSҤR֍Vtޖdn':jzBR"ܽ%K`{7ثp KmI %35Ȇs}«kQeYkc*ʍ *OeFv^y gg8ww%A:ܫl~ "i5qhTIc2p I{ V5fՄ^W~oI"x?-F}m􊠖M0P1C}93F2k#ICE0A`T6l;7%}IҲ.Y}}k* e`FPVZU;;a 'Pմ`Ċ R,"{ҤOgIу.̧!%U mUV H(pFqTrJBUδԼ̦}nbޓ7_dڝ7OS&.&cZ袚Le|\1cFS֛.tIK"yF<8A)$ƠEFJ1jwƒ*b5*6 -- Zؚq(@'hhU%TaQZܐq@aEhT"8 /](r 'xzEl˰ԚRސ8/;}hʀϕӶlMʀ9:Q@S7 pvBW%sy~Pq26C pաm&9b✬wwwQ{4+Q(me-P&>/ .1]}?x]:ż1}r5ӞpS,uG%M48&;fBWM~(=8 Fk />V񮙚7STB!Yhk/)ldzk &>SUXd QJ:?a qAbօ 8gehF:_ 2B' CMh~37[ND t-lm_jw "jT蕨]Q<j6rgo}CadnbߓEIgA|[ޟFt)!Fpc gJʘxf+Si\ M6tK*nE ʐ@TVU4LeP[$VF^Ӂ6b>.Nr,| ljGSp[G# 5ǐ yKM]f򈲯P-(RH6K7 = =7YL p2A@ AD;EXui-VVv j"rL=I˔>n|!&9nRsCy\R=o>xg{\8Q󽣏|{s.4(O݌4N[\NrKnƙsy;*5;QAS0n$EM+DiMěi7HDfV޿<У g~ h6$lcUCج qdsHzܶ3&K̊AxrBxKD#?ց"aVo@gj۶_==;3IܞLӗ/h@t4eגf:wA6eQ6i24LgގvdɸҘgO;(Hu!"{PN+'yXظ F,U}:T(_-U5^vNSH@C朣'UPRg Ϝx(|[TU:@+:q5k>l E9Пl6Q8[ϲ4S€qWlŅ0ɐ l%γ/U+3#/UנM}}Arb:NDvm [gHl~ndgEaPe $}6 O'}X2BXS~/*iM;ˊ›LPgy! s-7*!*' =HӴXds"5%vgJBub8R^dAf0b\FdO,oQHocu<#|y>HRc/?3S 5 /yVjyԗ )hL(`dhJ;~ T1WB.\inH_f>QL_G - }jB{!؜+C>Hʻ>Aww^uPD\"{@/٨CdPA0O'4\W)7n ˢVS/'^^y˃f!([뎻mڜ(FEQ)Rj*'A ehϷQV4;yu[|ߝ]ׂB)֜Z<}Q/oD˭$LxB!@'1$ȶQMi Ʉ5V^aوGok+6c;@hȤF WowiFC(C_nsY)̩/YGVp=*nh]}ʤO+WF7F%Oae@y74ǩzkwbZp,,,Af6"3^PK2>[mn:A拜 TXt+ tbyTR9uV +ŝ1ZR4M Nc^Ɂ& ԢSm7Zpe"^P]K 'ܒ{XGI]R[ zъߍ-{4JsbYq4/L<^;ZQ0Sap13:s*M _WrNc{P3cGWfZ ӳ \KӊdpPtx42+Fmo;)z'Pqe2mgʃql"W*68@Nԡ F:sՃFa hѺxk7^ػN8IT ):mWa ! >Ph6h]A BFDm(9#@;gCNQe-F06XQ K v1`hl@mJtd5!@' 9*f0ђqa?OWb8.FhvSWfI1w\WBR0`D.781!8%qCuK!B9 ړ2\Ċә]j}3 +5o77WCu#N{~2³ۊc gӏǣbz5_t޻R`N[9.~_Og߹XlG7h[q 8_kgfP&‘y{n61pTN#thaI!@<>Ws(Ԥ80>݆:Ye3 (PT$CDIDU&]eQG (ݔ E M'G` Fx.ʼn6TY@"4n!I^R꼁nP%x|mvuBƎ#g-g wx'NͶ$nM$nM~kZ*xqeD #0+ e` WDh||C+q ɟ,JeL=ufJ_]:aMF~ӊU*iqKea 1@*xj2BbapŸ NH F}J t@Jiғw!=61$*; =F 8aCC]'A@Ҡ{UR|:T@4yS`rF:xQ[,.DI( NiP9[lD4 }xz) IT>/Qyi^L 7`@)b34 k EixeI2|ZP&Ӫc4* )GP^MB0Cܡ̕E^0!Hֲ6Hoj1tL1)'rk~dCyA'6}iӂf\o¬2PjTt'10juzy87a h_-m3T~wyeg9w[ ޅ"{W3yM06h4j Y v\.Y4e@kJT"fjOP}!d5#\S!Qc:&ω 9.4;w ^*j) :P"cDs4<1 Phʩ%Kq?mXV 6V ގvvqE%ڝ8B|W{g8+MޣiR5E{/BXEνӨ3Ҁ5:2 AsXkeXGXDM8x+t{oQRuFJ,7 uԵ %#[E/{HM*(JV"f?O@[鲸꤁Tu{[[;(*HT"MY6)ϻ9(BGw4옅@(c2z͐D;bg G+]ɨj`T;y.n!`ݯZY?dZ^yT4ajd;*WGC2:Zܙ_1u||h7KU|s,)/ta^~~x>nH"P^YpKG}K}&N0Yn4]Y{Ts)=m?Ӱry*# E4I8 ^'oj7[*MD'u69, OnnCօ|"$SC6wwڻe%!QF{2~kuV+KA崪xޥٷty#Yo7]^O*L[Y"xO֎s/m]9feU(~Oq|z:;/?0R> ţI:}6$"r%}TD5cLڃ[52JdvB{-j}q6z8u|c|C_eZQC򆁓Id$2pRe`J-ȂsuxN{e<(/DL!hU42uK+S'4|)?ɦRMK VHYVbvdꣽzu6_~_Wd8T|\;M}QX`#ms4ơ1nQb"x]@ sc[2h'@%&:7 Qmff坜"^9puBmZde-Gd=*y s*k~7?E*Qٞ5fǴ+j7{YvekCa[peM39rИJPƆb["Tgه7O7ٻFr$W 1HGAPbntS .}$wWbFJ%<*VIy$s' z 0E0xǔ!s7Xs.'8ha`'k@w"Wt.W>J[FĴ`/1J[> ȱ58;ujq/|m4dYŔGWv1mQM)ݽٕg"# x䶅 ޵SKvHa׉m=tQVA~G]~*  0H=2"PgLث@ ʵ$agᾥBl]m9"6r;y;g-L2% AYe2!3L)g3wB%z.ZсVPThR<`v<үBˠi +sr&%r\r9v#ˠxݝI5 =K0d pIZxk%+'sF#X礅&cK 8HZ&%υCmr[* :ycaPA$ |JbFϊ VpeQejIq02qLf~fZ2!δ~Z;fVeة&٨9=7̨Pv!MؚRB)cVfPO3g{'Fc!} 6@Z菿()_y;byֲSXGa@!>/s.OhA,snQLe<34SL>$r&FYǙ+0koϙ,ȍb:H-%/fnqS&Ur-ϕx 4qeE2\/:&5qza?bCe,\I_[]AUA"_Ci:5hRzib B1}#yС %|Pz:m ƾaս iah~Vj0҆/W h̻K[H=p-I9Sr-v)`Z(9"($*@+oa4oh"8^4^5(|"_HW,7@ZKI_LԵ-ixE\-W[?y 3R2nI).rDow2lj̠,fL<~QPi+BRi3n - mgZs p r8!t@~|@ UL_$t URSӞkP #|>77Gx'C G[mr{ PC |B*quk`"5`Pp&xY.8={ ma jDlnƽݞv{VYmgMmPs^שLRB[!(egd Ε3ʫu3S݌/߿sҎs28;aq& %9]Uo-I%g9ʬT2)JqaH w2:\br&̜$G&g/Ydw BY'sA` g-[]i@?T٫T]Q|yբ(:$.lr6gvZiހ 6 IR\w34`{:r^:_c=* ]\ jv^՚"҂Hѿ2sgD)BR1 c߶1gsWt4:WWbBȱ;<]nYk)e)rJCY/D[+½WTZm'_;dCVP]mpLA1wQ#Akup*MPyV!^SJ,!~:al&.&tw%X:n˹ ^seE69l\U!+`r N=c`iDj`f}0ܳXycG-:beGM4Mܺń9d)5M8 g%rR&"k#b8u -w + p f6ʣ_ϝ TLBfeSՎwmWxc7%[=I9)3"c둈7|uic `}xG6F z2t'_ÀDΫ4:40LЯ xä=xW1 +Q2B=;P(/ӍK-wLW]ݶ)y; 90Dφ|I*(|Hm'2ډmL7RWW'O)4q%F_Ym4d#je:k]DɁyjʱ,eF 2;ɴC*PA3i`U',-Ĕ `O~_`bjnyh6)%f }n%4.pis2FG]BifCb(eހFXFwR.67KK #a96x$Ws5WK 894׳K LWxv!r,ryܤ4\( qNO`(uJLlnjqSjh|'{^戦G$\Qt lxvVeU땟/KW8y˸뒫Vsg Y V߭®ڠWf&͗_dn)o*[ߊu1zyZi>V[v>іa^䇂iP?' MM5޷4:a%0vUOUw}GFmҁ MtædMh=ӬdinW䑦sH. 5%@emrْm Ԃ \,b:PxVLf-( 4+c4B4hz@ (tn7k,uƬkY)"a*`Ʌbn5[W J}rͬ:=uϦ^Q|bE-m ]2P}Ku=A` 辄O\,|kpzt6.so v -33h Sq$A;o04R}uqvOa'j 6EL MH",rhtA4&Y]:|9[Л~ǷW \x:\eKb)cw7./(/gdw1/|ʹϊ-Kj+~J`)9O+i̘e/Q_Ni9 _n_x1+wI!Q- mv aGca v9ՠ8 T01V*٪eЭkD>\&W9=~Z~Gu]RKҿ?>N//էY">7IXtRQbr**Ԉ99,kmhl ZJfޠkup/7uk3WNxv$fQL<8"cAc0@bB!/إ&xIkm$C/CZZ_oxJ.D?݀b}|&js wgWڝ6ȁeS 'aVB tA# ĪV8LgW28z<_J4W4D-h,\r?;)YGވ-j1OqomHO!#\T3E+w=ZEA .C&nHg$K` /[:4yL`4kA'Ө71q}, PDjwviO5M|zFjC"9'nӸ\d\zztC.“ [/o_\/]. -2p5ZP[-.:`\-^9ȬYPRٻ8rW}IǼTI8A88}ZJd+9\ӗfFR,ɪLØ/l@XI;Wd+edhhtZwQ7#ΊwaCq{YĽ[,X/Cvg|:AOCs[8u GKFuߧ +9ȯښdVO:ٳ؋aǾrH}{iy;yskނ=;O Άl#JjD PK2Ĝjoޏrbh0j?qP/ I|2qS]hHF(D>XyȌ d(e29@m x\(n{]ׅ^ޔLo?q!{Sh!ox{t=^gf}{) C _ 29_=7_nd[9뭹-O2vWUQ"Ϫ*jӮ9K+T/Th^賗mTKi-oF | Bwp@`_oΰ,>H]MIX& YkE w$gt75ǜPLr1XL+6PXA(6 otSXfFY2I TF5d>^3ӎZ=õ.49#R|F_N%_gH놭qUVBx}H C}ԟWὥB=Fӑb5ӏAuAUג=1ZFuVOt :k:EEITxOޔJ'(o(:ruB:RohTLlR#8@r: s`P'@Yl5 8Y˝ujt0WR[.HDQ(֢WV0f(+\7QZ5Ǟps[Ro-,m(*UIYAM־}t)*|̑Hrj keQ!F+}ƔVjºDY.H4bu=yݚ\ŐBtZD ]tjSozټJdSHv_i ,"n78ƛd #f_\)5A4hs]/-}ȣh׾թh~{"0#;+D9s}T`{Vܜ׆PdoV4[z"߀>lVW 3#璄[WS3rْ3'|D1P(Nfl@|P݉Ů짚e?ʗ Ʌf]wJ07Pig XR7>zlEXc  NqOE慈 ^!jVrU*d&*cDfv;n;AX{ݻz~^ g:i/&sezʗrod/b܋+}۸xwS~o,fb/f{aN{ЀXJQ{F#fsM?+Kg?i97߲?^+7E{};z{x Vn܊%ǻq*~YؚV$$( 3KzKJQߍzƾPIz|=־1gȵ9Eՠ9/jj>ü:wEie0[s|J;<%<- `oXE=ƴj%CuΡzxzrncş*)g2 M[Yhk u7N[ `F3e&u;ft f8@xQ1Q_T)A ȇ)]US9VYxȘSAymq C>A EAt']4K*Må *(0Ȩ 2S("&!RRLqqb^KGυ@b} (ZcYlnbʒ.V&:0Y3Y1,S(;d!RBdeZ0nhTppaHR+%JY&F:&+MPJ b<e+0z,ZN6zX(9\(9:ͨP8A:KnU"!ZCu;h,whU{+D;t _W%] wRqTe_!:/T+ȯS+yæ{V"ZؐW/MKm<Ƽg,{HĕJıʘLw$u Czڙ͓̽'G&eҎ7y* \n"(w[v}깖f|C9AJ@>\{)憽YgV-^hU~" VFM)ђp22\1y(!!$ءFn$$2yfaʱ`NeR.L#YyԘ _0.{I <{!+6߅ˁR1sK ,d% yoQ=T(Gg툮qAFZ AsY%Q'\u|3O-yC bS2U9e}>M2.D,=YZh$> $zKK:I&)6D̚4\ bTugߎze}A+A!&Z159}Z +^9NYbJ8)Kg\ (1 *~[B")Jp~BXJQIV4HK2k)Ts.H(E␉[fֺu9CuPinu:HbV2aVd +00;d1e,+iN Z}d xaeT5 h7::E,xQ]&"'d(8QI cgPDcJH%-].U/H|y[Q[j>9ӠF\#zƘL1ܸ7!?Țr;O^D Łyڽ}Ԛfݚj%obPSOo]Ju$N'+y߾Vb!z"!ɨMxSX5[xciXm7i_uc W+yHkfD] a[*L|)~xn;HJZp-qw(1D]gB.T74p'm)jQzB[pOQ;P %F{ x,=Q1Qȇ-`жBtm8|p a ̢7B!)nU<[tPu]h]5cXh&8m~|i{=[&#fyc$ӭ|o$}w}囻v4N5u5?>84ƁS6+(E;HnkS%jtaDwd nԁRntwOR+HQE7&w2k=m@$;EvZSYIH!mFL0L5d$xRd?sJx5^#Ͽ2{?I06d4ۖ~()J&`ЉP;! d1Mra(wF{/FTA8٪ϒ ͔E ^#KXt$WV E ę74VL*g$yfLJ%F͈쭭tT ZOqa"f<1)!l*f%SRܡ*&mlҰuyK%ל 0 i| G0jO_rܸ}[gWGԘ$u)'IJ5?w@tq<Ɋď1sw!'1E~`|kU,xj! [= M*lOl~+w >>GlV{*k_IFr立ػWiuXZxunU'i:ʇd_I'杽CapWTFڍImPw=ƄCgot 4BmgJi>~)SXpէ~ReQeQeQe[ĸJ*rYҢE*9'y`,e)) pWקQ};:>էn^JR >j8L +j `UL:<$bsƒtdb8>&Ԛxx4\q2 } zcg !ӘhF L۵jdPTFF@$5"B@0hJYTJrZԪ6zZh'^XRF!mYjڋNjէ0UVyY7RT?{ƍ K/BSgĵl^Ra='~C3"-:)g׍FwJ#` xy\5Ε(.d\fbP#(X[LJ`3xpL5.qcB @%!#e(=`%, ˃* AT/=/$a%"h4geՌ#B~m),BpJxԕpJ2= 䰂Z) :+#1C;ڢԔ܌k4GL9 gżҙ3| @/y.׆rA¬԰4KV-¼BTW,XF n,&HW=R7ϕw @g6e c{Vtz CTI|8DAI% kפϹ1dImU^%;E0OKܳ9X3v |] ?8 ``ss`YxKԂ|UCk,H17b$g\*|~)Nf8}gzEpeAbDoEq YE5} rEJu}.A"wsqa8Cƥv wj9E tDCqb\ޗ2[]WMӽI07PjwrqeE8lYQ-Ẋj뮟+Zj͸Ż/wX˭skŰT⡇COj2Z*ڄ0z*Gde6M6ᐔVN4  ݒC̵]Aı H IAmG ʙz }%X6g[ w+Y{U*YˤA@:!JZ)P)@ť4Ġ|Ƶ9PC!{cs(% )@Iʻ6Yb IAUmUʇ{WSTd'/uEc¹&C6<{31 #r##d28 )A8}(K:fp!6MݓXV4m WGD#0myܒ\Q@&9{0 rh)9z8ՄҺ߈TS%NVz9((Z@Sa"(a2`m9*8IzHnGˋHpp"EHrupoYB 0Ylo0`e1/^o߁﷋sU"a@C. CgD2ܔ^QvB4laua÷Z0Rl_q"mɛ IyX\KE4I4eD [*1GvDkxZJҵvK?ڐ.)2U1TjWi<햊AD+2ԛvK?Rڐ.I2ŇinSQb#:}4nGO9MnmH $QTsiu&҉Sٽ\ϟ6fWvDig1@'rȒpZ $'6gDˬ''H"O8 z虭c NNL ? g?ՙ :A?AQu~B3AB'6gKud~Ohc&,OOPp)SQL)ꩦ`' |Zdo޾튦w[4 zU׷fj>.1V܄9̗𛍂kO;q0.vQUy/f6|E1]va~|v3&1Gc00+XȐffA$9e:]&|Tt3'TGA,"軇/e.P*fB1!p)K3)ͥQdhm[|:(k<#d`P_wxSBp׺?zj`V}1 ˳ՠ˃9۔}U"!Ob_t*G帿&2;E*+*cRv~~@wrxx^=+hW{vA3qw#_sNF57DqZ|}u3UZ]CT϶'\Q}Nn5h3d)#Kq&烻>FeYRR !>iy/fn?-k.7^Q1;^NWY>d^>? u AЯ˂l ƻq<񙖌 GDPE G*ӗrru&9D&9ףx>XF W IG-ë[W|%|%w~jީИBND&7a8{3tuyxu3^-~d&N۫e۪DD^\lVf7z9-jyx;~קKoL-Vԇܸ/-ӻϦ8vVrJgl0,ŨyuQPx\zu8/ՄS8BŒV^BjDԞxuevo[?K1c2>4JI x-k!w;RF6 ,X`21mW0̀5r`F*JB|y(\!TT+mPUB;ˮZ\^F21d:)ۀNi1Om; qz 1-`MO'17@ Ers@J:*Ikx:Fa+ϵАF^y{h O Sl$Sқ݌| !VL+K(Z2Uvݶ&|m-- ޘV vTmu@Zu`,^xx@!5FT$lppGؾ~hJu+F 3{J lwA+ZNғ -.6 *@p %81F휃g[ \i.Ѿq:#>zԯrB"2gg0\юL@#Fj3L^JYN83&,͐FFj1 l gma}$~BD1U\FgZ!I7#YSg"O"]< ! %B>w}C<])GHQQ酭C ҿ\O 0/N.*'oHK?C.[<.!΍Ϳ]}n?Sx>XEjFZ^3.!ʙ],]Xը/ղsQ{ȒZiU'=QQ 둪bD Sqjؒ⤇K)>1(Q;&́1b3=1fyC ȅJYj&C;FA#ԶiN(YGE)Cxi^JMY`ă9N6W$d{Ur㺢( fNEOA6Yd[E bijUNQ-L?nH.Y q$:wޱn*)ş#R&1o`FN;)a WOX |j g0W0S`eTJq-F$8*Dn;q@nq3h) UJ+6pK)-Vv7ݚc9V4b'XBsK^&Rr1 0x32 з6%}F-Ô E;jT1ny ͸4R5J3c()m e| c(Sɹ3D=p)R`ιAJ Ȳ'~Bj}yϲ}{#vO1/zߩ. dLho]?Biz5} r{ir=\]fL|<7B~2ҕ= B/kF *ۗH&U?.6:\XŇ/beO/ ©Qǖ-^RTRwt^Ϫg{frGd 60^nlzwni6ĉ>?L8|u_=ɷd#OwmH_KgN6_<Ľ54v< $~IrzI-HYl )4ud"],}9$MgϬs$; s [oޫ?Sp8oUVz|u c:Fd`:&u8|ׂ*[[$[pU/O1+,?WJ@`~,҃OV$91!-IC`|}ib-/3 kBˁ*y*&@N&RtP0 |rE: Ш=iy:heCz*A$&=QˋLHFunO=j%ֈ,ņt.9u { LkH"%ùB uOsiF^[x 9(h6r -5{r>ϑfF p>V ou`#4g il!!J:wFV ~_oQ-D|6/Vgۄt2s<k,YH:2>ýk.hdi-n(rzvBD7RQRtnOݼx b0xMp_QvSJlVA1񨁋`VмNJ "be,ݒ|"$SMMW]&ܼ#jT N3h#Z-SޟvKhUֆ|"\4S}Up:Oe3OM0>謩%yAA겊Ps &D?ۗ0!J^E*߭Z[QtwCka*)-+3zC%6RqȽ\cSiEyZ $^1|dōJdw(> !q_qG D=0xP#/]ոn7ZaJ8hJ*Jk39,HIqk@,O2ޝي]j}B0گ玓 ﷓IpmXMםl[-Y86j B).Ayݹ[tR8Ӄ7=H޹w#X/u.蛑D1,5W^g*)a7(!B@szh N/]0IN!|^;Ť$v kBׇƱ;EH;l(-Q;W] e@pޝ*H>UKAԱt҂;ڳd+X3Y\_x!Y;#ͨAk5Ei ,pz" -BȢk\@|=˘t){ ڎ~r+'=h+ E5 $m7Y ^PA<񙖌 GDPE G* =ѹGNpIa@7 <:: * JeЄ"))CͩJI^ ,kND5ِ\(jbmf:c }+ib,*f3dъ`Ll?fᝍN;% XT,㻛:}G޼{@9>1]vutvō;S#ӌ]VhalRX!f]V( Y*Xt2vXA6[60DGVqcaGImx‘&I'@t9lL(;QD=ʎ3.:yX-YU\ 6ks!h{c;%;^Jʠ)is˸*oWj|oCh4 5ugg.>+`2~> bE7NÓB)aC TJ5KE5H5?G y-$}-޼6џV,9AMilSF?sޔcM'o1w'E ZYb<N0Cq&W |E b6GyU[Qiz7ު8>!=JJ23\Y^+ƦoW.jU׍u,6<v(pgW?F47LpBsAXJoZ5#zw%_=`#XO_WӮ=Yz+y>uO.Bdq"LG<[`@z("t#pߺŻ' PeDAdUK2_Cڶ=m6@.'LDeڒܓ RA>M`ZuR"MQ\ U; Ce័ zFɭu_Ǚ/N,2%&HM Q7D-tLQqjSfS!Έ0rHK!kK-RRjxP0֗uhW۬iѲ-%Cliaov!Zةh_H{2u?8Wd.|瞈bRH })RB3Vof~,@8gᝍu\t{'!Sq x8-u723Qxѫ7޺>QeMVHᅎ:94E0C#DÌl`F*,q Ogc~Pc͎bG>#%+"<#.);s8mԻK ѫ`kxOma>,5mggaG5.H\nl:W(^:Vȧ!*%wFjŜ' Lޣv'uýΘ|L-IFK3{dD]jCO*7.mhk' A?1H&aЇY ñk]pvRaX 6t_a] ͐A)9.V{!AVHwmh ړ1HG@,Ț@>]ҦQ,!!xֱGp5cepO>9&BiLP9ږ"l# 8u/"V8GL$ qEXyF% ӹXQ2S[QWDKcJ˓'e(Mhc}8+EP~X2OyH=jV37_8_d ?μ0n̻ 3&y;tl\Fh0DgˌkD" p9S'4f,0җu(}QlRѲpClH,i,>y ZFޔz^ȉ3ę0tvxQȹro.>6X>"Lh׋^+C*MASqIB( ׆!cscIα,s" * {M5JK[J!!fpg[js& ;@38')11`H5Fza~RR bRSa<߄;f1>#?dR-a\boW kr育iȫ+>|:qlfѻ^K<S<.,ߙeߙ. ` lWgǫbFٹ,Jc7]禜mF2ߊE$w r.lwvU,ɪ42e?Ts>A{rwa峹1/Ο}LYCAfgy%J^ 姌 '{͓BMV/: ]%Ba%Cvg Z\zӢ9NեQޚ*¨2&7`45J Ji΍%Jɴ!?J hBiqfHI$w *7tJ+'wCb{)'>=>̘f *qs6&!=w_C(1iQL4Ҫk5-q@9.WOǣPH%MWqj%{)#)fzs_xF>r2@31teuA]*gMJJO.YQ~;ARE)aeWZS(.iAwRZKFd='A;dɝ\pDHgJbsc)RВr` .W Wy  vܔp,ʛ:ɉBF!R{HB$߃)ցP_y Hh̖\2H6~> 'Ǚ~?8Tݎ_) ##Ƌ4baqbT&"/f˼bBt5L SkKTXeYsr +eŞKC !(T #7 "iO_ 나-k _$^Vl>a`^*DLJjkI[*wjJQJ# -+eoz(!c92f5|uʉ@ ¸m:0ޕ}kT"V7dIL(Hp 4ߨv;ӒJ@TL/NSZj}q|h:YV^N%yI&2A`o8-r 6GQ*d›u Y-2}x+󆦭0F]j. TjC?A>j#gg7,eNO&TϤԂV &oe˿ kه_^lm@JgLq)IVdu!TQe@pŪ4;ǗӋn჉ϋۛJf~o > O,d[j֯鳰r~2_"sXtB+D;4:44e<cLF 4ieC<;F4d5gK޹XsԅQdMi0lڰMer0S9ZJEqׇpF^EQ> A[daer24^$HǗ02D>DDq=uv#鋉 (?}jŌ IL^LZaAXPГSQ+$.6@ i)5E>^R갠~c b)5%!()$3&==d8 82N%Vcx,kdX> f M`PY{OwwGݤ[w| >gNp&S (s Dh^TZ]Fr0@`y qj&Fv"6C -BLap[c2º( Z"XSaZ { SYvҏ8pd _~r񏍋VZja+t$eiƒ?>>Tk dt"PbeQݲIl{O]KmLNW:ژ$ilzh1heߟP,QĒ,cr,Y;.$juY%3>bjG%{aB| 5Z I@ Xev'!΋E",fr}X B[s4[%؜7AdPMN~ؕ(]1!}~a4+Z$|7N6l>yϻ@Mu t~#ŻrݢڻuhwB&ަi>\yq7O %8eۯ~F*?I{HIsBh`J瞑nƤ!jxvwSfu;YG& ze;7O˒c+{ 0ey-;V\]1(׿hp>*e90x,B1xǯ_s͓V)K\j)U'flck#ރk+5DM<{x8D  {ΤQ}[RVp6ՙ9>?,ܖsQqx>ǍU:{bIR8Vg}xtJGg4Eylݼm6NC꩎g%$zbN l*c| Dr5km}~#Wak&ghٲx<Ŷ,fTWʙ}ΪόO3Bזq4D4}r!QbԔAaf4dPpNN1(7@)>qGrPM0l qsp6 9æ{4CI:4- aOj }z<ٶdZ_.Q- Ebc_几+̗<~7ϿWf>pcJ(|+[Z\߮>pmxz W~_"<dFQs: <kk(CL[~-tbPGH, uϋl/]1jj=ߖ;+-V[ދrKLZ4@^f8JkR)jրJ"~̋//1Jeg I B/M- g |<uGR3ӌ&:iSfRH]upfU|_4Y/ 8S0#_#Z֊3S#:VoQR=3UmRE-řB5j,GvX*9&-ڣ8'x6ܺ?)nb-l~Jewwt K,wyBĝˁpDiq%$,@(_H k^n@W-#s14{xmh:6.L jGg;OˍR\ѣdXr_A{[T_rqE- (Оp9Q88G@:Vu6b5eӪ aqA G N: @H_-Iߖ~9H;7\m=e[8(3T<Ҿ$lm].[z5Ղ~P\'@ u:qe  ]!_>/F~*p{'`ʇ/6Y&bD1#ܬa;x߷rġ+Ȍh4*JpL"A#%3Az -]$Ï|{徸wmzٓo~dݗٗ[r$ٓ ~)٦M$XY.v5Ʈ_?湃gYWpNr,թT`8wHH$唓S-Q?V:GlG"#yi.("dbB`c%@ `,K)K%!/W i2IdqLéox֍']bfǚ#i$1(a?蘡xI5`"dU$J'MjcPb@ctVJ+TEP jpڶwɔFBc̱KJ0Xk<.Z(dmPʴ`BXċYy0/m2v۝r y"&&揪}@ԁ7nŘj7sad[dhU[<~Qn::Hڣݝ7$5[^@"Иn+GasǥT;b]\lj%;ȞNcf.m.PIPqzN\#x9D; `ck/7 bZG]4u1)+@NNnAp'R*AH6oWx-cޜ1/6X+'W@>NxF-wsEBnnί:"JAsL|˫Wn1*"Q*QEyK锽yq)!].XL ?\^c/`_=_|qjE,Pő,_0hǻp@mj@Oګ *nt7VplV}Hx2\yN}^gÛOo <>i\g38rq)\|#ȋ Ďnr~JQm5QAdPZko6 eyS([TJi^DyPZ>-H8cDфRs{ C UȦmVD}`: ;ȃ[ aI$qSPB|h[J8o]P9uQ+|Ҵʭ 7uE)V_Z=+!BbN-&*Rn,%D-&µ[~RB4DA\HHD8 ʭ˫QA ~VLwNJPaZKF_&edT',:h1Ț>+%)Td \rπ(6IHgH41( F'uRrGڎI17>>| Tqnҕؓ4>)Jh$.Zr,#V6oCG3v ìAbSjǁIͭ|3s.j3H% ژ.N,/6m#R8iGGdOkX+xz+U=9:o{| \Qf(.x qbU<3zxtӳ_IXؑ.z) g$%UO*WV:FDSEsTNV 6>7՞gX;y_ 7+TjyU' |u(Q2b?"w+/.~NC]4CuF/P.=UlY8:~?T5h ߼Ěz7Nx݁JC6֡I@pZk-% ZdoxT׹ oH:4p.~\ߏ:BHwE/>1Fh2X3HY3m֛ 4YZOƛ㪝J.&ڄӪO X6J>apcy' jB`7}Y`Azj >*Ͼ;OkKcܗ*#@.&Pq53f Xm0YW]0UEHc&hv f(B{|Wh'')ݫ?x76ï?qooz\nOޅݗCr!myA5~=IE:0'Xp)NPR[GUᗓ)8QM*M!à+ٮnlq෷6p?oZ]80aCj1oV~^70ٛG<U@VgۊYgˌ`7Ukf'(!JS&V ?Yt˖@&e5c:4p2|3`Ϥ 77vZr&~}V Rd`3IQ)_X;@2NjScP)rg ߍSMЉ &UGR{NDTMSJ'(sNƣȈ&\P@i9YC/ R+jÌa~5,Y!I[2ƦT T{T\*%4xG%P3`!˲pEסDq%cIw}圼z`+_=fRwOs̳[6>aErw5$3C?A拕"><ގ'S< s߬&LWX}0냀d ʑۃ%qB|'wݔt6D 8"qEW+Mn{e~m˅\kҟe& @eIg >7~eɏ_;6%Jx|D#H9*GbD EiF`gBBb-lB &BDV k$?$ ;Ipj7ADK/}  RBjƸ&~[P@p̓l~a’C lj^j;#X4%)>ߎD1#ylu$e{] gʣԴg(NDwGiPYB`xl1RF=O,ILYFb}RªTgi=N?$cX4t hEyEQWHip֕η)s:S,Ku+ ]#yœ4)-U+ _9e]:`=v'׵>.qD9v{BƕPHm\,>RجX:Y~9PIybP*͋A9sTzʽ*ʀ4~P |)l2Ma1 O-cޯ!>KRMӔb%&? I.\g=Y43nhzП.5'bP;O?NԹ[?h3{‸}Pr6OM|Da/IaTv# ٍ.-14h4 iAL@#bRtQPDH-9o5&HC5/~dwFHΖxoREc+%qu1d)>1.f/@e`yo?탿F)91EDH,NCpǺF r4:3mHEŒK-sSvxkyO7nkO>d]Zu "($W]5締v ̂1Bkm U%ξƹ }"8ḃ jkT 7emG*MZ1ME\MEvTKDb^>\w%B{VwWM}3&Y9l>Ew"GՖ*[AFA~ ny)i7Fw2'$LG,EkB{?5>+H8>K1MGU#L*S=x&ۉaH:!'*V~q; )ew?F>؉S^9x1Ւy+Aq]]QLWS@ H&{){ԏNl uj8Ɖ]BF@p$[]h U23sr[oȖԟ%d&+\o^৿҇WJzk~0}`"E^-ru=OGA %HaI76:  0ZK1t63/їPՕ}`Uۨ&UnM釟l_Jb|F]@aeUgtb #Gwh{Zv5,.d G]1+o \Ɩx&v٢SZ;^J?F ~rSi94'_IBAN,6|WAsd-;ޛܶ8YdPPA/V 1+d޴ǜ¶sg;KXa״XKmreOB>z}yϿŚFv}Yŗ y T1RHԤ_ȫ hR6RTGÛԸBw]Ji>+o#[$3o#7n$ &.+n>_<J>l1ٷG:c\y]qS)g8HszX>)7>L]ݭQ^Wex?>۪ =ပf[Heax_ݔS,ij =.[4e?Mcft(70T HP#Af)d Eԭ:aZ#h0q,-  &NKHF7pcqe&ir)j9z2>SXLcmPABh I Z ^2%C4H( 8e@#mu?C lh"rI6t^ Sj,x߀eHЩΧle^emcLqrZ̴90#KP2չ9{%ٰ $#Iӛ}5=U!Қ&*J-0pVIJg[̤sy81ʌ,DQ=WYGe4m=msOfA76NT4QR?UTMjE/+J'uR5נ>*rKQsZVĶT֚Pڠ</8 .k*tRz`VQY#j4ZggEMĖ+ T,,heNޫT7k*dQG *QxNtESvG꺠xy]]{YMW~`;gjH*5ƌU||FCmu/|u#mn?Mu -:'1(;):Gokrnk~zrS-@mM: *1Lib2ּa-jo7uN P GaZ}nުMy:cͽЁ TX`!v}o3Trޣu1r)}topѩ OW unY3ԿLMv_zj0.|:\՗O-Bkk/b t{oM#ݠ^eHEߙׄ]ڇ3@k'&Gdh+:~곣l֢  6 !_< :j[sCFvSvWBڽ'ٝ]ݔ_ޣe/=LҪ`VU5b_}.CR%It?v@j ,H%dJcg/a-Jcv۩#}; Q8$$oIf~b{"6bk5kJ=hfn3ņ101I.rDL6"a$^VD/Mp*^=*m~>E$6 P&2O.:3!w]vSa:{򥄚jgQc%mwp4?p\o*M+7 R3n>rr0Φj{pB6vW o磍͍ZvAc$EcsᰨicO* I QDh̀z?XcޏmmSsX)!9,0nQEUt !\(U$/ OZ*Ϋ)~ߏoGe$A3v:VNoUph`KtYqV`gV$ﲐ6`yk7IGocEN PFZ VvVʘYhmlfG6eT`[ߨQm& >G+ɢ.15Yk(~PF(4z它mGBsSz^rz+~#8?;aJ/\=TSvHy-jK-Aq:]6RJ` d`œ)6{ty[5̂X-j1 t>`*_Ώ0֘UGN)ڀO+XJ&g[rZ.; v*8=Ji>BP6"G.p9*@3 D- zVa,p䒳naE-8ʆW/6ؼ;H_/R~3Ǝka $- b!6mFKO**1"ZdZi=*y )hp`36E ^ 4{%3) J{GAj /lav w= .ypW \*&9G^ އ`Zx @Kij{`(B>4(e:U-|=~eVE++NfɒLA^|v&TPW?7%}7- ]m5*~lY'h۾SeAъ'hd 9Rt̆:IL͏=?\ȜOM UmZ Ӷ&߹ ҄; Iv4,>t`(||C{JQKGp!juG5mnKXͨ>,^W!Jj 4R /u~+Kz#(ir.-`=xʿ־sI\,AraYnю|v lʿY!r|(G?3bS/0H0.]N*R}Y[vJQͼ9RA`_X?Ul2#,ukipVA\+bcY}k|,zb- \DT- fn*I'KNPawQ$gg˅ߺ&Blw:Ic6v#dpQJP&MH:<gqQDT1@85KQ9h%qIK[-Yɓܐ"s"$2ȜR uaRBΤ\%t],,^/˻ܡ@iDNAFB] WL9ȊGgrR[ABІ[;2t<.5)J ~A2[6 ߁ " 67",:5+d@F/!.iJ rbd .Zj`BbRI\hK2Gu9P~Z=={7v`.+ҭ3t,4՞: xbMWcfs&DZ?O[g)m @ BqW]!߅oex s3178=M@1qr"=]+jEz=צɻۡ)+LbK(q+) ]k$h0^rlRi_- U,Tir/-TL)q᯼VB)k#wxlɔ^v] HOm}4D EHFv#BE)-xxQ5/lu-):ҪiJa*Qrn$^et#Z"N-$IN%X)%R0O8 pwPJRzWZr` I ="#ӂ `Ղ%&K\ DZ/D4n< yc){l,I)2!)<p$,$4܆+d]̐~vxiڌN"Bc4̏96{mv/U,eϲ=k˛33{y7 Y]ImAslk[AH-wm5l|OH W0b!8l?᧰\6oSZ{ُgpq?>ᷕ5^y+,Ѿ>n :~";l_v= s{Y#~װwq4H? -GI'~hX;$$+aO8Yb/R=I@i %WX$㤱%hP7A5@*Mlӥ>oJS_vN,'REzd)C݊}/7_+FQ`ew#u5|L&Rat4 QM@ D}XsX 󈭆G- 3 J];Ț-SRLeJJ!W?ŎtHm|W݃v2ۣ[q=]Ֆ&13A[67?"aW6Ԏamf4s$N\:r7]`aVr꣐o&Fbo5nZeVvN+C괿'?shnԽw75Snz/E|3k'`'nN;D鲯hSdERѭ]4˧>qt [.)6mU*Gn[hgRrĞ9H+=?uAs/[аb|B^,U>OFiaϛ D:/M!~!jZ7~ѕcɎ,wYFQtJWaI$^e㙤Otc6ڄȨ :Pj(>"$"/=s4B"v)-+ wF'xAW|Lxr:8h0_Y(97J)&:`.&8't$mc?8]kkǿ wMͲmc@\Osu+ 1.itA` ^ BR:5Ayݺ UX'KU8)Jr[mtsAqe;n #[et @ GsD^Q(R(%J - #@&Mah r"=i;ͦD 4*_h!Z({I%c @biY8X鈂A9 (G0DE$tيm PW0{w#a (3T;! T!V[ PB!Vzс3@lG赙WT?(rW.]6^uۮav3!ŝCF ɶ6ރ?7jLC&qkL.E_, 7ld Vw4XTE 8dh]1BEa.14y>M:4ڬrQJ4F%t؅Oz0CS] j!="wB|. n/ЩG(Ztf/N7(єlΕ9+'\ (|D)';85mզ,|4ں4oy{9sm6~C1oLb̶'sfƳ_4>NB5Єu|.)!850ן'N){fcD~iTfTlԏmbKaS炨 ^ƀ QOÂ>ʊy:$e5z0fI" nT ˥It1Bx[){ !Î$I>GH oYjz}toG -@u2WҊ TR/)JcqC $VԝM8TuRp*Y 2Y2   ~=w$`Jsx(G8DiLڥ:u&?+ySjk'jN/3%z)[|$/]=dp#ҖW 93x1Gk`=}#E:4nN<8̵-ni˴L;g^'yv[[C]lp|/F>u{R [qЌ|loj3wQd^.hv=f9$u^Ů`$D'"PW[+콞 XMTA 8ff˪tCaA.K3XZd6u>ޕ+?/%+gQH8yЖM[3Ng=\}ahIw2&41Q\\W62 m $xSb{f^`I3%J7 =[(POu +mAIF֥G3ǤFwZ*{{BV8:\Nd=d@ϗWI:o, (ԥ|LK<[riZ[N~֯5NJ$|ZAȟ"P[+hwӘ~[X aa#]\JvU^,w$֝+* ۳v㍝kUKݳ??lz+WO3J67f?볙_7ۢ_o/.T h?PгمY|y|Qyz)ъfW[v{yUXy=~9"g~.6@ S:޺?*K,9oHXA{[]y_w!*VQbDq^4Ņ;y.,a] bX|Wjn(M@03by9YXEp J>| #j}e77MKmag3`7&1n{׼aYf"љEL9gGjyegG@_/{չ 8Zs(%qjQ 鑔e 0b20̉ґ=B sya׋{֋r"$%Bh(7?BQ&(ކU{j<#hSY烮֙2v~ 'Ѯn$ Hƌuw(;ʀk,!GZhPl;%8)bzrr,eeBq%1L(_ &PPYSw碊?E\4D+*i2Xc*kU$uZ .Y)^Xe D*1&؆Q.&0q!=!wo? ge_'膂j%\#11Rh%! ^szOoȉޘ1v1Њz!3d/7__6.je;bm}&_|_su:mÇf Ͽ6h~+&:Z#UƼ8FQrtʲT}}Ȭ,`s9\${œӱZ?*;c^Ey_}(q."6zxmR\==9w1P GupassqwFҭ["ϓ"5|zNWS RxqBvͩB}!n$hСn$}ڕ8ۛnڪt㡦:wԽs#Zv\vWKlĈ T/T']p"zQ'2sJ&IZlM*v`lRSZd4%NfG 儲x-G@(X5HGBuC,+>W!]~TϬ~sug?W}巴HPDW v+ JgM2ٕRYޗ'HMO%kލ p9uL,̙f1뭛sY$̘x>qnI֢4QkLk&VO?fۗaELf G1a&[M܃ 19 Q3sb}ԃ9qsnAT@Z0! _4ӭq4_3dT* MxI_;?~wr"Id{F8=Erh1@!/.QpԎ냜ig~X("SlԪ4l+]Ze]Q>ݤ?nׯ`ah_֤y=f81fy=C:FV:lYHuWylS=wYʖ˫7Y"6;j(>İͼ Ù@8A]-n$eeK}u/o*uZ!w&VfyU~4wF7}`Gǀ}mާՏ'}&ak~w%$7?WiH}CŞ&]'Nn@y%oB#_uȟV{&h QÀ( 3*, @eI,ǒP0a qKI`2}Gwdx-rN.0t"ՙe,CP#|`Dc857AQzO5mZvg7)٩uð|<*[O ̪Q3,F p t^H%AҏFa3DKF&HZC$%q] iOINA,E )1/4caY4E]}wKY^$JѦB%X2\G)#X:J=( kkOZjca٩uJ (} l!& l GUR,db ҁYlJXO.$K'6x,Ҍ(i"JbVݵh^+`b%Y?hJ^˶.yϫO#48|AW  w̎DẀ1tC^>7SxiA!T"1TY+BFyeC$k5`I:d(Sr$2/}B.>iZv)Ҟw}F:g@FrHEȑ/ |$ M)ޑD]8kNp0d?ǯ] dlpi?ݘ@!7Рy~ '`E.s̭Y+8ib>qP<|xE Jz yO0BNQڸJ=$zS^Bt) q BLFKÓ\k@% >eL141ZA%I2;z4\j1MX+)W=2H(C"D3EzE$n4֯Pxd^^kÇ\6g9{z!w$aR7}6 tЌi3M .EL\5I.9}h8^+)=9VqQnOΑjyMRS"Afd_޿EfGneO[h/-zzr&c8 Z _HOQGwc^ټALokS7)5PK<{V4') M=·Ҳ3M?^1|eT* {뎎TB*HkouN7ҵM{c9an7ȜђTfi7P#xߴ-Lf06 )LJA-Pr>]zmQ(s7-(ʤEx)PKi]ZS)iOFbvp}'N~7>Ve9}ͧ."H+ /wg1)F ިs!P,$ 'wǦ1F~#azo5?KX1X7s#Nku=}ch/wwl48܆AxVj=?Ɩ\}zqDsPtqi(^ÿV+ˏﮣɦ֗wwͿ뛶߅րR2 J30(ip?W?tP‡w< 34ZV*r3VWN?Hx 0%+ f`A"ABIF+A*0%+ "F>p QT)UL qO#_km|T ^ +tt2.Fd!ϛ^|L䶒ܱ7d|E'0%+D^ˉUc/ɴ|mnҽ`}zQc3T.xd`4i c㬾J,Yej3BG)B@7@W~ڛrD2&CBd'8vG$4'x"ח~*SCE֗GjI7KB Q_?ٱWp nt!"j1J&zc9Ffi9*(^>^E 0 CN^d++=|2 0p ay䓷G.S0PWp}Ù";eRq'ՀMaO>饃ڤ p9P#jG5t H YSUdAZQ}R$D++Q|"N1ef)S| [vg(XM#kϓ6^e;uD&0# \be}PZ18ICPd3(PJ 3LIZHm| L:XS(*Jd&jLĐ̖[@kCe#™n›|y c U_gL'GBt.@٩W&+RO:ʯWB`.&|0V qҒQ3Pc CT]B}Q-L:FZY)pfMθ%9賥e%Î'*GBM"۬8 .S2̉`(;r1/ML@DK}tBP7A͝zB=q\#1I.X-TG)1{jm&qu5 B7q!d3lA k),M$9LG2|@;YوT;YJ6z { RYGry9H(hj&{JF~7C7zމm 9 ]8IrW-Vx-(~uS}J4ZZ)]VntBOlq>Z?-"w̙ƛY\^֭vm!kfL{j=x.NX=Qf>&z=\cDL˖;n߾;c/S}fr?nͦTs󡾗'&SI hqvu˻F%M>xZޤfhA ~7I:`ٻ8r#~9Q, ӮI<)e[=OqFG#ĞfF$jb5* ^gd,d JMT'CDw7Jb֋zBiNjiֽ𗙙Oy`PaM/dKC$C[hա2e8E zv]G.ꕯ?F3IPRȉzT*xGS8󌆟'Pf5];QT/^#C>sA/OY^Z>\~^\\)n vw3kxsf!j'$gu /swSoJ`kteN x 'g]h?" (  53H3^烘 ɹO`J*z'ÄJPTMn[+;y͸5I?,-9Z)wH4ZuBǮl-܃Sp`O-Tkwk(0eXyz✝Ȳ(1"33 feHm|qIs)?E-w~yOXj(вV JrϵiojʻP8=n/յ}pPѻ h!_^H61Yϒ$/_/yJ}8!kE-R⋣/*ںW޷5 ~%7e0urgsv+.OYߕt $t2 N?8<# ̗w K6qݱM0_Xw̒}oŝ)8mީ{ؿA<d@W;p)Dv9SJs:UT]`]ڕ OY 9سZg]=BKIh sm,VI8UlYaCQa7і( 2㗎d)i|dx` =G}l$&W(A +Tbl{zoL$ִQB=X<хL-+ Vu+rI!llk7OGA  :.\|9-+l 463X'Nw_ClAyF?uƴ$[}*4XKuzd lA+}Q6Oc,ZLޟʐ4^2>Vhi}(Z0먍-}Z hbR־X0nճql>8@drcWw#֍5sZx˶YR!)aȥNQXFUr&fQ '~JKY 2|LrpB`<Ѻ$J"MB:>;i)rcPOQal&NC ]:7(sBqN ǁB>rbPX{c* >u(B"$'N$GbQXQfs{Ԧ6Rc$ՊYZGD* IQ˘B6,}j̥-!ǘgT G\FS!<.ndԲ8$J.EjlkBN`1*lfی^aF~*>Ԧ|{򖨃"ka}>Dž;VƎ΀(FEiŖox>Xv%Z.c`}$ɻ1&۠۫GZ7B3:f >7en^ ` B)È{qQ0@ma9FFh) a aݠwDh mkt Ɠ"v"͐ _]^w/2R i~ )ޘ~΁.v An_*rv/m^~N 9xgucJGE0ϕEu2y`9tJDV Z2Mz!-˙#5iXaF<>"+ |e3q#xG@qӽ\tɈ@yɻ[*Bx6yXuZVؤ: X\8M*kيe)&e;Vos);es;<ګr}j%P[p@[;:r["||=ү>!m^!_jCl,^hzu.F8$VtwrT3e.D;\q%+UN{ -Nr=Ƌzy8ėҘ&c9@ nbNc=#dK|:gr8}{䬔F:g2uSԆ~f 􁮉Nq>eU/' J9`-#[IoJƴ<Bq%σfb㛹#cݞ0Vٽu22BaatHSneFdT^K׈ߑxe#JgJv} O˒Tnf͗;u`72sX$禯J?јRp#Ħk4H&\n0N-x6 د}Fs "u5y?d6IZtBb6Y?xT"8{"nmf YA ɫEd  d#AA "yu1o[WR $bu$GM$5ˌr^C$\h$.~fa!S}߬yLx_MB( ֵ Y8⑳B}#ekc3XhJ5¡iX:PJJ#Y]f\9_=fE[{r2K޻ђYo!c =f+S Ոɦ@j ݾ[5VwPJs^?IGJh&ﶰ||?`vWǩ}dY={{2vWx7,\qozd S@;*MKv'Υ #'IN3Ns!CǂO$w^x2> Y50f“L2h Oz6qf3w{6eXkg;L3;Nݩlx|Η( {ςieM6b"0+v&σNX2YlneHB.WKm(`dl@dso@wTEՎ=jjېUksM 8`m2Ƽ6jGmr^Q{\C${>0CuC җH07FRO9Wꄙ-:l WD@H'Gr$=gZDx% YsRY3cŢ y/=oC`ua%/EDb6n$ZD ` p&~-З~%hʗ0 8:yOc?WS@6zV xtTOx{N>h꓏Re)F5lx񘝭vnx.0wGe4GqAk{6P3Gl%0Lk[[ [-'͔Q,ŤlQJ #U<(V"{Sg'3sHV$8m"V"~K®\G:_\cYx 8ZXb[//8!˫\W2oM57ՈT#zjD+ċCRm Ud*X%C,jJ4;/GuݽzuUNӭNGS_ޮtyx tQGX7jΣ4[+9)xPG6ǵv(g )'kZ4 pIy',"`\Ʋ2(ʉ\ҹ>]ƨZsiϘhLhY񿝾W믎SMLpV4]rx;e0JQb[b=HVY(IڶR}8N9 wcm86vF7ʲ}k: ɭQC`wbґQQ@RRڰP $ 8V AJ ̈| 0;&G6hm2Qa+os mVjŋb+f {{ZnbYWHB,ۏV̜;c x.Z!Q4cfB}Fvchps՛@3GrPuHJ=VI6 zxyv^^e˖^^{M$/5^\~tj1Ěç++@939h7a jÔ:N긑.ƻK֒6(gAݏh;$ [?)%M9IlQ)'i+ /R6$4p2f*>.&vrS#@%:E,@tA`=:6i-=0=,ZtםפsXp,F.~P}o7XJ&0y 1-Eȁm a#P ` 5-SoYV6wL0*,p"1GQoY NėLfZzu_nVuH=]]J±v":>[)wqPE"EI>:Ӂ+TM) }ۃyÍce '@O2 -R/ɞ<8Dd؛( fgHmvȀCer1s/,ndFL7SzqhCle-;mԕ @;h:`dш6]hɃZ'@e+"{KHgHm%!k-[|=TL9]!qAԇSp.I."FIjZVPٻQM1+4%KuLj]hIgQ9UJUg(Xtiww(Ѽ>Yr>d5[sD◆gHߏ+6>@NH}옔9b7@}GG5B\_&6w%oյ_N{޿˰۞EN9ɤNNz`݈ 5YӒӅ*̤RnJ -уԱ~P"߂h;ynQ>Lʞ85sRR. u=Xʲ3gZXW]bbϮ׫Y| kP ]hl29i%:;3b(h6)`V  E䨴Yze{F0 TM_!'#wx[dLxYƁ_=82ĶX,M{>kѲj#5G^{@~e.;L4`<1OoE^}GxA"UެCPR+V&Y Ӑw2 ; 5o%緪gxF^RZ(eֻUVLr:v@D\d`T'KnM+7J??v?%w@xa95ԕrW,=u|-sGRr*=1~'FռdJ#!XM $6l2%5q:cVD1u"/^CGɢ8aF3jB"|VP=̊Ί/~`x C>Yo0H136ΎZa6IDs}!{C| C, D}A_vRw@~6INjM-aOW_{~Q >_v-JUeGkU/p64XǞu퇜3Tn`SvWe͎1]PFv遡C}z Y<8PXCdQ̘k]l\Ϙδbsk=YwB~;-o1RMӰllr֛U5@~%RF-J-Q/qOhRϮ=Y3Fh_?4e-.>z'I<^_>>wK؟΅EM7В6cDi p5X^AЌb{e vm }ōs:uvFre/g􏅁ơ@(H|v8ҥP$Y1Z{VvY+S!mi?7,ig}6b?/-$o5f:l~{;Y/Luݺf0Ok_A:|KCz-bh +;_s3JءiY {G R^wTb>8k>q.Y ͻlƬ5XTʒ26gōXPFMїD)g>Fr!pX" YVP)8sH G+.+{ e90ZZ$7H߀W+k^y3ff auʣ6:Quʓ%3(OloW9=R Yʊ y,ӂϠsMA ǞWG,PeI؏&SLfį`ϝl)[#6?Rn|Öp*D[6Zj]]j X$bQ[5g5̴kZˠk׸|& ln7J/nmդֻQ ^-t^Q9FҞ=\[ g_MKѼ҇xm:doWEݬiz {JRM~Tap7ߒ jvjmu"Xg]6Άjt>F}lpQ!zlN ћ6Y1Mb[ey[ULt;6'fE/; P:gt$; {טdZU8:2bXMyW*;Vݨx:d?2o5 rv5^7j_k6 :qrqY;՚Ruyjcƨu pay2lO*+ɳ7Dۙ̈htAe~T}t?? ?KGa@/?_<"}/}7w}7ws2#Ǎًsi2XDe0%Rdm훟>g7o~|/oH&5xy?<+?!$=z~~[۪4s6߇l{g(ǘI~#v >'tΥ8AYJ3/Nsg:#~0:qlCd[Xa(JF׀!ێ9~m4yIk`!B2`(XymQFY͂mJ?&ҁEzy(G^}mgiWb ŝV 5nÿ|oorVf<<jhQU;+D܄Yko:7nYЛRdKgseL],FwL<XneA96[A!(~3TN*WۡR+rznx*~zrc!TJEGC;ޝ7i`z7͍2P-kthoQP6*Mn&e؉Q';ΖkK`!jZ4@{W-`%#e,}\ܨN.jf nelSC]cW8uԢ Y2O8z[ͱzkϣDv

с'9#^UH0Z8bwذqIQ N BrKGm Mw_׹xQGF[m_m4F,@'fmIJevt&ΫK&Hjf S3lbH]5}Ƚ,Iag?tt,B)}Hdg1QJ;j=Qu ލ#3,#®J9:u,ZM%U|f⛒q6f~hW̏Ϗ<dze,CE"Y`Xƪf%0\Q6S"m1et{B&΁ʼnN,2^~(jD$]Q0'eRٻ~ V)jN ;=Y^p1qA)C0@w8e ,No4WGpZ\y`@;K4 I3)s+ ebX9#xl^JD%186kC_r.Cv:! y b/I$} z$Zݤ2&K0dk{R]sc'^@l8PTbJT @_y'͍GR/qH2s7֕O&2H3>"*`?f${#Z!ze8FEe@߭mgYZ0MʱL%83)usG7cp #䏼:E~"yCA`hE72DnAk)y,#S(hD&ryU"7#ܺ+88l`j GG770*>_[>nU2 HYmC0a\ډvï5`t^oeCUwLӷ\uЫ 'pvzU"u^pv$BllVxs=cv][պqO֝pd8l#xiXX`cMUô%Rմ_nxR&z\uLmD 6 ~~~إX'E5B|7:_*tcdGfw5Kso0g[i<kcjܷ1mXN˧Qo&C燇?IOÄY{0{ Cl[gON*wmm#~9؝~X8 A23Of,itq6{p%YdJjl3M5/UVT-)7C;AU{x=BT!0nsov@:0~)Zw5P/} &wDD1LCp(K#F,8`FB+%{7;#La>WʴĢ93'),LJ:ٕ6hKo^ͯTbU5(|kUE/*?_'Ӛ ?WJ/#"NȻ_:0hzk]˚Sk X"R)!8Wʯհ-8*yu'5%eE-ԍsI 6 =KJBDۀ,"m'ץϥZ *#7M5BM6NPw;Rud{6z[C2ŪXhŮn i2檓p2 P <qTn2n⢄\mp[2,i[nBO_VⅨPm!^N*eݓh6M"7 u-ᛳ$' Js]K(RzPsrP\LnwY%JJfbپ̳8Ҫƪ}`V^*ڐݖ+@6ձбU4Ip;%KV:F.K(KpE2b J5&F蔢+mJ$Pa$Q pIfO\ɝ:K|ʒOrgɶ37}Z]?Ki>%RzSQ7K޾y YH(Al-' jhO?_饃lb8>}H!ӽǻb/ шkAnzC3m]H{OZr_%)7^a1A[YNJSP _=IW3][(0S\ԳlfdyXnҼo+"J&URS!FR4i2ƤVI68K@$EiJ-q*xB3`HSVĩŠGR{}m{L2=NYٔ;MI 88YN(1еQuUS8v[Gyɕ3:iRe.5bz|o{'?b"Jjupϛ{ {s;><t0Y,+ 4l8qȌwU_{YEe3/j{5+cf,& EO"IUf|܋-f!0f jv\%XN+|9u΂w I&d=Խ|ݙrpz9pVuG8D&5Sae}SL1Un: X*\pZSܝVE0\o )LF)Cm$pi3 ƃa߮CxmD__l.rشqW\CmfZn/j-6Ң%mm=!jX{18\ݮSɭD_ߒk!˓FřQ°лMW|8bgvMȊc4ڗO~py0Z}V }f x:8!@VWrS 5kP_fb^FAN6 =9)%E"!4}.yAjN`J CmbC\Q.4/AG,w5Ljz[1̋vV G7-0e#p O7Nh#Ҧ1zl G?8GizJu-"$'4,iG2|SS|\9ֈr*Oٰ Lqv%Zڜ[>ƙ۾i8T{]$Y&nUzi$ 7qOޛՐbTFpHE[yT&(UZ#Ӷ? өm>EbcλpAQӬ2joD#dk`b+ ZZ݃hǝ(1\ܮ S EAGT 7Jb$<1}&LbQwRj-NMOI" (rXC3ۦ='ڻJ**C߸f42(_߼9%4|da<=&}9ʍa;Y|z4#ya3c5~%qϹJY6*MV.f)ܧGM& l6<N25"=̂T%,_߁S\ѧ,}h@"ˆr9>C/{8s71U2ҨsEJb3JBY 8 [ Yjsd_&cL|P7mE7z:_^LP><]g""ngf"0C*~ S;^0M b:U f:aF,64҉6⭋S.pQR.0}~C"4W/NR4 ;}z tܣxYW@\W_w*9o^lzD^9RP|"ӟ=(t5 s@uV4(o'syLf#*EьϺ/m; kܹS7 >)L|i9uU V8p.檍/MXF| wg_d)h~[) {hTv5QkJq?XP(q$qe |'N (%JLc 3mPU&F3c &Z KK쮅B;3I"H)=8uG,bk\v`rE؏epZoU/BP. ;]5xt{[=-8]G?mΒTlt-Q%Q,B1jͅIAxS4N[9xr|(Furj,;/0s2k2U8@ 9ۈ:6;X:?Oy Nm;gǗr:~;\B-Z8- |񭼐eVGۧ9/tfs37}9(Ojj^wM`F1<} > ӽǻ !}f2tok#|=G\N/hOF%C纆b4:"!0ۣ}B MkOht(Kx$b!dgp7>#{0J,Φw m %_f_wMJJ+wk_*Ӌ;)X9Ϸ?ɴl`+ d'w$Azfr7Hx!$D*(˅x<etf$%i}_I/]=G82D(MRqy_o4b@4 8:#с:nF!hMml@`K5?H6 }pԁ,٢X6_?c$ p7Koc5_nOL)&#BZ+˞>C'D$u%0HkJ.%XlmATe0bq!2|SOiACT-Kp^h쥳`ј%*K X*L%!$iƩQD%uD[g5RԹCKp%!T'U# "KB`!ǤB :D yL:þ%AVbRkMkOQ`)~ '8T2zٝt+!:<閁c4[ɬXJy޶aݎp)Uxm{Ws ; rfor@Mlj_%VXr|@͟o~V ߋݪO9y~EfUΌ@)x:yX=ӛd-ǭV#1g r"2aF4F ̭w7Ofz;ķWfp~z D)M!|TT LS7N PW…mFO^i1DŘ6B 寎q!tH/YkDs+Q (Jþ+Q+1Z*x,ŏ7N9+GT6esĆbY]`84LbkErctLݤjҪ!{Fix4i9x@WMS=Rr}㜠0Y6H£deʯD"قguDIz$H3w'ZozX1y{Ws¿GXHz Y L/*=t@ً]3S-1<ڱW%4f0k/)Թ"5PXH캟;>vj=Y\jQ2 ef(]vP`E-^󹺰8mR24]"WTQQwnvmqY7ݬn.ͦt) 4BUVPars!w 011Gi[yƿуTϹQ{׈q-uJhJwa!.YXie8G 5DV )dVSΧҲ=hY&J+zc%_AxJ:QPW|@<Ϯ[I^SaG3aV$w3ߔMEi藒qGhd28m),#,nvʳu(LoM$bVK2$J@q r>{Xxy; "5~ I_T#¶v)]|xnk#tb&^7]!q~@Lk/=q׌+Inm}>N./kL*?拼&!WNFƋy`Ct¼::gr*ۛz_avTU GglX ;3r>6#gf<n_Lu7"f X<9Ј$uzM% N'K{ `Yn}}K@:b T_:"!PyH!h: <{_FiQTPmNO guHyf8 ځT+*:VyB}OVvRuUp}إZSfѓ[hNwh2:n[ELyjqj74%ڭ.!S1팷mVl#P5!!\DwdI)K\^Yg5Lϵ~pCP3wrKO_߽x+dhĽ9l[nvԽ* ضaSW(nq5nc(ҼА4h|:3^f]{aUG>%n \$Mt%k}w{xֳhY]'l hAK@ !ꩤSi\g4,kpqt=LF#YqR*z29k"a裖Oj8!ci$Id]J(7Otͫ;B_ՓW|1_өo0&?@O?xm$I}j&<X(r(ȇ&V$_ަPXmvxX,;#Q9i>1O_&y[wu5+! B'A '_e݊߶N&. M>;rQS?-鯹dPfA7ߝpX[ OŲ)Z?}urXo&y󙖰YۿK0`I%`IQwɦH-*gKDf+r)qySЍ@ w)A[s`Ĩig tTc=rf`vg&p(|/x)Ļ놛%\\^<,b횬CjQ^Zfwzbrao,ɝ|WsASd7_Cd77ʼ;>J*JYZLܲ&_v&L\K!M6aIƇR>L29W" KxrS9bxqy>>Yƻlt&R_:'^ԅEmCxl]1uݯXHftHuRVe䛁%loLSLRf..mz뷣+FY&.es63PX}el3{;Κb@fΛ 9(0&>Ͱdlo#G_)Hi#!lT<;92:IƉ2Z r_3eSLFa$6a8\g9@*5q/ȒƐ"Cl)a+ ZQ2!Z)0$,:MAij@*1fZ]ó)P)#I$ БyAHB=qB}plϕ`b6<3$}OD @M/rsb "&? L/tGG#"6ic6*דO_.p\ή./1#WR(yY 1ӱ,l8i*oo4&: ;L~=ŚR>"a C,ҡ9-[;{L6Y͋f{cFp5//MWrצ"VA޸}k K҈M RuUʓ2Eq!( d(V) m{, W.DʳHvdPa"es~lt2Jb:] wp;^E*T[IoZ ?>By("y#6zSU4̇7=S(zKQ<3.xIk#szKW635%DvzCƺJɰh4LbN0D "LfǦDCPp&e Nu0ybI%XSnp5YJANY1<%];܇{0@T!ՆOΗIK ùI祈b#룋~T-0g!p⡰͈0a8c&E*V1JVLN[1͋c~F/" bԑwv1Izbѡ3f wT_VP+ޛ '?ש("/>׬(b>WQS};bx"8%]A*cnR=<2Fd mД0)UJy'|QP_6T垼0F/C#~>Ք&B;&d*M$+3I,1HT3HtBQ&R2 -\h%; QfD dF'i"kL- 3!ƸϿG"C FǵG %IG%[;9.HR,1ߍY" ((),#_sa,cX&MLl ,i@#e8pRN;\#1N8\r)ktA([INRO حoÈ9 T"Z9ŝr至p6kTZO.IO##Yِj fV$̜ nH1ͬ 0ߓSHgy\R3 09UQGV!yŜI8ׁ{v7Xqu)8:.MF;9EyJmQq#ȇ(in͍{{e.a<&' s U@d}8-8 Pg8܍c-OÐ0~ .lPl 㖛_ xHu*:}Q];ѹ2@9]7A#XVDvˋTs*E@Z5ʂ1]) ҮTr3qsS#^Npa815 J׃N3QlYKo%|ط꾣/y:#g-9OP)}Ury#Yu ->y$ϝ_ߊ+%oQeŮnHB^֒<͝ A Dt#F~T3Lկ nMH+:2EJ}nu1ng)$K1[ڭ y"ZY?s(Xz{qzwgl0n{~EyeW_zBPr1B!s#?U hC=+Ǔl/j9|;w'|H 4\,į_ 'G`V"0z,‘?^挳pfØ|Q]qh5QRcJr*Avgڄtk',Y/N+RՌYqw&>-`Kebhh(R-̧eb)~-Xa:&mRp?G~CF;.)#\13<&l(%5CHJP2Jd(#xH=f97u;QKo<#E$7xgoEXH0"QwY*A @I C! 4I)w֕3Y1l%I5(@$04E)u"eLh[d+ ĭ%h%nl̐, O! \ !cun8p|:s \式0 }*ļ$ye\qQTҭ)RuWFY]iQOwd4QFQ.&tjQv6( ń2(0[G՘V[Vùzֿw,ǐ>ÍǐUL:TCplgښǍ_Q%$@7nڗ>S&/ApߎlRPMɒ E5[3(_h4_׋LEJD!l7ak2uoFƧ9etTo?Eغ^A`;}A+T F>B ZjmLD%U<]u륚?ka D1yӑ ]m>UuK\ԁ}BvBjMa ( &EkMnQ9 xy"qdS~~1I^ v٭bLkMfy_ӭ"e}nKDn),䍛h'j{@&(|:1m4ݎ7u97n[66!.nTXnsN抗*ωňm?Ұ*i.3RU8hi&n;kk%*PoRǝI}718W~0v]ъ <͇ o?tP H>ZYSiˣօGcVbʚ⑈p.Lp2`i_Ș1)fa{7|>6|glu]Ag|i|W8 "qayfЅ|q@` xVZ>RFwO * ҲÎg8,f0]l*͙%X0X'GBRsn$E2~s*txت|x4t*W>S~ʇ7BV4Ј]zm %{U]PQQKRID2]ord@w!1! qwWJB 5I Iu߿yauzcupll 3QHfz;ƭ^AH<X#0wëq$Z!=[aypu:ێrV)>Y߲nO+4_0~,˰~)g,.nI6`|X+#_'T%7%+~_ ľTv-ވDi(@FlI=o%$KPobߗ냗;Z(BZ9ɵPhY[OؒzOzg J!%dVzHm9(uV js)8+\Ms)[)LUr bYa/FŦ8+r5+(8+@›RMJT~[ofW٧zͳF $i͆VZŌv\wa$dcC[1(6x Ko ͲqW=p`йPCRP|z5fh( )j:8zo4M؛#kuB> q5#3)xT"(Ɣg92LS0ȐzB@IK0lOIXKĎif' }`r jM^ݓH1@-A{P`RUdB U;)̴4CNɔih &ZS01\ ִ>4@;HIp%;#sk:[,buS8Gl zzpw #+b/@s Ǩ| jT 񊅷0{UkK A}aas\ՈƶHN&D\ -n$;\В$fp[18Ǽa'3=Gf]Њ!W[f嫗{d@7_c&U%d-PY `5x`6k1S`¨F~6Au.Cq>կ?`z{0_BF9U^C7kz?nJO2}^ yn7]kКʢhYG2RDž,  1Ʉ? \Jcg?TpNW+n VŻfHЯ!ּ1jsmgT_߲i;#zY3gRG6W$zdyGic3 Jh 9),DXˁRR.9yIs PH,*OӔ6;sێ0R8Y+( .nY5krFEruYzz75sͬ[$@%p<g$rBGoaI4R]ZLu\.SSiR^d_~ b@`$ELVZZs\-KCQfFQ!If\2VJ$7xdpVvjs٠$`d K}BjI\΂a FތQX!/F"dmi0FJR(rF(?Eb4u0eVJ%5DJ *)G1V%5guc #4 N1yVR3lӶR⬔pNFaؖSRb4</#R_lKNs[)FZ)ιsiUk f"S!t!,]-2Wd=$`gOO>77fV;Jż;k>E>ܿL&-UݺW .]vŘMvǓט7N^:ym{7_݆w=k/^r1XOQjƘU ѿ,BY~Jv)mP+cּB]+=f}?{Zֲv$DtuLNz86If%F$ƈߞM3W,]i(-һtL>yM{|~bp'EN{;<$*Qq}7^V<&!uUU~}M_v.NVuEjB*= S64x U1ARrQC-^Xs7(R (Lk~ /Ғ"w3֎cަtmU!SdkQg2sžAJT6&P%2j?/_QiM)̦6QH70z FSH2FYtTAU(*U6PUuaP}aSˣIM̺Zdv$>*/ g. a #)X*k=mW̾ :Zc7Z8չmGD@sa|W8C VnZ|mr? {F/ae^k/yVfrN$;֎b;IJjV_NL @D-br us4Ӹn48VZ BTR[oVdӈ cp.V=:Bn5安("]-ז듲tih!uo~ -j,m_'XJXYV4ؘGp?,f=p'Fer+K3 EeeN(-dݣ1|sqb/ .Sp~wUT"*R=S}]D=h5 FZR-_&0>PgR ff} +:ޓҔ+8EQ irq6*<1mØ[R-C4-x3 tU$欥XKE9XtK)@eI77 ٺfS~0*cq sp]Fl\\ckߚ 0Z6Gub>\NJYB6 uUehK/ךHJY h(*O&he{*ZpZ/_5W\|(\6g:6Vieg?^,-[TNb0tzIx`>Kk8S$Wywӱ|Gg;"LTўdFEXi4ok Nm_y:-a#ȎKΫQ[pƟWDcB@:Mt&ڹRf,:gămE;TAk##DP @늀S^Nb7z(O8^h~z)Ń!7$ k'Cz |v割c$\ded{5=PifJ8ikv%)ս{t'!Pu1M{fsԺx&ͣZ$ U<}E3O*Nt:Qö}F4_KBrjt0n'ҵZV4.]oSV[OKW?Ka۪h#L8 R;l. Vʱ1w+]:wHd :>ds닭LHwm_S y&s~_P-Jwj_eHR16Dҗo6%,2@ ʹ,s4hv,u)OZ+[qQٵ4TM%ՠy)=BRifg2Ɉ;j,rR&]H!I(sԂi*TgNǂ bnqH5C%k(&t+H)kY0H W boP)0PK$0 x'zB{X?,9]@O}5z6H-:+A+x_`jcn{5D⡿cVJ:UV͠v&A[*-6P=PH~cyưvJAŒΎWl|]lr.7< Ԭ߁%}ݶyٹvk-{ѱk e ds!^;ot>HKڭ?M>dw Yꚋh3gihpoypeΤ6&2ܙ=k:2mn?4xW.BH(p_B @rBjtn': on݊80ODsu;1}WI]~|HLvN6JzUH:8d?ABDK5_5AށGL3U _Z_?+?13(I#q2|;YHdHwW2r8wi).IISXJH TJv(>S=:RJ}&et*%Ҍ;aDѦC-3ѡK)Cɘ R>%S C<Щpeodz;P>Ւ9Ձ|"3hI`SE `0Sr?!EsA)Hh.5)1AE Sn/&i]3wUnc4S\4$zbi_ o%3;~]%SmyI󪜱z=I,8ڰ_¬B56 Fq~jo1漝OiXU|2Zo7]NM5J2h m}.p p$*yH2/M8gthYLrj^|Q +8Pe4ǿř TA1SvikѨ4#7Q;_no$wG g'0ᐦcI 'QL&=d!}QQH=LtK-]3sG%=?΁]??UpS_sLsd)=ߣ{ϟØ]r,[RFąfy/B.vCTQ?D5PVlryQx5 e~ ;lB P9 8l^@Gq5Zu,&~aEW0ZYoC5M(t5rNkMªq-1ڽYbR9kAqkX"pX5!>.-#Tk+/ _- ߣPi ڨB:p!fmߺo3sWA}lyj˻=+w13Ra ͤ kvlɍUWOA{C)"k􏿿@c2/01(Bѧ7bHXi%/yyb," b8cZQn-_noY3\iA-ӄK%4ge}CŻRA IjǍ)WE~bf/.sH`u]Ɍ6}Y7u3PR3鏵'˟Vb<=vE3 e1[@0Z'AZm I M@+z?It}N}q*eyJ@cg;"e;f)My]5q5=%z1:#-$ Tw1FK 6 FdJic7' Bp=[J<9{e(tЬ`{ 'ʇ%#65i%$<\H rBsIRcryp3| w0 ôZWJ G(Dsw}Q[ BJHxe7m-/r%u:^]Vחl%ISzhW-D\dsU`-#.V=zB,)5bΘ/")Z2`}R]7-H6_^5x\?OO!@_:VFttWd*oۇf7#?)_b( *2 .҈Lq5q\p;N/ .Sp~wEOcv9fPDt7rA'p#2>I{b BհQ3mP虛RyH-h@e*9>Hn0ep>熚bt 9jŸGR,ô~ePEF0@CKq1i@D#:C gZX"xQx,4XKE( bo龏r䫓)3EG41g't$;d6}}~j-zR{᪌\ywѻ :1fI^ÇN0{X?,\wXtX_]^$M2&.hG#; =Áxz0[E-+{@ȠJ]]bLt4Co.b1!}<\G4fπ>J,/VŅ[&0"kd.Nӫ.Z)g9^W圲6/}B58SP]> i| -O{^ @_` D1sDSpz*tLoGIG [p,{Z98#Df`ruF y`\l*zs׵4K);ܵ h=-1:#N-9 PZs:bC c|ӦG7St2%o!0m1mhM7{3*66>ۊ{& ~9D9TtweG:9rO ~!*! FT:r ~aEj:DaԒ춒:BJc@ `i_jlhYMW4ɀkVqB8qy>O5aeZI"(G{<5ZrQijTJnGQ]"^!dm Q0T-p\ãIgES"yG'p<"gB /x!&gFL:(ipjъVj!j({Gs>҉q\LT-xi=#Di JsDCmև}i e7귯~AP`v6t}Ԝ`m(rQ`6*2#AE3 f2jߟ}G:1$kŌ>2#c,7ƃ)ʣlEsZ0 ./h04KT'.lPj:z-I-)dܒsKJBEL֭3$99ٲstK el>N3t jOv[XOcO|}9V}^YBRݶY3K:t-]8Y '0Ag0H{ 0V9%Ǹ/p2/Rr)2stT.g/]n-o=3R8qM0os_= 򶾚$pvuq_نCB9җxNɨVJ!kU m+8[CRWuMn %!HfhD̘:i]XI)@_/*)9b M^ȍL4Dw=j#; ]}ܤg5$cWUdnh`0t)ۀy10C"%`9UuNy07.WD_C;F.0"ƅcxy6Uqjf9P5#FArƛr_i?Wګ:_lrp/]R(JZ7d_PŽinrv덍3y+EM_M-#b6ڕ蘍g6WgZ[4ξASdAInPƙ򘘽^M4M4]A c\\a9IwEi{{ET;=]E/^ab^{|6/Qc9Ne'Gr : k(\F0#PŊa!l b# 0gF(: 7Ď@SR_ߎP1/* J uz>H(`8Lʘ\EL#"5DX dHsɹD@`8GABjT b!M#jɿ/xd U'j"@fF6&4p*u[28KBi"GJ1X -,-@0 `aɐj5FU<O4Ꙙ+*`ƋclR!1R"BʏJqz^KDtDϏ.eD@RP )"oTJ+5'Gl#X]U ձQD0yGHQK%He#wCrj?˙gXua\ _, 4+ˏyv\.t.',a:ړܥNfp]rO͌:qU=o6\gyF8.і<=F{tSef]Š٭3VELITU|2IZp9V;Jn"$#9ݭrVEtI<9n+X/%^󦒟>OXS{uiLŹxf@gxf9^HDwN͸^A)+@ 0"i\JK :nէ'%5* &2F%n6RF:f B-\9,X>yo q|-L@Wzp͆_uo#%VV^μЈQ\ON+֜;‘ێLO@UElQ8Dl#BHrM2WbmXh\h>TuC$ R$bBr B/I-DSjh.tHsVȹsiU"]0(z FIE+ޅirMD'staJK \h&^34e[&2@*I$i|,Gg&O%#ZaiB[*O T%WL8WbXyr׽LRjM!76~dMK|zlP:ϳ '=Ѽ9U;j()Qs^x yx}>bm'.hsZFּY#(;f\ cHo``jIb¨7$%{O t[Vm }`)F0FĈs)[Z=A} W n?gk͗7k=8fRJv a6R]hq<-HT@4C7:i܉e]HYpC|=G|eDmcH֎ױ$Y9K#U3O΋vy#kM@5c}ޛ(,T!t?Syy=(☐gFĚẗɳD7 bNߓ8Wve6FYy}Q>%Qڐ#Ni`ŤF"L1sR"p(H#`*L4^:I+ Co9wӏ }c]Yt^vqz{Atz=?|tq _?X,$xUv⠗ثK߹cs0Az; ӟV0ONۛze輱g}& dNqb}I9nð`IA!Y( }˖ʜhD &Q<)^;2&EvIO?ԽS&.>4ڡ}gHʌ⧿HwQbDuDI0G4=FVjMmTG;_MsIҫoU]^vck^=t M}lg .[ٝVz뚯__@M뀣;N=/A:?,]<aC۷gQ^_uafQۗ˫#%:6?Ka e~ OLr~v3e^oy RNh0N_=)Ht )ߜyb/ë8Hӛ2^>ūo|,_'Pz~7Sܧm-~\͂Rظ4J#( ޹& q1=KLj#!~HZ Ѷ jeI\>d's0q4-]I?'LuwRRr:O~E`6W%;-o.| V$d](K"J"C:ϧyu|r9Y]V~}ƀ^I`eE˗P>&2PǞcWnky\ _U; /,_pwwWA>vm֤z4 Z`2L2=h8Gˁh\KBm/4 G zd!a(Wa5 |l`ڥ!Z(-m 1O(rnsWb! sR!z8o--9VPz~l0z1_D@dlG9Sc_mE:֑Ѿxݛw6+PȞn7s#XͶnvJd3+6^.3luvtWEEћW5r TLUdf3J*IT]i}) ovoW'?#Lq}NF27S8-ʅm1mQ+mYj2* S7 fn8u(ʝ#  ٛՂ$b?l?n cOkYtlvu_}_]{]k粈*R+JmT.]jܹDu5w}sI; Dž~ K CBJJd2Rƥ0 A>\M P)ʐ6[&ѱTΗZP42ei&x%5w]L1D%й9׏ZS d.:Ĝ҇a`ߦM@1UE;M wzQ7dm=zVz!t{ A,;[&Q#H[R6)]:8 tYCytR`dV>wo -wH>6:}B>+]IC^n)20:Ex#1j⃩yƛSTF>'rPi}X[u-\B+zx_-ܗO?Wy/׿hW4|qE٭[*X]$4bזnM柑Zطك~U="2Z!AT}g X$os_?hmoK~*D©3zо4t_u:?åv4~7_c܇K-(|Z..)c5fRvhq 0L,.ӟ[R|q)@^ Kꮲg]_yc?w Z=5}w?~X2pߦ?Q͇5l ٸ>_.a7P|_??h(7<JBݞ.{[ªZ ]xt|:u?yoCǔCt 4ɯ[ލGncuPݎ{e 'vz)CtX>s˻1k96V%>혻f7_6лM*Lmג@dbzSJް,EGSx社ޔuI>j*ѤOT3:W{\Pz(Vr  =J-V.(=kpWJӢ97& Je\F$PZv䰖#Jpŧe'y7fRQZ=un)!ӯؖk-k[ҵWF{eL`r"k0DM`͑:FMoQQ'=JQҘ"mlr"E7+%&iN-ZP' jD56OPԭ~*@@Jcչ#,T".s]>-# Si&%cg9vVOY\㚝Rph.8f,}8z)3Ѭz +K%rS/9W7ˋ? I7“ѣ5NC[NˇcD^Ubn[;   ̅/+\4ZjU% Ugpd0hd|Ev\A/S $cohn>V\ :4-hU 5> Y1DiץUr 9o|hm;kwa+Ə@@(4gfgSX[Xg6F(BaCMI F㩉h˪Vʇڈv>+yBi#Q~:7qDJڰ9\=kePf2[i²Xh6\I`ҬJ,.Z=.*sҬKKVbLRˢڦI&/͚u)C^Ê3yK蠥YG7 ![Np(?4˩ʿ}pvkY.VJ4k% c8UiVu%W|]ŶiئJ^~Ҭ4(S8xI$?/=T\ڟ=2*]IlJR3 J0д@k&@JBƱjH;K>j8dDlbVpf .w[ZRjD^9-2*DPJXҥD}(hs\iSr(}] 2Sޛ+ :h\mMu&֌i6m4"7ހbt)Ʃ+;?PLi<|[ѐi bwyt LTڙScL!5.e>2`'|nț̗I&gzc[~bcU2iQ;SC Q6H T׹[+δC*i-~x_~Sd=x%fFs}3\U%3yju\ռ32-3`'(> pZBLFH?ʼn%Z :s / 6t=\^AEZ&.cB݃NoRǦɽ}pE'3&HqVs~Ĉ18ޗ_6:u,\}l,vN;uh4"vj>՞oX޶tV!HYj1vgY"0V#W$ZZCTt ?|e0 Ld:"x,N9_>P6AE&b.kfڸ_a6b4^l36Nv&Kid{ 0|q U8x ZjFCQC*G>p%S<^S<*k碇K)WbLBˢ.8ϳN>Ϯ-Yoӿ7}c+|gONz.}*YWmk{ fx/p9EuIb[QӦzH)KްR<u,Fo_@+Wpe16)-dp/:PCi] N I>ZYf'um XFJP>&}|h{pWgʎs]?6)eO_OPŦZ㫾)T?Z_o_O)?L!JJ%uZҟO$rwȟ?|xw<WcYS8O7WkGh!bup%'Ᾱ_>}^ǵuIZWn}&8 e2Ï/}ӹm.|vbY IbLm T[T-֦5֠uZfܚjLtzW9FZt Ϟ4?θ5- sp*Ɏi’Y;s9k!zsYȦ@\wKm k}n7_RIeR$WSleSS$xt㱼vԷw\RX7n16c{7[ |L'!|Snh-|&٦(q)joo'͏1um g@hq\)]w.Gv2/cFf/n8-I&=u zqB|P]à>"OvJ/4`M8MjƎA<Dd@ˆduFeQ-]ҙjW:&#skXGۈm0ӝjJF{*LAʔHՖBDHe]#J!R ʹS*  F( !al9tRLǖP'XB0rhxdcbZcj_~7cۧgf'Ʀ ';X"_ܪV=n Y.?Һ\n>Wȳ1\-M$pQJ!Jdcd|BJ; -*i y%% ]WYg Tg BTѸ4 9*BҮQIU5j]`A6f%~xljņ*-Z*bzDlDssPr3, 6 lԉ); 5]{#V`Ԭl? z13.qڏaUo$.0%w}1As`-Z`\9C%vY5 /|\^ i=c@p ?Rs/^QKE^r9֠hU_zF5H_x[MNKD*V'Xu`o{ >RG]k!$ц2S@@|tm(en=a6%\c|iAAsS 4:S `b) e V)?90^P}y dM\(!@ y|֤O\+R*DJNبTH%)sc+e b2S"hx"$C\L4e^0e_%b0\2 `ξ1ܮ@a(lx1Lmk1lX"0@ nn(r$֋q3T*BC5<$N6JeSAZIxL9hg=-%b(v# $0dR<{B"3QSFZGG~qm{(ϵ# gH\ e~T~-" P~{0M!q=i#uVVFՕֶBgyѼm6L ~[9)( @<x0S?:f:O=S״]6SC]3s6EB ΐK #Dcz1FB1B ӆ1oZ(x  $D[6(Ф4Gv2hS?'_) „(?߲t}=@),7(j_t%yvuG z$D#*[|5ohM&Fq>n}F+Ӊ}Gw;.BJ6wsRX7nmJ""S}嚏}<SzBqS E['yʅ?lCuJ/ңs>Z-쿙dv=<˾n[h5-.Lܤw&}Ѯ.ܔK?7egOWK?Jp).ag,dwզ鋙rҹ~Ɋyx!d  /%*cٚQI&r.@VFp*~VE}fWS2xhf$d / ]T㦐QA"DٝYj"]!Id_ jS˚+v*I>*FY,ZLѨI1\)gPJ8 U>J(a|dA7Tx.(ݙ3va`;W=e#ZQ ZY/ۧbeswuV(¹]XߟN\SqA/۟s `0&c=ȇϪW],3L܆HL03 y8,[*dwln@fZ~vAZ?-Ov;3Z=[חKk'%XXn77nb >JYcBVü%cWCV1H(qa՛Fsg5JpzYdVM׃/,~q{>4j1gXR4( 2@I04eIF1H0v@8n~r}5y׫]]M<-J~_Ykޫa/l3&i:ԊEY}PЩVH\j&[Y1jdpp-"EMn8Dah*[xŲ( ݈璷+GԜsFȎ(F 0VҒaKE8QFkp΋\2,U)ƍc َ"v'𷿅Y}g<7jVY厘 Ŀ}-ǶJ(9Cl?*EB4ErX~;V$|=+.VT{ [lr⻶U9L%*Y0Ü`)bH vo#$I\w(ZPAл%ٳ#bPYx| :Q (`"n@黅9R9DwH{= 3j =}y1`'Oj#Ի:{j5v_Lm: eyףWc;)Fl.MP0XгWI$h10h؛00X{ O%?,FX]4ٶ Oٝo r/]gbTtVChPgL`Lt> 1 [?+ vQʞ|U̇G|gq=.DYuB_^{dUd Fa,p!BDse!s-L(b +.Y(@0!tXK.HjT (3 u d”yFJlt86P-$1vsUs 3b g GqS)y 2h&,D7Sv"߮}/j.((%FBPobA>j4sS~ZWasaBJHyN1aP1%+U1s@8n*]~vG1:c]gOֱ 4҉}ͬHNH\YQ3YOUTeԪ-BU`υDv`zYsI\F;JhIh#AScJfPn)($R,?1LE  Ū[7QՃ9ќ3cD5DVˆ0CKWêYI4T hGKTLvƐDԹ5 2CruTuc5]O;Uuӫko;Sު1k$u&(/dt A瘣ַOȞtJk̦֕lܻ=JHRQ $|1 =4~p"$qFir4*ۼxh ! 6fC+Dȶ.)a'AE9C%OHXKN;U`yj 4^=|?M=p3^>^cI$~^8?uaW-g=UMh 0q: ,] j ~Mi ;gmnF% jW^U*%_6 li%{)CJHi[9 nՑtDoX^`ΨPz\)0j*ZyαU$LJZJFyx7a!g4L >hxg =X8&>7hV]1=C~ ZE4y{5{u>Yd Yy}Q֧B9:Jw\.# 1_]~ p8l2^yLɬ%X uc,=KNlƁގMj S{`\fU7hڑ1F~vvI`Um3kuNqOL[咕E/8XV6NWi)v1v ):vhD}z~Q;ءjA'޴ rD?ks3 ֶ{#w"Eϋ(Ml%GP#9+i*ΕE rK+0$:ynu ڣ> vJѕW,qpJ|nJ.4zMT̀/g*&5Zu*!k.eY!xQL8Muid@)J\#HD>TPʏ펳2-M?YfkfIR4^ОUIiqpj^8*MnX'p&-Cqt$r`ogö8+hڎ\ڟEy\8i+jN%G'wN )rLŎrP~{b?ckx(Q./\^Dhe$DV:IY^QB+1t I)VLN]_*2)yKqYtDX (5bRUL8\B!,+uױæB <]%.m깒j$1t:$TDK$F0Nܠ"M"RīiKKJ1FQG z, rZ*rMK7ȦaiMѺ"[呠~j( ^^xVrC+Jq5F8L9kp,Tm.TIM/ɠtINIDj<=lx~*B5RH`b@Zf*"T2%*9$Lҍl1Qjy$ ᨩmM  5ؐ[>Uq+J0/.MgJx3ox7dpBfq1Ha[r!ݯ]8 )7^K#@J'JaDR+'Kmʁ-q߆ߖLnOWòn챶Ӣ.K<B!6Z<"16Q/͚^HL. R_7pWnm#xb䒓ػs;ҳ$蝏m/qԠR:J%'JJbpƕ֋֕^9WqrH,%Jk[RG-hYrX?%Zk+f8AṴ4nUDPp+"V*zܠCdz@9lVj)Bzm#e(%bc=1cXb%KɻmQdybSAZA: M+oZ&z\t0t./uX`b]ߝw.F kMt As.-|n4v!Pj7OKϷ@fyĄc$8Јd!5k( 44r8DAϯ@'j#>ڋ|ӣsq;= #kN^?7:G2@-QNQ,D@Q!;qF7#L7ܕEs3I^ .t$!v}:s ԠHJ8l;ֹ$=ԂR,個dW`E Lt-\Xl|L2q- o-ywqq{Gbp\/ Z (V[x߈w!}s(cƣ#--u0"SڣшzutJ+c=rApYy82F֦ԞpP*(TVkSD" E=^3<󈌍ig36jT`"=g4{}56 Բ(m=! U"f؁itaIdAdm ]?q#=U>:>=h:"{}Ij5cghr |\1RY-@PI)IE42cSDO<~E]lWS˼FmL3ȑh=H: JIH:/ޯ򂓀+_ogZ8@bbf<(9竻z2ʇژygv>ۚogf1j 2o8DܐE/{%++,"pvkq"]]Af@Zׂ+N\%u>pP9E.+0'W&!TڛK\jJ.XNs_Y•+e+wePgbVg&~ {șj,].x-!׷ p,-.o/ϟ~0?Ąϧ&ncc8 A)F7xFZn@~*AjFk1xD$9 ޴kVlfv./%=Y֯u0{Bq_LhTNTysJNNnxztF/8*?ݨj*h/ϙdCm .?eU( 7{.߱^Ӯ[kU?vGkɈsr33ph=A,u,iy&>-Nئŏl2iD>|9)ȶT18?}t2toAƊWri]Vg3Z?/NC8[Eߨ墮Mߗ_ez@Dgf2-YrX wJ(U򶅰ω7g.9!'΢3,Frrg}0im1SٳH/Y.oRگ ~^%1|ٿة ?l8B%?.F4[h ¼ήf(zmv ]."D#^;ߡG9stV{)i5'2y9lhR.YZ8sx/F$Wʹ̍ f!DO7f~OoI+*^'iJerz^I+*iJ:h۟;ğ:PR^-i x;ݝnaݷMZyAVZˬUJ;oHZw37^n4Zgnl> <܍27:hTKT(rR'K-/tM)-ЃʢOos]j*%Z(" IT en =PyB.˕$猐oRPe-;OKBA&fU21 ^ťM9qAʣ".HUK\뤫F,B~J`Qz<* P!H$9#]D,+N_UN:[=j/PL|@Ъf6d>E$ɬ#*'^[uF9訪'BnBɫfjhU ).HchI?xkEqM2:VBXkHbW,,. Pvg8.k$("rS 5M.a0-O*Ibp10,"Bxh- ՚yKpaL&.D8zQ=<@Slw1Iͭ8 ]ONY)F˙a 9VŦ3 :4廵sҪV>ooe"1EwǵwT&&Z+%%_WRV=yJH Me"J$9eDIxյQ ,8k"(% RjCG=Z%j2iKX)$Vf-)*#e2#ڹL~D1m,v6HB٬h5m,T[I0FEc9F奈YV[+#mE7 ;dEgI xdqQ“Df)0lUݛӠ7mG I{J\>ACJKsZ/0~ EW`}UdBs8010rCp؅=pW #I1@4M%Ᾰ,Zߓ ijǜnUsb. vI $0Ǫs%2VPOGu{l\(iߒ4^|6U7MECLtĐĄUFhtq@Ji@ZDMHf2p>0*@uBoz(A a5c<%ɢDYڥ]ɹxGoڣS1:e}pLj9 "QVabPTo_FaYv>O%{NË${ KxoX4y! *^9_?ϟV?0jWbtsmr@$` øCD4b: t;{F|礹s,yqj$jtb~EA͆ޜ^,shF%* V`TK!/1:`||RDғi_&ݏ4jpY%Jf/G R/|uᇗR~`~'[0Hk mt`BD{pB:ʕCk7g}eو֞@sNᨷĖ$ާDb$9zSY .5g7^􆧬Qw?,Yn,z(8tN (G"SJPB5\M}{oԧ>,Tm9!%#h[h)mh+\)Y|7.+vn]o/Ն)hJf*q=c)S핑4=KNyk3?UAzsׁ$G@ 0>FGN(L8kkѐ9Bz|za\%.>bS6pLO b˃],ei2VHI*n4I"LCb}d_l]&x lwy!Nl%:aD\H I2 oCJ҇L|^+0^#8]}%[㈤[0[E>,N{C$7x.cot4el@COR}>@}cVX DDZF|k srXdDg/'a|JtV\yWGYiVl4 @OƃaN#zةW(ZR18n<2]TQ dwWO._j^O&8#>3-A O_L'a!4 w׬bC͙eʒb߮g3l:ʁcl]q{v$*6wֺѭp isJ ljz.z Z{:":=4)dŪI8=kpTIʹ<#|wV'vymNYSL Χ o21.?$2[,]U 2Ly&^X2scN pz&:z$)/]Ji>EH `JGڷ[&R\&9L6rF]S,3Vd=Q&K@Yf@F3 ;U\Պ'G.(s޹L7y -u=8c0:@R llG ,6Q@T[=CAjKi6F^D O2IgF[!sCzns[&&dz0WB18mSDmק:; ߁;W-eV ]/epzn[G-֥z4=.V`h~ 2E2YZLJQ,e{XR!g@SL'` 0*S* pLYcN zݪ4ɇ3@=?43JemN%5 V2#0Fm|P d4=X˒ 8 UJcú%0^2ȗnIjvᩯ~B #t/C@9U/b뫎7^uny{qxӍA:r|禡}ߎf蛋ViH͊ V پv6SIz|e9HG׳ۼK`ʿi%9!˂%IɮQ` kUf*WZ~fû$:CUr K툫e,$٘RD8YbbY1N jw5a.BԨ[";WZ/Wv1^y mF9=sLm~7M[k\T R{Yj夬pZ(( º~ٽQ/{֭ې}(cc-t^cՒ-ɱ_K-;-5Vp%73 9ysɮn$S3s>a+RTQ|5aqv8ᄛek61Up@Ah1̛z@H*lS.*0"< ˯^u@DǔAqx6ᬕ/S nz;B!v>ulDŽtrhỾ޸e`)pJy9tQj q7.Cw]Yj'Q˲ TDAѿI,B/8H4FƦM6idE2IqÙ[ࡰ2^q^*@{þ hcԵ˕Y U7"xm "g\_5h-TN'ø"?Epp飺 aLbyzgBOb\vz# -zǒyjJj?LJ11v_o[nTzC=Wԑ]?&E$GDL{aE4Jg~M$̇tDm|4o<9"|1+ &ffLN&{CNIM{̷hǩ Wjxy>b@~qA Sh?OX[dqdK5)| ![@@k$mqY遠no[q? rR#Y@LwFȭ Z- 6bO82d X2qiiId*MÄ]y_kQ7WYӚ4G}A =,d׷hiP&% Zm ^?L&HNZ敱, R-*;}4 Go !rͮcl$KUs0G}M5vfnOn>H|BR1HQ E1y`Z |_ 7W4Fg)ҰX}2xiY0 1&ךFRhBq o1Zg,.lgԴœl_o+M 藡/Zeݛw 28ó&RLԌM_PFjzo9 b!udgM BȜKg^[:d F@obo47ַ6y5ڎ]|@O:Ό2:ڍHV"\OW;lo 3=4k\y- f`Gࢎ8>M>lFKA@[)]5tӬO>Y9@{҃ҎE[&aizr,=P=N&Ei`EuیŜjIrV&9}k1݌H1:h &cZZcLːYLB{Ý6Mio\챱>j |x P=ͳ>?kWOs7ƳoV`j?ic:S$YKDno+Z$vqz#:i#>'xd< |`;CSǡ_pz~~ 6YG =i'ga͋>D=,y7)|ѐЯ4kY›29C-- f̯FMA-<=ȝLUGR<^?5zhPtjB#,iTjwYLu)rࣻRu-ʛV. }I#=F9Ph j>ɤـU`;N`]"ZcZw3IRfa6´"ΌyaC!{޸P&ߗqZztĂ@r)Le0\ݍďcl>y'gM_ۛt{|rהRq5=p)jllAR(8qrhC1ʕmhAx](GCfsI8V#hhr'Ghx >F-ztΐJ{z H51P|/i_p#;M*WQ"1Z*zFӺX 8%Y*ѲQy:~uo3y$Oruv[Mb[xj |΢|:ٻ'Ƽ (G8޽a׎ _LXi{l}* k۽m|jotoi!Ekʅ30$'!OVJRw{+ߺ|]VAo{Km?mۣG_#J#mLe'd$]o›xER˷i?c4w&?L UpϹPۏ`M"- k):NS ,1#^l7tߋYӋ헅BkǦd\MF/)uEY%)Ze4(OBٜ ߎ#;ұ֫ rF͖H!$rN߉Hcb*(S\d̠".t=o򢩎$~ljeTCݻ]tZ(].,S>ތwBikyW Tvܾp\w. ㊭jq__:*ldɻш4T c}h* W%:hg A[@p<;]cA/'[?^-@HѮN6_%vvsIݵnZQ@9gN, bHhBJiR2› .r!bqZOP|)eJL ۊj)̋X9_1GbjߊFq !Vam3G5Q$,}T{}fg|)w!>;;lCDV؁ YP8!\-2bR 9= /B)(oT:ONFaHQ(%l4G#jQ Ü%mT\7ݑz@3쳌!c B ^$AHC8?pBlԗi;{Y =JD4,M f.FJM@.ZR <8%{Vy>޾ /8S /Y_xkJQۮh 8Z},#7.tD .H5ѥi 0y#A2 :2 0!1%ѕ aT@k\@OO1tw(RKem.6Q1l$WHHj r= ~~j>dD$ql]R&&xlwmʫ=Zaݏ.;p#HqznT[4WZOs` S^R>#Ux>9: BM_&׌wz;%ԳE6h5r)FnFg]kjU=rw9 =cCJreXT߼PjҶc?iW̪l _{Ow܊J/EzФJz;}28 ˑ3n=tCa|$'`7'?kهI]&eG;h6)qQ5m0Q}K5-LX:bX}aQ}K5Hx?gzB9RChY)nQJyJ)n Q}CvA5JDޕt(eҊjޞsD)a(.ݪA0VTk.2B_ZQT\Pz(@b\!NѮdj2fEԵuIKҬD-99R3J -7fJgB"gEXhɏ+Or|.?xo?Y_ns zT4DCu2cvu?$D_oUg#j>Aj/e8׿h`x&V-tYG`kWk٬LTk-vo%'7gz%L]bITtV俁)!ٽ.:ݯIv%T3b$!A͎iz!Bk#q-F@mu訑e+#p24ӍHь!/6!خ5^Rџz m?n_no}CFЫd@F>ǙY,Mi< ωss|>}:h΀* y)25rnU3bwȊ) HjR=??TZ!l"C1t3rj1DaLLSfOj̑ݺzVz"z|@@SiwO> Y!-hb}̣[и-V i줽[eK] hnB1vǍYTTLSFyekzCn\r+9vVXO@W  ő^&P/ǚGpVȩWezp|Ys=pwͅ\n]iܔ z8ݚV((Mj0ZNaZш~3^qZ\j Q{̧]sXN5jG 9jS;yAUN1Z n<3mz|R~}?ξ`nJǏ7ׅʠȈ4bmÂgh$%%,4G)?̊՚*>悶^]&@j!RPPWv>|\AM5C r&;+gL|@?u}xXsmfj 0#c4pM9a ƈ ދ~*P<`y8t/&hFi2܋dBS=`G;^C0ZELx JS?שݥJv ?(cL ,gv:z:@jr%  yu[f>|;y-lِN 8% #f!j6C! kZ9'eG5$r@QTU̿Uכ&x}El,_G"9`z\)"WRE)GW7ݩ>:ƽx˘Ų?!NaWm:m;l<]/VWf],=U.k!1RލPmu\[ B;F:+n9֭ snc0;AY\Mp<]<'N9@ha)jnFbUl~<#OlcAf?[܌WѡO۬fM6kY~&7Kg472ӇM&h~ޕjL8ˣkFERXg,kx> <`0hX(Yj=pc=\NRe20hc_2,sma׌䚁Q(vT\idťVX<0# X=d V 6 'SM9A#YƠ•fSa?n*af0 &b(U]ܚ_"R@O;rRՁI,P=^7C]!qa~:pφX%6 4d^%{:_39`ڢ)S}@?^qG[e=rhV/VV?,IʠjlG"ܜiw& ;j`xg" ӱйsB96ETl?;t3SQ1z1!,msbx9uy:+G{סNx(ǻ>-+gU TMj]Z]hBNwm+KS\U}^S_2BelY9P_ 1?MZ]K'TRu&=(-ޤ@zY Wp &#V;($J!WP2%•J. yWs`$WWXͅf!*۾[嗢{ )ýK]`/24!dȉfKd%)KR !Rn' =1@1$}bDFpFRD4M@Լ`,ifh&L+Sux-FvtE@qR KE-VIT=9!PLeg㘴KEJD6Ɗ0dDH2J}0nT_uY#A oQ/|*MVףO ` Fbc !R*np{Nu@)9WǍ_=K6dZ ^2vM-pB0(yJ5YYic 2XckKe`hz8~$j@{w:CTtdD]~^)Y{`q%j]{]hVJ\ƿGEuP. \d%R%`O`Vщa~@t !ٓ!mRwVq5/hN4'͉M"pǩ:-rlST#/ *Ӵ,(ge*UQ(}Rh2Mcj;xc4#EN[krP+Ľz(FNb9a$k3mKfG] J3)djcM! /3i TӜWj@x焤khiHP%wl,r)](,g%PvBY?Y,*O9f(]1t JpO1񣳱`60?Z**rg_oкNQ͵զ2ۇG/?~i%TH݄Wתcxkך@+ ؜&u}vF))R"Uv`M)J52`*Sn'RN#ED'ŝ8.#qBpV#nf8h0IBX>^8g?,W 4݋`Nhu44E8>agVx:~|K㑜6ގ 9֫eVZԖkUOwdZF1XU;5muZOΛo %dB@nȁYwl}yK@6[  >'žWOu2Kj T3BE睟Dlm7ao|T U y_z%M;D4iLbIݣЏ0V!bǨӏeT;@;h 7zߺiDX@'v:֭?Tά[|ٛ@B9D#0EWkuR!n2N[iw[|PB9Dw0'I!QZI;pϓ={:fcowypnjFWן4 }N7c7g^# B; YG<)|wPs땔@*D&(obZ.Q<@X?ݹVB 刃wn^)~ber;F\uSEَXw(L1oϺ1[ B;Fԫwj_+кwL LouG+ĎQǺxgu*ۉ*h@;hsLi0 Ef>wwppiT}?WC[/I]su@X`aFPsR z)Dd4ф(  f_  yÑPPM C& BkHnW~<27fBh˂FL ̧Zחm*:Woj8sk\F33 (!c|P ĺ-/Ug#DP,iùw,OBZk.} b3d>+gg}ݏ?X:],w*ר'Ubי6UҨzU^>cQ*cI䉟 (c0q@-J>SZ5^^d3EE jeXd櫕BwܷkaM<|k] q@+/JEgM߆'TG0!C-{o!-~}&XRg`3ecdƻt_]_mHţkwc(D&`9K:{hhqilHI}dsO䜶؁mLIj-`-ɿ/gruB?fZ}? EʕY ][s[7+,LLnU}zR;[I%3;3).[R$N2II)Fʡ*G!NFv(eUsy똰Վw'C _VATin'#S7x#$!2{r67޽3]ӧLd@9b3h?*AaS8 A,r*WiC!VN`Aqkhk["s8Q3,oY,(ٸ18Zd7d1gկ'@6IqqRX\I;G_Ri洛8'߇:Jl$U 8H%UGwІ xVw h-2 ^(gEc 87R@ he@ap  d#@9'OpA#척 IB=t||giuꞥ=kfc5!(VklıHk4 ;4 Gi#滅,'o=-~]_t>Yݨbm:߼-r:*Z#a&UֲFo9FnRY4F#8#iLVjxTqZ@YQSBv+@@Evk]#fZg(R/'[JWC! 3:j^)9qv]D,ؚzb q2x' h,#e]9K-De%zjGr\فH+(ؚ̇hHzFA QL o]usSX&l ZpI'v^@!K{EJ Q GB.t?wK(ixڃ]nByPpn%}+VB)QK4i b<`xɍ@-` u<"%V WgON:nO5ɆSL,{e r~ĻvT7KvֶQY(2}톥ڙQ'V}gf)2Kt~\9eIڿBWWOOrcmBW໿~}?kJiK3Y7urg•>p Ъ\Ss}y7 nMew '^)S1j{s;xl*H.(8:jV{wyJ6^BȔ(MhoC:Ռ3 |.4%q/nnogi-_qQIE?[8W_>,.Pn'pAx$ ,YRӳgM5 t]詜ӷƍ"$.S"ɅEşX$>1T[^E{CO@OēHA*_Ք N\w=i h%`VZD3x0"N/3¶)qR\Ep]i>Fz匫1M 3zZ4,yMd>v[w4<[{?_VwO*}>y\Ig҇g眐t79oۂ)lMh}D 3E a-8z&}))('hAԷ6۰ r-4kԧ[L#;xSXf)hqy#1uj||j ^G Sy*ԓ82.NQS0>^6-)`WQM|ԁ.Wae2cQQPi莕[CTz/䟦puۚ߃ 4#J޺:wBK{; @M8tN/Pԙ].=.OM4CMS-F8X,kU\jh'/䴦SEwIU٤Z u#bӽ;Jn[$w.dJ8h7EAu GtDQݺ/n}H\D[˔&O:KaGJ.f##TNN8w ġFf5g*(`PC5ؠA䧬#cT̋MQ2p&\KǜTroB">.bmLeiz&&b|9Mړ_/>øLjOdbW=un>> PNI JjjVK,z}\D*o|MbUW/TS$ sy\PJUJ{Z1\e"UpAI?,:T!Y @.Tw;FoU7?ԗMs$U6f zI* Gv^#Oa KhT K,sequ:<[Wka_B>46WCIlaK%ܡٲa_SdJS.}3V).aA4BSН lz&ۨw>h 7%܅x ibPc~m j%pz'5ϙM.@+O6$8VQ'[ ɡ;Ok\{E!5Qd##2ILšm/xQ Ub9-<0:Tw(兆{xPʹw9iN0!X$3g 8m>R/q?9)hz-,3R "cD@Tt]1532Dl0iɚm\fDpY@G*:"V$R:LTJ ?i9PQk!Qj `PS -JР:cA+$D)K#>~_1n,*#OZ7!F5ŭ|'aJ%F1i\,, E,^mLAPkdvEΛàm7˼*EXZ,k|g1ʸ;2 CThm`Ydr{[s3X'T~ǎwL #EqZ XG[{>08HNГv E q Z2p4=7<ָ5`C̲L%&gɳ&mx8Ԇ)%xA8XKx@Q(5a˚0*k:|j]>R5KcS9k>˵udvKM#a휈H&u|{hĹEhSkH8ה Ԅ.rFi.#HiF\b:RВ)LKԩ>Gt2,p =pzӶA@|)WaݖGn¶eSGDUgo//]vKu"*" }&u_L5AJn|'cUak8)9y5kYe \ -"{rV9Wrָíc?cQut{⇥_6<:=TqhMF.%k.$5 Բ LTf؝! d硜o=mΝ͟,~x11Hg2HOOH;'n"(2_j*R 1@4:Up!p#ɣ)H?%VÞ#PF(ZBP^ÄJhiTFGxuHvwTtX[,*;#[(a/'V""APC\0\Pԃ0A:|pJ8CB.p5Nt7VIqif0E);ʶYPW[Uyv6P#1ˊ=Mc #`ظG"4Uc{TIYdꊁf{Ovwb;={ 8RUO & `ѣcfPX {C=E3vju;,&u;Tx*q8ra_IxχO7@ 5'CMڥw$w;h]f FZ %"K׹(|X˜5`Jݕ8k(٨&ɨ( VzFEg$kT\tozV Bq3iS%c7~ʟG$ř'Ϧ.7٘em^ã-#\콹w:@*)Ll>Oh/?˧GW+ R>@]öf=GKdhI%\)N;P8XY/ ֩FF*`vU"9m\>U*`!/Sd1XMn$r9_zaP) nAt7S" gHJur<ۆ5^{sQ]}.Z/s\"^Z"[W#(,kV{K~E]i]M-b6\zmtnK 7's(`07QNk.EK p,'b̝)aQ)^\Ņ-rZ:U\}^k#zk,$9+R3@I@-ֆE: Hg&F@kQhS2k*Eh['\;mL"QrT;GNɈ } 5]rx)Plv-2ܘWn~UۛxJ{x.&┰}I=$|dBd#{]Aפqsa ߙ$Xpb`tFRM/*Ww=yǏ0ۛ(_NިjWgkDߚ]6xD8؆/}/f7WWʚw!V0.%7;Å݉J(_Hbz.?hB*m*/Fqx]43^5phDIrOhi0)f2RԋxЩ_S*Q#ISqחW4NI.܃ɷ>( di,q\_u>g\ȝ*m?֭J}}>16d 0<9I,%.RI_gB?9_MubAǝXp&2;:9Sjx7gUG4xE:3˧2q4fV%-o/wbl6qioCo2V4 l$!-XKk Bu6!h0H}*mc'Urb}eW˱D5-C#,wMjP[4I#>o.=UߌG{"Yi-E3xGO\gB*χCp {2mTrtΘ0!#vY\ f$mnZU5c[Ƈt5'vLiKJ;:mW`,Z<8ﴳn&l>jҲcU(*UOغS*˫M [XGZt}ȁ%B$eŪHAX! aV8wsɜ(HFs#;a6>!}Ĥazy=f!#q 91 њ\p|hb΅VHN/axN]UNVARZ5aKCbS\>QuqI_lkd4wf#XD'm'vϥ]n94Ή;]5˯|7:Db6ك`7B% V\.AJQE{lv<^)TV!dEl<I;pts%d{=z5.LuK>-;4+*Yo0V1qxS\Dg'ECCĖF`; <[69QG)mzTJgUմ(noeUk8}3@kӛ,ҖMʁ I 69m\=mga.mxk&Pjê[nŪڰEnPe؉jTHJeaSqXi2%e~ĥ:Gt8'9 w.L ~ێ |L(Y7{Jj4Z~r";r.6Io8 T E1(bUgGGTT&&i{ѝ]ZC8 lm (|a#egK!G({;O*˥lC~g0ba]Q쏰sI%ʦݲ/Osi-8{ڨػr|ԲJN5ّxg{OUθ,0FN8eatd0@G8\' L5#c"R8S-SY/Nh 3/h.ElIԖhe˓oۻxI >=J<dX]aBH4I9ge[6P cnxkݜw R u߮e3x-j=F#Dc3i>; 6c*%l@[cy^H_, ?z 'rb_:r߂yZoXeqdZ[ŤE}Pe*8-9ɺ?_rX:As2c*u )Rļ_zՒH.+mGPRJv`s+.qBtDP;3Duإ<@ig8BTFbpaPde- "h8A"G<0d?ߔp2_h멅-/âA8+&V9[/%qy\;YQog̤FV^kd %rLbޛ2-5i\Gm1c6ƌ bdΗ>\FjUQՇ2jb"Rsl~3zR+>oNGb/N{mʚDЬ_*RHM^8=L[9Wd;`;OcoB,socJN`jNcNDWjö6|-u`= Y"l"t"»B,ɜ뺳ZG&'tcHM6 .6M͖;FoSRsd8[9zʲ *r6pTų%Cpxп潺12}MWaZr͕#GYs89V7`V+3LuLA{d}+\on2)Z-1(XѶcj~_67jѰ_M 1;oXrZdEHΕR,v{%WUŨ4lDtVPsHSnwpYZ&*SP"bE%6R0CH)}^|`7 ޾,M Ɲ99/ur  ^1dΒgEѲ?=A )Y[,9狘zJ[rg^v@`#g;$7S?tC;PmVi. Yp9kg_L3z@9YDqז`2фo&A>N <(O|giyG #DZ ZѷO/^]P +͎C1;QvfvL**8;zo:o8d>r~oѦd[ۑyeZ$Ўӭ$*E>y;{ܮƯs!^'YsP>?y*Z(YN {6'>Oy{q=}#(Qfm|@B meDvu!9J's_T!E$/7~̉shATETBců?\,?o|h:&N9R9GZ:m"`Ēev728fsRe{HҍX$YxZ_Bi9{@$Eo3sxyo.+:[\FT-]{Wƍ݀}JR7zs8[,cTxHS߷1C#$kC"C Fn}C9LlD 4w`%|0{~YN A?B3qu\^5 : \~bx|g` ,]"$( 'a{ -9;vt#P^$XAb[0 ԁ PoRW0XM7tw ^/?FIdT($" R'=e`цiP%|D>z'@apyq^G'4ڼOQt>NR=V>?ӛaS \{j! ݒ9NogxSl3!>,xEq8p?\+?ޔ^/€at6e=xx*!,쭥!ל+ٛ[ޅ?H' P#*|)G/] (>Pg)L;yPbi2},.8#c> b/ɸs?9VP@wX +"je 9d\u/c~~g$s뢺WQV>fhzq9x+nv"kMuEIW~idQȻb᫆z %_W.pjkpЂ#MqțCPT!W :_w/Յo~|7՗"-A>&2q0m qah%_Kb厞PL{k$qQҼ+$#܆αh>L6p;lFo{}:pձ,C cPp51ӆmeIZWLҳ&h4`kv>%07hR539>XT 렍WB򈵥F`rB0I FE5w"m9~c5176L9Mr豨c''V\KICFrq 0Â2 ^ԆZC|&ҮXF `iG>TKU";",*t/dy/X*!J_ǫ{>fV6jˏ~EQb*f wFKYFɶd)%iMsU ̥4Wl8= <5j86F{O5WF+&KZ l{{ZJrK{#潢ܑyKnkT0͘jÆ 7)y+v-͟SfWVb- Ai\HÇ5JbHrIC.B`|~p[7w 1B)7HS{&!(ZJ8yf6)cNVg[Sg=ݐs9"okj;~Uyjf&86]G&͋Qo1>v4?ԅk p4 ~P:Ac^ڧ~4n]HnTêGyZIzNfqz56K⼘LOyhycv GU4i0È0E֠~1 wXU\UĖx&˝eN'_:8Hϊ89 tހoz%UrSu!!R:"]DҺ(2!t(Te93dZjqG C 栧Z"rZ`,gaHɞ1ލFTH1N12pRSqϋJL[o2c $`֐bJY$@_ I g>EhP2XdHnẙ 5sؠL(-?4J]ÀL,xct9?wc vyޢsE̙ U*]![i,&b~lPO2ʌVrQ1;Jy Ջ!rbҒF$|:jCdF&U57%[~ݢފ[?.֬dO0$`hsUkeHZ Ć-/;k-F5u^GZSՂ-O9s Rщݽ}[]V}PlZ/ + nЧgf:OÞ)p%3RVq-XBO9Nj+*?=Wsj8QQ"`A`qM2D6)HYf?qyimwлEaJtkC_O,"}?m3$uYݛ7:Y,rO&ԥ{X}{|Y< 1Ej# CC*+IDt3VIjq[k[1kh[+#e>~jI~x CL>wYό차WHW{,zU ej_󯼨sR˵ ?q8ܛ +dddsق' L&Ɲ#AsL,p8K"%˜Zp C"ſ2%ʛRmtX,C8YFͭah I \vP4YM*յ|_Ri7\QzM+RK^)ţ֖ZPERJ#1i'Rv&BEzUkrnoHD (H+C s-zelM$Ws^huܴCՍN9Q#du EC>`Kq:\v!b> Yb8_lm6̓ZIHp76"9ø4;KF)n;ta&qӹokÔD qm r8x01rݡMZ|ewJ$ fȡ'Cus{WtJG,O{-#A2 n7.ΰeްnzRXcN=C_!hLd*L N2asd9ʏEa?\+J+mA5 ǭ2Y5ref^ .M/N RЗ Ac&B(#3Aՙ(Ò8u |1] 3Ɩ?)&.Д$?KhY,kJ ^g2V1&^mUjQ늜$5Jcsa qH<&H 5VH*mΤԚU"Ym`ӪK唟OPyhN0 Com ¹R|8\|I K+.'"KSBi)9z-%6b/Dh?Im/-PBlpQާ <񯹍;6JdBƽ6҆/T,ʚdU N< ε%VR 0PH4k?$Fٻ&7n$W:igg‘CzY&v6 VK}mY )b/&bf$XLJDȃ^ao\׆0E_k)Vܩm`9Үc .nGbL4Kn]DxnDګ (rhm5'. :qAhf4ԘM&i"B +؉ bjWFOe]32e}8xR9tJF%)ha/iע|*puLӈJ {avBA Ӛ+y<;: j="b…QV Hn47N\/@k!(Rūxo⥈BFY<FkVFtM_~ y *[vkaZ@4°[iɳ`̩k_!5]Z۫駌_L/B􁛊tYq ii .{wRon5Xm^]jrjan]%*pVvliִ,_^WB*񱲜A6Bӎv]9Y hoF~wc.i=Ң*hHrLe #պ' cM4Vvv5Xd:0K;A%fRJxZ)ki 7ٱ2;0s6}]C Pic [a0fvkKmN0gpf +2IKqZ`aoQ%f; 1Xzgu`euRѫs3Ys$?ݶ\|DA}XR+u Yf>Uk]l+zi=,w5:xup \qtgN(HETs đM]3sdu֛y %WשUm^:lΞmˆZ[I/^.;|~e9\v4Vb)`27r]\,dYJW*V뇶hdV[|㇫{|Ymg=MCδ݃{|,SEAO' {\.0ٳd߬rB "Xy{~0_qŒRRl,'Mņ;X֘wǭTH_#Q%LҼ٣aMltcj DkrQBꇤ cM}JQ[T_|FβG}z F`۟H5o|vMEW!Ӕ(>(>ջ_~˵1M.:,el6uܷ]R#Lwm ޻y}1'gLscIVj [/F ^7.l>Xp=zcoW(yw&ʹ+%VxUZjً _^8-k{Z,=`ej[yׯ6x4_&15vr85YlGjP:hT[^ud6J=RpkR;Zat#͒wAxtC Y{c?\⅌"tKL6)_7HȾLȾLȾLȾ̑V7"e2Sӂ> #b[OmkUhhXq/z˷8B཯p{@|&#+\(pLb(*8| \1=sQCM߆&ʦ]{!Uְ^ˁE-(,L-<됱˞í]#Qʯ7ۤ.6Ie;kv}w>R{'~M1\fC|g(_F_{{Oy2_qޟh˯/3_Msuw1Qmˣ׿v:v![?zj\OK|ʠlrPݴ7R!lkj}Uh~tw~O۶A coc( ,Na:qr٭\+V=vpށ -JBL,ŋt_?8ks&.[*MΫ07xV_eQLFW b*([i}4 `~9mHn3KjUBi?XRN;/2rPYUC%/V0bDIT.$=\ߝPLJǽnqv3ZAt+ h@V+T')ܒyȹTC~k6έS[ P r; D*(ۖR,ܽ#$rTGҲsA9Q ,$f,^]*z]>?'QUoj=W jmv7@FwH6xwYjló'&J^6a&H)oV:RC}7CnҍJZӾdg3?~X< M3]~ymڑ{hf_&&Tdɿv?,^=yW& Cٵ{. ʿ ђli%i?yi3t実Z?]OH2E!b8jr]lS0uJBz9zY[nxSx8P"KO GS !bP-'uc:_bjȝ$0 UL*A;M-(o%-B@4o iv^yLpB|ki;xJJ^5e%:VUm*]Z TsQmŕ<{mJƎad|DܟZ-Xw$/iJVl`|DEԿ~PoEB]FhÏ`ə2F]ƭm;*8S&hAF#݌~QpWޣ>l֧|q.^k0@w=TFrJxFD]g+-B朂OGS` -\׳J#mqwoi-j r̴#XMş*qCIZ@ n?g?jʩC[?_^ #L W"?5T*|ő= he?LN$*>1LBak<3LEv778V43)B CNFdsCT#M\EJ I؊.0t~Ys-k!y;sPP Q&Qdww$rY3QA;9]Az)rs{$cIWP!%{֎ES|l 5]&!23+]Tɞv`^R_~sqoz)'GvHe}ZiV*J觇C{`ҩrpy >S2*e2bSͯoT{V_V_V_V_TY]M9c ssEV0q T:[WRy|~ TN23R_t: PN|1 |6̗G2z%kMapt$zՕB)=1JgGyiȹi88xE3-θ FUHMx⾧vusewQ=wߓ %(1ܺI0*FpoAeXo DFJ9/%u9eQXk\೮` D<[0FT Şl FܣsדQ \N7_}v5hZsED9ȑq6}@q}1yk@`J}uM؄పU\Vng!@Kْ,ѐǖIAr+3+8r3 ?P iY b2}u$~`wGrH3fͧ>[N/Ƿ%A.'نHNīuzPNGae^/ BZ9_m\ĐN[%ϮDp2r&(-;k7P m\*f![2dr=& k5;jIm(XV'zO|s&kr˨JC Dk˧{n2Zi ZJqblml{y\\$B*z^w: `#IS$л 3K8qiC{2kH{-_e8j|{qdO<'^ל/%5yk;{"*V%IJDL# g[Pne]ѓ%չ+JxꮈGSr2 |^c&#prT꧵4!9n"suS!ՓxnNE=ܛ8PJ6h޹|58Z\wz^ pӺ#W8ZIJMC'z[yk0·LՊ}$;EՄ@җ7rDM xLZC͚vJe} զ?8MdB?5e9̈uҘ- TMGn]k%4.J+@!ªȫ"#zȏo͡U2*hzgGт߷0fg+nIM&c0o\CQA;Oʏ$͇$JX-~<_/#>?wwH6оyK$䕋hLiv;v3vK JtQK g}!O+عvKZW.I2e[țM[*1:mST޴[z=Hֆrm,Spв88Sq|1#8&ȃTs[]ѦEcGhRT&G% C9kw)ٻl1tO% gkg=ߠVm/wR(yXY(tW᯹G##|XX:/T-W/ƕzh"t?BC,.\ Mb-'-QILad4T [`I+I,_oƒ% C+$(2?,TjZqJ(;j$X=4;V ydL9)Yg3ވ&踿Lm0pĔWޮ<\_GC ɺ[XҿN'ӿK6MO-&YuJ'+~+DjB'oxM,f݆ζ2_%\]cL+bp嵩Q.s+|+cy6F ;e-ϴ={JXvZqmDOxԈGJT{M#*6Gn`_7I00%o1i U-0[PV:NBS<)0rA!sUx map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 20 17:32:31 crc kubenswrapper[4690]: body: Mar 20 17:32:31 crc kubenswrapper[4690]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:32:19.642961082 +0000 UTC m=+14.508786800,LastTimestamp:2026-03-20 17:32:19.642961082 +0000 UTC m=+14.508786800,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 17:32:31 crc kubenswrapper[4690]: > Mar 20 17:32:31 crc kubenswrapper[4690]: E0320 17:32:31.750786 4690 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e9d030c0abeb4 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:32:19.643047604 +0000 UTC m=+14.508873322,LastTimestamp:2026-03-20 17:32:19.643047604 +0000 UTC m=+14.508873322,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:32:31 crc kubenswrapper[4690]: E0320 17:32:31.755752 4690 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e9d0091292bc3\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e9d0091292bc3 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:32:08.991509443 +0000 UTC m=+3.857335121,LastTimestamp:2026-03-20 17:32:20.011708618 +0000 UTC m=+14.877534336,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:32:31 crc kubenswrapper[4690]: E0320 17:32:31.759532 4690 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e9d009cd002ca\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e9d009cd002ca openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:32:09.186992842 +0000 UTC m=+4.052818520,LastTimestamp:2026-03-20 17:32:20.230535387 +0000 UTC m=+15.096361105,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:32:31 crc kubenswrapper[4690]: E0320 17:32:31.764192 4690 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e9d009d812720\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e9d009d812720 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:32:09.198602016 +0000 UTC m=+4.064427694,LastTimestamp:2026-03-20 17:32:20.240312301 +0000 UTC m=+15.106137999,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:32:31 crc kubenswrapper[4690]: E0320 17:32:31.769150 4690 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 20 17:32:31 crc kubenswrapper[4690]: &Event{ObjectMeta:{kube-apiserver-crc.189e9d0363168972 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 20 17:32:31 crc kubenswrapper[4690]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 20 17:32:31 crc kubenswrapper[4690]: Mar 20 17:32:31 crc kubenswrapper[4690]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:32:21.103438194 +0000 UTC m=+15.969263892,LastTimestamp:2026-03-20 17:32:21.103438194 +0000 UTC m=+15.969263892,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 17:32:31 crc kubenswrapper[4690]: > Mar 20 17:32:31 crc kubenswrapper[4690]: E0320 17:32:31.773970 4690 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e9d0363172d6f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:32:21.103480175 +0000 UTC m=+15.969305873,LastTimestamp:2026-03-20 17:32:21.103480175 +0000 UTC m=+15.969305873,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:32:31 crc kubenswrapper[4690]: E0320 17:32:31.778438 4690 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e9d0363168972\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 20 17:32:31 crc kubenswrapper[4690]: &Event{ObjectMeta:{kube-apiserver-crc.189e9d0363168972 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 20 17:32:31 crc kubenswrapper[4690]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 20 17:32:31 crc kubenswrapper[4690]: Mar 20 17:32:31 crc kubenswrapper[4690]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:32:21.103438194 +0000 UTC m=+15.969263892,LastTimestamp:2026-03-20 17:32:21.115140533 +0000 UTC m=+15.980966221,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 17:32:31 crc kubenswrapper[4690]: > Mar 20 17:32:31 crc kubenswrapper[4690]: E0320 17:32:31.784081 4690 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e9d0363172d6f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e9d0363172d6f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:32:21.103480175 +0000 UTC m=+15.969305873,LastTimestamp:2026-03-20 17:32:21.115171874 +0000 UTC m=+15.980997562,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:32:31 crc kubenswrapper[4690]: E0320 17:32:31.793341 4690 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e9d030c096cba\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 17:32:31 crc kubenswrapper[4690]: &Event{ObjectMeta:{kube-controller-manager-crc.189e9d030c096cba openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 20 17:32:31 crc kubenswrapper[4690]: body: Mar 20 17:32:31 crc kubenswrapper[4690]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:32:19.642961082 +0000 UTC m=+14.508786800,LastTimestamp:2026-03-20 17:32:29.642958864 +0000 UTC m=+24.508784582,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 17:32:31 crc kubenswrapper[4690]: > Mar 20 17:32:31 crc kubenswrapper[4690]: E0320 17:32:31.799827 4690 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e9d030c0abeb4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e9d030c0abeb4 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:32:19.643047604 +0000 UTC m=+14.508873322,LastTimestamp:2026-03-20 17:32:29.643025206 +0000 UTC m=+24.508850924,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:32:31 crc kubenswrapper[4690]: I0320 17:32:31.816335 4690 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:32:32 crc kubenswrapper[4690]: W0320 17:32:32.574749 4690 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 20 17:32:32 crc kubenswrapper[4690]: E0320 17:32:32.574841 4690 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 20 17:32:32 crc kubenswrapper[4690]: I0320 17:32:32.817788 4690 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:32:33 crc kubenswrapper[4690]: I0320 17:32:33.815954 4690 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:32:34 crc kubenswrapper[4690]: W0320 17:32:34.080034 4690 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 20 17:32:34 crc kubenswrapper[4690]: E0320 17:32:34.080404 4690 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 20 17:32:34 crc kubenswrapper[4690]: I0320 17:32:34.516477 4690 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:32:34 crc kubenswrapper[4690]: E0320 17:32:34.518177 4690 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 17:32:34 crc kubenswrapper[4690]: I0320 17:32:34.518434 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:32:34 crc kubenswrapper[4690]: I0320 17:32:34.518472 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:32:34 crc kubenswrapper[4690]: I0320 17:32:34.518484 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:32:34 crc kubenswrapper[4690]: I0320 17:32:34.518510 4690 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 17:32:34 crc kubenswrapper[4690]: E0320 17:32:34.523052 4690 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 17:32:34 crc kubenswrapper[4690]: I0320 17:32:34.817382 4690 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:32:35 crc kubenswrapper[4690]: I0320 17:32:35.814702 4690 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:32:35 crc kubenswrapper[4690]: E0320 17:32:35.971791 4690 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 17:32:36 crc kubenswrapper[4690]: I0320 17:32:36.815376 4690 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:32:37 crc kubenswrapper[4690]: I0320 17:32:37.818052 4690 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:32:38 crc kubenswrapper[4690]: I0320 17:32:38.294526 4690 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:43704->192.168.126.11:10357: read: connection reset by peer" start-of-body= Mar 20 17:32:38 crc kubenswrapper[4690]: I0320 17:32:38.294643 4690 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:43704->192.168.126.11:10357: read: connection reset by peer" Mar 20 17:32:38 crc kubenswrapper[4690]: I0320 17:32:38.294743 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 17:32:38 crc kubenswrapper[4690]: I0320 17:32:38.294992 4690 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:32:38 crc kubenswrapper[4690]: I0320 17:32:38.297341 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:32:38 crc kubenswrapper[4690]: I0320 17:32:38.297426 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:32:38 crc kubenswrapper[4690]: I0320 17:32:38.297448 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:32:38 crc kubenswrapper[4690]: I0320 17:32:38.298347 4690 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"98dccfbbb62f60dc126e6c81729f6ac78b1f017d1dd01a200d06beb2296fd1b2"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 20 17:32:38 crc kubenswrapper[4690]: I0320 17:32:38.298629 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://98dccfbbb62f60dc126e6c81729f6ac78b1f017d1dd01a200d06beb2296fd1b2" gracePeriod=30 Mar 20 17:32:38 crc kubenswrapper[4690]: E0320 17:32:38.301650 4690 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 17:32:38 crc kubenswrapper[4690]: &Event{ObjectMeta:{kube-controller-manager-crc.189e9d0763c2ebe5 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": read tcp 192.168.126.11:43704->192.168.126.11:10357: read: connection reset by peer Mar 20 17:32:38 crc kubenswrapper[4690]: body: Mar 20 17:32:38 crc kubenswrapper[4690]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:32:38.294604773 +0000 UTC m=+33.160430491,LastTimestamp:2026-03-20 17:32:38.294604773 +0000 UTC m=+33.160430491,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 17:32:38 crc kubenswrapper[4690]: > Mar 20 17:32:38 crc kubenswrapper[4690]: E0320 17:32:38.305502 4690 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e9d0763c44506 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:43704->192.168.126.11:10357: read: connection reset by peer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:32:38.294693126 +0000 UTC m=+33.160518844,LastTimestamp:2026-03-20 17:32:38.294693126 +0000 UTC m=+33.160518844,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:32:38 crc kubenswrapper[4690]: E0320 17:32:38.310542 4690 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e9d0764001ce8 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:32:38.298615016 +0000 UTC m=+33.164440754,LastTimestamp:2026-03-20 17:32:38.298615016 +0000 UTC m=+33.164440754,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:32:38 crc kubenswrapper[4690]: I0320 17:32:38.816976 4690 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:32:38 crc kubenswrapper[4690]: E0320 17:32:38.832886 4690 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e9d001c82087a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e9d001c82087a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:32:07.034398842 +0000 UTC m=+1.900224520,LastTimestamp:2026-03-20 17:32:38.825396548 +0000 UTC m=+33.691222226,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:32:39 crc kubenswrapper[4690]: E0320 17:32:39.062145 4690 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e9d0032f5118b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e9d0032f5118b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:32:07.411036555 +0000 UTC m=+2.276862273,LastTimestamp:2026-03-20 17:32:39.056007728 +0000 UTC m=+33.921833446,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:32:39 crc kubenswrapper[4690]: E0320 17:32:39.078697 4690 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e9d00343db03c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e9d00343db03c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:32:07.432572988 +0000 UTC m=+2.298398706,LastTimestamp:2026-03-20 17:32:39.070549237 +0000 UTC m=+33.936374915,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:32:39 crc kubenswrapper[4690]: I0320 17:32:39.114221 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 20 17:32:39 crc kubenswrapper[4690]: I0320 17:32:39.114974 4690 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="98dccfbbb62f60dc126e6c81729f6ac78b1f017d1dd01a200d06beb2296fd1b2" exitCode=255 Mar 20 17:32:39 crc kubenswrapper[4690]: I0320 17:32:39.115094 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"98dccfbbb62f60dc126e6c81729f6ac78b1f017d1dd01a200d06beb2296fd1b2"} Mar 20 17:32:39 crc kubenswrapper[4690]: I0320 17:32:39.115196 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"696bd60243b29b1c078b32f2dcb7261e108e0b204ba5889b2c0ce5d6c8dff044"} Mar 20 17:32:39 crc kubenswrapper[4690]: I0320 17:32:39.115388 4690 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:32:39 crc kubenswrapper[4690]: I0320 17:32:39.117088 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:32:39 crc kubenswrapper[4690]: I0320 17:32:39.117131 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:32:39 crc kubenswrapper[4690]: I0320 17:32:39.117141 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:32:39 crc kubenswrapper[4690]: I0320 17:32:39.816294 4690 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:32:40 crc kubenswrapper[4690]: I0320 17:32:40.817246 4690 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:32:41 crc kubenswrapper[4690]: I0320 17:32:41.523672 4690 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:32:41 crc kubenswrapper[4690]: I0320 17:32:41.525732 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:32:41 crc kubenswrapper[4690]: I0320 17:32:41.526068 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:32:41 crc kubenswrapper[4690]: I0320 17:32:41.526311 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:32:41 crc kubenswrapper[4690]: I0320 17:32:41.526890 4690 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 17:32:41 crc kubenswrapper[4690]: E0320 17:32:41.527849 4690 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 17:32:41 crc kubenswrapper[4690]: E0320 17:32:41.535225 4690 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 17:32:41 crc kubenswrapper[4690]: I0320 17:32:41.817234 4690 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:32:42 crc kubenswrapper[4690]: I0320 17:32:42.817664 4690 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:32:42 crc kubenswrapper[4690]: I0320 17:32:42.882705 4690 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:32:42 crc kubenswrapper[4690]: I0320 17:32:42.885060 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:32:42 crc kubenswrapper[4690]: I0320 17:32:42.885147 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:32:42 crc kubenswrapper[4690]: I0320 17:32:42.885169 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:32:42 crc kubenswrapper[4690]: I0320 17:32:42.886183 4690 scope.go:117] "RemoveContainer" containerID="55b32a129587401c0080e925ffcac9c03d6820d42f27f54feeb0828b6326ade4" Mar 20 17:32:43 crc kubenswrapper[4690]: I0320 17:32:43.816111 4690 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:32:44 crc kubenswrapper[4690]: I0320 17:32:44.134103 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 20 17:32:44 crc kubenswrapper[4690]: I0320 17:32:44.138032 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"efe6cb07e25674ec32374ae6292c27ead95724ecbb9d9724799b4adc34714436"} Mar 20 17:32:44 crc kubenswrapper[4690]: I0320 17:32:44.138311 4690 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:32:44 crc kubenswrapper[4690]: I0320 17:32:44.139768 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:32:44 crc kubenswrapper[4690]: I0320 17:32:44.139856 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:32:44 crc kubenswrapper[4690]: I0320 17:32:44.139883 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:32:44 crc kubenswrapper[4690]: I0320 17:32:44.282334 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 17:32:44 crc kubenswrapper[4690]: I0320 17:32:44.282745 4690 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:32:44 crc kubenswrapper[4690]: I0320 17:32:44.284114 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:32:44 crc kubenswrapper[4690]: I0320 17:32:44.284168 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:32:44 crc kubenswrapper[4690]: I0320 17:32:44.284192 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:32:44 crc kubenswrapper[4690]: I0320 17:32:44.816282 4690 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:32:45 crc kubenswrapper[4690]: I0320 17:32:45.143365 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 17:32:45 crc kubenswrapper[4690]: I0320 17:32:45.145097 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 20 17:32:45 crc kubenswrapper[4690]: I0320 17:32:45.148942 4690 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="efe6cb07e25674ec32374ae6292c27ead95724ecbb9d9724799b4adc34714436" exitCode=255 Mar 20 17:32:45 crc kubenswrapper[4690]: I0320 17:32:45.148997 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"efe6cb07e25674ec32374ae6292c27ead95724ecbb9d9724799b4adc34714436"} Mar 20 17:32:45 crc kubenswrapper[4690]: I0320 17:32:45.149052 4690 scope.go:117] "RemoveContainer" containerID="55b32a129587401c0080e925ffcac9c03d6820d42f27f54feeb0828b6326ade4" Mar 20 17:32:45 crc kubenswrapper[4690]: I0320 17:32:45.149225 4690 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:32:45 crc kubenswrapper[4690]: I0320 17:32:45.152046 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:32:45 crc kubenswrapper[4690]: I0320 17:32:45.152250 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:32:45 crc kubenswrapper[4690]: I0320 17:32:45.152462 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:32:45 crc kubenswrapper[4690]: I0320 17:32:45.153476 4690 scope.go:117] "RemoveContainer" containerID="efe6cb07e25674ec32374ae6292c27ead95724ecbb9d9724799b4adc34714436" Mar 20 17:32:45 crc kubenswrapper[4690]: E0320 17:32:45.153955 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 17:32:45 crc kubenswrapper[4690]: I0320 17:32:45.428762 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:32:45 crc kubenswrapper[4690]: I0320 17:32:45.816571 4690 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:32:45 crc kubenswrapper[4690]: E0320 17:32:45.972846 4690 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 17:32:46 crc kubenswrapper[4690]: I0320 17:32:46.155493 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 17:32:46 crc kubenswrapper[4690]: I0320 17:32:46.159152 4690 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:32:46 crc kubenswrapper[4690]: I0320 17:32:46.160363 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:32:46 crc kubenswrapper[4690]: I0320 17:32:46.160414 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:32:46 crc kubenswrapper[4690]: I0320 17:32:46.160431 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:32:46 crc kubenswrapper[4690]: I0320 17:32:46.161167 4690 scope.go:117] "RemoveContainer" containerID="efe6cb07e25674ec32374ae6292c27ead95724ecbb9d9724799b4adc34714436" Mar 20 17:32:46 crc kubenswrapper[4690]: E0320 17:32:46.161474 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 17:32:46 crc kubenswrapper[4690]: I0320 17:32:46.642370 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 17:32:46 crc kubenswrapper[4690]: I0320 17:32:46.642656 4690 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:32:46 crc kubenswrapper[4690]: I0320 17:32:46.644367 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:32:46 crc kubenswrapper[4690]: I0320 17:32:46.644621 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:32:46 crc kubenswrapper[4690]: I0320 17:32:46.644800 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:32:46 crc kubenswrapper[4690]: I0320 17:32:46.669154 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 17:32:46 crc kubenswrapper[4690]: I0320 17:32:46.816017 4690 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:32:47 crc kubenswrapper[4690]: I0320 17:32:47.161541 4690 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:32:47 crc kubenswrapper[4690]: I0320 17:32:47.162845 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:32:47 crc kubenswrapper[4690]: I0320 17:32:47.162866 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:32:47 crc kubenswrapper[4690]: I0320 17:32:47.162875 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:32:47 crc kubenswrapper[4690]: W0320 17:32:47.583788 4690 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 20 17:32:47 crc kubenswrapper[4690]: E0320 17:32:47.584336 4690 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 20 17:32:47 crc kubenswrapper[4690]: W0320 17:32:47.631356 4690 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 20 17:32:47 crc kubenswrapper[4690]: E0320 17:32:47.631454 4690 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 20 17:32:47 crc kubenswrapper[4690]: I0320 17:32:47.728400 4690 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:32:47 crc kubenswrapper[4690]: I0320 17:32:47.728630 4690 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:32:47 crc kubenswrapper[4690]: I0320 17:32:47.730367 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:32:47 crc kubenswrapper[4690]: I0320 17:32:47.730422 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:32:47 crc kubenswrapper[4690]: I0320 17:32:47.730441 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:32:47 crc kubenswrapper[4690]: I0320 17:32:47.731295 4690 scope.go:117] "RemoveContainer" containerID="efe6cb07e25674ec32374ae6292c27ead95724ecbb9d9724799b4adc34714436" Mar 20 17:32:47 crc kubenswrapper[4690]: E0320 17:32:47.731615 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 17:32:47 crc kubenswrapper[4690]: I0320 17:32:47.817030 4690 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:32:48 crc kubenswrapper[4690]: E0320 17:32:48.533492 4690 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 17:32:48 crc kubenswrapper[4690]: I0320 17:32:48.536403 4690 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:32:48 crc kubenswrapper[4690]: I0320 17:32:48.537722 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:32:48 crc kubenswrapper[4690]: I0320 17:32:48.537763 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:32:48 crc kubenswrapper[4690]: I0320 17:32:48.537780 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:32:48 crc kubenswrapper[4690]: I0320 17:32:48.537816 4690 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 17:32:48 crc kubenswrapper[4690]: E0320 17:32:48.542510 4690 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 17:32:48 crc kubenswrapper[4690]: I0320 17:32:48.815313 4690 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:32:49 crc kubenswrapper[4690]: W0320 17:32:49.352856 4690 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 20 17:32:49 crc kubenswrapper[4690]: E0320 17:32:49.352909 4690 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 20 17:32:49 crc kubenswrapper[4690]: I0320 17:32:49.813854 4690 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:32:50 crc kubenswrapper[4690]: I0320 17:32:50.815697 4690 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:32:51 crc kubenswrapper[4690]: I0320 17:32:51.816243 4690 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:32:52 crc kubenswrapper[4690]: I0320 17:32:52.816795 4690 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:32:53 crc kubenswrapper[4690]: I0320 17:32:53.816063 4690 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:32:54 crc kubenswrapper[4690]: I0320 17:32:54.286318 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 17:32:54 crc kubenswrapper[4690]: I0320 17:32:54.286541 4690 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:32:54 crc kubenswrapper[4690]: I0320 17:32:54.287880 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:32:54 crc kubenswrapper[4690]: I0320 17:32:54.287928 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:32:54 crc kubenswrapper[4690]: I0320 17:32:54.287945 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:32:54 crc kubenswrapper[4690]: I0320 17:32:54.816581 4690 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:32:55 crc kubenswrapper[4690]: E0320 17:32:55.541464 4690 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 17:32:55 crc kubenswrapper[4690]: I0320 17:32:55.543644 4690 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:32:55 crc kubenswrapper[4690]: I0320 17:32:55.545518 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:32:55 crc kubenswrapper[4690]: I0320 17:32:55.545575 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:32:55 crc kubenswrapper[4690]: I0320 17:32:55.545599 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:32:55 crc kubenswrapper[4690]: I0320 17:32:55.545643 4690 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 17:32:55 crc kubenswrapper[4690]: E0320 17:32:55.552725 4690 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 17:32:55 crc kubenswrapper[4690]: I0320 17:32:55.816493 4690 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:32:55 crc kubenswrapper[4690]: E0320 17:32:55.973100 4690 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 17:32:56 crc kubenswrapper[4690]: I0320 17:32:56.815870 4690 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:32:57 crc kubenswrapper[4690]: I0320 17:32:57.816957 4690 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:32:58 crc kubenswrapper[4690]: I0320 17:32:58.812931 4690 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:32:59 crc kubenswrapper[4690]: W0320 17:32:59.662067 4690 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 20 17:32:59 crc kubenswrapper[4690]: E0320 17:32:59.662180 4690 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 20 17:32:59 crc kubenswrapper[4690]: I0320 17:32:59.816663 4690 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:33:00 crc kubenswrapper[4690]: I0320 17:33:00.085099 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 17:33:00 crc kubenswrapper[4690]: I0320 17:33:00.085218 4690 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:33:00 crc kubenswrapper[4690]: I0320 17:33:00.086624 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:00 crc kubenswrapper[4690]: I0320 17:33:00.086691 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:00 crc kubenswrapper[4690]: I0320 17:33:00.086711 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:00 crc kubenswrapper[4690]: I0320 17:33:00.816507 4690 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:33:00 crc kubenswrapper[4690]: I0320 17:33:00.882960 4690 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:33:00 crc kubenswrapper[4690]: I0320 17:33:00.889572 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:00 crc kubenswrapper[4690]: I0320 17:33:00.889677 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:00 crc kubenswrapper[4690]: I0320 17:33:00.889703 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:00 crc kubenswrapper[4690]: I0320 17:33:00.891775 4690 scope.go:117] "RemoveContainer" containerID="efe6cb07e25674ec32374ae6292c27ead95724ecbb9d9724799b4adc34714436" Mar 20 17:33:00 crc kubenswrapper[4690]: E0320 17:33:00.892106 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 17:33:01 crc kubenswrapper[4690]: I0320 17:33:01.817628 4690 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:33:02 crc kubenswrapper[4690]: E0320 17:33:02.550479 4690 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 17:33:02 crc kubenswrapper[4690]: I0320 17:33:02.553590 4690 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:33:02 crc kubenswrapper[4690]: I0320 17:33:02.555495 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:02 crc kubenswrapper[4690]: I0320 17:33:02.555598 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:02 crc kubenswrapper[4690]: I0320 17:33:02.555621 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:02 crc kubenswrapper[4690]: I0320 17:33:02.555673 4690 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 17:33:02 crc kubenswrapper[4690]: E0320 17:33:02.562978 4690 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 17:33:02 crc kubenswrapper[4690]: I0320 17:33:02.814527 4690 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:33:03 crc kubenswrapper[4690]: I0320 17:33:03.817479 4690 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:33:04 crc kubenswrapper[4690]: I0320 17:33:04.818106 4690 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:33:05 crc kubenswrapper[4690]: I0320 17:33:05.818344 4690 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:33:05 crc kubenswrapper[4690]: E0320 17:33:05.973500 4690 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 17:33:06 crc kubenswrapper[4690]: I0320 17:33:06.818216 4690 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:33:07 crc kubenswrapper[4690]: I0320 17:33:07.817703 4690 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:33:08 crc kubenswrapper[4690]: I0320 17:33:08.818053 4690 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:33:09 crc kubenswrapper[4690]: E0320 17:33:09.555914 4690 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 17:33:09 crc kubenswrapper[4690]: I0320 17:33:09.563991 4690 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:33:09 crc kubenswrapper[4690]: I0320 17:33:09.565740 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:09 crc kubenswrapper[4690]: I0320 17:33:09.565799 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:09 crc kubenswrapper[4690]: I0320 17:33:09.565818 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:09 crc kubenswrapper[4690]: I0320 17:33:09.565862 4690 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 17:33:09 crc kubenswrapper[4690]: E0320 17:33:09.570766 4690 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 17:33:09 crc kubenswrapper[4690]: I0320 17:33:09.816035 4690 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:33:10 crc kubenswrapper[4690]: I0320 17:33:10.537284 4690 csr.go:261] certificate signing request csr-tg57l is approved, waiting to be issued Mar 20 17:33:10 crc kubenswrapper[4690]: I0320 17:33:10.545895 4690 csr.go:257] certificate signing request csr-tg57l is issued Mar 20 17:33:10 crc kubenswrapper[4690]: I0320 17:33:10.605571 4690 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 20 17:33:10 crc kubenswrapper[4690]: I0320 17:33:10.642058 4690 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 20 17:33:11 crc kubenswrapper[4690]: I0320 17:33:11.547340 4690 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-11-17 03:11:50.538216112 +0000 UTC Mar 20 17:33:11 crc kubenswrapper[4690]: I0320 17:33:11.547390 4690 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 5793h38m38.990831032s for next certificate rotation Mar 20 17:33:15 crc kubenswrapper[4690]: I0320 17:33:15.882534 4690 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:33:15 crc kubenswrapper[4690]: I0320 17:33:15.884301 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:15 crc kubenswrapper[4690]: I0320 17:33:15.884357 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:15 crc kubenswrapper[4690]: I0320 17:33:15.884375 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:15 crc kubenswrapper[4690]: I0320 17:33:15.885397 4690 scope.go:117] "RemoveContainer" containerID="efe6cb07e25674ec32374ae6292c27ead95724ecbb9d9724799b4adc34714436" Mar 20 17:33:15 crc kubenswrapper[4690]: E0320 17:33:15.973666 4690 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 17:33:16 crc kubenswrapper[4690]: I0320 17:33:16.249353 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 17:33:16 crc kubenswrapper[4690]: I0320 17:33:16.251074 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"60a788ca120045ef7b2481c3da0afac1f8ae2522b3edd3b73a48f5f8dab045a4"} Mar 20 17:33:16 crc kubenswrapper[4690]: I0320 17:33:16.251292 4690 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:33:16 crc kubenswrapper[4690]: I0320 17:33:16.253294 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:16 crc kubenswrapper[4690]: I0320 17:33:16.253353 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:16 crc kubenswrapper[4690]: I0320 17:33:16.253372 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:16 crc kubenswrapper[4690]: I0320 17:33:16.571600 4690 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:33:16 crc kubenswrapper[4690]: I0320 17:33:16.573407 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:16 crc kubenswrapper[4690]: I0320 17:33:16.573600 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:16 crc kubenswrapper[4690]: I0320 17:33:16.573630 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:16 crc kubenswrapper[4690]: I0320 17:33:16.573789 4690 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 17:33:16 crc kubenswrapper[4690]: I0320 17:33:16.587039 4690 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 20 17:33:16 crc kubenswrapper[4690]: I0320 17:33:16.587385 4690 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 20 17:33:16 crc kubenswrapper[4690]: E0320 17:33:16.587434 4690 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 20 17:33:16 crc kubenswrapper[4690]: I0320 17:33:16.592508 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:16 crc kubenswrapper[4690]: I0320 17:33:16.592544 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:16 crc kubenswrapper[4690]: I0320 17:33:16.592590 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:16 crc kubenswrapper[4690]: I0320 17:33:16.592614 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:16 crc kubenswrapper[4690]: I0320 17:33:16.592632 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:16Z","lastTransitionTime":"2026-03-20T17:33:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:16 crc kubenswrapper[4690]: E0320 17:33:16.617011 4690 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"65dcae3a-f6f0-4cdb-ac7a-76b1f475ea12\\\",\\\"systemUUID\\\":\\\"6ccc1e34-4160-4143-b919-ac2f717f294a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:16 crc kubenswrapper[4690]: I0320 17:33:16.629310 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:16 crc kubenswrapper[4690]: I0320 17:33:16.629360 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:16 crc kubenswrapper[4690]: I0320 17:33:16.629377 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:16 crc kubenswrapper[4690]: I0320 17:33:16.629404 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:16 crc kubenswrapper[4690]: I0320 17:33:16.629421 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:16Z","lastTransitionTime":"2026-03-20T17:33:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:16 crc kubenswrapper[4690]: E0320 17:33:16.642067 4690 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"65dcae3a-f6f0-4cdb-ac7a-76b1f475ea12\\\",\\\"systemUUID\\\":\\\"6ccc1e34-4160-4143-b919-ac2f717f294a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:16 crc kubenswrapper[4690]: I0320 17:33:16.651374 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:16 crc kubenswrapper[4690]: I0320 17:33:16.651417 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:16 crc kubenswrapper[4690]: I0320 17:33:16.651437 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:16 crc kubenswrapper[4690]: I0320 17:33:16.651464 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:16 crc kubenswrapper[4690]: I0320 17:33:16.651808 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:16Z","lastTransitionTime":"2026-03-20T17:33:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:16 crc kubenswrapper[4690]: E0320 17:33:16.665805 4690 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"65dcae3a-f6f0-4cdb-ac7a-76b1f475ea12\\\",\\\"systemUUID\\\":\\\"6ccc1e34-4160-4143-b919-ac2f717f294a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:16 crc kubenswrapper[4690]: I0320 17:33:16.678867 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:16 crc kubenswrapper[4690]: I0320 17:33:16.678939 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:16 crc kubenswrapper[4690]: I0320 17:33:16.678960 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:16 crc kubenswrapper[4690]: I0320 17:33:16.678983 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:16 crc kubenswrapper[4690]: I0320 17:33:16.679001 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:16Z","lastTransitionTime":"2026-03-20T17:33:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:16 crc kubenswrapper[4690]: E0320 17:33:16.691339 4690 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"65dcae3a-f6f0-4cdb-ac7a-76b1f475ea12\\\",\\\"systemUUID\\\":\\\"6ccc1e34-4160-4143-b919-ac2f717f294a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:16 crc kubenswrapper[4690]: E0320 17:33:16.691788 4690 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 17:33:16 crc kubenswrapper[4690]: E0320 17:33:16.691891 4690 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:33:16 crc kubenswrapper[4690]: E0320 17:33:16.792311 4690 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:33:16 crc kubenswrapper[4690]: E0320 17:33:16.892680 4690 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:33:16 crc kubenswrapper[4690]: E0320 17:33:16.993852 4690 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:33:17 crc kubenswrapper[4690]: E0320 17:33:17.094362 4690 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:33:17 crc kubenswrapper[4690]: E0320 17:33:17.195065 4690 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:33:17 crc kubenswrapper[4690]: I0320 17:33:17.263842 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 20 17:33:17 crc kubenswrapper[4690]: I0320 17:33:17.264567 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 17:33:17 crc kubenswrapper[4690]: I0320 17:33:17.266186 4690 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="60a788ca120045ef7b2481c3da0afac1f8ae2522b3edd3b73a48f5f8dab045a4" exitCode=255 Mar 20 17:33:17 crc kubenswrapper[4690]: I0320 17:33:17.266252 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"60a788ca120045ef7b2481c3da0afac1f8ae2522b3edd3b73a48f5f8dab045a4"} Mar 20 17:33:17 crc kubenswrapper[4690]: I0320 17:33:17.266333 4690 scope.go:117] "RemoveContainer" containerID="efe6cb07e25674ec32374ae6292c27ead95724ecbb9d9724799b4adc34714436" Mar 20 17:33:17 crc kubenswrapper[4690]: I0320 17:33:17.266562 4690 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:33:17 crc kubenswrapper[4690]: I0320 17:33:17.268697 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:17 crc kubenswrapper[4690]: I0320 17:33:17.268877 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:17 crc kubenswrapper[4690]: I0320 17:33:17.268895 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:17 crc kubenswrapper[4690]: I0320 17:33:17.270721 4690 scope.go:117] "RemoveContainer" containerID="60a788ca120045ef7b2481c3da0afac1f8ae2522b3edd3b73a48f5f8dab045a4" Mar 20 17:33:17 crc kubenswrapper[4690]: E0320 17:33:17.271176 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 17:33:17 crc kubenswrapper[4690]: E0320 17:33:17.295615 4690 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:33:17 crc kubenswrapper[4690]: E0320 17:33:17.396421 4690 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:33:17 crc kubenswrapper[4690]: E0320 17:33:17.497512 4690 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:33:17 crc kubenswrapper[4690]: E0320 17:33:17.598062 4690 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:33:17 crc kubenswrapper[4690]: E0320 17:33:17.698725 4690 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:33:17 crc kubenswrapper[4690]: I0320 17:33:17.728195 4690 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:33:17 crc kubenswrapper[4690]: E0320 17:33:17.799328 4690 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:33:17 crc kubenswrapper[4690]: E0320 17:33:17.900380 4690 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:33:18 crc kubenswrapper[4690]: E0320 17:33:18.000584 4690 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:33:18 crc kubenswrapper[4690]: E0320 17:33:18.101568 4690 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:33:18 crc kubenswrapper[4690]: E0320 17:33:18.202499 4690 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:33:18 crc kubenswrapper[4690]: I0320 17:33:18.271658 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 20 17:33:18 crc kubenswrapper[4690]: I0320 17:33:18.275112 4690 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:33:18 crc kubenswrapper[4690]: I0320 17:33:18.276384 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:18 crc kubenswrapper[4690]: I0320 17:33:18.276591 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:18 crc kubenswrapper[4690]: I0320 17:33:18.276741 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:18 crc kubenswrapper[4690]: I0320 17:33:18.277999 4690 scope.go:117] "RemoveContainer" containerID="60a788ca120045ef7b2481c3da0afac1f8ae2522b3edd3b73a48f5f8dab045a4" Mar 20 17:33:18 crc kubenswrapper[4690]: E0320 17:33:18.278561 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 17:33:18 crc kubenswrapper[4690]: E0320 17:33:18.302959 4690 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:33:18 crc kubenswrapper[4690]: E0320 17:33:18.403281 4690 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:33:18 crc kubenswrapper[4690]: E0320 17:33:18.503850 4690 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:33:18 crc kubenswrapper[4690]: E0320 17:33:18.605023 4690 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:33:18 crc kubenswrapper[4690]: E0320 17:33:18.705795 4690 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:33:18 crc kubenswrapper[4690]: E0320 17:33:18.806471 4690 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:33:18 crc kubenswrapper[4690]: E0320 17:33:18.907334 4690 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:33:19 crc kubenswrapper[4690]: E0320 17:33:19.007736 4690 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:33:19 crc kubenswrapper[4690]: E0320 17:33:19.108850 4690 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:33:19 crc kubenswrapper[4690]: E0320 17:33:19.209725 4690 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:33:19 crc kubenswrapper[4690]: E0320 17:33:19.310379 4690 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:33:19 crc kubenswrapper[4690]: E0320 17:33:19.410892 4690 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:33:19 crc kubenswrapper[4690]: E0320 17:33:19.511674 4690 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:33:19 crc kubenswrapper[4690]: E0320 17:33:19.611851 4690 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:33:19 crc kubenswrapper[4690]: E0320 17:33:19.712930 4690 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:33:19 crc kubenswrapper[4690]: E0320 17:33:19.813670 4690 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:33:19 crc kubenswrapper[4690]: E0320 17:33:19.913866 4690 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:33:20 crc kubenswrapper[4690]: E0320 17:33:20.014331 4690 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:33:20 crc kubenswrapper[4690]: E0320 17:33:20.114470 4690 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:33:20 crc kubenswrapper[4690]: E0320 17:33:20.215564 4690 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:33:20 crc kubenswrapper[4690]: E0320 17:33:20.316739 4690 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:33:20 crc kubenswrapper[4690]: E0320 17:33:20.417690 4690 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:33:20 crc kubenswrapper[4690]: E0320 17:33:20.518379 4690 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:33:20 crc kubenswrapper[4690]: E0320 17:33:20.619493 4690 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:33:20 crc kubenswrapper[4690]: E0320 17:33:20.720464 4690 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:33:20 crc kubenswrapper[4690]: E0320 17:33:20.821524 4690 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:33:20 crc kubenswrapper[4690]: E0320 17:33:20.922547 4690 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:33:21 crc kubenswrapper[4690]: E0320 17:33:21.023926 4690 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:33:21 crc kubenswrapper[4690]: E0320 17:33:21.125119 4690 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:33:21 crc kubenswrapper[4690]: E0320 17:33:21.225862 4690 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:33:21 crc kubenswrapper[4690]: E0320 17:33:21.326581 4690 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:33:21 crc kubenswrapper[4690]: E0320 17:33:21.428144 4690 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:33:21 crc kubenswrapper[4690]: E0320 17:33:21.529658 4690 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:33:21 crc kubenswrapper[4690]: E0320 17:33:21.630770 4690 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:33:21 crc kubenswrapper[4690]: E0320 17:33:21.731492 4690 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:33:21 crc kubenswrapper[4690]: E0320 17:33:21.832651 4690 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:33:21 crc kubenswrapper[4690]: E0320 17:33:21.933455 4690 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:33:22 crc kubenswrapper[4690]: E0320 17:33:22.035242 4690 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:33:22 crc kubenswrapper[4690]: E0320 17:33:22.136450 4690 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:33:22 crc kubenswrapper[4690]: E0320 17:33:22.237412 4690 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:33:22 crc kubenswrapper[4690]: E0320 17:33:22.338408 4690 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:33:22 crc kubenswrapper[4690]: E0320 17:33:22.439385 4690 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:33:22 crc kubenswrapper[4690]: E0320 17:33:22.540344 4690 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:33:22 crc kubenswrapper[4690]: E0320 17:33:22.640739 4690 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:33:22 crc kubenswrapper[4690]: E0320 17:33:22.741101 4690 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:33:22 crc kubenswrapper[4690]: E0320 17:33:22.841970 4690 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:33:22 crc kubenswrapper[4690]: E0320 17:33:22.942529 4690 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:33:22 crc kubenswrapper[4690]: I0320 17:33:22.997417 4690 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.045356 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.045403 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.045428 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.045453 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.045471 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:23Z","lastTransitionTime":"2026-03-20T17:33:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.148098 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.148169 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.148192 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.148227 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.148249 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:23Z","lastTransitionTime":"2026-03-20T17:33:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.251195 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.251305 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.251326 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.251349 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.251368 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:23Z","lastTransitionTime":"2026-03-20T17:33:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.354316 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.354409 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.354434 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.354464 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.354486 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:23Z","lastTransitionTime":"2026-03-20T17:33:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.457600 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.457666 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.457689 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.457718 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.457737 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:23Z","lastTransitionTime":"2026-03-20T17:33:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.560452 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.560513 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.560534 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.560562 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.560584 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:23Z","lastTransitionTime":"2026-03-20T17:33:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.664190 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.664296 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.664315 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.664340 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.664361 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:23Z","lastTransitionTime":"2026-03-20T17:33:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.767772 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.767823 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.767839 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.767862 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.767879 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:23Z","lastTransitionTime":"2026-03-20T17:33:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.824830 4690 apiserver.go:52] "Watching apiserver" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.833242 4690 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.833710 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-tzvwm","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-dns/node-resolver-qhmg6","openshift-image-registry/node-ca-4rfg5","openshift-multus/network-metrics-daemon-bgj72","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-machine-config-operator/machine-config-daemon-wtg2q","openshift-multus/multus-bf8dm","openshift-network-operator/iptables-alerter-4ln5h","openshift-ovn-kubernetes/ovnkube-node-7bsmm","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nqtt"] Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.834202 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.834510 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.834623 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 17:33:23 crc kubenswrapper[4690]: E0320 17:33:23.834672 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.835135 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.835227 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:33:23 crc kubenswrapper[4690]: E0320 17:33:23.835458 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.836691 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.837415 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-qhmg6" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.837978 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.838016 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.838049 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 20 17:33:23 crc kubenswrapper[4690]: E0320 17:33:23.838098 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.838109 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.838351 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.838413 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.838488 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.838820 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.842402 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.842563 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.843678 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.844816 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.845368 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bgj72" Mar 20 17:33:23 crc kubenswrapper[4690]: E0320 17:33:23.845609 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bgj72" podUID="3cb690cf-caea-4c1b-ad3c-7e17a802b1a3" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.847501 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-bf8dm" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.847540 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.848797 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-4rfg5" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.849516 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-tzvwm" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.851201 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.852556 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.852847 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nqtt" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.856984 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.857016 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.857157 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.857347 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.857414 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.857466 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.858231 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.858582 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.858720 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.858851 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.858981 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.859151 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.859172 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.859473 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.859659 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.859993 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.860132 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.860211 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.860220 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.860432 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.860757 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.860766 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.860927 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.865641 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.877504 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.877564 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.877582 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.877606 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.877625 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:23Z","lastTransitionTime":"2026-03-20T17:33:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.883005 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.904066 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.917247 4690 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.917647 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qhmg6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5abdfe2-a5f7-43a7-9c83-a9eb0dacdea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lb8q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qhmg6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.934690 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.951020 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.964274 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nqtt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f51dea1-fc10-4d4a-9065-2d0c020b36f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzj2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzj2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8nqtt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.977633 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.980713 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.980830 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.980846 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.980871 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.980886 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:23Z","lastTransitionTime":"2026-03-20T17:33:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.984133 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.984181 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.984215 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.984247 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.984295 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.984324 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.984350 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.984378 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.984409 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.984438 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.984550 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.984583 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.984614 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.984662 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.984692 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.984723 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.984754 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.984788 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.984818 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.984848 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.984880 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.984911 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.984940 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.984973 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.985005 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.985038 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.985068 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.985098 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.985128 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.985159 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.985188 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.985219 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.985250 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.985305 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.985342 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.985372 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.985404 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.985433 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.985487 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.985518 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.985563 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.985594 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.985628 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.985660 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.985694 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.985823 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.985860 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.985892 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.985922 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.985952 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.985980 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.986009 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.986038 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.986066 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.986095 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.986129 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.986156 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.986187 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.986216 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.986244 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.986305 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.986333 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.986364 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.986394 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.986424 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.986455 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.986484 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.986512 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.986547 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.986577 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.986606 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.986638 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.986666 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.986694 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.986724 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.986753 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.986782 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.986811 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.986842 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.986897 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.986925 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.986953 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.986983 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.987013 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.987048 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.987079 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.987108 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.987137 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.987167 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.987195 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.987224 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.987284 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.987314 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.987345 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.987375 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.987410 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.987444 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.987472 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.988218 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.988432 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.988698 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.988945 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.989049 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.989069 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.989114 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.989303 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.989344 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.989494 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c18651e4-89e3-43fd-a780-bfa6df87591e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v64dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v64dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wtg2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.989549 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.989756 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.989795 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.989826 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.989837 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.990086 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.990205 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.990442 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.990482 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.990518 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.990549 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.990581 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.990613 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.990647 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.990783 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.990818 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.990850 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.990929 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.991033 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.991074 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.991108 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.991142 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.991180 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.991246 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.991301 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.991335 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.991372 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.991405 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.991436 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.991466 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.991498 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.991541 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.991571 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.991600 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.991627 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.991661 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.991691 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.991717 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.991758 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.991787 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.991819 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.991850 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.991880 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.991913 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.991946 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.991985 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.992021 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.992050 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.992080 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.992110 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.992141 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.992176 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.992211 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.992244 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.992312 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.992345 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.992376 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.992410 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.992444 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 20 17:33:23 crc kubenswrapper[4690]: I0320 17:33:23.992486 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.003177 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4rfg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deaf1de2-4906-4e89-ae1b-83b6d35f97a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmghf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4rfg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:23.990301 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:23.990403 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:23.990413 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:23.990583 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:23.990613 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:23.990655 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:23.990858 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:23.990862 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:23.990940 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:23.990965 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:23.991116 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:23.991788 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:23.991793 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:23.991814 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:23.992457 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: E0320 17:33:23.992497 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:33:24.492480769 +0000 UTC m=+79.358306437 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.004215 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.004301 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.004346 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.004376 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.004410 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.004438 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.004443 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.004472 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.004582 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.004696 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.004770 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.004847 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.004931 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.005765 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.006317 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.006400 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.006432 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.006457 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.006477 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.006508 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.006574 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.006603 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.006629 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.006671 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.006697 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.006718 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.006741 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.006761 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.006780 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.006817 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.006841 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.006902 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.006926 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.006950 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.006973 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.006992 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.007015 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.007038 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.007059 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.007083 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.007104 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.007125 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.007145 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.007165 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.007284 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/01a728ab-e286-4606-b922-d510978b863a-ovnkube-script-lib\") pod \"ovnkube-node-7bsmm\" (UID: \"01a728ab-e286-4606-b922-d510978b863a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.007317 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.007341 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.007369 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.007389 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/01a728ab-e286-4606-b922-d510978b863a-host-slash\") pod \"ovnkube-node-7bsmm\" (UID: \"01a728ab-e286-4606-b922-d510978b863a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.007411 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/189715be-f690-4a1d-9bd3-fb0dcae7affe-multus-cni-dir\") pod \"multus-bf8dm\" (UID: \"189715be-f690-4a1d-9bd3-fb0dcae7affe\") " pod="openshift-multus/multus-bf8dm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.007433 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v64dg\" (UniqueName: \"kubernetes.io/projected/c18651e4-89e3-43fd-a780-bfa6df87591e-kube-api-access-v64dg\") pod \"machine-config-daemon-wtg2q\" (UID: \"c18651e4-89e3-43fd-a780-bfa6df87591e\") " pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.007453 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/01a728ab-e286-4606-b922-d510978b863a-run-ovn\") pod \"ovnkube-node-7bsmm\" (UID: \"01a728ab-e286-4606-b922-d510978b863a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.007473 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.007492 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.007512 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.007535 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79kbc\" (UniqueName: \"kubernetes.io/projected/3fe7c1d1-7aa9-4c64-941e-7415a99367ea-kube-api-access-79kbc\") pod \"multus-additional-cni-plugins-tzvwm\" (UID: \"3fe7c1d1-7aa9-4c64-941e-7415a99367ea\") " pod="openshift-multus/multus-additional-cni-plugins-tzvwm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.007556 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c18651e4-89e3-43fd-a780-bfa6df87591e-proxy-tls\") pod \"machine-config-daemon-wtg2q\" (UID: \"c18651e4-89e3-43fd-a780-bfa6df87591e\") " pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.007579 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.007598 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3fe7c1d1-7aa9-4c64-941e-7415a99367ea-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-tzvwm\" (UID: \"3fe7c1d1-7aa9-4c64-941e-7415a99367ea\") " pod="openshift-multus/multus-additional-cni-plugins-tzvwm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.007622 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.007642 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/c18651e4-89e3-43fd-a780-bfa6df87591e-rootfs\") pod \"machine-config-daemon-wtg2q\" (UID: \"c18651e4-89e3-43fd-a780-bfa6df87591e\") " pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.007659 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3fe7c1d1-7aa9-4c64-941e-7415a99367ea-tuning-conf-dir\") pod \"multus-additional-cni-plugins-tzvwm\" (UID: \"3fe7c1d1-7aa9-4c64-941e-7415a99367ea\") " pod="openshift-multus/multus-additional-cni-plugins-tzvwm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.007678 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e5abdfe2-a5f7-43a7-9c83-a9eb0dacdea3-hosts-file\") pod \"node-resolver-qhmg6\" (UID: \"e5abdfe2-a5f7-43a7-9c83-a9eb0dacdea3\") " pod="openshift-dns/node-resolver-qhmg6" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.007701 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/189715be-f690-4a1d-9bd3-fb0dcae7affe-host-run-k8s-cni-cncf-io\") pod \"multus-bf8dm\" (UID: \"189715be-f690-4a1d-9bd3-fb0dcae7affe\") " pod="openshift-multus/multus-bf8dm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.007719 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/189715be-f690-4a1d-9bd3-fb0dcae7affe-multus-conf-dir\") pod \"multus-bf8dm\" (UID: \"189715be-f690-4a1d-9bd3-fb0dcae7affe\") " pod="openshift-multus/multus-bf8dm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.007739 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/01a728ab-e286-4606-b922-d510978b863a-host-run-ovn-kubernetes\") pod \"ovnkube-node-7bsmm\" (UID: \"01a728ab-e286-4606-b922-d510978b863a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.007756 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3f51dea1-fc10-4d4a-9065-2d0c020b36f9-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-8nqtt\" (UID: \"3f51dea1-fc10-4d4a-9065-2d0c020b36f9\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nqtt" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.007778 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3f51dea1-fc10-4d4a-9065-2d0c020b36f9-env-overrides\") pod \"ovnkube-control-plane-749d76644c-8nqtt\" (UID: \"3f51dea1-fc10-4d4a-9065-2d0c020b36f9\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nqtt" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.007802 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.007822 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/01a728ab-e286-4606-b922-d510978b863a-node-log\") pod \"ovnkube-node-7bsmm\" (UID: \"01a728ab-e286-4606-b922-d510978b863a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.007838 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/01a728ab-e286-4606-b922-d510978b863a-log-socket\") pod \"ovnkube-node-7bsmm\" (UID: \"01a728ab-e286-4606-b922-d510978b863a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.007856 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/189715be-f690-4a1d-9bd3-fb0dcae7affe-hostroot\") pod \"multus-bf8dm\" (UID: \"189715be-f690-4a1d-9bd3-fb0dcae7affe\") " pod="openshift-multus/multus-bf8dm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.007877 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3fe7c1d1-7aa9-4c64-941e-7415a99367ea-os-release\") pod \"multus-additional-cni-plugins-tzvwm\" (UID: \"3fe7c1d1-7aa9-4c64-941e-7415a99367ea\") " pod="openshift-multus/multus-additional-cni-plugins-tzvwm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.007897 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lb8q\" (UniqueName: \"kubernetes.io/projected/e5abdfe2-a5f7-43a7-9c83-a9eb0dacdea3-kube-api-access-7lb8q\") pod \"node-resolver-qhmg6\" (UID: \"e5abdfe2-a5f7-43a7-9c83-a9eb0dacdea3\") " pod="openshift-dns/node-resolver-qhmg6" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.007916 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c18651e4-89e3-43fd-a780-bfa6df87591e-mcd-auth-proxy-config\") pod \"machine-config-daemon-wtg2q\" (UID: \"c18651e4-89e3-43fd-a780-bfa6df87591e\") " pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.007933 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/deaf1de2-4906-4e89-ae1b-83b6d35f97a6-serviceca\") pod \"node-ca-4rfg5\" (UID: \"deaf1de2-4906-4e89-ae1b-83b6d35f97a6\") " pod="openshift-image-registry/node-ca-4rfg5" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.007954 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/189715be-f690-4a1d-9bd3-fb0dcae7affe-os-release\") pod \"multus-bf8dm\" (UID: \"189715be-f690-4a1d-9bd3-fb0dcae7affe\") " pod="openshift-multus/multus-bf8dm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.007971 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/189715be-f690-4a1d-9bd3-fb0dcae7affe-cni-binary-copy\") pod \"multus-bf8dm\" (UID: \"189715be-f690-4a1d-9bd3-fb0dcae7affe\") " pod="openshift-multus/multus-bf8dm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.007991 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/189715be-f690-4a1d-9bd3-fb0dcae7affe-host-var-lib-cni-bin\") pod \"multus-bf8dm\" (UID: \"189715be-f690-4a1d-9bd3-fb0dcae7affe\") " pod="openshift-multus/multus-bf8dm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.008009 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/01a728ab-e286-4606-b922-d510978b863a-host-cni-bin\") pod \"ovnkube-node-7bsmm\" (UID: \"01a728ab-e286-4606-b922-d510978b863a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.008029 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/01a728ab-e286-4606-b922-d510978b863a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7bsmm\" (UID: \"01a728ab-e286-4606-b922-d510978b863a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.008048 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/01a728ab-e286-4606-b922-d510978b863a-env-overrides\") pod \"ovnkube-node-7bsmm\" (UID: \"01a728ab-e286-4606-b922-d510978b863a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.008068 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmwk9\" (UniqueName: \"kubernetes.io/projected/01a728ab-e286-4606-b922-d510978b863a-kube-api-access-nmwk9\") pod \"ovnkube-node-7bsmm\" (UID: \"01a728ab-e286-4606-b922-d510978b863a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.008086 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmghf\" (UniqueName: \"kubernetes.io/projected/deaf1de2-4906-4e89-ae1b-83b6d35f97a6-kube-api-access-qmghf\") pod \"node-ca-4rfg5\" (UID: \"deaf1de2-4906-4e89-ae1b-83b6d35f97a6\") " pod="openshift-image-registry/node-ca-4rfg5" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.008106 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/01a728ab-e286-4606-b922-d510978b863a-etc-openvswitch\") pod \"ovnkube-node-7bsmm\" (UID: \"01a728ab-e286-4606-b922-d510978b863a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.008129 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/01a728ab-e286-4606-b922-d510978b863a-run-openvswitch\") pod \"ovnkube-node-7bsmm\" (UID: \"01a728ab-e286-4606-b922-d510978b863a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.008153 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.008175 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/01a728ab-e286-4606-b922-d510978b863a-ovn-node-metrics-cert\") pod \"ovnkube-node-7bsmm\" (UID: \"01a728ab-e286-4606-b922-d510978b863a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.008192 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3f51dea1-fc10-4d4a-9065-2d0c020b36f9-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-8nqtt\" (UID: \"3f51dea1-fc10-4d4a-9065-2d0c020b36f9\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nqtt" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.008213 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/189715be-f690-4a1d-9bd3-fb0dcae7affe-multus-socket-dir-parent\") pod \"multus-bf8dm\" (UID: \"189715be-f690-4a1d-9bd3-fb0dcae7affe\") " pod="openshift-multus/multus-bf8dm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.008233 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/189715be-f690-4a1d-9bd3-fb0dcae7affe-multus-daemon-config\") pod \"multus-bf8dm\" (UID: \"189715be-f690-4a1d-9bd3-fb0dcae7affe\") " pod="openshift-multus/multus-bf8dm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.008270 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.008289 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9vwp\" (UniqueName: \"kubernetes.io/projected/189715be-f690-4a1d-9bd3-fb0dcae7affe-kube-api-access-z9vwp\") pod \"multus-bf8dm\" (UID: \"189715be-f690-4a1d-9bd3-fb0dcae7affe\") " pod="openshift-multus/multus-bf8dm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.008308 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/01a728ab-e286-4606-b922-d510978b863a-run-systemd\") pod \"ovnkube-node-7bsmm\" (UID: \"01a728ab-e286-4606-b922-d510978b863a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.008329 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzj2d\" (UniqueName: \"kubernetes.io/projected/3f51dea1-fc10-4d4a-9065-2d0c020b36f9-kube-api-access-zzj2d\") pod \"ovnkube-control-plane-749d76644c-8nqtt\" (UID: \"3f51dea1-fc10-4d4a-9065-2d0c020b36f9\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nqtt" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.008353 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djqjv\" (UniqueName: \"kubernetes.io/projected/3cb690cf-caea-4c1b-ad3c-7e17a802b1a3-kube-api-access-djqjv\") pod \"network-metrics-daemon-bgj72\" (UID: \"3cb690cf-caea-4c1b-ad3c-7e17a802b1a3\") " pod="openshift-multus/network-metrics-daemon-bgj72" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.008373 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/01a728ab-e286-4606-b922-d510978b863a-host-cni-netd\") pod \"ovnkube-node-7bsmm\" (UID: \"01a728ab-e286-4606-b922-d510978b863a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.008393 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.008413 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/189715be-f690-4a1d-9bd3-fb0dcae7affe-host-var-lib-kubelet\") pod \"multus-bf8dm\" (UID: \"189715be-f690-4a1d-9bd3-fb0dcae7affe\") " pod="openshift-multus/multus-bf8dm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.008435 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/189715be-f690-4a1d-9bd3-fb0dcae7affe-etc-kubernetes\") pod \"multus-bf8dm\" (UID: \"189715be-f690-4a1d-9bd3-fb0dcae7affe\") " pod="openshift-multus/multus-bf8dm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.008456 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/01a728ab-e286-4606-b922-d510978b863a-host-run-netns\") pod \"ovnkube-node-7bsmm\" (UID: \"01a728ab-e286-4606-b922-d510978b863a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.008474 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3fe7c1d1-7aa9-4c64-941e-7415a99367ea-cnibin\") pod \"multus-additional-cni-plugins-tzvwm\" (UID: \"3fe7c1d1-7aa9-4c64-941e-7415a99367ea\") " pod="openshift-multus/multus-additional-cni-plugins-tzvwm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.008496 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3cb690cf-caea-4c1b-ad3c-7e17a802b1a3-metrics-certs\") pod \"network-metrics-daemon-bgj72\" (UID: \"3cb690cf-caea-4c1b-ad3c-7e17a802b1a3\") " pod="openshift-multus/network-metrics-daemon-bgj72" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.008517 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/deaf1de2-4906-4e89-ae1b-83b6d35f97a6-host\") pod \"node-ca-4rfg5\" (UID: \"deaf1de2-4906-4e89-ae1b-83b6d35f97a6\") " pod="openshift-image-registry/node-ca-4rfg5" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.008536 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/01a728ab-e286-4606-b922-d510978b863a-host-kubelet\") pod \"ovnkube-node-7bsmm\" (UID: \"01a728ab-e286-4606-b922-d510978b863a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.008553 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/01a728ab-e286-4606-b922-d510978b863a-systemd-units\") pod \"ovnkube-node-7bsmm\" (UID: \"01a728ab-e286-4606-b922-d510978b863a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.008578 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.008599 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/189715be-f690-4a1d-9bd3-fb0dcae7affe-host-var-lib-cni-multus\") pod \"multus-bf8dm\" (UID: \"189715be-f690-4a1d-9bd3-fb0dcae7affe\") " pod="openshift-multus/multus-bf8dm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.008647 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/189715be-f690-4a1d-9bd3-fb0dcae7affe-host-run-multus-certs\") pod \"multus-bf8dm\" (UID: \"189715be-f690-4a1d-9bd3-fb0dcae7affe\") " pod="openshift-multus/multus-bf8dm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.008668 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3fe7c1d1-7aa9-4c64-941e-7415a99367ea-system-cni-dir\") pod \"multus-additional-cni-plugins-tzvwm\" (UID: \"3fe7c1d1-7aa9-4c64-941e-7415a99367ea\") " pod="openshift-multus/multus-additional-cni-plugins-tzvwm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.008693 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3fe7c1d1-7aa9-4c64-941e-7415a99367ea-cni-binary-copy\") pod \"multus-additional-cni-plugins-tzvwm\" (UID: \"3fe7c1d1-7aa9-4c64-941e-7415a99367ea\") " pod="openshift-multus/multus-additional-cni-plugins-tzvwm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.008734 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/01a728ab-e286-4606-b922-d510978b863a-var-lib-openvswitch\") pod \"ovnkube-node-7bsmm\" (UID: \"01a728ab-e286-4606-b922-d510978b863a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.008772 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.008801 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/189715be-f690-4a1d-9bd3-fb0dcae7affe-system-cni-dir\") pod \"multus-bf8dm\" (UID: \"189715be-f690-4a1d-9bd3-fb0dcae7affe\") " pod="openshift-multus/multus-bf8dm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.008817 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/189715be-f690-4a1d-9bd3-fb0dcae7affe-cnibin\") pod \"multus-bf8dm\" (UID: \"189715be-f690-4a1d-9bd3-fb0dcae7affe\") " pod="openshift-multus/multus-bf8dm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.008836 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/189715be-f690-4a1d-9bd3-fb0dcae7affe-host-run-netns\") pod \"multus-bf8dm\" (UID: \"189715be-f690-4a1d-9bd3-fb0dcae7affe\") " pod="openshift-multus/multus-bf8dm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.008856 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/01a728ab-e286-4606-b922-d510978b863a-ovnkube-config\") pod \"ovnkube-node-7bsmm\" (UID: \"01a728ab-e286-4606-b922-d510978b863a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.008937 4690 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.008955 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.008966 4690 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.008979 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.008992 4690 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.009007 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.009019 4690 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.009031 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.009042 4690 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.009059 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.009070 4690 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.009081 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.009096 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.009108 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.009120 4690 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.009132 4690 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.005316 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.005376 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:23.992538 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:23.992954 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:23.993018 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:23.993067 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:23.993404 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:23.993446 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:23.994135 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:23.994232 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:23.994461 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:23.994731 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.012801 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:23.994746 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:23.994786 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:23.995164 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:23.995467 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:23.995551 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:23.995771 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:23.996145 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:23.996322 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:23.996568 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:23.996567 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:23.996604 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:23.996956 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:23.997197 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:23.997545 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:23.997963 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:23.998225 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:23.998161 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:23.998429 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:23.999363 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:23.999424 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:23.999947 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.000029 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.000249 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.000156 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.000386 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.000551 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.000583 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.000917 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.000774 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.001185 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.001198 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.001195 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.001468 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.001518 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.001726 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.001803 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.001814 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.002063 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.002158 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.002323 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.002364 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.002203 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.002409 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.002587 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.002730 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.002962 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.003383 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.003716 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.003985 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.005389 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.005524 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.005705 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.006117 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.006142 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.006237 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.006708 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.006819 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.007513 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.007970 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.008743 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.008971 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.009102 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.009096 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:23.992534 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.009356 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.009538 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.009774 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.010049 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.010174 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.010312 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.010321 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.010388 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.010497 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.010919 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.011204 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.011540 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.011601 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.011848 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.011877 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.011912 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.012448 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.013102 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.013131 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.000783 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.013400 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.013468 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.013996 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.014853 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.015062 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.015154 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.015166 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.015179 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.015198 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.015421 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.015472 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.015504 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.015508 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.015610 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.015706 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.015555 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.015787 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.016079 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.016477 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.016767 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.016789 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.016916 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.016928 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.017469 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: E0320 17:33:24.017827 4690 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.017881 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.017947 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.018998 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.019493 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.019633 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.020124 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.020185 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.020236 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: E0320 17:33:24.020360 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 17:33:24.520211463 +0000 UTC m=+79.386037151 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.021318 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.020574 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.022485 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.022924 4690 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.022971 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.023034 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.023619 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.023796 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: E0320 17:33:24.023811 4690 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 17:33:24 crc kubenswrapper[4690]: E0320 17:33:24.024536 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 17:33:24.524495995 +0000 UTC m=+79.390321733 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.026020 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.026196 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.031041 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.036389 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.036426 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.036742 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.037512 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.037824 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.044666 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.044688 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.045081 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.046028 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.046187 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.047449 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.047931 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.047988 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.048118 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.048181 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:24 crc kubenswrapper[4690]: E0320 17:33:24.048229 4690 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 17:33:24 crc kubenswrapper[4690]: E0320 17:33:24.048244 4690 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 17:33:24 crc kubenswrapper[4690]: E0320 17:33:24.048274 4690 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:33:24 crc kubenswrapper[4690]: E0320 17:33:24.048325 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 17:33:24.548309287 +0000 UTC m=+79.414134965 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.048318 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.048754 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.049028 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.049166 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.049324 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.049497 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.049931 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.050099 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.050163 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.050980 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.052545 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.053640 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.056911 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.058043 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.059316 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:24 crc kubenswrapper[4690]: E0320 17:33:24.059660 4690 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 17:33:24 crc kubenswrapper[4690]: E0320 17:33:24.059694 4690 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 17:33:24 crc kubenswrapper[4690]: E0320 17:33:24.059714 4690 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:33:24 crc kubenswrapper[4690]: E0320 17:33:24.059781 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 17:33:24.559757485 +0000 UTC m=+79.425583193 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.060250 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.065399 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.069649 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.071065 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bf8dm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"189715be-f690-4a1d-9bd3-fb0dcae7affe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z9vwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bf8dm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.077041 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.083759 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.083801 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.083812 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.083828 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.083838 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:24Z","lastTransitionTime":"2026-03-20T17:33:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.087957 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01a728ab-e286-4606-b922-d510978b863a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7bsmm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.097798 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bgj72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cb690cf-caea-4c1b-ad3c-7e17a802b1a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djqjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djqjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bgj72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.109682 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/189715be-f690-4a1d-9bd3-fb0dcae7affe-host-var-lib-cni-bin\") pod \"multus-bf8dm\" (UID: \"189715be-f690-4a1d-9bd3-fb0dcae7affe\") " pod="openshift-multus/multus-bf8dm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.109738 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/189715be-f690-4a1d-9bd3-fb0dcae7affe-hostroot\") pod \"multus-bf8dm\" (UID: \"189715be-f690-4a1d-9bd3-fb0dcae7affe\") " pod="openshift-multus/multus-bf8dm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.109782 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3fe7c1d1-7aa9-4c64-941e-7415a99367ea-os-release\") pod \"multus-additional-cni-plugins-tzvwm\" (UID: \"3fe7c1d1-7aa9-4c64-941e-7415a99367ea\") " pod="openshift-multus/multus-additional-cni-plugins-tzvwm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.109783 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/189715be-f690-4a1d-9bd3-fb0dcae7affe-host-var-lib-cni-bin\") pod \"multus-bf8dm\" (UID: \"189715be-f690-4a1d-9bd3-fb0dcae7affe\") " pod="openshift-multus/multus-bf8dm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.109813 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/189715be-f690-4a1d-9bd3-fb0dcae7affe-hostroot\") pod \"multus-bf8dm\" (UID: \"189715be-f690-4a1d-9bd3-fb0dcae7affe\") " pod="openshift-multus/multus-bf8dm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.109818 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lb8q\" (UniqueName: \"kubernetes.io/projected/e5abdfe2-a5f7-43a7-9c83-a9eb0dacdea3-kube-api-access-7lb8q\") pod \"node-resolver-qhmg6\" (UID: \"e5abdfe2-a5f7-43a7-9c83-a9eb0dacdea3\") " pod="openshift-dns/node-resolver-qhmg6" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.109872 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c18651e4-89e3-43fd-a780-bfa6df87591e-mcd-auth-proxy-config\") pod \"machine-config-daemon-wtg2q\" (UID: \"c18651e4-89e3-43fd-a780-bfa6df87591e\") " pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.109897 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/deaf1de2-4906-4e89-ae1b-83b6d35f97a6-serviceca\") pod \"node-ca-4rfg5\" (UID: \"deaf1de2-4906-4e89-ae1b-83b6d35f97a6\") " pod="openshift-image-registry/node-ca-4rfg5" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.109922 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/189715be-f690-4a1d-9bd3-fb0dcae7affe-os-release\") pod \"multus-bf8dm\" (UID: \"189715be-f690-4a1d-9bd3-fb0dcae7affe\") " pod="openshift-multus/multus-bf8dm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.109943 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/189715be-f690-4a1d-9bd3-fb0dcae7affe-cni-binary-copy\") pod \"multus-bf8dm\" (UID: \"189715be-f690-4a1d-9bd3-fb0dcae7affe\") " pod="openshift-multus/multus-bf8dm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.109963 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/01a728ab-e286-4606-b922-d510978b863a-run-openvswitch\") pod \"ovnkube-node-7bsmm\" (UID: \"01a728ab-e286-4606-b922-d510978b863a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.109992 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/01a728ab-e286-4606-b922-d510978b863a-host-cni-bin\") pod \"ovnkube-node-7bsmm\" (UID: \"01a728ab-e286-4606-b922-d510978b863a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.110000 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3fe7c1d1-7aa9-4c64-941e-7415a99367ea-os-release\") pod \"multus-additional-cni-plugins-tzvwm\" (UID: \"3fe7c1d1-7aa9-4c64-941e-7415a99367ea\") " pod="openshift-multus/multus-additional-cni-plugins-tzvwm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.110020 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/01a728ab-e286-4606-b922-d510978b863a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7bsmm\" (UID: \"01a728ab-e286-4606-b922-d510978b863a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.110046 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/01a728ab-e286-4606-b922-d510978b863a-env-overrides\") pod \"ovnkube-node-7bsmm\" (UID: \"01a728ab-e286-4606-b922-d510978b863a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.110069 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmwk9\" (UniqueName: \"kubernetes.io/projected/01a728ab-e286-4606-b922-d510978b863a-kube-api-access-nmwk9\") pod \"ovnkube-node-7bsmm\" (UID: \"01a728ab-e286-4606-b922-d510978b863a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.110091 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmghf\" (UniqueName: \"kubernetes.io/projected/deaf1de2-4906-4e89-ae1b-83b6d35f97a6-kube-api-access-qmghf\") pod \"node-ca-4rfg5\" (UID: \"deaf1de2-4906-4e89-ae1b-83b6d35f97a6\") " pod="openshift-image-registry/node-ca-4rfg5" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.110112 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/01a728ab-e286-4606-b922-d510978b863a-etc-openvswitch\") pod \"ovnkube-node-7bsmm\" (UID: \"01a728ab-e286-4606-b922-d510978b863a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.110138 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/189715be-f690-4a1d-9bd3-fb0dcae7affe-multus-daemon-config\") pod \"multus-bf8dm\" (UID: \"189715be-f690-4a1d-9bd3-fb0dcae7affe\") " pod="openshift-multus/multus-bf8dm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.110176 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/01a728ab-e286-4606-b922-d510978b863a-ovn-node-metrics-cert\") pod \"ovnkube-node-7bsmm\" (UID: \"01a728ab-e286-4606-b922-d510978b863a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.110163 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/189715be-f690-4a1d-9bd3-fb0dcae7affe-os-release\") pod \"multus-bf8dm\" (UID: \"189715be-f690-4a1d-9bd3-fb0dcae7affe\") " pod="openshift-multus/multus-bf8dm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.110201 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3f51dea1-fc10-4d4a-9065-2d0c020b36f9-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-8nqtt\" (UID: \"3f51dea1-fc10-4d4a-9065-2d0c020b36f9\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nqtt" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.110230 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/189715be-f690-4a1d-9bd3-fb0dcae7affe-multus-socket-dir-parent\") pod \"multus-bf8dm\" (UID: \"189715be-f690-4a1d-9bd3-fb0dcae7affe\") " pod="openshift-multus/multus-bf8dm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.110281 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/01a728ab-e286-4606-b922-d510978b863a-etc-openvswitch\") pod \"ovnkube-node-7bsmm\" (UID: \"01a728ab-e286-4606-b922-d510978b863a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.110293 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/01a728ab-e286-4606-b922-d510978b863a-run-systemd\") pod \"ovnkube-node-7bsmm\" (UID: \"01a728ab-e286-4606-b922-d510978b863a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.110324 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9vwp\" (UniqueName: \"kubernetes.io/projected/189715be-f690-4a1d-9bd3-fb0dcae7affe-kube-api-access-z9vwp\") pod \"multus-bf8dm\" (UID: \"189715be-f690-4a1d-9bd3-fb0dcae7affe\") " pod="openshift-multus/multus-bf8dm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.110352 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzj2d\" (UniqueName: \"kubernetes.io/projected/3f51dea1-fc10-4d4a-9065-2d0c020b36f9-kube-api-access-zzj2d\") pod \"ovnkube-control-plane-749d76644c-8nqtt\" (UID: \"3f51dea1-fc10-4d4a-9065-2d0c020b36f9\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nqtt" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.110368 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/01a728ab-e286-4606-b922-d510978b863a-run-openvswitch\") pod \"ovnkube-node-7bsmm\" (UID: \"01a728ab-e286-4606-b922-d510978b863a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.110375 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djqjv\" (UniqueName: \"kubernetes.io/projected/3cb690cf-caea-4c1b-ad3c-7e17a802b1a3-kube-api-access-djqjv\") pod \"network-metrics-daemon-bgj72\" (UID: \"3cb690cf-caea-4c1b-ad3c-7e17a802b1a3\") " pod="openshift-multus/network-metrics-daemon-bgj72" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.110458 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/01a728ab-e286-4606-b922-d510978b863a-host-cni-netd\") pod \"ovnkube-node-7bsmm\" (UID: \"01a728ab-e286-4606-b922-d510978b863a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.110495 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/01a728ab-e286-4606-b922-d510978b863a-host-run-netns\") pod \"ovnkube-node-7bsmm\" (UID: \"01a728ab-e286-4606-b922-d510978b863a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.110528 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/189715be-f690-4a1d-9bd3-fb0dcae7affe-host-var-lib-kubelet\") pod \"multus-bf8dm\" (UID: \"189715be-f690-4a1d-9bd3-fb0dcae7affe\") " pod="openshift-multus/multus-bf8dm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.110559 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/189715be-f690-4a1d-9bd3-fb0dcae7affe-etc-kubernetes\") pod \"multus-bf8dm\" (UID: \"189715be-f690-4a1d-9bd3-fb0dcae7affe\") " pod="openshift-multus/multus-bf8dm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.110588 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3fe7c1d1-7aa9-4c64-941e-7415a99367ea-cnibin\") pod \"multus-additional-cni-plugins-tzvwm\" (UID: \"3fe7c1d1-7aa9-4c64-941e-7415a99367ea\") " pod="openshift-multus/multus-additional-cni-plugins-tzvwm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.110628 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c18651e4-89e3-43fd-a780-bfa6df87591e-mcd-auth-proxy-config\") pod \"machine-config-daemon-wtg2q\" (UID: \"c18651e4-89e3-43fd-a780-bfa6df87591e\") " pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.110628 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3fe7c1d1-7aa9-4c64-941e-7415a99367ea-system-cni-dir\") pod \"multus-additional-cni-plugins-tzvwm\" (UID: \"3fe7c1d1-7aa9-4c64-941e-7415a99367ea\") " pod="openshift-multus/multus-additional-cni-plugins-tzvwm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.110667 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3fe7c1d1-7aa9-4c64-941e-7415a99367ea-system-cni-dir\") pod \"multus-additional-cni-plugins-tzvwm\" (UID: \"3fe7c1d1-7aa9-4c64-941e-7415a99367ea\") " pod="openshift-multus/multus-additional-cni-plugins-tzvwm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.110684 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3cb690cf-caea-4c1b-ad3c-7e17a802b1a3-metrics-certs\") pod \"network-metrics-daemon-bgj72\" (UID: \"3cb690cf-caea-4c1b-ad3c-7e17a802b1a3\") " pod="openshift-multus/network-metrics-daemon-bgj72" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.110710 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/deaf1de2-4906-4e89-ae1b-83b6d35f97a6-host\") pod \"node-ca-4rfg5\" (UID: \"deaf1de2-4906-4e89-ae1b-83b6d35f97a6\") " pod="openshift-image-registry/node-ca-4rfg5" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.110747 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/01a728ab-e286-4606-b922-d510978b863a-host-run-netns\") pod \"ovnkube-node-7bsmm\" (UID: \"01a728ab-e286-4606-b922-d510978b863a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.110769 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/01a728ab-e286-4606-b922-d510978b863a-host-kubelet\") pod \"ovnkube-node-7bsmm\" (UID: \"01a728ab-e286-4606-b922-d510978b863a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.110792 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/01a728ab-e286-4606-b922-d510978b863a-systemd-units\") pod \"ovnkube-node-7bsmm\" (UID: \"01a728ab-e286-4606-b922-d510978b863a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.110822 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/189715be-f690-4a1d-9bd3-fb0dcae7affe-etc-kubernetes\") pod \"multus-bf8dm\" (UID: \"189715be-f690-4a1d-9bd3-fb0dcae7affe\") " pod="openshift-multus/multus-bf8dm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.110837 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/189715be-f690-4a1d-9bd3-fb0dcae7affe-host-var-lib-cni-multus\") pod \"multus-bf8dm\" (UID: \"189715be-f690-4a1d-9bd3-fb0dcae7affe\") " pod="openshift-multus/multus-bf8dm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.110858 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/189715be-f690-4a1d-9bd3-fb0dcae7affe-host-run-multus-certs\") pod \"multus-bf8dm\" (UID: \"189715be-f690-4a1d-9bd3-fb0dcae7affe\") " pod="openshift-multus/multus-bf8dm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.110865 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3fe7c1d1-7aa9-4c64-941e-7415a99367ea-cnibin\") pod \"multus-additional-cni-plugins-tzvwm\" (UID: \"3fe7c1d1-7aa9-4c64-941e-7415a99367ea\") " pod="openshift-multus/multus-additional-cni-plugins-tzvwm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.111057 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/01a728ab-e286-4606-b922-d510978b863a-env-overrides\") pod \"ovnkube-node-7bsmm\" (UID: \"01a728ab-e286-4606-b922-d510978b863a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.111064 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/01a728ab-e286-4606-b922-d510978b863a-host-kubelet\") pod \"ovnkube-node-7bsmm\" (UID: \"01a728ab-e286-4606-b922-d510978b863a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.111137 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/deaf1de2-4906-4e89-ae1b-83b6d35f97a6-host\") pod \"node-ca-4rfg5\" (UID: \"deaf1de2-4906-4e89-ae1b-83b6d35f97a6\") " pod="openshift-image-registry/node-ca-4rfg5" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.111147 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3fe7c1d1-7aa9-4c64-941e-7415a99367ea-cni-binary-copy\") pod \"multus-additional-cni-plugins-tzvwm\" (UID: \"3fe7c1d1-7aa9-4c64-941e-7415a99367ea\") " pod="openshift-multus/multus-additional-cni-plugins-tzvwm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.111161 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/189715be-f690-4a1d-9bd3-fb0dcae7affe-multus-socket-dir-parent\") pod \"multus-bf8dm\" (UID: \"189715be-f690-4a1d-9bd3-fb0dcae7affe\") " pod="openshift-multus/multus-bf8dm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.110323 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/01a728ab-e286-4606-b922-d510978b863a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7bsmm\" (UID: \"01a728ab-e286-4606-b922-d510978b863a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.111181 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/01a728ab-e286-4606-b922-d510978b863a-run-systemd\") pod \"ovnkube-node-7bsmm\" (UID: \"01a728ab-e286-4606-b922-d510978b863a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.111215 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/01a728ab-e286-4606-b922-d510978b863a-var-lib-openvswitch\") pod \"ovnkube-node-7bsmm\" (UID: \"01a728ab-e286-4606-b922-d510978b863a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.110791 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/189715be-f690-4a1d-9bd3-fb0dcae7affe-host-var-lib-kubelet\") pod \"multus-bf8dm\" (UID: \"189715be-f690-4a1d-9bd3-fb0dcae7affe\") " pod="openshift-multus/multus-bf8dm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.111151 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/189715be-f690-4a1d-9bd3-fb0dcae7affe-multus-daemon-config\") pod \"multus-bf8dm\" (UID: \"189715be-f690-4a1d-9bd3-fb0dcae7affe\") " pod="openshift-multus/multus-bf8dm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.110134 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/01a728ab-e286-4606-b922-d510978b863a-host-cni-bin\") pod \"ovnkube-node-7bsmm\" (UID: \"01a728ab-e286-4606-b922-d510978b863a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" Mar 20 17:33:24 crc kubenswrapper[4690]: E0320 17:33:24.111243 4690 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.111175 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/01a728ab-e286-4606-b922-d510978b863a-var-lib-openvswitch\") pod \"ovnkube-node-7bsmm\" (UID: \"01a728ab-e286-4606-b922-d510978b863a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.111288 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/deaf1de2-4906-4e89-ae1b-83b6d35f97a6-serviceca\") pod \"node-ca-4rfg5\" (UID: \"deaf1de2-4906-4e89-ae1b-83b6d35f97a6\") " pod="openshift-image-registry/node-ca-4rfg5" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.110715 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/01a728ab-e286-4606-b922-d510978b863a-host-cni-netd\") pod \"ovnkube-node-7bsmm\" (UID: \"01a728ab-e286-4606-b922-d510978b863a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.111395 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/01a728ab-e286-4606-b922-d510978b863a-systemd-units\") pod \"ovnkube-node-7bsmm\" (UID: \"01a728ab-e286-4606-b922-d510978b863a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.111412 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/189715be-f690-4a1d-9bd3-fb0dcae7affe-host-var-lib-cni-multus\") pod \"multus-bf8dm\" (UID: \"189715be-f690-4a1d-9bd3-fb0dcae7affe\") " pod="openshift-multus/multus-bf8dm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.111439 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/189715be-f690-4a1d-9bd3-fb0dcae7affe-host-run-multus-certs\") pod \"multus-bf8dm\" (UID: \"189715be-f690-4a1d-9bd3-fb0dcae7affe\") " pod="openshift-multus/multus-bf8dm" Mar 20 17:33:24 crc kubenswrapper[4690]: E0320 17:33:24.111522 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3cb690cf-caea-4c1b-ad3c-7e17a802b1a3-metrics-certs podName:3cb690cf-caea-4c1b-ad3c-7e17a802b1a3 nodeName:}" failed. No retries permitted until 2026-03-20 17:33:24.611493046 +0000 UTC m=+79.477318954 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3cb690cf-caea-4c1b-ad3c-7e17a802b1a3-metrics-certs") pod "network-metrics-daemon-bgj72" (UID: "3cb690cf-caea-4c1b-ad3c-7e17a802b1a3") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.111549 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/189715be-f690-4a1d-9bd3-fb0dcae7affe-system-cni-dir\") pod \"multus-bf8dm\" (UID: \"189715be-f690-4a1d-9bd3-fb0dcae7affe\") " pod="openshift-multus/multus-bf8dm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.111677 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/189715be-f690-4a1d-9bd3-fb0dcae7affe-system-cni-dir\") pod \"multus-bf8dm\" (UID: \"189715be-f690-4a1d-9bd3-fb0dcae7affe\") " pod="openshift-multus/multus-bf8dm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.111675 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/189715be-f690-4a1d-9bd3-fb0dcae7affe-cnibin\") pod \"multus-bf8dm\" (UID: \"189715be-f690-4a1d-9bd3-fb0dcae7affe\") " pod="openshift-multus/multus-bf8dm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.111732 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/189715be-f690-4a1d-9bd3-fb0dcae7affe-host-run-netns\") pod \"multus-bf8dm\" (UID: \"189715be-f690-4a1d-9bd3-fb0dcae7affe\") " pod="openshift-multus/multus-bf8dm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.111794 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/189715be-f690-4a1d-9bd3-fb0dcae7affe-cnibin\") pod \"multus-bf8dm\" (UID: \"189715be-f690-4a1d-9bd3-fb0dcae7affe\") " pod="openshift-multus/multus-bf8dm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.111824 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/189715be-f690-4a1d-9bd3-fb0dcae7affe-host-run-netns\") pod \"multus-bf8dm\" (UID: \"189715be-f690-4a1d-9bd3-fb0dcae7affe\") " pod="openshift-multus/multus-bf8dm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.111849 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/01a728ab-e286-4606-b922-d510978b863a-ovnkube-config\") pod \"ovnkube-node-7bsmm\" (UID: \"01a728ab-e286-4606-b922-d510978b863a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.111943 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/01a728ab-e286-4606-b922-d510978b863a-ovnkube-script-lib\") pod \"ovnkube-node-7bsmm\" (UID: \"01a728ab-e286-4606-b922-d510978b863a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.112331 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/189715be-f690-4a1d-9bd3-fb0dcae7affe-multus-cni-dir\") pod \"multus-bf8dm\" (UID: \"189715be-f690-4a1d-9bd3-fb0dcae7affe\") " pod="openshift-multus/multus-bf8dm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.112375 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/01a728ab-e286-4606-b922-d510978b863a-host-slash\") pod \"ovnkube-node-7bsmm\" (UID: \"01a728ab-e286-4606-b922-d510978b863a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.112403 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79kbc\" (UniqueName: \"kubernetes.io/projected/3fe7c1d1-7aa9-4c64-941e-7415a99367ea-kube-api-access-79kbc\") pod \"multus-additional-cni-plugins-tzvwm\" (UID: \"3fe7c1d1-7aa9-4c64-941e-7415a99367ea\") " pod="openshift-multus/multus-additional-cni-plugins-tzvwm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.112430 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v64dg\" (UniqueName: \"kubernetes.io/projected/c18651e4-89e3-43fd-a780-bfa6df87591e-kube-api-access-v64dg\") pod \"machine-config-daemon-wtg2q\" (UID: \"c18651e4-89e3-43fd-a780-bfa6df87591e\") " pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.112453 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/01a728ab-e286-4606-b922-d510978b863a-run-ovn\") pod \"ovnkube-node-7bsmm\" (UID: \"01a728ab-e286-4606-b922-d510978b863a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.112478 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.112513 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.112534 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/c18651e4-89e3-43fd-a780-bfa6df87591e-rootfs\") pod \"machine-config-daemon-wtg2q\" (UID: \"c18651e4-89e3-43fd-a780-bfa6df87591e\") " pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.112556 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c18651e4-89e3-43fd-a780-bfa6df87591e-proxy-tls\") pod \"machine-config-daemon-wtg2q\" (UID: \"c18651e4-89e3-43fd-a780-bfa6df87591e\") " pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.112579 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3fe7c1d1-7aa9-4c64-941e-7415a99367ea-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-tzvwm\" (UID: \"3fe7c1d1-7aa9-4c64-941e-7415a99367ea\") " pod="openshift-multus/multus-additional-cni-plugins-tzvwm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.112609 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/189715be-f690-4a1d-9bd3-fb0dcae7affe-multus-conf-dir\") pod \"multus-bf8dm\" (UID: \"189715be-f690-4a1d-9bd3-fb0dcae7affe\") " pod="openshift-multus/multus-bf8dm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.112661 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3fe7c1d1-7aa9-4c64-941e-7415a99367ea-tuning-conf-dir\") pod \"multus-additional-cni-plugins-tzvwm\" (UID: \"3fe7c1d1-7aa9-4c64-941e-7415a99367ea\") " pod="openshift-multus/multus-additional-cni-plugins-tzvwm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.112686 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e5abdfe2-a5f7-43a7-9c83-a9eb0dacdea3-hosts-file\") pod \"node-resolver-qhmg6\" (UID: \"e5abdfe2-a5f7-43a7-9c83-a9eb0dacdea3\") " pod="openshift-dns/node-resolver-qhmg6" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.112708 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/189715be-f690-4a1d-9bd3-fb0dcae7affe-host-run-k8s-cni-cncf-io\") pod \"multus-bf8dm\" (UID: \"189715be-f690-4a1d-9bd3-fb0dcae7affe\") " pod="openshift-multus/multus-bf8dm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.112730 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/01a728ab-e286-4606-b922-d510978b863a-log-socket\") pod \"ovnkube-node-7bsmm\" (UID: \"01a728ab-e286-4606-b922-d510978b863a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.112752 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/01a728ab-e286-4606-b922-d510978b863a-host-run-ovn-kubernetes\") pod \"ovnkube-node-7bsmm\" (UID: \"01a728ab-e286-4606-b922-d510978b863a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.112774 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3f51dea1-fc10-4d4a-9065-2d0c020b36f9-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-8nqtt\" (UID: \"3f51dea1-fc10-4d4a-9065-2d0c020b36f9\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nqtt" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.112797 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3f51dea1-fc10-4d4a-9065-2d0c020b36f9-env-overrides\") pod \"ovnkube-control-plane-749d76644c-8nqtt\" (UID: \"3f51dea1-fc10-4d4a-9065-2d0c020b36f9\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nqtt" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.112829 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/01a728ab-e286-4606-b922-d510978b863a-node-log\") pod \"ovnkube-node-7bsmm\" (UID: \"01a728ab-e286-4606-b922-d510978b863a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.112841 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/c18651e4-89e3-43fd-a780-bfa6df87591e-rootfs\") pod \"machine-config-daemon-wtg2q\" (UID: \"c18651e4-89e3-43fd-a780-bfa6df87591e\") " pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.112957 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/189715be-f690-4a1d-9bd3-fb0dcae7affe-host-run-k8s-cni-cncf-io\") pod \"multus-bf8dm\" (UID: \"189715be-f690-4a1d-9bd3-fb0dcae7affe\") " pod="openshift-multus/multus-bf8dm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.112957 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.112987 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/01a728ab-e286-4606-b922-d510978b863a-log-socket\") pod \"ovnkube-node-7bsmm\" (UID: \"01a728ab-e286-4606-b922-d510978b863a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.113025 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/01a728ab-e286-4606-b922-d510978b863a-host-run-ovn-kubernetes\") pod \"ovnkube-node-7bsmm\" (UID: \"01a728ab-e286-4606-b922-d510978b863a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.113070 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/01a728ab-e286-4606-b922-d510978b863a-ovnkube-config\") pod \"ovnkube-node-7bsmm\" (UID: \"01a728ab-e286-4606-b922-d510978b863a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.113177 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/189715be-f690-4a1d-9bd3-fb0dcae7affe-multus-conf-dir\") pod \"multus-bf8dm\" (UID: \"189715be-f690-4a1d-9bd3-fb0dcae7affe\") " pod="openshift-multus/multus-bf8dm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.113894 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3f51dea1-fc10-4d4a-9065-2d0c020b36f9-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-8nqtt\" (UID: \"3f51dea1-fc10-4d4a-9065-2d0c020b36f9\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nqtt" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.114114 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3fe7c1d1-7aa9-4c64-941e-7415a99367ea-cni-binary-copy\") pod \"multus-additional-cni-plugins-tzvwm\" (UID: \"3fe7c1d1-7aa9-4c64-941e-7415a99367ea\") " pod="openshift-multus/multus-additional-cni-plugins-tzvwm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.114193 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e5abdfe2-a5f7-43a7-9c83-a9eb0dacdea3-hosts-file\") pod \"node-resolver-qhmg6\" (UID: \"e5abdfe2-a5f7-43a7-9c83-a9eb0dacdea3\") " pod="openshift-dns/node-resolver-qhmg6" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.114205 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/01a728ab-e286-4606-b922-d510978b863a-host-slash\") pod \"ovnkube-node-7bsmm\" (UID: \"01a728ab-e286-4606-b922-d510978b863a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.114302 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/189715be-f690-4a1d-9bd3-fb0dcae7affe-multus-cni-dir\") pod \"multus-bf8dm\" (UID: \"189715be-f690-4a1d-9bd3-fb0dcae7affe\") " pod="openshift-multus/multus-bf8dm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.114342 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/01a728ab-e286-4606-b922-d510978b863a-ovnkube-script-lib\") pod \"ovnkube-node-7bsmm\" (UID: \"01a728ab-e286-4606-b922-d510978b863a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.114360 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/01a728ab-e286-4606-b922-d510978b863a-node-log\") pod \"ovnkube-node-7bsmm\" (UID: \"01a728ab-e286-4606-b922-d510978b863a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.114376 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/01a728ab-e286-4606-b922-d510978b863a-run-ovn\") pod \"ovnkube-node-7bsmm\" (UID: \"01a728ab-e286-4606-b922-d510978b863a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.114356 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.114455 4690 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.114478 4690 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.114352 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tzvwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fe7c1d1-7aa9-4c64-941e-7415a99367ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tzvwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.114515 4690 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.114537 4690 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.114556 4690 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.114576 4690 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.114593 4690 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.114610 4690 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.114629 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.114647 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.114664 4690 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.114681 4690 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.114693 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3fe7c1d1-7aa9-4c64-941e-7415a99367ea-tuning-conf-dir\") pod \"multus-additional-cni-plugins-tzvwm\" (UID: \"3fe7c1d1-7aa9-4c64-941e-7415a99367ea\") " pod="openshift-multus/multus-additional-cni-plugins-tzvwm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.114698 4690 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.114725 4690 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.114736 4690 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.114746 4690 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.114754 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.114762 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.114771 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.114780 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.114788 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.114797 4690 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.114806 4690 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.114814 4690 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.114822 4690 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.114831 4690 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.114839 4690 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.114850 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.114858 4690 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.114866 4690 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.114875 4690 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.114885 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.114893 4690 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.114901 4690 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.114909 4690 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.114918 4690 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.114928 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.114937 4690 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.114946 4690 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.114954 4690 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.114962 4690 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.114971 4690 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.114979 4690 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.114987 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.114996 4690 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.115005 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.115014 4690 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.115001 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3f51dea1-fc10-4d4a-9065-2d0c020b36f9-env-overrides\") pod \"ovnkube-control-plane-749d76644c-8nqtt\" (UID: \"3f51dea1-fc10-4d4a-9065-2d0c020b36f9\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nqtt" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.115022 4690 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.115099 4690 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.115125 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3f51dea1-fc10-4d4a-9065-2d0c020b36f9-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-8nqtt\" (UID: \"3f51dea1-fc10-4d4a-9065-2d0c020b36f9\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nqtt" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.115129 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.115177 4690 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.115188 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.115198 4690 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.115207 4690 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.115216 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.115227 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.115237 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.115248 4690 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.115277 4690 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.115286 4690 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.115296 4690 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.115307 4690 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.115316 4690 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.115325 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.115334 4690 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.115343 4690 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.115354 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.115363 4690 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.115372 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.115381 4690 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.115390 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.115398 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.115407 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.115416 4690 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.115444 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.115453 4690 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.115462 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.115471 4690 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.115479 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.115488 4690 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.115497 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.115506 4690 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.115514 4690 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.115523 4690 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.115530 4690 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.115539 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.115600 4690 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.115610 4690 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.115618 4690 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.115674 4690 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.115686 4690 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.115697 4690 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.115709 4690 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.115892 4690 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.115905 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.115916 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.115926 4690 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.115937 4690 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.115946 4690 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.115954 4690 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.115963 4690 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.115974 4690 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.115983 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.115992 4690 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.116001 4690 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.116011 4690 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.116022 4690 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.116037 4690 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.116046 4690 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.116055 4690 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.116064 4690 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.116075 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.116086 4690 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.116097 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.116108 4690 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.116121 4690 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.116135 4690 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.116147 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.116159 4690 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.116169 4690 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.116179 4690 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.116188 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.116198 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.116207 4690 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.116216 4690 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.116225 4690 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.116233 4690 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.116242 4690 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.116264 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.116274 4690 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.116283 4690 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.116293 4690 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.116302 4690 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.116310 4690 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.116320 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.116330 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.116339 4690 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.116348 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.116358 4690 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.116368 4690 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.116376 4690 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.116385 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.116394 4690 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.116406 4690 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.117762 4690 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.117781 4690 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.117793 4690 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.117805 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.117818 4690 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.117829 4690 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.117837 4690 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.117846 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.117855 4690 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.117863 4690 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.117873 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.117881 4690 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.117889 4690 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.117898 4690 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.117907 4690 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.117916 4690 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.118109 4690 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.118121 4690 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.118129 4690 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.118168 4690 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.118202 4690 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.118213 4690 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.118221 4690 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.118230 4690 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.118239 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.118247 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.118269 4690 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.126486 4690 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.126497 4690 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.126539 4690 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.126551 4690 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.126587 4690 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.126595 4690 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.126603 4690 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.134239 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/189715be-f690-4a1d-9bd3-fb0dcae7affe-cni-binary-copy\") pod \"multus-bf8dm\" (UID: \"189715be-f690-4a1d-9bd3-fb0dcae7affe\") " pod="openshift-multus/multus-bf8dm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.134590 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3fe7c1d1-7aa9-4c64-941e-7415a99367ea-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-tzvwm\" (UID: \"3fe7c1d1-7aa9-4c64-941e-7415a99367ea\") " pod="openshift-multus/multus-additional-cni-plugins-tzvwm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.134892 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c18651e4-89e3-43fd-a780-bfa6df87591e-proxy-tls\") pod \"machine-config-daemon-wtg2q\" (UID: \"c18651e4-89e3-43fd-a780-bfa6df87591e\") " pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.134950 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/01a728ab-e286-4606-b922-d510978b863a-ovn-node-metrics-cert\") pod \"ovnkube-node-7bsmm\" (UID: \"01a728ab-e286-4606-b922-d510978b863a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.137225 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmwk9\" (UniqueName: \"kubernetes.io/projected/01a728ab-e286-4606-b922-d510978b863a-kube-api-access-nmwk9\") pod \"ovnkube-node-7bsmm\" (UID: \"01a728ab-e286-4606-b922-d510978b863a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.137765 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzj2d\" (UniqueName: \"kubernetes.io/projected/3f51dea1-fc10-4d4a-9065-2d0c020b36f9-kube-api-access-zzj2d\") pod \"ovnkube-control-plane-749d76644c-8nqtt\" (UID: \"3f51dea1-fc10-4d4a-9065-2d0c020b36f9\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nqtt" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.138064 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djqjv\" (UniqueName: \"kubernetes.io/projected/3cb690cf-caea-4c1b-ad3c-7e17a802b1a3-kube-api-access-djqjv\") pod \"network-metrics-daemon-bgj72\" (UID: \"3cb690cf-caea-4c1b-ad3c-7e17a802b1a3\") " pod="openshift-multus/network-metrics-daemon-bgj72" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.138665 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79kbc\" (UniqueName: \"kubernetes.io/projected/3fe7c1d1-7aa9-4c64-941e-7415a99367ea-kube-api-access-79kbc\") pod \"multus-additional-cni-plugins-tzvwm\" (UID: \"3fe7c1d1-7aa9-4c64-941e-7415a99367ea\") " pod="openshift-multus/multus-additional-cni-plugins-tzvwm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.138882 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lb8q\" (UniqueName: \"kubernetes.io/projected/e5abdfe2-a5f7-43a7-9c83-a9eb0dacdea3-kube-api-access-7lb8q\") pod \"node-resolver-qhmg6\" (UID: \"e5abdfe2-a5f7-43a7-9c83-a9eb0dacdea3\") " pod="openshift-dns/node-resolver-qhmg6" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.144524 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v64dg\" (UniqueName: \"kubernetes.io/projected/c18651e4-89e3-43fd-a780-bfa6df87591e-kube-api-access-v64dg\") pod \"machine-config-daemon-wtg2q\" (UID: \"c18651e4-89e3-43fd-a780-bfa6df87591e\") " pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.144995 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9vwp\" (UniqueName: \"kubernetes.io/projected/189715be-f690-4a1d-9bd3-fb0dcae7affe-kube-api-access-z9vwp\") pod \"multus-bf8dm\" (UID: \"189715be-f690-4a1d-9bd3-fb0dcae7affe\") " pod="openshift-multus/multus-bf8dm" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.148871 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmghf\" (UniqueName: \"kubernetes.io/projected/deaf1de2-4906-4e89-ae1b-83b6d35f97a6-kube-api-access-qmghf\") pod \"node-ca-4rfg5\" (UID: \"deaf1de2-4906-4e89-ae1b-83b6d35f97a6\") " pod="openshift-image-registry/node-ca-4rfg5" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.179000 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.186059 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.186174 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.186239 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.186350 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.186442 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:24Z","lastTransitionTime":"2026-03-20T17:33:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:24 crc kubenswrapper[4690]: E0320 17:33:24.192059 4690 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 17:33:24 crc kubenswrapper[4690]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 20 17:33:24 crc kubenswrapper[4690]: if [[ -f "/env/_master" ]]; then Mar 20 17:33:24 crc kubenswrapper[4690]: set -o allexport Mar 20 17:33:24 crc kubenswrapper[4690]: source "/env/_master" Mar 20 17:33:24 crc kubenswrapper[4690]: set +o allexport Mar 20 17:33:24 crc kubenswrapper[4690]: fi Mar 20 17:33:24 crc kubenswrapper[4690]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 20 17:33:24 crc kubenswrapper[4690]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 20 17:33:24 crc kubenswrapper[4690]: ho_enable="--enable-hybrid-overlay" Mar 20 17:33:24 crc kubenswrapper[4690]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 20 17:33:24 crc kubenswrapper[4690]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 20 17:33:24 crc kubenswrapper[4690]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 20 17:33:24 crc kubenswrapper[4690]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 20 17:33:24 crc kubenswrapper[4690]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 20 17:33:24 crc kubenswrapper[4690]: --webhook-host=127.0.0.1 \ Mar 20 17:33:24 crc kubenswrapper[4690]: --webhook-port=9743 \ Mar 20 17:33:24 crc kubenswrapper[4690]: ${ho_enable} \ Mar 20 17:33:24 crc kubenswrapper[4690]: --enable-interconnect \ Mar 20 17:33:24 crc kubenswrapper[4690]: --disable-approver \ Mar 20 17:33:24 crc kubenswrapper[4690]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 20 17:33:24 crc kubenswrapper[4690]: --wait-for-kubernetes-api=200s \ Mar 20 17:33:24 crc kubenswrapper[4690]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 20 17:33:24 crc kubenswrapper[4690]: --loglevel="${LOGLEVEL}" Mar 20 17:33:24 crc kubenswrapper[4690]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 17:33:24 crc kubenswrapper[4690]: > logger="UnhandledError" Mar 20 17:33:24 crc kubenswrapper[4690]: E0320 17:33:24.196015 4690 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 17:33:24 crc kubenswrapper[4690]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 20 17:33:24 crc kubenswrapper[4690]: if [[ -f "/env/_master" ]]; then Mar 20 17:33:24 crc kubenswrapper[4690]: set -o allexport Mar 20 17:33:24 crc kubenswrapper[4690]: source "/env/_master" Mar 20 17:33:24 crc kubenswrapper[4690]: set +o allexport Mar 20 17:33:24 crc kubenswrapper[4690]: fi Mar 20 17:33:24 crc kubenswrapper[4690]: Mar 20 17:33:24 crc kubenswrapper[4690]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 20 17:33:24 crc kubenswrapper[4690]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 20 17:33:24 crc kubenswrapper[4690]: --disable-webhook \ Mar 20 17:33:24 crc kubenswrapper[4690]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 20 17:33:24 crc kubenswrapper[4690]: --loglevel="${LOGLEVEL}" Mar 20 17:33:24 crc kubenswrapper[4690]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 17:33:24 crc kubenswrapper[4690]: > logger="UnhandledError" Mar 20 17:33:24 crc kubenswrapper[4690]: E0320 17:33:24.199065 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.199146 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.211856 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-qhmg6" Mar 20 17:33:24 crc kubenswrapper[4690]: W0320 17:33:24.212306 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-b93cfea60d3522dee79dce30c069739c0488ee788e80373ab7a47bc1713973d7 WatchSource:0}: Error finding container b93cfea60d3522dee79dce30c069739c0488ee788e80373ab7a47bc1713973d7: Status 404 returned error can't find the container with id b93cfea60d3522dee79dce30c069739c0488ee788e80373ab7a47bc1713973d7 Mar 20 17:33:24 crc kubenswrapper[4690]: E0320 17:33:24.214247 4690 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 20 17:33:24 crc kubenswrapper[4690]: E0320 17:33:24.215431 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.222453 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 17:33:24 crc kubenswrapper[4690]: W0320 17:33:24.222855 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5abdfe2_a5f7_43a7_9c83_a9eb0dacdea3.slice/crio-86d9f448b29f6d363012e374cd70f5a6bc4a2b0752af19a9100eb22c6148d733 WatchSource:0}: Error finding container 86d9f448b29f6d363012e374cd70f5a6bc4a2b0752af19a9100eb22c6148d733: Status 404 returned error can't find the container with id 86d9f448b29f6d363012e374cd70f5a6bc4a2b0752af19a9100eb22c6148d733 Mar 20 17:33:24 crc kubenswrapper[4690]: E0320 17:33:24.227007 4690 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 17:33:24 crc kubenswrapper[4690]: container &Container{Name:dns-node-resolver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/bin/bash -c #!/bin/bash Mar 20 17:33:24 crc kubenswrapper[4690]: set -uo pipefail Mar 20 17:33:24 crc kubenswrapper[4690]: Mar 20 17:33:24 crc kubenswrapper[4690]: trap 'jobs -p | xargs kill || true; wait; exit 0' TERM Mar 20 17:33:24 crc kubenswrapper[4690]: Mar 20 17:33:24 crc kubenswrapper[4690]: OPENSHIFT_MARKER="openshift-generated-node-resolver" Mar 20 17:33:24 crc kubenswrapper[4690]: HOSTS_FILE="/etc/hosts" Mar 20 17:33:24 crc kubenswrapper[4690]: TEMP_FILE="/etc/hosts.tmp" Mar 20 17:33:24 crc kubenswrapper[4690]: Mar 20 17:33:24 crc kubenswrapper[4690]: IFS=', ' read -r -a services <<< "${SERVICES}" Mar 20 17:33:24 crc kubenswrapper[4690]: Mar 20 17:33:24 crc kubenswrapper[4690]: # Make a temporary file with the old hosts file's attributes. Mar 20 17:33:24 crc kubenswrapper[4690]: if ! cp -f --attributes-only "${HOSTS_FILE}" "${TEMP_FILE}"; then Mar 20 17:33:24 crc kubenswrapper[4690]: echo "Failed to preserve hosts file. Exiting." Mar 20 17:33:24 crc kubenswrapper[4690]: exit 1 Mar 20 17:33:24 crc kubenswrapper[4690]: fi Mar 20 17:33:24 crc kubenswrapper[4690]: Mar 20 17:33:24 crc kubenswrapper[4690]: while true; do Mar 20 17:33:24 crc kubenswrapper[4690]: declare -A svc_ips Mar 20 17:33:24 crc kubenswrapper[4690]: for svc in "${services[@]}"; do Mar 20 17:33:24 crc kubenswrapper[4690]: # Fetch service IP from cluster dns if present. We make several tries Mar 20 17:33:24 crc kubenswrapper[4690]: # to do it: IPv4, IPv6, IPv4 over TCP and IPv6 over TCP. The two last ones Mar 20 17:33:24 crc kubenswrapper[4690]: # are for deployments with Kuryr on older OpenStack (OSP13) - those do not Mar 20 17:33:24 crc kubenswrapper[4690]: # support UDP loadbalancers and require reaching DNS through TCP. Mar 20 17:33:24 crc kubenswrapper[4690]: cmds=('dig -t A @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 20 17:33:24 crc kubenswrapper[4690]: 'dig -t AAAA @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 20 17:33:24 crc kubenswrapper[4690]: 'dig -t A +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 20 17:33:24 crc kubenswrapper[4690]: 'dig -t AAAA +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"') Mar 20 17:33:24 crc kubenswrapper[4690]: for i in ${!cmds[*]} Mar 20 17:33:24 crc kubenswrapper[4690]: do Mar 20 17:33:24 crc kubenswrapper[4690]: ips=($(eval "${cmds[i]}")) Mar 20 17:33:24 crc kubenswrapper[4690]: if [[ "$?" -eq 0 && "${#ips[@]}" -ne 0 ]]; then Mar 20 17:33:24 crc kubenswrapper[4690]: svc_ips["${svc}"]="${ips[@]}" Mar 20 17:33:24 crc kubenswrapper[4690]: break Mar 20 17:33:24 crc kubenswrapper[4690]: fi Mar 20 17:33:24 crc kubenswrapper[4690]: done Mar 20 17:33:24 crc kubenswrapper[4690]: done Mar 20 17:33:24 crc kubenswrapper[4690]: Mar 20 17:33:24 crc kubenswrapper[4690]: # Update /etc/hosts only if we get valid service IPs Mar 20 17:33:24 crc kubenswrapper[4690]: # We will not update /etc/hosts when there is coredns service outage or api unavailability Mar 20 17:33:24 crc kubenswrapper[4690]: # Stale entries could exist in /etc/hosts if the service is deleted Mar 20 17:33:24 crc kubenswrapper[4690]: if [[ -n "${svc_ips[*]-}" ]]; then Mar 20 17:33:24 crc kubenswrapper[4690]: # Build a new hosts file from /etc/hosts with our custom entries filtered out Mar 20 17:33:24 crc kubenswrapper[4690]: if ! sed --silent "/# ${OPENSHIFT_MARKER}/d; w ${TEMP_FILE}" "${HOSTS_FILE}"; then Mar 20 17:33:24 crc kubenswrapper[4690]: # Only continue rebuilding the hosts entries if its original content is preserved Mar 20 17:33:24 crc kubenswrapper[4690]: sleep 60 & wait Mar 20 17:33:24 crc kubenswrapper[4690]: continue Mar 20 17:33:24 crc kubenswrapper[4690]: fi Mar 20 17:33:24 crc kubenswrapper[4690]: Mar 20 17:33:24 crc kubenswrapper[4690]: # Append resolver entries for services Mar 20 17:33:24 crc kubenswrapper[4690]: rc=0 Mar 20 17:33:24 crc kubenswrapper[4690]: for svc in "${!svc_ips[@]}"; do Mar 20 17:33:24 crc kubenswrapper[4690]: for ip in ${svc_ips[${svc}]}; do Mar 20 17:33:24 crc kubenswrapper[4690]: echo "${ip} ${svc} ${svc}.${CLUSTER_DOMAIN} # ${OPENSHIFT_MARKER}" >> "${TEMP_FILE}" || rc=$? Mar 20 17:33:24 crc kubenswrapper[4690]: done Mar 20 17:33:24 crc kubenswrapper[4690]: done Mar 20 17:33:24 crc kubenswrapper[4690]: if [[ $rc -ne 0 ]]; then Mar 20 17:33:24 crc kubenswrapper[4690]: sleep 60 & wait Mar 20 17:33:24 crc kubenswrapper[4690]: continue Mar 20 17:33:24 crc kubenswrapper[4690]: fi Mar 20 17:33:24 crc kubenswrapper[4690]: Mar 20 17:33:24 crc kubenswrapper[4690]: Mar 20 17:33:24 crc kubenswrapper[4690]: # TODO: Update /etc/hosts atomically to avoid any inconsistent behavior Mar 20 17:33:24 crc kubenswrapper[4690]: # Replace /etc/hosts with our modified version if needed Mar 20 17:33:24 crc kubenswrapper[4690]: cmp "${TEMP_FILE}" "${HOSTS_FILE}" || cp -f "${TEMP_FILE}" "${HOSTS_FILE}" Mar 20 17:33:24 crc kubenswrapper[4690]: # TEMP_FILE is not removed to avoid file create/delete and attributes copy churn Mar 20 17:33:24 crc kubenswrapper[4690]: fi Mar 20 17:33:24 crc kubenswrapper[4690]: sleep 60 & wait Mar 20 17:33:24 crc kubenswrapper[4690]: unset svc_ips Mar 20 17:33:24 crc kubenswrapper[4690]: done Mar 20 17:33:24 crc kubenswrapper[4690]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:SERVICES,Value:image-registry.openshift-image-registry.svc,ValueFrom:nil,},EnvVar{Name:NAMESERVER,Value:10.217.4.10,ValueFrom:nil,},EnvVar{Name:CLUSTER_DOMAIN,Value:cluster.local,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{22020096 0} {} 21Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:hosts-file,ReadOnly:false,MountPath:/etc/hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7lb8q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-resolver-qhmg6_openshift-dns(e5abdfe2-a5f7-43a7-9c83-a9eb0dacdea3): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 17:33:24 crc kubenswrapper[4690]: > logger="UnhandledError" Mar 20 17:33:24 crc kubenswrapper[4690]: E0320 17:33:24.228555 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dns-node-resolver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-dns/node-resolver-qhmg6" podUID="e5abdfe2-a5f7-43a7-9c83-a9eb0dacdea3" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.229667 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-bf8dm" Mar 20 17:33:24 crc kubenswrapper[4690]: E0320 17:33:24.235829 4690 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 17:33:24 crc kubenswrapper[4690]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 20 17:33:24 crc kubenswrapper[4690]: set -o allexport Mar 20 17:33:24 crc kubenswrapper[4690]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 20 17:33:24 crc kubenswrapper[4690]: source /etc/kubernetes/apiserver-url.env Mar 20 17:33:24 crc kubenswrapper[4690]: else Mar 20 17:33:24 crc kubenswrapper[4690]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 20 17:33:24 crc kubenswrapper[4690]: exit 1 Mar 20 17:33:24 crc kubenswrapper[4690]: fi Mar 20 17:33:24 crc kubenswrapper[4690]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 20 17:33:24 crc kubenswrapper[4690]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 17:33:24 crc kubenswrapper[4690]: > logger="UnhandledError" Mar 20 17:33:24 crc kubenswrapper[4690]: E0320 17:33:24.237171 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.238734 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-4rfg5" Mar 20 17:33:24 crc kubenswrapper[4690]: W0320 17:33:24.243383 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod189715be_f690_4a1d_9bd3_fb0dcae7affe.slice/crio-accdbf74a09879e58f694782878a1045befc7d7e7fceca6df9bc219e07df74ea WatchSource:0}: Error finding container accdbf74a09879e58f694782878a1045befc7d7e7fceca6df9bc219e07df74ea: Status 404 returned error can't find the container with id accdbf74a09879e58f694782878a1045befc7d7e7fceca6df9bc219e07df74ea Mar 20 17:33:24 crc kubenswrapper[4690]: E0320 17:33:24.245810 4690 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 17:33:24 crc kubenswrapper[4690]: container &Container{Name:kube-multus,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,Command:[/bin/bash -ec --],Args:[MULTUS_DAEMON_OPT="" Mar 20 17:33:24 crc kubenswrapper[4690]: /entrypoint/cnibincopy.sh; exec /usr/src/multus-cni/bin/multus-daemon $MULTUS_DAEMON_OPT Mar 20 17:33:24 crc kubenswrapper[4690]: ],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/bin/,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:6443,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:api-int.crc.testing,ValueFrom:nil,},EnvVar{Name:MULTUS_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:false,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:system-cni-dir,ReadOnly:false,MountPath:/host/etc/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-cni-dir,ReadOnly:false,MountPath:/host/run/multus/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-socket-dir-parent,ReadOnly:false,MountPath:/host/run/multus,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-k8s-cni-cncf-io,ReadOnly:false,MountPath:/run/k8s.cni.cncf.io,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-netns,ReadOnly:false,MountPath:/run/netns,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-bin,ReadOnly:false,MountPath:/var/lib/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-multus,ReadOnly:false,MountPath:/var/lib/cni/multus,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-kubelet,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:hostroot,ReadOnly:false,MountPath:/hostroot,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-conf-dir,ReadOnly:false,MountPath:/etc/cni/multus/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-daemon-config,ReadOnly:true,MountPath:/etc/cni/net.d/multus.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-multus-certs,ReadOnly:false,MountPath:/etc/cni/multus/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-kubernetes,ReadOnly:false,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-z9vwp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-bf8dm_openshift-multus(189715be-f690-4a1d-9bd3-fb0dcae7affe): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 17:33:24 crc kubenswrapper[4690]: > logger="UnhandledError" Mar 20 17:33:24 crc kubenswrapper[4690]: E0320 17:33:24.247017 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-bf8dm" podUID="189715be-f690-4a1d-9bd3-fb0dcae7affe" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.248341 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-tzvwm" Mar 20 17:33:24 crc kubenswrapper[4690]: W0320 17:33:24.249125 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddeaf1de2_4906_4e89_ae1b_83b6d35f97a6.slice/crio-436e58178161603373aea3c43474edc80dd2bb429ac61236abafd43e232700ef WatchSource:0}: Error finding container 436e58178161603373aea3c43474edc80dd2bb429ac61236abafd43e232700ef: Status 404 returned error can't find the container with id 436e58178161603373aea3c43474edc80dd2bb429ac61236abafd43e232700ef Mar 20 17:33:24 crc kubenswrapper[4690]: E0320 17:33:24.252499 4690 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 17:33:24 crc kubenswrapper[4690]: container &Container{Name:node-ca,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f,Command:[/bin/sh -c trap 'jobs -p | xargs -r kill; echo shutting down node-ca; exit 0' TERM Mar 20 17:33:24 crc kubenswrapper[4690]: while [ true ]; Mar 20 17:33:24 crc kubenswrapper[4690]: do Mar 20 17:33:24 crc kubenswrapper[4690]: for f in $(ls /tmp/serviceca); do Mar 20 17:33:24 crc kubenswrapper[4690]: echo $f Mar 20 17:33:24 crc kubenswrapper[4690]: ca_file_path="/tmp/serviceca/${f}" Mar 20 17:33:24 crc kubenswrapper[4690]: f=$(echo $f | sed -r 's/(.*)\.\./\1:/') Mar 20 17:33:24 crc kubenswrapper[4690]: reg_dir_path="/etc/docker/certs.d/${f}" Mar 20 17:33:24 crc kubenswrapper[4690]: if [ -e "${reg_dir_path}" ]; then Mar 20 17:33:24 crc kubenswrapper[4690]: cp -u $ca_file_path $reg_dir_path/ca.crt Mar 20 17:33:24 crc kubenswrapper[4690]: else Mar 20 17:33:24 crc kubenswrapper[4690]: mkdir $reg_dir_path Mar 20 17:33:24 crc kubenswrapper[4690]: cp $ca_file_path $reg_dir_path/ca.crt Mar 20 17:33:24 crc kubenswrapper[4690]: fi Mar 20 17:33:24 crc kubenswrapper[4690]: done Mar 20 17:33:24 crc kubenswrapper[4690]: for d in $(ls /etc/docker/certs.d); do Mar 20 17:33:24 crc kubenswrapper[4690]: echo $d Mar 20 17:33:24 crc kubenswrapper[4690]: dp=$(echo $d | sed -r 's/(.*):/\1\.\./') Mar 20 17:33:24 crc kubenswrapper[4690]: reg_conf_path="/tmp/serviceca/${dp}" Mar 20 17:33:24 crc kubenswrapper[4690]: if [ ! -e "${reg_conf_path}" ]; then Mar 20 17:33:24 crc kubenswrapper[4690]: rm -rf /etc/docker/certs.d/$d Mar 20 17:33:24 crc kubenswrapper[4690]: fi Mar 20 17:33:24 crc kubenswrapper[4690]: done Mar 20 17:33:24 crc kubenswrapper[4690]: sleep 60 & wait ${!} Mar 20 17:33:24 crc kubenswrapper[4690]: done Mar 20 17:33:24 crc kubenswrapper[4690]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{10485760 0} {} 10Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:serviceca,ReadOnly:false,MountPath:/tmp/serviceca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host,ReadOnly:false,MountPath:/etc/docker/certs.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qmghf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*1001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-ca-4rfg5_openshift-image-registry(deaf1de2-4906-4e89-ae1b-83b6d35f97a6): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 17:33:24 crc kubenswrapper[4690]: > logger="UnhandledError" Mar 20 17:33:24 crc kubenswrapper[4690]: E0320 17:33:24.253625 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"node-ca\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-image-registry/node-ca-4rfg5" podUID="deaf1de2-4906-4e89-ae1b-83b6d35f97a6" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.257060 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nqtt" Mar 20 17:33:24 crc kubenswrapper[4690]: W0320 17:33:24.258151 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3fe7c1d1_7aa9_4c64_941e_7415a99367ea.slice/crio-e38ffb03e7b4bd33817e263035245dbe1f5049e176917b7f5a342a321d69be15 WatchSource:0}: Error finding container e38ffb03e7b4bd33817e263035245dbe1f5049e176917b7f5a342a321d69be15: Status 404 returned error can't find the container with id e38ffb03e7b4bd33817e263035245dbe1f5049e176917b7f5a342a321d69be15 Mar 20 17:33:24 crc kubenswrapper[4690]: E0320 17:33:24.261071 4690 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:egress-router-binary-copy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,Command:[/entrypoint/cnibincopy.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/bin/,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:true,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-79kbc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-additional-cni-plugins-tzvwm_openshift-multus(3fe7c1d1-7aa9-4c64-941e-7415a99367ea): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 20 17:33:24 crc kubenswrapper[4690]: E0320 17:33:24.262328 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"egress-router-binary-copy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-additional-cni-plugins-tzvwm" podUID="3fe7c1d1-7aa9-4c64-941e-7415a99367ea" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.265625 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" Mar 20 17:33:24 crc kubenswrapper[4690]: W0320 17:33:24.266420 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f51dea1_fc10_4d4a_9065_2d0c020b36f9.slice/crio-76dfffd762ac097f167a65809302339071c3584197ed9a50cb50228e0e655f2c WatchSource:0}: Error finding container 76dfffd762ac097f167a65809302339071c3584197ed9a50cb50228e0e655f2c: Status 404 returned error can't find the container with id 76dfffd762ac097f167a65809302339071c3584197ed9a50cb50228e0e655f2c Mar 20 17:33:24 crc kubenswrapper[4690]: E0320 17:33:24.267998 4690 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 17:33:24 crc kubenswrapper[4690]: container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[/bin/bash -c #!/bin/bash Mar 20 17:33:24 crc kubenswrapper[4690]: set -euo pipefail Mar 20 17:33:24 crc kubenswrapper[4690]: TLS_PK=/etc/pki/tls/metrics-cert/tls.key Mar 20 17:33:24 crc kubenswrapper[4690]: TLS_CERT=/etc/pki/tls/metrics-cert/tls.crt Mar 20 17:33:24 crc kubenswrapper[4690]: # As the secret mount is optional we must wait for the files to be present. Mar 20 17:33:24 crc kubenswrapper[4690]: # The service is created in monitor.yaml and this is created in sdn.yaml. Mar 20 17:33:24 crc kubenswrapper[4690]: TS=$(date +%s) Mar 20 17:33:24 crc kubenswrapper[4690]: WARN_TS=$(( ${TS} + $(( 20 * 60)) )) Mar 20 17:33:24 crc kubenswrapper[4690]: HAS_LOGGED_INFO=0 Mar 20 17:33:24 crc kubenswrapper[4690]: Mar 20 17:33:24 crc kubenswrapper[4690]: log_missing_certs(){ Mar 20 17:33:24 crc kubenswrapper[4690]: CUR_TS=$(date +%s) Mar 20 17:33:24 crc kubenswrapper[4690]: if [[ "${CUR_TS}" -gt "WARN_TS" ]]; then Mar 20 17:33:24 crc kubenswrapper[4690]: echo $(date -Iseconds) WARN: ovn-control-plane-metrics-cert not mounted after 20 minutes. Mar 20 17:33:24 crc kubenswrapper[4690]: elif [[ "${HAS_LOGGED_INFO}" -eq 0 ]] ; then Mar 20 17:33:24 crc kubenswrapper[4690]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-cert not mounted. Waiting 20 minutes. Mar 20 17:33:24 crc kubenswrapper[4690]: HAS_LOGGED_INFO=1 Mar 20 17:33:24 crc kubenswrapper[4690]: fi Mar 20 17:33:24 crc kubenswrapper[4690]: } Mar 20 17:33:24 crc kubenswrapper[4690]: while [[ ! -f "${TLS_PK}" || ! -f "${TLS_CERT}" ]] ; do Mar 20 17:33:24 crc kubenswrapper[4690]: log_missing_certs Mar 20 17:33:24 crc kubenswrapper[4690]: sleep 5 Mar 20 17:33:24 crc kubenswrapper[4690]: done Mar 20 17:33:24 crc kubenswrapper[4690]: Mar 20 17:33:24 crc kubenswrapper[4690]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-certs mounted, starting kube-rbac-proxy Mar 20 17:33:24 crc kubenswrapper[4690]: exec /usr/bin/kube-rbac-proxy \ Mar 20 17:33:24 crc kubenswrapper[4690]: --logtostderr \ Mar 20 17:33:24 crc kubenswrapper[4690]: --secure-listen-address=:9108 \ Mar 20 17:33:24 crc kubenswrapper[4690]: --tls-cipher-suites=TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256 \ Mar 20 17:33:24 crc kubenswrapper[4690]: --upstream=http://127.0.0.1:29108/ \ Mar 20 17:33:24 crc kubenswrapper[4690]: --tls-private-key-file=${TLS_PK} \ Mar 20 17:33:24 crc kubenswrapper[4690]: --tls-cert-file=${TLS_CERT} Mar 20 17:33:24 crc kubenswrapper[4690]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:9108,ContainerPort:9108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{20971520 0} {} 20Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovn-control-plane-metrics-cert,ReadOnly:true,MountPath:/etc/pki/tls/metrics-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zzj2d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-8nqtt_openshift-ovn-kubernetes(3f51dea1-fc10-4d4a-9065-2d0c020b36f9): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 17:33:24 crc kubenswrapper[4690]: > logger="UnhandledError" Mar 20 17:33:24 crc kubenswrapper[4690]: E0320 17:33:24.270404 4690 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 17:33:24 crc kubenswrapper[4690]: container &Container{Name:ovnkube-cluster-manager,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 20 17:33:24 crc kubenswrapper[4690]: if [[ -f "/env/_master" ]]; then Mar 20 17:33:24 crc kubenswrapper[4690]: set -o allexport Mar 20 17:33:24 crc kubenswrapper[4690]: source "/env/_master" Mar 20 17:33:24 crc kubenswrapper[4690]: set +o allexport Mar 20 17:33:24 crc kubenswrapper[4690]: fi Mar 20 17:33:24 crc kubenswrapper[4690]: Mar 20 17:33:24 crc kubenswrapper[4690]: ovn_v4_join_subnet_opt= Mar 20 17:33:24 crc kubenswrapper[4690]: if [[ "" != "" ]]; then Mar 20 17:33:24 crc kubenswrapper[4690]: ovn_v4_join_subnet_opt="--gateway-v4-join-subnet " Mar 20 17:33:24 crc kubenswrapper[4690]: fi Mar 20 17:33:24 crc kubenswrapper[4690]: ovn_v6_join_subnet_opt= Mar 20 17:33:24 crc kubenswrapper[4690]: if [[ "" != "" ]]; then Mar 20 17:33:24 crc kubenswrapper[4690]: ovn_v6_join_subnet_opt="--gateway-v6-join-subnet " Mar 20 17:33:24 crc kubenswrapper[4690]: fi Mar 20 17:33:24 crc kubenswrapper[4690]: Mar 20 17:33:24 crc kubenswrapper[4690]: ovn_v4_transit_switch_subnet_opt= Mar 20 17:33:24 crc kubenswrapper[4690]: if [[ "" != "" ]]; then Mar 20 17:33:24 crc kubenswrapper[4690]: ovn_v4_transit_switch_subnet_opt="--cluster-manager-v4-transit-switch-subnet " Mar 20 17:33:24 crc kubenswrapper[4690]: fi Mar 20 17:33:24 crc kubenswrapper[4690]: ovn_v6_transit_switch_subnet_opt= Mar 20 17:33:24 crc kubenswrapper[4690]: if [[ "" != "" ]]; then Mar 20 17:33:24 crc kubenswrapper[4690]: ovn_v6_transit_switch_subnet_opt="--cluster-manager-v6-transit-switch-subnet " Mar 20 17:33:24 crc kubenswrapper[4690]: fi Mar 20 17:33:24 crc kubenswrapper[4690]: Mar 20 17:33:24 crc kubenswrapper[4690]: dns_name_resolver_enabled_flag= Mar 20 17:33:24 crc kubenswrapper[4690]: if [[ "false" == "true" ]]; then Mar 20 17:33:24 crc kubenswrapper[4690]: dns_name_resolver_enabled_flag="--enable-dns-name-resolver" Mar 20 17:33:24 crc kubenswrapper[4690]: fi Mar 20 17:33:24 crc kubenswrapper[4690]: Mar 20 17:33:24 crc kubenswrapper[4690]: persistent_ips_enabled_flag= Mar 20 17:33:24 crc kubenswrapper[4690]: if [[ "true" == "true" ]]; then Mar 20 17:33:24 crc kubenswrapper[4690]: persistent_ips_enabled_flag="--enable-persistent-ips" Mar 20 17:33:24 crc kubenswrapper[4690]: fi Mar 20 17:33:24 crc kubenswrapper[4690]: Mar 20 17:33:24 crc kubenswrapper[4690]: # This is needed so that converting clusters from GA to TP Mar 20 17:33:24 crc kubenswrapper[4690]: # will rollout control plane pods as well Mar 20 17:33:24 crc kubenswrapper[4690]: network_segmentation_enabled_flag= Mar 20 17:33:24 crc kubenswrapper[4690]: multi_network_enabled_flag= Mar 20 17:33:24 crc kubenswrapper[4690]: if [[ "true" == "true" ]]; then Mar 20 17:33:24 crc kubenswrapper[4690]: multi_network_enabled_flag="--enable-multi-network" Mar 20 17:33:24 crc kubenswrapper[4690]: network_segmentation_enabled_flag="--enable-network-segmentation" Mar 20 17:33:24 crc kubenswrapper[4690]: fi Mar 20 17:33:24 crc kubenswrapper[4690]: Mar 20 17:33:24 crc kubenswrapper[4690]: echo "I$(date "+%m%d %H:%M:%S.%N") - ovnkube-control-plane - start ovnkube --init-cluster-manager ${K8S_NODE}" Mar 20 17:33:24 crc kubenswrapper[4690]: exec /usr/bin/ovnkube \ Mar 20 17:33:24 crc kubenswrapper[4690]: --enable-interconnect \ Mar 20 17:33:24 crc kubenswrapper[4690]: --init-cluster-manager "${K8S_NODE}" \ Mar 20 17:33:24 crc kubenswrapper[4690]: --config-file=/run/ovnkube-config/ovnkube.conf \ Mar 20 17:33:24 crc kubenswrapper[4690]: --loglevel "${OVN_KUBE_LOG_LEVEL}" \ Mar 20 17:33:24 crc kubenswrapper[4690]: --metrics-bind-address "127.0.0.1:29108" \ Mar 20 17:33:24 crc kubenswrapper[4690]: --metrics-enable-pprof \ Mar 20 17:33:24 crc kubenswrapper[4690]: --metrics-enable-config-duration \ Mar 20 17:33:24 crc kubenswrapper[4690]: ${ovn_v4_join_subnet_opt} \ Mar 20 17:33:24 crc kubenswrapper[4690]: ${ovn_v6_join_subnet_opt} \ Mar 20 17:33:24 crc kubenswrapper[4690]: ${ovn_v4_transit_switch_subnet_opt} \ Mar 20 17:33:24 crc kubenswrapper[4690]: ${ovn_v6_transit_switch_subnet_opt} \ Mar 20 17:33:24 crc kubenswrapper[4690]: ${dns_name_resolver_enabled_flag} \ Mar 20 17:33:24 crc kubenswrapper[4690]: ${persistent_ips_enabled_flag} \ Mar 20 17:33:24 crc kubenswrapper[4690]: ${multi_network_enabled_flag} \ Mar 20 17:33:24 crc kubenswrapper[4690]: ${network_segmentation_enabled_flag} Mar 20 17:33:24 crc kubenswrapper[4690]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics-port,HostPort:29108,ContainerPort:29108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OVN_KUBE_LOG_LEVEL,Value:4,ValueFrom:nil,},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{314572800 0} {} 300Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovnkube-config,ReadOnly:false,MountPath:/run/ovnkube-config/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zzj2d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-8nqtt_openshift-ovn-kubernetes(3f51dea1-fc10-4d4a-9065-2d0c020b36f9): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 17:33:24 crc kubenswrapper[4690]: > logger="UnhandledError" Mar 20 17:33:24 crc kubenswrapper[4690]: E0320 17:33:24.271859 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"ovnkube-cluster-manager\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nqtt" podUID="3f51dea1-fc10-4d4a-9065-2d0c020b36f9" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.273014 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" Mar 20 17:33:24 crc kubenswrapper[4690]: E0320 17:33:24.279700 4690 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 17:33:24 crc kubenswrapper[4690]: init container &Container{Name:kubecfg-setup,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c cat << EOF > /etc/ovn/kubeconfig Mar 20 17:33:24 crc kubenswrapper[4690]: apiVersion: v1 Mar 20 17:33:24 crc kubenswrapper[4690]: clusters: Mar 20 17:33:24 crc kubenswrapper[4690]: - cluster: Mar 20 17:33:24 crc kubenswrapper[4690]: certificate-authority: /var/run/secrets/kubernetes.io/serviceaccount/ca.crt Mar 20 17:33:24 crc kubenswrapper[4690]: server: https://api-int.crc.testing:6443 Mar 20 17:33:24 crc kubenswrapper[4690]: name: default-cluster Mar 20 17:33:24 crc kubenswrapper[4690]: contexts: Mar 20 17:33:24 crc kubenswrapper[4690]: - context: Mar 20 17:33:24 crc kubenswrapper[4690]: cluster: default-cluster Mar 20 17:33:24 crc kubenswrapper[4690]: namespace: default Mar 20 17:33:24 crc kubenswrapper[4690]: user: default-auth Mar 20 17:33:24 crc kubenswrapper[4690]: name: default-context Mar 20 17:33:24 crc kubenswrapper[4690]: current-context: default-context Mar 20 17:33:24 crc kubenswrapper[4690]: kind: Config Mar 20 17:33:24 crc kubenswrapper[4690]: preferences: {} Mar 20 17:33:24 crc kubenswrapper[4690]: users: Mar 20 17:33:24 crc kubenswrapper[4690]: - name: default-auth Mar 20 17:33:24 crc kubenswrapper[4690]: user: Mar 20 17:33:24 crc kubenswrapper[4690]: client-certificate: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 20 17:33:24 crc kubenswrapper[4690]: client-key: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 20 17:33:24 crc kubenswrapper[4690]: EOF Mar 20 17:33:24 crc kubenswrapper[4690]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-openvswitch,ReadOnly:false,MountPath:/etc/ovn/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nmwk9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-node-7bsmm_openshift-ovn-kubernetes(01a728ab-e286-4606-b922-d510978b863a): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 17:33:24 crc kubenswrapper[4690]: > logger="UnhandledError" Mar 20 17:33:24 crc kubenswrapper[4690]: E0320 17:33:24.281076 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kubecfg-setup\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" podUID="01a728ab-e286-4606-b922-d510978b863a" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.289070 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.289121 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.289139 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.289158 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.289172 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:24Z","lastTransitionTime":"2026-03-20T17:33:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:24 crc kubenswrapper[4690]: W0320 17:33:24.289527 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc18651e4_89e3_43fd_a780_bfa6df87591e.slice/crio-5285f70fbcba75161e139d5d852a2c649510bfccb5cf2c520afab127f5087986 WatchSource:0}: Error finding container 5285f70fbcba75161e139d5d852a2c649510bfccb5cf2c520afab127f5087986: Status 404 returned error can't find the container with id 5285f70fbcba75161e139d5d852a2c649510bfccb5cf2c520afab127f5087986 Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.291912 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-4rfg5" event={"ID":"deaf1de2-4906-4e89-ae1b-83b6d35f97a6","Type":"ContainerStarted","Data":"436e58178161603373aea3c43474edc80dd2bb429ac61236abafd43e232700ef"} Mar 20 17:33:24 crc kubenswrapper[4690]: E0320 17:33:24.292152 4690 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:machine-config-daemon,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a,Command:[/usr/bin/machine-config-daemon],Args:[start --payload-version=4.18.1],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:health,HostPort:8798,ContainerPort:8798,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rootfs,ReadOnly:false,MountPath:/rootfs,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-v64dg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/health,Port:{0 8798 },Host:127.0.0.1,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:120,TimeoutSeconds:1,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 20 17:33:24 crc kubenswrapper[4690]: E0320 17:33:24.294059 4690 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 17:33:24 crc kubenswrapper[4690]: container &Container{Name:node-ca,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f,Command:[/bin/sh -c trap 'jobs -p | xargs -r kill; echo shutting down node-ca; exit 0' TERM Mar 20 17:33:24 crc kubenswrapper[4690]: while [ true ]; Mar 20 17:33:24 crc kubenswrapper[4690]: do Mar 20 17:33:24 crc kubenswrapper[4690]: for f in $(ls /tmp/serviceca); do Mar 20 17:33:24 crc kubenswrapper[4690]: echo $f Mar 20 17:33:24 crc kubenswrapper[4690]: ca_file_path="/tmp/serviceca/${f}" Mar 20 17:33:24 crc kubenswrapper[4690]: f=$(echo $f | sed -r 's/(.*)\.\./\1:/') Mar 20 17:33:24 crc kubenswrapper[4690]: reg_dir_path="/etc/docker/certs.d/${f}" Mar 20 17:33:24 crc kubenswrapper[4690]: if [ -e "${reg_dir_path}" ]; then Mar 20 17:33:24 crc kubenswrapper[4690]: cp -u $ca_file_path $reg_dir_path/ca.crt Mar 20 17:33:24 crc kubenswrapper[4690]: else Mar 20 17:33:24 crc kubenswrapper[4690]: mkdir $reg_dir_path Mar 20 17:33:24 crc kubenswrapper[4690]: cp $ca_file_path $reg_dir_path/ca.crt Mar 20 17:33:24 crc kubenswrapper[4690]: fi Mar 20 17:33:24 crc kubenswrapper[4690]: done Mar 20 17:33:24 crc kubenswrapper[4690]: for d in $(ls /etc/docker/certs.d); do Mar 20 17:33:24 crc kubenswrapper[4690]: echo $d Mar 20 17:33:24 crc kubenswrapper[4690]: dp=$(echo $d | sed -r 's/(.*):/\1\.\./') Mar 20 17:33:24 crc kubenswrapper[4690]: reg_conf_path="/tmp/serviceca/${dp}" Mar 20 17:33:24 crc kubenswrapper[4690]: if [ ! -e "${reg_conf_path}" ]; then Mar 20 17:33:24 crc kubenswrapper[4690]: rm -rf /etc/docker/certs.d/$d Mar 20 17:33:24 crc kubenswrapper[4690]: fi Mar 20 17:33:24 crc kubenswrapper[4690]: done Mar 20 17:33:24 crc kubenswrapper[4690]: sleep 60 & wait ${!} Mar 20 17:33:24 crc kubenswrapper[4690]: done Mar 20 17:33:24 crc kubenswrapper[4690]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{10485760 0} {} 10Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:serviceca,ReadOnly:false,MountPath:/tmp/serviceca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host,ReadOnly:false,MountPath:/etc/docker/certs.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qmghf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*1001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-ca-4rfg5_openshift-image-registry(deaf1de2-4906-4e89-ae1b-83b6d35f97a6): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 17:33:24 crc kubenswrapper[4690]: > logger="UnhandledError" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.294538 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nqtt" event={"ID":"3f51dea1-fc10-4d4a-9065-2d0c020b36f9","Type":"ContainerStarted","Data":"76dfffd762ac097f167a65809302339071c3584197ed9a50cb50228e0e655f2c"} Mar 20 17:33:24 crc kubenswrapper[4690]: E0320 17:33:24.295250 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"node-ca\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-image-registry/node-ca-4rfg5" podUID="deaf1de2-4906-4e89-ae1b-83b6d35f97a6" Mar 20 17:33:24 crc kubenswrapper[4690]: E0320 17:33:24.295539 4690 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[],Args:[--secure-listen-address=0.0.0.0:9001 --config-file=/etc/kube-rbac-proxy/config-file.yaml --tls-cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --tls-min-version=VersionTLS12 --upstream=http://127.0.0.1:8797 --logtostderr=true --tls-cert-file=/etc/tls/private/tls.crt --tls-private-key-file=/etc/tls/private/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:9001,ContainerPort:9001,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:proxy-tls,ReadOnly:false,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:mcd-auth-proxy-config,ReadOnly:false,MountPath:/etc/kube-rbac-proxy,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-v64dg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 20 17:33:24 crc kubenswrapper[4690]: E0320 17:33:24.296794 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"machine-config-daemon\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 17:33:24 crc kubenswrapper[4690]: E0320 17:33:24.297542 4690 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 17:33:24 crc kubenswrapper[4690]: container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[/bin/bash -c #!/bin/bash Mar 20 17:33:24 crc kubenswrapper[4690]: set -euo pipefail Mar 20 17:33:24 crc kubenswrapper[4690]: TLS_PK=/etc/pki/tls/metrics-cert/tls.key Mar 20 17:33:24 crc kubenswrapper[4690]: TLS_CERT=/etc/pki/tls/metrics-cert/tls.crt Mar 20 17:33:24 crc kubenswrapper[4690]: # As the secret mount is optional we must wait for the files to be present. Mar 20 17:33:24 crc kubenswrapper[4690]: # The service is created in monitor.yaml and this is created in sdn.yaml. Mar 20 17:33:24 crc kubenswrapper[4690]: TS=$(date +%s) Mar 20 17:33:24 crc kubenswrapper[4690]: WARN_TS=$(( ${TS} + $(( 20 * 60)) )) Mar 20 17:33:24 crc kubenswrapper[4690]: HAS_LOGGED_INFO=0 Mar 20 17:33:24 crc kubenswrapper[4690]: Mar 20 17:33:24 crc kubenswrapper[4690]: log_missing_certs(){ Mar 20 17:33:24 crc kubenswrapper[4690]: CUR_TS=$(date +%s) Mar 20 17:33:24 crc kubenswrapper[4690]: if [[ "${CUR_TS}" -gt "WARN_TS" ]]; then Mar 20 17:33:24 crc kubenswrapper[4690]: echo $(date -Iseconds) WARN: ovn-control-plane-metrics-cert not mounted after 20 minutes. Mar 20 17:33:24 crc kubenswrapper[4690]: elif [[ "${HAS_LOGGED_INFO}" -eq 0 ]] ; then Mar 20 17:33:24 crc kubenswrapper[4690]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-cert not mounted. Waiting 20 minutes. Mar 20 17:33:24 crc kubenswrapper[4690]: HAS_LOGGED_INFO=1 Mar 20 17:33:24 crc kubenswrapper[4690]: fi Mar 20 17:33:24 crc kubenswrapper[4690]: } Mar 20 17:33:24 crc kubenswrapper[4690]: while [[ ! -f "${TLS_PK}" || ! -f "${TLS_CERT}" ]] ; do Mar 20 17:33:24 crc kubenswrapper[4690]: log_missing_certs Mar 20 17:33:24 crc kubenswrapper[4690]: sleep 5 Mar 20 17:33:24 crc kubenswrapper[4690]: done Mar 20 17:33:24 crc kubenswrapper[4690]: Mar 20 17:33:24 crc kubenswrapper[4690]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-certs mounted, starting kube-rbac-proxy Mar 20 17:33:24 crc kubenswrapper[4690]: exec /usr/bin/kube-rbac-proxy \ Mar 20 17:33:24 crc kubenswrapper[4690]: --logtostderr \ Mar 20 17:33:24 crc kubenswrapper[4690]: --secure-listen-address=:9108 \ Mar 20 17:33:24 crc kubenswrapper[4690]: --tls-cipher-suites=TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256 \ Mar 20 17:33:24 crc kubenswrapper[4690]: --upstream=http://127.0.0.1:29108/ \ Mar 20 17:33:24 crc kubenswrapper[4690]: --tls-private-key-file=${TLS_PK} \ Mar 20 17:33:24 crc kubenswrapper[4690]: --tls-cert-file=${TLS_CERT} Mar 20 17:33:24 crc kubenswrapper[4690]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:9108,ContainerPort:9108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{20971520 0} {} 20Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovn-control-plane-metrics-cert,ReadOnly:true,MountPath:/etc/pki/tls/metrics-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zzj2d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-8nqtt_openshift-ovn-kubernetes(3f51dea1-fc10-4d4a-9065-2d0c020b36f9): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 17:33:24 crc kubenswrapper[4690]: > logger="UnhandledError" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.298196 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tzvwm" event={"ID":"3fe7c1d1-7aa9-4c64-941e-7415a99367ea","Type":"ContainerStarted","Data":"e38ffb03e7b4bd33817e263035245dbe1f5049e176917b7f5a342a321d69be15"} Mar 20 17:33:24 crc kubenswrapper[4690]: E0320 17:33:24.300635 4690 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:egress-router-binary-copy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,Command:[/entrypoint/cnibincopy.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/bin/,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:true,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-79kbc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-additional-cni-plugins-tzvwm_openshift-multus(3fe7c1d1-7aa9-4c64-941e-7415a99367ea): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 20 17:33:24 crc kubenswrapper[4690]: E0320 17:33:24.300706 4690 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 17:33:24 crc kubenswrapper[4690]: container &Container{Name:ovnkube-cluster-manager,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 20 17:33:24 crc kubenswrapper[4690]: if [[ -f "/env/_master" ]]; then Mar 20 17:33:24 crc kubenswrapper[4690]: set -o allexport Mar 20 17:33:24 crc kubenswrapper[4690]: source "/env/_master" Mar 20 17:33:24 crc kubenswrapper[4690]: set +o allexport Mar 20 17:33:24 crc kubenswrapper[4690]: fi Mar 20 17:33:24 crc kubenswrapper[4690]: Mar 20 17:33:24 crc kubenswrapper[4690]: ovn_v4_join_subnet_opt= Mar 20 17:33:24 crc kubenswrapper[4690]: if [[ "" != "" ]]; then Mar 20 17:33:24 crc kubenswrapper[4690]: ovn_v4_join_subnet_opt="--gateway-v4-join-subnet " Mar 20 17:33:24 crc kubenswrapper[4690]: fi Mar 20 17:33:24 crc kubenswrapper[4690]: ovn_v6_join_subnet_opt= Mar 20 17:33:24 crc kubenswrapper[4690]: if [[ "" != "" ]]; then Mar 20 17:33:24 crc kubenswrapper[4690]: ovn_v6_join_subnet_opt="--gateway-v6-join-subnet " Mar 20 17:33:24 crc kubenswrapper[4690]: fi Mar 20 17:33:24 crc kubenswrapper[4690]: Mar 20 17:33:24 crc kubenswrapper[4690]: ovn_v4_transit_switch_subnet_opt= Mar 20 17:33:24 crc kubenswrapper[4690]: if [[ "" != "" ]]; then Mar 20 17:33:24 crc kubenswrapper[4690]: ovn_v4_transit_switch_subnet_opt="--cluster-manager-v4-transit-switch-subnet " Mar 20 17:33:24 crc kubenswrapper[4690]: fi Mar 20 17:33:24 crc kubenswrapper[4690]: ovn_v6_transit_switch_subnet_opt= Mar 20 17:33:24 crc kubenswrapper[4690]: if [[ "" != "" ]]; then Mar 20 17:33:24 crc kubenswrapper[4690]: ovn_v6_transit_switch_subnet_opt="--cluster-manager-v6-transit-switch-subnet " Mar 20 17:33:24 crc kubenswrapper[4690]: fi Mar 20 17:33:24 crc kubenswrapper[4690]: Mar 20 17:33:24 crc kubenswrapper[4690]: dns_name_resolver_enabled_flag= Mar 20 17:33:24 crc kubenswrapper[4690]: if [[ "false" == "true" ]]; then Mar 20 17:33:24 crc kubenswrapper[4690]: dns_name_resolver_enabled_flag="--enable-dns-name-resolver" Mar 20 17:33:24 crc kubenswrapper[4690]: fi Mar 20 17:33:24 crc kubenswrapper[4690]: Mar 20 17:33:24 crc kubenswrapper[4690]: persistent_ips_enabled_flag= Mar 20 17:33:24 crc kubenswrapper[4690]: if [[ "true" == "true" ]]; then Mar 20 17:33:24 crc kubenswrapper[4690]: persistent_ips_enabled_flag="--enable-persistent-ips" Mar 20 17:33:24 crc kubenswrapper[4690]: fi Mar 20 17:33:24 crc kubenswrapper[4690]: Mar 20 17:33:24 crc kubenswrapper[4690]: # This is needed so that converting clusters from GA to TP Mar 20 17:33:24 crc kubenswrapper[4690]: # will rollout control plane pods as well Mar 20 17:33:24 crc kubenswrapper[4690]: network_segmentation_enabled_flag= Mar 20 17:33:24 crc kubenswrapper[4690]: multi_network_enabled_flag= Mar 20 17:33:24 crc kubenswrapper[4690]: if [[ "true" == "true" ]]; then Mar 20 17:33:24 crc kubenswrapper[4690]: multi_network_enabled_flag="--enable-multi-network" Mar 20 17:33:24 crc kubenswrapper[4690]: network_segmentation_enabled_flag="--enable-network-segmentation" Mar 20 17:33:24 crc kubenswrapper[4690]: fi Mar 20 17:33:24 crc kubenswrapper[4690]: Mar 20 17:33:24 crc kubenswrapper[4690]: echo "I$(date "+%m%d %H:%M:%S.%N") - ovnkube-control-plane - start ovnkube --init-cluster-manager ${K8S_NODE}" Mar 20 17:33:24 crc kubenswrapper[4690]: exec /usr/bin/ovnkube \ Mar 20 17:33:24 crc kubenswrapper[4690]: --enable-interconnect \ Mar 20 17:33:24 crc kubenswrapper[4690]: --init-cluster-manager "${K8S_NODE}" \ Mar 20 17:33:24 crc kubenswrapper[4690]: --config-file=/run/ovnkube-config/ovnkube.conf \ Mar 20 17:33:24 crc kubenswrapper[4690]: --loglevel "${OVN_KUBE_LOG_LEVEL}" \ Mar 20 17:33:24 crc kubenswrapper[4690]: --metrics-bind-address "127.0.0.1:29108" \ Mar 20 17:33:24 crc kubenswrapper[4690]: --metrics-enable-pprof \ Mar 20 17:33:24 crc kubenswrapper[4690]: --metrics-enable-config-duration \ Mar 20 17:33:24 crc kubenswrapper[4690]: ${ovn_v4_join_subnet_opt} \ Mar 20 17:33:24 crc kubenswrapper[4690]: ${ovn_v6_join_subnet_opt} \ Mar 20 17:33:24 crc kubenswrapper[4690]: ${ovn_v4_transit_switch_subnet_opt} \ Mar 20 17:33:24 crc kubenswrapper[4690]: ${ovn_v6_transit_switch_subnet_opt} \ Mar 20 17:33:24 crc kubenswrapper[4690]: ${dns_name_resolver_enabled_flag} \ Mar 20 17:33:24 crc kubenswrapper[4690]: ${persistent_ips_enabled_flag} \ Mar 20 17:33:24 crc kubenswrapper[4690]: ${multi_network_enabled_flag} \ Mar 20 17:33:24 crc kubenswrapper[4690]: ${network_segmentation_enabled_flag} Mar 20 17:33:24 crc kubenswrapper[4690]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics-port,HostPort:29108,ContainerPort:29108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OVN_KUBE_LOG_LEVEL,Value:4,ValueFrom:nil,},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{314572800 0} {} 300Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovnkube-config,ReadOnly:false,MountPath:/run/ovnkube-config/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zzj2d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-8nqtt_openshift-ovn-kubernetes(3f51dea1-fc10-4d4a-9065-2d0c020b36f9): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 17:33:24 crc kubenswrapper[4690]: > logger="UnhandledError" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.300751 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-qhmg6" event={"ID":"e5abdfe2-a5f7-43a7-9c83-a9eb0dacdea3","Type":"ContainerStarted","Data":"86d9f448b29f6d363012e374cd70f5a6bc4a2b0752af19a9100eb22c6148d733"} Mar 20 17:33:24 crc kubenswrapper[4690]: E0320 17:33:24.301796 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"egress-router-binary-copy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-additional-cni-plugins-tzvwm" podUID="3fe7c1d1-7aa9-4c64-941e-7415a99367ea" Mar 20 17:33:24 crc kubenswrapper[4690]: E0320 17:33:24.301829 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"ovnkube-cluster-manager\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nqtt" podUID="3f51dea1-fc10-4d4a-9065-2d0c020b36f9" Mar 20 17:33:24 crc kubenswrapper[4690]: E0320 17:33:24.301870 4690 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 17:33:24 crc kubenswrapper[4690]: container &Container{Name:dns-node-resolver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/bin/bash -c #!/bin/bash Mar 20 17:33:24 crc kubenswrapper[4690]: set -uo pipefail Mar 20 17:33:24 crc kubenswrapper[4690]: Mar 20 17:33:24 crc kubenswrapper[4690]: trap 'jobs -p | xargs kill || true; wait; exit 0' TERM Mar 20 17:33:24 crc kubenswrapper[4690]: Mar 20 17:33:24 crc kubenswrapper[4690]: OPENSHIFT_MARKER="openshift-generated-node-resolver" Mar 20 17:33:24 crc kubenswrapper[4690]: HOSTS_FILE="/etc/hosts" Mar 20 17:33:24 crc kubenswrapper[4690]: TEMP_FILE="/etc/hosts.tmp" Mar 20 17:33:24 crc kubenswrapper[4690]: Mar 20 17:33:24 crc kubenswrapper[4690]: IFS=', ' read -r -a services <<< "${SERVICES}" Mar 20 17:33:24 crc kubenswrapper[4690]: Mar 20 17:33:24 crc kubenswrapper[4690]: # Make a temporary file with the old hosts file's attributes. Mar 20 17:33:24 crc kubenswrapper[4690]: if ! cp -f --attributes-only "${HOSTS_FILE}" "${TEMP_FILE}"; then Mar 20 17:33:24 crc kubenswrapper[4690]: echo "Failed to preserve hosts file. Exiting." Mar 20 17:33:24 crc kubenswrapper[4690]: exit 1 Mar 20 17:33:24 crc kubenswrapper[4690]: fi Mar 20 17:33:24 crc kubenswrapper[4690]: Mar 20 17:33:24 crc kubenswrapper[4690]: while true; do Mar 20 17:33:24 crc kubenswrapper[4690]: declare -A svc_ips Mar 20 17:33:24 crc kubenswrapper[4690]: for svc in "${services[@]}"; do Mar 20 17:33:24 crc kubenswrapper[4690]: # Fetch service IP from cluster dns if present. We make several tries Mar 20 17:33:24 crc kubenswrapper[4690]: # to do it: IPv4, IPv6, IPv4 over TCP and IPv6 over TCP. The two last ones Mar 20 17:33:24 crc kubenswrapper[4690]: # are for deployments with Kuryr on older OpenStack (OSP13) - those do not Mar 20 17:33:24 crc kubenswrapper[4690]: # support UDP loadbalancers and require reaching DNS through TCP. Mar 20 17:33:24 crc kubenswrapper[4690]: cmds=('dig -t A @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 20 17:33:24 crc kubenswrapper[4690]: 'dig -t AAAA @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 20 17:33:24 crc kubenswrapper[4690]: 'dig -t A +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 20 17:33:24 crc kubenswrapper[4690]: 'dig -t AAAA +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"') Mar 20 17:33:24 crc kubenswrapper[4690]: for i in ${!cmds[*]} Mar 20 17:33:24 crc kubenswrapper[4690]: do Mar 20 17:33:24 crc kubenswrapper[4690]: ips=($(eval "${cmds[i]}")) Mar 20 17:33:24 crc kubenswrapper[4690]: if [[ "$?" -eq 0 && "${#ips[@]}" -ne 0 ]]; then Mar 20 17:33:24 crc kubenswrapper[4690]: svc_ips["${svc}"]="${ips[@]}" Mar 20 17:33:24 crc kubenswrapper[4690]: break Mar 20 17:33:24 crc kubenswrapper[4690]: fi Mar 20 17:33:24 crc kubenswrapper[4690]: done Mar 20 17:33:24 crc kubenswrapper[4690]: done Mar 20 17:33:24 crc kubenswrapper[4690]: Mar 20 17:33:24 crc kubenswrapper[4690]: # Update /etc/hosts only if we get valid service IPs Mar 20 17:33:24 crc kubenswrapper[4690]: # We will not update /etc/hosts when there is coredns service outage or api unavailability Mar 20 17:33:24 crc kubenswrapper[4690]: # Stale entries could exist in /etc/hosts if the service is deleted Mar 20 17:33:24 crc kubenswrapper[4690]: if [[ -n "${svc_ips[*]-}" ]]; then Mar 20 17:33:24 crc kubenswrapper[4690]: # Build a new hosts file from /etc/hosts with our custom entries filtered out Mar 20 17:33:24 crc kubenswrapper[4690]: if ! sed --silent "/# ${OPENSHIFT_MARKER}/d; w ${TEMP_FILE}" "${HOSTS_FILE}"; then Mar 20 17:33:24 crc kubenswrapper[4690]: # Only continue rebuilding the hosts entries if its original content is preserved Mar 20 17:33:24 crc kubenswrapper[4690]: sleep 60 & wait Mar 20 17:33:24 crc kubenswrapper[4690]: continue Mar 20 17:33:24 crc kubenswrapper[4690]: fi Mar 20 17:33:24 crc kubenswrapper[4690]: Mar 20 17:33:24 crc kubenswrapper[4690]: # Append resolver entries for services Mar 20 17:33:24 crc kubenswrapper[4690]: rc=0 Mar 20 17:33:24 crc kubenswrapper[4690]: for svc in "${!svc_ips[@]}"; do Mar 20 17:33:24 crc kubenswrapper[4690]: for ip in ${svc_ips[${svc}]}; do Mar 20 17:33:24 crc kubenswrapper[4690]: echo "${ip} ${svc} ${svc}.${CLUSTER_DOMAIN} # ${OPENSHIFT_MARKER}" >> "${TEMP_FILE}" || rc=$? Mar 20 17:33:24 crc kubenswrapper[4690]: done Mar 20 17:33:24 crc kubenswrapper[4690]: done Mar 20 17:33:24 crc kubenswrapper[4690]: if [[ $rc -ne 0 ]]; then Mar 20 17:33:24 crc kubenswrapper[4690]: sleep 60 & wait Mar 20 17:33:24 crc kubenswrapper[4690]: continue Mar 20 17:33:24 crc kubenswrapper[4690]: fi Mar 20 17:33:24 crc kubenswrapper[4690]: Mar 20 17:33:24 crc kubenswrapper[4690]: Mar 20 17:33:24 crc kubenswrapper[4690]: # TODO: Update /etc/hosts atomically to avoid any inconsistent behavior Mar 20 17:33:24 crc kubenswrapper[4690]: # Replace /etc/hosts with our modified version if needed Mar 20 17:33:24 crc kubenswrapper[4690]: cmp "${TEMP_FILE}" "${HOSTS_FILE}" || cp -f "${TEMP_FILE}" "${HOSTS_FILE}" Mar 20 17:33:24 crc kubenswrapper[4690]: # TEMP_FILE is not removed to avoid file create/delete and attributes copy churn Mar 20 17:33:24 crc kubenswrapper[4690]: fi Mar 20 17:33:24 crc kubenswrapper[4690]: sleep 60 & wait Mar 20 17:33:24 crc kubenswrapper[4690]: unset svc_ips Mar 20 17:33:24 crc kubenswrapper[4690]: done Mar 20 17:33:24 crc kubenswrapper[4690]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:SERVICES,Value:image-registry.openshift-image-registry.svc,ValueFrom:nil,},EnvVar{Name:NAMESERVER,Value:10.217.4.10,ValueFrom:nil,},EnvVar{Name:CLUSTER_DOMAIN,Value:cluster.local,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{22020096 0} {} 21Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:hosts-file,ReadOnly:false,MountPath:/etc/hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7lb8q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-resolver-qhmg6_openshift-dns(e5abdfe2-a5f7-43a7-9c83-a9eb0dacdea3): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 17:33:24 crc kubenswrapper[4690]: > logger="UnhandledError" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.302221 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"2fb76e94c7ed6ae217d822de7d5cef0f220c1621373125c3008444e1fbdd5490"} Mar 20 17:33:24 crc kubenswrapper[4690]: E0320 17:33:24.302935 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dns-node-resolver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-dns/node-resolver-qhmg6" podUID="e5abdfe2-a5f7-43a7-9c83-a9eb0dacdea3" Mar 20 17:33:24 crc kubenswrapper[4690]: E0320 17:33:24.303689 4690 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 17:33:24 crc kubenswrapper[4690]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 20 17:33:24 crc kubenswrapper[4690]: if [[ -f "/env/_master" ]]; then Mar 20 17:33:24 crc kubenswrapper[4690]: set -o allexport Mar 20 17:33:24 crc kubenswrapper[4690]: source "/env/_master" Mar 20 17:33:24 crc kubenswrapper[4690]: set +o allexport Mar 20 17:33:24 crc kubenswrapper[4690]: fi Mar 20 17:33:24 crc kubenswrapper[4690]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 20 17:33:24 crc kubenswrapper[4690]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 20 17:33:24 crc kubenswrapper[4690]: ho_enable="--enable-hybrid-overlay" Mar 20 17:33:24 crc kubenswrapper[4690]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 20 17:33:24 crc kubenswrapper[4690]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 20 17:33:24 crc kubenswrapper[4690]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 20 17:33:24 crc kubenswrapper[4690]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 20 17:33:24 crc kubenswrapper[4690]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 20 17:33:24 crc kubenswrapper[4690]: --webhook-host=127.0.0.1 \ Mar 20 17:33:24 crc kubenswrapper[4690]: --webhook-port=9743 \ Mar 20 17:33:24 crc kubenswrapper[4690]: ${ho_enable} \ Mar 20 17:33:24 crc kubenswrapper[4690]: --enable-interconnect \ Mar 20 17:33:24 crc kubenswrapper[4690]: --disable-approver \ Mar 20 17:33:24 crc kubenswrapper[4690]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 20 17:33:24 crc kubenswrapper[4690]: --wait-for-kubernetes-api=200s \ Mar 20 17:33:24 crc kubenswrapper[4690]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 20 17:33:24 crc kubenswrapper[4690]: --loglevel="${LOGLEVEL}" Mar 20 17:33:24 crc kubenswrapper[4690]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 17:33:24 crc kubenswrapper[4690]: > logger="UnhandledError" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.304379 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bf8dm" event={"ID":"189715be-f690-4a1d-9bd3-fb0dcae7affe","Type":"ContainerStarted","Data":"accdbf74a09879e58f694782878a1045befc7d7e7fceca6df9bc219e07df74ea"} Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.305461 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" event={"ID":"01a728ab-e286-4606-b922-d510978b863a","Type":"ContainerStarted","Data":"cca249cb4b6b5151a2967ed0c06b0f8a24549915a836d9597d1d837c4b055a6e"} Mar 20 17:33:24 crc kubenswrapper[4690]: E0320 17:33:24.305581 4690 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 17:33:24 crc kubenswrapper[4690]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 20 17:33:24 crc kubenswrapper[4690]: if [[ -f "/env/_master" ]]; then Mar 20 17:33:24 crc kubenswrapper[4690]: set -o allexport Mar 20 17:33:24 crc kubenswrapper[4690]: source "/env/_master" Mar 20 17:33:24 crc kubenswrapper[4690]: set +o allexport Mar 20 17:33:24 crc kubenswrapper[4690]: fi Mar 20 17:33:24 crc kubenswrapper[4690]: Mar 20 17:33:24 crc kubenswrapper[4690]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 20 17:33:24 crc kubenswrapper[4690]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 20 17:33:24 crc kubenswrapper[4690]: --disable-webhook \ Mar 20 17:33:24 crc kubenswrapper[4690]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 20 17:33:24 crc kubenswrapper[4690]: --loglevel="${LOGLEVEL}" Mar 20 17:33:24 crc kubenswrapper[4690]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 17:33:24 crc kubenswrapper[4690]: > logger="UnhandledError" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.306542 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"b93cfea60d3522dee79dce30c069739c0488ee788e80373ab7a47bc1713973d7"} Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.306541 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:24 crc kubenswrapper[4690]: E0320 17:33:24.306685 4690 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 17:33:24 crc kubenswrapper[4690]: init container &Container{Name:kubecfg-setup,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c cat << EOF > /etc/ovn/kubeconfig Mar 20 17:33:24 crc kubenswrapper[4690]: apiVersion: v1 Mar 20 17:33:24 crc kubenswrapper[4690]: clusters: Mar 20 17:33:24 crc kubenswrapper[4690]: - cluster: Mar 20 17:33:24 crc kubenswrapper[4690]: certificate-authority: /var/run/secrets/kubernetes.io/serviceaccount/ca.crt Mar 20 17:33:24 crc kubenswrapper[4690]: server: https://api-int.crc.testing:6443 Mar 20 17:33:24 crc kubenswrapper[4690]: name: default-cluster Mar 20 17:33:24 crc kubenswrapper[4690]: contexts: Mar 20 17:33:24 crc kubenswrapper[4690]: - context: Mar 20 17:33:24 crc kubenswrapper[4690]: cluster: default-cluster Mar 20 17:33:24 crc kubenswrapper[4690]: namespace: default Mar 20 17:33:24 crc kubenswrapper[4690]: user: default-auth Mar 20 17:33:24 crc kubenswrapper[4690]: name: default-context Mar 20 17:33:24 crc kubenswrapper[4690]: current-context: default-context Mar 20 17:33:24 crc kubenswrapper[4690]: kind: Config Mar 20 17:33:24 crc kubenswrapper[4690]: preferences: {} Mar 20 17:33:24 crc kubenswrapper[4690]: users: Mar 20 17:33:24 crc kubenswrapper[4690]: - name: default-auth Mar 20 17:33:24 crc kubenswrapper[4690]: user: Mar 20 17:33:24 crc kubenswrapper[4690]: client-certificate: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 20 17:33:24 crc kubenswrapper[4690]: client-key: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 20 17:33:24 crc kubenswrapper[4690]: EOF Mar 20 17:33:24 crc kubenswrapper[4690]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-openvswitch,ReadOnly:false,MountPath:/etc/ovn/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nmwk9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-node-7bsmm_openshift-ovn-kubernetes(01a728ab-e286-4606-b922-d510978b863a): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 17:33:24 crc kubenswrapper[4690]: > logger="UnhandledError" Mar 20 17:33:24 crc kubenswrapper[4690]: E0320 17:33:24.306841 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 20 17:33:24 crc kubenswrapper[4690]: E0320 17:33:24.306881 4690 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 17:33:24 crc kubenswrapper[4690]: container &Container{Name:kube-multus,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,Command:[/bin/bash -ec --],Args:[MULTUS_DAEMON_OPT="" Mar 20 17:33:24 crc kubenswrapper[4690]: /entrypoint/cnibincopy.sh; exec /usr/src/multus-cni/bin/multus-daemon $MULTUS_DAEMON_OPT Mar 20 17:33:24 crc kubenswrapper[4690]: ],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/bin/,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:6443,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:api-int.crc.testing,ValueFrom:nil,},EnvVar{Name:MULTUS_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:false,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:system-cni-dir,ReadOnly:false,MountPath:/host/etc/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-cni-dir,ReadOnly:false,MountPath:/host/run/multus/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-socket-dir-parent,ReadOnly:false,MountPath:/host/run/multus,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-k8s-cni-cncf-io,ReadOnly:false,MountPath:/run/k8s.cni.cncf.io,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-netns,ReadOnly:false,MountPath:/run/netns,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-bin,ReadOnly:false,MountPath:/var/lib/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-multus,ReadOnly:false,MountPath:/var/lib/cni/multus,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-kubelet,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:hostroot,ReadOnly:false,MountPath:/hostroot,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-conf-dir,ReadOnly:false,MountPath:/etc/cni/multus/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-daemon-config,ReadOnly:true,MountPath:/etc/cni/net.d/multus.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-multus-certs,ReadOnly:false,MountPath:/etc/cni/multus/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-kubernetes,ReadOnly:false,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-z9vwp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-bf8dm_openshift-multus(189715be-f690-4a1d-9bd3-fb0dcae7affe): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 17:33:24 crc kubenswrapper[4690]: > logger="UnhandledError" Mar 20 17:33:24 crc kubenswrapper[4690]: E0320 17:33:24.307486 4690 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 20 17:33:24 crc kubenswrapper[4690]: E0320 17:33:24.307965 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-bf8dm" podUID="189715be-f690-4a1d-9bd3-fb0dcae7affe" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.307995 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"d526e7837746e2cf9d46c45799107b9032a06048f5a67138c0b80ffd02e6ab23"} Mar 20 17:33:24 crc kubenswrapper[4690]: E0320 17:33:24.308010 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kubecfg-setup\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" podUID="01a728ab-e286-4606-b922-d510978b863a" Mar 20 17:33:24 crc kubenswrapper[4690]: E0320 17:33:24.308598 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 20 17:33:24 crc kubenswrapper[4690]: E0320 17:33:24.310359 4690 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 17:33:24 crc kubenswrapper[4690]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 20 17:33:24 crc kubenswrapper[4690]: set -o allexport Mar 20 17:33:24 crc kubenswrapper[4690]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 20 17:33:24 crc kubenswrapper[4690]: source /etc/kubernetes/apiserver-url.env Mar 20 17:33:24 crc kubenswrapper[4690]: else Mar 20 17:33:24 crc kubenswrapper[4690]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 20 17:33:24 crc kubenswrapper[4690]: exit 1 Mar 20 17:33:24 crc kubenswrapper[4690]: fi Mar 20 17:33:24 crc kubenswrapper[4690]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 20 17:33:24 crc kubenswrapper[4690]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 17:33:24 crc kubenswrapper[4690]: > logger="UnhandledError" Mar 20 17:33:24 crc kubenswrapper[4690]: E0320 17:33:24.311834 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.318985 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c18651e4-89e3-43fd-a780-bfa6df87591e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v64dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v64dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wtg2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.328150 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4rfg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deaf1de2-4906-4e89-ae1b-83b6d35f97a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmghf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4rfg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.339483 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nqtt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f51dea1-fc10-4d4a-9065-2d0c020b36f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzj2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzj2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8nqtt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.348504 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.360205 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.373353 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bf8dm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"189715be-f690-4a1d-9bd3-fb0dcae7affe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z9vwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bf8dm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.379818 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bgj72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cb690cf-caea-4c1b-ad3c-7e17a802b1a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djqjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djqjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bgj72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.392387 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.392440 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.392459 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.392482 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.392499 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:24Z","lastTransitionTime":"2026-03-20T17:33:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.394743 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tzvwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fe7c1d1-7aa9-4c64-941e-7415a99367ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tzvwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.420553 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01a728ab-e286-4606-b922-d510978b863a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7bsmm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.439015 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.449531 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.462577 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.472759 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qhmg6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5abdfe2-a5f7-43a7-9c83-a9eb0dacdea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lb8q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qhmg6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.484177 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.495896 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.495943 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.495961 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.495984 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.496002 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:24Z","lastTransitionTime":"2026-03-20T17:33:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.496927 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.512671 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bf8dm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"189715be-f690-4a1d-9bd3-fb0dcae7affe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z9vwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bf8dm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.523962 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bgj72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cb690cf-caea-4c1b-ad3c-7e17a802b1a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djqjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djqjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bgj72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.534185 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.534481 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.534536 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:33:24 crc kubenswrapper[4690]: E0320 17:33:24.534562 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:33:25.534529246 +0000 UTC m=+80.400354924 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:33:24 crc kubenswrapper[4690]: E0320 17:33:24.534672 4690 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 17:33:24 crc kubenswrapper[4690]: E0320 17:33:24.534689 4690 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 17:33:24 crc kubenswrapper[4690]: E0320 17:33:24.534759 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 17:33:25.534738852 +0000 UTC m=+80.400564560 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 17:33:24 crc kubenswrapper[4690]: E0320 17:33:24.534789 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 17:33:25.534775053 +0000 UTC m=+80.400600771 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.540742 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tzvwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fe7c1d1-7aa9-4c64-941e-7415a99367ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tzvwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.567023 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01a728ab-e286-4606-b922-d510978b863a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7bsmm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.582232 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.596086 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.598790 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.598857 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.598883 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.598915 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.598938 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:24Z","lastTransitionTime":"2026-03-20T17:33:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.610713 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.621790 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qhmg6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5abdfe2-a5f7-43a7-9c83-a9eb0dacdea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lb8q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qhmg6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.626052 4690 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.634311 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.636309 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3cb690cf-caea-4c1b-ad3c-7e17a802b1a3-metrics-certs\") pod \"network-metrics-daemon-bgj72\" (UID: \"3cb690cf-caea-4c1b-ad3c-7e17a802b1a3\") " pod="openshift-multus/network-metrics-daemon-bgj72" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.636381 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.636417 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:33:24 crc kubenswrapper[4690]: E0320 17:33:24.636619 4690 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 17:33:24 crc kubenswrapper[4690]: E0320 17:33:24.636622 4690 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 17:33:24 crc kubenswrapper[4690]: E0320 17:33:24.636645 4690 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 17:33:24 crc kubenswrapper[4690]: E0320 17:33:24.636664 4690 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 17:33:24 crc kubenswrapper[4690]: E0320 17:33:24.636670 4690 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:33:24 crc kubenswrapper[4690]: E0320 17:33:24.636683 4690 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:33:24 crc kubenswrapper[4690]: E0320 17:33:24.636747 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 17:33:25.636726202 +0000 UTC m=+80.502551920 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:33:24 crc kubenswrapper[4690]: E0320 17:33:24.636780 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 17:33:25.636762773 +0000 UTC m=+80.502588491 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:33:24 crc kubenswrapper[4690]: E0320 17:33:24.637500 4690 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 17:33:24 crc kubenswrapper[4690]: E0320 17:33:24.637593 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3cb690cf-caea-4c1b-ad3c-7e17a802b1a3-metrics-certs podName:3cb690cf-caea-4c1b-ad3c-7e17a802b1a3 nodeName:}" failed. No retries permitted until 2026-03-20 17:33:25.637573416 +0000 UTC m=+80.503399124 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3cb690cf-caea-4c1b-ad3c-7e17a802b1a3-metrics-certs") pod "network-metrics-daemon-bgj72" (UID: "3cb690cf-caea-4c1b-ad3c-7e17a802b1a3") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.647064 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c18651e4-89e3-43fd-a780-bfa6df87591e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v64dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v64dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wtg2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.657994 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4rfg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deaf1de2-4906-4e89-ae1b-83b6d35f97a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmghf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4rfg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.670419 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nqtt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f51dea1-fc10-4d4a-9065-2d0c020b36f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzj2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzj2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8nqtt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.702416 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.702470 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.702488 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.702517 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.702534 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:24Z","lastTransitionTime":"2026-03-20T17:33:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.805312 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.805379 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.805400 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.805424 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.805442 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:24Z","lastTransitionTime":"2026-03-20T17:33:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.908153 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.908225 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.908245 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.908298 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:24 crc kubenswrapper[4690]: I0320 17:33:24.908319 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:24Z","lastTransitionTime":"2026-03-20T17:33:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.011697 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.011806 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.011821 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.011845 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.011861 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:25Z","lastTransitionTime":"2026-03-20T17:33:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.115078 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.115126 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.115137 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.115153 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.115166 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:25Z","lastTransitionTime":"2026-03-20T17:33:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.217610 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.217709 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.217736 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.217765 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.217783 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:25Z","lastTransitionTime":"2026-03-20T17:33:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.312319 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" event={"ID":"c18651e4-89e3-43fd-a780-bfa6df87591e","Type":"ContainerStarted","Data":"5285f70fbcba75161e139d5d852a2c649510bfccb5cf2c520afab127f5087986"} Mar 20 17:33:25 crc kubenswrapper[4690]: E0320 17:33:25.314570 4690 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:machine-config-daemon,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a,Command:[/usr/bin/machine-config-daemon],Args:[start --payload-version=4.18.1],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:health,HostPort:8798,ContainerPort:8798,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rootfs,ReadOnly:false,MountPath:/rootfs,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-v64dg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/health,Port:{0 8798 },Host:127.0.0.1,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:120,TimeoutSeconds:1,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.321398 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.321454 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.321482 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.321512 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.321535 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:25Z","lastTransitionTime":"2026-03-20T17:33:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:25 crc kubenswrapper[4690]: E0320 17:33:25.321558 4690 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[],Args:[--secure-listen-address=0.0.0.0:9001 --config-file=/etc/kube-rbac-proxy/config-file.yaml --tls-cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --tls-min-version=VersionTLS12 --upstream=http://127.0.0.1:8797 --logtostderr=true --tls-cert-file=/etc/tls/private/tls.crt --tls-private-key-file=/etc/tls/private/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:9001,ContainerPort:9001,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:proxy-tls,ReadOnly:false,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:mcd-auth-proxy-config,ReadOnly:false,MountPath:/etc/kube-rbac-proxy,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-v64dg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 20 17:33:25 crc kubenswrapper[4690]: E0320 17:33:25.323502 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"machine-config-daemon\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.330097 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.344618 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c18651e4-89e3-43fd-a780-bfa6df87591e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v64dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v64dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wtg2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.356310 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4rfg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deaf1de2-4906-4e89-ae1b-83b6d35f97a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmghf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4rfg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.367592 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nqtt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f51dea1-fc10-4d4a-9065-2d0c020b36f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzj2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzj2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8nqtt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.384229 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.397872 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.413350 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bf8dm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"189715be-f690-4a1d-9bd3-fb0dcae7affe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z9vwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bf8dm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.424795 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.424868 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.424891 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.424917 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.424942 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:25Z","lastTransitionTime":"2026-03-20T17:33:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.425495 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bgj72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cb690cf-caea-4c1b-ad3c-7e17a802b1a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djqjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djqjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bgj72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.428577 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.445836 4690 scope.go:117] "RemoveContainer" containerID="60a788ca120045ef7b2481c3da0afac1f8ae2522b3edd3b73a48f5f8dab045a4" Mar 20 17:33:25 crc kubenswrapper[4690]: E0320 17:33:25.446241 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.446411 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.448248 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tzvwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fe7c1d1-7aa9-4c64-941e-7415a99367ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tzvwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.479391 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01a728ab-e286-4606-b922-d510978b863a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7bsmm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.497420 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.512020 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.527112 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.528512 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.528533 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.528605 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.528624 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.528635 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:25Z","lastTransitionTime":"2026-03-20T17:33:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.538981 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qhmg6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5abdfe2-a5f7-43a7-9c83-a9eb0dacdea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lb8q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qhmg6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.545876 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.546003 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:33:25 crc kubenswrapper[4690]: E0320 17:33:25.546012 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:33:27.545992691 +0000 UTC m=+82.411818379 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:33:25 crc kubenswrapper[4690]: E0320 17:33:25.546123 4690 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.546155 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:33:25 crc kubenswrapper[4690]: E0320 17:33:25.546164 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 17:33:27.546152336 +0000 UTC m=+82.411978024 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 17:33:25 crc kubenswrapper[4690]: E0320 17:33:25.546288 4690 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 17:33:25 crc kubenswrapper[4690]: E0320 17:33:25.546408 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 17:33:27.546381862 +0000 UTC m=+82.412207590 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.631356 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.631399 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.631411 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.631427 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.631438 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:25Z","lastTransitionTime":"2026-03-20T17:33:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.647300 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3cb690cf-caea-4c1b-ad3c-7e17a802b1a3-metrics-certs\") pod \"network-metrics-daemon-bgj72\" (UID: \"3cb690cf-caea-4c1b-ad3c-7e17a802b1a3\") " pod="openshift-multus/network-metrics-daemon-bgj72" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.647369 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.647410 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:33:25 crc kubenswrapper[4690]: E0320 17:33:25.647426 4690 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 17:33:25 crc kubenswrapper[4690]: E0320 17:33:25.647490 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3cb690cf-caea-4c1b-ad3c-7e17a802b1a3-metrics-certs podName:3cb690cf-caea-4c1b-ad3c-7e17a802b1a3 nodeName:}" failed. No retries permitted until 2026-03-20 17:33:27.647468626 +0000 UTC m=+82.513294314 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3cb690cf-caea-4c1b-ad3c-7e17a802b1a3-metrics-certs") pod "network-metrics-daemon-bgj72" (UID: "3cb690cf-caea-4c1b-ad3c-7e17a802b1a3") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 17:33:25 crc kubenswrapper[4690]: E0320 17:33:25.647546 4690 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 17:33:25 crc kubenswrapper[4690]: E0320 17:33:25.647570 4690 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 17:33:25 crc kubenswrapper[4690]: E0320 17:33:25.647585 4690 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:33:25 crc kubenswrapper[4690]: E0320 17:33:25.647629 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 17:33:27.64761439 +0000 UTC m=+82.513440078 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:33:25 crc kubenswrapper[4690]: E0320 17:33:25.647707 4690 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 17:33:25 crc kubenswrapper[4690]: E0320 17:33:25.647723 4690 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 17:33:25 crc kubenswrapper[4690]: E0320 17:33:25.647737 4690 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:33:25 crc kubenswrapper[4690]: E0320 17:33:25.647775 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 17:33:27.647763405 +0000 UTC m=+82.513589103 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.734748 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.734801 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.734820 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.734839 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.734854 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:25Z","lastTransitionTime":"2026-03-20T17:33:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.838937 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.839002 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.839021 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.839046 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.839064 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:25Z","lastTransitionTime":"2026-03-20T17:33:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.882432 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.882472 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.882507 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bgj72" Mar 20 17:33:25 crc kubenswrapper[4690]: E0320 17:33:25.882937 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:33:25 crc kubenswrapper[4690]: E0320 17:33:25.883070 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bgj72" podUID="3cb690cf-caea-4c1b-ad3c-7e17a802b1a3" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.883380 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:33:25 crc kubenswrapper[4690]: E0320 17:33:25.883429 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:33:25 crc kubenswrapper[4690]: E0320 17:33:25.883535 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.891576 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.892857 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.895183 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bgj72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cb690cf-caea-4c1b-ad3c-7e17a802b1a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djqjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djqjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bgj72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.896197 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.897828 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.900047 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.901174 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.902535 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.904629 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.906181 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.908643 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.910519 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.913056 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.913889 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.914859 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.916305 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.916698 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tzvwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fe7c1d1-7aa9-4c64-941e-7415a99367ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tzvwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.917222 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.918823 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.919419 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.920380 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.922027 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.922729 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.923907 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.924980 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.927308 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.928215 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.930565 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.932198 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.933390 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.935931 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.937612 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.940302 4690 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.940661 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.942793 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01a728ab-e286-4606-b922-d510978b863a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7bsmm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.944639 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.944705 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.944718 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.944755 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.944768 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:25Z","lastTransitionTime":"2026-03-20T17:33:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.945338 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.947392 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.948599 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.951507 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.953037 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.954629 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.955582 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.957243 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.957946 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.960846 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.960865 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1ec4f2e-81b3-4b81-b071-1306b93f352a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc5b19d4175f97a26633b3c61b49147f93e1edeb8975964cb23bbe474f6326e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe2bb59ee9fc82c3e49b375d294aebc73e2175d699416cb28c587a153cbadc5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83d020fd903a7b604233a4229c9a201a78f0f9d41864c94e82220090dd73e69e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a788ca120045ef7b2481c3da0afac1f8ae2522b3edd3b73a48f5f8dab045a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60a788ca120045ef7b2481c3da0afac1f8ae2522b3edd3b73a48f5f8dab045a4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:33:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:33:16.417534 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:33:16.417775 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:33:16.418850 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4179466923/tls.crt::/tmp/serving-cert-4179466923/tls.key\\\\\\\"\\\\nI0320 17:33:16.771141 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:33:16.777371 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:33:16.777420 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:33:16.777489 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:33:16.777503 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:33:16.783760 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 17:33:16.783788 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:33:16.783793 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 17:33:16.783790 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:33:16.783798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:33:16.783816 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:33:16.783823 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:33:16.783828 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:33:16.787038 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d1877a8c2e19c04c44916cbcd68e19a117e4d6075b33ce7131064590120b12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438a96b878fe413aa54a56021b7ca5d2d38226050a036c2ce144aaead090aff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://438a96b878fe413aa54a56021b7ca5d2d38226050a036c2ce144aaead090aff7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.964335 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.965828 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.967891 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.969290 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.971446 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.973140 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.975500 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.975950 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.977866 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.978772 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.980682 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.981618 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.982674 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 20 17:33:25 crc kubenswrapper[4690]: I0320 17:33:25.986294 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:26 crc kubenswrapper[4690]: I0320 17:33:26.001662 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:26 crc kubenswrapper[4690]: I0320 17:33:26.018697 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qhmg6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5abdfe2-a5f7-43a7-9c83-a9eb0dacdea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lb8q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qhmg6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:26 crc kubenswrapper[4690]: I0320 17:33:26.032174 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:26 crc kubenswrapper[4690]: I0320 17:33:26.047657 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c18651e4-89e3-43fd-a780-bfa6df87591e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v64dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v64dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wtg2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:26 crc kubenswrapper[4690]: I0320 17:33:26.048003 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:26 crc kubenswrapper[4690]: I0320 17:33:26.048102 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:26 crc kubenswrapper[4690]: I0320 17:33:26.048178 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:26 crc kubenswrapper[4690]: I0320 17:33:26.048307 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:26 crc kubenswrapper[4690]: I0320 17:33:26.048390 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:26Z","lastTransitionTime":"2026-03-20T17:33:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:26 crc kubenswrapper[4690]: I0320 17:33:26.059655 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4rfg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deaf1de2-4906-4e89-ae1b-83b6d35f97a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmghf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4rfg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:26 crc kubenswrapper[4690]: I0320 17:33:26.070454 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nqtt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f51dea1-fc10-4d4a-9065-2d0c020b36f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzj2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzj2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8nqtt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:26 crc kubenswrapper[4690]: I0320 17:33:26.086869 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:26 crc kubenswrapper[4690]: I0320 17:33:26.099753 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:26 crc kubenswrapper[4690]: I0320 17:33:26.111728 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bf8dm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"189715be-f690-4a1d-9bd3-fb0dcae7affe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z9vwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bf8dm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:26 crc kubenswrapper[4690]: I0320 17:33:26.151924 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:26 crc kubenswrapper[4690]: I0320 17:33:26.151969 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:26 crc kubenswrapper[4690]: I0320 17:33:26.151987 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:26 crc kubenswrapper[4690]: I0320 17:33:26.152010 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:26 crc kubenswrapper[4690]: I0320 17:33:26.152028 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:26Z","lastTransitionTime":"2026-03-20T17:33:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:26 crc kubenswrapper[4690]: I0320 17:33:26.255159 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:26 crc kubenswrapper[4690]: I0320 17:33:26.255217 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:26 crc kubenswrapper[4690]: I0320 17:33:26.255241 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:26 crc kubenswrapper[4690]: I0320 17:33:26.255300 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:26 crc kubenswrapper[4690]: I0320 17:33:26.255334 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:26Z","lastTransitionTime":"2026-03-20T17:33:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:26 crc kubenswrapper[4690]: I0320 17:33:26.317401 4690 scope.go:117] "RemoveContainer" containerID="60a788ca120045ef7b2481c3da0afac1f8ae2522b3edd3b73a48f5f8dab045a4" Mar 20 17:33:26 crc kubenswrapper[4690]: E0320 17:33:26.317884 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 17:33:26 crc kubenswrapper[4690]: I0320 17:33:26.358190 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:26 crc kubenswrapper[4690]: I0320 17:33:26.358248 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:26 crc kubenswrapper[4690]: I0320 17:33:26.358295 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:26 crc kubenswrapper[4690]: I0320 17:33:26.358321 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:26 crc kubenswrapper[4690]: I0320 17:33:26.358338 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:26Z","lastTransitionTime":"2026-03-20T17:33:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:26 crc kubenswrapper[4690]: I0320 17:33:26.461110 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:26 crc kubenswrapper[4690]: I0320 17:33:26.461162 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:26 crc kubenswrapper[4690]: I0320 17:33:26.461182 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:26 crc kubenswrapper[4690]: I0320 17:33:26.461210 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:26 crc kubenswrapper[4690]: I0320 17:33:26.461232 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:26Z","lastTransitionTime":"2026-03-20T17:33:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:26 crc kubenswrapper[4690]: I0320 17:33:26.564184 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:26 crc kubenswrapper[4690]: I0320 17:33:26.564300 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:26 crc kubenswrapper[4690]: I0320 17:33:26.564339 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:26 crc kubenswrapper[4690]: I0320 17:33:26.564371 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:26 crc kubenswrapper[4690]: I0320 17:33:26.564392 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:26Z","lastTransitionTime":"2026-03-20T17:33:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:26 crc kubenswrapper[4690]: I0320 17:33:26.667795 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:26 crc kubenswrapper[4690]: I0320 17:33:26.667864 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:26 crc kubenswrapper[4690]: I0320 17:33:26.667882 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:26 crc kubenswrapper[4690]: I0320 17:33:26.667949 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:26 crc kubenswrapper[4690]: I0320 17:33:26.667973 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:26Z","lastTransitionTime":"2026-03-20T17:33:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:26 crc kubenswrapper[4690]: I0320 17:33:26.771280 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:26 crc kubenswrapper[4690]: I0320 17:33:26.771334 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:26 crc kubenswrapper[4690]: I0320 17:33:26.771352 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:26 crc kubenswrapper[4690]: I0320 17:33:26.771376 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:26 crc kubenswrapper[4690]: I0320 17:33:26.771396 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:26Z","lastTransitionTime":"2026-03-20T17:33:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:26 crc kubenswrapper[4690]: I0320 17:33:26.874992 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:26 crc kubenswrapper[4690]: I0320 17:33:26.875074 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:26 crc kubenswrapper[4690]: I0320 17:33:26.875099 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:26 crc kubenswrapper[4690]: I0320 17:33:26.875128 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:26 crc kubenswrapper[4690]: I0320 17:33:26.875150 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:26Z","lastTransitionTime":"2026-03-20T17:33:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:26 crc kubenswrapper[4690]: I0320 17:33:26.978808 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:26 crc kubenswrapper[4690]: I0320 17:33:26.978876 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:26 crc kubenswrapper[4690]: I0320 17:33:26.978894 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:26 crc kubenswrapper[4690]: I0320 17:33:26.978923 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:26 crc kubenswrapper[4690]: I0320 17:33:26.978943 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:26Z","lastTransitionTime":"2026-03-20T17:33:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:27 crc kubenswrapper[4690]: I0320 17:33:27.050301 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:27 crc kubenswrapper[4690]: I0320 17:33:27.050389 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:27 crc kubenswrapper[4690]: I0320 17:33:27.050408 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:27 crc kubenswrapper[4690]: I0320 17:33:27.050850 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:27 crc kubenswrapper[4690]: I0320 17:33:27.050929 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:27Z","lastTransitionTime":"2026-03-20T17:33:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:27 crc kubenswrapper[4690]: E0320 17:33:27.067517 4690 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"65dcae3a-f6f0-4cdb-ac7a-76b1f475ea12\\\",\\\"systemUUID\\\":\\\"6ccc1e34-4160-4143-b919-ac2f717f294a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:27 crc kubenswrapper[4690]: I0320 17:33:27.072627 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:27 crc kubenswrapper[4690]: I0320 17:33:27.072691 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:27 crc kubenswrapper[4690]: I0320 17:33:27.072708 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:27 crc kubenswrapper[4690]: I0320 17:33:27.072732 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:27 crc kubenswrapper[4690]: I0320 17:33:27.072749 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:27Z","lastTransitionTime":"2026-03-20T17:33:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:27 crc kubenswrapper[4690]: E0320 17:33:27.088791 4690 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"65dcae3a-f6f0-4cdb-ac7a-76b1f475ea12\\\",\\\"systemUUID\\\":\\\"6ccc1e34-4160-4143-b919-ac2f717f294a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:27 crc kubenswrapper[4690]: I0320 17:33:27.093843 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:27 crc kubenswrapper[4690]: I0320 17:33:27.093886 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:27 crc kubenswrapper[4690]: I0320 17:33:27.093903 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:27 crc kubenswrapper[4690]: I0320 17:33:27.093927 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:27 crc kubenswrapper[4690]: I0320 17:33:27.093944 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:27Z","lastTransitionTime":"2026-03-20T17:33:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:27 crc kubenswrapper[4690]: E0320 17:33:27.110033 4690 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"65dcae3a-f6f0-4cdb-ac7a-76b1f475ea12\\\",\\\"systemUUID\\\":\\\"6ccc1e34-4160-4143-b919-ac2f717f294a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:27 crc kubenswrapper[4690]: I0320 17:33:27.116116 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:27 crc kubenswrapper[4690]: I0320 17:33:27.116178 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:27 crc kubenswrapper[4690]: I0320 17:33:27.116197 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:27 crc kubenswrapper[4690]: I0320 17:33:27.116225 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:27 crc kubenswrapper[4690]: I0320 17:33:27.116243 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:27Z","lastTransitionTime":"2026-03-20T17:33:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:27 crc kubenswrapper[4690]: E0320 17:33:27.130845 4690 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"65dcae3a-f6f0-4cdb-ac7a-76b1f475ea12\\\",\\\"systemUUID\\\":\\\"6ccc1e34-4160-4143-b919-ac2f717f294a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:27 crc kubenswrapper[4690]: I0320 17:33:27.135767 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:27 crc kubenswrapper[4690]: I0320 17:33:27.135809 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:27 crc kubenswrapper[4690]: I0320 17:33:27.135825 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:27 crc kubenswrapper[4690]: I0320 17:33:27.135849 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:27 crc kubenswrapper[4690]: I0320 17:33:27.135868 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:27Z","lastTransitionTime":"2026-03-20T17:33:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:27 crc kubenswrapper[4690]: E0320 17:33:27.151416 4690 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"65dcae3a-f6f0-4cdb-ac7a-76b1f475ea12\\\",\\\"systemUUID\\\":\\\"6ccc1e34-4160-4143-b919-ac2f717f294a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:27 crc kubenswrapper[4690]: E0320 17:33:27.151703 4690 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 17:33:27 crc kubenswrapper[4690]: I0320 17:33:27.154157 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:27 crc kubenswrapper[4690]: I0320 17:33:27.154251 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:27 crc kubenswrapper[4690]: I0320 17:33:27.154300 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:27 crc kubenswrapper[4690]: I0320 17:33:27.154322 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:27 crc kubenswrapper[4690]: I0320 17:33:27.154340 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:27Z","lastTransitionTime":"2026-03-20T17:33:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:27 crc kubenswrapper[4690]: I0320 17:33:27.256975 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:27 crc kubenswrapper[4690]: I0320 17:33:27.257106 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:27 crc kubenswrapper[4690]: I0320 17:33:27.257128 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:27 crc kubenswrapper[4690]: I0320 17:33:27.257151 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:27 crc kubenswrapper[4690]: I0320 17:33:27.257171 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:27Z","lastTransitionTime":"2026-03-20T17:33:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:27 crc kubenswrapper[4690]: I0320 17:33:27.359461 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:27 crc kubenswrapper[4690]: I0320 17:33:27.359497 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:27 crc kubenswrapper[4690]: I0320 17:33:27.359507 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:27 crc kubenswrapper[4690]: I0320 17:33:27.359520 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:27 crc kubenswrapper[4690]: I0320 17:33:27.359531 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:27Z","lastTransitionTime":"2026-03-20T17:33:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:27 crc kubenswrapper[4690]: I0320 17:33:27.463352 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:27 crc kubenswrapper[4690]: I0320 17:33:27.463430 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:27 crc kubenswrapper[4690]: I0320 17:33:27.463454 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:27 crc kubenswrapper[4690]: I0320 17:33:27.463485 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:27 crc kubenswrapper[4690]: I0320 17:33:27.463508 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:27Z","lastTransitionTime":"2026-03-20T17:33:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:27 crc kubenswrapper[4690]: I0320 17:33:27.565413 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:33:27 crc kubenswrapper[4690]: I0320 17:33:27.565582 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:33:27 crc kubenswrapper[4690]: E0320 17:33:27.565629 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:33:31.565598287 +0000 UTC m=+86.431424005 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:33:27 crc kubenswrapper[4690]: I0320 17:33:27.565672 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:33:27 crc kubenswrapper[4690]: E0320 17:33:27.565692 4690 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 17:33:27 crc kubenswrapper[4690]: E0320 17:33:27.565760 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 17:33:31.565743261 +0000 UTC m=+86.431568979 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 17:33:27 crc kubenswrapper[4690]: E0320 17:33:27.565860 4690 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 17:33:27 crc kubenswrapper[4690]: E0320 17:33:27.565976 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 17:33:31.565946097 +0000 UTC m=+86.431771805 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 17:33:27 crc kubenswrapper[4690]: I0320 17:33:27.567107 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:27 crc kubenswrapper[4690]: I0320 17:33:27.567177 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:27 crc kubenswrapper[4690]: I0320 17:33:27.567201 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:27 crc kubenswrapper[4690]: I0320 17:33:27.567234 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:27 crc kubenswrapper[4690]: I0320 17:33:27.567290 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:27Z","lastTransitionTime":"2026-03-20T17:33:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:27 crc kubenswrapper[4690]: I0320 17:33:27.666747 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:33:27 crc kubenswrapper[4690]: I0320 17:33:27.666822 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:33:27 crc kubenswrapper[4690]: I0320 17:33:27.666921 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3cb690cf-caea-4c1b-ad3c-7e17a802b1a3-metrics-certs\") pod \"network-metrics-daemon-bgj72\" (UID: \"3cb690cf-caea-4c1b-ad3c-7e17a802b1a3\") " pod="openshift-multus/network-metrics-daemon-bgj72" Mar 20 17:33:27 crc kubenswrapper[4690]: E0320 17:33:27.667040 4690 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 17:33:27 crc kubenswrapper[4690]: E0320 17:33:27.667081 4690 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 17:33:27 crc kubenswrapper[4690]: E0320 17:33:27.667087 4690 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 17:33:27 crc kubenswrapper[4690]: E0320 17:33:27.667119 4690 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:33:27 crc kubenswrapper[4690]: E0320 17:33:27.667154 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3cb690cf-caea-4c1b-ad3c-7e17a802b1a3-metrics-certs podName:3cb690cf-caea-4c1b-ad3c-7e17a802b1a3 nodeName:}" failed. No retries permitted until 2026-03-20 17:33:31.667132443 +0000 UTC m=+86.532958151 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3cb690cf-caea-4c1b-ad3c-7e17a802b1a3-metrics-certs") pod "network-metrics-daemon-bgj72" (UID: "3cb690cf-caea-4c1b-ad3c-7e17a802b1a3") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 17:33:27 crc kubenswrapper[4690]: E0320 17:33:27.667157 4690 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 17:33:27 crc kubenswrapper[4690]: E0320 17:33:27.667193 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 17:33:31.667168204 +0000 UTC m=+86.532993922 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:33:27 crc kubenswrapper[4690]: E0320 17:33:27.667196 4690 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 17:33:27 crc kubenswrapper[4690]: E0320 17:33:27.667223 4690 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:33:27 crc kubenswrapper[4690]: E0320 17:33:27.667326 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 17:33:31.667308128 +0000 UTC m=+86.533133846 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:33:27 crc kubenswrapper[4690]: I0320 17:33:27.669831 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:27 crc kubenswrapper[4690]: I0320 17:33:27.669870 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:27 crc kubenswrapper[4690]: I0320 17:33:27.669904 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:27 crc kubenswrapper[4690]: I0320 17:33:27.669929 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:27 crc kubenswrapper[4690]: I0320 17:33:27.669947 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:27Z","lastTransitionTime":"2026-03-20T17:33:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:27 crc kubenswrapper[4690]: I0320 17:33:27.773641 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:27 crc kubenswrapper[4690]: I0320 17:33:27.773688 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:27 crc kubenswrapper[4690]: I0320 17:33:27.773704 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:27 crc kubenswrapper[4690]: I0320 17:33:27.773727 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:27 crc kubenswrapper[4690]: I0320 17:33:27.773744 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:27Z","lastTransitionTime":"2026-03-20T17:33:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:27 crc kubenswrapper[4690]: I0320 17:33:27.877566 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:27 crc kubenswrapper[4690]: I0320 17:33:27.877631 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:27 crc kubenswrapper[4690]: I0320 17:33:27.877657 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:27 crc kubenswrapper[4690]: I0320 17:33:27.877687 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:27 crc kubenswrapper[4690]: I0320 17:33:27.877712 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:27Z","lastTransitionTime":"2026-03-20T17:33:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:27 crc kubenswrapper[4690]: I0320 17:33:27.882715 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bgj72" Mar 20 17:33:27 crc kubenswrapper[4690]: I0320 17:33:27.882729 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:33:27 crc kubenswrapper[4690]: I0320 17:33:27.882972 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:33:27 crc kubenswrapper[4690]: E0320 17:33:27.883030 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bgj72" podUID="3cb690cf-caea-4c1b-ad3c-7e17a802b1a3" Mar 20 17:33:27 crc kubenswrapper[4690]: E0320 17:33:27.883191 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:33:27 crc kubenswrapper[4690]: E0320 17:33:27.883358 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:33:27 crc kubenswrapper[4690]: I0320 17:33:27.883372 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:33:27 crc kubenswrapper[4690]: E0320 17:33:27.883480 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:33:27 crc kubenswrapper[4690]: I0320 17:33:27.981428 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:27 crc kubenswrapper[4690]: I0320 17:33:27.981479 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:27 crc kubenswrapper[4690]: I0320 17:33:27.981496 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:27 crc kubenswrapper[4690]: I0320 17:33:27.981524 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:27 crc kubenswrapper[4690]: I0320 17:33:27.981540 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:27Z","lastTransitionTime":"2026-03-20T17:33:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:28 crc kubenswrapper[4690]: I0320 17:33:28.085600 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:28 crc kubenswrapper[4690]: I0320 17:33:28.086508 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:28 crc kubenswrapper[4690]: I0320 17:33:28.086712 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:28 crc kubenswrapper[4690]: I0320 17:33:28.086903 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:28 crc kubenswrapper[4690]: I0320 17:33:28.087109 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:28Z","lastTransitionTime":"2026-03-20T17:33:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:28 crc kubenswrapper[4690]: I0320 17:33:28.191022 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:28 crc kubenswrapper[4690]: I0320 17:33:28.191084 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:28 crc kubenswrapper[4690]: I0320 17:33:28.191103 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:28 crc kubenswrapper[4690]: I0320 17:33:28.191135 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:28 crc kubenswrapper[4690]: I0320 17:33:28.191155 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:28Z","lastTransitionTime":"2026-03-20T17:33:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:28 crc kubenswrapper[4690]: I0320 17:33:28.295480 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:28 crc kubenswrapper[4690]: I0320 17:33:28.295556 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:28 crc kubenswrapper[4690]: I0320 17:33:28.295577 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:28 crc kubenswrapper[4690]: I0320 17:33:28.295608 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:28 crc kubenswrapper[4690]: I0320 17:33:28.295631 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:28Z","lastTransitionTime":"2026-03-20T17:33:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:28 crc kubenswrapper[4690]: I0320 17:33:28.398795 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:28 crc kubenswrapper[4690]: I0320 17:33:28.398888 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:28 crc kubenswrapper[4690]: I0320 17:33:28.398913 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:28 crc kubenswrapper[4690]: I0320 17:33:28.398958 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:28 crc kubenswrapper[4690]: I0320 17:33:28.398985 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:28Z","lastTransitionTime":"2026-03-20T17:33:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:28 crc kubenswrapper[4690]: I0320 17:33:28.504161 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:28 crc kubenswrapper[4690]: I0320 17:33:28.504237 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:28 crc kubenswrapper[4690]: I0320 17:33:28.504289 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:28 crc kubenswrapper[4690]: I0320 17:33:28.504320 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:28 crc kubenswrapper[4690]: I0320 17:33:28.504341 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:28Z","lastTransitionTime":"2026-03-20T17:33:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:28 crc kubenswrapper[4690]: I0320 17:33:28.608558 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:28 crc kubenswrapper[4690]: I0320 17:33:28.609194 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:28 crc kubenswrapper[4690]: I0320 17:33:28.609241 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:28 crc kubenswrapper[4690]: I0320 17:33:28.609314 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:28 crc kubenswrapper[4690]: I0320 17:33:28.609341 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:28Z","lastTransitionTime":"2026-03-20T17:33:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:28 crc kubenswrapper[4690]: I0320 17:33:28.713507 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:28 crc kubenswrapper[4690]: I0320 17:33:28.713611 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:28 crc kubenswrapper[4690]: I0320 17:33:28.713655 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:28 crc kubenswrapper[4690]: I0320 17:33:28.713698 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:28 crc kubenswrapper[4690]: I0320 17:33:28.713770 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:28Z","lastTransitionTime":"2026-03-20T17:33:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:28 crc kubenswrapper[4690]: I0320 17:33:28.817235 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:28 crc kubenswrapper[4690]: I0320 17:33:28.817332 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:28 crc kubenswrapper[4690]: I0320 17:33:28.817352 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:28 crc kubenswrapper[4690]: I0320 17:33:28.817387 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:28 crc kubenswrapper[4690]: I0320 17:33:28.817413 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:28Z","lastTransitionTime":"2026-03-20T17:33:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:28 crc kubenswrapper[4690]: I0320 17:33:28.920601 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:28 crc kubenswrapper[4690]: I0320 17:33:28.920677 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:28 crc kubenswrapper[4690]: I0320 17:33:28.920700 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:28 crc kubenswrapper[4690]: I0320 17:33:28.920730 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:28 crc kubenswrapper[4690]: I0320 17:33:28.920752 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:28Z","lastTransitionTime":"2026-03-20T17:33:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:29 crc kubenswrapper[4690]: I0320 17:33:29.023742 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:29 crc kubenswrapper[4690]: I0320 17:33:29.023819 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:29 crc kubenswrapper[4690]: I0320 17:33:29.023840 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:29 crc kubenswrapper[4690]: I0320 17:33:29.023873 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:29 crc kubenswrapper[4690]: I0320 17:33:29.023902 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:29Z","lastTransitionTime":"2026-03-20T17:33:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:29 crc kubenswrapper[4690]: I0320 17:33:29.127547 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:29 crc kubenswrapper[4690]: I0320 17:33:29.127585 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:29 crc kubenswrapper[4690]: I0320 17:33:29.127599 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:29 crc kubenswrapper[4690]: I0320 17:33:29.127616 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:29 crc kubenswrapper[4690]: I0320 17:33:29.127631 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:29Z","lastTransitionTime":"2026-03-20T17:33:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:29 crc kubenswrapper[4690]: I0320 17:33:29.231224 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:29 crc kubenswrapper[4690]: I0320 17:33:29.231348 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:29 crc kubenswrapper[4690]: I0320 17:33:29.231374 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:29 crc kubenswrapper[4690]: I0320 17:33:29.231400 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:29 crc kubenswrapper[4690]: I0320 17:33:29.231422 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:29Z","lastTransitionTime":"2026-03-20T17:33:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:29 crc kubenswrapper[4690]: I0320 17:33:29.334679 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:29 crc kubenswrapper[4690]: I0320 17:33:29.334794 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:29 crc kubenswrapper[4690]: I0320 17:33:29.334818 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:29 crc kubenswrapper[4690]: I0320 17:33:29.334860 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:29 crc kubenswrapper[4690]: I0320 17:33:29.334883 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:29Z","lastTransitionTime":"2026-03-20T17:33:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:29 crc kubenswrapper[4690]: I0320 17:33:29.438290 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:29 crc kubenswrapper[4690]: I0320 17:33:29.438369 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:29 crc kubenswrapper[4690]: I0320 17:33:29.438570 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:29 crc kubenswrapper[4690]: I0320 17:33:29.438618 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:29 crc kubenswrapper[4690]: I0320 17:33:29.438648 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:29Z","lastTransitionTime":"2026-03-20T17:33:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:29 crc kubenswrapper[4690]: I0320 17:33:29.541857 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:29 crc kubenswrapper[4690]: I0320 17:33:29.541931 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:29 crc kubenswrapper[4690]: I0320 17:33:29.541950 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:29 crc kubenswrapper[4690]: I0320 17:33:29.541989 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:29 crc kubenswrapper[4690]: I0320 17:33:29.542012 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:29Z","lastTransitionTime":"2026-03-20T17:33:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:29 crc kubenswrapper[4690]: I0320 17:33:29.645484 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:29 crc kubenswrapper[4690]: I0320 17:33:29.645528 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:29 crc kubenswrapper[4690]: I0320 17:33:29.645540 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:29 crc kubenswrapper[4690]: I0320 17:33:29.645561 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:29 crc kubenswrapper[4690]: I0320 17:33:29.645574 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:29Z","lastTransitionTime":"2026-03-20T17:33:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:29 crc kubenswrapper[4690]: I0320 17:33:29.749468 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:29 crc kubenswrapper[4690]: I0320 17:33:29.749526 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:29 crc kubenswrapper[4690]: I0320 17:33:29.749546 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:29 crc kubenswrapper[4690]: I0320 17:33:29.749577 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:29 crc kubenswrapper[4690]: I0320 17:33:29.749607 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:29Z","lastTransitionTime":"2026-03-20T17:33:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:29 crc kubenswrapper[4690]: I0320 17:33:29.853370 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:29 crc kubenswrapper[4690]: I0320 17:33:29.853436 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:29 crc kubenswrapper[4690]: I0320 17:33:29.853452 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:29 crc kubenswrapper[4690]: I0320 17:33:29.853482 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:29 crc kubenswrapper[4690]: I0320 17:33:29.853501 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:29Z","lastTransitionTime":"2026-03-20T17:33:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:29 crc kubenswrapper[4690]: I0320 17:33:29.883561 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:33:29 crc kubenswrapper[4690]: I0320 17:33:29.883708 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bgj72" Mar 20 17:33:29 crc kubenswrapper[4690]: E0320 17:33:29.883752 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:33:29 crc kubenswrapper[4690]: I0320 17:33:29.883822 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:33:29 crc kubenswrapper[4690]: E0320 17:33:29.883979 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bgj72" podUID="3cb690cf-caea-4c1b-ad3c-7e17a802b1a3" Mar 20 17:33:29 crc kubenswrapper[4690]: E0320 17:33:29.884243 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:33:29 crc kubenswrapper[4690]: I0320 17:33:29.884316 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:33:29 crc kubenswrapper[4690]: E0320 17:33:29.885042 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:33:29 crc kubenswrapper[4690]: I0320 17:33:29.957471 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:29 crc kubenswrapper[4690]: I0320 17:33:29.957533 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:29 crc kubenswrapper[4690]: I0320 17:33:29.957543 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:29 crc kubenswrapper[4690]: I0320 17:33:29.957564 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:29 crc kubenswrapper[4690]: I0320 17:33:29.957584 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:29Z","lastTransitionTime":"2026-03-20T17:33:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:30 crc kubenswrapper[4690]: I0320 17:33:30.061320 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:30 crc kubenswrapper[4690]: I0320 17:33:30.061408 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:30 crc kubenswrapper[4690]: I0320 17:33:30.061429 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:30 crc kubenswrapper[4690]: I0320 17:33:30.061460 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:30 crc kubenswrapper[4690]: I0320 17:33:30.061483 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:30Z","lastTransitionTime":"2026-03-20T17:33:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:30 crc kubenswrapper[4690]: I0320 17:33:30.164833 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:30 crc kubenswrapper[4690]: I0320 17:33:30.164891 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:30 crc kubenswrapper[4690]: I0320 17:33:30.164910 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:30 crc kubenswrapper[4690]: I0320 17:33:30.164936 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:30 crc kubenswrapper[4690]: I0320 17:33:30.164955 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:30Z","lastTransitionTime":"2026-03-20T17:33:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:30 crc kubenswrapper[4690]: I0320 17:33:30.268793 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:30 crc kubenswrapper[4690]: I0320 17:33:30.268863 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:30 crc kubenswrapper[4690]: I0320 17:33:30.268881 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:30 crc kubenswrapper[4690]: I0320 17:33:30.268910 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:30 crc kubenswrapper[4690]: I0320 17:33:30.268944 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:30Z","lastTransitionTime":"2026-03-20T17:33:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:30 crc kubenswrapper[4690]: I0320 17:33:30.372432 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:30 crc kubenswrapper[4690]: I0320 17:33:30.372785 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:30 crc kubenswrapper[4690]: I0320 17:33:30.372979 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:30 crc kubenswrapper[4690]: I0320 17:33:30.373124 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:30 crc kubenswrapper[4690]: I0320 17:33:30.373299 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:30Z","lastTransitionTime":"2026-03-20T17:33:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:30 crc kubenswrapper[4690]: I0320 17:33:30.475886 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:30 crc kubenswrapper[4690]: I0320 17:33:30.475941 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:30 crc kubenswrapper[4690]: I0320 17:33:30.475957 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:30 crc kubenswrapper[4690]: I0320 17:33:30.475980 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:30 crc kubenswrapper[4690]: I0320 17:33:30.475997 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:30Z","lastTransitionTime":"2026-03-20T17:33:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:30 crc kubenswrapper[4690]: I0320 17:33:30.578754 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:30 crc kubenswrapper[4690]: I0320 17:33:30.578826 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:30 crc kubenswrapper[4690]: I0320 17:33:30.578852 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:30 crc kubenswrapper[4690]: I0320 17:33:30.578883 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:30 crc kubenswrapper[4690]: I0320 17:33:30.578908 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:30Z","lastTransitionTime":"2026-03-20T17:33:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:30 crc kubenswrapper[4690]: I0320 17:33:30.682381 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:30 crc kubenswrapper[4690]: I0320 17:33:30.682441 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:30 crc kubenswrapper[4690]: I0320 17:33:30.682459 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:30 crc kubenswrapper[4690]: I0320 17:33:30.682482 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:30 crc kubenswrapper[4690]: I0320 17:33:30.682498 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:30Z","lastTransitionTime":"2026-03-20T17:33:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:30 crc kubenswrapper[4690]: I0320 17:33:30.785619 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:30 crc kubenswrapper[4690]: I0320 17:33:30.785688 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:30 crc kubenswrapper[4690]: I0320 17:33:30.785713 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:30 crc kubenswrapper[4690]: I0320 17:33:30.785743 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:30 crc kubenswrapper[4690]: I0320 17:33:30.785766 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:30Z","lastTransitionTime":"2026-03-20T17:33:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:30 crc kubenswrapper[4690]: I0320 17:33:30.889388 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:30 crc kubenswrapper[4690]: I0320 17:33:30.889448 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:30 crc kubenswrapper[4690]: I0320 17:33:30.889466 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:30 crc kubenswrapper[4690]: I0320 17:33:30.889645 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:30 crc kubenswrapper[4690]: I0320 17:33:30.889663 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:30Z","lastTransitionTime":"2026-03-20T17:33:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:30 crc kubenswrapper[4690]: I0320 17:33:30.992874 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:30 crc kubenswrapper[4690]: I0320 17:33:30.992936 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:30 crc kubenswrapper[4690]: I0320 17:33:30.992956 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:30 crc kubenswrapper[4690]: I0320 17:33:30.992982 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:30 crc kubenswrapper[4690]: I0320 17:33:30.993000 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:30Z","lastTransitionTime":"2026-03-20T17:33:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:31 crc kubenswrapper[4690]: I0320 17:33:31.095749 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:31 crc kubenswrapper[4690]: I0320 17:33:31.095813 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:31 crc kubenswrapper[4690]: I0320 17:33:31.095835 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:31 crc kubenswrapper[4690]: I0320 17:33:31.095865 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:31 crc kubenswrapper[4690]: I0320 17:33:31.095886 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:31Z","lastTransitionTime":"2026-03-20T17:33:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:31 crc kubenswrapper[4690]: I0320 17:33:31.198910 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:31 crc kubenswrapper[4690]: I0320 17:33:31.198961 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:31 crc kubenswrapper[4690]: I0320 17:33:31.198977 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:31 crc kubenswrapper[4690]: I0320 17:33:31.198999 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:31 crc kubenswrapper[4690]: I0320 17:33:31.199016 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:31Z","lastTransitionTime":"2026-03-20T17:33:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:31 crc kubenswrapper[4690]: I0320 17:33:31.301678 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:31 crc kubenswrapper[4690]: I0320 17:33:31.301773 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:31 crc kubenswrapper[4690]: I0320 17:33:31.301790 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:31 crc kubenswrapper[4690]: I0320 17:33:31.301813 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:31 crc kubenswrapper[4690]: I0320 17:33:31.301832 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:31Z","lastTransitionTime":"2026-03-20T17:33:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:31 crc kubenswrapper[4690]: I0320 17:33:31.404389 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:31 crc kubenswrapper[4690]: I0320 17:33:31.404459 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:31 crc kubenswrapper[4690]: I0320 17:33:31.404482 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:31 crc kubenswrapper[4690]: I0320 17:33:31.404505 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:31 crc kubenswrapper[4690]: I0320 17:33:31.404525 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:31Z","lastTransitionTime":"2026-03-20T17:33:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:31 crc kubenswrapper[4690]: I0320 17:33:31.507857 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:31 crc kubenswrapper[4690]: I0320 17:33:31.507926 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:31 crc kubenswrapper[4690]: I0320 17:33:31.507943 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:31 crc kubenswrapper[4690]: I0320 17:33:31.507967 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:31 crc kubenswrapper[4690]: I0320 17:33:31.507985 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:31Z","lastTransitionTime":"2026-03-20T17:33:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:31 crc kubenswrapper[4690]: I0320 17:33:31.910826 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:33:31 crc kubenswrapper[4690]: I0320 17:33:31.910984 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:33:31 crc kubenswrapper[4690]: I0320 17:33:31.911034 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:33:31 crc kubenswrapper[4690]: I0320 17:33:31.911089 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:33:31 crc kubenswrapper[4690]: E0320 17:33:31.913048 4690 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 17:33:31 crc kubenswrapper[4690]: E0320 17:33:31.913081 4690 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 17:33:31 crc kubenswrapper[4690]: E0320 17:33:31.913096 4690 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:33:31 crc kubenswrapper[4690]: E0320 17:33:31.913198 4690 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 17:33:31 crc kubenswrapper[4690]: E0320 17:33:31.913250 4690 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 17:33:31 crc kubenswrapper[4690]: E0320 17:33:31.913287 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:33:39.913181327 +0000 UTC m=+94.779007045 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:33:31 crc kubenswrapper[4690]: E0320 17:33:31.913308 4690 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:33:31 crc kubenswrapper[4690]: E0320 17:33:31.913350 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 17:33:39.913322961 +0000 UTC m=+94.779148679 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:33:31 crc kubenswrapper[4690]: E0320 17:33:31.913380 4690 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 17:33:31 crc kubenswrapper[4690]: E0320 17:33:31.913474 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 17:33:39.913447084 +0000 UTC m=+94.779272792 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 17:33:31 crc kubenswrapper[4690]: E0320 17:33:31.913524 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 17:33:39.913488065 +0000 UTC m=+94.779313783 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:33:31 crc kubenswrapper[4690]: I0320 17:33:31.913703 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:33:31 crc kubenswrapper[4690]: I0320 17:33:31.913711 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:33:31 crc kubenswrapper[4690]: E0320 17:33:31.913914 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:33:31 crc kubenswrapper[4690]: E0320 17:33:31.914093 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:33:31 crc kubenswrapper[4690]: E0320 17:33:31.914194 4690 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 17:33:31 crc kubenswrapper[4690]: I0320 17:33:31.914231 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:33:31 crc kubenswrapper[4690]: E0320 17:33:31.914322 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 17:33:39.914296889 +0000 UTC m=+94.780122607 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 17:33:31 crc kubenswrapper[4690]: I0320 17:33:31.914328 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bgj72" Mar 20 17:33:31 crc kubenswrapper[4690]: E0320 17:33:31.915781 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:33:31 crc kubenswrapper[4690]: I0320 17:33:31.913893 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:33:31 crc kubenswrapper[4690]: E0320 17:33:31.916201 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bgj72" podUID="3cb690cf-caea-4c1b-ad3c-7e17a802b1a3" Mar 20 17:33:31 crc kubenswrapper[4690]: I0320 17:33:31.916395 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3cb690cf-caea-4c1b-ad3c-7e17a802b1a3-metrics-certs\") pod \"network-metrics-daemon-bgj72\" (UID: \"3cb690cf-caea-4c1b-ad3c-7e17a802b1a3\") " pod="openshift-multus/network-metrics-daemon-bgj72" Mar 20 17:33:31 crc kubenswrapper[4690]: E0320 17:33:31.916695 4690 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 17:33:31 crc kubenswrapper[4690]: E0320 17:33:31.916849 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3cb690cf-caea-4c1b-ad3c-7e17a802b1a3-metrics-certs podName:3cb690cf-caea-4c1b-ad3c-7e17a802b1a3 nodeName:}" failed. No retries permitted until 2026-03-20 17:33:39.916749339 +0000 UTC m=+94.782575047 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3cb690cf-caea-4c1b-ad3c-7e17a802b1a3-metrics-certs") pod "network-metrics-daemon-bgj72" (UID: "3cb690cf-caea-4c1b-ad3c-7e17a802b1a3") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 17:33:31 crc kubenswrapper[4690]: I0320 17:33:31.920644 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:31 crc kubenswrapper[4690]: I0320 17:33:31.920674 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:31 crc kubenswrapper[4690]: I0320 17:33:31.920684 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:31 crc kubenswrapper[4690]: I0320 17:33:31.920705 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:31 crc kubenswrapper[4690]: I0320 17:33:31.920720 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:31Z","lastTransitionTime":"2026-03-20T17:33:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:31 crc kubenswrapper[4690]: I0320 17:33:31.932346 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 20 17:33:32 crc kubenswrapper[4690]: I0320 17:33:32.023768 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:32 crc kubenswrapper[4690]: I0320 17:33:32.023831 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:32 crc kubenswrapper[4690]: I0320 17:33:32.023848 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:32 crc kubenswrapper[4690]: I0320 17:33:32.023874 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:32 crc kubenswrapper[4690]: I0320 17:33:32.023956 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:32Z","lastTransitionTime":"2026-03-20T17:33:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:32 crc kubenswrapper[4690]: I0320 17:33:32.127112 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:32 crc kubenswrapper[4690]: I0320 17:33:32.127198 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:32 crc kubenswrapper[4690]: I0320 17:33:32.127225 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:32 crc kubenswrapper[4690]: I0320 17:33:32.127290 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:32 crc kubenswrapper[4690]: I0320 17:33:32.127318 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:32Z","lastTransitionTime":"2026-03-20T17:33:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:32 crc kubenswrapper[4690]: I0320 17:33:32.230208 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:32 crc kubenswrapper[4690]: I0320 17:33:32.230280 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:32 crc kubenswrapper[4690]: I0320 17:33:32.230298 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:32 crc kubenswrapper[4690]: I0320 17:33:32.230320 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:32 crc kubenswrapper[4690]: I0320 17:33:32.230336 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:32Z","lastTransitionTime":"2026-03-20T17:33:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:32 crc kubenswrapper[4690]: I0320 17:33:32.334669 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:32 crc kubenswrapper[4690]: I0320 17:33:32.334715 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:32 crc kubenswrapper[4690]: I0320 17:33:32.334731 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:32 crc kubenswrapper[4690]: I0320 17:33:32.334753 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:32 crc kubenswrapper[4690]: I0320 17:33:32.334787 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:32Z","lastTransitionTime":"2026-03-20T17:33:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:32 crc kubenswrapper[4690]: I0320 17:33:32.437792 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:32 crc kubenswrapper[4690]: I0320 17:33:32.437867 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:32 crc kubenswrapper[4690]: I0320 17:33:32.437884 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:32 crc kubenswrapper[4690]: I0320 17:33:32.437911 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:32 crc kubenswrapper[4690]: I0320 17:33:32.437929 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:32Z","lastTransitionTime":"2026-03-20T17:33:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:32 crc kubenswrapper[4690]: I0320 17:33:32.541469 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:32 crc kubenswrapper[4690]: I0320 17:33:32.541556 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:32 crc kubenswrapper[4690]: I0320 17:33:32.541587 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:32 crc kubenswrapper[4690]: I0320 17:33:32.541619 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:32 crc kubenswrapper[4690]: I0320 17:33:32.541643 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:32Z","lastTransitionTime":"2026-03-20T17:33:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:32 crc kubenswrapper[4690]: I0320 17:33:32.643844 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:32 crc kubenswrapper[4690]: I0320 17:33:32.644129 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:32 crc kubenswrapper[4690]: I0320 17:33:32.644229 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:32 crc kubenswrapper[4690]: I0320 17:33:32.644363 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:32 crc kubenswrapper[4690]: I0320 17:33:32.644486 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:32Z","lastTransitionTime":"2026-03-20T17:33:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:32 crc kubenswrapper[4690]: I0320 17:33:32.747695 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:32 crc kubenswrapper[4690]: I0320 17:33:32.747744 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:32 crc kubenswrapper[4690]: I0320 17:33:32.747762 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:32 crc kubenswrapper[4690]: I0320 17:33:32.747785 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:32 crc kubenswrapper[4690]: I0320 17:33:32.747803 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:32Z","lastTransitionTime":"2026-03-20T17:33:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:32 crc kubenswrapper[4690]: I0320 17:33:32.851294 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:32 crc kubenswrapper[4690]: I0320 17:33:32.851358 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:32 crc kubenswrapper[4690]: I0320 17:33:32.851379 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:32 crc kubenswrapper[4690]: I0320 17:33:32.851404 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:32 crc kubenswrapper[4690]: I0320 17:33:32.851421 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:32Z","lastTransitionTime":"2026-03-20T17:33:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:32 crc kubenswrapper[4690]: I0320 17:33:32.954383 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:32 crc kubenswrapper[4690]: I0320 17:33:32.954447 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:32 crc kubenswrapper[4690]: I0320 17:33:32.954466 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:32 crc kubenswrapper[4690]: I0320 17:33:32.954490 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:32 crc kubenswrapper[4690]: I0320 17:33:32.954509 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:32Z","lastTransitionTime":"2026-03-20T17:33:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:33 crc kubenswrapper[4690]: I0320 17:33:33.057435 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:33 crc kubenswrapper[4690]: I0320 17:33:33.057540 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:33 crc kubenswrapper[4690]: I0320 17:33:33.057562 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:33 crc kubenswrapper[4690]: I0320 17:33:33.057584 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:33 crc kubenswrapper[4690]: I0320 17:33:33.057610 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:33Z","lastTransitionTime":"2026-03-20T17:33:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:33 crc kubenswrapper[4690]: I0320 17:33:33.160396 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:33 crc kubenswrapper[4690]: I0320 17:33:33.160460 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:33 crc kubenswrapper[4690]: I0320 17:33:33.160477 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:33 crc kubenswrapper[4690]: I0320 17:33:33.160501 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:33 crc kubenswrapper[4690]: I0320 17:33:33.160521 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:33Z","lastTransitionTime":"2026-03-20T17:33:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:33 crc kubenswrapper[4690]: I0320 17:33:33.263027 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:33 crc kubenswrapper[4690]: I0320 17:33:33.263084 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:33 crc kubenswrapper[4690]: I0320 17:33:33.263103 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:33 crc kubenswrapper[4690]: I0320 17:33:33.263125 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:33 crc kubenswrapper[4690]: I0320 17:33:33.263143 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:33Z","lastTransitionTime":"2026-03-20T17:33:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:33 crc kubenswrapper[4690]: I0320 17:33:33.366418 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:33 crc kubenswrapper[4690]: I0320 17:33:33.366487 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:33 crc kubenswrapper[4690]: I0320 17:33:33.366512 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:33 crc kubenswrapper[4690]: I0320 17:33:33.366542 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:33 crc kubenswrapper[4690]: I0320 17:33:33.366569 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:33Z","lastTransitionTime":"2026-03-20T17:33:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:33 crc kubenswrapper[4690]: I0320 17:33:33.469589 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:33 crc kubenswrapper[4690]: I0320 17:33:33.469646 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:33 crc kubenswrapper[4690]: I0320 17:33:33.469664 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:33 crc kubenswrapper[4690]: I0320 17:33:33.469692 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:33 crc kubenswrapper[4690]: I0320 17:33:33.469710 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:33Z","lastTransitionTime":"2026-03-20T17:33:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:33 crc kubenswrapper[4690]: I0320 17:33:33.572038 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:33 crc kubenswrapper[4690]: I0320 17:33:33.572104 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:33 crc kubenswrapper[4690]: I0320 17:33:33.572125 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:33 crc kubenswrapper[4690]: I0320 17:33:33.572158 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:33 crc kubenswrapper[4690]: I0320 17:33:33.572180 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:33Z","lastTransitionTime":"2026-03-20T17:33:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:33 crc kubenswrapper[4690]: I0320 17:33:33.675167 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:33 crc kubenswrapper[4690]: I0320 17:33:33.675356 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:33 crc kubenswrapper[4690]: I0320 17:33:33.675386 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:33 crc kubenswrapper[4690]: I0320 17:33:33.675418 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:33 crc kubenswrapper[4690]: I0320 17:33:33.675442 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:33Z","lastTransitionTime":"2026-03-20T17:33:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:33 crc kubenswrapper[4690]: I0320 17:33:33.778549 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:33 crc kubenswrapper[4690]: I0320 17:33:33.778606 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:33 crc kubenswrapper[4690]: I0320 17:33:33.778622 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:33 crc kubenswrapper[4690]: I0320 17:33:33.778645 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:33 crc kubenswrapper[4690]: I0320 17:33:33.778662 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:33Z","lastTransitionTime":"2026-03-20T17:33:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:33 crc kubenswrapper[4690]: I0320 17:33:33.882335 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:33 crc kubenswrapper[4690]: I0320 17:33:33.882397 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:33 crc kubenswrapper[4690]: I0320 17:33:33.882419 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:33 crc kubenswrapper[4690]: I0320 17:33:33.882448 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:33 crc kubenswrapper[4690]: I0320 17:33:33.882470 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:33Z","lastTransitionTime":"2026-03-20T17:33:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:33 crc kubenswrapper[4690]: I0320 17:33:33.882495 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:33:33 crc kubenswrapper[4690]: E0320 17:33:33.882751 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:33:33 crc kubenswrapper[4690]: I0320 17:33:33.882828 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:33:33 crc kubenswrapper[4690]: I0320 17:33:33.883400 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bgj72" Mar 20 17:33:33 crc kubenswrapper[4690]: I0320 17:33:33.883466 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:33:33 crc kubenswrapper[4690]: E0320 17:33:33.883710 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:33:33 crc kubenswrapper[4690]: E0320 17:33:33.883863 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bgj72" podUID="3cb690cf-caea-4c1b-ad3c-7e17a802b1a3" Mar 20 17:33:33 crc kubenswrapper[4690]: E0320 17:33:33.883976 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:33:33 crc kubenswrapper[4690]: I0320 17:33:33.905148 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 20 17:33:33 crc kubenswrapper[4690]: I0320 17:33:33.990605 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:33 crc kubenswrapper[4690]: I0320 17:33:33.991007 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:33 crc kubenswrapper[4690]: I0320 17:33:33.991141 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:33 crc kubenswrapper[4690]: I0320 17:33:33.991504 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:33 crc kubenswrapper[4690]: I0320 17:33:33.991681 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:33Z","lastTransitionTime":"2026-03-20T17:33:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:34 crc kubenswrapper[4690]: I0320 17:33:34.095096 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:34 crc kubenswrapper[4690]: I0320 17:33:34.095184 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:34 crc kubenswrapper[4690]: I0320 17:33:34.095206 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:34 crc kubenswrapper[4690]: I0320 17:33:34.095240 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:34 crc kubenswrapper[4690]: I0320 17:33:34.095287 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:34Z","lastTransitionTime":"2026-03-20T17:33:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:34 crc kubenswrapper[4690]: I0320 17:33:34.197865 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:34 crc kubenswrapper[4690]: I0320 17:33:34.198244 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:34 crc kubenswrapper[4690]: I0320 17:33:34.198458 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:34 crc kubenswrapper[4690]: I0320 17:33:34.198615 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:34 crc kubenswrapper[4690]: I0320 17:33:34.198762 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:34Z","lastTransitionTime":"2026-03-20T17:33:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:34 crc kubenswrapper[4690]: I0320 17:33:34.302065 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:34 crc kubenswrapper[4690]: I0320 17:33:34.302150 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:34 crc kubenswrapper[4690]: I0320 17:33:34.302163 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:34 crc kubenswrapper[4690]: I0320 17:33:34.302189 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:34 crc kubenswrapper[4690]: I0320 17:33:34.302203 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:34Z","lastTransitionTime":"2026-03-20T17:33:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:34 crc kubenswrapper[4690]: I0320 17:33:34.405182 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:34 crc kubenswrapper[4690]: I0320 17:33:34.405250 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:34 crc kubenswrapper[4690]: I0320 17:33:34.405284 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:34 crc kubenswrapper[4690]: I0320 17:33:34.405300 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:34 crc kubenswrapper[4690]: I0320 17:33:34.405312 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:34Z","lastTransitionTime":"2026-03-20T17:33:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:34 crc kubenswrapper[4690]: I0320 17:33:34.508749 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:34 crc kubenswrapper[4690]: I0320 17:33:34.508799 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:34 crc kubenswrapper[4690]: I0320 17:33:34.508817 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:34 crc kubenswrapper[4690]: I0320 17:33:34.508840 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:34 crc kubenswrapper[4690]: I0320 17:33:34.508857 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:34Z","lastTransitionTime":"2026-03-20T17:33:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:34 crc kubenswrapper[4690]: I0320 17:33:34.611304 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:34 crc kubenswrapper[4690]: I0320 17:33:34.611735 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:34 crc kubenswrapper[4690]: I0320 17:33:34.611900 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:34 crc kubenswrapper[4690]: I0320 17:33:34.612053 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:34 crc kubenswrapper[4690]: I0320 17:33:34.612210 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:34Z","lastTransitionTime":"2026-03-20T17:33:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:34 crc kubenswrapper[4690]: I0320 17:33:34.715130 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:34 crc kubenswrapper[4690]: I0320 17:33:34.715172 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:34 crc kubenswrapper[4690]: I0320 17:33:34.715182 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:34 crc kubenswrapper[4690]: I0320 17:33:34.715198 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:34 crc kubenswrapper[4690]: I0320 17:33:34.715209 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:34Z","lastTransitionTime":"2026-03-20T17:33:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:34 crc kubenswrapper[4690]: I0320 17:33:34.817457 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:34 crc kubenswrapper[4690]: I0320 17:33:34.817490 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:34 crc kubenswrapper[4690]: I0320 17:33:34.817517 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:34 crc kubenswrapper[4690]: I0320 17:33:34.817529 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:34 crc kubenswrapper[4690]: I0320 17:33:34.817537 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:34Z","lastTransitionTime":"2026-03-20T17:33:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:34 crc kubenswrapper[4690]: I0320 17:33:34.920087 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:34 crc kubenswrapper[4690]: I0320 17:33:34.920134 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:34 crc kubenswrapper[4690]: I0320 17:33:34.920145 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:34 crc kubenswrapper[4690]: I0320 17:33:34.920166 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:34 crc kubenswrapper[4690]: I0320 17:33:34.920178 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:34Z","lastTransitionTime":"2026-03-20T17:33:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:35 crc kubenswrapper[4690]: I0320 17:33:35.023485 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:35 crc kubenswrapper[4690]: I0320 17:33:35.023559 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:35 crc kubenswrapper[4690]: I0320 17:33:35.023577 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:35 crc kubenswrapper[4690]: I0320 17:33:35.023604 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:35 crc kubenswrapper[4690]: I0320 17:33:35.023626 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:35Z","lastTransitionTime":"2026-03-20T17:33:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:35 crc kubenswrapper[4690]: I0320 17:33:35.127080 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:35 crc kubenswrapper[4690]: I0320 17:33:35.127155 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:35 crc kubenswrapper[4690]: I0320 17:33:35.127177 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:35 crc kubenswrapper[4690]: I0320 17:33:35.127205 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:35 crc kubenswrapper[4690]: I0320 17:33:35.127224 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:35Z","lastTransitionTime":"2026-03-20T17:33:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:35 crc kubenswrapper[4690]: I0320 17:33:35.230622 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:35 crc kubenswrapper[4690]: I0320 17:33:35.230712 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:35 crc kubenswrapper[4690]: I0320 17:33:35.230731 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:35 crc kubenswrapper[4690]: I0320 17:33:35.230752 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:35 crc kubenswrapper[4690]: I0320 17:33:35.230770 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:35Z","lastTransitionTime":"2026-03-20T17:33:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:35 crc kubenswrapper[4690]: I0320 17:33:35.333293 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:35 crc kubenswrapper[4690]: I0320 17:33:35.333342 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:35 crc kubenswrapper[4690]: I0320 17:33:35.333406 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:35 crc kubenswrapper[4690]: I0320 17:33:35.333435 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:35 crc kubenswrapper[4690]: I0320 17:33:35.333459 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:35Z","lastTransitionTime":"2026-03-20T17:33:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:35 crc kubenswrapper[4690]: I0320 17:33:35.437772 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:35 crc kubenswrapper[4690]: I0320 17:33:35.437818 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:35 crc kubenswrapper[4690]: I0320 17:33:35.437830 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:35 crc kubenswrapper[4690]: I0320 17:33:35.437850 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:35 crc kubenswrapper[4690]: I0320 17:33:35.437863 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:35Z","lastTransitionTime":"2026-03-20T17:33:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:35 crc kubenswrapper[4690]: I0320 17:33:35.540299 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:35 crc kubenswrapper[4690]: I0320 17:33:35.540370 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:35 crc kubenswrapper[4690]: I0320 17:33:35.540383 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:35 crc kubenswrapper[4690]: I0320 17:33:35.540408 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:35 crc kubenswrapper[4690]: I0320 17:33:35.540424 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:35Z","lastTransitionTime":"2026-03-20T17:33:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:35 crc kubenswrapper[4690]: I0320 17:33:35.643194 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:35 crc kubenswrapper[4690]: I0320 17:33:35.643290 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:35 crc kubenswrapper[4690]: I0320 17:33:35.643311 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:35 crc kubenswrapper[4690]: I0320 17:33:35.643337 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:35 crc kubenswrapper[4690]: I0320 17:33:35.643360 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:35Z","lastTransitionTime":"2026-03-20T17:33:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:35 crc kubenswrapper[4690]: I0320 17:33:35.746526 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:35 crc kubenswrapper[4690]: I0320 17:33:35.746592 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:35 crc kubenswrapper[4690]: I0320 17:33:35.746610 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:35 crc kubenswrapper[4690]: I0320 17:33:35.746637 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:35 crc kubenswrapper[4690]: I0320 17:33:35.746656 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:35Z","lastTransitionTime":"2026-03-20T17:33:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:35 crc kubenswrapper[4690]: I0320 17:33:35.849244 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:35 crc kubenswrapper[4690]: I0320 17:33:35.849541 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:35 crc kubenswrapper[4690]: I0320 17:33:35.849662 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:35 crc kubenswrapper[4690]: I0320 17:33:35.849784 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:35 crc kubenswrapper[4690]: I0320 17:33:35.849891 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:35Z","lastTransitionTime":"2026-03-20T17:33:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:35 crc kubenswrapper[4690]: I0320 17:33:35.883001 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:33:35 crc kubenswrapper[4690]: I0320 17:33:35.883147 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:33:35 crc kubenswrapper[4690]: E0320 17:33:35.883316 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:33:35 crc kubenswrapper[4690]: I0320 17:33:35.883189 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:33:35 crc kubenswrapper[4690]: E0320 17:33:35.883141 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:33:35 crc kubenswrapper[4690]: I0320 17:33:35.883890 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bgj72" Mar 20 17:33:35 crc kubenswrapper[4690]: E0320 17:33:35.884052 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:33:35 crc kubenswrapper[4690]: E0320 17:33:35.884163 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bgj72" podUID="3cb690cf-caea-4c1b-ad3c-7e17a802b1a3" Mar 20 17:33:35 crc kubenswrapper[4690]: E0320 17:33:35.886496 4690 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 17:33:35 crc kubenswrapper[4690]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 20 17:33:35 crc kubenswrapper[4690]: set -o allexport Mar 20 17:33:35 crc kubenswrapper[4690]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 20 17:33:35 crc kubenswrapper[4690]: source /etc/kubernetes/apiserver-url.env Mar 20 17:33:35 crc kubenswrapper[4690]: else Mar 20 17:33:35 crc kubenswrapper[4690]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 20 17:33:35 crc kubenswrapper[4690]: exit 1 Mar 20 17:33:35 crc kubenswrapper[4690]: fi Mar 20 17:33:35 crc kubenswrapper[4690]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 20 17:33:35 crc kubenswrapper[4690]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 17:33:35 crc kubenswrapper[4690]: > logger="UnhandledError" Mar 20 17:33:35 crc kubenswrapper[4690]: E0320 17:33:35.886938 4690 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 17:33:35 crc kubenswrapper[4690]: init container &Container{Name:kubecfg-setup,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c cat << EOF > /etc/ovn/kubeconfig Mar 20 17:33:35 crc kubenswrapper[4690]: apiVersion: v1 Mar 20 17:33:35 crc kubenswrapper[4690]: clusters: Mar 20 17:33:35 crc kubenswrapper[4690]: - cluster: Mar 20 17:33:35 crc kubenswrapper[4690]: certificate-authority: /var/run/secrets/kubernetes.io/serviceaccount/ca.crt Mar 20 17:33:35 crc kubenswrapper[4690]: server: https://api-int.crc.testing:6443 Mar 20 17:33:35 crc kubenswrapper[4690]: name: default-cluster Mar 20 17:33:35 crc kubenswrapper[4690]: contexts: Mar 20 17:33:35 crc kubenswrapper[4690]: - context: Mar 20 17:33:35 crc kubenswrapper[4690]: cluster: default-cluster Mar 20 17:33:35 crc kubenswrapper[4690]: namespace: default Mar 20 17:33:35 crc kubenswrapper[4690]: user: default-auth Mar 20 17:33:35 crc kubenswrapper[4690]: name: default-context Mar 20 17:33:35 crc kubenswrapper[4690]: current-context: default-context Mar 20 17:33:35 crc kubenswrapper[4690]: kind: Config Mar 20 17:33:35 crc kubenswrapper[4690]: preferences: {} Mar 20 17:33:35 crc kubenswrapper[4690]: users: Mar 20 17:33:35 crc kubenswrapper[4690]: - name: default-auth Mar 20 17:33:35 crc kubenswrapper[4690]: user: Mar 20 17:33:35 crc kubenswrapper[4690]: client-certificate: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 20 17:33:35 crc kubenswrapper[4690]: client-key: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 20 17:33:35 crc kubenswrapper[4690]: EOF Mar 20 17:33:35 crc kubenswrapper[4690]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-openvswitch,ReadOnly:false,MountPath:/etc/ovn/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nmwk9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-node-7bsmm_openshift-ovn-kubernetes(01a728ab-e286-4606-b922-d510978b863a): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 17:33:35 crc kubenswrapper[4690]: > logger="UnhandledError" Mar 20 17:33:35 crc kubenswrapper[4690]: E0320 17:33:35.887750 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 20 17:33:35 crc kubenswrapper[4690]: E0320 17:33:35.888725 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kubecfg-setup\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" podUID="01a728ab-e286-4606-b922-d510978b863a" Mar 20 17:33:35 crc kubenswrapper[4690]: I0320 17:33:35.898603 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:35 crc kubenswrapper[4690]: I0320 17:33:35.911494 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c18651e4-89e3-43fd-a780-bfa6df87591e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v64dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v64dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wtg2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:35 crc kubenswrapper[4690]: I0320 17:33:35.926863 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4rfg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deaf1de2-4906-4e89-ae1b-83b6d35f97a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmghf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4rfg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:35 crc kubenswrapper[4690]: I0320 17:33:35.940618 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nqtt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f51dea1-fc10-4d4a-9065-2d0c020b36f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzj2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzj2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8nqtt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:35 crc kubenswrapper[4690]: I0320 17:33:35.954067 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:35 crc kubenswrapper[4690]: I0320 17:33:35.954712 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:35 crc kubenswrapper[4690]: I0320 17:33:35.954758 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:35 crc kubenswrapper[4690]: I0320 17:33:35.954775 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:35 crc kubenswrapper[4690]: I0320 17:33:35.954799 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:35 crc kubenswrapper[4690]: I0320 17:33:35.954815 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:35Z","lastTransitionTime":"2026-03-20T17:33:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:35 crc kubenswrapper[4690]: I0320 17:33:35.964370 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:35 crc kubenswrapper[4690]: I0320 17:33:35.967610 4690 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 20 17:33:35 crc kubenswrapper[4690]: I0320 17:33:35.981526 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bf8dm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"189715be-f690-4a1d-9bd3-fb0dcae7affe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z9vwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bf8dm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:36 crc kubenswrapper[4690]: I0320 17:33:36.007748 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dacf40f3-f7fe-429b-bb11-3057bc037779\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b273b610fa19944625ca87d5ec10f818b86154d676f1def5ebe494ee44ed3848\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f552ca9ec154d035a9f9809b20d9ff2cd19bbd4cb9262173a0334289741f4fad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd8c552d958aced0cb683d87c3ef8d88494d4888ccb028a9f4c27b24b4923264\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5355eb1563fa92e70ca61e39a864a15b53da2181b277f3e134d121b5626b954a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f044bae4d4345b16e951ba16d4dc6df9b400789b67b6eb23d806fba27dc77d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://601a5cb96354f970de2322d08594baacac3c21ec962d27dc0c809f1bc99de4d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://601a5cb96354f970de2322d08594baacac3c21ec962d27dc0c809f1bc99de4d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e719a69188fb4ee3882973f6f72ba027c5a546cb39b119b27bcd38d8cc728521\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e719a69188fb4ee3882973f6f72ba027c5a546cb39b119b27bcd38d8cc728521\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ef118e8eca52e42d265877595d296d5641caa5c79886b886eefca7686f9b6524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef118e8eca52e42d265877595d296d5641caa5c79886b886eefca7686f9b6524\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:36 crc kubenswrapper[4690]: I0320 17:33:36.020829 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bgj72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cb690cf-caea-4c1b-ad3c-7e17a802b1a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djqjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djqjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bgj72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:36 crc kubenswrapper[4690]: I0320 17:33:36.039143 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tzvwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fe7c1d1-7aa9-4c64-941e-7415a99367ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tzvwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:36 crc kubenswrapper[4690]: I0320 17:33:36.057576 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:36 crc kubenswrapper[4690]: I0320 17:33:36.057704 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:36 crc kubenswrapper[4690]: I0320 17:33:36.057737 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:36 crc kubenswrapper[4690]: I0320 17:33:36.057761 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:36 crc kubenswrapper[4690]: I0320 17:33:36.057793 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:36Z","lastTransitionTime":"2026-03-20T17:33:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:36 crc kubenswrapper[4690]: I0320 17:33:36.066087 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01a728ab-e286-4606-b922-d510978b863a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7bsmm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:36 crc kubenswrapper[4690]: I0320 17:33:36.076980 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cdd6a8b-6b15-41c5-ba81-51e1ef53835e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c42561cbc470c23295468bf31d6dda364c3962cf8ac84f53ed62c01fa3e19db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbfcf8baf8b3cc4746bc7b314297f0f820b7461ad85d9c2f500a3ed589fb4bc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbfcf8baf8b3cc4746bc7b314297f0f820b7461ad85d9c2f500a3ed589fb4bc0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:36 crc kubenswrapper[4690]: I0320 17:33:36.092792 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1ec4f2e-81b3-4b81-b071-1306b93f352a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc5b19d4175f97a26633b3c61b49147f93e1edeb8975964cb23bbe474f6326e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe2bb59ee9fc82c3e49b375d294aebc73e2175d699416cb28c587a153cbadc5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83d020fd903a7b604233a4229c9a201a78f0f9d41864c94e82220090dd73e69e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a788ca120045ef7b2481c3da0afac1f8ae2522b3edd3b73a48f5f8dab045a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60a788ca120045ef7b2481c3da0afac1f8ae2522b3edd3b73a48f5f8dab045a4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:33:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:33:16.417534 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:33:16.417775 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:33:16.418850 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4179466923/tls.crt::/tmp/serving-cert-4179466923/tls.key\\\\\\\"\\\\nI0320 17:33:16.771141 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:33:16.777371 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:33:16.777420 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:33:16.777489 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:33:16.777503 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:33:16.783760 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 17:33:16.783788 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:33:16.783793 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 17:33:16.783790 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:33:16.783798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:33:16.783816 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:33:16.783823 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:33:16.783828 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:33:16.787038 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d1877a8c2e19c04c44916cbcd68e19a117e4d6075b33ce7131064590120b12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438a96b878fe413aa54a56021b7ca5d2d38226050a036c2ce144aaead090aff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://438a96b878fe413aa54a56021b7ca5d2d38226050a036c2ce144aaead090aff7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:36 crc kubenswrapper[4690]: I0320 17:33:36.109717 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:36 crc kubenswrapper[4690]: I0320 17:33:36.123792 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:36 crc kubenswrapper[4690]: I0320 17:33:36.137324 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:36 crc kubenswrapper[4690]: I0320 17:33:36.146981 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qhmg6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5abdfe2-a5f7-43a7-9c83-a9eb0dacdea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lb8q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qhmg6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:36 crc kubenswrapper[4690]: I0320 17:33:36.161175 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:36 crc kubenswrapper[4690]: I0320 17:33:36.161229 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:36 crc kubenswrapper[4690]: I0320 17:33:36.161245 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:36 crc kubenswrapper[4690]: I0320 17:33:36.161300 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:36 crc kubenswrapper[4690]: I0320 17:33:36.161318 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:36Z","lastTransitionTime":"2026-03-20T17:33:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:36 crc kubenswrapper[4690]: I0320 17:33:36.264456 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:36 crc kubenswrapper[4690]: I0320 17:33:36.264539 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:36 crc kubenswrapper[4690]: I0320 17:33:36.264559 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:36 crc kubenswrapper[4690]: I0320 17:33:36.264582 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:36 crc kubenswrapper[4690]: I0320 17:33:36.264599 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:36Z","lastTransitionTime":"2026-03-20T17:33:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:36 crc kubenswrapper[4690]: I0320 17:33:36.367103 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:36 crc kubenswrapper[4690]: I0320 17:33:36.367144 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:36 crc kubenswrapper[4690]: I0320 17:33:36.367198 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:36 crc kubenswrapper[4690]: I0320 17:33:36.367217 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:36 crc kubenswrapper[4690]: I0320 17:33:36.367231 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:36Z","lastTransitionTime":"2026-03-20T17:33:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:36 crc kubenswrapper[4690]: I0320 17:33:36.470156 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:36 crc kubenswrapper[4690]: I0320 17:33:36.470222 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:36 crc kubenswrapper[4690]: I0320 17:33:36.470244 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:36 crc kubenswrapper[4690]: I0320 17:33:36.470305 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:36 crc kubenswrapper[4690]: I0320 17:33:36.470328 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:36Z","lastTransitionTime":"2026-03-20T17:33:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:36 crc kubenswrapper[4690]: I0320 17:33:36.573306 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:36 crc kubenswrapper[4690]: I0320 17:33:36.573410 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:36 crc kubenswrapper[4690]: I0320 17:33:36.573434 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:36 crc kubenswrapper[4690]: I0320 17:33:36.573458 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:36 crc kubenswrapper[4690]: I0320 17:33:36.573475 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:36Z","lastTransitionTime":"2026-03-20T17:33:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:36 crc kubenswrapper[4690]: I0320 17:33:36.676436 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:36 crc kubenswrapper[4690]: I0320 17:33:36.676482 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:36 crc kubenswrapper[4690]: I0320 17:33:36.676493 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:36 crc kubenswrapper[4690]: I0320 17:33:36.676509 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:36 crc kubenswrapper[4690]: I0320 17:33:36.676521 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:36Z","lastTransitionTime":"2026-03-20T17:33:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:36 crc kubenswrapper[4690]: I0320 17:33:36.779425 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:36 crc kubenswrapper[4690]: I0320 17:33:36.779458 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:36 crc kubenswrapper[4690]: I0320 17:33:36.779470 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:36 crc kubenswrapper[4690]: I0320 17:33:36.779485 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:36 crc kubenswrapper[4690]: I0320 17:33:36.779496 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:36Z","lastTransitionTime":"2026-03-20T17:33:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:36 crc kubenswrapper[4690]: I0320 17:33:36.882162 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:36 crc kubenswrapper[4690]: I0320 17:33:36.882218 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:36 crc kubenswrapper[4690]: I0320 17:33:36.882235 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:36 crc kubenswrapper[4690]: I0320 17:33:36.882289 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:36 crc kubenswrapper[4690]: I0320 17:33:36.882308 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:36Z","lastTransitionTime":"2026-03-20T17:33:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:36 crc kubenswrapper[4690]: E0320 17:33:36.884290 4690 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 17:33:36 crc kubenswrapper[4690]: container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[/bin/bash -c #!/bin/bash Mar 20 17:33:36 crc kubenswrapper[4690]: set -euo pipefail Mar 20 17:33:36 crc kubenswrapper[4690]: TLS_PK=/etc/pki/tls/metrics-cert/tls.key Mar 20 17:33:36 crc kubenswrapper[4690]: TLS_CERT=/etc/pki/tls/metrics-cert/tls.crt Mar 20 17:33:36 crc kubenswrapper[4690]: # As the secret mount is optional we must wait for the files to be present. Mar 20 17:33:36 crc kubenswrapper[4690]: # The service is created in monitor.yaml and this is created in sdn.yaml. Mar 20 17:33:36 crc kubenswrapper[4690]: TS=$(date +%s) Mar 20 17:33:36 crc kubenswrapper[4690]: WARN_TS=$(( ${TS} + $(( 20 * 60)) )) Mar 20 17:33:36 crc kubenswrapper[4690]: HAS_LOGGED_INFO=0 Mar 20 17:33:36 crc kubenswrapper[4690]: Mar 20 17:33:36 crc kubenswrapper[4690]: log_missing_certs(){ Mar 20 17:33:36 crc kubenswrapper[4690]: CUR_TS=$(date +%s) Mar 20 17:33:36 crc kubenswrapper[4690]: if [[ "${CUR_TS}" -gt "WARN_TS" ]]; then Mar 20 17:33:36 crc kubenswrapper[4690]: echo $(date -Iseconds) WARN: ovn-control-plane-metrics-cert not mounted after 20 minutes. Mar 20 17:33:36 crc kubenswrapper[4690]: elif [[ "${HAS_LOGGED_INFO}" -eq 0 ]] ; then Mar 20 17:33:36 crc kubenswrapper[4690]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-cert not mounted. Waiting 20 minutes. Mar 20 17:33:36 crc kubenswrapper[4690]: HAS_LOGGED_INFO=1 Mar 20 17:33:36 crc kubenswrapper[4690]: fi Mar 20 17:33:36 crc kubenswrapper[4690]: } Mar 20 17:33:36 crc kubenswrapper[4690]: while [[ ! -f "${TLS_PK}" || ! -f "${TLS_CERT}" ]] ; do Mar 20 17:33:36 crc kubenswrapper[4690]: log_missing_certs Mar 20 17:33:36 crc kubenswrapper[4690]: sleep 5 Mar 20 17:33:36 crc kubenswrapper[4690]: done Mar 20 17:33:36 crc kubenswrapper[4690]: Mar 20 17:33:36 crc kubenswrapper[4690]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-certs mounted, starting kube-rbac-proxy Mar 20 17:33:36 crc kubenswrapper[4690]: exec /usr/bin/kube-rbac-proxy \ Mar 20 17:33:36 crc kubenswrapper[4690]: --logtostderr \ Mar 20 17:33:36 crc kubenswrapper[4690]: --secure-listen-address=:9108 \ Mar 20 17:33:36 crc kubenswrapper[4690]: --tls-cipher-suites=TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256 \ Mar 20 17:33:36 crc kubenswrapper[4690]: --upstream=http://127.0.0.1:29108/ \ Mar 20 17:33:36 crc kubenswrapper[4690]: --tls-private-key-file=${TLS_PK} \ Mar 20 17:33:36 crc kubenswrapper[4690]: --tls-cert-file=${TLS_CERT} Mar 20 17:33:36 crc kubenswrapper[4690]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:9108,ContainerPort:9108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{20971520 0} {} 20Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovn-control-plane-metrics-cert,ReadOnly:true,MountPath:/etc/pki/tls/metrics-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zzj2d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-8nqtt_openshift-ovn-kubernetes(3f51dea1-fc10-4d4a-9065-2d0c020b36f9): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 17:33:36 crc kubenswrapper[4690]: > logger="UnhandledError" Mar 20 17:33:36 crc kubenswrapper[4690]: E0320 17:33:36.884389 4690 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 17:33:36 crc kubenswrapper[4690]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 20 17:33:36 crc kubenswrapper[4690]: if [[ -f "/env/_master" ]]; then Mar 20 17:33:36 crc kubenswrapper[4690]: set -o allexport Mar 20 17:33:36 crc kubenswrapper[4690]: source "/env/_master" Mar 20 17:33:36 crc kubenswrapper[4690]: set +o allexport Mar 20 17:33:36 crc kubenswrapper[4690]: fi Mar 20 17:33:36 crc kubenswrapper[4690]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 20 17:33:36 crc kubenswrapper[4690]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 20 17:33:36 crc kubenswrapper[4690]: ho_enable="--enable-hybrid-overlay" Mar 20 17:33:36 crc kubenswrapper[4690]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 20 17:33:36 crc kubenswrapper[4690]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 20 17:33:36 crc kubenswrapper[4690]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 20 17:33:36 crc kubenswrapper[4690]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 20 17:33:36 crc kubenswrapper[4690]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 20 17:33:36 crc kubenswrapper[4690]: --webhook-host=127.0.0.1 \ Mar 20 17:33:36 crc kubenswrapper[4690]: --webhook-port=9743 \ Mar 20 17:33:36 crc kubenswrapper[4690]: ${ho_enable} \ Mar 20 17:33:36 crc kubenswrapper[4690]: --enable-interconnect \ Mar 20 17:33:36 crc kubenswrapper[4690]: --disable-approver \ Mar 20 17:33:36 crc kubenswrapper[4690]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 20 17:33:36 crc kubenswrapper[4690]: --wait-for-kubernetes-api=200s \ Mar 20 17:33:36 crc kubenswrapper[4690]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 20 17:33:36 crc kubenswrapper[4690]: --loglevel="${LOGLEVEL}" Mar 20 17:33:36 crc kubenswrapper[4690]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 17:33:36 crc kubenswrapper[4690]: > logger="UnhandledError" Mar 20 17:33:36 crc kubenswrapper[4690]: E0320 17:33:36.887790 4690 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 17:33:36 crc kubenswrapper[4690]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 20 17:33:36 crc kubenswrapper[4690]: if [[ -f "/env/_master" ]]; then Mar 20 17:33:36 crc kubenswrapper[4690]: set -o allexport Mar 20 17:33:36 crc kubenswrapper[4690]: source "/env/_master" Mar 20 17:33:36 crc kubenswrapper[4690]: set +o allexport Mar 20 17:33:36 crc kubenswrapper[4690]: fi Mar 20 17:33:36 crc kubenswrapper[4690]: Mar 20 17:33:36 crc kubenswrapper[4690]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 20 17:33:36 crc kubenswrapper[4690]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 20 17:33:36 crc kubenswrapper[4690]: --disable-webhook \ Mar 20 17:33:36 crc kubenswrapper[4690]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 20 17:33:36 crc kubenswrapper[4690]: --loglevel="${LOGLEVEL}" Mar 20 17:33:36 crc kubenswrapper[4690]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 17:33:36 crc kubenswrapper[4690]: > logger="UnhandledError" Mar 20 17:33:36 crc kubenswrapper[4690]: E0320 17:33:36.889092 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 20 17:33:36 crc kubenswrapper[4690]: E0320 17:33:36.892474 4690 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 17:33:36 crc kubenswrapper[4690]: container &Container{Name:ovnkube-cluster-manager,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 20 17:33:36 crc kubenswrapper[4690]: if [[ -f "/env/_master" ]]; then Mar 20 17:33:36 crc kubenswrapper[4690]: set -o allexport Mar 20 17:33:36 crc kubenswrapper[4690]: source "/env/_master" Mar 20 17:33:36 crc kubenswrapper[4690]: set +o allexport Mar 20 17:33:36 crc kubenswrapper[4690]: fi Mar 20 17:33:36 crc kubenswrapper[4690]: Mar 20 17:33:36 crc kubenswrapper[4690]: ovn_v4_join_subnet_opt= Mar 20 17:33:36 crc kubenswrapper[4690]: if [[ "" != "" ]]; then Mar 20 17:33:36 crc kubenswrapper[4690]: ovn_v4_join_subnet_opt="--gateway-v4-join-subnet " Mar 20 17:33:36 crc kubenswrapper[4690]: fi Mar 20 17:33:36 crc kubenswrapper[4690]: ovn_v6_join_subnet_opt= Mar 20 17:33:36 crc kubenswrapper[4690]: if [[ "" != "" ]]; then Mar 20 17:33:36 crc kubenswrapper[4690]: ovn_v6_join_subnet_opt="--gateway-v6-join-subnet " Mar 20 17:33:36 crc kubenswrapper[4690]: fi Mar 20 17:33:36 crc kubenswrapper[4690]: Mar 20 17:33:36 crc kubenswrapper[4690]: ovn_v4_transit_switch_subnet_opt= Mar 20 17:33:36 crc kubenswrapper[4690]: if [[ "" != "" ]]; then Mar 20 17:33:36 crc kubenswrapper[4690]: ovn_v4_transit_switch_subnet_opt="--cluster-manager-v4-transit-switch-subnet " Mar 20 17:33:36 crc kubenswrapper[4690]: fi Mar 20 17:33:36 crc kubenswrapper[4690]: ovn_v6_transit_switch_subnet_opt= Mar 20 17:33:36 crc kubenswrapper[4690]: if [[ "" != "" ]]; then Mar 20 17:33:36 crc kubenswrapper[4690]: ovn_v6_transit_switch_subnet_opt="--cluster-manager-v6-transit-switch-subnet " Mar 20 17:33:36 crc kubenswrapper[4690]: fi Mar 20 17:33:36 crc kubenswrapper[4690]: Mar 20 17:33:36 crc kubenswrapper[4690]: dns_name_resolver_enabled_flag= Mar 20 17:33:36 crc kubenswrapper[4690]: if [[ "false" == "true" ]]; then Mar 20 17:33:36 crc kubenswrapper[4690]: dns_name_resolver_enabled_flag="--enable-dns-name-resolver" Mar 20 17:33:36 crc kubenswrapper[4690]: fi Mar 20 17:33:36 crc kubenswrapper[4690]: Mar 20 17:33:36 crc kubenswrapper[4690]: persistent_ips_enabled_flag= Mar 20 17:33:36 crc kubenswrapper[4690]: if [[ "true" == "true" ]]; then Mar 20 17:33:36 crc kubenswrapper[4690]: persistent_ips_enabled_flag="--enable-persistent-ips" Mar 20 17:33:36 crc kubenswrapper[4690]: fi Mar 20 17:33:36 crc kubenswrapper[4690]: Mar 20 17:33:36 crc kubenswrapper[4690]: # This is needed so that converting clusters from GA to TP Mar 20 17:33:36 crc kubenswrapper[4690]: # will rollout control plane pods as well Mar 20 17:33:36 crc kubenswrapper[4690]: network_segmentation_enabled_flag= Mar 20 17:33:36 crc kubenswrapper[4690]: multi_network_enabled_flag= Mar 20 17:33:36 crc kubenswrapper[4690]: if [[ "true" == "true" ]]; then Mar 20 17:33:36 crc kubenswrapper[4690]: multi_network_enabled_flag="--enable-multi-network" Mar 20 17:33:36 crc kubenswrapper[4690]: network_segmentation_enabled_flag="--enable-network-segmentation" Mar 20 17:33:36 crc kubenswrapper[4690]: fi Mar 20 17:33:36 crc kubenswrapper[4690]: Mar 20 17:33:36 crc kubenswrapper[4690]: echo "I$(date "+%m%d %H:%M:%S.%N") - ovnkube-control-plane - start ovnkube --init-cluster-manager ${K8S_NODE}" Mar 20 17:33:36 crc kubenswrapper[4690]: exec /usr/bin/ovnkube \ Mar 20 17:33:36 crc kubenswrapper[4690]: --enable-interconnect \ Mar 20 17:33:36 crc kubenswrapper[4690]: --init-cluster-manager "${K8S_NODE}" \ Mar 20 17:33:36 crc kubenswrapper[4690]: --config-file=/run/ovnkube-config/ovnkube.conf \ Mar 20 17:33:36 crc kubenswrapper[4690]: --loglevel "${OVN_KUBE_LOG_LEVEL}" \ Mar 20 17:33:36 crc kubenswrapper[4690]: --metrics-bind-address "127.0.0.1:29108" \ Mar 20 17:33:36 crc kubenswrapper[4690]: --metrics-enable-pprof \ Mar 20 17:33:36 crc kubenswrapper[4690]: --metrics-enable-config-duration \ Mar 20 17:33:36 crc kubenswrapper[4690]: ${ovn_v4_join_subnet_opt} \ Mar 20 17:33:36 crc kubenswrapper[4690]: ${ovn_v6_join_subnet_opt} \ Mar 20 17:33:36 crc kubenswrapper[4690]: ${ovn_v4_transit_switch_subnet_opt} \ Mar 20 17:33:36 crc kubenswrapper[4690]: ${ovn_v6_transit_switch_subnet_opt} \ Mar 20 17:33:36 crc kubenswrapper[4690]: ${dns_name_resolver_enabled_flag} \ Mar 20 17:33:36 crc kubenswrapper[4690]: ${persistent_ips_enabled_flag} \ Mar 20 17:33:36 crc kubenswrapper[4690]: ${multi_network_enabled_flag} \ Mar 20 17:33:36 crc kubenswrapper[4690]: ${network_segmentation_enabled_flag} Mar 20 17:33:36 crc kubenswrapper[4690]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics-port,HostPort:29108,ContainerPort:29108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OVN_KUBE_LOG_LEVEL,Value:4,ValueFrom:nil,},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{314572800 0} {} 300Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovnkube-config,ReadOnly:false,MountPath:/run/ovnkube-config/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zzj2d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-8nqtt_openshift-ovn-kubernetes(3f51dea1-fc10-4d4a-9065-2d0c020b36f9): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 17:33:36 crc kubenswrapper[4690]: > logger="UnhandledError" Mar 20 17:33:36 crc kubenswrapper[4690]: E0320 17:33:36.893734 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"ovnkube-cluster-manager\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nqtt" podUID="3f51dea1-fc10-4d4a-9065-2d0c020b36f9" Mar 20 17:33:36 crc kubenswrapper[4690]: I0320 17:33:36.985429 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:36 crc kubenswrapper[4690]: I0320 17:33:36.985505 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:36 crc kubenswrapper[4690]: I0320 17:33:36.985524 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:36 crc kubenswrapper[4690]: I0320 17:33:36.985548 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:36 crc kubenswrapper[4690]: I0320 17:33:36.985566 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:36Z","lastTransitionTime":"2026-03-20T17:33:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:37 crc kubenswrapper[4690]: I0320 17:33:37.088740 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:37 crc kubenswrapper[4690]: I0320 17:33:37.088790 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:37 crc kubenswrapper[4690]: I0320 17:33:37.088804 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:37 crc kubenswrapper[4690]: I0320 17:33:37.088823 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:37 crc kubenswrapper[4690]: I0320 17:33:37.088837 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:37Z","lastTransitionTime":"2026-03-20T17:33:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:37 crc kubenswrapper[4690]: I0320 17:33:37.176948 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:37 crc kubenswrapper[4690]: I0320 17:33:37.177023 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:37 crc kubenswrapper[4690]: I0320 17:33:37.177046 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:37 crc kubenswrapper[4690]: I0320 17:33:37.177074 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:37 crc kubenswrapper[4690]: I0320 17:33:37.177096 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:37Z","lastTransitionTime":"2026-03-20T17:33:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:37 crc kubenswrapper[4690]: E0320 17:33:37.190569 4690 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"65dcae3a-f6f0-4cdb-ac7a-76b1f475ea12\\\",\\\"systemUUID\\\":\\\"6ccc1e34-4160-4143-b919-ac2f717f294a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:37 crc kubenswrapper[4690]: I0320 17:33:37.195037 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:37 crc kubenswrapper[4690]: I0320 17:33:37.195085 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:37 crc kubenswrapper[4690]: I0320 17:33:37.195101 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:37 crc kubenswrapper[4690]: I0320 17:33:37.195124 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:37 crc kubenswrapper[4690]: I0320 17:33:37.195140 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:37Z","lastTransitionTime":"2026-03-20T17:33:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:37 crc kubenswrapper[4690]: E0320 17:33:37.243905 4690 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"65dcae3a-f6f0-4cdb-ac7a-76b1f475ea12\\\",\\\"systemUUID\\\":\\\"6ccc1e34-4160-4143-b919-ac2f717f294a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:37 crc kubenswrapper[4690]: I0320 17:33:37.267689 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:37 crc kubenswrapper[4690]: I0320 17:33:37.267734 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:37 crc kubenswrapper[4690]: I0320 17:33:37.267746 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:37 crc kubenswrapper[4690]: I0320 17:33:37.267764 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:37 crc kubenswrapper[4690]: I0320 17:33:37.267777 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:37Z","lastTransitionTime":"2026-03-20T17:33:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:37 crc kubenswrapper[4690]: E0320 17:33:37.279443 4690 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"65dcae3a-f6f0-4cdb-ac7a-76b1f475ea12\\\",\\\"systemUUID\\\":\\\"6ccc1e34-4160-4143-b919-ac2f717f294a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:37 crc kubenswrapper[4690]: I0320 17:33:37.283061 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:37 crc kubenswrapper[4690]: I0320 17:33:37.283104 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:37 crc kubenswrapper[4690]: I0320 17:33:37.283119 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:37 crc kubenswrapper[4690]: I0320 17:33:37.283140 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:37 crc kubenswrapper[4690]: I0320 17:33:37.283154 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:37Z","lastTransitionTime":"2026-03-20T17:33:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:37 crc kubenswrapper[4690]: E0320 17:33:37.298178 4690 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"65dcae3a-f6f0-4cdb-ac7a-76b1f475ea12\\\",\\\"systemUUID\\\":\\\"6ccc1e34-4160-4143-b919-ac2f717f294a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:37 crc kubenswrapper[4690]: I0320 17:33:37.301791 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:37 crc kubenswrapper[4690]: I0320 17:33:37.301834 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:37 crc kubenswrapper[4690]: I0320 17:33:37.301851 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:37 crc kubenswrapper[4690]: I0320 17:33:37.301870 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:37 crc kubenswrapper[4690]: I0320 17:33:37.301884 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:37Z","lastTransitionTime":"2026-03-20T17:33:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:37 crc kubenswrapper[4690]: E0320 17:33:37.315396 4690 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"65dcae3a-f6f0-4cdb-ac7a-76b1f475ea12\\\",\\\"systemUUID\\\":\\\"6ccc1e34-4160-4143-b919-ac2f717f294a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:37 crc kubenswrapper[4690]: E0320 17:33:37.315610 4690 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 17:33:37 crc kubenswrapper[4690]: I0320 17:33:37.317167 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:37 crc kubenswrapper[4690]: I0320 17:33:37.317275 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:37 crc kubenswrapper[4690]: I0320 17:33:37.317297 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:37 crc kubenswrapper[4690]: I0320 17:33:37.317318 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:37 crc kubenswrapper[4690]: I0320 17:33:37.317332 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:37Z","lastTransitionTime":"2026-03-20T17:33:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:37 crc kubenswrapper[4690]: I0320 17:33:37.420091 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:37 crc kubenswrapper[4690]: I0320 17:33:37.420134 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:37 crc kubenswrapper[4690]: I0320 17:33:37.420150 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:37 crc kubenswrapper[4690]: I0320 17:33:37.420174 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:37 crc kubenswrapper[4690]: I0320 17:33:37.420218 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:37Z","lastTransitionTime":"2026-03-20T17:33:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:37 crc kubenswrapper[4690]: I0320 17:33:37.523003 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:37 crc kubenswrapper[4690]: I0320 17:33:37.523071 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:37 crc kubenswrapper[4690]: I0320 17:33:37.523091 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:37 crc kubenswrapper[4690]: I0320 17:33:37.523122 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:37 crc kubenswrapper[4690]: I0320 17:33:37.523174 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:37Z","lastTransitionTime":"2026-03-20T17:33:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:37 crc kubenswrapper[4690]: I0320 17:33:37.626079 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:37 crc kubenswrapper[4690]: I0320 17:33:37.626140 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:37 crc kubenswrapper[4690]: I0320 17:33:37.626157 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:37 crc kubenswrapper[4690]: I0320 17:33:37.626182 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:37 crc kubenswrapper[4690]: I0320 17:33:37.626199 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:37Z","lastTransitionTime":"2026-03-20T17:33:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:37 crc kubenswrapper[4690]: I0320 17:33:37.728986 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:37 crc kubenswrapper[4690]: I0320 17:33:37.729065 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:37 crc kubenswrapper[4690]: I0320 17:33:37.729094 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:37 crc kubenswrapper[4690]: I0320 17:33:37.729126 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:37 crc kubenswrapper[4690]: I0320 17:33:37.729151 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:37Z","lastTransitionTime":"2026-03-20T17:33:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:37 crc kubenswrapper[4690]: I0320 17:33:37.832663 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:37 crc kubenswrapper[4690]: I0320 17:33:37.832717 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:37 crc kubenswrapper[4690]: I0320 17:33:37.832737 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:37 crc kubenswrapper[4690]: I0320 17:33:37.832764 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:37 crc kubenswrapper[4690]: I0320 17:33:37.832781 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:37Z","lastTransitionTime":"2026-03-20T17:33:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:37 crc kubenswrapper[4690]: I0320 17:33:37.882434 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:33:37 crc kubenswrapper[4690]: I0320 17:33:37.882722 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:33:37 crc kubenswrapper[4690]: I0320 17:33:37.882753 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bgj72" Mar 20 17:33:37 crc kubenswrapper[4690]: I0320 17:33:37.882835 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:33:37 crc kubenswrapper[4690]: E0320 17:33:37.882836 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:33:37 crc kubenswrapper[4690]: E0320 17:33:37.883001 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:33:37 crc kubenswrapper[4690]: E0320 17:33:37.883142 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:33:37 crc kubenswrapper[4690]: E0320 17:33:37.883735 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bgj72" podUID="3cb690cf-caea-4c1b-ad3c-7e17a802b1a3" Mar 20 17:33:37 crc kubenswrapper[4690]: E0320 17:33:37.886473 4690 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:egress-router-binary-copy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,Command:[/entrypoint/cnibincopy.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/bin/,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:true,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-79kbc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-additional-cni-plugins-tzvwm_openshift-multus(3fe7c1d1-7aa9-4c64-941e-7415a99367ea): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 20 17:33:37 crc kubenswrapper[4690]: E0320 17:33:37.886906 4690 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 17:33:37 crc kubenswrapper[4690]: container &Container{Name:dns-node-resolver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/bin/bash -c #!/bin/bash Mar 20 17:33:37 crc kubenswrapper[4690]: set -uo pipefail Mar 20 17:33:37 crc kubenswrapper[4690]: Mar 20 17:33:37 crc kubenswrapper[4690]: trap 'jobs -p | xargs kill || true; wait; exit 0' TERM Mar 20 17:33:37 crc kubenswrapper[4690]: Mar 20 17:33:37 crc kubenswrapper[4690]: OPENSHIFT_MARKER="openshift-generated-node-resolver" Mar 20 17:33:37 crc kubenswrapper[4690]: HOSTS_FILE="/etc/hosts" Mar 20 17:33:37 crc kubenswrapper[4690]: TEMP_FILE="/etc/hosts.tmp" Mar 20 17:33:37 crc kubenswrapper[4690]: Mar 20 17:33:37 crc kubenswrapper[4690]: IFS=', ' read -r -a services <<< "${SERVICES}" Mar 20 17:33:37 crc kubenswrapper[4690]: Mar 20 17:33:37 crc kubenswrapper[4690]: # Make a temporary file with the old hosts file's attributes. Mar 20 17:33:37 crc kubenswrapper[4690]: if ! cp -f --attributes-only "${HOSTS_FILE}" "${TEMP_FILE}"; then Mar 20 17:33:37 crc kubenswrapper[4690]: echo "Failed to preserve hosts file. Exiting." Mar 20 17:33:37 crc kubenswrapper[4690]: exit 1 Mar 20 17:33:37 crc kubenswrapper[4690]: fi Mar 20 17:33:37 crc kubenswrapper[4690]: Mar 20 17:33:37 crc kubenswrapper[4690]: while true; do Mar 20 17:33:37 crc kubenswrapper[4690]: declare -A svc_ips Mar 20 17:33:37 crc kubenswrapper[4690]: for svc in "${services[@]}"; do Mar 20 17:33:37 crc kubenswrapper[4690]: # Fetch service IP from cluster dns if present. We make several tries Mar 20 17:33:37 crc kubenswrapper[4690]: # to do it: IPv4, IPv6, IPv4 over TCP and IPv6 over TCP. The two last ones Mar 20 17:33:37 crc kubenswrapper[4690]: # are for deployments with Kuryr on older OpenStack (OSP13) - those do not Mar 20 17:33:37 crc kubenswrapper[4690]: # support UDP loadbalancers and require reaching DNS through TCP. Mar 20 17:33:37 crc kubenswrapper[4690]: cmds=('dig -t A @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 20 17:33:37 crc kubenswrapper[4690]: 'dig -t AAAA @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 20 17:33:37 crc kubenswrapper[4690]: 'dig -t A +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 20 17:33:37 crc kubenswrapper[4690]: 'dig -t AAAA +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"') Mar 20 17:33:37 crc kubenswrapper[4690]: for i in ${!cmds[*]} Mar 20 17:33:37 crc kubenswrapper[4690]: do Mar 20 17:33:37 crc kubenswrapper[4690]: ips=($(eval "${cmds[i]}")) Mar 20 17:33:37 crc kubenswrapper[4690]: if [[ "$?" -eq 0 && "${#ips[@]}" -ne 0 ]]; then Mar 20 17:33:37 crc kubenswrapper[4690]: svc_ips["${svc}"]="${ips[@]}" Mar 20 17:33:37 crc kubenswrapper[4690]: break Mar 20 17:33:37 crc kubenswrapper[4690]: fi Mar 20 17:33:37 crc kubenswrapper[4690]: done Mar 20 17:33:37 crc kubenswrapper[4690]: done Mar 20 17:33:37 crc kubenswrapper[4690]: Mar 20 17:33:37 crc kubenswrapper[4690]: # Update /etc/hosts only if we get valid service IPs Mar 20 17:33:37 crc kubenswrapper[4690]: # We will not update /etc/hosts when there is coredns service outage or api unavailability Mar 20 17:33:37 crc kubenswrapper[4690]: # Stale entries could exist in /etc/hosts if the service is deleted Mar 20 17:33:37 crc kubenswrapper[4690]: if [[ -n "${svc_ips[*]-}" ]]; then Mar 20 17:33:37 crc kubenswrapper[4690]: # Build a new hosts file from /etc/hosts with our custom entries filtered out Mar 20 17:33:37 crc kubenswrapper[4690]: if ! sed --silent "/# ${OPENSHIFT_MARKER}/d; w ${TEMP_FILE}" "${HOSTS_FILE}"; then Mar 20 17:33:37 crc kubenswrapper[4690]: # Only continue rebuilding the hosts entries if its original content is preserved Mar 20 17:33:37 crc kubenswrapper[4690]: sleep 60 & wait Mar 20 17:33:37 crc kubenswrapper[4690]: continue Mar 20 17:33:37 crc kubenswrapper[4690]: fi Mar 20 17:33:37 crc kubenswrapper[4690]: Mar 20 17:33:37 crc kubenswrapper[4690]: # Append resolver entries for services Mar 20 17:33:37 crc kubenswrapper[4690]: rc=0 Mar 20 17:33:37 crc kubenswrapper[4690]: for svc in "${!svc_ips[@]}"; do Mar 20 17:33:37 crc kubenswrapper[4690]: for ip in ${svc_ips[${svc}]}; do Mar 20 17:33:37 crc kubenswrapper[4690]: echo "${ip} ${svc} ${svc}.${CLUSTER_DOMAIN} # ${OPENSHIFT_MARKER}" >> "${TEMP_FILE}" || rc=$? Mar 20 17:33:37 crc kubenswrapper[4690]: done Mar 20 17:33:37 crc kubenswrapper[4690]: done Mar 20 17:33:37 crc kubenswrapper[4690]: if [[ $rc -ne 0 ]]; then Mar 20 17:33:37 crc kubenswrapper[4690]: sleep 60 & wait Mar 20 17:33:37 crc kubenswrapper[4690]: continue Mar 20 17:33:37 crc kubenswrapper[4690]: fi Mar 20 17:33:37 crc kubenswrapper[4690]: Mar 20 17:33:37 crc kubenswrapper[4690]: Mar 20 17:33:37 crc kubenswrapper[4690]: # TODO: Update /etc/hosts atomically to avoid any inconsistent behavior Mar 20 17:33:37 crc kubenswrapper[4690]: # Replace /etc/hosts with our modified version if needed Mar 20 17:33:37 crc kubenswrapper[4690]: cmp "${TEMP_FILE}" "${HOSTS_FILE}" || cp -f "${TEMP_FILE}" "${HOSTS_FILE}" Mar 20 17:33:37 crc kubenswrapper[4690]: # TEMP_FILE is not removed to avoid file create/delete and attributes copy churn Mar 20 17:33:37 crc kubenswrapper[4690]: fi Mar 20 17:33:37 crc kubenswrapper[4690]: sleep 60 & wait Mar 20 17:33:37 crc kubenswrapper[4690]: unset svc_ips Mar 20 17:33:37 crc kubenswrapper[4690]: done Mar 20 17:33:37 crc kubenswrapper[4690]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:SERVICES,Value:image-registry.openshift-image-registry.svc,ValueFrom:nil,},EnvVar{Name:NAMESERVER,Value:10.217.4.10,ValueFrom:nil,},EnvVar{Name:CLUSTER_DOMAIN,Value:cluster.local,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{22020096 0} {} 21Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:hosts-file,ReadOnly:false,MountPath:/etc/hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7lb8q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-resolver-qhmg6_openshift-dns(e5abdfe2-a5f7-43a7-9c83-a9eb0dacdea3): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 17:33:37 crc kubenswrapper[4690]: > logger="UnhandledError" Mar 20 17:33:37 crc kubenswrapper[4690]: E0320 17:33:37.887095 4690 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 20 17:33:37 crc kubenswrapper[4690]: E0320 17:33:37.888130 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"egress-router-binary-copy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-additional-cni-plugins-tzvwm" podUID="3fe7c1d1-7aa9-4c64-941e-7415a99367ea" Mar 20 17:33:37 crc kubenswrapper[4690]: E0320 17:33:37.888326 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dns-node-resolver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-dns/node-resolver-qhmg6" podUID="e5abdfe2-a5f7-43a7-9c83-a9eb0dacdea3" Mar 20 17:33:37 crc kubenswrapper[4690]: E0320 17:33:37.888561 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 20 17:33:37 crc kubenswrapper[4690]: I0320 17:33:37.935725 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:37 crc kubenswrapper[4690]: I0320 17:33:37.935774 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:37 crc kubenswrapper[4690]: I0320 17:33:37.935788 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:37 crc kubenswrapper[4690]: I0320 17:33:37.935807 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:37 crc kubenswrapper[4690]: I0320 17:33:37.935820 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:37Z","lastTransitionTime":"2026-03-20T17:33:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:38 crc kubenswrapper[4690]: I0320 17:33:38.038181 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:38 crc kubenswrapper[4690]: I0320 17:33:38.038296 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:38 crc kubenswrapper[4690]: I0320 17:33:38.038332 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:38 crc kubenswrapper[4690]: I0320 17:33:38.038360 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:38 crc kubenswrapper[4690]: I0320 17:33:38.038384 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:38Z","lastTransitionTime":"2026-03-20T17:33:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:38 crc kubenswrapper[4690]: I0320 17:33:38.140707 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:38 crc kubenswrapper[4690]: I0320 17:33:38.140870 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:38 crc kubenswrapper[4690]: I0320 17:33:38.140938 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:38 crc kubenswrapper[4690]: I0320 17:33:38.140965 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:38 crc kubenswrapper[4690]: I0320 17:33:38.140984 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:38Z","lastTransitionTime":"2026-03-20T17:33:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:38 crc kubenswrapper[4690]: I0320 17:33:38.244013 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:38 crc kubenswrapper[4690]: I0320 17:33:38.244059 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:38 crc kubenswrapper[4690]: I0320 17:33:38.244069 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:38 crc kubenswrapper[4690]: I0320 17:33:38.244084 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:38 crc kubenswrapper[4690]: I0320 17:33:38.244095 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:38Z","lastTransitionTime":"2026-03-20T17:33:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:38 crc kubenswrapper[4690]: I0320 17:33:38.345530 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:38 crc kubenswrapper[4690]: I0320 17:33:38.345594 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:38 crc kubenswrapper[4690]: I0320 17:33:38.345616 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:38 crc kubenswrapper[4690]: I0320 17:33:38.345638 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:38 crc kubenswrapper[4690]: I0320 17:33:38.345657 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:38Z","lastTransitionTime":"2026-03-20T17:33:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:38 crc kubenswrapper[4690]: I0320 17:33:38.447511 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:38 crc kubenswrapper[4690]: I0320 17:33:38.447552 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:38 crc kubenswrapper[4690]: I0320 17:33:38.447563 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:38 crc kubenswrapper[4690]: I0320 17:33:38.447579 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:38 crc kubenswrapper[4690]: I0320 17:33:38.447590 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:38Z","lastTransitionTime":"2026-03-20T17:33:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:38 crc kubenswrapper[4690]: I0320 17:33:38.549957 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:38 crc kubenswrapper[4690]: I0320 17:33:38.550029 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:38 crc kubenswrapper[4690]: I0320 17:33:38.550052 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:38 crc kubenswrapper[4690]: I0320 17:33:38.550080 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:38 crc kubenswrapper[4690]: I0320 17:33:38.550104 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:38Z","lastTransitionTime":"2026-03-20T17:33:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:38 crc kubenswrapper[4690]: I0320 17:33:38.653194 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:38 crc kubenswrapper[4690]: I0320 17:33:38.653289 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:38 crc kubenswrapper[4690]: I0320 17:33:38.653307 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:38 crc kubenswrapper[4690]: I0320 17:33:38.653331 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:38 crc kubenswrapper[4690]: I0320 17:33:38.653350 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:38Z","lastTransitionTime":"2026-03-20T17:33:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:38 crc kubenswrapper[4690]: I0320 17:33:38.756466 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:38 crc kubenswrapper[4690]: I0320 17:33:38.756540 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:38 crc kubenswrapper[4690]: I0320 17:33:38.756570 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:38 crc kubenswrapper[4690]: I0320 17:33:38.756599 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:38 crc kubenswrapper[4690]: I0320 17:33:38.756620 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:38Z","lastTransitionTime":"2026-03-20T17:33:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:38 crc kubenswrapper[4690]: I0320 17:33:38.858796 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:38 crc kubenswrapper[4690]: I0320 17:33:38.858842 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:38 crc kubenswrapper[4690]: I0320 17:33:38.858852 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:38 crc kubenswrapper[4690]: I0320 17:33:38.858870 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:38 crc kubenswrapper[4690]: I0320 17:33:38.858881 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:38Z","lastTransitionTime":"2026-03-20T17:33:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:38 crc kubenswrapper[4690]: I0320 17:33:38.882742 4690 scope.go:117] "RemoveContainer" containerID="60a788ca120045ef7b2481c3da0afac1f8ae2522b3edd3b73a48f5f8dab045a4" Mar 20 17:33:38 crc kubenswrapper[4690]: E0320 17:33:38.882925 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 17:33:38 crc kubenswrapper[4690]: E0320 17:33:38.883834 4690 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 17:33:38 crc kubenswrapper[4690]: container &Container{Name:node-ca,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f,Command:[/bin/sh -c trap 'jobs -p | xargs -r kill; echo shutting down node-ca; exit 0' TERM Mar 20 17:33:38 crc kubenswrapper[4690]: while [ true ]; Mar 20 17:33:38 crc kubenswrapper[4690]: do Mar 20 17:33:38 crc kubenswrapper[4690]: for f in $(ls /tmp/serviceca); do Mar 20 17:33:38 crc kubenswrapper[4690]: echo $f Mar 20 17:33:38 crc kubenswrapper[4690]: ca_file_path="/tmp/serviceca/${f}" Mar 20 17:33:38 crc kubenswrapper[4690]: f=$(echo $f | sed -r 's/(.*)\.\./\1:/') Mar 20 17:33:38 crc kubenswrapper[4690]: reg_dir_path="/etc/docker/certs.d/${f}" Mar 20 17:33:38 crc kubenswrapper[4690]: if [ -e "${reg_dir_path}" ]; then Mar 20 17:33:38 crc kubenswrapper[4690]: cp -u $ca_file_path $reg_dir_path/ca.crt Mar 20 17:33:38 crc kubenswrapper[4690]: else Mar 20 17:33:38 crc kubenswrapper[4690]: mkdir $reg_dir_path Mar 20 17:33:38 crc kubenswrapper[4690]: cp $ca_file_path $reg_dir_path/ca.crt Mar 20 17:33:38 crc kubenswrapper[4690]: fi Mar 20 17:33:38 crc kubenswrapper[4690]: done Mar 20 17:33:38 crc kubenswrapper[4690]: for d in $(ls /etc/docker/certs.d); do Mar 20 17:33:38 crc kubenswrapper[4690]: echo $d Mar 20 17:33:38 crc kubenswrapper[4690]: dp=$(echo $d | sed -r 's/(.*):/\1\.\./') Mar 20 17:33:38 crc kubenswrapper[4690]: reg_conf_path="/tmp/serviceca/${dp}" Mar 20 17:33:38 crc kubenswrapper[4690]: if [ ! -e "${reg_conf_path}" ]; then Mar 20 17:33:38 crc kubenswrapper[4690]: rm -rf /etc/docker/certs.d/$d Mar 20 17:33:38 crc kubenswrapper[4690]: fi Mar 20 17:33:38 crc kubenswrapper[4690]: done Mar 20 17:33:38 crc kubenswrapper[4690]: sleep 60 & wait ${!} Mar 20 17:33:38 crc kubenswrapper[4690]: done Mar 20 17:33:38 crc kubenswrapper[4690]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{10485760 0} {} 10Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:serviceca,ReadOnly:false,MountPath:/tmp/serviceca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host,ReadOnly:false,MountPath:/etc/docker/certs.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qmghf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*1001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-ca-4rfg5_openshift-image-registry(deaf1de2-4906-4e89-ae1b-83b6d35f97a6): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 17:33:38 crc kubenswrapper[4690]: > logger="UnhandledError" Mar 20 17:33:38 crc kubenswrapper[4690]: E0320 17:33:38.884674 4690 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:machine-config-daemon,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a,Command:[/usr/bin/machine-config-daemon],Args:[start --payload-version=4.18.1],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:health,HostPort:8798,ContainerPort:8798,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rootfs,ReadOnly:false,MountPath:/rootfs,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-v64dg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/health,Port:{0 8798 },Host:127.0.0.1,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:120,TimeoutSeconds:1,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 20 17:33:38 crc kubenswrapper[4690]: E0320 17:33:38.885594 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"node-ca\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-image-registry/node-ca-4rfg5" podUID="deaf1de2-4906-4e89-ae1b-83b6d35f97a6" Mar 20 17:33:38 crc kubenswrapper[4690]: E0320 17:33:38.888498 4690 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[],Args:[--secure-listen-address=0.0.0.0:9001 --config-file=/etc/kube-rbac-proxy/config-file.yaml --tls-cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --tls-min-version=VersionTLS12 --upstream=http://127.0.0.1:8797 --logtostderr=true --tls-cert-file=/etc/tls/private/tls.crt --tls-private-key-file=/etc/tls/private/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:9001,ContainerPort:9001,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:proxy-tls,ReadOnly:false,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:mcd-auth-proxy-config,ReadOnly:false,MountPath:/etc/kube-rbac-proxy,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-v64dg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 20 17:33:38 crc kubenswrapper[4690]: E0320 17:33:38.889795 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"machine-config-daemon\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 17:33:38 crc kubenswrapper[4690]: I0320 17:33:38.962034 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:38 crc kubenswrapper[4690]: I0320 17:33:38.962092 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:38 crc kubenswrapper[4690]: I0320 17:33:38.962108 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:38 crc kubenswrapper[4690]: I0320 17:33:38.962131 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:38 crc kubenswrapper[4690]: I0320 17:33:38.962148 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:38Z","lastTransitionTime":"2026-03-20T17:33:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:39 crc kubenswrapper[4690]: I0320 17:33:39.065953 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:39 crc kubenswrapper[4690]: I0320 17:33:39.066067 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:39 crc kubenswrapper[4690]: I0320 17:33:39.066088 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:39 crc kubenswrapper[4690]: I0320 17:33:39.066126 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:39 crc kubenswrapper[4690]: I0320 17:33:39.066143 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:39Z","lastTransitionTime":"2026-03-20T17:33:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:39 crc kubenswrapper[4690]: I0320 17:33:39.168636 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:39 crc kubenswrapper[4690]: I0320 17:33:39.168699 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:39 crc kubenswrapper[4690]: I0320 17:33:39.168716 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:39 crc kubenswrapper[4690]: I0320 17:33:39.168740 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:39 crc kubenswrapper[4690]: I0320 17:33:39.168759 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:39Z","lastTransitionTime":"2026-03-20T17:33:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:39 crc kubenswrapper[4690]: I0320 17:33:39.274721 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:39 crc kubenswrapper[4690]: I0320 17:33:39.274774 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:39 crc kubenswrapper[4690]: I0320 17:33:39.274789 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:39 crc kubenswrapper[4690]: I0320 17:33:39.274815 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:39 crc kubenswrapper[4690]: I0320 17:33:39.274830 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:39Z","lastTransitionTime":"2026-03-20T17:33:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:39 crc kubenswrapper[4690]: I0320 17:33:39.378464 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:39 crc kubenswrapper[4690]: I0320 17:33:39.378531 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:39 crc kubenswrapper[4690]: I0320 17:33:39.378607 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:39 crc kubenswrapper[4690]: I0320 17:33:39.378697 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:39 crc kubenswrapper[4690]: I0320 17:33:39.378729 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:39Z","lastTransitionTime":"2026-03-20T17:33:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:39 crc kubenswrapper[4690]: I0320 17:33:39.481611 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:39 crc kubenswrapper[4690]: I0320 17:33:39.481651 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:39 crc kubenswrapper[4690]: I0320 17:33:39.481663 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:39 crc kubenswrapper[4690]: I0320 17:33:39.481679 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:39 crc kubenswrapper[4690]: I0320 17:33:39.481692 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:39Z","lastTransitionTime":"2026-03-20T17:33:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:39 crc kubenswrapper[4690]: I0320 17:33:39.584103 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:39 crc kubenswrapper[4690]: I0320 17:33:39.584135 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:39 crc kubenswrapper[4690]: I0320 17:33:39.584147 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:39 crc kubenswrapper[4690]: I0320 17:33:39.584163 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:39 crc kubenswrapper[4690]: I0320 17:33:39.584174 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:39Z","lastTransitionTime":"2026-03-20T17:33:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:39 crc kubenswrapper[4690]: I0320 17:33:39.687162 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:39 crc kubenswrapper[4690]: I0320 17:33:39.687198 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:39 crc kubenswrapper[4690]: I0320 17:33:39.687206 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:39 crc kubenswrapper[4690]: I0320 17:33:39.687220 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:39 crc kubenswrapper[4690]: I0320 17:33:39.687230 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:39Z","lastTransitionTime":"2026-03-20T17:33:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:39 crc kubenswrapper[4690]: I0320 17:33:39.790465 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:39 crc kubenswrapper[4690]: I0320 17:33:39.790506 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:39 crc kubenswrapper[4690]: I0320 17:33:39.790517 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:39 crc kubenswrapper[4690]: I0320 17:33:39.790533 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:39 crc kubenswrapper[4690]: I0320 17:33:39.790546 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:39Z","lastTransitionTime":"2026-03-20T17:33:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:39 crc kubenswrapper[4690]: I0320 17:33:39.882676 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bgj72" Mar 20 17:33:39 crc kubenswrapper[4690]: I0320 17:33:39.882751 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:33:39 crc kubenswrapper[4690]: I0320 17:33:39.882703 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:33:39 crc kubenswrapper[4690]: I0320 17:33:39.883152 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:33:39 crc kubenswrapper[4690]: E0320 17:33:39.883227 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:33:39 crc kubenswrapper[4690]: E0320 17:33:39.883471 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bgj72" podUID="3cb690cf-caea-4c1b-ad3c-7e17a802b1a3" Mar 20 17:33:39 crc kubenswrapper[4690]: E0320 17:33:39.883551 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:33:39 crc kubenswrapper[4690]: E0320 17:33:39.883395 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:33:39 crc kubenswrapper[4690]: E0320 17:33:39.885861 4690 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 17:33:39 crc kubenswrapper[4690]: container &Container{Name:kube-multus,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,Command:[/bin/bash -ec --],Args:[MULTUS_DAEMON_OPT="" Mar 20 17:33:39 crc kubenswrapper[4690]: /entrypoint/cnibincopy.sh; exec /usr/src/multus-cni/bin/multus-daemon $MULTUS_DAEMON_OPT Mar 20 17:33:39 crc kubenswrapper[4690]: ],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/bin/,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:6443,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:api-int.crc.testing,ValueFrom:nil,},EnvVar{Name:MULTUS_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:false,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:system-cni-dir,ReadOnly:false,MountPath:/host/etc/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-cni-dir,ReadOnly:false,MountPath:/host/run/multus/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-socket-dir-parent,ReadOnly:false,MountPath:/host/run/multus,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-k8s-cni-cncf-io,ReadOnly:false,MountPath:/run/k8s.cni.cncf.io,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-netns,ReadOnly:false,MountPath:/run/netns,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-bin,ReadOnly:false,MountPath:/var/lib/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-multus,ReadOnly:false,MountPath:/var/lib/cni/multus,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-kubelet,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:hostroot,ReadOnly:false,MountPath:/hostroot,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-conf-dir,ReadOnly:false,MountPath:/etc/cni/multus/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-daemon-config,ReadOnly:true,MountPath:/etc/cni/net.d/multus.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-multus-certs,ReadOnly:false,MountPath:/etc/cni/multus/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-kubernetes,ReadOnly:false,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-z9vwp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-bf8dm_openshift-multus(189715be-f690-4a1d-9bd3-fb0dcae7affe): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 17:33:39 crc kubenswrapper[4690]: > logger="UnhandledError" Mar 20 17:33:39 crc kubenswrapper[4690]: E0320 17:33:39.887129 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-bf8dm" podUID="189715be-f690-4a1d-9bd3-fb0dcae7affe" Mar 20 17:33:39 crc kubenswrapper[4690]: I0320 17:33:39.893649 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:39 crc kubenswrapper[4690]: I0320 17:33:39.893698 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:39 crc kubenswrapper[4690]: I0320 17:33:39.893886 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:39 crc kubenswrapper[4690]: I0320 17:33:39.893956 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:39 crc kubenswrapper[4690]: I0320 17:33:39.893980 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:39Z","lastTransitionTime":"2026-03-20T17:33:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:39 crc kubenswrapper[4690]: I0320 17:33:39.997023 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:39 crc kubenswrapper[4690]: I0320 17:33:39.997099 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:39 crc kubenswrapper[4690]: I0320 17:33:39.997133 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:39 crc kubenswrapper[4690]: I0320 17:33:39.997155 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:39 crc kubenswrapper[4690]: I0320 17:33:39.997170 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:39Z","lastTransitionTime":"2026-03-20T17:33:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:40 crc kubenswrapper[4690]: I0320 17:33:40.009117 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:33:40 crc kubenswrapper[4690]: I0320 17:33:40.009250 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3cb690cf-caea-4c1b-ad3c-7e17a802b1a3-metrics-certs\") pod \"network-metrics-daemon-bgj72\" (UID: \"3cb690cf-caea-4c1b-ad3c-7e17a802b1a3\") " pod="openshift-multus/network-metrics-daemon-bgj72" Mar 20 17:33:40 crc kubenswrapper[4690]: E0320 17:33:40.009304 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:33:56.009283314 +0000 UTC m=+110.875109152 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:33:40 crc kubenswrapper[4690]: E0320 17:33:40.009368 4690 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 17:33:40 crc kubenswrapper[4690]: E0320 17:33:40.009420 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3cb690cf-caea-4c1b-ad3c-7e17a802b1a3-metrics-certs podName:3cb690cf-caea-4c1b-ad3c-7e17a802b1a3 nodeName:}" failed. No retries permitted until 2026-03-20 17:33:56.009408018 +0000 UTC m=+110.875233896 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3cb690cf-caea-4c1b-ad3c-7e17a802b1a3-metrics-certs") pod "network-metrics-daemon-bgj72" (UID: "3cb690cf-caea-4c1b-ad3c-7e17a802b1a3") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 17:33:40 crc kubenswrapper[4690]: I0320 17:33:40.009711 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:33:40 crc kubenswrapper[4690]: E0320 17:33:40.009848 4690 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 17:33:40 crc kubenswrapper[4690]: E0320 17:33:40.009872 4690 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 17:33:40 crc kubenswrapper[4690]: E0320 17:33:40.009885 4690 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:33:40 crc kubenswrapper[4690]: E0320 17:33:40.009927 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 17:33:56.009915702 +0000 UTC m=+110.875741380 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:33:40 crc kubenswrapper[4690]: I0320 17:33:40.010327 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:33:40 crc kubenswrapper[4690]: I0320 17:33:40.010394 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:33:40 crc kubenswrapper[4690]: I0320 17:33:40.010437 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:33:40 crc kubenswrapper[4690]: E0320 17:33:40.010523 4690 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 17:33:40 crc kubenswrapper[4690]: E0320 17:33:40.010559 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 17:33:56.010548701 +0000 UTC m=+110.876374379 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 17:33:40 crc kubenswrapper[4690]: E0320 17:33:40.010622 4690 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 17:33:40 crc kubenswrapper[4690]: E0320 17:33:40.010636 4690 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 17:33:40 crc kubenswrapper[4690]: E0320 17:33:40.010647 4690 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:33:40 crc kubenswrapper[4690]: E0320 17:33:40.010675 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 17:33:56.010666464 +0000 UTC m=+110.876492142 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:33:40 crc kubenswrapper[4690]: E0320 17:33:40.010715 4690 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 17:33:40 crc kubenswrapper[4690]: E0320 17:33:40.010745 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 17:33:56.010736546 +0000 UTC m=+110.876562224 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 17:33:40 crc kubenswrapper[4690]: I0320 17:33:40.099785 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:40 crc kubenswrapper[4690]: I0320 17:33:40.099831 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:40 crc kubenswrapper[4690]: I0320 17:33:40.099845 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:40 crc kubenswrapper[4690]: I0320 17:33:40.099862 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:40 crc kubenswrapper[4690]: I0320 17:33:40.099901 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:40Z","lastTransitionTime":"2026-03-20T17:33:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:40 crc kubenswrapper[4690]: I0320 17:33:40.203042 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:40 crc kubenswrapper[4690]: I0320 17:33:40.203074 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:40 crc kubenswrapper[4690]: I0320 17:33:40.203082 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:40 crc kubenswrapper[4690]: I0320 17:33:40.203096 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:40 crc kubenswrapper[4690]: I0320 17:33:40.203106 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:40Z","lastTransitionTime":"2026-03-20T17:33:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:40 crc kubenswrapper[4690]: I0320 17:33:40.305543 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:40 crc kubenswrapper[4690]: I0320 17:33:40.305738 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:40 crc kubenswrapper[4690]: I0320 17:33:40.305783 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:40 crc kubenswrapper[4690]: I0320 17:33:40.305816 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:40 crc kubenswrapper[4690]: I0320 17:33:40.305838 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:40Z","lastTransitionTime":"2026-03-20T17:33:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:40 crc kubenswrapper[4690]: I0320 17:33:40.408332 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:40 crc kubenswrapper[4690]: I0320 17:33:40.408376 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:40 crc kubenswrapper[4690]: I0320 17:33:40.408388 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:40 crc kubenswrapper[4690]: I0320 17:33:40.408404 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:40 crc kubenswrapper[4690]: I0320 17:33:40.408416 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:40Z","lastTransitionTime":"2026-03-20T17:33:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:40 crc kubenswrapper[4690]: I0320 17:33:40.511724 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:40 crc kubenswrapper[4690]: I0320 17:33:40.511788 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:40 crc kubenswrapper[4690]: I0320 17:33:40.511807 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:40 crc kubenswrapper[4690]: I0320 17:33:40.511830 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:40 crc kubenswrapper[4690]: I0320 17:33:40.511850 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:40Z","lastTransitionTime":"2026-03-20T17:33:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:40 crc kubenswrapper[4690]: I0320 17:33:40.614418 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:40 crc kubenswrapper[4690]: I0320 17:33:40.614496 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:40 crc kubenswrapper[4690]: I0320 17:33:40.614531 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:40 crc kubenswrapper[4690]: I0320 17:33:40.614559 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:40 crc kubenswrapper[4690]: I0320 17:33:40.614580 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:40Z","lastTransitionTime":"2026-03-20T17:33:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:40 crc kubenswrapper[4690]: I0320 17:33:40.717233 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:40 crc kubenswrapper[4690]: I0320 17:33:40.717346 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:40 crc kubenswrapper[4690]: I0320 17:33:40.717364 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:40 crc kubenswrapper[4690]: I0320 17:33:40.717389 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:40 crc kubenswrapper[4690]: I0320 17:33:40.717411 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:40Z","lastTransitionTime":"2026-03-20T17:33:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:40 crc kubenswrapper[4690]: I0320 17:33:40.820144 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:40 crc kubenswrapper[4690]: I0320 17:33:40.820228 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:40 crc kubenswrapper[4690]: I0320 17:33:40.820244 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:40 crc kubenswrapper[4690]: I0320 17:33:40.820296 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:40 crc kubenswrapper[4690]: I0320 17:33:40.820321 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:40Z","lastTransitionTime":"2026-03-20T17:33:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:40 crc kubenswrapper[4690]: I0320 17:33:40.923794 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:40 crc kubenswrapper[4690]: I0320 17:33:40.923852 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:40 crc kubenswrapper[4690]: I0320 17:33:40.924059 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:40 crc kubenswrapper[4690]: I0320 17:33:40.924081 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:40 crc kubenswrapper[4690]: I0320 17:33:40.924098 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:40Z","lastTransitionTime":"2026-03-20T17:33:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:41 crc kubenswrapper[4690]: I0320 17:33:41.027003 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:41 crc kubenswrapper[4690]: I0320 17:33:41.027051 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:41 crc kubenswrapper[4690]: I0320 17:33:41.027060 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:41 crc kubenswrapper[4690]: I0320 17:33:41.027078 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:41 crc kubenswrapper[4690]: I0320 17:33:41.027090 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:41Z","lastTransitionTime":"2026-03-20T17:33:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:41 crc kubenswrapper[4690]: I0320 17:33:41.130114 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:41 crc kubenswrapper[4690]: I0320 17:33:41.130205 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:41 crc kubenswrapper[4690]: I0320 17:33:41.130227 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:41 crc kubenswrapper[4690]: I0320 17:33:41.130300 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:41 crc kubenswrapper[4690]: I0320 17:33:41.130340 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:41Z","lastTransitionTime":"2026-03-20T17:33:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:41 crc kubenswrapper[4690]: I0320 17:33:41.233710 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:41 crc kubenswrapper[4690]: I0320 17:33:41.233868 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:41 crc kubenswrapper[4690]: I0320 17:33:41.233892 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:41 crc kubenswrapper[4690]: I0320 17:33:41.233923 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:41 crc kubenswrapper[4690]: I0320 17:33:41.233944 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:41Z","lastTransitionTime":"2026-03-20T17:33:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:41 crc kubenswrapper[4690]: I0320 17:33:41.337121 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:41 crc kubenswrapper[4690]: I0320 17:33:41.337196 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:41 crc kubenswrapper[4690]: I0320 17:33:41.337219 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:41 crc kubenswrapper[4690]: I0320 17:33:41.337293 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:41 crc kubenswrapper[4690]: I0320 17:33:41.337319 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:41Z","lastTransitionTime":"2026-03-20T17:33:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:41 crc kubenswrapper[4690]: I0320 17:33:41.440382 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:41 crc kubenswrapper[4690]: I0320 17:33:41.440453 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:41 crc kubenswrapper[4690]: I0320 17:33:41.440470 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:41 crc kubenswrapper[4690]: I0320 17:33:41.440496 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:41 crc kubenswrapper[4690]: I0320 17:33:41.440514 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:41Z","lastTransitionTime":"2026-03-20T17:33:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:41 crc kubenswrapper[4690]: I0320 17:33:41.542969 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:41 crc kubenswrapper[4690]: I0320 17:33:41.543036 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:41 crc kubenswrapper[4690]: I0320 17:33:41.543054 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:41 crc kubenswrapper[4690]: I0320 17:33:41.543108 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:41 crc kubenswrapper[4690]: I0320 17:33:41.543125 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:41Z","lastTransitionTime":"2026-03-20T17:33:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:41 crc kubenswrapper[4690]: I0320 17:33:41.646378 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:41 crc kubenswrapper[4690]: I0320 17:33:41.646451 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:41 crc kubenswrapper[4690]: I0320 17:33:41.646470 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:41 crc kubenswrapper[4690]: I0320 17:33:41.646494 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:41 crc kubenswrapper[4690]: I0320 17:33:41.646513 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:41Z","lastTransitionTime":"2026-03-20T17:33:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:41 crc kubenswrapper[4690]: I0320 17:33:41.749472 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:41 crc kubenswrapper[4690]: I0320 17:33:41.749548 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:41 crc kubenswrapper[4690]: I0320 17:33:41.749573 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:41 crc kubenswrapper[4690]: I0320 17:33:41.749602 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:41 crc kubenswrapper[4690]: I0320 17:33:41.749625 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:41Z","lastTransitionTime":"2026-03-20T17:33:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:41 crc kubenswrapper[4690]: I0320 17:33:41.853043 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:41 crc kubenswrapper[4690]: I0320 17:33:41.853101 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:41 crc kubenswrapper[4690]: I0320 17:33:41.853120 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:41 crc kubenswrapper[4690]: I0320 17:33:41.853146 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:41 crc kubenswrapper[4690]: I0320 17:33:41.853165 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:41Z","lastTransitionTime":"2026-03-20T17:33:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:41 crc kubenswrapper[4690]: I0320 17:33:41.883053 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:33:41 crc kubenswrapper[4690]: I0320 17:33:41.883143 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bgj72" Mar 20 17:33:41 crc kubenswrapper[4690]: I0320 17:33:41.883142 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:33:41 crc kubenswrapper[4690]: E0320 17:33:41.883317 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:33:41 crc kubenswrapper[4690]: I0320 17:33:41.883359 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:33:41 crc kubenswrapper[4690]: E0320 17:33:41.883580 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:33:41 crc kubenswrapper[4690]: E0320 17:33:41.883804 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:33:41 crc kubenswrapper[4690]: E0320 17:33:41.883910 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bgj72" podUID="3cb690cf-caea-4c1b-ad3c-7e17a802b1a3" Mar 20 17:33:41 crc kubenswrapper[4690]: I0320 17:33:41.956001 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:41 crc kubenswrapper[4690]: I0320 17:33:41.956072 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:41 crc kubenswrapper[4690]: I0320 17:33:41.956091 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:41 crc kubenswrapper[4690]: I0320 17:33:41.956116 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:41 crc kubenswrapper[4690]: I0320 17:33:41.956135 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:41Z","lastTransitionTime":"2026-03-20T17:33:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:42 crc kubenswrapper[4690]: I0320 17:33:42.058658 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:42 crc kubenswrapper[4690]: I0320 17:33:42.058731 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:42 crc kubenswrapper[4690]: I0320 17:33:42.058805 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:42 crc kubenswrapper[4690]: I0320 17:33:42.058834 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:42 crc kubenswrapper[4690]: I0320 17:33:42.058852 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:42Z","lastTransitionTime":"2026-03-20T17:33:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:42 crc kubenswrapper[4690]: I0320 17:33:42.162376 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:42 crc kubenswrapper[4690]: I0320 17:33:42.162417 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:42 crc kubenswrapper[4690]: I0320 17:33:42.162426 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:42 crc kubenswrapper[4690]: I0320 17:33:42.162444 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:42 crc kubenswrapper[4690]: I0320 17:33:42.162453 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:42Z","lastTransitionTime":"2026-03-20T17:33:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:42 crc kubenswrapper[4690]: I0320 17:33:42.265760 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:42 crc kubenswrapper[4690]: I0320 17:33:42.265810 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:42 crc kubenswrapper[4690]: I0320 17:33:42.265824 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:42 crc kubenswrapper[4690]: I0320 17:33:42.265844 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:42 crc kubenswrapper[4690]: I0320 17:33:42.265859 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:42Z","lastTransitionTime":"2026-03-20T17:33:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:42 crc kubenswrapper[4690]: I0320 17:33:42.368190 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:42 crc kubenswrapper[4690]: I0320 17:33:42.368229 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:42 crc kubenswrapper[4690]: I0320 17:33:42.368238 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:42 crc kubenswrapper[4690]: I0320 17:33:42.368251 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:42 crc kubenswrapper[4690]: I0320 17:33:42.368281 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:42Z","lastTransitionTime":"2026-03-20T17:33:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:42 crc kubenswrapper[4690]: I0320 17:33:42.470946 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:42 crc kubenswrapper[4690]: I0320 17:33:42.470991 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:42 crc kubenswrapper[4690]: I0320 17:33:42.471002 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:42 crc kubenswrapper[4690]: I0320 17:33:42.471051 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:42 crc kubenswrapper[4690]: I0320 17:33:42.471063 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:42Z","lastTransitionTime":"2026-03-20T17:33:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:42 crc kubenswrapper[4690]: I0320 17:33:42.574085 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:42 crc kubenswrapper[4690]: I0320 17:33:42.574137 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:42 crc kubenswrapper[4690]: I0320 17:33:42.574148 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:42 crc kubenswrapper[4690]: I0320 17:33:42.574165 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:42 crc kubenswrapper[4690]: I0320 17:33:42.574177 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:42Z","lastTransitionTime":"2026-03-20T17:33:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:42 crc kubenswrapper[4690]: I0320 17:33:42.677307 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:42 crc kubenswrapper[4690]: I0320 17:33:42.677363 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:42 crc kubenswrapper[4690]: I0320 17:33:42.677381 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:42 crc kubenswrapper[4690]: I0320 17:33:42.677405 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:42 crc kubenswrapper[4690]: I0320 17:33:42.677421 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:42Z","lastTransitionTime":"2026-03-20T17:33:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:42 crc kubenswrapper[4690]: I0320 17:33:42.781638 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:42 crc kubenswrapper[4690]: I0320 17:33:42.781699 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:42 crc kubenswrapper[4690]: I0320 17:33:42.781718 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:42 crc kubenswrapper[4690]: I0320 17:33:42.781755 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:42 crc kubenswrapper[4690]: I0320 17:33:42.781773 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:42Z","lastTransitionTime":"2026-03-20T17:33:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:42 crc kubenswrapper[4690]: I0320 17:33:42.885537 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:42 crc kubenswrapper[4690]: I0320 17:33:42.885637 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:42 crc kubenswrapper[4690]: I0320 17:33:42.885789 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:42 crc kubenswrapper[4690]: I0320 17:33:42.885835 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:42 crc kubenswrapper[4690]: I0320 17:33:42.885857 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:42Z","lastTransitionTime":"2026-03-20T17:33:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:42 crc kubenswrapper[4690]: I0320 17:33:42.989993 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:42 crc kubenswrapper[4690]: I0320 17:33:42.990090 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:42 crc kubenswrapper[4690]: I0320 17:33:42.990109 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:42 crc kubenswrapper[4690]: I0320 17:33:42.990137 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:42 crc kubenswrapper[4690]: I0320 17:33:42.990156 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:42Z","lastTransitionTime":"2026-03-20T17:33:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:43 crc kubenswrapper[4690]: I0320 17:33:43.094450 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:43 crc kubenswrapper[4690]: I0320 17:33:43.094615 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:43 crc kubenswrapper[4690]: I0320 17:33:43.094636 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:43 crc kubenswrapper[4690]: I0320 17:33:43.094664 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:43 crc kubenswrapper[4690]: I0320 17:33:43.094683 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:43Z","lastTransitionTime":"2026-03-20T17:33:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:43 crc kubenswrapper[4690]: I0320 17:33:43.197936 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:43 crc kubenswrapper[4690]: I0320 17:33:43.198000 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:43 crc kubenswrapper[4690]: I0320 17:33:43.198022 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:43 crc kubenswrapper[4690]: I0320 17:33:43.198052 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:43 crc kubenswrapper[4690]: I0320 17:33:43.198071 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:43Z","lastTransitionTime":"2026-03-20T17:33:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:43 crc kubenswrapper[4690]: I0320 17:33:43.301342 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:43 crc kubenswrapper[4690]: I0320 17:33:43.301430 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:43 crc kubenswrapper[4690]: I0320 17:33:43.301450 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:43 crc kubenswrapper[4690]: I0320 17:33:43.301479 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:43 crc kubenswrapper[4690]: I0320 17:33:43.301499 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:43Z","lastTransitionTime":"2026-03-20T17:33:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:43 crc kubenswrapper[4690]: I0320 17:33:43.404175 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:43 crc kubenswrapper[4690]: I0320 17:33:43.404213 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:43 crc kubenswrapper[4690]: I0320 17:33:43.404224 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:43 crc kubenswrapper[4690]: I0320 17:33:43.404239 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:43 crc kubenswrapper[4690]: I0320 17:33:43.404249 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:43Z","lastTransitionTime":"2026-03-20T17:33:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:43 crc kubenswrapper[4690]: I0320 17:33:43.506724 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:43 crc kubenswrapper[4690]: I0320 17:33:43.506796 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:43 crc kubenswrapper[4690]: I0320 17:33:43.506820 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:43 crc kubenswrapper[4690]: I0320 17:33:43.506853 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:43 crc kubenswrapper[4690]: I0320 17:33:43.506874 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:43Z","lastTransitionTime":"2026-03-20T17:33:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:43 crc kubenswrapper[4690]: I0320 17:33:43.547020 4690 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 20 17:33:43 crc kubenswrapper[4690]: I0320 17:33:43.609539 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:43 crc kubenswrapper[4690]: I0320 17:33:43.609591 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:43 crc kubenswrapper[4690]: I0320 17:33:43.609603 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:43 crc kubenswrapper[4690]: I0320 17:33:43.609622 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:43 crc kubenswrapper[4690]: I0320 17:33:43.609634 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:43Z","lastTransitionTime":"2026-03-20T17:33:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:43 crc kubenswrapper[4690]: I0320 17:33:43.712399 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:43 crc kubenswrapper[4690]: I0320 17:33:43.712442 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:43 crc kubenswrapper[4690]: I0320 17:33:43.712454 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:43 crc kubenswrapper[4690]: I0320 17:33:43.712472 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:43 crc kubenswrapper[4690]: I0320 17:33:43.712485 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:43Z","lastTransitionTime":"2026-03-20T17:33:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:43 crc kubenswrapper[4690]: I0320 17:33:43.853171 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:43 crc kubenswrapper[4690]: I0320 17:33:43.853230 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:43 crc kubenswrapper[4690]: I0320 17:33:43.853291 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:43 crc kubenswrapper[4690]: I0320 17:33:43.853323 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:43 crc kubenswrapper[4690]: I0320 17:33:43.853344 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:43Z","lastTransitionTime":"2026-03-20T17:33:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:43 crc kubenswrapper[4690]: I0320 17:33:43.882888 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:33:43 crc kubenswrapper[4690]: I0320 17:33:43.882953 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bgj72" Mar 20 17:33:43 crc kubenswrapper[4690]: I0320 17:33:43.882994 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:33:43 crc kubenswrapper[4690]: E0320 17:33:43.883125 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:33:43 crc kubenswrapper[4690]: I0320 17:33:43.883144 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:33:43 crc kubenswrapper[4690]: E0320 17:33:43.883243 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bgj72" podUID="3cb690cf-caea-4c1b-ad3c-7e17a802b1a3" Mar 20 17:33:43 crc kubenswrapper[4690]: E0320 17:33:43.883425 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:33:43 crc kubenswrapper[4690]: E0320 17:33:43.883685 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:33:43 crc kubenswrapper[4690]: I0320 17:33:43.956766 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:43 crc kubenswrapper[4690]: I0320 17:33:43.956822 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:43 crc kubenswrapper[4690]: I0320 17:33:43.956838 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:43 crc kubenswrapper[4690]: I0320 17:33:43.956863 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:43 crc kubenswrapper[4690]: I0320 17:33:43.956879 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:43Z","lastTransitionTime":"2026-03-20T17:33:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:44 crc kubenswrapper[4690]: I0320 17:33:44.059762 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:44 crc kubenswrapper[4690]: I0320 17:33:44.059849 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:44 crc kubenswrapper[4690]: I0320 17:33:44.059872 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:44 crc kubenswrapper[4690]: I0320 17:33:44.059898 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:44 crc kubenswrapper[4690]: I0320 17:33:44.059914 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:44Z","lastTransitionTime":"2026-03-20T17:33:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:44 crc kubenswrapper[4690]: I0320 17:33:44.162609 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:44 crc kubenswrapper[4690]: I0320 17:33:44.162651 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:44 crc kubenswrapper[4690]: I0320 17:33:44.162663 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:44 crc kubenswrapper[4690]: I0320 17:33:44.162680 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:44 crc kubenswrapper[4690]: I0320 17:33:44.162693 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:44Z","lastTransitionTime":"2026-03-20T17:33:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:44 crc kubenswrapper[4690]: I0320 17:33:44.265811 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:44 crc kubenswrapper[4690]: I0320 17:33:44.265858 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:44 crc kubenswrapper[4690]: I0320 17:33:44.265875 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:44 crc kubenswrapper[4690]: I0320 17:33:44.265892 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:44 crc kubenswrapper[4690]: I0320 17:33:44.265902 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:44Z","lastTransitionTime":"2026-03-20T17:33:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:44 crc kubenswrapper[4690]: I0320 17:33:44.376201 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:44 crc kubenswrapper[4690]: I0320 17:33:44.376291 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:44 crc kubenswrapper[4690]: I0320 17:33:44.376312 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:44 crc kubenswrapper[4690]: I0320 17:33:44.376336 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:44 crc kubenswrapper[4690]: I0320 17:33:44.376354 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:44Z","lastTransitionTime":"2026-03-20T17:33:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:44 crc kubenswrapper[4690]: I0320 17:33:44.479760 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:44 crc kubenswrapper[4690]: I0320 17:33:44.479882 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:44 crc kubenswrapper[4690]: I0320 17:33:44.479903 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:44 crc kubenswrapper[4690]: I0320 17:33:44.479925 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:44 crc kubenswrapper[4690]: I0320 17:33:44.479943 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:44Z","lastTransitionTime":"2026-03-20T17:33:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:44 crc kubenswrapper[4690]: I0320 17:33:44.583239 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:44 crc kubenswrapper[4690]: I0320 17:33:44.583318 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:44 crc kubenswrapper[4690]: I0320 17:33:44.583338 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:44 crc kubenswrapper[4690]: I0320 17:33:44.583359 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:44 crc kubenswrapper[4690]: I0320 17:33:44.583376 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:44Z","lastTransitionTime":"2026-03-20T17:33:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:44 crc kubenswrapper[4690]: I0320 17:33:44.687235 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:44 crc kubenswrapper[4690]: I0320 17:33:44.687350 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:44 crc kubenswrapper[4690]: I0320 17:33:44.687432 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:44 crc kubenswrapper[4690]: I0320 17:33:44.687461 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:44 crc kubenswrapper[4690]: I0320 17:33:44.687480 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:44Z","lastTransitionTime":"2026-03-20T17:33:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:44 crc kubenswrapper[4690]: I0320 17:33:44.790525 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:44 crc kubenswrapper[4690]: I0320 17:33:44.790580 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:44 crc kubenswrapper[4690]: I0320 17:33:44.790596 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:44 crc kubenswrapper[4690]: I0320 17:33:44.790618 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:44 crc kubenswrapper[4690]: I0320 17:33:44.790635 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:44Z","lastTransitionTime":"2026-03-20T17:33:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:44 crc kubenswrapper[4690]: I0320 17:33:44.893576 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:44 crc kubenswrapper[4690]: I0320 17:33:44.893624 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:44 crc kubenswrapper[4690]: I0320 17:33:44.893641 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:44 crc kubenswrapper[4690]: I0320 17:33:44.893662 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:44 crc kubenswrapper[4690]: I0320 17:33:44.893680 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:44Z","lastTransitionTime":"2026-03-20T17:33:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:44 crc kubenswrapper[4690]: I0320 17:33:44.996660 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:44 crc kubenswrapper[4690]: I0320 17:33:44.996716 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:44 crc kubenswrapper[4690]: I0320 17:33:44.996733 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:44 crc kubenswrapper[4690]: I0320 17:33:44.996754 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:44 crc kubenswrapper[4690]: I0320 17:33:44.996770 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:44Z","lastTransitionTime":"2026-03-20T17:33:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:45 crc kubenswrapper[4690]: I0320 17:33:45.100187 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:45 crc kubenswrapper[4690]: I0320 17:33:45.100233 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:45 crc kubenswrapper[4690]: I0320 17:33:45.100250 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:45 crc kubenswrapper[4690]: I0320 17:33:45.100321 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:45 crc kubenswrapper[4690]: I0320 17:33:45.100339 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:45Z","lastTransitionTime":"2026-03-20T17:33:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:45 crc kubenswrapper[4690]: I0320 17:33:45.203606 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:45 crc kubenswrapper[4690]: I0320 17:33:45.203664 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:45 crc kubenswrapper[4690]: I0320 17:33:45.203681 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:45 crc kubenswrapper[4690]: I0320 17:33:45.203706 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:45 crc kubenswrapper[4690]: I0320 17:33:45.203724 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:45Z","lastTransitionTime":"2026-03-20T17:33:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:45 crc kubenswrapper[4690]: I0320 17:33:45.307233 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:45 crc kubenswrapper[4690]: I0320 17:33:45.307352 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:45 crc kubenswrapper[4690]: I0320 17:33:45.307378 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:45 crc kubenswrapper[4690]: I0320 17:33:45.307410 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:45 crc kubenswrapper[4690]: I0320 17:33:45.307432 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:45Z","lastTransitionTime":"2026-03-20T17:33:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:45 crc kubenswrapper[4690]: I0320 17:33:45.410697 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:45 crc kubenswrapper[4690]: I0320 17:33:45.410761 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:45 crc kubenswrapper[4690]: I0320 17:33:45.410779 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:45 crc kubenswrapper[4690]: I0320 17:33:45.410802 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:45 crc kubenswrapper[4690]: I0320 17:33:45.410818 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:45Z","lastTransitionTime":"2026-03-20T17:33:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:45 crc kubenswrapper[4690]: I0320 17:33:45.514105 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:45 crc kubenswrapper[4690]: I0320 17:33:45.514161 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:45 crc kubenswrapper[4690]: I0320 17:33:45.514219 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:45 crc kubenswrapper[4690]: I0320 17:33:45.514243 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:45 crc kubenswrapper[4690]: I0320 17:33:45.514290 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:45Z","lastTransitionTime":"2026-03-20T17:33:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:45 crc kubenswrapper[4690]: I0320 17:33:45.617472 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:45 crc kubenswrapper[4690]: I0320 17:33:45.617541 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:45 crc kubenswrapper[4690]: I0320 17:33:45.617561 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:45 crc kubenswrapper[4690]: I0320 17:33:45.617587 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:45 crc kubenswrapper[4690]: I0320 17:33:45.617617 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:45Z","lastTransitionTime":"2026-03-20T17:33:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:45 crc kubenswrapper[4690]: I0320 17:33:45.720029 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:45 crc kubenswrapper[4690]: I0320 17:33:45.720089 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:45 crc kubenswrapper[4690]: I0320 17:33:45.720176 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:45 crc kubenswrapper[4690]: I0320 17:33:45.720249 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:45 crc kubenswrapper[4690]: I0320 17:33:45.720308 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:45Z","lastTransitionTime":"2026-03-20T17:33:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:45 crc kubenswrapper[4690]: I0320 17:33:45.822947 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:45 crc kubenswrapper[4690]: I0320 17:33:45.823002 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:45 crc kubenswrapper[4690]: I0320 17:33:45.823018 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:45 crc kubenswrapper[4690]: I0320 17:33:45.823040 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:45 crc kubenswrapper[4690]: I0320 17:33:45.823057 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:45Z","lastTransitionTime":"2026-03-20T17:33:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:45 crc kubenswrapper[4690]: I0320 17:33:45.882540 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:33:45 crc kubenswrapper[4690]: I0320 17:33:45.882537 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:33:45 crc kubenswrapper[4690]: I0320 17:33:45.882677 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:33:45 crc kubenswrapper[4690]: E0320 17:33:45.882869 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:33:45 crc kubenswrapper[4690]: I0320 17:33:45.883391 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bgj72" Mar 20 17:33:45 crc kubenswrapper[4690]: E0320 17:33:45.883550 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:33:45 crc kubenswrapper[4690]: E0320 17:33:45.883657 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bgj72" podUID="3cb690cf-caea-4c1b-ad3c-7e17a802b1a3" Mar 20 17:33:45 crc kubenswrapper[4690]: E0320 17:33:45.883766 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:33:45 crc kubenswrapper[4690]: I0320 17:33:45.896385 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cdd6a8b-6b15-41c5-ba81-51e1ef53835e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c42561cbc470c23295468bf31d6dda364c3962cf8ac84f53ed62c01fa3e19db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbfcf8baf8b3cc4746bc7b314297f0f820b7461ad85d9c2f500a3ed589fb4bc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbfcf8baf8b3cc4746bc7b314297f0f820b7461ad85d9c2f500a3ed589fb4bc0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:45 crc kubenswrapper[4690]: I0320 17:33:45.913897 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1ec4f2e-81b3-4b81-b071-1306b93f352a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc5b19d4175f97a26633b3c61b49147f93e1edeb8975964cb23bbe474f6326e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe2bb59ee9fc82c3e49b375d294aebc73e2175d699416cb28c587a153cbadc5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83d020fd903a7b604233a4229c9a201a78f0f9d41864c94e82220090dd73e69e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a788ca120045ef7b2481c3da0afac1f8ae2522b3edd3b73a48f5f8dab045a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60a788ca120045ef7b2481c3da0afac1f8ae2522b3edd3b73a48f5f8dab045a4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:33:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:33:16.417534 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:33:16.417775 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:33:16.418850 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4179466923/tls.crt::/tmp/serving-cert-4179466923/tls.key\\\\\\\"\\\\nI0320 17:33:16.771141 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:33:16.777371 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:33:16.777420 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:33:16.777489 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:33:16.777503 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:33:16.783760 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 17:33:16.783788 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:33:16.783793 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 17:33:16.783790 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:33:16.783798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:33:16.783816 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:33:16.783823 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:33:16.783828 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:33:16.787038 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d1877a8c2e19c04c44916cbcd68e19a117e4d6075b33ce7131064590120b12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438a96b878fe413aa54a56021b7ca5d2d38226050a036c2ce144aaead090aff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://438a96b878fe413aa54a56021b7ca5d2d38226050a036c2ce144aaead090aff7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:45 crc kubenswrapper[4690]: I0320 17:33:45.925849 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:45 crc kubenswrapper[4690]: I0320 17:33:45.925913 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:45 crc kubenswrapper[4690]: I0320 17:33:45.925931 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:45 crc kubenswrapper[4690]: I0320 17:33:45.925964 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:45 crc kubenswrapper[4690]: I0320 17:33:45.925983 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:45Z","lastTransitionTime":"2026-03-20T17:33:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:45 crc kubenswrapper[4690]: I0320 17:33:45.929809 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:45 crc kubenswrapper[4690]: I0320 17:33:45.944729 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:45 crc kubenswrapper[4690]: I0320 17:33:45.958630 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:45 crc kubenswrapper[4690]: I0320 17:33:45.970381 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qhmg6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5abdfe2-a5f7-43a7-9c83-a9eb0dacdea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lb8q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qhmg6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:45 crc kubenswrapper[4690]: I0320 17:33:45.985581 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:45 crc kubenswrapper[4690]: I0320 17:33:45.998979 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c18651e4-89e3-43fd-a780-bfa6df87591e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v64dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v64dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wtg2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:46 crc kubenswrapper[4690]: I0320 17:33:46.011470 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4rfg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deaf1de2-4906-4e89-ae1b-83b6d35f97a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmghf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4rfg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:46 crc kubenswrapper[4690]: I0320 17:33:46.027726 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nqtt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f51dea1-fc10-4d4a-9065-2d0c020b36f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzj2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzj2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8nqtt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:46 crc kubenswrapper[4690]: I0320 17:33:46.029642 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:46 crc kubenswrapper[4690]: I0320 17:33:46.029711 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:46 crc kubenswrapper[4690]: I0320 17:33:46.029738 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:46 crc kubenswrapper[4690]: I0320 17:33:46.029766 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:46 crc kubenswrapper[4690]: I0320 17:33:46.029787 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:46Z","lastTransitionTime":"2026-03-20T17:33:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:46 crc kubenswrapper[4690]: I0320 17:33:46.044862 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:46 crc kubenswrapper[4690]: I0320 17:33:46.060936 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:46 crc kubenswrapper[4690]: I0320 17:33:46.076067 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bf8dm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"189715be-f690-4a1d-9bd3-fb0dcae7affe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z9vwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bf8dm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:46 crc kubenswrapper[4690]: I0320 17:33:46.107828 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dacf40f3-f7fe-429b-bb11-3057bc037779\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b273b610fa19944625ca87d5ec10f818b86154d676f1def5ebe494ee44ed3848\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f552ca9ec154d035a9f9809b20d9ff2cd19bbd4cb9262173a0334289741f4fad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd8c552d958aced0cb683d87c3ef8d88494d4888ccb028a9f4c27b24b4923264\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5355eb1563fa92e70ca61e39a864a15b53da2181b277f3e134d121b5626b954a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f044bae4d4345b16e951ba16d4dc6df9b400789b67b6eb23d806fba27dc77d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://601a5cb96354f970de2322d08594baacac3c21ec962d27dc0c809f1bc99de4d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://601a5cb96354f970de2322d08594baacac3c21ec962d27dc0c809f1bc99de4d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e719a69188fb4ee3882973f6f72ba027c5a546cb39b119b27bcd38d8cc728521\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e719a69188fb4ee3882973f6f72ba027c5a546cb39b119b27bcd38d8cc728521\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ef118e8eca52e42d265877595d296d5641caa5c79886b886eefca7686f9b6524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef118e8eca52e42d265877595d296d5641caa5c79886b886eefca7686f9b6524\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:46 crc kubenswrapper[4690]: I0320 17:33:46.121005 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bgj72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cb690cf-caea-4c1b-ad3c-7e17a802b1a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djqjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djqjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bgj72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:46 crc kubenswrapper[4690]: I0320 17:33:46.133470 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:46 crc kubenswrapper[4690]: I0320 17:33:46.133532 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:46 crc kubenswrapper[4690]: I0320 17:33:46.133552 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:46 crc kubenswrapper[4690]: I0320 17:33:46.133577 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:46 crc kubenswrapper[4690]: I0320 17:33:46.133596 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:46Z","lastTransitionTime":"2026-03-20T17:33:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:46 crc kubenswrapper[4690]: I0320 17:33:46.140829 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tzvwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fe7c1d1-7aa9-4c64-941e-7415a99367ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tzvwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:46 crc kubenswrapper[4690]: I0320 17:33:46.168226 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01a728ab-e286-4606-b922-d510978b863a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7bsmm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:46 crc kubenswrapper[4690]: I0320 17:33:46.239335 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:46 crc kubenswrapper[4690]: I0320 17:33:46.239400 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:46 crc kubenswrapper[4690]: I0320 17:33:46.239417 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:46 crc kubenswrapper[4690]: I0320 17:33:46.239440 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:46 crc kubenswrapper[4690]: I0320 17:33:46.239457 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:46Z","lastTransitionTime":"2026-03-20T17:33:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:46 crc kubenswrapper[4690]: I0320 17:33:46.343371 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:46 crc kubenswrapper[4690]: I0320 17:33:46.343424 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:46 crc kubenswrapper[4690]: I0320 17:33:46.343437 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:46 crc kubenswrapper[4690]: I0320 17:33:46.343461 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:46 crc kubenswrapper[4690]: I0320 17:33:46.343478 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:46Z","lastTransitionTime":"2026-03-20T17:33:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:46 crc kubenswrapper[4690]: I0320 17:33:46.447733 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:46 crc kubenswrapper[4690]: I0320 17:33:46.447786 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:46 crc kubenswrapper[4690]: I0320 17:33:46.447822 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:46 crc kubenswrapper[4690]: I0320 17:33:46.447840 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:46 crc kubenswrapper[4690]: I0320 17:33:46.447855 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:46Z","lastTransitionTime":"2026-03-20T17:33:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:46 crc kubenswrapper[4690]: I0320 17:33:46.551911 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:46 crc kubenswrapper[4690]: I0320 17:33:46.551951 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:46 crc kubenswrapper[4690]: I0320 17:33:46.551962 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:46 crc kubenswrapper[4690]: I0320 17:33:46.551980 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:46 crc kubenswrapper[4690]: I0320 17:33:46.551992 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:46Z","lastTransitionTime":"2026-03-20T17:33:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:46 crc kubenswrapper[4690]: I0320 17:33:46.654621 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:46 crc kubenswrapper[4690]: I0320 17:33:46.654696 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:46 crc kubenswrapper[4690]: I0320 17:33:46.654715 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:46 crc kubenswrapper[4690]: I0320 17:33:46.654738 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:46 crc kubenswrapper[4690]: I0320 17:33:46.654755 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:46Z","lastTransitionTime":"2026-03-20T17:33:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:46 crc kubenswrapper[4690]: I0320 17:33:46.756698 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:46 crc kubenswrapper[4690]: I0320 17:33:46.756760 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:46 crc kubenswrapper[4690]: I0320 17:33:46.756776 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:46 crc kubenswrapper[4690]: I0320 17:33:46.756800 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:46 crc kubenswrapper[4690]: I0320 17:33:46.756818 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:46Z","lastTransitionTime":"2026-03-20T17:33:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:46 crc kubenswrapper[4690]: I0320 17:33:46.860201 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:46 crc kubenswrapper[4690]: I0320 17:33:46.860377 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:46 crc kubenswrapper[4690]: I0320 17:33:46.860396 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:46 crc kubenswrapper[4690]: I0320 17:33:46.860419 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:46 crc kubenswrapper[4690]: I0320 17:33:46.860436 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:46Z","lastTransitionTime":"2026-03-20T17:33:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:46 crc kubenswrapper[4690]: I0320 17:33:46.963916 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:46 crc kubenswrapper[4690]: I0320 17:33:46.963981 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:46 crc kubenswrapper[4690]: I0320 17:33:46.963999 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:46 crc kubenswrapper[4690]: I0320 17:33:46.964026 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:46 crc kubenswrapper[4690]: I0320 17:33:46.964043 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:46Z","lastTransitionTime":"2026-03-20T17:33:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:47 crc kubenswrapper[4690]: I0320 17:33:47.066884 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:47 crc kubenswrapper[4690]: I0320 17:33:47.066952 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:47 crc kubenswrapper[4690]: I0320 17:33:47.066970 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:47 crc kubenswrapper[4690]: I0320 17:33:47.066997 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:47 crc kubenswrapper[4690]: I0320 17:33:47.067015 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:47Z","lastTransitionTime":"2026-03-20T17:33:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:47 crc kubenswrapper[4690]: I0320 17:33:47.169773 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:47 crc kubenswrapper[4690]: I0320 17:33:47.169843 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:47 crc kubenswrapper[4690]: I0320 17:33:47.169862 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:47 crc kubenswrapper[4690]: I0320 17:33:47.169887 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:47 crc kubenswrapper[4690]: I0320 17:33:47.169911 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:47Z","lastTransitionTime":"2026-03-20T17:33:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:47 crc kubenswrapper[4690]: I0320 17:33:47.273064 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:47 crc kubenswrapper[4690]: I0320 17:33:47.273195 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:47 crc kubenswrapper[4690]: I0320 17:33:47.273219 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:47 crc kubenswrapper[4690]: I0320 17:33:47.273285 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:47 crc kubenswrapper[4690]: I0320 17:33:47.273316 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:47Z","lastTransitionTime":"2026-03-20T17:33:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:47 crc kubenswrapper[4690]: I0320 17:33:47.376613 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:47 crc kubenswrapper[4690]: I0320 17:33:47.376693 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:47 crc kubenswrapper[4690]: I0320 17:33:47.376718 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:47 crc kubenswrapper[4690]: I0320 17:33:47.376745 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:47 crc kubenswrapper[4690]: I0320 17:33:47.376767 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:47Z","lastTransitionTime":"2026-03-20T17:33:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:47 crc kubenswrapper[4690]: I0320 17:33:47.479938 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:47 crc kubenswrapper[4690]: I0320 17:33:47.479983 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:47 crc kubenswrapper[4690]: I0320 17:33:47.480007 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:47 crc kubenswrapper[4690]: I0320 17:33:47.480036 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:47 crc kubenswrapper[4690]: I0320 17:33:47.480059 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:47Z","lastTransitionTime":"2026-03-20T17:33:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:47 crc kubenswrapper[4690]: I0320 17:33:47.567733 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:47 crc kubenswrapper[4690]: I0320 17:33:47.567788 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:47 crc kubenswrapper[4690]: I0320 17:33:47.567808 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:47 crc kubenswrapper[4690]: I0320 17:33:47.567831 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:47 crc kubenswrapper[4690]: I0320 17:33:47.567848 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:47Z","lastTransitionTime":"2026-03-20T17:33:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:47 crc kubenswrapper[4690]: E0320 17:33:47.583658 4690 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"65dcae3a-f6f0-4cdb-ac7a-76b1f475ea12\\\",\\\"systemUUID\\\":\\\"6ccc1e34-4160-4143-b919-ac2f717f294a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:47 crc kubenswrapper[4690]: I0320 17:33:47.588520 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:47 crc kubenswrapper[4690]: I0320 17:33:47.588578 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:47 crc kubenswrapper[4690]: I0320 17:33:47.588597 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:47 crc kubenswrapper[4690]: I0320 17:33:47.588617 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:47 crc kubenswrapper[4690]: I0320 17:33:47.588633 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:47Z","lastTransitionTime":"2026-03-20T17:33:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:47 crc kubenswrapper[4690]: E0320 17:33:47.604887 4690 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"65dcae3a-f6f0-4cdb-ac7a-76b1f475ea12\\\",\\\"systemUUID\\\":\\\"6ccc1e34-4160-4143-b919-ac2f717f294a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:47 crc kubenswrapper[4690]: I0320 17:33:47.609390 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:47 crc kubenswrapper[4690]: I0320 17:33:47.609489 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:47 crc kubenswrapper[4690]: I0320 17:33:47.609516 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:47 crc kubenswrapper[4690]: I0320 17:33:47.609549 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:47 crc kubenswrapper[4690]: I0320 17:33:47.609573 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:47Z","lastTransitionTime":"2026-03-20T17:33:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:47 crc kubenswrapper[4690]: E0320 17:33:47.623899 4690 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"65dcae3a-f6f0-4cdb-ac7a-76b1f475ea12\\\",\\\"systemUUID\\\":\\\"6ccc1e34-4160-4143-b919-ac2f717f294a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:47 crc kubenswrapper[4690]: I0320 17:33:47.628696 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:47 crc kubenswrapper[4690]: I0320 17:33:47.628764 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:47 crc kubenswrapper[4690]: I0320 17:33:47.628785 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:47 crc kubenswrapper[4690]: I0320 17:33:47.628812 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:47 crc kubenswrapper[4690]: I0320 17:33:47.628831 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:47Z","lastTransitionTime":"2026-03-20T17:33:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:47 crc kubenswrapper[4690]: E0320 17:33:47.643332 4690 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"65dcae3a-f6f0-4cdb-ac7a-76b1f475ea12\\\",\\\"systemUUID\\\":\\\"6ccc1e34-4160-4143-b919-ac2f717f294a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:47 crc kubenswrapper[4690]: I0320 17:33:47.649039 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:47 crc kubenswrapper[4690]: I0320 17:33:47.649107 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:47 crc kubenswrapper[4690]: I0320 17:33:47.649129 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:47 crc kubenswrapper[4690]: I0320 17:33:47.649156 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:47 crc kubenswrapper[4690]: I0320 17:33:47.649183 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:47Z","lastTransitionTime":"2026-03-20T17:33:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:47 crc kubenswrapper[4690]: E0320 17:33:47.665239 4690 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"65dcae3a-f6f0-4cdb-ac7a-76b1f475ea12\\\",\\\"systemUUID\\\":\\\"6ccc1e34-4160-4143-b919-ac2f717f294a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:47 crc kubenswrapper[4690]: E0320 17:33:47.665504 4690 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 17:33:47 crc kubenswrapper[4690]: I0320 17:33:47.667789 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:47 crc kubenswrapper[4690]: I0320 17:33:47.667816 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:47 crc kubenswrapper[4690]: I0320 17:33:47.667829 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:47 crc kubenswrapper[4690]: I0320 17:33:47.667845 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:47 crc kubenswrapper[4690]: I0320 17:33:47.667856 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:47Z","lastTransitionTime":"2026-03-20T17:33:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:47 crc kubenswrapper[4690]: I0320 17:33:47.771421 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:47 crc kubenswrapper[4690]: I0320 17:33:47.771463 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:47 crc kubenswrapper[4690]: I0320 17:33:47.771480 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:47 crc kubenswrapper[4690]: I0320 17:33:47.771503 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:47 crc kubenswrapper[4690]: I0320 17:33:47.771519 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:47Z","lastTransitionTime":"2026-03-20T17:33:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:47 crc kubenswrapper[4690]: I0320 17:33:47.873946 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:47 crc kubenswrapper[4690]: I0320 17:33:47.874020 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:47 crc kubenswrapper[4690]: I0320 17:33:47.874050 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:47 crc kubenswrapper[4690]: I0320 17:33:47.874084 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:47 crc kubenswrapper[4690]: I0320 17:33:47.874108 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:47Z","lastTransitionTime":"2026-03-20T17:33:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:47 crc kubenswrapper[4690]: I0320 17:33:47.882502 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:33:47 crc kubenswrapper[4690]: I0320 17:33:47.882600 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:33:47 crc kubenswrapper[4690]: I0320 17:33:47.882502 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bgj72" Mar 20 17:33:47 crc kubenswrapper[4690]: E0320 17:33:47.882723 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:33:47 crc kubenswrapper[4690]: E0320 17:33:47.882902 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:33:47 crc kubenswrapper[4690]: I0320 17:33:47.882985 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:33:47 crc kubenswrapper[4690]: E0320 17:33:47.883355 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bgj72" podUID="3cb690cf-caea-4c1b-ad3c-7e17a802b1a3" Mar 20 17:33:47 crc kubenswrapper[4690]: E0320 17:33:47.884250 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:33:47 crc kubenswrapper[4690]: I0320 17:33:47.977573 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:47 crc kubenswrapper[4690]: I0320 17:33:47.977899 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:47 crc kubenswrapper[4690]: I0320 17:33:47.977916 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:47 crc kubenswrapper[4690]: I0320 17:33:47.977940 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:47 crc kubenswrapper[4690]: I0320 17:33:47.977973 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:47Z","lastTransitionTime":"2026-03-20T17:33:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:48 crc kubenswrapper[4690]: I0320 17:33:48.080762 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:48 crc kubenswrapper[4690]: I0320 17:33:48.080840 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:48 crc kubenswrapper[4690]: I0320 17:33:48.080866 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:48 crc kubenswrapper[4690]: I0320 17:33:48.080896 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:48 crc kubenswrapper[4690]: I0320 17:33:48.080914 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:48Z","lastTransitionTime":"2026-03-20T17:33:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:48 crc kubenswrapper[4690]: I0320 17:33:48.183501 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:48 crc kubenswrapper[4690]: I0320 17:33:48.183560 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:48 crc kubenswrapper[4690]: I0320 17:33:48.183581 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:48 crc kubenswrapper[4690]: I0320 17:33:48.183606 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:48 crc kubenswrapper[4690]: I0320 17:33:48.183623 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:48Z","lastTransitionTime":"2026-03-20T17:33:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:48 crc kubenswrapper[4690]: I0320 17:33:48.286987 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:48 crc kubenswrapper[4690]: I0320 17:33:48.287052 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:48 crc kubenswrapper[4690]: I0320 17:33:48.287111 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:48 crc kubenswrapper[4690]: I0320 17:33:48.287141 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:48 crc kubenswrapper[4690]: I0320 17:33:48.287167 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:48Z","lastTransitionTime":"2026-03-20T17:33:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:48 crc kubenswrapper[4690]: I0320 17:33:48.390446 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:48 crc kubenswrapper[4690]: I0320 17:33:48.390538 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:48 crc kubenswrapper[4690]: I0320 17:33:48.390559 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:48 crc kubenswrapper[4690]: I0320 17:33:48.390587 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:48 crc kubenswrapper[4690]: I0320 17:33:48.390604 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:48Z","lastTransitionTime":"2026-03-20T17:33:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:48 crc kubenswrapper[4690]: I0320 17:33:48.391437 4690 generic.go:334] "Generic (PLEG): container finished" podID="01a728ab-e286-4606-b922-d510978b863a" containerID="13ad2529bd38d1e0c84ca456ccdcc8020ce82a667c5aa5ea3a0027d397ec94f3" exitCode=0 Mar 20 17:33:48 crc kubenswrapper[4690]: I0320 17:33:48.391592 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" event={"ID":"01a728ab-e286-4606-b922-d510978b863a","Type":"ContainerDied","Data":"13ad2529bd38d1e0c84ca456ccdcc8020ce82a667c5aa5ea3a0027d397ec94f3"} Mar 20 17:33:48 crc kubenswrapper[4690]: I0320 17:33:48.410198 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:48 crc kubenswrapper[4690]: I0320 17:33:48.426943 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:48 crc kubenswrapper[4690]: I0320 17:33:48.440136 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qhmg6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5abdfe2-a5f7-43a7-9c83-a9eb0dacdea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lb8q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qhmg6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:48 crc kubenswrapper[4690]: I0320 17:33:48.451580 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cdd6a8b-6b15-41c5-ba81-51e1ef53835e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c42561cbc470c23295468bf31d6dda364c3962cf8ac84f53ed62c01fa3e19db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbfcf8baf8b3cc4746bc7b314297f0f820b7461ad85d9c2f500a3ed589fb4bc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbfcf8baf8b3cc4746bc7b314297f0f820b7461ad85d9c2f500a3ed589fb4bc0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:48 crc kubenswrapper[4690]: I0320 17:33:48.477410 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1ec4f2e-81b3-4b81-b071-1306b93f352a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc5b19d4175f97a26633b3c61b49147f93e1edeb8975964cb23bbe474f6326e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe2bb59ee9fc82c3e49b375d294aebc73e2175d699416cb28c587a153cbadc5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83d020fd903a7b604233a4229c9a201a78f0f9d41864c94e82220090dd73e69e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a788ca120045ef7b2481c3da0afac1f8ae2522b3edd3b73a48f5f8dab045a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60a788ca120045ef7b2481c3da0afac1f8ae2522b3edd3b73a48f5f8dab045a4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:33:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:33:16.417534 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:33:16.417775 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:33:16.418850 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4179466923/tls.crt::/tmp/serving-cert-4179466923/tls.key\\\\\\\"\\\\nI0320 17:33:16.771141 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:33:16.777371 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:33:16.777420 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:33:16.777489 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:33:16.777503 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:33:16.783760 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 17:33:16.783788 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:33:16.783793 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 17:33:16.783790 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:33:16.783798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:33:16.783816 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:33:16.783823 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:33:16.783828 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:33:16.787038 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d1877a8c2e19c04c44916cbcd68e19a117e4d6075b33ce7131064590120b12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438a96b878fe413aa54a56021b7ca5d2d38226050a036c2ce144aaead090aff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://438a96b878fe413aa54a56021b7ca5d2d38226050a036c2ce144aaead090aff7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:48 crc kubenswrapper[4690]: I0320 17:33:48.493353 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:48 crc kubenswrapper[4690]: I0320 17:33:48.493426 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:48 crc kubenswrapper[4690]: I0320 17:33:48.493449 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:48 crc kubenswrapper[4690]: I0320 17:33:48.493479 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:48 crc kubenswrapper[4690]: I0320 17:33:48.493533 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:48Z","lastTransitionTime":"2026-03-20T17:33:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:48 crc kubenswrapper[4690]: I0320 17:33:48.495185 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:48 crc kubenswrapper[4690]: I0320 17:33:48.507859 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4rfg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deaf1de2-4906-4e89-ae1b-83b6d35f97a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmghf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4rfg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:48 crc kubenswrapper[4690]: I0320 17:33:48.521441 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nqtt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f51dea1-fc10-4d4a-9065-2d0c020b36f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzj2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzj2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8nqtt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:48 crc kubenswrapper[4690]: I0320 17:33:48.536355 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:48 crc kubenswrapper[4690]: I0320 17:33:48.549464 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c18651e4-89e3-43fd-a780-bfa6df87591e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v64dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v64dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wtg2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:48 crc kubenswrapper[4690]: I0320 17:33:48.567909 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:48 crc kubenswrapper[4690]: I0320 17:33:48.585154 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:48 crc kubenswrapper[4690]: I0320 17:33:48.597027 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:48 crc kubenswrapper[4690]: I0320 17:33:48.597094 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:48 crc kubenswrapper[4690]: I0320 17:33:48.597118 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:48 crc kubenswrapper[4690]: I0320 17:33:48.597149 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:48 crc kubenswrapper[4690]: I0320 17:33:48.597175 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:48Z","lastTransitionTime":"2026-03-20T17:33:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:48 crc kubenswrapper[4690]: I0320 17:33:48.602729 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bf8dm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"189715be-f690-4a1d-9bd3-fb0dcae7affe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z9vwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bf8dm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:48 crc kubenswrapper[4690]: I0320 17:33:48.623236 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tzvwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fe7c1d1-7aa9-4c64-941e-7415a99367ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tzvwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:48 crc kubenswrapper[4690]: I0320 17:33:48.653070 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01a728ab-e286-4606-b922-d510978b863a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ad2529bd38d1e0c84ca456ccdcc8020ce82a667c5aa5ea3a0027d397ec94f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ad2529bd38d1e0c84ca456ccdcc8020ce82a667c5aa5ea3a0027d397ec94f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7bsmm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:48 crc kubenswrapper[4690]: I0320 17:33:48.699671 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dacf40f3-f7fe-429b-bb11-3057bc037779\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b273b610fa19944625ca87d5ec10f818b86154d676f1def5ebe494ee44ed3848\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f552ca9ec154d035a9f9809b20d9ff2cd19bbd4cb9262173a0334289741f4fad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd8c552d958aced0cb683d87c3ef8d88494d4888ccb028a9f4c27b24b4923264\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5355eb1563fa92e70ca61e39a864a15b53da2181b277f3e134d121b5626b954a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f044bae4d4345b16e951ba16d4dc6df9b400789b67b6eb23d806fba27dc77d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://601a5cb96354f970de2322d08594baacac3c21ec962d27dc0c809f1bc99de4d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://601a5cb96354f970de2322d08594baacac3c21ec962d27dc0c809f1bc99de4d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e719a69188fb4ee3882973f6f72ba027c5a546cb39b119b27bcd38d8cc728521\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e719a69188fb4ee3882973f6f72ba027c5a546cb39b119b27bcd38d8cc728521\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ef118e8eca52e42d265877595d296d5641caa5c79886b886eefca7686f9b6524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef118e8eca52e42d265877595d296d5641caa5c79886b886eefca7686f9b6524\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:48 crc kubenswrapper[4690]: I0320 17:33:48.701974 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:48 crc kubenswrapper[4690]: I0320 17:33:48.702004 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:48 crc kubenswrapper[4690]: I0320 17:33:48.702017 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:48 crc kubenswrapper[4690]: I0320 17:33:48.702037 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:48 crc kubenswrapper[4690]: I0320 17:33:48.702049 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:48Z","lastTransitionTime":"2026-03-20T17:33:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:48 crc kubenswrapper[4690]: I0320 17:33:48.709855 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bgj72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cb690cf-caea-4c1b-ad3c-7e17a802b1a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djqjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djqjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bgj72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:48 crc kubenswrapper[4690]: I0320 17:33:48.804388 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:48 crc kubenswrapper[4690]: I0320 17:33:48.804445 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:48 crc kubenswrapper[4690]: I0320 17:33:48.804462 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:48 crc kubenswrapper[4690]: I0320 17:33:48.804484 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:48 crc kubenswrapper[4690]: I0320 17:33:48.804500 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:48Z","lastTransitionTime":"2026-03-20T17:33:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:48 crc kubenswrapper[4690]: I0320 17:33:48.907193 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:48 crc kubenswrapper[4690]: I0320 17:33:48.907287 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:48 crc kubenswrapper[4690]: I0320 17:33:48.907318 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:48 crc kubenswrapper[4690]: I0320 17:33:48.907348 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:48 crc kubenswrapper[4690]: I0320 17:33:48.907369 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:48Z","lastTransitionTime":"2026-03-20T17:33:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:49 crc kubenswrapper[4690]: I0320 17:33:49.011874 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:49 crc kubenswrapper[4690]: I0320 17:33:49.011953 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:49 crc kubenswrapper[4690]: I0320 17:33:49.011976 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:49 crc kubenswrapper[4690]: I0320 17:33:49.012007 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:49 crc kubenswrapper[4690]: I0320 17:33:49.012029 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:49Z","lastTransitionTime":"2026-03-20T17:33:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:49 crc kubenswrapper[4690]: I0320 17:33:49.115132 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:49 crc kubenswrapper[4690]: I0320 17:33:49.115199 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:49 crc kubenswrapper[4690]: I0320 17:33:49.115220 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:49 crc kubenswrapper[4690]: I0320 17:33:49.115248 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:49 crc kubenswrapper[4690]: I0320 17:33:49.115292 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:49Z","lastTransitionTime":"2026-03-20T17:33:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:49 crc kubenswrapper[4690]: I0320 17:33:49.218803 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:49 crc kubenswrapper[4690]: I0320 17:33:49.219198 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:49 crc kubenswrapper[4690]: I0320 17:33:49.219216 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:49 crc kubenswrapper[4690]: I0320 17:33:49.219243 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:49 crc kubenswrapper[4690]: I0320 17:33:49.219290 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:49Z","lastTransitionTime":"2026-03-20T17:33:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:49 crc kubenswrapper[4690]: I0320 17:33:49.322345 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:49 crc kubenswrapper[4690]: I0320 17:33:49.322407 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:49 crc kubenswrapper[4690]: I0320 17:33:49.322425 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:49 crc kubenswrapper[4690]: I0320 17:33:49.322449 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:49 crc kubenswrapper[4690]: I0320 17:33:49.322465 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:49Z","lastTransitionTime":"2026-03-20T17:33:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:49 crc kubenswrapper[4690]: I0320 17:33:49.400719 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" event={"ID":"01a728ab-e286-4606-b922-d510978b863a","Type":"ContainerStarted","Data":"d198c0b94cfc2e9429a02ccb1bf444b3746c37cd3278cc5c41cccad3a92f3a7c"} Mar 20 17:33:49 crc kubenswrapper[4690]: I0320 17:33:49.400796 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" event={"ID":"01a728ab-e286-4606-b922-d510978b863a","Type":"ContainerStarted","Data":"78b79e7c6bc179739a43168addace3ea75f4067c5938f219a5cb0e545f65472f"} Mar 20 17:33:49 crc kubenswrapper[4690]: I0320 17:33:49.400827 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" event={"ID":"01a728ab-e286-4606-b922-d510978b863a","Type":"ContainerStarted","Data":"187278dddcc4ae295ce37bb5966dd95b70987cf9579d8a302c45162906caa098"} Mar 20 17:33:49 crc kubenswrapper[4690]: I0320 17:33:49.400852 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" event={"ID":"01a728ab-e286-4606-b922-d510978b863a","Type":"ContainerStarted","Data":"89f5bb035f84384df58eb38689bda300611344d78c38c548c61cd02a479b6852"} Mar 20 17:33:49 crc kubenswrapper[4690]: I0320 17:33:49.400875 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" event={"ID":"01a728ab-e286-4606-b922-d510978b863a","Type":"ContainerStarted","Data":"11c8e8059826df28ea1bdafe3ca56a8a902ff916246367be3ece76d468194901"} Mar 20 17:33:49 crc kubenswrapper[4690]: I0320 17:33:49.400897 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" event={"ID":"01a728ab-e286-4606-b922-d510978b863a","Type":"ContainerStarted","Data":"95c9b322e5da6bc8172886af77d6507bccaaf8e4489181c78d3f5e522d781aa4"} Mar 20 17:33:49 crc kubenswrapper[4690]: I0320 17:33:49.425819 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:49 crc kubenswrapper[4690]: I0320 17:33:49.426121 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:49 crc kubenswrapper[4690]: I0320 17:33:49.426313 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:49 crc kubenswrapper[4690]: I0320 17:33:49.426479 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:49 crc kubenswrapper[4690]: I0320 17:33:49.426611 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:49Z","lastTransitionTime":"2026-03-20T17:33:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:49 crc kubenswrapper[4690]: I0320 17:33:49.530086 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:49 crc kubenswrapper[4690]: I0320 17:33:49.530155 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:49 crc kubenswrapper[4690]: I0320 17:33:49.530173 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:49 crc kubenswrapper[4690]: I0320 17:33:49.530201 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:49 crc kubenswrapper[4690]: I0320 17:33:49.530221 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:49Z","lastTransitionTime":"2026-03-20T17:33:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:49 crc kubenswrapper[4690]: I0320 17:33:49.633759 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:49 crc kubenswrapper[4690]: I0320 17:33:49.633819 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:49 crc kubenswrapper[4690]: I0320 17:33:49.633836 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:49 crc kubenswrapper[4690]: I0320 17:33:49.633860 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:49 crc kubenswrapper[4690]: I0320 17:33:49.633893 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:49Z","lastTransitionTime":"2026-03-20T17:33:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:49 crc kubenswrapper[4690]: I0320 17:33:49.736468 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:49 crc kubenswrapper[4690]: I0320 17:33:49.736510 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:49 crc kubenswrapper[4690]: I0320 17:33:49.736520 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:49 crc kubenswrapper[4690]: I0320 17:33:49.736535 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:49 crc kubenswrapper[4690]: I0320 17:33:49.736545 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:49Z","lastTransitionTime":"2026-03-20T17:33:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:49 crc kubenswrapper[4690]: I0320 17:33:49.840061 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:49 crc kubenswrapper[4690]: I0320 17:33:49.841391 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:49 crc kubenswrapper[4690]: I0320 17:33:49.841542 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:49 crc kubenswrapper[4690]: I0320 17:33:49.841682 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:49 crc kubenswrapper[4690]: I0320 17:33:49.841802 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:49Z","lastTransitionTime":"2026-03-20T17:33:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:49 crc kubenswrapper[4690]: I0320 17:33:49.882859 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:33:49 crc kubenswrapper[4690]: I0320 17:33:49.883090 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bgj72" Mar 20 17:33:49 crc kubenswrapper[4690]: I0320 17:33:49.883084 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:33:49 crc kubenswrapper[4690]: E0320 17:33:49.883292 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:33:49 crc kubenswrapper[4690]: I0320 17:33:49.883336 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:33:49 crc kubenswrapper[4690]: E0320 17:33:49.883550 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:33:49 crc kubenswrapper[4690]: E0320 17:33:49.884223 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:33:49 crc kubenswrapper[4690]: E0320 17:33:49.884960 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bgj72" podUID="3cb690cf-caea-4c1b-ad3c-7e17a802b1a3" Mar 20 17:33:49 crc kubenswrapper[4690]: I0320 17:33:49.946748 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:49 crc kubenswrapper[4690]: I0320 17:33:49.946796 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:49 crc kubenswrapper[4690]: I0320 17:33:49.946815 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:49 crc kubenswrapper[4690]: I0320 17:33:49.946841 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:49 crc kubenswrapper[4690]: I0320 17:33:49.946860 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:49Z","lastTransitionTime":"2026-03-20T17:33:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:50 crc kubenswrapper[4690]: I0320 17:33:50.048801 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:50 crc kubenswrapper[4690]: I0320 17:33:50.048835 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:50 crc kubenswrapper[4690]: I0320 17:33:50.048845 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:50 crc kubenswrapper[4690]: I0320 17:33:50.048860 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:50 crc kubenswrapper[4690]: I0320 17:33:50.048869 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:50Z","lastTransitionTime":"2026-03-20T17:33:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:50 crc kubenswrapper[4690]: I0320 17:33:50.151902 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:50 crc kubenswrapper[4690]: I0320 17:33:50.151947 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:50 crc kubenswrapper[4690]: I0320 17:33:50.151957 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:50 crc kubenswrapper[4690]: I0320 17:33:50.151974 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:50 crc kubenswrapper[4690]: I0320 17:33:50.151986 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:50Z","lastTransitionTime":"2026-03-20T17:33:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:50 crc kubenswrapper[4690]: I0320 17:33:50.253633 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:50 crc kubenswrapper[4690]: I0320 17:33:50.253668 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:50 crc kubenswrapper[4690]: I0320 17:33:50.253677 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:50 crc kubenswrapper[4690]: I0320 17:33:50.253689 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:50 crc kubenswrapper[4690]: I0320 17:33:50.253698 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:50Z","lastTransitionTime":"2026-03-20T17:33:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:50 crc kubenswrapper[4690]: I0320 17:33:50.356574 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:50 crc kubenswrapper[4690]: I0320 17:33:50.356639 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:50 crc kubenswrapper[4690]: I0320 17:33:50.356660 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:50 crc kubenswrapper[4690]: I0320 17:33:50.356687 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:50 crc kubenswrapper[4690]: I0320 17:33:50.356709 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:50Z","lastTransitionTime":"2026-03-20T17:33:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:50 crc kubenswrapper[4690]: I0320 17:33:50.407201 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"64a74fb2e29c84d99284cdca82ecd7abae5fc195747f292f11036116ec270ff7"} Mar 20 17:33:50 crc kubenswrapper[4690]: I0320 17:33:50.407333 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"b37728496304293eddfd812f4584815ce277a3a2b02b6716e5f7d5d77ebaf9d4"} Mar 20 17:33:50 crc kubenswrapper[4690]: I0320 17:33:50.410207 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" event={"ID":"c18651e4-89e3-43fd-a780-bfa6df87591e","Type":"ContainerStarted","Data":"746499ab480c55aa548acd69b4adc2adb724c111d53536273f1e738c5d67209c"} Mar 20 17:33:50 crc kubenswrapper[4690]: I0320 17:33:50.410332 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" event={"ID":"c18651e4-89e3-43fd-a780-bfa6df87591e","Type":"ContainerStarted","Data":"09565d72b6e11bc9bc4f72446c455016fb107bdf0fe367b56427ce9f79c20b0e"} Mar 20 17:33:50 crc kubenswrapper[4690]: I0320 17:33:50.412432 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-4rfg5" event={"ID":"deaf1de2-4906-4e89-ae1b-83b6d35f97a6","Type":"ContainerStarted","Data":"53b3e701b77813269b88f29ec4e437ca71cad9cd1b9cc9310dc6b59cc609bcc4"} Mar 20 17:33:50 crc kubenswrapper[4690]: I0320 17:33:50.418561 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tzvwm" event={"ID":"3fe7c1d1-7aa9-4c64-941e-7415a99367ea","Type":"ContainerStarted","Data":"56dc92b978a7c1bbd4e3ccc2a6821348e2a990247e49e82c4de43c8bbe305cad"} Mar 20 17:33:50 crc kubenswrapper[4690]: I0320 17:33:50.436437 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01a728ab-e286-4606-b922-d510978b863a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ad2529bd38d1e0c84ca456ccdcc8020ce82a667c5aa5ea3a0027d397ec94f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ad2529bd38d1e0c84ca456ccdcc8020ce82a667c5aa5ea3a0027d397ec94f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7bsmm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:50 crc kubenswrapper[4690]: I0320 17:33:50.459331 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:50 crc kubenswrapper[4690]: I0320 17:33:50.459389 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:50 crc kubenswrapper[4690]: I0320 17:33:50.459407 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:50 crc kubenswrapper[4690]: I0320 17:33:50.459443 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:50 crc kubenswrapper[4690]: I0320 17:33:50.459461 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:50Z","lastTransitionTime":"2026-03-20T17:33:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:50 crc kubenswrapper[4690]: I0320 17:33:50.467505 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dacf40f3-f7fe-429b-bb11-3057bc037779\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b273b610fa19944625ca87d5ec10f818b86154d676f1def5ebe494ee44ed3848\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f552ca9ec154d035a9f9809b20d9ff2cd19bbd4cb9262173a0334289741f4fad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd8c552d958aced0cb683d87c3ef8d88494d4888ccb028a9f4c27b24b4923264\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5355eb1563fa92e70ca61e39a864a15b53da2181b277f3e134d121b5626b954a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f044bae4d4345b16e951ba16d4dc6df9b400789b67b6eb23d806fba27dc77d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://601a5cb96354f970de2322d08594baacac3c21ec962d27dc0c809f1bc99de4d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://601a5cb96354f970de2322d08594baacac3c21ec962d27dc0c809f1bc99de4d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e719a69188fb4ee3882973f6f72ba027c5a546cb39b119b27bcd38d8cc728521\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e719a69188fb4ee3882973f6f72ba027c5a546cb39b119b27bcd38d8cc728521\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ef118e8eca52e42d265877595d296d5641caa5c79886b886eefca7686f9b6524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef118e8eca52e42d265877595d296d5641caa5c79886b886eefca7686f9b6524\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:50 crc kubenswrapper[4690]: I0320 17:33:50.480705 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bgj72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cb690cf-caea-4c1b-ad3c-7e17a802b1a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djqjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djqjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bgj72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:50 crc kubenswrapper[4690]: I0320 17:33:50.493820 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tzvwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fe7c1d1-7aa9-4c64-941e-7415a99367ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tzvwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:50 crc kubenswrapper[4690]: I0320 17:33:50.503229 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:50 crc kubenswrapper[4690]: I0320 17:33:50.511076 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qhmg6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5abdfe2-a5f7-43a7-9c83-a9eb0dacdea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lb8q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qhmg6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:50 crc kubenswrapper[4690]: I0320 17:33:50.519481 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cdd6a8b-6b15-41c5-ba81-51e1ef53835e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c42561cbc470c23295468bf31d6dda364c3962cf8ac84f53ed62c01fa3e19db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbfcf8baf8b3cc4746bc7b314297f0f820b7461ad85d9c2f500a3ed589fb4bc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbfcf8baf8b3cc4746bc7b314297f0f820b7461ad85d9c2f500a3ed589fb4bc0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:50 crc kubenswrapper[4690]: I0320 17:33:50.534579 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1ec4f2e-81b3-4b81-b071-1306b93f352a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc5b19d4175f97a26633b3c61b49147f93e1edeb8975964cb23bbe474f6326e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe2bb59ee9fc82c3e49b375d294aebc73e2175d699416cb28c587a153cbadc5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83d020fd903a7b604233a4229c9a201a78f0f9d41864c94e82220090dd73e69e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a788ca120045ef7b2481c3da0afac1f8ae2522b3edd3b73a48f5f8dab045a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60a788ca120045ef7b2481c3da0afac1f8ae2522b3edd3b73a48f5f8dab045a4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:33:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:33:16.417534 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:33:16.417775 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:33:16.418850 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4179466923/tls.crt::/tmp/serving-cert-4179466923/tls.key\\\\\\\"\\\\nI0320 17:33:16.771141 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:33:16.777371 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:33:16.777420 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:33:16.777489 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:33:16.777503 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:33:16.783760 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 17:33:16.783788 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:33:16.783793 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 17:33:16.783790 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:33:16.783798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:33:16.783816 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:33:16.783823 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:33:16.783828 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:33:16.787038 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d1877a8c2e19c04c44916cbcd68e19a117e4d6075b33ce7131064590120b12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438a96b878fe413aa54a56021b7ca5d2d38226050a036c2ce144aaead090aff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://438a96b878fe413aa54a56021b7ca5d2d38226050a036c2ce144aaead090aff7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:50 crc kubenswrapper[4690]: I0320 17:33:50.548449 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:50 crc kubenswrapper[4690]: I0320 17:33:50.562526 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:50 crc kubenswrapper[4690]: I0320 17:33:50.562592 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:50 crc kubenswrapper[4690]: I0320 17:33:50.562612 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:50 crc kubenswrapper[4690]: I0320 17:33:50.562639 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:50 crc kubenswrapper[4690]: I0320 17:33:50.562658 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:50Z","lastTransitionTime":"2026-03-20T17:33:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:50 crc kubenswrapper[4690]: I0320 17:33:50.562843 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:33:50 crc kubenswrapper[4690]: I0320 17:33:50.581101 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nqtt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f51dea1-fc10-4d4a-9065-2d0c020b36f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzj2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzj2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8nqtt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:50Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:50 crc kubenswrapper[4690]: I0320 17:33:50.596215 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:50Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:50 crc kubenswrapper[4690]: I0320 17:33:50.613914 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c18651e4-89e3-43fd-a780-bfa6df87591e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v64dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v64dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wtg2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:50Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:50 crc kubenswrapper[4690]: I0320 17:33:50.628319 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4rfg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deaf1de2-4906-4e89-ae1b-83b6d35f97a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmghf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4rfg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:50Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:50 crc kubenswrapper[4690]: I0320 17:33:50.649623 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64a74fb2e29c84d99284cdca82ecd7abae5fc195747f292f11036116ec270ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37728496304293eddfd812f4584815ce277a3a2b02b6716e5f7d5d77ebaf9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:50Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:50 crc kubenswrapper[4690]: I0320 17:33:50.665202 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:50 crc kubenswrapper[4690]: I0320 17:33:50.665249 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:50 crc kubenswrapper[4690]: I0320 17:33:50.665313 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:50 crc kubenswrapper[4690]: I0320 17:33:50.665341 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:50 crc kubenswrapper[4690]: I0320 17:33:50.665362 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:50Z","lastTransitionTime":"2026-03-20T17:33:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:50 crc kubenswrapper[4690]: I0320 17:33:50.666959 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:50Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:50 crc kubenswrapper[4690]: I0320 17:33:50.684950 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bf8dm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"189715be-f690-4a1d-9bd3-fb0dcae7affe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z9vwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bf8dm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:50Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:50 crc kubenswrapper[4690]: I0320 17:33:50.702246 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qhmg6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5abdfe2-a5f7-43a7-9c83-a9eb0dacdea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lb8q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qhmg6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:50Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:50 crc kubenswrapper[4690]: I0320 17:33:50.717715 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cdd6a8b-6b15-41c5-ba81-51e1ef53835e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c42561cbc470c23295468bf31d6dda364c3962cf8ac84f53ed62c01fa3e19db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbfcf8baf8b3cc4746bc7b314297f0f820b7461ad85d9c2f500a3ed589fb4bc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbfcf8baf8b3cc4746bc7b314297f0f820b7461ad85d9c2f500a3ed589fb4bc0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:50Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:50 crc kubenswrapper[4690]: I0320 17:33:50.738305 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1ec4f2e-81b3-4b81-b071-1306b93f352a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc5b19d4175f97a26633b3c61b49147f93e1edeb8975964cb23bbe474f6326e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe2bb59ee9fc82c3e49b375d294aebc73e2175d699416cb28c587a153cbadc5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83d020fd903a7b604233a4229c9a201a78f0f9d41864c94e82220090dd73e69e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a788ca120045ef7b2481c3da0afac1f8ae2522b3edd3b73a48f5f8dab045a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60a788ca120045ef7b2481c3da0afac1f8ae2522b3edd3b73a48f5f8dab045a4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:33:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:33:16.417534 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:33:16.417775 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:33:16.418850 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4179466923/tls.crt::/tmp/serving-cert-4179466923/tls.key\\\\\\\"\\\\nI0320 17:33:16.771141 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:33:16.777371 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:33:16.777420 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:33:16.777489 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:33:16.777503 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:33:16.783760 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 17:33:16.783788 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:33:16.783793 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 17:33:16.783790 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:33:16.783798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:33:16.783816 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:33:16.783823 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:33:16.783828 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:33:16.787038 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d1877a8c2e19c04c44916cbcd68e19a117e4d6075b33ce7131064590120b12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438a96b878fe413aa54a56021b7ca5d2d38226050a036c2ce144aaead090aff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://438a96b878fe413aa54a56021b7ca5d2d38226050a036c2ce144aaead090aff7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:50Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:50 crc kubenswrapper[4690]: I0320 17:33:50.757042 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:50Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:50 crc kubenswrapper[4690]: I0320 17:33:50.768319 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:50 crc kubenswrapper[4690]: I0320 17:33:50.768347 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:50 crc kubenswrapper[4690]: I0320 17:33:50.768357 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:50 crc kubenswrapper[4690]: I0320 17:33:50.768373 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:50 crc kubenswrapper[4690]: I0320 17:33:50.768383 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:50Z","lastTransitionTime":"2026-03-20T17:33:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:50 crc kubenswrapper[4690]: I0320 17:33:50.773141 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:50Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:50 crc kubenswrapper[4690]: I0320 17:33:50.788407 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:50Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:50 crc kubenswrapper[4690]: I0320 17:33:50.809825 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:50Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:50 crc kubenswrapper[4690]: I0320 17:33:50.824665 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c18651e4-89e3-43fd-a780-bfa6df87591e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://746499ab480c55aa548acd69b4adc2adb724c111d53536273f1e738c5d67209c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v64dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09565d72b6e11bc9bc4f72446c455016fb107bdf0fe367b56427ce9f79c20b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v64dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wtg2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:50Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:50 crc kubenswrapper[4690]: I0320 17:33:50.837227 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4rfg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deaf1de2-4906-4e89-ae1b-83b6d35f97a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53b3e701b77813269b88f29ec4e437ca71cad9cd1b9cc9310dc6b59cc609bcc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmghf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4rfg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:50Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:50 crc kubenswrapper[4690]: I0320 17:33:50.849211 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nqtt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f51dea1-fc10-4d4a-9065-2d0c020b36f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzj2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzj2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8nqtt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:50Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:50 crc kubenswrapper[4690]: I0320 17:33:50.862743 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64a74fb2e29c84d99284cdca82ecd7abae5fc195747f292f11036116ec270ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37728496304293eddfd812f4584815ce277a3a2b02b6716e5f7d5d77ebaf9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:50Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:50 crc kubenswrapper[4690]: I0320 17:33:50.871144 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:50 crc kubenswrapper[4690]: I0320 17:33:50.871199 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:50 crc kubenswrapper[4690]: I0320 17:33:50.871212 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:50 crc kubenswrapper[4690]: I0320 17:33:50.871231 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:50 crc kubenswrapper[4690]: I0320 17:33:50.871243 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:50Z","lastTransitionTime":"2026-03-20T17:33:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:50 crc kubenswrapper[4690]: I0320 17:33:50.878102 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:50Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:50 crc kubenswrapper[4690]: I0320 17:33:50.898876 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bf8dm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"189715be-f690-4a1d-9bd3-fb0dcae7affe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z9vwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bf8dm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:50Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:50 crc kubenswrapper[4690]: I0320 17:33:50.922326 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dacf40f3-f7fe-429b-bb11-3057bc037779\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b273b610fa19944625ca87d5ec10f818b86154d676f1def5ebe494ee44ed3848\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f552ca9ec154d035a9f9809b20d9ff2cd19bbd4cb9262173a0334289741f4fad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd8c552d958aced0cb683d87c3ef8d88494d4888ccb028a9f4c27b24b4923264\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5355eb1563fa92e70ca61e39a864a15b53da2181b277f3e134d121b5626b954a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f044bae4d4345b16e951ba16d4dc6df9b400789b67b6eb23d806fba27dc77d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://601a5cb96354f970de2322d08594baacac3c21ec962d27dc0c809f1bc99de4d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://601a5cb96354f970de2322d08594baacac3c21ec962d27dc0c809f1bc99de4d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e719a69188fb4ee3882973f6f72ba027c5a546cb39b119b27bcd38d8cc728521\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e719a69188fb4ee3882973f6f72ba027c5a546cb39b119b27bcd38d8cc728521\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ef118e8eca52e42d265877595d296d5641caa5c79886b886eefca7686f9b6524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef118e8eca52e42d265877595d296d5641caa5c79886b886eefca7686f9b6524\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:50Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:50 crc kubenswrapper[4690]: I0320 17:33:50.940617 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bgj72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cb690cf-caea-4c1b-ad3c-7e17a802b1a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djqjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djqjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bgj72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:50Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:50 crc kubenswrapper[4690]: I0320 17:33:50.966040 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tzvwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fe7c1d1-7aa9-4c64-941e-7415a99367ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56dc92b978a7c1bbd4e3ccc2a6821348e2a990247e49e82c4de43c8bbe305cad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56dc92b978a7c1bbd4e3ccc2a6821348e2a990247e49e82c4de43c8bbe305cad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tzvwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:50Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:50 crc kubenswrapper[4690]: I0320 17:33:50.973850 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:50 crc kubenswrapper[4690]: I0320 17:33:50.973891 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:50 crc kubenswrapper[4690]: I0320 17:33:50.973903 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:50 crc kubenswrapper[4690]: I0320 17:33:50.973919 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:50 crc kubenswrapper[4690]: I0320 17:33:50.973933 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:50Z","lastTransitionTime":"2026-03-20T17:33:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:50 crc kubenswrapper[4690]: I0320 17:33:50.987209 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01a728ab-e286-4606-b922-d510978b863a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ad2529bd38d1e0c84ca456ccdcc8020ce82a667c5aa5ea3a0027d397ec94f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ad2529bd38d1e0c84ca456ccdcc8020ce82a667c5aa5ea3a0027d397ec94f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7bsmm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:50Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:51 crc kubenswrapper[4690]: I0320 17:33:51.076928 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:51 crc kubenswrapper[4690]: I0320 17:33:51.076963 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:51 crc kubenswrapper[4690]: I0320 17:33:51.076974 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:51 crc kubenswrapper[4690]: I0320 17:33:51.076989 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:51 crc kubenswrapper[4690]: I0320 17:33:51.077000 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:51Z","lastTransitionTime":"2026-03-20T17:33:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:51 crc kubenswrapper[4690]: I0320 17:33:51.180953 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:51 crc kubenswrapper[4690]: I0320 17:33:51.181053 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:51 crc kubenswrapper[4690]: I0320 17:33:51.181112 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:51 crc kubenswrapper[4690]: I0320 17:33:51.181142 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:51 crc kubenswrapper[4690]: I0320 17:33:51.181215 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:51Z","lastTransitionTime":"2026-03-20T17:33:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:51 crc kubenswrapper[4690]: I0320 17:33:51.284782 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:51 crc kubenswrapper[4690]: I0320 17:33:51.284840 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:51 crc kubenswrapper[4690]: I0320 17:33:51.284857 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:51 crc kubenswrapper[4690]: I0320 17:33:51.284880 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:51 crc kubenswrapper[4690]: I0320 17:33:51.284897 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:51Z","lastTransitionTime":"2026-03-20T17:33:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:51 crc kubenswrapper[4690]: I0320 17:33:51.387563 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:51 crc kubenswrapper[4690]: I0320 17:33:51.387914 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:51 crc kubenswrapper[4690]: I0320 17:33:51.387931 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:51 crc kubenswrapper[4690]: I0320 17:33:51.388101 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:51 crc kubenswrapper[4690]: I0320 17:33:51.388129 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:51Z","lastTransitionTime":"2026-03-20T17:33:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:51 crc kubenswrapper[4690]: I0320 17:33:51.425763 4690 generic.go:334] "Generic (PLEG): container finished" podID="3fe7c1d1-7aa9-4c64-941e-7415a99367ea" containerID="56dc92b978a7c1bbd4e3ccc2a6821348e2a990247e49e82c4de43c8bbe305cad" exitCode=0 Mar 20 17:33:51 crc kubenswrapper[4690]: I0320 17:33:51.425831 4690 generic.go:334] "Generic (PLEG): container finished" podID="3fe7c1d1-7aa9-4c64-941e-7415a99367ea" containerID="2b4a3f2829967bcafe60ed0c6d08a421e8c8a5cd49d2a7445bbc92c2592d7457" exitCode=0 Mar 20 17:33:51 crc kubenswrapper[4690]: I0320 17:33:51.425848 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tzvwm" event={"ID":"3fe7c1d1-7aa9-4c64-941e-7415a99367ea","Type":"ContainerDied","Data":"56dc92b978a7c1bbd4e3ccc2a6821348e2a990247e49e82c4de43c8bbe305cad"} Mar 20 17:33:51 crc kubenswrapper[4690]: I0320 17:33:51.425914 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tzvwm" event={"ID":"3fe7c1d1-7aa9-4c64-941e-7415a99367ea","Type":"ContainerDied","Data":"2b4a3f2829967bcafe60ed0c6d08a421e8c8a5cd49d2a7445bbc92c2592d7457"} Mar 20 17:33:51 crc kubenswrapper[4690]: I0320 17:33:51.447569 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64a74fb2e29c84d99284cdca82ecd7abae5fc195747f292f11036116ec270ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37728496304293eddfd812f4584815ce277a3a2b02b6716e5f7d5d77ebaf9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:51Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:51 crc kubenswrapper[4690]: I0320 17:33:51.470346 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:51Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:51 crc kubenswrapper[4690]: I0320 17:33:51.487367 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bf8dm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"189715be-f690-4a1d-9bd3-fb0dcae7affe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z9vwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bf8dm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:51Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:51 crc kubenswrapper[4690]: I0320 17:33:51.491706 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:51 crc kubenswrapper[4690]: I0320 17:33:51.491760 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:51 crc kubenswrapper[4690]: I0320 17:33:51.491779 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:51 crc kubenswrapper[4690]: I0320 17:33:51.491802 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:51 crc kubenswrapper[4690]: I0320 17:33:51.491821 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:51Z","lastTransitionTime":"2026-03-20T17:33:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:51 crc kubenswrapper[4690]: I0320 17:33:51.509180 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01a728ab-e286-4606-b922-d510978b863a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ad2529bd38d1e0c84ca456ccdcc8020ce82a667c5aa5ea3a0027d397ec94f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ad2529bd38d1e0c84ca456ccdcc8020ce82a667c5aa5ea3a0027d397ec94f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7bsmm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:51Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:51 crc kubenswrapper[4690]: I0320 17:33:51.544582 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dacf40f3-f7fe-429b-bb11-3057bc037779\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b273b610fa19944625ca87d5ec10f818b86154d676f1def5ebe494ee44ed3848\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f552ca9ec154d035a9f9809b20d9ff2cd19bbd4cb9262173a0334289741f4fad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd8c552d958aced0cb683d87c3ef8d88494d4888ccb028a9f4c27b24b4923264\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5355eb1563fa92e70ca61e39a864a15b53da2181b277f3e134d121b5626b954a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f044bae4d4345b16e951ba16d4dc6df9b400789b67b6eb23d806fba27dc77d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://601a5cb96354f970de2322d08594baacac3c21ec962d27dc0c809f1bc99de4d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://601a5cb96354f970de2322d08594baacac3c21ec962d27dc0c809f1bc99de4d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e719a69188fb4ee3882973f6f72ba027c5a546cb39b119b27bcd38d8cc728521\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e719a69188fb4ee3882973f6f72ba027c5a546cb39b119b27bcd38d8cc728521\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ef118e8eca52e42d265877595d296d5641caa5c79886b886eefca7686f9b6524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef118e8eca52e42d265877595d296d5641caa5c79886b886eefca7686f9b6524\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:51Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:51 crc kubenswrapper[4690]: I0320 17:33:51.559812 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bgj72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cb690cf-caea-4c1b-ad3c-7e17a802b1a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djqjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djqjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bgj72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:51Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:51 crc kubenswrapper[4690]: I0320 17:33:51.581338 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tzvwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fe7c1d1-7aa9-4c64-941e-7415a99367ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56dc92b978a7c1bbd4e3ccc2a6821348e2a990247e49e82c4de43c8bbe305cad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56dc92b978a7c1bbd4e3ccc2a6821348e2a990247e49e82c4de43c8bbe305cad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b4a3f2829967bcafe60ed0c6d08a421e8c8a5cd49d2a7445bbc92c2592d7457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b4a3f2829967bcafe60ed0c6d08a421e8c8a5cd49d2a7445bbc92c2592d7457\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tzvwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:51Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:51 crc kubenswrapper[4690]: I0320 17:33:51.594048 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:51 crc kubenswrapper[4690]: I0320 17:33:51.594099 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:51 crc kubenswrapper[4690]: I0320 17:33:51.594118 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:51 crc kubenswrapper[4690]: I0320 17:33:51.594142 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:51 crc kubenswrapper[4690]: I0320 17:33:51.594159 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:51Z","lastTransitionTime":"2026-03-20T17:33:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:51 crc kubenswrapper[4690]: I0320 17:33:51.604116 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:51Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:51 crc kubenswrapper[4690]: I0320 17:33:51.620893 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qhmg6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5abdfe2-a5f7-43a7-9c83-a9eb0dacdea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lb8q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qhmg6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:51Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:51 crc kubenswrapper[4690]: I0320 17:33:51.635633 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cdd6a8b-6b15-41c5-ba81-51e1ef53835e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c42561cbc470c23295468bf31d6dda364c3962cf8ac84f53ed62c01fa3e19db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbfcf8baf8b3cc4746bc7b314297f0f820b7461ad85d9c2f500a3ed589fb4bc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbfcf8baf8b3cc4746bc7b314297f0f820b7461ad85d9c2f500a3ed589fb4bc0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:51Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:51 crc kubenswrapper[4690]: I0320 17:33:51.653859 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1ec4f2e-81b3-4b81-b071-1306b93f352a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc5b19d4175f97a26633b3c61b49147f93e1edeb8975964cb23bbe474f6326e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe2bb59ee9fc82c3e49b375d294aebc73e2175d699416cb28c587a153cbadc5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83d020fd903a7b604233a4229c9a201a78f0f9d41864c94e82220090dd73e69e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a788ca120045ef7b2481c3da0afac1f8ae2522b3edd3b73a48f5f8dab045a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60a788ca120045ef7b2481c3da0afac1f8ae2522b3edd3b73a48f5f8dab045a4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:33:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:33:16.417534 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:33:16.417775 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:33:16.418850 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4179466923/tls.crt::/tmp/serving-cert-4179466923/tls.key\\\\\\\"\\\\nI0320 17:33:16.771141 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:33:16.777371 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:33:16.777420 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:33:16.777489 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:33:16.777503 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:33:16.783760 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 17:33:16.783788 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:33:16.783793 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 17:33:16.783790 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:33:16.783798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:33:16.783816 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:33:16.783823 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:33:16.783828 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:33:16.787038 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d1877a8c2e19c04c44916cbcd68e19a117e4d6075b33ce7131064590120b12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438a96b878fe413aa54a56021b7ca5d2d38226050a036c2ce144aaead090aff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://438a96b878fe413aa54a56021b7ca5d2d38226050a036c2ce144aaead090aff7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:51Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:51 crc kubenswrapper[4690]: I0320 17:33:51.670294 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:51Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:51 crc kubenswrapper[4690]: I0320 17:33:51.687226 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:51Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:51 crc kubenswrapper[4690]: I0320 17:33:51.696412 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:51 crc kubenswrapper[4690]: I0320 17:33:51.696463 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:51 crc kubenswrapper[4690]: I0320 17:33:51.696481 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:51 crc kubenswrapper[4690]: I0320 17:33:51.696505 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:51 crc kubenswrapper[4690]: I0320 17:33:51.696522 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:51Z","lastTransitionTime":"2026-03-20T17:33:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:51 crc kubenswrapper[4690]: I0320 17:33:51.702157 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nqtt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f51dea1-fc10-4d4a-9065-2d0c020b36f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzj2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzj2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8nqtt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:51Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:51 crc kubenswrapper[4690]: I0320 17:33:51.720463 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:51Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:51 crc kubenswrapper[4690]: I0320 17:33:51.737455 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c18651e4-89e3-43fd-a780-bfa6df87591e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://746499ab480c55aa548acd69b4adc2adb724c111d53536273f1e738c5d67209c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v64dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09565d72b6e11bc9bc4f72446c455016fb107bdf0fe367b56427ce9f79c20b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v64dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wtg2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:51Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:51 crc kubenswrapper[4690]: I0320 17:33:51.751967 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4rfg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deaf1de2-4906-4e89-ae1b-83b6d35f97a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53b3e701b77813269b88f29ec4e437ca71cad9cd1b9cc9310dc6b59cc609bcc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmghf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4rfg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:51Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:51 crc kubenswrapper[4690]: I0320 17:33:51.801769 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:51 crc kubenswrapper[4690]: I0320 17:33:51.801876 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:51 crc kubenswrapper[4690]: I0320 17:33:51.801943 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:51 crc kubenswrapper[4690]: I0320 17:33:51.801977 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:51 crc kubenswrapper[4690]: I0320 17:33:51.802039 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:51Z","lastTransitionTime":"2026-03-20T17:33:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:51 crc kubenswrapper[4690]: I0320 17:33:51.883430 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bgj72" Mar 20 17:33:51 crc kubenswrapper[4690]: E0320 17:33:51.883606 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bgj72" podUID="3cb690cf-caea-4c1b-ad3c-7e17a802b1a3" Mar 20 17:33:51 crc kubenswrapper[4690]: I0320 17:33:51.884856 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:33:51 crc kubenswrapper[4690]: E0320 17:33:51.884940 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:33:51 crc kubenswrapper[4690]: I0320 17:33:51.885009 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:33:51 crc kubenswrapper[4690]: E0320 17:33:51.885083 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:33:51 crc kubenswrapper[4690]: I0320 17:33:51.885557 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:33:51 crc kubenswrapper[4690]: E0320 17:33:51.885698 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:33:51 crc kubenswrapper[4690]: I0320 17:33:51.910062 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:51 crc kubenswrapper[4690]: I0320 17:33:51.910166 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:51 crc kubenswrapper[4690]: I0320 17:33:51.910190 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:51 crc kubenswrapper[4690]: I0320 17:33:51.910223 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:51 crc kubenswrapper[4690]: I0320 17:33:51.910244 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:51Z","lastTransitionTime":"2026-03-20T17:33:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:52 crc kubenswrapper[4690]: I0320 17:33:52.042647 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:52 crc kubenswrapper[4690]: I0320 17:33:52.043011 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:52 crc kubenswrapper[4690]: I0320 17:33:52.043024 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:52 crc kubenswrapper[4690]: I0320 17:33:52.043042 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:52 crc kubenswrapper[4690]: I0320 17:33:52.043053 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:52Z","lastTransitionTime":"2026-03-20T17:33:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:52 crc kubenswrapper[4690]: I0320 17:33:52.145461 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:52 crc kubenswrapper[4690]: I0320 17:33:52.145525 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:52 crc kubenswrapper[4690]: I0320 17:33:52.145536 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:52 crc kubenswrapper[4690]: I0320 17:33:52.145552 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:52 crc kubenswrapper[4690]: I0320 17:33:52.145562 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:52Z","lastTransitionTime":"2026-03-20T17:33:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:52 crc kubenswrapper[4690]: I0320 17:33:52.249158 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:52 crc kubenswrapper[4690]: I0320 17:33:52.249219 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:52 crc kubenswrapper[4690]: I0320 17:33:52.249239 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:52 crc kubenswrapper[4690]: I0320 17:33:52.249298 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:52 crc kubenswrapper[4690]: I0320 17:33:52.249320 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:52Z","lastTransitionTime":"2026-03-20T17:33:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:52 crc kubenswrapper[4690]: I0320 17:33:52.352933 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:52 crc kubenswrapper[4690]: I0320 17:33:52.353019 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:52 crc kubenswrapper[4690]: I0320 17:33:52.353045 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:52 crc kubenswrapper[4690]: I0320 17:33:52.353071 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:52 crc kubenswrapper[4690]: I0320 17:33:52.353088 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:52Z","lastTransitionTime":"2026-03-20T17:33:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:52 crc kubenswrapper[4690]: I0320 17:33:52.433033 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"a7501f273f832d465f837fe21cbfaddda7e9fdbfafe44e94d3fbfee21bbd2735"} Mar 20 17:33:52 crc kubenswrapper[4690]: I0320 17:33:52.435626 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nqtt" event={"ID":"3f51dea1-fc10-4d4a-9065-2d0c020b36f9","Type":"ContainerStarted","Data":"3de078ec156833ff0304a8e83014adf2c8fc5c7f8db9bb25c366acf27fa446ac"} Mar 20 17:33:52 crc kubenswrapper[4690]: I0320 17:33:52.435683 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nqtt" event={"ID":"3f51dea1-fc10-4d4a-9065-2d0c020b36f9","Type":"ContainerStarted","Data":"fc815d328a997ab7b69c5eb959fedde44313867916d64f4ebaf96d77e34b2e84"} Mar 20 17:33:52 crc kubenswrapper[4690]: I0320 17:33:52.439499 4690 generic.go:334] "Generic (PLEG): container finished" podID="3fe7c1d1-7aa9-4c64-941e-7415a99367ea" containerID="971c38dc48c64a0c8c8781e6d2a3d6f5222f9e846fb32ae417a4a1872a296b47" exitCode=0 Mar 20 17:33:52 crc kubenswrapper[4690]: I0320 17:33:52.439589 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tzvwm" event={"ID":"3fe7c1d1-7aa9-4c64-941e-7415a99367ea","Type":"ContainerDied","Data":"971c38dc48c64a0c8c8781e6d2a3d6f5222f9e846fb32ae417a4a1872a296b47"} Mar 20 17:33:52 crc kubenswrapper[4690]: I0320 17:33:52.442210 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-qhmg6" event={"ID":"e5abdfe2-a5f7-43a7-9c83-a9eb0dacdea3","Type":"ContainerStarted","Data":"19bc44db59dd7f723e92f099fb77ea80fac41a5fc0a3818ddd8d443495c50c8b"} Mar 20 17:33:52 crc kubenswrapper[4690]: I0320 17:33:52.451076 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" event={"ID":"01a728ab-e286-4606-b922-d510978b863a","Type":"ContainerStarted","Data":"6447a78cef9ba2045f7928077399b681d152b37755ec287ae1633a26a67711ff"} Mar 20 17:33:52 crc kubenswrapper[4690]: I0320 17:33:52.456162 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:52 crc kubenswrapper[4690]: I0320 17:33:52.456327 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:52 crc kubenswrapper[4690]: I0320 17:33:52.456360 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:52 crc kubenswrapper[4690]: I0320 17:33:52.456401 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:52 crc kubenswrapper[4690]: I0320 17:33:52.456423 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:52Z","lastTransitionTime":"2026-03-20T17:33:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:52 crc kubenswrapper[4690]: I0320 17:33:52.458961 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:52Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:52 crc kubenswrapper[4690]: I0320 17:33:52.479736 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c18651e4-89e3-43fd-a780-bfa6df87591e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://746499ab480c55aa548acd69b4adc2adb724c111d53536273f1e738c5d67209c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v64dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09565d72b6e11bc9bc4f72446c455016fb107bdf0fe367b56427ce9f79c20b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v64dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wtg2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:52Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:52 crc kubenswrapper[4690]: I0320 17:33:52.497399 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4rfg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deaf1de2-4906-4e89-ae1b-83b6d35f97a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53b3e701b77813269b88f29ec4e437ca71cad9cd1b9cc9310dc6b59cc609bcc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmghf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4rfg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:52Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:52 crc kubenswrapper[4690]: I0320 17:33:52.521850 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nqtt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f51dea1-fc10-4d4a-9065-2d0c020b36f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzj2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzj2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8nqtt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:52Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:52 crc kubenswrapper[4690]: I0320 17:33:52.542984 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:52Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:52 crc kubenswrapper[4690]: I0320 17:33:52.561085 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:52 crc kubenswrapper[4690]: I0320 17:33:52.561222 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:52 crc kubenswrapper[4690]: I0320 17:33:52.561242 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:52 crc kubenswrapper[4690]: I0320 17:33:52.561337 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:52 crc kubenswrapper[4690]: I0320 17:33:52.561361 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:52Z","lastTransitionTime":"2026-03-20T17:33:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:52 crc kubenswrapper[4690]: I0320 17:33:52.564891 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bf8dm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"189715be-f690-4a1d-9bd3-fb0dcae7affe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z9vwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bf8dm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:52Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:52 crc kubenswrapper[4690]: I0320 17:33:52.588529 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64a74fb2e29c84d99284cdca82ecd7abae5fc195747f292f11036116ec270ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37728496304293eddfd812f4584815ce277a3a2b02b6716e5f7d5d77ebaf9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:52Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:52 crc kubenswrapper[4690]: I0320 17:33:52.615937 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dacf40f3-f7fe-429b-bb11-3057bc037779\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b273b610fa19944625ca87d5ec10f818b86154d676f1def5ebe494ee44ed3848\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f552ca9ec154d035a9f9809b20d9ff2cd19bbd4cb9262173a0334289741f4fad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd8c552d958aced0cb683d87c3ef8d88494d4888ccb028a9f4c27b24b4923264\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5355eb1563fa92e70ca61e39a864a15b53da2181b277f3e134d121b5626b954a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f044bae4d4345b16e951ba16d4dc6df9b400789b67b6eb23d806fba27dc77d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://601a5cb96354f970de2322d08594baacac3c21ec962d27dc0c809f1bc99de4d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://601a5cb96354f970de2322d08594baacac3c21ec962d27dc0c809f1bc99de4d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e719a69188fb4ee3882973f6f72ba027c5a546cb39b119b27bcd38d8cc728521\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e719a69188fb4ee3882973f6f72ba027c5a546cb39b119b27bcd38d8cc728521\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ef118e8eca52e42d265877595d296d5641caa5c79886b886eefca7686f9b6524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef118e8eca52e42d265877595d296d5641caa5c79886b886eefca7686f9b6524\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:52Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:52 crc kubenswrapper[4690]: I0320 17:33:52.638722 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bgj72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cb690cf-caea-4c1b-ad3c-7e17a802b1a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djqjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djqjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bgj72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:52Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:52 crc kubenswrapper[4690]: I0320 17:33:52.658333 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tzvwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fe7c1d1-7aa9-4c64-941e-7415a99367ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56dc92b978a7c1bbd4e3ccc2a6821348e2a990247e49e82c4de43c8bbe305cad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56dc92b978a7c1bbd4e3ccc2a6821348e2a990247e49e82c4de43c8bbe305cad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b4a3f2829967bcafe60ed0c6d08a421e8c8a5cd49d2a7445bbc92c2592d7457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b4a3f2829967bcafe60ed0c6d08a421e8c8a5cd49d2a7445bbc92c2592d7457\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tzvwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:52Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:52 crc kubenswrapper[4690]: I0320 17:33:52.664977 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:52 crc kubenswrapper[4690]: I0320 17:33:52.665029 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:52 crc kubenswrapper[4690]: I0320 17:33:52.665045 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:52 crc kubenswrapper[4690]: I0320 17:33:52.665066 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:52 crc kubenswrapper[4690]: I0320 17:33:52.665081 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:52Z","lastTransitionTime":"2026-03-20T17:33:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:52 crc kubenswrapper[4690]: I0320 17:33:52.689786 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01a728ab-e286-4606-b922-d510978b863a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ad2529bd38d1e0c84ca456ccdcc8020ce82a667c5aa5ea3a0027d397ec94f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ad2529bd38d1e0c84ca456ccdcc8020ce82a667c5aa5ea3a0027d397ec94f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7bsmm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:52Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:52 crc kubenswrapper[4690]: I0320 17:33:52.706234 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1ec4f2e-81b3-4b81-b071-1306b93f352a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc5b19d4175f97a26633b3c61b49147f93e1edeb8975964cb23bbe474f6326e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe2bb59ee9fc82c3e49b375d294aebc73e2175d699416cb28c587a153cbadc5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83d020fd903a7b604233a4229c9a201a78f0f9d41864c94e82220090dd73e69e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a788ca120045ef7b2481c3da0afac1f8ae2522b3edd3b73a48f5f8dab045a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60a788ca120045ef7b2481c3da0afac1f8ae2522b3edd3b73a48f5f8dab045a4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:33:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:33:16.417534 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:33:16.417775 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:33:16.418850 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4179466923/tls.crt::/tmp/serving-cert-4179466923/tls.key\\\\\\\"\\\\nI0320 17:33:16.771141 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:33:16.777371 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:33:16.777420 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:33:16.777489 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:33:16.777503 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:33:16.783760 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 17:33:16.783788 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:33:16.783793 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 17:33:16.783790 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:33:16.783798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:33:16.783816 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:33:16.783823 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:33:16.783828 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:33:16.787038 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d1877a8c2e19c04c44916cbcd68e19a117e4d6075b33ce7131064590120b12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438a96b878fe413aa54a56021b7ca5d2d38226050a036c2ce144aaead090aff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://438a96b878fe413aa54a56021b7ca5d2d38226050a036c2ce144aaead090aff7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:52Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:52 crc kubenswrapper[4690]: I0320 17:33:52.721744 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7501f273f832d465f837fe21cbfaddda7e9fdbfafe44e94d3fbfee21bbd2735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:52Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:52 crc kubenswrapper[4690]: I0320 17:33:52.734944 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:52Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:52 crc kubenswrapper[4690]: I0320 17:33:52.751973 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:52Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:52 crc kubenswrapper[4690]: I0320 17:33:52.765351 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qhmg6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5abdfe2-a5f7-43a7-9c83-a9eb0dacdea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lb8q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qhmg6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:52Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:52 crc kubenswrapper[4690]: I0320 17:33:52.767344 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:52 crc kubenswrapper[4690]: I0320 17:33:52.767403 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:52 crc kubenswrapper[4690]: I0320 17:33:52.767422 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:52 crc kubenswrapper[4690]: I0320 17:33:52.767447 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:52 crc kubenswrapper[4690]: I0320 17:33:52.767465 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:52Z","lastTransitionTime":"2026-03-20T17:33:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:52 crc kubenswrapper[4690]: I0320 17:33:52.777587 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cdd6a8b-6b15-41c5-ba81-51e1ef53835e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c42561cbc470c23295468bf31d6dda364c3962cf8ac84f53ed62c01fa3e19db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbfcf8baf8b3cc4746bc7b314297f0f820b7461ad85d9c2f500a3ed589fb4bc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbfcf8baf8b3cc4746bc7b314297f0f820b7461ad85d9c2f500a3ed589fb4bc0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:52Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:52 crc kubenswrapper[4690]: I0320 17:33:52.790238 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:52Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:52 crc kubenswrapper[4690]: I0320 17:33:52.803432 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:52Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:52 crc kubenswrapper[4690]: I0320 17:33:52.814915 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qhmg6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5abdfe2-a5f7-43a7-9c83-a9eb0dacdea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19bc44db59dd7f723e92f099fb77ea80fac41a5fc0a3818ddd8d443495c50c8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lb8q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qhmg6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:52Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:52 crc kubenswrapper[4690]: I0320 17:33:52.824485 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cdd6a8b-6b15-41c5-ba81-51e1ef53835e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c42561cbc470c23295468bf31d6dda364c3962cf8ac84f53ed62c01fa3e19db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbfcf8baf8b3cc4746bc7b314297f0f820b7461ad85d9c2f500a3ed589fb4bc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbfcf8baf8b3cc4746bc7b314297f0f820b7461ad85d9c2f500a3ed589fb4bc0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:52Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:52 crc kubenswrapper[4690]: I0320 17:33:52.840403 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1ec4f2e-81b3-4b81-b071-1306b93f352a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc5b19d4175f97a26633b3c61b49147f93e1edeb8975964cb23bbe474f6326e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe2bb59ee9fc82c3e49b375d294aebc73e2175d699416cb28c587a153cbadc5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83d020fd903a7b604233a4229c9a201a78f0f9d41864c94e82220090dd73e69e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a788ca120045ef7b2481c3da0afac1f8ae2522b3edd3b73a48f5f8dab045a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60a788ca120045ef7b2481c3da0afac1f8ae2522b3edd3b73a48f5f8dab045a4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:33:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:33:16.417534 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:33:16.417775 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:33:16.418850 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4179466923/tls.crt::/tmp/serving-cert-4179466923/tls.key\\\\\\\"\\\\nI0320 17:33:16.771141 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:33:16.777371 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:33:16.777420 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:33:16.777489 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:33:16.777503 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:33:16.783760 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 17:33:16.783788 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:33:16.783793 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 17:33:16.783790 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:33:16.783798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:33:16.783816 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:33:16.783823 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:33:16.783828 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:33:16.787038 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d1877a8c2e19c04c44916cbcd68e19a117e4d6075b33ce7131064590120b12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438a96b878fe413aa54a56021b7ca5d2d38226050a036c2ce144aaead090aff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://438a96b878fe413aa54a56021b7ca5d2d38226050a036c2ce144aaead090aff7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:52Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:52 crc kubenswrapper[4690]: I0320 17:33:52.856014 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7501f273f832d465f837fe21cbfaddda7e9fdbfafe44e94d3fbfee21bbd2735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:52Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:52 crc kubenswrapper[4690]: I0320 17:33:52.865053 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4rfg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deaf1de2-4906-4e89-ae1b-83b6d35f97a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53b3e701b77813269b88f29ec4e437ca71cad9cd1b9cc9310dc6b59cc609bcc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmghf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4rfg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:52Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:52 crc kubenswrapper[4690]: I0320 17:33:52.869134 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:52 crc kubenswrapper[4690]: I0320 17:33:52.869166 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:52 crc kubenswrapper[4690]: I0320 17:33:52.869178 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:52 crc kubenswrapper[4690]: I0320 17:33:52.869195 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:52 crc kubenswrapper[4690]: I0320 17:33:52.869206 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:52Z","lastTransitionTime":"2026-03-20T17:33:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:52 crc kubenswrapper[4690]: I0320 17:33:52.875326 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nqtt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f51dea1-fc10-4d4a-9065-2d0c020b36f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc815d328a997ab7b69c5eb959fedde44313867916d64f4ebaf96d77e34b2e84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzj2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de078ec156833ff0304a8e83014adf2c8fc5c7f8db9bb25c366acf27fa446ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzj2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8nqtt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:52Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:52 crc kubenswrapper[4690]: I0320 17:33:52.882983 4690 scope.go:117] "RemoveContainer" containerID="60a788ca120045ef7b2481c3da0afac1f8ae2522b3edd3b73a48f5f8dab045a4" Mar 20 17:33:52 crc kubenswrapper[4690]: E0320 17:33:52.883113 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 17:33:52 crc kubenswrapper[4690]: I0320 17:33:52.886167 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:52Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:52 crc kubenswrapper[4690]: I0320 17:33:52.898443 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c18651e4-89e3-43fd-a780-bfa6df87591e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://746499ab480c55aa548acd69b4adc2adb724c111d53536273f1e738c5d67209c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v64dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09565d72b6e11bc9bc4f72446c455016fb107bdf0fe367b56427ce9f79c20b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v64dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wtg2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:52Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:52 crc kubenswrapper[4690]: I0320 17:33:52.914206 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64a74fb2e29c84d99284cdca82ecd7abae5fc195747f292f11036116ec270ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37728496304293eddfd812f4584815ce277a3a2b02b6716e5f7d5d77ebaf9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:52Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:52 crc kubenswrapper[4690]: I0320 17:33:52.927067 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:52Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:52 crc kubenswrapper[4690]: I0320 17:33:52.941095 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bf8dm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"189715be-f690-4a1d-9bd3-fb0dcae7affe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z9vwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bf8dm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:52Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:52 crc kubenswrapper[4690]: I0320 17:33:52.959331 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tzvwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fe7c1d1-7aa9-4c64-941e-7415a99367ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56dc92b978a7c1bbd4e3ccc2a6821348e2a990247e49e82c4de43c8bbe305cad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56dc92b978a7c1bbd4e3ccc2a6821348e2a990247e49e82c4de43c8bbe305cad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b4a3f2829967bcafe60ed0c6d08a421e8c8a5cd49d2a7445bbc92c2592d7457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b4a3f2829967bcafe60ed0c6d08a421e8c8a5cd49d2a7445bbc92c2592d7457\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://971c38dc48c64a0c8c8781e6d2a3d6f5222f9e846fb32ae417a4a1872a296b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://971c38dc48c64a0c8c8781e6d2a3d6f5222f9e846fb32ae417a4a1872a296b47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tzvwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:52Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:52 crc kubenswrapper[4690]: I0320 17:33:52.972298 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:52 crc kubenswrapper[4690]: I0320 17:33:52.972380 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:52 crc kubenswrapper[4690]: I0320 17:33:52.972398 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:52 crc kubenswrapper[4690]: I0320 17:33:52.972422 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:52 crc kubenswrapper[4690]: I0320 17:33:52.972441 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:52Z","lastTransitionTime":"2026-03-20T17:33:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:52 crc kubenswrapper[4690]: I0320 17:33:52.981059 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01a728ab-e286-4606-b922-d510978b863a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ad2529bd38d1e0c84ca456ccdcc8020ce82a667c5aa5ea3a0027d397ec94f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ad2529bd38d1e0c84ca456ccdcc8020ce82a667c5aa5ea3a0027d397ec94f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7bsmm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:52Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:53 crc kubenswrapper[4690]: I0320 17:33:53.012368 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dacf40f3-f7fe-429b-bb11-3057bc037779\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b273b610fa19944625ca87d5ec10f818b86154d676f1def5ebe494ee44ed3848\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f552ca9ec154d035a9f9809b20d9ff2cd19bbd4cb9262173a0334289741f4fad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd8c552d958aced0cb683d87c3ef8d88494d4888ccb028a9f4c27b24b4923264\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5355eb1563fa92e70ca61e39a864a15b53da2181b277f3e134d121b5626b954a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f044bae4d4345b16e951ba16d4dc6df9b400789b67b6eb23d806fba27dc77d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://601a5cb96354f970de2322d08594baacac3c21ec962d27dc0c809f1bc99de4d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://601a5cb96354f970de2322d08594baacac3c21ec962d27dc0c809f1bc99de4d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e719a69188fb4ee3882973f6f72ba027c5a546cb39b119b27bcd38d8cc728521\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e719a69188fb4ee3882973f6f72ba027c5a546cb39b119b27bcd38d8cc728521\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ef118e8eca52e42d265877595d296d5641caa5c79886b886eefca7686f9b6524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef118e8eca52e42d265877595d296d5641caa5c79886b886eefca7686f9b6524\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:53Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:53 crc kubenswrapper[4690]: I0320 17:33:53.026368 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bgj72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cb690cf-caea-4c1b-ad3c-7e17a802b1a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djqjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djqjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bgj72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:53Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:53 crc kubenswrapper[4690]: I0320 17:33:53.074785 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:53 crc kubenswrapper[4690]: I0320 17:33:53.074823 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:53 crc kubenswrapper[4690]: I0320 17:33:53.074832 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:53 crc kubenswrapper[4690]: I0320 17:33:53.074847 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:53 crc kubenswrapper[4690]: I0320 17:33:53.074858 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:53Z","lastTransitionTime":"2026-03-20T17:33:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:53 crc kubenswrapper[4690]: I0320 17:33:53.177471 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:53 crc kubenswrapper[4690]: I0320 17:33:53.177511 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:53 crc kubenswrapper[4690]: I0320 17:33:53.177523 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:53 crc kubenswrapper[4690]: I0320 17:33:53.177540 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:53 crc kubenswrapper[4690]: I0320 17:33:53.177552 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:53Z","lastTransitionTime":"2026-03-20T17:33:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:53 crc kubenswrapper[4690]: I0320 17:33:53.279971 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:53 crc kubenswrapper[4690]: I0320 17:33:53.280004 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:53 crc kubenswrapper[4690]: I0320 17:33:53.280036 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:53 crc kubenswrapper[4690]: I0320 17:33:53.280052 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:53 crc kubenswrapper[4690]: I0320 17:33:53.280063 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:53Z","lastTransitionTime":"2026-03-20T17:33:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:53 crc kubenswrapper[4690]: I0320 17:33:53.382583 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:53 crc kubenswrapper[4690]: I0320 17:33:53.382637 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:53 crc kubenswrapper[4690]: I0320 17:33:53.382652 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:53 crc kubenswrapper[4690]: I0320 17:33:53.382674 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:53 crc kubenswrapper[4690]: I0320 17:33:53.382691 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:53Z","lastTransitionTime":"2026-03-20T17:33:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:53 crc kubenswrapper[4690]: I0320 17:33:53.458019 4690 generic.go:334] "Generic (PLEG): container finished" podID="3fe7c1d1-7aa9-4c64-941e-7415a99367ea" containerID="e6e590fdf915cb209ad79022e0bb1b20cf642ebfeaa5e67cad61f14c495feaed" exitCode=0 Mar 20 17:33:53 crc kubenswrapper[4690]: I0320 17:33:53.458073 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tzvwm" event={"ID":"3fe7c1d1-7aa9-4c64-941e-7415a99367ea","Type":"ContainerDied","Data":"e6e590fdf915cb209ad79022e0bb1b20cf642ebfeaa5e67cad61f14c495feaed"} Mar 20 17:33:53 crc kubenswrapper[4690]: I0320 17:33:53.475437 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qhmg6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5abdfe2-a5f7-43a7-9c83-a9eb0dacdea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19bc44db59dd7f723e92f099fb77ea80fac41a5fc0a3818ddd8d443495c50c8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lb8q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qhmg6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:53Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:53 crc kubenswrapper[4690]: I0320 17:33:53.485807 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:53 crc kubenswrapper[4690]: I0320 17:33:53.485877 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:53 crc kubenswrapper[4690]: I0320 17:33:53.485894 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:53 crc kubenswrapper[4690]: I0320 17:33:53.485917 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:53 crc kubenswrapper[4690]: I0320 17:33:53.485936 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:53Z","lastTransitionTime":"2026-03-20T17:33:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:53 crc kubenswrapper[4690]: I0320 17:33:53.501640 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cdd6a8b-6b15-41c5-ba81-51e1ef53835e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c42561cbc470c23295468bf31d6dda364c3962cf8ac84f53ed62c01fa3e19db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbfcf8baf8b3cc4746bc7b314297f0f820b7461ad85d9c2f500a3ed589fb4bc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbfcf8baf8b3cc4746bc7b314297f0f820b7461ad85d9c2f500a3ed589fb4bc0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:53Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:53 crc kubenswrapper[4690]: I0320 17:33:53.521563 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1ec4f2e-81b3-4b81-b071-1306b93f352a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc5b19d4175f97a26633b3c61b49147f93e1edeb8975964cb23bbe474f6326e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe2bb59ee9fc82c3e49b375d294aebc73e2175d699416cb28c587a153cbadc5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83d020fd903a7b604233a4229c9a201a78f0f9d41864c94e82220090dd73e69e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a788ca120045ef7b2481c3da0afac1f8ae2522b3edd3b73a48f5f8dab045a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60a788ca120045ef7b2481c3da0afac1f8ae2522b3edd3b73a48f5f8dab045a4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:33:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:33:16.417534 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:33:16.417775 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:33:16.418850 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4179466923/tls.crt::/tmp/serving-cert-4179466923/tls.key\\\\\\\"\\\\nI0320 17:33:16.771141 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:33:16.777371 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:33:16.777420 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:33:16.777489 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:33:16.777503 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:33:16.783760 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 17:33:16.783788 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:33:16.783793 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 17:33:16.783790 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:33:16.783798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:33:16.783816 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:33:16.783823 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:33:16.783828 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:33:16.787038 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d1877a8c2e19c04c44916cbcd68e19a117e4d6075b33ce7131064590120b12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438a96b878fe413aa54a56021b7ca5d2d38226050a036c2ce144aaead090aff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://438a96b878fe413aa54a56021b7ca5d2d38226050a036c2ce144aaead090aff7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:53Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:53 crc kubenswrapper[4690]: I0320 17:33:53.546195 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7501f273f832d465f837fe21cbfaddda7e9fdbfafe44e94d3fbfee21bbd2735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:53Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:53 crc kubenswrapper[4690]: I0320 17:33:53.560721 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:53Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:53 crc kubenswrapper[4690]: I0320 17:33:53.574841 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:53Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:53 crc kubenswrapper[4690]: I0320 17:33:53.589758 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:53Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:53 crc kubenswrapper[4690]: I0320 17:33:53.590047 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:53 crc kubenswrapper[4690]: I0320 17:33:53.590098 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:53 crc kubenswrapper[4690]: I0320 17:33:53.590117 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:53 crc kubenswrapper[4690]: I0320 17:33:53.590141 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:53 crc kubenswrapper[4690]: I0320 17:33:53.590158 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:53Z","lastTransitionTime":"2026-03-20T17:33:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:53 crc kubenswrapper[4690]: I0320 17:33:53.601816 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c18651e4-89e3-43fd-a780-bfa6df87591e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://746499ab480c55aa548acd69b4adc2adb724c111d53536273f1e738c5d67209c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v64dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09565d72b6e11bc9bc4f72446c455016fb107bdf0fe367b56427ce9f79c20b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v64dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wtg2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:53Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:53 crc kubenswrapper[4690]: I0320 17:33:53.615402 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4rfg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deaf1de2-4906-4e89-ae1b-83b6d35f97a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53b3e701b77813269b88f29ec4e437ca71cad9cd1b9cc9310dc6b59cc609bcc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmghf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4rfg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:53Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:53 crc kubenswrapper[4690]: I0320 17:33:53.629196 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nqtt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f51dea1-fc10-4d4a-9065-2d0c020b36f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc815d328a997ab7b69c5eb959fedde44313867916d64f4ebaf96d77e34b2e84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzj2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de078ec156833ff0304a8e83014adf2c8fc5c7f8db9bb25c366acf27fa446ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzj2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8nqtt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:53Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:53 crc kubenswrapper[4690]: I0320 17:33:53.649778 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64a74fb2e29c84d99284cdca82ecd7abae5fc195747f292f11036116ec270ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37728496304293eddfd812f4584815ce277a3a2b02b6716e5f7d5d77ebaf9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:53Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:53 crc kubenswrapper[4690]: I0320 17:33:53.668235 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:53Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:53 crc kubenswrapper[4690]: I0320 17:33:53.689037 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bf8dm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"189715be-f690-4a1d-9bd3-fb0dcae7affe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z9vwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bf8dm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:53Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:53 crc kubenswrapper[4690]: I0320 17:33:53.692597 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:53 crc kubenswrapper[4690]: I0320 17:33:53.692635 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:53 crc kubenswrapper[4690]: I0320 17:33:53.692647 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:53 crc kubenswrapper[4690]: I0320 17:33:53.692666 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:53 crc kubenswrapper[4690]: I0320 17:33:53.692678 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:53Z","lastTransitionTime":"2026-03-20T17:33:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:53 crc kubenswrapper[4690]: I0320 17:33:53.711375 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dacf40f3-f7fe-429b-bb11-3057bc037779\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b273b610fa19944625ca87d5ec10f818b86154d676f1def5ebe494ee44ed3848\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f552ca9ec154d035a9f9809b20d9ff2cd19bbd4cb9262173a0334289741f4fad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd8c552d958aced0cb683d87c3ef8d88494d4888ccb028a9f4c27b24b4923264\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5355eb1563fa92e70ca61e39a864a15b53da2181b277f3e134d121b5626b954a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f044bae4d4345b16e951ba16d4dc6df9b400789b67b6eb23d806fba27dc77d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://601a5cb96354f970de2322d08594baacac3c21ec962d27dc0c809f1bc99de4d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://601a5cb96354f970de2322d08594baacac3c21ec962d27dc0c809f1bc99de4d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e719a69188fb4ee3882973f6f72ba027c5a546cb39b119b27bcd38d8cc728521\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e719a69188fb4ee3882973f6f72ba027c5a546cb39b119b27bcd38d8cc728521\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ef118e8eca52e42d265877595d296d5641caa5c79886b886eefca7686f9b6524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef118e8eca52e42d265877595d296d5641caa5c79886b886eefca7686f9b6524\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:53Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:53 crc kubenswrapper[4690]: I0320 17:33:53.724922 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bgj72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cb690cf-caea-4c1b-ad3c-7e17a802b1a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djqjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djqjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bgj72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:53Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:53 crc kubenswrapper[4690]: I0320 17:33:53.738373 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tzvwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fe7c1d1-7aa9-4c64-941e-7415a99367ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56dc92b978a7c1bbd4e3ccc2a6821348e2a990247e49e82c4de43c8bbe305cad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56dc92b978a7c1bbd4e3ccc2a6821348e2a990247e49e82c4de43c8bbe305cad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b4a3f2829967bcafe60ed0c6d08a421e8c8a5cd49d2a7445bbc92c2592d7457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b4a3f2829967bcafe60ed0c6d08a421e8c8a5cd49d2a7445bbc92c2592d7457\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://971c38dc48c64a0c8c8781e6d2a3d6f5222f9e846fb32ae417a4a1872a296b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://971c38dc48c64a0c8c8781e6d2a3d6f5222f9e846fb32ae417a4a1872a296b47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6e590fdf915cb209ad79022e0bb1b20cf642ebfeaa5e67cad61f14c495feaed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6e590fdf915cb209ad79022e0bb1b20cf642ebfeaa5e67cad61f14c495feaed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tzvwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:53Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:53 crc kubenswrapper[4690]: I0320 17:33:53.754489 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01a728ab-e286-4606-b922-d510978b863a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ad2529bd38d1e0c84ca456ccdcc8020ce82a667c5aa5ea3a0027d397ec94f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ad2529bd38d1e0c84ca456ccdcc8020ce82a667c5aa5ea3a0027d397ec94f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7bsmm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:53Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:53 crc kubenswrapper[4690]: I0320 17:33:53.795387 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:53 crc kubenswrapper[4690]: I0320 17:33:53.795431 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:53 crc kubenswrapper[4690]: I0320 17:33:53.795443 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:53 crc kubenswrapper[4690]: I0320 17:33:53.795463 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:53 crc kubenswrapper[4690]: I0320 17:33:53.795477 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:53Z","lastTransitionTime":"2026-03-20T17:33:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:53 crc kubenswrapper[4690]: I0320 17:33:53.883183 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:33:53 crc kubenswrapper[4690]: E0320 17:33:53.883318 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:33:53 crc kubenswrapper[4690]: I0320 17:33:53.883609 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:33:53 crc kubenswrapper[4690]: E0320 17:33:53.883661 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:33:53 crc kubenswrapper[4690]: I0320 17:33:53.883698 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:33:53 crc kubenswrapper[4690]: E0320 17:33:53.883738 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:33:53 crc kubenswrapper[4690]: I0320 17:33:53.886068 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bgj72" Mar 20 17:33:53 crc kubenswrapper[4690]: E0320 17:33:53.886155 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bgj72" podUID="3cb690cf-caea-4c1b-ad3c-7e17a802b1a3" Mar 20 17:33:53 crc kubenswrapper[4690]: I0320 17:33:53.897607 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:53 crc kubenswrapper[4690]: I0320 17:33:53.898309 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:53 crc kubenswrapper[4690]: I0320 17:33:53.898401 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:53 crc kubenswrapper[4690]: I0320 17:33:53.898473 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:53 crc kubenswrapper[4690]: I0320 17:33:53.898538 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:53Z","lastTransitionTime":"2026-03-20T17:33:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:54 crc kubenswrapper[4690]: I0320 17:33:54.001152 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:54 crc kubenswrapper[4690]: I0320 17:33:54.002037 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:54 crc kubenswrapper[4690]: I0320 17:33:54.002141 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:54 crc kubenswrapper[4690]: I0320 17:33:54.002221 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:54 crc kubenswrapper[4690]: I0320 17:33:54.002332 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:54Z","lastTransitionTime":"2026-03-20T17:33:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:54 crc kubenswrapper[4690]: I0320 17:33:54.105094 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:54 crc kubenswrapper[4690]: I0320 17:33:54.105138 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:54 crc kubenswrapper[4690]: I0320 17:33:54.105155 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:54 crc kubenswrapper[4690]: I0320 17:33:54.105180 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:54 crc kubenswrapper[4690]: I0320 17:33:54.105197 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:54Z","lastTransitionTime":"2026-03-20T17:33:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:54 crc kubenswrapper[4690]: I0320 17:33:54.208386 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:54 crc kubenswrapper[4690]: I0320 17:33:54.208435 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:54 crc kubenswrapper[4690]: I0320 17:33:54.208499 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:54 crc kubenswrapper[4690]: I0320 17:33:54.208523 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:54 crc kubenswrapper[4690]: I0320 17:33:54.208540 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:54Z","lastTransitionTime":"2026-03-20T17:33:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:54 crc kubenswrapper[4690]: I0320 17:33:54.311450 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:54 crc kubenswrapper[4690]: I0320 17:33:54.311509 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:54 crc kubenswrapper[4690]: I0320 17:33:54.311520 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:54 crc kubenswrapper[4690]: I0320 17:33:54.311539 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:54 crc kubenswrapper[4690]: I0320 17:33:54.311553 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:54Z","lastTransitionTime":"2026-03-20T17:33:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:54 crc kubenswrapper[4690]: I0320 17:33:54.415549 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:54 crc kubenswrapper[4690]: I0320 17:33:54.415601 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:54 crc kubenswrapper[4690]: I0320 17:33:54.415616 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:54 crc kubenswrapper[4690]: I0320 17:33:54.415636 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:54 crc kubenswrapper[4690]: I0320 17:33:54.415648 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:54Z","lastTransitionTime":"2026-03-20T17:33:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:54 crc kubenswrapper[4690]: I0320 17:33:54.466347 4690 generic.go:334] "Generic (PLEG): container finished" podID="3fe7c1d1-7aa9-4c64-941e-7415a99367ea" containerID="aa9bfe0b6b30c8ecbcab836f9fd1770f959392e981e9676b281b5768a4279d22" exitCode=0 Mar 20 17:33:54 crc kubenswrapper[4690]: I0320 17:33:54.466410 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tzvwm" event={"ID":"3fe7c1d1-7aa9-4c64-941e-7415a99367ea","Type":"ContainerDied","Data":"aa9bfe0b6b30c8ecbcab836f9fd1770f959392e981e9676b281b5768a4279d22"} Mar 20 17:33:54 crc kubenswrapper[4690]: I0320 17:33:54.474738 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" event={"ID":"01a728ab-e286-4606-b922-d510978b863a","Type":"ContainerStarted","Data":"33c64eaa45aef662646090576ff2b79e93b622b98520ed7fc96d04d9d8bf4dec"} Mar 20 17:33:54 crc kubenswrapper[4690]: I0320 17:33:54.475421 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" Mar 20 17:33:54 crc kubenswrapper[4690]: I0320 17:33:54.489095 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:54Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:54 crc kubenswrapper[4690]: I0320 17:33:54.508149 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c18651e4-89e3-43fd-a780-bfa6df87591e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://746499ab480c55aa548acd69b4adc2adb724c111d53536273f1e738c5d67209c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v64dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09565d72b6e11bc9bc4f72446c455016fb107bdf0fe367b56427ce9f79c20b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v64dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wtg2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:54Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:54 crc kubenswrapper[4690]: I0320 17:33:54.515304 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" Mar 20 17:33:54 crc kubenswrapper[4690]: I0320 17:33:54.519893 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:54 crc kubenswrapper[4690]: I0320 17:33:54.519966 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:54 crc kubenswrapper[4690]: I0320 17:33:54.520075 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:54 crc kubenswrapper[4690]: I0320 17:33:54.520175 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:54 crc kubenswrapper[4690]: I0320 17:33:54.520345 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:54Z","lastTransitionTime":"2026-03-20T17:33:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:54 crc kubenswrapper[4690]: I0320 17:33:54.524976 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4rfg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deaf1de2-4906-4e89-ae1b-83b6d35f97a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53b3e701b77813269b88f29ec4e437ca71cad9cd1b9cc9310dc6b59cc609bcc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmghf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4rfg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:54Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:54 crc kubenswrapper[4690]: I0320 17:33:54.543143 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nqtt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f51dea1-fc10-4d4a-9065-2d0c020b36f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc815d328a997ab7b69c5eb959fedde44313867916d64f4ebaf96d77e34b2e84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzj2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de078ec156833ff0304a8e83014adf2c8fc5c7f8db9bb25c366acf27fa446ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzj2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8nqtt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:54Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:54 crc kubenswrapper[4690]: I0320 17:33:54.562528 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64a74fb2e29c84d99284cdca82ecd7abae5fc195747f292f11036116ec270ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37728496304293eddfd812f4584815ce277a3a2b02b6716e5f7d5d77ebaf9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:54Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:54 crc kubenswrapper[4690]: I0320 17:33:54.578563 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:54Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:54 crc kubenswrapper[4690]: I0320 17:33:54.596378 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bf8dm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"189715be-f690-4a1d-9bd3-fb0dcae7affe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z9vwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bf8dm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:54Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:54 crc kubenswrapper[4690]: I0320 17:33:54.624225 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:54 crc kubenswrapper[4690]: I0320 17:33:54.624293 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:54 crc kubenswrapper[4690]: I0320 17:33:54.624306 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:54 crc kubenswrapper[4690]: I0320 17:33:54.624326 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:54 crc kubenswrapper[4690]: I0320 17:33:54.624338 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:54Z","lastTransitionTime":"2026-03-20T17:33:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:54 crc kubenswrapper[4690]: I0320 17:33:54.636701 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dacf40f3-f7fe-429b-bb11-3057bc037779\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b273b610fa19944625ca87d5ec10f818b86154d676f1def5ebe494ee44ed3848\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f552ca9ec154d035a9f9809b20d9ff2cd19bbd4cb9262173a0334289741f4fad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd8c552d958aced0cb683d87c3ef8d88494d4888ccb028a9f4c27b24b4923264\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5355eb1563fa92e70ca61e39a864a15b53da2181b277f3e134d121b5626b954a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f044bae4d4345b16e951ba16d4dc6df9b400789b67b6eb23d806fba27dc77d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://601a5cb96354f970de2322d08594baacac3c21ec962d27dc0c809f1bc99de4d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://601a5cb96354f970de2322d08594baacac3c21ec962d27dc0c809f1bc99de4d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e719a69188fb4ee3882973f6f72ba027c5a546cb39b119b27bcd38d8cc728521\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e719a69188fb4ee3882973f6f72ba027c5a546cb39b119b27bcd38d8cc728521\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ef118e8eca52e42d265877595d296d5641caa5c79886b886eefca7686f9b6524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef118e8eca52e42d265877595d296d5641caa5c79886b886eefca7686f9b6524\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:54Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:54 crc kubenswrapper[4690]: I0320 17:33:54.655162 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bgj72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cb690cf-caea-4c1b-ad3c-7e17a802b1a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djqjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djqjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bgj72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:54Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:54 crc kubenswrapper[4690]: I0320 17:33:54.681078 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tzvwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fe7c1d1-7aa9-4c64-941e-7415a99367ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56dc92b978a7c1bbd4e3ccc2a6821348e2a990247e49e82c4de43c8bbe305cad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56dc92b978a7c1bbd4e3ccc2a6821348e2a990247e49e82c4de43c8bbe305cad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b4a3f2829967bcafe60ed0c6d08a421e8c8a5cd49d2a7445bbc92c2592d7457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b4a3f2829967bcafe60ed0c6d08a421e8c8a5cd49d2a7445bbc92c2592d7457\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://971c38dc48c64a0c8c8781e6d2a3d6f5222f9e846fb32ae417a4a1872a296b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://971c38dc48c64a0c8c8781e6d2a3d6f5222f9e846fb32ae417a4a1872a296b47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6e590fdf915cb209ad79022e0bb1b20cf642ebfeaa5e67cad61f14c495feaed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6e590fdf915cb209ad79022e0bb1b20cf642ebfeaa5e67cad61f14c495feaed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa9bfe0b6b30c8ecbcab836f9fd1770f959392e981e9676b281b5768a4279d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa9bfe0b6b30c8ecbcab836f9fd1770f959392e981e9676b281b5768a4279d22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tzvwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:54Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:54 crc kubenswrapper[4690]: I0320 17:33:54.713111 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01a728ab-e286-4606-b922-d510978b863a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ad2529bd38d1e0c84ca456ccdcc8020ce82a667c5aa5ea3a0027d397ec94f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ad2529bd38d1e0c84ca456ccdcc8020ce82a667c5aa5ea3a0027d397ec94f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7bsmm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:54Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:54 crc kubenswrapper[4690]: I0320 17:33:54.727300 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:54 crc kubenswrapper[4690]: I0320 17:33:54.727357 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:54 crc kubenswrapper[4690]: I0320 17:33:54.727376 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:54 crc kubenswrapper[4690]: I0320 17:33:54.727400 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:54 crc kubenswrapper[4690]: I0320 17:33:54.727418 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:54Z","lastTransitionTime":"2026-03-20T17:33:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:54 crc kubenswrapper[4690]: I0320 17:33:54.729276 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qhmg6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5abdfe2-a5f7-43a7-9c83-a9eb0dacdea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19bc44db59dd7f723e92f099fb77ea80fac41a5fc0a3818ddd8d443495c50c8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lb8q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qhmg6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:54Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:54 crc kubenswrapper[4690]: I0320 17:33:54.743810 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cdd6a8b-6b15-41c5-ba81-51e1ef53835e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c42561cbc470c23295468bf31d6dda364c3962cf8ac84f53ed62c01fa3e19db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbfcf8baf8b3cc4746bc7b314297f0f820b7461ad85d9c2f500a3ed589fb4bc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbfcf8baf8b3cc4746bc7b314297f0f820b7461ad85d9c2f500a3ed589fb4bc0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:54Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:54 crc kubenswrapper[4690]: I0320 17:33:54.759064 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1ec4f2e-81b3-4b81-b071-1306b93f352a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc5b19d4175f97a26633b3c61b49147f93e1edeb8975964cb23bbe474f6326e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe2bb59ee9fc82c3e49b375d294aebc73e2175d699416cb28c587a153cbadc5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83d020fd903a7b604233a4229c9a201a78f0f9d41864c94e82220090dd73e69e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a788ca120045ef7b2481c3da0afac1f8ae2522b3edd3b73a48f5f8dab045a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60a788ca120045ef7b2481c3da0afac1f8ae2522b3edd3b73a48f5f8dab045a4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:33:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:33:16.417534 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:33:16.417775 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:33:16.418850 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4179466923/tls.crt::/tmp/serving-cert-4179466923/tls.key\\\\\\\"\\\\nI0320 17:33:16.771141 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:33:16.777371 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:33:16.777420 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:33:16.777489 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:33:16.777503 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:33:16.783760 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 17:33:16.783788 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:33:16.783793 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 17:33:16.783790 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:33:16.783798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:33:16.783816 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:33:16.783823 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:33:16.783828 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:33:16.787038 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d1877a8c2e19c04c44916cbcd68e19a117e4d6075b33ce7131064590120b12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438a96b878fe413aa54a56021b7ca5d2d38226050a036c2ce144aaead090aff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://438a96b878fe413aa54a56021b7ca5d2d38226050a036c2ce144aaead090aff7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:54Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:54 crc kubenswrapper[4690]: I0320 17:33:54.774568 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7501f273f832d465f837fe21cbfaddda7e9fdbfafe44e94d3fbfee21bbd2735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:54Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:54 crc kubenswrapper[4690]: I0320 17:33:54.791821 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:54Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:54 crc kubenswrapper[4690]: I0320 17:33:54.809430 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:54Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:54 crc kubenswrapper[4690]: I0320 17:33:54.824410 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bf8dm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"189715be-f690-4a1d-9bd3-fb0dcae7affe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z9vwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bf8dm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:54Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:54 crc kubenswrapper[4690]: I0320 17:33:54.830373 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:54 crc kubenswrapper[4690]: I0320 17:33:54.830459 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:54 crc kubenswrapper[4690]: I0320 17:33:54.830484 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:54 crc kubenswrapper[4690]: I0320 17:33:54.830510 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:54 crc kubenswrapper[4690]: I0320 17:33:54.830531 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:54Z","lastTransitionTime":"2026-03-20T17:33:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:54 crc kubenswrapper[4690]: I0320 17:33:54.846410 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64a74fb2e29c84d99284cdca82ecd7abae5fc195747f292f11036116ec270ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37728496304293eddfd812f4584815ce277a3a2b02b6716e5f7d5d77ebaf9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:54Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:54 crc kubenswrapper[4690]: I0320 17:33:54.866372 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:54Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:54 crc kubenswrapper[4690]: I0320 17:33:54.882384 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bgj72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cb690cf-caea-4c1b-ad3c-7e17a802b1a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djqjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djqjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bgj72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:54Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:54 crc kubenswrapper[4690]: I0320 17:33:54.906009 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tzvwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fe7c1d1-7aa9-4c64-941e-7415a99367ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56dc92b978a7c1bbd4e3ccc2a6821348e2a990247e49e82c4de43c8bbe305cad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56dc92b978a7c1bbd4e3ccc2a6821348e2a990247e49e82c4de43c8bbe305cad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b4a3f2829967bcafe60ed0c6d08a421e8c8a5cd49d2a7445bbc92c2592d7457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b4a3f2829967bcafe60ed0c6d08a421e8c8a5cd49d2a7445bbc92c2592d7457\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://971c38dc48c64a0c8c8781e6d2a3d6f5222f9e846fb32ae417a4a1872a296b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://971c38dc48c64a0c8c8781e6d2a3d6f5222f9e846fb32ae417a4a1872a296b47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6e590fdf915cb209ad79022e0bb1b20cf642ebfeaa5e67cad61f14c495feaed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6e590fdf915cb209ad79022e0bb1b20cf642ebfeaa5e67cad61f14c495feaed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa9bfe0b6b30c8ecbcab836f9fd1770f959392e981e9676b281b5768a4279d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa9bfe0b6b30c8ecbcab836f9fd1770f959392e981e9676b281b5768a4279d22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tzvwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:54Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:54 crc kubenswrapper[4690]: I0320 17:33:54.935377 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:54 crc kubenswrapper[4690]: I0320 17:33:54.935658 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:54 crc kubenswrapper[4690]: I0320 17:33:54.935670 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:54 crc kubenswrapper[4690]: I0320 17:33:54.935686 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:54 crc kubenswrapper[4690]: I0320 17:33:54.935696 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:54Z","lastTransitionTime":"2026-03-20T17:33:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:54 crc kubenswrapper[4690]: I0320 17:33:54.941983 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01a728ab-e286-4606-b922-d510978b863a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89f5bb035f84384df58eb38689bda300611344d78c38c548c61cd02a479b6852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://187278dddcc4ae295ce37bb5966dd95b70987cf9579d8a302c45162906caa098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d198c0b94cfc2e9429a02ccb1bf444b3746c37cd3278cc5c41cccad3a92f3a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78b79e7c6bc179739a43168addace3ea75f4067c5938f219a5cb0e545f65472f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11c8e8059826df28ea1bdafe3ca56a8a902ff916246367be3ece76d468194901\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95c9b322e5da6bc8172886af77d6507bccaaf8e4489181c78d3f5e522d781aa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33c64eaa45aef662646090576ff2b79e93b622b98520ed7fc96d04d9d8bf4dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6447a78cef9ba2045f7928077399b681d152b37755ec287ae1633a26a67711ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ad2529bd38d1e0c84ca456ccdcc8020ce82a667c5aa5ea3a0027d397ec94f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ad2529bd38d1e0c84ca456ccdcc8020ce82a667c5aa5ea3a0027d397ec94f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7bsmm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:54Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:54 crc kubenswrapper[4690]: I0320 17:33:54.968708 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dacf40f3-f7fe-429b-bb11-3057bc037779\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b273b610fa19944625ca87d5ec10f818b86154d676f1def5ebe494ee44ed3848\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f552ca9ec154d035a9f9809b20d9ff2cd19bbd4cb9262173a0334289741f4fad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd8c552d958aced0cb683d87c3ef8d88494d4888ccb028a9f4c27b24b4923264\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5355eb1563fa92e70ca61e39a864a15b53da2181b277f3e134d121b5626b954a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f044bae4d4345b16e951ba16d4dc6df9b400789b67b6eb23d806fba27dc77d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://601a5cb96354f970de2322d08594baacac3c21ec962d27dc0c809f1bc99de4d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://601a5cb96354f970de2322d08594baacac3c21ec962d27dc0c809f1bc99de4d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e719a69188fb4ee3882973f6f72ba027c5a546cb39b119b27bcd38d8cc728521\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e719a69188fb4ee3882973f6f72ba027c5a546cb39b119b27bcd38d8cc728521\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ef118e8eca52e42d265877595d296d5641caa5c79886b886eefca7686f9b6524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef118e8eca52e42d265877595d296d5641caa5c79886b886eefca7686f9b6524\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:54Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:54 crc kubenswrapper[4690]: I0320 17:33:54.986370 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7501f273f832d465f837fe21cbfaddda7e9fdbfafe44e94d3fbfee21bbd2735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:54Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:55 crc kubenswrapper[4690]: I0320 17:33:55.003177 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:55Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:55 crc kubenswrapper[4690]: I0320 17:33:55.020429 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:55Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:55 crc kubenswrapper[4690]: I0320 17:33:55.038630 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:55 crc kubenswrapper[4690]: I0320 17:33:55.038689 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:55 crc kubenswrapper[4690]: I0320 17:33:55.038706 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:55 crc kubenswrapper[4690]: I0320 17:33:55.038731 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:55 crc kubenswrapper[4690]: I0320 17:33:55.038750 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:55Z","lastTransitionTime":"2026-03-20T17:33:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:55 crc kubenswrapper[4690]: I0320 17:33:55.039091 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qhmg6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5abdfe2-a5f7-43a7-9c83-a9eb0dacdea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19bc44db59dd7f723e92f099fb77ea80fac41a5fc0a3818ddd8d443495c50c8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lb8q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qhmg6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:55Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:55 crc kubenswrapper[4690]: I0320 17:33:55.055990 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cdd6a8b-6b15-41c5-ba81-51e1ef53835e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c42561cbc470c23295468bf31d6dda364c3962cf8ac84f53ed62c01fa3e19db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbfcf8baf8b3cc4746bc7b314297f0f820b7461ad85d9c2f500a3ed589fb4bc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbfcf8baf8b3cc4746bc7b314297f0f820b7461ad85d9c2f500a3ed589fb4bc0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:55Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:55 crc kubenswrapper[4690]: I0320 17:33:55.077319 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1ec4f2e-81b3-4b81-b071-1306b93f352a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc5b19d4175f97a26633b3c61b49147f93e1edeb8975964cb23bbe474f6326e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe2bb59ee9fc82c3e49b375d294aebc73e2175d699416cb28c587a153cbadc5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83d020fd903a7b604233a4229c9a201a78f0f9d41864c94e82220090dd73e69e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a788ca120045ef7b2481c3da0afac1f8ae2522b3edd3b73a48f5f8dab045a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60a788ca120045ef7b2481c3da0afac1f8ae2522b3edd3b73a48f5f8dab045a4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:33:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:33:16.417534 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:33:16.417775 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:33:16.418850 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4179466923/tls.crt::/tmp/serving-cert-4179466923/tls.key\\\\\\\"\\\\nI0320 17:33:16.771141 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:33:16.777371 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:33:16.777420 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:33:16.777489 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:33:16.777503 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:33:16.783760 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 17:33:16.783788 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:33:16.783793 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 17:33:16.783790 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:33:16.783798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:33:16.783816 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:33:16.783823 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:33:16.783828 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:33:16.787038 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d1877a8c2e19c04c44916cbcd68e19a117e4d6075b33ce7131064590120b12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438a96b878fe413aa54a56021b7ca5d2d38226050a036c2ce144aaead090aff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://438a96b878fe413aa54a56021b7ca5d2d38226050a036c2ce144aaead090aff7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:55Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:55 crc kubenswrapper[4690]: I0320 17:33:55.095284 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c18651e4-89e3-43fd-a780-bfa6df87591e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://746499ab480c55aa548acd69b4adc2adb724c111d53536273f1e738c5d67209c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v64dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09565d72b6e11bc9bc4f72446c455016fb107bdf0fe367b56427ce9f79c20b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v64dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wtg2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:55Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:55 crc kubenswrapper[4690]: I0320 17:33:55.112374 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4rfg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deaf1de2-4906-4e89-ae1b-83b6d35f97a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53b3e701b77813269b88f29ec4e437ca71cad9cd1b9cc9310dc6b59cc609bcc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmghf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4rfg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:55Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:55 crc kubenswrapper[4690]: I0320 17:33:55.130441 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nqtt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f51dea1-fc10-4d4a-9065-2d0c020b36f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc815d328a997ab7b69c5eb959fedde44313867916d64f4ebaf96d77e34b2e84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzj2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de078ec156833ff0304a8e83014adf2c8fc5c7f8db9bb25c366acf27fa446ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzj2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8nqtt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:55Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:55 crc kubenswrapper[4690]: I0320 17:33:55.141236 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:55Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:55 crc kubenswrapper[4690]: I0320 17:33:55.141897 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:55 crc kubenswrapper[4690]: I0320 17:33:55.141923 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:55 crc kubenswrapper[4690]: I0320 17:33:55.141934 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:55 crc kubenswrapper[4690]: I0320 17:33:55.141951 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:55 crc kubenswrapper[4690]: I0320 17:33:55.141967 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:55Z","lastTransitionTime":"2026-03-20T17:33:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:55 crc kubenswrapper[4690]: I0320 17:33:55.244816 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:55 crc kubenswrapper[4690]: I0320 17:33:55.244890 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:55 crc kubenswrapper[4690]: I0320 17:33:55.244908 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:55 crc kubenswrapper[4690]: I0320 17:33:55.244938 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:55 crc kubenswrapper[4690]: I0320 17:33:55.244956 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:55Z","lastTransitionTime":"2026-03-20T17:33:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:55 crc kubenswrapper[4690]: I0320 17:33:55.348658 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:55 crc kubenswrapper[4690]: I0320 17:33:55.348732 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:55 crc kubenswrapper[4690]: I0320 17:33:55.348789 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:55 crc kubenswrapper[4690]: I0320 17:33:55.348820 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:55 crc kubenswrapper[4690]: I0320 17:33:55.348843 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:55Z","lastTransitionTime":"2026-03-20T17:33:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:55 crc kubenswrapper[4690]: I0320 17:33:55.452355 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:55 crc kubenswrapper[4690]: I0320 17:33:55.452433 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:55 crc kubenswrapper[4690]: I0320 17:33:55.452465 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:55 crc kubenswrapper[4690]: I0320 17:33:55.452490 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:55 crc kubenswrapper[4690]: I0320 17:33:55.452508 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:55Z","lastTransitionTime":"2026-03-20T17:33:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:55 crc kubenswrapper[4690]: I0320 17:33:55.480854 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bf8dm" event={"ID":"189715be-f690-4a1d-9bd3-fb0dcae7affe","Type":"ContainerStarted","Data":"6ab5d0027832ffcb62f2f0869a4811a56bd02954cbdd4fd0e20870dc72818ba4"} Mar 20 17:33:55 crc kubenswrapper[4690]: I0320 17:33:55.489693 4690 generic.go:334] "Generic (PLEG): container finished" podID="3fe7c1d1-7aa9-4c64-941e-7415a99367ea" containerID="7d589b2d09af16c9faaa995e5d4abaa7663d53b499e93fbb2ad76e9ef14ff32c" exitCode=0 Mar 20 17:33:55 crc kubenswrapper[4690]: I0320 17:33:55.490086 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tzvwm" event={"ID":"3fe7c1d1-7aa9-4c64-941e-7415a99367ea","Type":"ContainerDied","Data":"7d589b2d09af16c9faaa995e5d4abaa7663d53b499e93fbb2ad76e9ef14ff32c"} Mar 20 17:33:55 crc kubenswrapper[4690]: I0320 17:33:55.490771 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" Mar 20 17:33:55 crc kubenswrapper[4690]: I0320 17:33:55.490850 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" Mar 20 17:33:55 crc kubenswrapper[4690]: I0320 17:33:55.513064 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tzvwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fe7c1d1-7aa9-4c64-941e-7415a99367ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56dc92b978a7c1bbd4e3ccc2a6821348e2a990247e49e82c4de43c8bbe305cad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56dc92b978a7c1bbd4e3ccc2a6821348e2a990247e49e82c4de43c8bbe305cad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b4a3f2829967bcafe60ed0c6d08a421e8c8a5cd49d2a7445bbc92c2592d7457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b4a3f2829967bcafe60ed0c6d08a421e8c8a5cd49d2a7445bbc92c2592d7457\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://971c38dc48c64a0c8c8781e6d2a3d6f5222f9e846fb32ae417a4a1872a296b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://971c38dc48c64a0c8c8781e6d2a3d6f5222f9e846fb32ae417a4a1872a296b47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6e590fdf915cb209ad79022e0bb1b20cf642ebfeaa5e67cad61f14c495feaed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6e590fdf915cb209ad79022e0bb1b20cf642ebfeaa5e67cad61f14c495feaed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa9bfe0b6b30c8ecbcab836f9fd1770f959392e981e9676b281b5768a4279d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa9bfe0b6b30c8ecbcab836f9fd1770f959392e981e9676b281b5768a4279d22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tzvwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:55Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:55 crc kubenswrapper[4690]: I0320 17:33:55.532427 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" Mar 20 17:33:55 crc kubenswrapper[4690]: I0320 17:33:55.540788 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01a728ab-e286-4606-b922-d510978b863a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89f5bb035f84384df58eb38689bda300611344d78c38c548c61cd02a479b6852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://187278dddcc4ae295ce37bb5966dd95b70987cf9579d8a302c45162906caa098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d198c0b94cfc2e9429a02ccb1bf444b3746c37cd3278cc5c41cccad3a92f3a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78b79e7c6bc179739a43168addace3ea75f4067c5938f219a5cb0e545f65472f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11c8e8059826df28ea1bdafe3ca56a8a902ff916246367be3ece76d468194901\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95c9b322e5da6bc8172886af77d6507bccaaf8e4489181c78d3f5e522d781aa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33c64eaa45aef662646090576ff2b79e93b622b98520ed7fc96d04d9d8bf4dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6447a78cef9ba2045f7928077399b681d152b37755ec287ae1633a26a67711ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ad2529bd38d1e0c84ca456ccdcc8020ce82a667c5aa5ea3a0027d397ec94f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ad2529bd38d1e0c84ca456ccdcc8020ce82a667c5aa5ea3a0027d397ec94f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7bsmm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:55Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:55 crc kubenswrapper[4690]: I0320 17:33:55.558014 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:55 crc kubenswrapper[4690]: I0320 17:33:55.558104 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:55 crc kubenswrapper[4690]: I0320 17:33:55.558129 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:55 crc kubenswrapper[4690]: I0320 17:33:55.558158 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:55 crc kubenswrapper[4690]: I0320 17:33:55.558184 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:55Z","lastTransitionTime":"2026-03-20T17:33:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:55 crc kubenswrapper[4690]: I0320 17:33:55.574647 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dacf40f3-f7fe-429b-bb11-3057bc037779\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b273b610fa19944625ca87d5ec10f818b86154d676f1def5ebe494ee44ed3848\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f552ca9ec154d035a9f9809b20d9ff2cd19bbd4cb9262173a0334289741f4fad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd8c552d958aced0cb683d87c3ef8d88494d4888ccb028a9f4c27b24b4923264\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5355eb1563fa92e70ca61e39a864a15b53da2181b277f3e134d121b5626b954a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f044bae4d4345b16e951ba16d4dc6df9b400789b67b6eb23d806fba27dc77d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://601a5cb96354f970de2322d08594baacac3c21ec962d27dc0c809f1bc99de4d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://601a5cb96354f970de2322d08594baacac3c21ec962d27dc0c809f1bc99de4d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e719a69188fb4ee3882973f6f72ba027c5a546cb39b119b27bcd38d8cc728521\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e719a69188fb4ee3882973f6f72ba027c5a546cb39b119b27bcd38d8cc728521\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ef118e8eca52e42d265877595d296d5641caa5c79886b886eefca7686f9b6524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef118e8eca52e42d265877595d296d5641caa5c79886b886eefca7686f9b6524\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:55Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:55 crc kubenswrapper[4690]: I0320 17:33:55.592108 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bgj72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cb690cf-caea-4c1b-ad3c-7e17a802b1a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djqjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djqjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bgj72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:55Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:55 crc kubenswrapper[4690]: I0320 17:33:55.613686 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:55Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:55 crc kubenswrapper[4690]: I0320 17:33:55.632034 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:55Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:55 crc kubenswrapper[4690]: I0320 17:33:55.643216 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qhmg6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5abdfe2-a5f7-43a7-9c83-a9eb0dacdea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19bc44db59dd7f723e92f099fb77ea80fac41a5fc0a3818ddd8d443495c50c8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lb8q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qhmg6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:55Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:55 crc kubenswrapper[4690]: I0320 17:33:55.654506 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cdd6a8b-6b15-41c5-ba81-51e1ef53835e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c42561cbc470c23295468bf31d6dda364c3962cf8ac84f53ed62c01fa3e19db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbfcf8baf8b3cc4746bc7b314297f0f820b7461ad85d9c2f500a3ed589fb4bc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbfcf8baf8b3cc4746bc7b314297f0f820b7461ad85d9c2f500a3ed589fb4bc0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:55Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:55 crc kubenswrapper[4690]: I0320 17:33:55.660142 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:55 crc kubenswrapper[4690]: I0320 17:33:55.660182 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:55 crc kubenswrapper[4690]: I0320 17:33:55.660193 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:55 crc kubenswrapper[4690]: I0320 17:33:55.660210 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:55 crc kubenswrapper[4690]: I0320 17:33:55.660222 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:55Z","lastTransitionTime":"2026-03-20T17:33:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:55 crc kubenswrapper[4690]: I0320 17:33:55.669768 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1ec4f2e-81b3-4b81-b071-1306b93f352a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc5b19d4175f97a26633b3c61b49147f93e1edeb8975964cb23bbe474f6326e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe2bb59ee9fc82c3e49b375d294aebc73e2175d699416cb28c587a153cbadc5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83d020fd903a7b604233a4229c9a201a78f0f9d41864c94e82220090dd73e69e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a788ca120045ef7b2481c3da0afac1f8ae2522b3edd3b73a48f5f8dab045a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60a788ca120045ef7b2481c3da0afac1f8ae2522b3edd3b73a48f5f8dab045a4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:33:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:33:16.417534 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:33:16.417775 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:33:16.418850 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4179466923/tls.crt::/tmp/serving-cert-4179466923/tls.key\\\\\\\"\\\\nI0320 17:33:16.771141 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:33:16.777371 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:33:16.777420 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:33:16.777489 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:33:16.777503 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:33:16.783760 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 17:33:16.783788 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:33:16.783793 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 17:33:16.783790 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:33:16.783798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:33:16.783816 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:33:16.783823 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:33:16.783828 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:33:16.787038 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d1877a8c2e19c04c44916cbcd68e19a117e4d6075b33ce7131064590120b12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438a96b878fe413aa54a56021b7ca5d2d38226050a036c2ce144aaead090aff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://438a96b878fe413aa54a56021b7ca5d2d38226050a036c2ce144aaead090aff7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:55Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:55 crc kubenswrapper[4690]: I0320 17:33:55.687223 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7501f273f832d465f837fe21cbfaddda7e9fdbfafe44e94d3fbfee21bbd2735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:55Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:55 crc kubenswrapper[4690]: I0320 17:33:55.708098 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4rfg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deaf1de2-4906-4e89-ae1b-83b6d35f97a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53b3e701b77813269b88f29ec4e437ca71cad9cd1b9cc9310dc6b59cc609bcc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmghf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4rfg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:55Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:55 crc kubenswrapper[4690]: I0320 17:33:55.735107 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nqtt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f51dea1-fc10-4d4a-9065-2d0c020b36f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc815d328a997ab7b69c5eb959fedde44313867916d64f4ebaf96d77e34b2e84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzj2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de078ec156833ff0304a8e83014adf2c8fc5c7f8db9bb25c366acf27fa446ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzj2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8nqtt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:55Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:55 crc kubenswrapper[4690]: I0320 17:33:55.753892 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:55Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:55 crc kubenswrapper[4690]: I0320 17:33:55.765490 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:55 crc kubenswrapper[4690]: I0320 17:33:55.765534 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:55 crc kubenswrapper[4690]: I0320 17:33:55.765545 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:55 crc kubenswrapper[4690]: I0320 17:33:55.765563 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:55 crc kubenswrapper[4690]: I0320 17:33:55.765577 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:55Z","lastTransitionTime":"2026-03-20T17:33:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:55 crc kubenswrapper[4690]: I0320 17:33:55.782938 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c18651e4-89e3-43fd-a780-bfa6df87591e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://746499ab480c55aa548acd69b4adc2adb724c111d53536273f1e738c5d67209c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v64dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09565d72b6e11bc9bc4f72446c455016fb107bdf0fe367b56427ce9f79c20b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v64dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wtg2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:55Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:55 crc kubenswrapper[4690]: I0320 17:33:55.803023 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64a74fb2e29c84d99284cdca82ecd7abae5fc195747f292f11036116ec270ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37728496304293eddfd812f4584815ce277a3a2b02b6716e5f7d5d77ebaf9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:55Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:55 crc kubenswrapper[4690]: I0320 17:33:55.823794 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:55Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:55 crc kubenswrapper[4690]: I0320 17:33:55.843800 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bf8dm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"189715be-f690-4a1d-9bd3-fb0dcae7affe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab5d0027832ffcb62f2f0869a4811a56bd02954cbdd4fd0e20870dc72818ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z9vwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bf8dm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:55Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:55 crc kubenswrapper[4690]: I0320 17:33:55.868044 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:55 crc kubenswrapper[4690]: I0320 17:33:55.868088 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:55 crc kubenswrapper[4690]: I0320 17:33:55.868102 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:55 crc kubenswrapper[4690]: I0320 17:33:55.868124 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:55 crc kubenswrapper[4690]: I0320 17:33:55.868141 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:55Z","lastTransitionTime":"2026-03-20T17:33:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:55 crc kubenswrapper[4690]: I0320 17:33:55.873039 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:55Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:55 crc kubenswrapper[4690]: I0320 17:33:55.883228 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bgj72" Mar 20 17:33:55 crc kubenswrapper[4690]: I0320 17:33:55.883366 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:33:55 crc kubenswrapper[4690]: I0320 17:33:55.883283 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:33:55 crc kubenswrapper[4690]: E0320 17:33:55.883443 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bgj72" podUID="3cb690cf-caea-4c1b-ad3c-7e17a802b1a3" Mar 20 17:33:55 crc kubenswrapper[4690]: I0320 17:33:55.883520 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:33:55 crc kubenswrapper[4690]: E0320 17:33:55.883638 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:33:55 crc kubenswrapper[4690]: E0320 17:33:55.883865 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:33:55 crc kubenswrapper[4690]: E0320 17:33:55.883979 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:33:55 crc kubenswrapper[4690]: I0320 17:33:55.887965 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bf8dm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"189715be-f690-4a1d-9bd3-fb0dcae7affe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab5d0027832ffcb62f2f0869a4811a56bd02954cbdd4fd0e20870dc72818ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z9vwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bf8dm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:55Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:55 crc kubenswrapper[4690]: I0320 17:33:55.912136 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64a74fb2e29c84d99284cdca82ecd7abae5fc195747f292f11036116ec270ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37728496304293eddfd812f4584815ce277a3a2b02b6716e5f7d5d77ebaf9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:55Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:55 crc kubenswrapper[4690]: I0320 17:33:55.936668 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dacf40f3-f7fe-429b-bb11-3057bc037779\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b273b610fa19944625ca87d5ec10f818b86154d676f1def5ebe494ee44ed3848\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f552ca9ec154d035a9f9809b20d9ff2cd19bbd4cb9262173a0334289741f4fad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd8c552d958aced0cb683d87c3ef8d88494d4888ccb028a9f4c27b24b4923264\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5355eb1563fa92e70ca61e39a864a15b53da2181b277f3e134d121b5626b954a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f044bae4d4345b16e951ba16d4dc6df9b400789b67b6eb23d806fba27dc77d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://601a5cb96354f970de2322d08594baacac3c21ec962d27dc0c809f1bc99de4d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://601a5cb96354f970de2322d08594baacac3c21ec962d27dc0c809f1bc99de4d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e719a69188fb4ee3882973f6f72ba027c5a546cb39b119b27bcd38d8cc728521\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e719a69188fb4ee3882973f6f72ba027c5a546cb39b119b27bcd38d8cc728521\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ef118e8eca52e42d265877595d296d5641caa5c79886b886eefca7686f9b6524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef118e8eca52e42d265877595d296d5641caa5c79886b886eefca7686f9b6524\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:55Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:55 crc kubenswrapper[4690]: I0320 17:33:55.958176 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bgj72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cb690cf-caea-4c1b-ad3c-7e17a802b1a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djqjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djqjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bgj72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:55Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:55 crc kubenswrapper[4690]: I0320 17:33:55.970233 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:55 crc kubenswrapper[4690]: I0320 17:33:55.970295 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:55 crc kubenswrapper[4690]: I0320 17:33:55.970309 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:55 crc kubenswrapper[4690]: I0320 17:33:55.970326 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:55 crc kubenswrapper[4690]: I0320 17:33:55.970339 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:55Z","lastTransitionTime":"2026-03-20T17:33:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:55 crc kubenswrapper[4690]: I0320 17:33:55.984065 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tzvwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fe7c1d1-7aa9-4c64-941e-7415a99367ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56dc92b978a7c1bbd4e3ccc2a6821348e2a990247e49e82c4de43c8bbe305cad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56dc92b978a7c1bbd4e3ccc2a6821348e2a990247e49e82c4de43c8bbe305cad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b4a3f2829967bcafe60ed0c6d08a421e8c8a5cd49d2a7445bbc92c2592d7457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b4a3f2829967bcafe60ed0c6d08a421e8c8a5cd49d2a7445bbc92c2592d7457\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://971c38dc48c64a0c8c8781e6d2a3d6f5222f9e846fb32ae417a4a1872a296b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://971c38dc48c64a0c8c8781e6d2a3d6f5222f9e846fb32ae417a4a1872a296b47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6e590fdf915cb209ad79022e0bb1b20cf642ebfeaa5e67cad61f14c495feaed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6e590fdf915cb209ad79022e0bb1b20cf642ebfeaa5e67cad61f14c495feaed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa9bfe0b6b30c8ecbcab836f9fd1770f959392e981e9676b281b5768a4279d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa9bfe0b6b30c8ecbcab836f9fd1770f959392e981e9676b281b5768a4279d22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d589b2d09af16c9faaa995e5d4abaa7663d53b499e93fbb2ad76e9ef14ff32c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d589b2d09af16c9faaa995e5d4abaa7663d53b499e93fbb2ad76e9ef14ff32c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tzvwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:55Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.004116 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01a728ab-e286-4606-b922-d510978b863a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89f5bb035f84384df58eb38689bda300611344d78c38c548c61cd02a479b6852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://187278dddcc4ae295ce37bb5966dd95b70987cf9579d8a302c45162906caa098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d198c0b94cfc2e9429a02ccb1bf444b3746c37cd3278cc5c41cccad3a92f3a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78b79e7c6bc179739a43168addace3ea75f4067c5938f219a5cb0e545f65472f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11c8e8059826df28ea1bdafe3ca56a8a902ff916246367be3ece76d468194901\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95c9b322e5da6bc8172886af77d6507bccaaf8e4489181c78d3f5e522d781aa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33c64eaa45aef662646090576ff2b79e93b622b98520ed7fc96d04d9d8bf4dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6447a78cef9ba2045f7928077399b681d152b37755ec287ae1633a26a67711ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ad2529bd38d1e0c84ca456ccdcc8020ce82a667c5aa5ea3a0027d397ec94f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ad2529bd38d1e0c84ca456ccdcc8020ce82a667c5aa5ea3a0027d397ec94f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7bsmm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:56Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.020179 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1ec4f2e-81b3-4b81-b071-1306b93f352a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc5b19d4175f97a26633b3c61b49147f93e1edeb8975964cb23bbe474f6326e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe2bb59ee9fc82c3e49b375d294aebc73e2175d699416cb28c587a153cbadc5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83d020fd903a7b604233a4229c9a201a78f0f9d41864c94e82220090dd73e69e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a788ca120045ef7b2481c3da0afac1f8ae2522b3edd3b73a48f5f8dab045a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60a788ca120045ef7b2481c3da0afac1f8ae2522b3edd3b73a48f5f8dab045a4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:33:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:33:16.417534 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:33:16.417775 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:33:16.418850 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4179466923/tls.crt::/tmp/serving-cert-4179466923/tls.key\\\\\\\"\\\\nI0320 17:33:16.771141 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:33:16.777371 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:33:16.777420 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:33:16.777489 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:33:16.777503 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:33:16.783760 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 17:33:16.783788 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:33:16.783793 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 17:33:16.783790 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:33:16.783798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:33:16.783816 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:33:16.783823 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:33:16.783828 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:33:16.787038 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d1877a8c2e19c04c44916cbcd68e19a117e4d6075b33ce7131064590120b12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438a96b878fe413aa54a56021b7ca5d2d38226050a036c2ce144aaead090aff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://438a96b878fe413aa54a56021b7ca5d2d38226050a036c2ce144aaead090aff7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:56Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.033489 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7501f273f832d465f837fe21cbfaddda7e9fdbfafe44e94d3fbfee21bbd2735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:56Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.048699 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:56Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.064955 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:56Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.072499 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.072561 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.072574 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.072591 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.072601 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:56Z","lastTransitionTime":"2026-03-20T17:33:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.077686 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qhmg6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5abdfe2-a5f7-43a7-9c83-a9eb0dacdea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19bc44db59dd7f723e92f099fb77ea80fac41a5fc0a3818ddd8d443495c50c8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lb8q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qhmg6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:56Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.103809 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:33:56 crc kubenswrapper[4690]: E0320 17:33:56.104025 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:34:28.103995727 +0000 UTC m=+142.969821405 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.104082 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.104136 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.104175 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.104217 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.104299 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3cb690cf-caea-4c1b-ad3c-7e17a802b1a3-metrics-certs\") pod \"network-metrics-daemon-bgj72\" (UID: \"3cb690cf-caea-4c1b-ad3c-7e17a802b1a3\") " pod="openshift-multus/network-metrics-daemon-bgj72" Mar 20 17:33:56 crc kubenswrapper[4690]: E0320 17:33:56.104297 4690 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 17:33:56 crc kubenswrapper[4690]: E0320 17:33:56.104389 4690 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 17:33:56 crc kubenswrapper[4690]: E0320 17:33:56.104412 4690 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 17:33:56 crc kubenswrapper[4690]: E0320 17:33:56.104413 4690 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 17:33:56 crc kubenswrapper[4690]: E0320 17:33:56.104459 4690 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 17:33:56 crc kubenswrapper[4690]: E0320 17:33:56.104474 4690 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:33:56 crc kubenswrapper[4690]: E0320 17:33:56.104498 4690 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 17:33:56 crc kubenswrapper[4690]: E0320 17:33:56.104428 4690 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:33:56 crc kubenswrapper[4690]: E0320 17:33:56.104458 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 17:34:28.10443858 +0000 UTC m=+142.970264248 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 17:33:56 crc kubenswrapper[4690]: E0320 17:33:56.104337 4690 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 17:33:56 crc kubenswrapper[4690]: E0320 17:33:56.104558 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 17:34:28.104548893 +0000 UTC m=+142.970374571 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:33:56 crc kubenswrapper[4690]: E0320 17:33:56.104573 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3cb690cf-caea-4c1b-ad3c-7e17a802b1a3-metrics-certs podName:3cb690cf-caea-4c1b-ad3c-7e17a802b1a3 nodeName:}" failed. No retries permitted until 2026-03-20 17:34:28.104564313 +0000 UTC m=+142.970390211 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3cb690cf-caea-4c1b-ad3c-7e17a802b1a3-metrics-certs") pod "network-metrics-daemon-bgj72" (UID: "3cb690cf-caea-4c1b-ad3c-7e17a802b1a3") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 17:33:56 crc kubenswrapper[4690]: E0320 17:33:56.104609 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 17:34:28.104581594 +0000 UTC m=+142.970407502 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:33:56 crc kubenswrapper[4690]: E0320 17:33:56.104626 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 17:34:28.104618315 +0000 UTC m=+142.970444173 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.168486 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cdd6a8b-6b15-41c5-ba81-51e1ef53835e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c42561cbc470c23295468bf31d6dda364c3962cf8ac84f53ed62c01fa3e19db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbfcf8baf8b3cc4746bc7b314297f0f820b7461ad85d9c2f500a3ed589fb4bc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbfcf8baf8b3cc4746bc7b314297f0f820b7461ad85d9c2f500a3ed589fb4bc0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:56Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.182416 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.182447 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.182456 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.182469 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.182477 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:56Z","lastTransitionTime":"2026-03-20T17:33:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.191302 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:56Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.210878 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c18651e4-89e3-43fd-a780-bfa6df87591e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://746499ab480c55aa548acd69b4adc2adb724c111d53536273f1e738c5d67209c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v64dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09565d72b6e11bc9bc4f72446c455016fb107bdf0fe367b56427ce9f79c20b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v64dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wtg2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:56Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.224637 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4rfg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deaf1de2-4906-4e89-ae1b-83b6d35f97a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53b3e701b77813269b88f29ec4e437ca71cad9cd1b9cc9310dc6b59cc609bcc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmghf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4rfg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:56Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.238441 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nqtt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f51dea1-fc10-4d4a-9065-2d0c020b36f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc815d328a997ab7b69c5eb959fedde44313867916d64f4ebaf96d77e34b2e84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzj2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de078ec156833ff0304a8e83014adf2c8fc5c7f8db9bb25c366acf27fa446ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzj2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8nqtt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:56Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.258213 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dacf40f3-f7fe-429b-bb11-3057bc037779\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b273b610fa19944625ca87d5ec10f818b86154d676f1def5ebe494ee44ed3848\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f552ca9ec154d035a9f9809b20d9ff2cd19bbd4cb9262173a0334289741f4fad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd8c552d958aced0cb683d87c3ef8d88494d4888ccb028a9f4c27b24b4923264\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5355eb1563fa92e70ca61e39a864a15b53da2181b277f3e134d121b5626b954a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f044bae4d4345b16e951ba16d4dc6df9b400789b67b6eb23d806fba27dc77d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://601a5cb96354f970de2322d08594baacac3c21ec962d27dc0c809f1bc99de4d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://601a5cb96354f970de2322d08594baacac3c21ec962d27dc0c809f1bc99de4d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e719a69188fb4ee3882973f6f72ba027c5a546cb39b119b27bcd38d8cc728521\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e719a69188fb4ee3882973f6f72ba027c5a546cb39b119b27bcd38d8cc728521\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ef118e8eca52e42d265877595d296d5641caa5c79886b886eefca7686f9b6524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef118e8eca52e42d265877595d296d5641caa5c79886b886eefca7686f9b6524\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:56Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.270531 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bgj72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cb690cf-caea-4c1b-ad3c-7e17a802b1a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djqjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djqjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bgj72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:56Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.284816 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.284858 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.284896 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.284914 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.284926 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:56Z","lastTransitionTime":"2026-03-20T17:33:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.286094 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tzvwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fe7c1d1-7aa9-4c64-941e-7415a99367ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56dc92b978a7c1bbd4e3ccc2a6821348e2a990247e49e82c4de43c8bbe305cad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56dc92b978a7c1bbd4e3ccc2a6821348e2a990247e49e82c4de43c8bbe305cad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b4a3f2829967bcafe60ed0c6d08a421e8c8a5cd49d2a7445bbc92c2592d7457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b4a3f2829967bcafe60ed0c6d08a421e8c8a5cd49d2a7445bbc92c2592d7457\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://971c38dc48c64a0c8c8781e6d2a3d6f5222f9e846fb32ae417a4a1872a296b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://971c38dc48c64a0c8c8781e6d2a3d6f5222f9e846fb32ae417a4a1872a296b47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6e590fdf915cb209ad79022e0bb1b20cf642ebfeaa5e67cad61f14c495feaed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6e590fdf915cb209ad79022e0bb1b20cf642ebfeaa5e67cad61f14c495feaed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa9bfe0b6b30c8ecbcab836f9fd1770f959392e981e9676b281b5768a4279d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa9bfe0b6b30c8ecbcab836f9fd1770f959392e981e9676b281b5768a4279d22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d589b2d09af16c9faaa995e5d4abaa7663d53b499e93fbb2ad76e9ef14ff32c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d589b2d09af16c9faaa995e5d4abaa7663d53b499e93fbb2ad76e9ef14ff32c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tzvwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:56Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.306797 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01a728ab-e286-4606-b922-d510978b863a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89f5bb035f84384df58eb38689bda300611344d78c38c548c61cd02a479b6852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://187278dddcc4ae295ce37bb5966dd95b70987cf9579d8a302c45162906caa098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d198c0b94cfc2e9429a02ccb1bf444b3746c37cd3278cc5c41cccad3a92f3a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78b79e7c6bc179739a43168addace3ea75f4067c5938f219a5cb0e545f65472f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11c8e8059826df28ea1bdafe3ca56a8a902ff916246367be3ece76d468194901\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95c9b322e5da6bc8172886af77d6507bccaaf8e4489181c78d3f5e522d781aa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33c64eaa45aef662646090576ff2b79e93b622b98520ed7fc96d04d9d8bf4dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6447a78cef9ba2045f7928077399b681d152b37755ec287ae1633a26a67711ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ad2529bd38d1e0c84ca456ccdcc8020ce82a667c5aa5ea3a0027d397ec94f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ad2529bd38d1e0c84ca456ccdcc8020ce82a667c5aa5ea3a0027d397ec94f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7bsmm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:56Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.317749 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qhmg6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5abdfe2-a5f7-43a7-9c83-a9eb0dacdea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19bc44db59dd7f723e92f099fb77ea80fac41a5fc0a3818ddd8d443495c50c8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lb8q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qhmg6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:56Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.327610 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cdd6a8b-6b15-41c5-ba81-51e1ef53835e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c42561cbc470c23295468bf31d6dda364c3962cf8ac84f53ed62c01fa3e19db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbfcf8baf8b3cc4746bc7b314297f0f820b7461ad85d9c2f500a3ed589fb4bc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbfcf8baf8b3cc4746bc7b314297f0f820b7461ad85d9c2f500a3ed589fb4bc0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:56Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.340835 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1ec4f2e-81b3-4b81-b071-1306b93f352a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc5b19d4175f97a26633b3c61b49147f93e1edeb8975964cb23bbe474f6326e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe2bb59ee9fc82c3e49b375d294aebc73e2175d699416cb28c587a153cbadc5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83d020fd903a7b604233a4229c9a201a78f0f9d41864c94e82220090dd73e69e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a788ca120045ef7b2481c3da0afac1f8ae2522b3edd3b73a48f5f8dab045a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60a788ca120045ef7b2481c3da0afac1f8ae2522b3edd3b73a48f5f8dab045a4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:33:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:33:16.417534 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:33:16.417775 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:33:16.418850 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4179466923/tls.crt::/tmp/serving-cert-4179466923/tls.key\\\\\\\"\\\\nI0320 17:33:16.771141 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:33:16.777371 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:33:16.777420 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:33:16.777489 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:33:16.777503 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:33:16.783760 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 17:33:16.783788 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:33:16.783793 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 17:33:16.783790 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:33:16.783798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:33:16.783816 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:33:16.783823 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:33:16.783828 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:33:16.787038 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d1877a8c2e19c04c44916cbcd68e19a117e4d6075b33ce7131064590120b12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438a96b878fe413aa54a56021b7ca5d2d38226050a036c2ce144aaead090aff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://438a96b878fe413aa54a56021b7ca5d2d38226050a036c2ce144aaead090aff7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:56Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.352442 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7501f273f832d465f837fe21cbfaddda7e9fdbfafe44e94d3fbfee21bbd2735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:56Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.363379 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:56Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.405986 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:56Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.407491 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.407579 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.407601 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.407634 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.407658 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:56Z","lastTransitionTime":"2026-03-20T17:33:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.418830 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:56Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.440310 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c18651e4-89e3-43fd-a780-bfa6df87591e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://746499ab480c55aa548acd69b4adc2adb724c111d53536273f1e738c5d67209c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v64dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09565d72b6e11bc9bc4f72446c455016fb107bdf0fe367b56427ce9f79c20b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v64dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wtg2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:56Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.460677 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4rfg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deaf1de2-4906-4e89-ae1b-83b6d35f97a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53b3e701b77813269b88f29ec4e437ca71cad9cd1b9cc9310dc6b59cc609bcc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmghf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4rfg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:56Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.473987 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nqtt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f51dea1-fc10-4d4a-9065-2d0c020b36f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc815d328a997ab7b69c5eb959fedde44313867916d64f4ebaf96d77e34b2e84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzj2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de078ec156833ff0304a8e83014adf2c8fc5c7f8db9bb25c366acf27fa446ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzj2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8nqtt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:56Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.488429 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64a74fb2e29c84d99284cdca82ecd7abae5fc195747f292f11036116ec270ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37728496304293eddfd812f4584815ce277a3a2b02b6716e5f7d5d77ebaf9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:56Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.498029 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tzvwm" event={"ID":"3fe7c1d1-7aa9-4c64-941e-7415a99367ea","Type":"ContainerStarted","Data":"f9bad176e93c3fff461f57c5c15ed0d5bcc9ef12767d38012fe1145dd701112b"} Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.500329 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"202f57e25ffca6b763271ebd9354cb780bda72898aa4b753ce08bcf5a774dbd6"} Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.506983 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:56Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.509833 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.509883 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.509898 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.509917 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.509934 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:56Z","lastTransitionTime":"2026-03-20T17:33:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.529135 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bf8dm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"189715be-f690-4a1d-9bd3-fb0dcae7affe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab5d0027832ffcb62f2f0869a4811a56bd02954cbdd4fd0e20870dc72818ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z9vwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bf8dm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:56Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.545492 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:56Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.559590 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qhmg6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5abdfe2-a5f7-43a7-9c83-a9eb0dacdea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19bc44db59dd7f723e92f099fb77ea80fac41a5fc0a3818ddd8d443495c50c8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lb8q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qhmg6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:56Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.574963 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cdd6a8b-6b15-41c5-ba81-51e1ef53835e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c42561cbc470c23295468bf31d6dda364c3962cf8ac84f53ed62c01fa3e19db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbfcf8baf8b3cc4746bc7b314297f0f820b7461ad85d9c2f500a3ed589fb4bc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbfcf8baf8b3cc4746bc7b314297f0f820b7461ad85d9c2f500a3ed589fb4bc0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:56Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.597180 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1ec4f2e-81b3-4b81-b071-1306b93f352a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc5b19d4175f97a26633b3c61b49147f93e1edeb8975964cb23bbe474f6326e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe2bb59ee9fc82c3e49b375d294aebc73e2175d699416cb28c587a153cbadc5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83d020fd903a7b604233a4229c9a201a78f0f9d41864c94e82220090dd73e69e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a788ca120045ef7b2481c3da0afac1f8ae2522b3edd3b73a48f5f8dab045a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60a788ca120045ef7b2481c3da0afac1f8ae2522b3edd3b73a48f5f8dab045a4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:33:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:33:16.417534 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:33:16.417775 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:33:16.418850 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4179466923/tls.crt::/tmp/serving-cert-4179466923/tls.key\\\\\\\"\\\\nI0320 17:33:16.771141 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:33:16.777371 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:33:16.777420 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:33:16.777489 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:33:16.777503 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:33:16.783760 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 17:33:16.783788 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:33:16.783793 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 17:33:16.783790 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:33:16.783798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:33:16.783816 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:33:16.783823 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:33:16.783828 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:33:16.787038 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d1877a8c2e19c04c44916cbcd68e19a117e4d6075b33ce7131064590120b12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438a96b878fe413aa54a56021b7ca5d2d38226050a036c2ce144aaead090aff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://438a96b878fe413aa54a56021b7ca5d2d38226050a036c2ce144aaead090aff7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:56Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.613454 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.613536 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.613561 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.613588 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.613607 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:56Z","lastTransitionTime":"2026-03-20T17:33:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.616993 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7501f273f832d465f837fe21cbfaddda7e9fdbfafe44e94d3fbfee21bbd2735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:56Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.634248 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:56Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.651826 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nqtt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f51dea1-fc10-4d4a-9065-2d0c020b36f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc815d328a997ab7b69c5eb959fedde44313867916d64f4ebaf96d77e34b2e84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzj2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de078ec156833ff0304a8e83014adf2c8fc5c7f8db9bb25c366acf27fa446ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzj2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8nqtt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:56Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.673414 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://202f57e25ffca6b763271ebd9354cb780bda72898aa4b753ce08bcf5a774dbd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:56Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.687705 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c18651e4-89e3-43fd-a780-bfa6df87591e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://746499ab480c55aa548acd69b4adc2adb724c111d53536273f1e738c5d67209c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v64dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09565d72b6e11bc9bc4f72446c455016fb107bdf0fe367b56427ce9f79c20b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v64dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wtg2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:56Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.700986 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4rfg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deaf1de2-4906-4e89-ae1b-83b6d35f97a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53b3e701b77813269b88f29ec4e437ca71cad9cd1b9cc9310dc6b59cc609bcc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmghf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4rfg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:56Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.714495 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64a74fb2e29c84d99284cdca82ecd7abae5fc195747f292f11036116ec270ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37728496304293eddfd812f4584815ce277a3a2b02b6716e5f7d5d77ebaf9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:56Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.716132 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.716220 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.716239 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.716297 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.716317 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:56Z","lastTransitionTime":"2026-03-20T17:33:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.735429 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:56Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.752834 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bf8dm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"189715be-f690-4a1d-9bd3-fb0dcae7affe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab5d0027832ffcb62f2f0869a4811a56bd02954cbdd4fd0e20870dc72818ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z9vwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bf8dm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:56Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.775114 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01a728ab-e286-4606-b922-d510978b863a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89f5bb035f84384df58eb38689bda300611344d78c38c548c61cd02a479b6852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://187278dddcc4ae295ce37bb5966dd95b70987cf9579d8a302c45162906caa098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d198c0b94cfc2e9429a02ccb1bf444b3746c37cd3278cc5c41cccad3a92f3a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78b79e7c6bc179739a43168addace3ea75f4067c5938f219a5cb0e545f65472f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11c8e8059826df28ea1bdafe3ca56a8a902ff916246367be3ece76d468194901\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95c9b322e5da6bc8172886af77d6507bccaaf8e4489181c78d3f5e522d781aa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33c64eaa45aef662646090576ff2b79e93b622b98520ed7fc96d04d9d8bf4dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6447a78cef9ba2045f7928077399b681d152b37755ec287ae1633a26a67711ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ad2529bd38d1e0c84ca456ccdcc8020ce82a667c5aa5ea3a0027d397ec94f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ad2529bd38d1e0c84ca456ccdcc8020ce82a667c5aa5ea3a0027d397ec94f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7bsmm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:56Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.797578 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dacf40f3-f7fe-429b-bb11-3057bc037779\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b273b610fa19944625ca87d5ec10f818b86154d676f1def5ebe494ee44ed3848\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f552ca9ec154d035a9f9809b20d9ff2cd19bbd4cb9262173a0334289741f4fad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd8c552d958aced0cb683d87c3ef8d88494d4888ccb028a9f4c27b24b4923264\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5355eb1563fa92e70ca61e39a864a15b53da2181b277f3e134d121b5626b954a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f044bae4d4345b16e951ba16d4dc6df9b400789b67b6eb23d806fba27dc77d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://601a5cb96354f970de2322d08594baacac3c21ec962d27dc0c809f1bc99de4d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://601a5cb96354f970de2322d08594baacac3c21ec962d27dc0c809f1bc99de4d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e719a69188fb4ee3882973f6f72ba027c5a546cb39b119b27bcd38d8cc728521\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e719a69188fb4ee3882973f6f72ba027c5a546cb39b119b27bcd38d8cc728521\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ef118e8eca52e42d265877595d296d5641caa5c79886b886eefca7686f9b6524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef118e8eca52e42d265877595d296d5641caa5c79886b886eefca7686f9b6524\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:56Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.812366 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bgj72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cb690cf-caea-4c1b-ad3c-7e17a802b1a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djqjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djqjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bgj72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:56Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.821032 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.821096 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.821116 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.821144 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.821162 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:56Z","lastTransitionTime":"2026-03-20T17:33:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.836439 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tzvwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fe7c1d1-7aa9-4c64-941e-7415a99367ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9bad176e93c3fff461f57c5c15ed0d5bcc9ef12767d38012fe1145dd701112b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56dc92b978a7c1bbd4e3ccc2a6821348e2a990247e49e82c4de43c8bbe305cad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56dc92b978a7c1bbd4e3ccc2a6821348e2a990247e49e82c4de43c8bbe305cad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b4a3f2829967bcafe60ed0c6d08a421e8c8a5cd49d2a7445bbc92c2592d7457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b4a3f2829967bcafe60ed0c6d08a421e8c8a5cd49d2a7445bbc92c2592d7457\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://971c38dc48c64a0c8c8781e6d2a3d6f5222f9e846fb32ae417a4a1872a296b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://971c38dc48c64a0c8c8781e6d2a3d6f5222f9e846fb32ae417a4a1872a296b47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6e590fdf915cb209ad79022e0bb1b20cf642ebfeaa5e67cad61f14c495feaed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6e590fdf915cb209ad79022e0bb1b20cf642ebfeaa5e67cad61f14c495feaed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa9bfe0b6b30c8ecbcab836f9fd1770f959392e981e9676b281b5768a4279d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa9bfe0b6b30c8ecbcab836f9fd1770f959392e981e9676b281b5768a4279d22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d589b2d09af16c9faaa995e5d4abaa7663d53b499e93fbb2ad76e9ef14ff32c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d589b2d09af16c9faaa995e5d4abaa7663d53b499e93fbb2ad76e9ef14ff32c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tzvwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:56Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.923823 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.923859 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.923869 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.923885 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:56 crc kubenswrapper[4690]: I0320 17:33:56.923897 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:56Z","lastTransitionTime":"2026-03-20T17:33:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:57 crc kubenswrapper[4690]: I0320 17:33:57.026287 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:57 crc kubenswrapper[4690]: I0320 17:33:57.026360 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:57 crc kubenswrapper[4690]: I0320 17:33:57.026380 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:57 crc kubenswrapper[4690]: I0320 17:33:57.026406 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:57 crc kubenswrapper[4690]: I0320 17:33:57.026424 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:57Z","lastTransitionTime":"2026-03-20T17:33:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:57 crc kubenswrapper[4690]: I0320 17:33:57.128879 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:57 crc kubenswrapper[4690]: I0320 17:33:57.128966 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:57 crc kubenswrapper[4690]: I0320 17:33:57.128992 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:57 crc kubenswrapper[4690]: I0320 17:33:57.129028 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:57 crc kubenswrapper[4690]: I0320 17:33:57.129053 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:57Z","lastTransitionTime":"2026-03-20T17:33:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:57 crc kubenswrapper[4690]: I0320 17:33:57.232915 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:57 crc kubenswrapper[4690]: I0320 17:33:57.232970 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:57 crc kubenswrapper[4690]: I0320 17:33:57.232988 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:57 crc kubenswrapper[4690]: I0320 17:33:57.233012 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:57 crc kubenswrapper[4690]: I0320 17:33:57.233029 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:57Z","lastTransitionTime":"2026-03-20T17:33:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:57 crc kubenswrapper[4690]: I0320 17:33:57.336457 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:57 crc kubenswrapper[4690]: I0320 17:33:57.336519 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:57 crc kubenswrapper[4690]: I0320 17:33:57.336537 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:57 crc kubenswrapper[4690]: I0320 17:33:57.336560 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:57 crc kubenswrapper[4690]: I0320 17:33:57.336576 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:57Z","lastTransitionTime":"2026-03-20T17:33:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:57 crc kubenswrapper[4690]: I0320 17:33:57.439956 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:57 crc kubenswrapper[4690]: I0320 17:33:57.440022 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:57 crc kubenswrapper[4690]: I0320 17:33:57.440041 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:57 crc kubenswrapper[4690]: I0320 17:33:57.440069 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:57 crc kubenswrapper[4690]: I0320 17:33:57.440086 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:57Z","lastTransitionTime":"2026-03-20T17:33:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:57 crc kubenswrapper[4690]: I0320 17:33:57.506450 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7bsmm_01a728ab-e286-4606-b922-d510978b863a/ovnkube-controller/0.log" Mar 20 17:33:57 crc kubenswrapper[4690]: I0320 17:33:57.510947 4690 generic.go:334] "Generic (PLEG): container finished" podID="01a728ab-e286-4606-b922-d510978b863a" containerID="33c64eaa45aef662646090576ff2b79e93b622b98520ed7fc96d04d9d8bf4dec" exitCode=1 Mar 20 17:33:57 crc kubenswrapper[4690]: I0320 17:33:57.511013 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" event={"ID":"01a728ab-e286-4606-b922-d510978b863a","Type":"ContainerDied","Data":"33c64eaa45aef662646090576ff2b79e93b622b98520ed7fc96d04d9d8bf4dec"} Mar 20 17:33:57 crc kubenswrapper[4690]: I0320 17:33:57.512206 4690 scope.go:117] "RemoveContainer" containerID="33c64eaa45aef662646090576ff2b79e93b622b98520ed7fc96d04d9d8bf4dec" Mar 20 17:33:57 crc kubenswrapper[4690]: I0320 17:33:57.534377 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bf8dm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"189715be-f690-4a1d-9bd3-fb0dcae7affe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab5d0027832ffcb62f2f0869a4811a56bd02954cbdd4fd0e20870dc72818ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z9vwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bf8dm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:57Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:57 crc kubenswrapper[4690]: I0320 17:33:57.542168 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:57 crc kubenswrapper[4690]: I0320 17:33:57.542208 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:57 crc kubenswrapper[4690]: I0320 17:33:57.542218 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:57 crc kubenswrapper[4690]: I0320 17:33:57.542232 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:57 crc kubenswrapper[4690]: I0320 17:33:57.542241 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:57Z","lastTransitionTime":"2026-03-20T17:33:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:57 crc kubenswrapper[4690]: I0320 17:33:57.558085 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64a74fb2e29c84d99284cdca82ecd7abae5fc195747f292f11036116ec270ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37728496304293eddfd812f4584815ce277a3a2b02b6716e5f7d5d77ebaf9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:57Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:57 crc kubenswrapper[4690]: I0320 17:33:57.574429 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:57Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:57 crc kubenswrapper[4690]: I0320 17:33:57.586349 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bgj72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cb690cf-caea-4c1b-ad3c-7e17a802b1a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djqjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djqjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bgj72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:57Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:57 crc kubenswrapper[4690]: I0320 17:33:57.605878 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tzvwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fe7c1d1-7aa9-4c64-941e-7415a99367ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9bad176e93c3fff461f57c5c15ed0d5bcc9ef12767d38012fe1145dd701112b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56dc92b978a7c1bbd4e3ccc2a6821348e2a990247e49e82c4de43c8bbe305cad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56dc92b978a7c1bbd4e3ccc2a6821348e2a990247e49e82c4de43c8bbe305cad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b4a3f2829967bcafe60ed0c6d08a421e8c8a5cd49d2a7445bbc92c2592d7457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b4a3f2829967bcafe60ed0c6d08a421e8c8a5cd49d2a7445bbc92c2592d7457\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://971c38dc48c64a0c8c8781e6d2a3d6f5222f9e846fb32ae417a4a1872a296b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://971c38dc48c64a0c8c8781e6d2a3d6f5222f9e846fb32ae417a4a1872a296b47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6e590fdf915cb209ad79022e0bb1b20cf642ebfeaa5e67cad61f14c495feaed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6e590fdf915cb209ad79022e0bb1b20cf642ebfeaa5e67cad61f14c495feaed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa9bfe0b6b30c8ecbcab836f9fd1770f959392e981e9676b281b5768a4279d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa9bfe0b6b30c8ecbcab836f9fd1770f959392e981e9676b281b5768a4279d22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d589b2d09af16c9faaa995e5d4abaa7663d53b499e93fbb2ad76e9ef14ff32c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d589b2d09af16c9faaa995e5d4abaa7663d53b499e93fbb2ad76e9ef14ff32c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tzvwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:57Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:57 crc kubenswrapper[4690]: I0320 17:33:57.635532 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01a728ab-e286-4606-b922-d510978b863a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89f5bb035f84384df58eb38689bda300611344d78c38c548c61cd02a479b6852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://187278dddcc4ae295ce37bb5966dd95b70987cf9579d8a302c45162906caa098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d198c0b94cfc2e9429a02ccb1bf444b3746c37cd3278cc5c41cccad3a92f3a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78b79e7c6bc179739a43168addace3ea75f4067c5938f219a5cb0e545f65472f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11c8e8059826df28ea1bdafe3ca56a8a902ff916246367be3ece76d468194901\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95c9b322e5da6bc8172886af77d6507bccaaf8e4489181c78d3f5e522d781aa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33c64eaa45aef662646090576ff2b79e93b622b98520ed7fc96d04d9d8bf4dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33c64eaa45aef662646090576ff2b79e93b622b98520ed7fc96d04d9d8bf4dec\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:33:56Z\\\",\\\"message\\\":\\\"0320 17:33:56.858782 6480 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0320 17:33:56.858834 6480 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0320 17:33:56.858854 6480 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 17:33:56.858860 6480 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 17:33:56.858875 6480 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 17:33:56.858881 6480 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 17:33:56.858897 6480 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 17:33:56.858916 6480 factory.go:656] Stopping watch factory\\\\nI0320 17:33:56.858932 6480 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 17:33:56.858968 6480 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 17:33:56.858978 6480 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0320 17:33:56.858987 6480 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0320 17:33:56.858997 6480 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 17:33:56.859005 6480 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0320 17:33:56.859016 6480 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 17:33:56.859024 6480 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6447a78cef9ba2045f7928077399b681d152b37755ec287ae1633a26a67711ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ad2529bd38d1e0c84ca456ccdcc8020ce82a667c5aa5ea3a0027d397ec94f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ad2529bd38d1e0c84ca456ccdcc8020ce82a667c5aa5ea3a0027d397ec94f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7bsmm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:57Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:57 crc kubenswrapper[4690]: I0320 17:33:57.644157 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:57 crc kubenswrapper[4690]: I0320 17:33:57.644206 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:57 crc kubenswrapper[4690]: I0320 17:33:57.644221 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:57 crc kubenswrapper[4690]: I0320 17:33:57.644238 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:57 crc kubenswrapper[4690]: I0320 17:33:57.644270 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:57Z","lastTransitionTime":"2026-03-20T17:33:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:57 crc kubenswrapper[4690]: I0320 17:33:57.659641 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dacf40f3-f7fe-429b-bb11-3057bc037779\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b273b610fa19944625ca87d5ec10f818b86154d676f1def5ebe494ee44ed3848\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f552ca9ec154d035a9f9809b20d9ff2cd19bbd4cb9262173a0334289741f4fad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd8c552d958aced0cb683d87c3ef8d88494d4888ccb028a9f4c27b24b4923264\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5355eb1563fa92e70ca61e39a864a15b53da2181b277f3e134d121b5626b954a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f044bae4d4345b16e951ba16d4dc6df9b400789b67b6eb23d806fba27dc77d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://601a5cb96354f970de2322d08594baacac3c21ec962d27dc0c809f1bc99de4d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://601a5cb96354f970de2322d08594baacac3c21ec962d27dc0c809f1bc99de4d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e719a69188fb4ee3882973f6f72ba027c5a546cb39b119b27bcd38d8cc728521\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e719a69188fb4ee3882973f6f72ba027c5a546cb39b119b27bcd38d8cc728521\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ef118e8eca52e42d265877595d296d5641caa5c79886b886eefca7686f9b6524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef118e8eca52e42d265877595d296d5641caa5c79886b886eefca7686f9b6524\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:57Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:57 crc kubenswrapper[4690]: I0320 17:33:57.674977 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7501f273f832d465f837fe21cbfaddda7e9fdbfafe44e94d3fbfee21bbd2735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:57Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:57 crc kubenswrapper[4690]: I0320 17:33:57.690179 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:57Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:57 crc kubenswrapper[4690]: I0320 17:33:57.705458 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:57Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:57 crc kubenswrapper[4690]: I0320 17:33:57.715947 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qhmg6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5abdfe2-a5f7-43a7-9c83-a9eb0dacdea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19bc44db59dd7f723e92f099fb77ea80fac41a5fc0a3818ddd8d443495c50c8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lb8q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qhmg6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:57Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:57 crc kubenswrapper[4690]: I0320 17:33:57.728901 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cdd6a8b-6b15-41c5-ba81-51e1ef53835e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c42561cbc470c23295468bf31d6dda364c3962cf8ac84f53ed62c01fa3e19db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbfcf8baf8b3cc4746bc7b314297f0f820b7461ad85d9c2f500a3ed589fb4bc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbfcf8baf8b3cc4746bc7b314297f0f820b7461ad85d9c2f500a3ed589fb4bc0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:57Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:57 crc kubenswrapper[4690]: I0320 17:33:57.744994 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1ec4f2e-81b3-4b81-b071-1306b93f352a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc5b19d4175f97a26633b3c61b49147f93e1edeb8975964cb23bbe474f6326e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe2bb59ee9fc82c3e49b375d294aebc73e2175d699416cb28c587a153cbadc5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83d020fd903a7b604233a4229c9a201a78f0f9d41864c94e82220090dd73e69e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a788ca120045ef7b2481c3da0afac1f8ae2522b3edd3b73a48f5f8dab045a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60a788ca120045ef7b2481c3da0afac1f8ae2522b3edd3b73a48f5f8dab045a4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:33:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:33:16.417534 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:33:16.417775 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:33:16.418850 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4179466923/tls.crt::/tmp/serving-cert-4179466923/tls.key\\\\\\\"\\\\nI0320 17:33:16.771141 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:33:16.777371 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:33:16.777420 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:33:16.777489 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:33:16.777503 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:33:16.783760 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 17:33:16.783788 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:33:16.783793 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 17:33:16.783790 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:33:16.783798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:33:16.783816 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:33:16.783823 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:33:16.783828 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:33:16.787038 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d1877a8c2e19c04c44916cbcd68e19a117e4d6075b33ce7131064590120b12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438a96b878fe413aa54a56021b7ca5d2d38226050a036c2ce144aaead090aff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://438a96b878fe413aa54a56021b7ca5d2d38226050a036c2ce144aaead090aff7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:57Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:57 crc kubenswrapper[4690]: I0320 17:33:57.746473 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:57 crc kubenswrapper[4690]: I0320 17:33:57.746531 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:57 crc kubenswrapper[4690]: I0320 17:33:57.746548 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:57 crc kubenswrapper[4690]: I0320 17:33:57.746573 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:57 crc kubenswrapper[4690]: I0320 17:33:57.746592 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:57Z","lastTransitionTime":"2026-03-20T17:33:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:57 crc kubenswrapper[4690]: I0320 17:33:57.760885 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c18651e4-89e3-43fd-a780-bfa6df87591e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://746499ab480c55aa548acd69b4adc2adb724c111d53536273f1e738c5d67209c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v64dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09565d72b6e11bc9bc4f72446c455016fb107bdf0fe367b56427ce9f79c20b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v64dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wtg2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:57Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:57 crc kubenswrapper[4690]: I0320 17:33:57.772585 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4rfg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deaf1de2-4906-4e89-ae1b-83b6d35f97a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53b3e701b77813269b88f29ec4e437ca71cad9cd1b9cc9310dc6b59cc609bcc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmghf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4rfg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:57Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:57 crc kubenswrapper[4690]: I0320 17:33:57.786029 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nqtt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f51dea1-fc10-4d4a-9065-2d0c020b36f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc815d328a997ab7b69c5eb959fedde44313867916d64f4ebaf96d77e34b2e84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzj2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de078ec156833ff0304a8e83014adf2c8fc5c7f8db9bb25c366acf27fa446ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzj2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8nqtt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:57Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:57 crc kubenswrapper[4690]: I0320 17:33:57.803625 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://202f57e25ffca6b763271ebd9354cb780bda72898aa4b753ce08bcf5a774dbd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:57Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:57 crc kubenswrapper[4690]: I0320 17:33:57.848971 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:57 crc kubenswrapper[4690]: I0320 17:33:57.849019 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:57 crc kubenswrapper[4690]: I0320 17:33:57.849036 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:57 crc kubenswrapper[4690]: I0320 17:33:57.849060 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:57 crc kubenswrapper[4690]: I0320 17:33:57.849077 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:57Z","lastTransitionTime":"2026-03-20T17:33:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:57 crc kubenswrapper[4690]: I0320 17:33:57.882806 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:33:57 crc kubenswrapper[4690]: I0320 17:33:57.882848 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bgj72" Mar 20 17:33:57 crc kubenswrapper[4690]: I0320 17:33:57.882864 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:33:57 crc kubenswrapper[4690]: E0320 17:33:57.882961 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:33:57 crc kubenswrapper[4690]: E0320 17:33:57.883046 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:33:57 crc kubenswrapper[4690]: E0320 17:33:57.883180 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bgj72" podUID="3cb690cf-caea-4c1b-ad3c-7e17a802b1a3" Mar 20 17:33:57 crc kubenswrapper[4690]: I0320 17:33:57.883357 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:33:57 crc kubenswrapper[4690]: E0320 17:33:57.883584 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:33:57 crc kubenswrapper[4690]: I0320 17:33:57.952228 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:57 crc kubenswrapper[4690]: I0320 17:33:57.952289 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:57 crc kubenswrapper[4690]: I0320 17:33:57.952303 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:57 crc kubenswrapper[4690]: I0320 17:33:57.952319 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:57 crc kubenswrapper[4690]: I0320 17:33:57.952330 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:57Z","lastTransitionTime":"2026-03-20T17:33:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:57 crc kubenswrapper[4690]: I0320 17:33:57.986680 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:57 crc kubenswrapper[4690]: I0320 17:33:57.986719 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:57 crc kubenswrapper[4690]: I0320 17:33:57.986727 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:57 crc kubenswrapper[4690]: I0320 17:33:57.986741 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:57 crc kubenswrapper[4690]: I0320 17:33:57.986751 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:57Z","lastTransitionTime":"2026-03-20T17:33:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:58 crc kubenswrapper[4690]: E0320 17:33:58.007666 4690 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"65dcae3a-f6f0-4cdb-ac7a-76b1f475ea12\\\",\\\"systemUUID\\\":\\\"6ccc1e34-4160-4143-b919-ac2f717f294a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:58Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:58 crc kubenswrapper[4690]: I0320 17:33:58.012184 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:58 crc kubenswrapper[4690]: I0320 17:33:58.012232 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:58 crc kubenswrapper[4690]: I0320 17:33:58.012243 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:58 crc kubenswrapper[4690]: I0320 17:33:58.012279 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:58 crc kubenswrapper[4690]: I0320 17:33:58.012290 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:58Z","lastTransitionTime":"2026-03-20T17:33:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:58 crc kubenswrapper[4690]: E0320 17:33:58.028558 4690 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"65dcae3a-f6f0-4cdb-ac7a-76b1f475ea12\\\",\\\"systemUUID\\\":\\\"6ccc1e34-4160-4143-b919-ac2f717f294a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:58Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:58 crc kubenswrapper[4690]: I0320 17:33:58.033232 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:58 crc kubenswrapper[4690]: I0320 17:33:58.033282 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:58 crc kubenswrapper[4690]: I0320 17:33:58.033293 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:58 crc kubenswrapper[4690]: I0320 17:33:58.033307 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:58 crc kubenswrapper[4690]: I0320 17:33:58.033317 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:58Z","lastTransitionTime":"2026-03-20T17:33:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:58 crc kubenswrapper[4690]: E0320 17:33:58.044195 4690 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"65dcae3a-f6f0-4cdb-ac7a-76b1f475ea12\\\",\\\"systemUUID\\\":\\\"6ccc1e34-4160-4143-b919-ac2f717f294a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:58Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:58 crc kubenswrapper[4690]: I0320 17:33:58.047689 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:58 crc kubenswrapper[4690]: I0320 17:33:58.047738 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:58 crc kubenswrapper[4690]: I0320 17:33:58.047760 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:58 crc kubenswrapper[4690]: I0320 17:33:58.047776 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:58 crc kubenswrapper[4690]: I0320 17:33:58.047788 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:58Z","lastTransitionTime":"2026-03-20T17:33:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:58 crc kubenswrapper[4690]: E0320 17:33:58.059990 4690 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"65dcae3a-f6f0-4cdb-ac7a-76b1f475ea12\\\",\\\"systemUUID\\\":\\\"6ccc1e34-4160-4143-b919-ac2f717f294a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:58Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:58 crc kubenswrapper[4690]: I0320 17:33:58.063182 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:58 crc kubenswrapper[4690]: I0320 17:33:58.063220 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:58 crc kubenswrapper[4690]: I0320 17:33:58.063233 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:58 crc kubenswrapper[4690]: I0320 17:33:58.063248 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:58 crc kubenswrapper[4690]: I0320 17:33:58.063270 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:58Z","lastTransitionTime":"2026-03-20T17:33:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:58 crc kubenswrapper[4690]: E0320 17:33:58.074732 4690 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:33:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"65dcae3a-f6f0-4cdb-ac7a-76b1f475ea12\\\",\\\"systemUUID\\\":\\\"6ccc1e34-4160-4143-b919-ac2f717f294a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:58Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:58 crc kubenswrapper[4690]: E0320 17:33:58.074883 4690 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 17:33:58 crc kubenswrapper[4690]: I0320 17:33:58.076199 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:58 crc kubenswrapper[4690]: I0320 17:33:58.076228 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:58 crc kubenswrapper[4690]: I0320 17:33:58.076237 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:58 crc kubenswrapper[4690]: I0320 17:33:58.076271 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:58 crc kubenswrapper[4690]: I0320 17:33:58.076283 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:58Z","lastTransitionTime":"2026-03-20T17:33:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:58 crc kubenswrapper[4690]: I0320 17:33:58.178610 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:58 crc kubenswrapper[4690]: I0320 17:33:58.178655 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:58 crc kubenswrapper[4690]: I0320 17:33:58.178667 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:58 crc kubenswrapper[4690]: I0320 17:33:58.178688 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:58 crc kubenswrapper[4690]: I0320 17:33:58.178700 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:58Z","lastTransitionTime":"2026-03-20T17:33:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:58 crc kubenswrapper[4690]: I0320 17:33:58.281104 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:58 crc kubenswrapper[4690]: I0320 17:33:58.281156 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:58 crc kubenswrapper[4690]: I0320 17:33:58.281169 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:58 crc kubenswrapper[4690]: I0320 17:33:58.281188 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:58 crc kubenswrapper[4690]: I0320 17:33:58.281199 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:58Z","lastTransitionTime":"2026-03-20T17:33:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:58 crc kubenswrapper[4690]: I0320 17:33:58.383226 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:58 crc kubenswrapper[4690]: I0320 17:33:58.383302 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:58 crc kubenswrapper[4690]: I0320 17:33:58.383314 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:58 crc kubenswrapper[4690]: I0320 17:33:58.383330 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:58 crc kubenswrapper[4690]: I0320 17:33:58.383338 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:58Z","lastTransitionTime":"2026-03-20T17:33:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:58 crc kubenswrapper[4690]: I0320 17:33:58.485471 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:58 crc kubenswrapper[4690]: I0320 17:33:58.485500 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:58 crc kubenswrapper[4690]: I0320 17:33:58.485510 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:58 crc kubenswrapper[4690]: I0320 17:33:58.485524 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:58 crc kubenswrapper[4690]: I0320 17:33:58.485533 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:58Z","lastTransitionTime":"2026-03-20T17:33:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:58 crc kubenswrapper[4690]: I0320 17:33:58.515400 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7bsmm_01a728ab-e286-4606-b922-d510978b863a/ovnkube-controller/1.log" Mar 20 17:33:58 crc kubenswrapper[4690]: I0320 17:33:58.515873 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7bsmm_01a728ab-e286-4606-b922-d510978b863a/ovnkube-controller/0.log" Mar 20 17:33:58 crc kubenswrapper[4690]: I0320 17:33:58.517924 4690 generic.go:334] "Generic (PLEG): container finished" podID="01a728ab-e286-4606-b922-d510978b863a" containerID="0c9d11cdb738402f6fe1772ac1ecc821fce38aec2a4d791927874099c1c91f9e" exitCode=1 Mar 20 17:33:58 crc kubenswrapper[4690]: I0320 17:33:58.517960 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" event={"ID":"01a728ab-e286-4606-b922-d510978b863a","Type":"ContainerDied","Data":"0c9d11cdb738402f6fe1772ac1ecc821fce38aec2a4d791927874099c1c91f9e"} Mar 20 17:33:58 crc kubenswrapper[4690]: I0320 17:33:58.517994 4690 scope.go:117] "RemoveContainer" containerID="33c64eaa45aef662646090576ff2b79e93b622b98520ed7fc96d04d9d8bf4dec" Mar 20 17:33:58 crc kubenswrapper[4690]: I0320 17:33:58.518662 4690 scope.go:117] "RemoveContainer" containerID="0c9d11cdb738402f6fe1772ac1ecc821fce38aec2a4d791927874099c1c91f9e" Mar 20 17:33:58 crc kubenswrapper[4690]: E0320 17:33:58.518807 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-7bsmm_openshift-ovn-kubernetes(01a728ab-e286-4606-b922-d510978b863a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" podUID="01a728ab-e286-4606-b922-d510978b863a" Mar 20 17:33:58 crc kubenswrapper[4690]: I0320 17:33:58.536971 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:58Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:58 crc kubenswrapper[4690]: I0320 17:33:58.551446 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bf8dm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"189715be-f690-4a1d-9bd3-fb0dcae7affe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab5d0027832ffcb62f2f0869a4811a56bd02954cbdd4fd0e20870dc72818ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z9vwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bf8dm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:58Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:58 crc kubenswrapper[4690]: I0320 17:33:58.568381 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64a74fb2e29c84d99284cdca82ecd7abae5fc195747f292f11036116ec270ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37728496304293eddfd812f4584815ce277a3a2b02b6716e5f7d5d77ebaf9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:58Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:58 crc kubenswrapper[4690]: I0320 17:33:58.588227 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:58 crc kubenswrapper[4690]: I0320 17:33:58.588295 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:58 crc kubenswrapper[4690]: I0320 17:33:58.588309 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:58 crc kubenswrapper[4690]: I0320 17:33:58.588428 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:58 crc kubenswrapper[4690]: I0320 17:33:58.588443 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:58Z","lastTransitionTime":"2026-03-20T17:33:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:58 crc kubenswrapper[4690]: I0320 17:33:58.591669 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dacf40f3-f7fe-429b-bb11-3057bc037779\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b273b610fa19944625ca87d5ec10f818b86154d676f1def5ebe494ee44ed3848\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f552ca9ec154d035a9f9809b20d9ff2cd19bbd4cb9262173a0334289741f4fad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd8c552d958aced0cb683d87c3ef8d88494d4888ccb028a9f4c27b24b4923264\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5355eb1563fa92e70ca61e39a864a15b53da2181b277f3e134d121b5626b954a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f044bae4d4345b16e951ba16d4dc6df9b400789b67b6eb23d806fba27dc77d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://601a5cb96354f970de2322d08594baacac3c21ec962d27dc0c809f1bc99de4d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://601a5cb96354f970de2322d08594baacac3c21ec962d27dc0c809f1bc99de4d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e719a69188fb4ee3882973f6f72ba027c5a546cb39b119b27bcd38d8cc728521\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e719a69188fb4ee3882973f6f72ba027c5a546cb39b119b27bcd38d8cc728521\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ef118e8eca52e42d265877595d296d5641caa5c79886b886eefca7686f9b6524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef118e8eca52e42d265877595d296d5641caa5c79886b886eefca7686f9b6524\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:58Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:58 crc kubenswrapper[4690]: I0320 17:33:58.604996 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bgj72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cb690cf-caea-4c1b-ad3c-7e17a802b1a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djqjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djqjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bgj72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:58Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:58 crc kubenswrapper[4690]: I0320 17:33:58.621450 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tzvwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fe7c1d1-7aa9-4c64-941e-7415a99367ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9bad176e93c3fff461f57c5c15ed0d5bcc9ef12767d38012fe1145dd701112b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56dc92b978a7c1bbd4e3ccc2a6821348e2a990247e49e82c4de43c8bbe305cad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56dc92b978a7c1bbd4e3ccc2a6821348e2a990247e49e82c4de43c8bbe305cad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b4a3f2829967bcafe60ed0c6d08a421e8c8a5cd49d2a7445bbc92c2592d7457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b4a3f2829967bcafe60ed0c6d08a421e8c8a5cd49d2a7445bbc92c2592d7457\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://971c38dc48c64a0c8c8781e6d2a3d6f5222f9e846fb32ae417a4a1872a296b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://971c38dc48c64a0c8c8781e6d2a3d6f5222f9e846fb32ae417a4a1872a296b47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6e590fdf915cb209ad79022e0bb1b20cf642ebfeaa5e67cad61f14c495feaed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6e590fdf915cb209ad79022e0bb1b20cf642ebfeaa5e67cad61f14c495feaed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa9bfe0b6b30c8ecbcab836f9fd1770f959392e981e9676b281b5768a4279d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa9bfe0b6b30c8ecbcab836f9fd1770f959392e981e9676b281b5768a4279d22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d589b2d09af16c9faaa995e5d4abaa7663d53b499e93fbb2ad76e9ef14ff32c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d589b2d09af16c9faaa995e5d4abaa7663d53b499e93fbb2ad76e9ef14ff32c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tzvwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:58Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:58 crc kubenswrapper[4690]: I0320 17:33:58.647577 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01a728ab-e286-4606-b922-d510978b863a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89f5bb035f84384df58eb38689bda300611344d78c38c548c61cd02a479b6852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://187278dddcc4ae295ce37bb5966dd95b70987cf9579d8a302c45162906caa098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d198c0b94cfc2e9429a02ccb1bf444b3746c37cd3278cc5c41cccad3a92f3a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78b79e7c6bc179739a43168addace3ea75f4067c5938f219a5cb0e545f65472f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11c8e8059826df28ea1bdafe3ca56a8a902ff916246367be3ece76d468194901\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95c9b322e5da6bc8172886af77d6507bccaaf8e4489181c78d3f5e522d781aa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c9d11cdb738402f6fe1772ac1ecc821fce38aec2a4d791927874099c1c91f9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33c64eaa45aef662646090576ff2b79e93b622b98520ed7fc96d04d9d8bf4dec\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:33:56Z\\\",\\\"message\\\":\\\"0320 17:33:56.858782 6480 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0320 17:33:56.858834 6480 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0320 17:33:56.858854 6480 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 17:33:56.858860 6480 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 17:33:56.858875 6480 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 17:33:56.858881 6480 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 17:33:56.858897 6480 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 17:33:56.858916 6480 factory.go:656] Stopping watch factory\\\\nI0320 17:33:56.858932 6480 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 17:33:56.858968 6480 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 17:33:56.858978 6480 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0320 17:33:56.858987 6480 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0320 17:33:56.858997 6480 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 17:33:56.859005 6480 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0320 17:33:56.859016 6480 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 17:33:56.859024 6480 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c9d11cdb738402f6fe1772ac1ecc821fce38aec2a4d791927874099c1c91f9e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:33:58Z\\\",\\\"message\\\":\\\"webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:58Z is after 2025-08-24T17:21:41Z]\\\\nI0320 17:33:58.453424 6680 services_controller.go:434] Service openshift-cluster-version/cluster-version-operator retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{cluster-version-operator openshift-cluster-version ddf4933a-f532-4906-9b8f-3b15aa433264 6187 0 2025-02-23 05:11:57 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:cluster-version-operator] map[exclude.release.openshift.io/internal-openshift-hosted:true include.release.openshift.io/self-managed-high-availability:true kubernetes.io/description:Expose cluster-version operator metrics to other in-cluster consumers. Access requires a prometheus-k8s RoleBinding in this namespace. service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:cluster-version-operator-serving-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc006b7dc67 \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Nam\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6447a78cef9ba2045f7928077399b681d152b37755ec287ae1633a26a67711ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ad2529bd38d1e0c84ca456ccdcc8020ce82a667c5aa5ea3a0027d397ec94f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ad2529bd38d1e0c84ca456ccdcc8020ce82a667c5aa5ea3a0027d397ec94f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7bsmm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:58Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:58 crc kubenswrapper[4690]: I0320 17:33:58.664149 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1ec4f2e-81b3-4b81-b071-1306b93f352a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc5b19d4175f97a26633b3c61b49147f93e1edeb8975964cb23bbe474f6326e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe2bb59ee9fc82c3e49b375d294aebc73e2175d699416cb28c587a153cbadc5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83d020fd903a7b604233a4229c9a201a78f0f9d41864c94e82220090dd73e69e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a788ca120045ef7b2481c3da0afac1f8ae2522b3edd3b73a48f5f8dab045a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60a788ca120045ef7b2481c3da0afac1f8ae2522b3edd3b73a48f5f8dab045a4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:33:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:33:16.417534 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:33:16.417775 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:33:16.418850 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4179466923/tls.crt::/tmp/serving-cert-4179466923/tls.key\\\\\\\"\\\\nI0320 17:33:16.771141 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:33:16.777371 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:33:16.777420 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:33:16.777489 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:33:16.777503 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:33:16.783760 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 17:33:16.783788 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:33:16.783793 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 17:33:16.783790 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:33:16.783798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:33:16.783816 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:33:16.783823 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:33:16.783828 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:33:16.787038 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d1877a8c2e19c04c44916cbcd68e19a117e4d6075b33ce7131064590120b12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438a96b878fe413aa54a56021b7ca5d2d38226050a036c2ce144aaead090aff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://438a96b878fe413aa54a56021b7ca5d2d38226050a036c2ce144aaead090aff7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:58Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:58 crc kubenswrapper[4690]: I0320 17:33:58.678190 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7501f273f832d465f837fe21cbfaddda7e9fdbfafe44e94d3fbfee21bbd2735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:58Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:58 crc kubenswrapper[4690]: I0320 17:33:58.690847 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:58 crc kubenswrapper[4690]: I0320 17:33:58.690879 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:58 crc kubenswrapper[4690]: I0320 17:33:58.690889 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:58 crc kubenswrapper[4690]: I0320 17:33:58.690903 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:58 crc kubenswrapper[4690]: I0320 17:33:58.690912 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:58Z","lastTransitionTime":"2026-03-20T17:33:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:58 crc kubenswrapper[4690]: I0320 17:33:58.695487 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:58Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:58 crc kubenswrapper[4690]: I0320 17:33:58.708072 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:58Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:58 crc kubenswrapper[4690]: I0320 17:33:58.720857 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qhmg6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5abdfe2-a5f7-43a7-9c83-a9eb0dacdea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19bc44db59dd7f723e92f099fb77ea80fac41a5fc0a3818ddd8d443495c50c8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lb8q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qhmg6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:58Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:58 crc kubenswrapper[4690]: I0320 17:33:58.732691 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cdd6a8b-6b15-41c5-ba81-51e1ef53835e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c42561cbc470c23295468bf31d6dda364c3962cf8ac84f53ed62c01fa3e19db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbfcf8baf8b3cc4746bc7b314297f0f820b7461ad85d9c2f500a3ed589fb4bc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbfcf8baf8b3cc4746bc7b314297f0f820b7461ad85d9c2f500a3ed589fb4bc0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:58Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:58 crc kubenswrapper[4690]: I0320 17:33:58.746429 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://202f57e25ffca6b763271ebd9354cb780bda72898aa4b753ce08bcf5a774dbd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:58Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:58 crc kubenswrapper[4690]: I0320 17:33:58.763389 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c18651e4-89e3-43fd-a780-bfa6df87591e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://746499ab480c55aa548acd69b4adc2adb724c111d53536273f1e738c5d67209c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v64dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09565d72b6e11bc9bc4f72446c455016fb107bdf0fe367b56427ce9f79c20b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v64dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wtg2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:58Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:58 crc kubenswrapper[4690]: I0320 17:33:58.777454 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4rfg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deaf1de2-4906-4e89-ae1b-83b6d35f97a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53b3e701b77813269b88f29ec4e437ca71cad9cd1b9cc9310dc6b59cc609bcc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmghf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4rfg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:58Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:58 crc kubenswrapper[4690]: I0320 17:33:58.792096 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nqtt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f51dea1-fc10-4d4a-9065-2d0c020b36f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc815d328a997ab7b69c5eb959fedde44313867916d64f4ebaf96d77e34b2e84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzj2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de078ec156833ff0304a8e83014adf2c8fc5c7f8db9bb25c366acf27fa446ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzj2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8nqtt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:58Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:58 crc kubenswrapper[4690]: I0320 17:33:58.793000 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:58 crc kubenswrapper[4690]: I0320 17:33:58.793071 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:58 crc kubenswrapper[4690]: I0320 17:33:58.793092 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:58 crc kubenswrapper[4690]: I0320 17:33:58.793116 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:58 crc kubenswrapper[4690]: I0320 17:33:58.793134 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:58Z","lastTransitionTime":"2026-03-20T17:33:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:58 crc kubenswrapper[4690]: I0320 17:33:58.895962 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:58 crc kubenswrapper[4690]: I0320 17:33:58.896002 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:58 crc kubenswrapper[4690]: I0320 17:33:58.896012 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:58 crc kubenswrapper[4690]: I0320 17:33:58.896027 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:58 crc kubenswrapper[4690]: I0320 17:33:58.896039 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:58Z","lastTransitionTime":"2026-03-20T17:33:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:58 crc kubenswrapper[4690]: I0320 17:33:58.998693 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:58 crc kubenswrapper[4690]: I0320 17:33:58.998772 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:58 crc kubenswrapper[4690]: I0320 17:33:58.998795 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:58 crc kubenswrapper[4690]: I0320 17:33:58.998836 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:58 crc kubenswrapper[4690]: I0320 17:33:58.998871 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:58Z","lastTransitionTime":"2026-03-20T17:33:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:59 crc kubenswrapper[4690]: I0320 17:33:59.101469 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:59 crc kubenswrapper[4690]: I0320 17:33:59.101527 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:59 crc kubenswrapper[4690]: I0320 17:33:59.101537 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:59 crc kubenswrapper[4690]: I0320 17:33:59.101552 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:59 crc kubenswrapper[4690]: I0320 17:33:59.101563 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:59Z","lastTransitionTime":"2026-03-20T17:33:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:59 crc kubenswrapper[4690]: I0320 17:33:59.205145 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:59 crc kubenswrapper[4690]: I0320 17:33:59.205200 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:59 crc kubenswrapper[4690]: I0320 17:33:59.205210 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:59 crc kubenswrapper[4690]: I0320 17:33:59.205226 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:59 crc kubenswrapper[4690]: I0320 17:33:59.205279 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:59Z","lastTransitionTime":"2026-03-20T17:33:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:59 crc kubenswrapper[4690]: I0320 17:33:59.308542 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:59 crc kubenswrapper[4690]: I0320 17:33:59.308600 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:59 crc kubenswrapper[4690]: I0320 17:33:59.308609 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:59 crc kubenswrapper[4690]: I0320 17:33:59.308623 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:59 crc kubenswrapper[4690]: I0320 17:33:59.308634 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:59Z","lastTransitionTime":"2026-03-20T17:33:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:59 crc kubenswrapper[4690]: I0320 17:33:59.412305 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:59 crc kubenswrapper[4690]: I0320 17:33:59.412366 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:59 crc kubenswrapper[4690]: I0320 17:33:59.412377 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:59 crc kubenswrapper[4690]: I0320 17:33:59.412395 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:59 crc kubenswrapper[4690]: I0320 17:33:59.412408 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:59Z","lastTransitionTime":"2026-03-20T17:33:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:59 crc kubenswrapper[4690]: I0320 17:33:59.515309 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:59 crc kubenswrapper[4690]: I0320 17:33:59.515383 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:59 crc kubenswrapper[4690]: I0320 17:33:59.515404 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:59 crc kubenswrapper[4690]: I0320 17:33:59.515429 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:59 crc kubenswrapper[4690]: I0320 17:33:59.515446 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:59Z","lastTransitionTime":"2026-03-20T17:33:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:59 crc kubenswrapper[4690]: I0320 17:33:59.523178 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7bsmm_01a728ab-e286-4606-b922-d510978b863a/ovnkube-controller/1.log" Mar 20 17:33:59 crc kubenswrapper[4690]: I0320 17:33:59.528140 4690 scope.go:117] "RemoveContainer" containerID="0c9d11cdb738402f6fe1772ac1ecc821fce38aec2a4d791927874099c1c91f9e" Mar 20 17:33:59 crc kubenswrapper[4690]: E0320 17:33:59.528850 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-7bsmm_openshift-ovn-kubernetes(01a728ab-e286-4606-b922-d510978b863a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" podUID="01a728ab-e286-4606-b922-d510978b863a" Mar 20 17:33:59 crc kubenswrapper[4690]: I0320 17:33:59.542682 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://202f57e25ffca6b763271ebd9354cb780bda72898aa4b753ce08bcf5a774dbd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:59Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:59 crc kubenswrapper[4690]: I0320 17:33:59.556987 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c18651e4-89e3-43fd-a780-bfa6df87591e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://746499ab480c55aa548acd69b4adc2adb724c111d53536273f1e738c5d67209c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v64dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09565d72b6e11bc9bc4f72446c455016fb107bdf0fe367b56427ce9f79c20b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v64dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wtg2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:59Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:59 crc kubenswrapper[4690]: I0320 17:33:59.570849 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4rfg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deaf1de2-4906-4e89-ae1b-83b6d35f97a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53b3e701b77813269b88f29ec4e437ca71cad9cd1b9cc9310dc6b59cc609bcc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmghf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4rfg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:59Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:59 crc kubenswrapper[4690]: I0320 17:33:59.582383 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nqtt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f51dea1-fc10-4d4a-9065-2d0c020b36f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc815d328a997ab7b69c5eb959fedde44313867916d64f4ebaf96d77e34b2e84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzj2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de078ec156833ff0304a8e83014adf2c8fc5c7f8db9bb25c366acf27fa446ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzj2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8nqtt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:59Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:59 crc kubenswrapper[4690]: I0320 17:33:59.594929 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64a74fb2e29c84d99284cdca82ecd7abae5fc195747f292f11036116ec270ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37728496304293eddfd812f4584815ce277a3a2b02b6716e5f7d5d77ebaf9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:59Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:59 crc kubenswrapper[4690]: I0320 17:33:59.608551 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:59Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:59 crc kubenswrapper[4690]: I0320 17:33:59.617980 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:59 crc kubenswrapper[4690]: I0320 17:33:59.618025 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:59 crc kubenswrapper[4690]: I0320 17:33:59.618034 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:59 crc kubenswrapper[4690]: I0320 17:33:59.618048 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:59 crc kubenswrapper[4690]: I0320 17:33:59.618060 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:59Z","lastTransitionTime":"2026-03-20T17:33:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:59 crc kubenswrapper[4690]: I0320 17:33:59.621191 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bf8dm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"189715be-f690-4a1d-9bd3-fb0dcae7affe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab5d0027832ffcb62f2f0869a4811a56bd02954cbdd4fd0e20870dc72818ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z9vwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bf8dm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:59Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:59 crc kubenswrapper[4690]: I0320 17:33:59.642819 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dacf40f3-f7fe-429b-bb11-3057bc037779\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b273b610fa19944625ca87d5ec10f818b86154d676f1def5ebe494ee44ed3848\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f552ca9ec154d035a9f9809b20d9ff2cd19bbd4cb9262173a0334289741f4fad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd8c552d958aced0cb683d87c3ef8d88494d4888ccb028a9f4c27b24b4923264\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5355eb1563fa92e70ca61e39a864a15b53da2181b277f3e134d121b5626b954a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f044bae4d4345b16e951ba16d4dc6df9b400789b67b6eb23d806fba27dc77d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://601a5cb96354f970de2322d08594baacac3c21ec962d27dc0c809f1bc99de4d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://601a5cb96354f970de2322d08594baacac3c21ec962d27dc0c809f1bc99de4d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e719a69188fb4ee3882973f6f72ba027c5a546cb39b119b27bcd38d8cc728521\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e719a69188fb4ee3882973f6f72ba027c5a546cb39b119b27bcd38d8cc728521\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ef118e8eca52e42d265877595d296d5641caa5c79886b886eefca7686f9b6524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef118e8eca52e42d265877595d296d5641caa5c79886b886eefca7686f9b6524\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:59Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:59 crc kubenswrapper[4690]: I0320 17:33:59.653417 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bgj72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cb690cf-caea-4c1b-ad3c-7e17a802b1a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djqjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djqjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bgj72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:59Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:59 crc kubenswrapper[4690]: I0320 17:33:59.669673 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tzvwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fe7c1d1-7aa9-4c64-941e-7415a99367ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9bad176e93c3fff461f57c5c15ed0d5bcc9ef12767d38012fe1145dd701112b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56dc92b978a7c1bbd4e3ccc2a6821348e2a990247e49e82c4de43c8bbe305cad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56dc92b978a7c1bbd4e3ccc2a6821348e2a990247e49e82c4de43c8bbe305cad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b4a3f2829967bcafe60ed0c6d08a421e8c8a5cd49d2a7445bbc92c2592d7457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b4a3f2829967bcafe60ed0c6d08a421e8c8a5cd49d2a7445bbc92c2592d7457\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://971c38dc48c64a0c8c8781e6d2a3d6f5222f9e846fb32ae417a4a1872a296b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://971c38dc48c64a0c8c8781e6d2a3d6f5222f9e846fb32ae417a4a1872a296b47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6e590fdf915cb209ad79022e0bb1b20cf642ebfeaa5e67cad61f14c495feaed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6e590fdf915cb209ad79022e0bb1b20cf642ebfeaa5e67cad61f14c495feaed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa9bfe0b6b30c8ecbcab836f9fd1770f959392e981e9676b281b5768a4279d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa9bfe0b6b30c8ecbcab836f9fd1770f959392e981e9676b281b5768a4279d22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d589b2d09af16c9faaa995e5d4abaa7663d53b499e93fbb2ad76e9ef14ff32c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d589b2d09af16c9faaa995e5d4abaa7663d53b499e93fbb2ad76e9ef14ff32c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tzvwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:59Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:59 crc kubenswrapper[4690]: I0320 17:33:59.689599 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01a728ab-e286-4606-b922-d510978b863a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89f5bb035f84384df58eb38689bda300611344d78c38c548c61cd02a479b6852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://187278dddcc4ae295ce37bb5966dd95b70987cf9579d8a302c45162906caa098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d198c0b94cfc2e9429a02ccb1bf444b3746c37cd3278cc5c41cccad3a92f3a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78b79e7c6bc179739a43168addace3ea75f4067c5938f219a5cb0e545f65472f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11c8e8059826df28ea1bdafe3ca56a8a902ff916246367be3ece76d468194901\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95c9b322e5da6bc8172886af77d6507bccaaf8e4489181c78d3f5e522d781aa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c9d11cdb738402f6fe1772ac1ecc821fce38aec2a4d791927874099c1c91f9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c9d11cdb738402f6fe1772ac1ecc821fce38aec2a4d791927874099c1c91f9e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:33:58Z\\\",\\\"message\\\":\\\"webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:58Z is after 2025-08-24T17:21:41Z]\\\\nI0320 17:33:58.453424 6680 services_controller.go:434] Service openshift-cluster-version/cluster-version-operator retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{cluster-version-operator openshift-cluster-version ddf4933a-f532-4906-9b8f-3b15aa433264 6187 0 2025-02-23 05:11:57 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:cluster-version-operator] map[exclude.release.openshift.io/internal-openshift-hosted:true include.release.openshift.io/self-managed-high-availability:true kubernetes.io/description:Expose cluster-version operator metrics to other in-cluster consumers. Access requires a prometheus-k8s RoleBinding in this namespace. service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:cluster-version-operator-serving-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc006b7dc67 \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Nam\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:57Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-7bsmm_openshift-ovn-kubernetes(01a728ab-e286-4606-b922-d510978b863a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6447a78cef9ba2045f7928077399b681d152b37755ec287ae1633a26a67711ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ad2529bd38d1e0c84ca456ccdcc8020ce82a667c5aa5ea3a0027d397ec94f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ad2529bd38d1e0c84ca456ccdcc8020ce82a667c5aa5ea3a0027d397ec94f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7bsmm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:59Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:59 crc kubenswrapper[4690]: I0320 17:33:59.700338 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qhmg6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5abdfe2-a5f7-43a7-9c83-a9eb0dacdea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19bc44db59dd7f723e92f099fb77ea80fac41a5fc0a3818ddd8d443495c50c8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lb8q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qhmg6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:59Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:59 crc kubenswrapper[4690]: I0320 17:33:59.714012 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cdd6a8b-6b15-41c5-ba81-51e1ef53835e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c42561cbc470c23295468bf31d6dda364c3962cf8ac84f53ed62c01fa3e19db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbfcf8baf8b3cc4746bc7b314297f0f820b7461ad85d9c2f500a3ed589fb4bc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbfcf8baf8b3cc4746bc7b314297f0f820b7461ad85d9c2f500a3ed589fb4bc0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:59Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:59 crc kubenswrapper[4690]: I0320 17:33:59.720289 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:59 crc kubenswrapper[4690]: I0320 17:33:59.720359 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:59 crc kubenswrapper[4690]: I0320 17:33:59.720373 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:59 crc kubenswrapper[4690]: I0320 17:33:59.720393 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:59 crc kubenswrapper[4690]: I0320 17:33:59.720407 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:59Z","lastTransitionTime":"2026-03-20T17:33:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:59 crc kubenswrapper[4690]: I0320 17:33:59.726656 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1ec4f2e-81b3-4b81-b071-1306b93f352a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc5b19d4175f97a26633b3c61b49147f93e1edeb8975964cb23bbe474f6326e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe2bb59ee9fc82c3e49b375d294aebc73e2175d699416cb28c587a153cbadc5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83d020fd903a7b604233a4229c9a201a78f0f9d41864c94e82220090dd73e69e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a788ca120045ef7b2481c3da0afac1f8ae2522b3edd3b73a48f5f8dab045a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60a788ca120045ef7b2481c3da0afac1f8ae2522b3edd3b73a48f5f8dab045a4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:33:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:33:16.417534 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:33:16.417775 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:33:16.418850 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4179466923/tls.crt::/tmp/serving-cert-4179466923/tls.key\\\\\\\"\\\\nI0320 17:33:16.771141 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:33:16.777371 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:33:16.777420 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:33:16.777489 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:33:16.777503 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:33:16.783760 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 17:33:16.783788 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:33:16.783793 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 17:33:16.783790 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:33:16.783798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:33:16.783816 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:33:16.783823 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:33:16.783828 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:33:16.787038 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d1877a8c2e19c04c44916cbcd68e19a117e4d6075b33ce7131064590120b12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438a96b878fe413aa54a56021b7ca5d2d38226050a036c2ce144aaead090aff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://438a96b878fe413aa54a56021b7ca5d2d38226050a036c2ce144aaead090aff7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:59Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:59 crc kubenswrapper[4690]: I0320 17:33:59.739511 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7501f273f832d465f837fe21cbfaddda7e9fdbfafe44e94d3fbfee21bbd2735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:59Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:59 crc kubenswrapper[4690]: I0320 17:33:59.753059 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:59Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:59 crc kubenswrapper[4690]: I0320 17:33:59.766813 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:59Z is after 2025-08-24T17:21:41Z" Mar 20 17:33:59 crc kubenswrapper[4690]: I0320 17:33:59.823179 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:59 crc kubenswrapper[4690]: I0320 17:33:59.823211 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:59 crc kubenswrapper[4690]: I0320 17:33:59.823221 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:59 crc kubenswrapper[4690]: I0320 17:33:59.823235 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:59 crc kubenswrapper[4690]: I0320 17:33:59.823244 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:59Z","lastTransitionTime":"2026-03-20T17:33:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:33:59 crc kubenswrapper[4690]: I0320 17:33:59.882464 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:33:59 crc kubenswrapper[4690]: I0320 17:33:59.882500 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:33:59 crc kubenswrapper[4690]: I0320 17:33:59.882526 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bgj72" Mar 20 17:33:59 crc kubenswrapper[4690]: E0320 17:33:59.882646 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:33:59 crc kubenswrapper[4690]: I0320 17:33:59.882692 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:33:59 crc kubenswrapper[4690]: E0320 17:33:59.882853 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bgj72" podUID="3cb690cf-caea-4c1b-ad3c-7e17a802b1a3" Mar 20 17:33:59 crc kubenswrapper[4690]: E0320 17:33:59.882978 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:33:59 crc kubenswrapper[4690]: E0320 17:33:59.883076 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:33:59 crc kubenswrapper[4690]: I0320 17:33:59.926061 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:33:59 crc kubenswrapper[4690]: I0320 17:33:59.926108 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:33:59 crc kubenswrapper[4690]: I0320 17:33:59.926121 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:33:59 crc kubenswrapper[4690]: I0320 17:33:59.926148 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:33:59 crc kubenswrapper[4690]: I0320 17:33:59.926161 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:33:59Z","lastTransitionTime":"2026-03-20T17:33:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:34:00 crc kubenswrapper[4690]: I0320 17:34:00.028245 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:34:00 crc kubenswrapper[4690]: I0320 17:34:00.028340 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:34:00 crc kubenswrapper[4690]: I0320 17:34:00.028361 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:34:00 crc kubenswrapper[4690]: I0320 17:34:00.028391 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:34:00 crc kubenswrapper[4690]: I0320 17:34:00.028411 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:34:00Z","lastTransitionTime":"2026-03-20T17:34:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:34:00 crc kubenswrapper[4690]: I0320 17:34:00.131449 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:34:00 crc kubenswrapper[4690]: I0320 17:34:00.131541 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:34:00 crc kubenswrapper[4690]: I0320 17:34:00.131558 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:34:00 crc kubenswrapper[4690]: I0320 17:34:00.131582 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:34:00 crc kubenswrapper[4690]: I0320 17:34:00.131599 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:34:00Z","lastTransitionTime":"2026-03-20T17:34:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:34:00 crc kubenswrapper[4690]: I0320 17:34:00.234090 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:34:00 crc kubenswrapper[4690]: I0320 17:34:00.234160 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:34:00 crc kubenswrapper[4690]: I0320 17:34:00.234181 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:34:00 crc kubenswrapper[4690]: I0320 17:34:00.234203 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:34:00 crc kubenswrapper[4690]: I0320 17:34:00.234220 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:34:00Z","lastTransitionTime":"2026-03-20T17:34:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:34:00 crc kubenswrapper[4690]: I0320 17:34:00.337390 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:34:00 crc kubenswrapper[4690]: I0320 17:34:00.337453 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:34:00 crc kubenswrapper[4690]: I0320 17:34:00.337469 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:34:00 crc kubenswrapper[4690]: I0320 17:34:00.337494 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:34:00 crc kubenswrapper[4690]: I0320 17:34:00.337511 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:34:00Z","lastTransitionTime":"2026-03-20T17:34:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:34:00 crc kubenswrapper[4690]: I0320 17:34:00.439865 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:34:00 crc kubenswrapper[4690]: I0320 17:34:00.439929 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:34:00 crc kubenswrapper[4690]: I0320 17:34:00.439945 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:34:00 crc kubenswrapper[4690]: I0320 17:34:00.439970 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:34:00 crc kubenswrapper[4690]: I0320 17:34:00.439999 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:34:00Z","lastTransitionTime":"2026-03-20T17:34:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:34:00 crc kubenswrapper[4690]: I0320 17:34:00.542625 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:34:00 crc kubenswrapper[4690]: I0320 17:34:00.542661 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:34:00 crc kubenswrapper[4690]: I0320 17:34:00.542669 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:34:00 crc kubenswrapper[4690]: I0320 17:34:00.542684 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:34:00 crc kubenswrapper[4690]: I0320 17:34:00.542692 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:34:00Z","lastTransitionTime":"2026-03-20T17:34:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:34:00 crc kubenswrapper[4690]: I0320 17:34:00.646128 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:34:00 crc kubenswrapper[4690]: I0320 17:34:00.646213 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:34:00 crc kubenswrapper[4690]: I0320 17:34:00.646232 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:34:00 crc kubenswrapper[4690]: I0320 17:34:00.646295 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:34:00 crc kubenswrapper[4690]: I0320 17:34:00.646316 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:34:00Z","lastTransitionTime":"2026-03-20T17:34:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:34:00 crc kubenswrapper[4690]: I0320 17:34:00.749696 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:34:00 crc kubenswrapper[4690]: I0320 17:34:00.749800 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:34:00 crc kubenswrapper[4690]: I0320 17:34:00.749827 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:34:00 crc kubenswrapper[4690]: I0320 17:34:00.749863 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:34:00 crc kubenswrapper[4690]: I0320 17:34:00.749891 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:34:00Z","lastTransitionTime":"2026-03-20T17:34:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:34:00 crc kubenswrapper[4690]: I0320 17:34:00.853389 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:34:00 crc kubenswrapper[4690]: I0320 17:34:00.853468 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:34:00 crc kubenswrapper[4690]: I0320 17:34:00.853487 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:34:00 crc kubenswrapper[4690]: I0320 17:34:00.853512 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:34:00 crc kubenswrapper[4690]: I0320 17:34:00.853530 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:34:00Z","lastTransitionTime":"2026-03-20T17:34:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:34:00 crc kubenswrapper[4690]: I0320 17:34:00.956867 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:34:00 crc kubenswrapper[4690]: I0320 17:34:00.956928 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:34:00 crc kubenswrapper[4690]: I0320 17:34:00.956946 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:34:00 crc kubenswrapper[4690]: I0320 17:34:00.956970 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:34:00 crc kubenswrapper[4690]: I0320 17:34:00.956991 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:34:00Z","lastTransitionTime":"2026-03-20T17:34:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:34:01 crc kubenswrapper[4690]: I0320 17:34:01.060487 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:34:01 crc kubenswrapper[4690]: I0320 17:34:01.060538 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:34:01 crc kubenswrapper[4690]: I0320 17:34:01.060549 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:34:01 crc kubenswrapper[4690]: I0320 17:34:01.060567 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:34:01 crc kubenswrapper[4690]: I0320 17:34:01.060579 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:34:01Z","lastTransitionTime":"2026-03-20T17:34:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:34:01 crc kubenswrapper[4690]: I0320 17:34:01.162927 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:34:01 crc kubenswrapper[4690]: I0320 17:34:01.162965 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:34:01 crc kubenswrapper[4690]: I0320 17:34:01.162976 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:34:01 crc kubenswrapper[4690]: I0320 17:34:01.162993 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:34:01 crc kubenswrapper[4690]: I0320 17:34:01.163004 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:34:01Z","lastTransitionTime":"2026-03-20T17:34:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:34:01 crc kubenswrapper[4690]: I0320 17:34:01.265958 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:34:01 crc kubenswrapper[4690]: I0320 17:34:01.266023 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:34:01 crc kubenswrapper[4690]: I0320 17:34:01.266045 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:34:01 crc kubenswrapper[4690]: I0320 17:34:01.266072 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:34:01 crc kubenswrapper[4690]: I0320 17:34:01.266094 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:34:01Z","lastTransitionTime":"2026-03-20T17:34:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:34:01 crc kubenswrapper[4690]: I0320 17:34:01.369452 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:34:01 crc kubenswrapper[4690]: I0320 17:34:01.369519 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:34:01 crc kubenswrapper[4690]: I0320 17:34:01.369538 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:34:01 crc kubenswrapper[4690]: I0320 17:34:01.369562 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:34:01 crc kubenswrapper[4690]: I0320 17:34:01.369581 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:34:01Z","lastTransitionTime":"2026-03-20T17:34:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:34:01 crc kubenswrapper[4690]: I0320 17:34:01.472974 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:34:01 crc kubenswrapper[4690]: I0320 17:34:01.473057 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:34:01 crc kubenswrapper[4690]: I0320 17:34:01.473076 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:34:01 crc kubenswrapper[4690]: I0320 17:34:01.473099 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:34:01 crc kubenswrapper[4690]: I0320 17:34:01.473150 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:34:01Z","lastTransitionTime":"2026-03-20T17:34:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:34:01 crc kubenswrapper[4690]: I0320 17:34:01.576009 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:34:01 crc kubenswrapper[4690]: I0320 17:34:01.576097 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:34:01 crc kubenswrapper[4690]: I0320 17:34:01.576116 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:34:01 crc kubenswrapper[4690]: I0320 17:34:01.576146 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:34:01 crc kubenswrapper[4690]: I0320 17:34:01.576166 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:34:01Z","lastTransitionTime":"2026-03-20T17:34:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:34:01 crc kubenswrapper[4690]: I0320 17:34:01.678664 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:34:01 crc kubenswrapper[4690]: I0320 17:34:01.678729 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:34:01 crc kubenswrapper[4690]: I0320 17:34:01.678748 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:34:01 crc kubenswrapper[4690]: I0320 17:34:01.679196 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:34:01 crc kubenswrapper[4690]: I0320 17:34:01.679461 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:34:01Z","lastTransitionTime":"2026-03-20T17:34:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:34:01 crc kubenswrapper[4690]: I0320 17:34:01.782821 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:34:01 crc kubenswrapper[4690]: I0320 17:34:01.782855 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:34:01 crc kubenswrapper[4690]: I0320 17:34:01.782864 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:34:01 crc kubenswrapper[4690]: I0320 17:34:01.782879 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:34:01 crc kubenswrapper[4690]: I0320 17:34:01.782889 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:34:01Z","lastTransitionTime":"2026-03-20T17:34:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:34:01 crc kubenswrapper[4690]: I0320 17:34:01.883116 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:34:01 crc kubenswrapper[4690]: I0320 17:34:01.883166 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bgj72" Mar 20 17:34:01 crc kubenswrapper[4690]: I0320 17:34:01.883225 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:34:01 crc kubenswrapper[4690]: E0320 17:34:01.883364 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:34:01 crc kubenswrapper[4690]: I0320 17:34:01.883390 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:34:01 crc kubenswrapper[4690]: E0320 17:34:01.883534 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:34:01 crc kubenswrapper[4690]: E0320 17:34:01.883628 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bgj72" podUID="3cb690cf-caea-4c1b-ad3c-7e17a802b1a3" Mar 20 17:34:01 crc kubenswrapper[4690]: E0320 17:34:01.883811 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:34:01 crc kubenswrapper[4690]: I0320 17:34:01.885471 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:34:01 crc kubenswrapper[4690]: I0320 17:34:01.885535 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:34:01 crc kubenswrapper[4690]: I0320 17:34:01.885555 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:34:01 crc kubenswrapper[4690]: I0320 17:34:01.885578 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:34:01 crc kubenswrapper[4690]: I0320 17:34:01.885597 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:34:01Z","lastTransitionTime":"2026-03-20T17:34:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:34:01 crc kubenswrapper[4690]: I0320 17:34:01.989106 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:34:01 crc kubenswrapper[4690]: I0320 17:34:01.989226 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:34:01 crc kubenswrapper[4690]: I0320 17:34:01.989251 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:34:01 crc kubenswrapper[4690]: I0320 17:34:01.989316 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:34:01 crc kubenswrapper[4690]: I0320 17:34:01.989340 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:34:01Z","lastTransitionTime":"2026-03-20T17:34:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:34:02 crc kubenswrapper[4690]: I0320 17:34:02.093227 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:34:02 crc kubenswrapper[4690]: I0320 17:34:02.093380 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:34:02 crc kubenswrapper[4690]: I0320 17:34:02.093398 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:34:02 crc kubenswrapper[4690]: I0320 17:34:02.093424 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:34:02 crc kubenswrapper[4690]: I0320 17:34:02.093441 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:34:02Z","lastTransitionTime":"2026-03-20T17:34:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:34:02 crc kubenswrapper[4690]: I0320 17:34:02.198629 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:34:02 crc kubenswrapper[4690]: I0320 17:34:02.198709 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:34:02 crc kubenswrapper[4690]: I0320 17:34:02.198741 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:34:02 crc kubenswrapper[4690]: I0320 17:34:02.198765 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:34:02 crc kubenswrapper[4690]: I0320 17:34:02.198782 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:34:02Z","lastTransitionTime":"2026-03-20T17:34:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:34:02 crc kubenswrapper[4690]: I0320 17:34:02.301676 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:34:02 crc kubenswrapper[4690]: I0320 17:34:02.301908 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:34:02 crc kubenswrapper[4690]: I0320 17:34:02.301919 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:34:02 crc kubenswrapper[4690]: I0320 17:34:02.301934 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:34:02 crc kubenswrapper[4690]: I0320 17:34:02.301944 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:34:02Z","lastTransitionTime":"2026-03-20T17:34:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:34:02 crc kubenswrapper[4690]: I0320 17:34:02.405139 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:34:02 crc kubenswrapper[4690]: I0320 17:34:02.405210 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:34:02 crc kubenswrapper[4690]: I0320 17:34:02.405227 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:34:02 crc kubenswrapper[4690]: I0320 17:34:02.405278 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:34:02 crc kubenswrapper[4690]: I0320 17:34:02.405298 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:34:02Z","lastTransitionTime":"2026-03-20T17:34:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:34:02 crc kubenswrapper[4690]: I0320 17:34:02.508889 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:34:02 crc kubenswrapper[4690]: I0320 17:34:02.508951 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:34:02 crc kubenswrapper[4690]: I0320 17:34:02.508968 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:34:02 crc kubenswrapper[4690]: I0320 17:34:02.508991 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:34:02 crc kubenswrapper[4690]: I0320 17:34:02.509009 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:34:02Z","lastTransitionTime":"2026-03-20T17:34:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:34:02 crc kubenswrapper[4690]: I0320 17:34:02.611950 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:34:02 crc kubenswrapper[4690]: I0320 17:34:02.611997 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:34:02 crc kubenswrapper[4690]: I0320 17:34:02.612012 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:34:02 crc kubenswrapper[4690]: I0320 17:34:02.612033 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:34:02 crc kubenswrapper[4690]: I0320 17:34:02.612050 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:34:02Z","lastTransitionTime":"2026-03-20T17:34:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:34:02 crc kubenswrapper[4690]: I0320 17:34:02.715426 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:34:02 crc kubenswrapper[4690]: I0320 17:34:02.715524 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:34:02 crc kubenswrapper[4690]: I0320 17:34:02.715548 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:34:02 crc kubenswrapper[4690]: I0320 17:34:02.715615 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:34:02 crc kubenswrapper[4690]: I0320 17:34:02.715694 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:34:02Z","lastTransitionTime":"2026-03-20T17:34:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:34:02 crc kubenswrapper[4690]: I0320 17:34:02.819023 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:34:02 crc kubenswrapper[4690]: I0320 17:34:02.819090 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:34:02 crc kubenswrapper[4690]: I0320 17:34:02.819107 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:34:02 crc kubenswrapper[4690]: I0320 17:34:02.819134 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:34:02 crc kubenswrapper[4690]: I0320 17:34:02.819152 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:34:02Z","lastTransitionTime":"2026-03-20T17:34:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:34:02 crc kubenswrapper[4690]: I0320 17:34:02.896694 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 20 17:34:02 crc kubenswrapper[4690]: I0320 17:34:02.922779 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:34:02 crc kubenswrapper[4690]: I0320 17:34:02.922847 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:34:02 crc kubenswrapper[4690]: I0320 17:34:02.922869 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:34:02 crc kubenswrapper[4690]: I0320 17:34:02.922894 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:34:02 crc kubenswrapper[4690]: I0320 17:34:02.922913 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:34:02Z","lastTransitionTime":"2026-03-20T17:34:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:34:03 crc kubenswrapper[4690]: I0320 17:34:03.025447 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:34:03 crc kubenswrapper[4690]: I0320 17:34:03.025507 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:34:03 crc kubenswrapper[4690]: I0320 17:34:03.025526 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:34:03 crc kubenswrapper[4690]: I0320 17:34:03.025551 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:34:03 crc kubenswrapper[4690]: I0320 17:34:03.025568 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:34:03Z","lastTransitionTime":"2026-03-20T17:34:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:34:03 crc kubenswrapper[4690]: I0320 17:34:03.127939 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:34:03 crc kubenswrapper[4690]: I0320 17:34:03.128033 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:34:03 crc kubenswrapper[4690]: I0320 17:34:03.128057 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:34:03 crc kubenswrapper[4690]: I0320 17:34:03.128093 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:34:03 crc kubenswrapper[4690]: I0320 17:34:03.128115 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:34:03Z","lastTransitionTime":"2026-03-20T17:34:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:34:03 crc kubenswrapper[4690]: I0320 17:34:03.235084 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:34:03 crc kubenswrapper[4690]: I0320 17:34:03.235132 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:34:03 crc kubenswrapper[4690]: I0320 17:34:03.235141 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:34:03 crc kubenswrapper[4690]: I0320 17:34:03.235160 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:34:03 crc kubenswrapper[4690]: I0320 17:34:03.235172 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:34:03Z","lastTransitionTime":"2026-03-20T17:34:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:34:03 crc kubenswrapper[4690]: I0320 17:34:03.338526 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:34:03 crc kubenswrapper[4690]: I0320 17:34:03.338589 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:34:03 crc kubenswrapper[4690]: I0320 17:34:03.338613 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:34:03 crc kubenswrapper[4690]: I0320 17:34:03.338645 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:34:03 crc kubenswrapper[4690]: I0320 17:34:03.338671 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:34:03Z","lastTransitionTime":"2026-03-20T17:34:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:34:03 crc kubenswrapper[4690]: I0320 17:34:03.442019 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:34:03 crc kubenswrapper[4690]: I0320 17:34:03.442082 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:34:03 crc kubenswrapper[4690]: I0320 17:34:03.442105 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:34:03 crc kubenswrapper[4690]: I0320 17:34:03.442133 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:34:03 crc kubenswrapper[4690]: I0320 17:34:03.442154 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:34:03Z","lastTransitionTime":"2026-03-20T17:34:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:34:03 crc kubenswrapper[4690]: I0320 17:34:03.544212 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:34:03 crc kubenswrapper[4690]: I0320 17:34:03.544317 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:34:03 crc kubenswrapper[4690]: I0320 17:34:03.544342 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:34:03 crc kubenswrapper[4690]: I0320 17:34:03.544371 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:34:03 crc kubenswrapper[4690]: I0320 17:34:03.544393 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:34:03Z","lastTransitionTime":"2026-03-20T17:34:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:34:03 crc kubenswrapper[4690]: I0320 17:34:03.647400 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:34:03 crc kubenswrapper[4690]: I0320 17:34:03.647472 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:34:03 crc kubenswrapper[4690]: I0320 17:34:03.647489 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:34:03 crc kubenswrapper[4690]: I0320 17:34:03.647512 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:34:03 crc kubenswrapper[4690]: I0320 17:34:03.647530 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:34:03Z","lastTransitionTime":"2026-03-20T17:34:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:34:03 crc kubenswrapper[4690]: I0320 17:34:03.750234 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:34:03 crc kubenswrapper[4690]: I0320 17:34:03.750345 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:34:03 crc kubenswrapper[4690]: I0320 17:34:03.750374 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:34:03 crc kubenswrapper[4690]: I0320 17:34:03.750419 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:34:03 crc kubenswrapper[4690]: I0320 17:34:03.750443 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:34:03Z","lastTransitionTime":"2026-03-20T17:34:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:34:03 crc kubenswrapper[4690]: I0320 17:34:03.853057 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:34:03 crc kubenswrapper[4690]: I0320 17:34:03.853124 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:34:03 crc kubenswrapper[4690]: I0320 17:34:03.853147 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:34:03 crc kubenswrapper[4690]: I0320 17:34:03.853185 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:34:03 crc kubenswrapper[4690]: I0320 17:34:03.853219 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:34:03Z","lastTransitionTime":"2026-03-20T17:34:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:34:03 crc kubenswrapper[4690]: I0320 17:34:03.882730 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:34:03 crc kubenswrapper[4690]: I0320 17:34:03.882730 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:34:03 crc kubenswrapper[4690]: I0320 17:34:03.882751 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bgj72" Mar 20 17:34:03 crc kubenswrapper[4690]: I0320 17:34:03.882851 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:34:03 crc kubenswrapper[4690]: E0320 17:34:03.882959 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:34:03 crc kubenswrapper[4690]: E0320 17:34:03.883113 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bgj72" podUID="3cb690cf-caea-4c1b-ad3c-7e17a802b1a3" Mar 20 17:34:03 crc kubenswrapper[4690]: E0320 17:34:03.883235 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:34:03 crc kubenswrapper[4690]: E0320 17:34:03.883331 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:34:03 crc kubenswrapper[4690]: I0320 17:34:03.955629 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:34:03 crc kubenswrapper[4690]: I0320 17:34:03.955695 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:34:03 crc kubenswrapper[4690]: I0320 17:34:03.955750 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:34:03 crc kubenswrapper[4690]: I0320 17:34:03.955779 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:34:03 crc kubenswrapper[4690]: I0320 17:34:03.955805 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:34:03Z","lastTransitionTime":"2026-03-20T17:34:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:34:04 crc kubenswrapper[4690]: I0320 17:34:04.059241 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:34:04 crc kubenswrapper[4690]: I0320 17:34:04.059305 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:34:04 crc kubenswrapper[4690]: I0320 17:34:04.059315 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:34:04 crc kubenswrapper[4690]: I0320 17:34:04.059328 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:34:04 crc kubenswrapper[4690]: I0320 17:34:04.059337 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:34:04Z","lastTransitionTime":"2026-03-20T17:34:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:34:04 crc kubenswrapper[4690]: I0320 17:34:04.162128 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:34:04 crc kubenswrapper[4690]: I0320 17:34:04.162184 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:34:04 crc kubenswrapper[4690]: I0320 17:34:04.162201 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:34:04 crc kubenswrapper[4690]: I0320 17:34:04.162225 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:34:04 crc kubenswrapper[4690]: I0320 17:34:04.162247 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:34:04Z","lastTransitionTime":"2026-03-20T17:34:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:34:04 crc kubenswrapper[4690]: I0320 17:34:04.265455 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:34:04 crc kubenswrapper[4690]: I0320 17:34:04.265527 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:34:04 crc kubenswrapper[4690]: I0320 17:34:04.265552 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:34:04 crc kubenswrapper[4690]: I0320 17:34:04.265583 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:34:04 crc kubenswrapper[4690]: I0320 17:34:04.265603 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:34:04Z","lastTransitionTime":"2026-03-20T17:34:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:34:04 crc kubenswrapper[4690]: I0320 17:34:04.370383 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:34:04 crc kubenswrapper[4690]: I0320 17:34:04.370465 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:34:04 crc kubenswrapper[4690]: I0320 17:34:04.370485 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:34:04 crc kubenswrapper[4690]: I0320 17:34:04.370508 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:34:04 crc kubenswrapper[4690]: I0320 17:34:04.370528 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:34:04Z","lastTransitionTime":"2026-03-20T17:34:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:34:04 crc kubenswrapper[4690]: I0320 17:34:04.473109 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:34:04 crc kubenswrapper[4690]: I0320 17:34:04.473150 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:34:04 crc kubenswrapper[4690]: I0320 17:34:04.473161 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:34:04 crc kubenswrapper[4690]: I0320 17:34:04.473177 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:34:04 crc kubenswrapper[4690]: I0320 17:34:04.473188 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:34:04Z","lastTransitionTime":"2026-03-20T17:34:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:34:04 crc kubenswrapper[4690]: I0320 17:34:04.575914 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:34:04 crc kubenswrapper[4690]: I0320 17:34:04.575980 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:34:04 crc kubenswrapper[4690]: I0320 17:34:04.575999 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:34:04 crc kubenswrapper[4690]: I0320 17:34:04.576025 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:34:04 crc kubenswrapper[4690]: I0320 17:34:04.576047 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:34:04Z","lastTransitionTime":"2026-03-20T17:34:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:34:04 crc kubenswrapper[4690]: I0320 17:34:04.678828 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:34:04 crc kubenswrapper[4690]: I0320 17:34:04.678891 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:34:04 crc kubenswrapper[4690]: I0320 17:34:04.678912 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:34:04 crc kubenswrapper[4690]: I0320 17:34:04.678939 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:34:04 crc kubenswrapper[4690]: I0320 17:34:04.678959 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:34:04Z","lastTransitionTime":"2026-03-20T17:34:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:34:04 crc kubenswrapper[4690]: I0320 17:34:04.781591 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:34:04 crc kubenswrapper[4690]: I0320 17:34:04.781656 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:34:04 crc kubenswrapper[4690]: I0320 17:34:04.781676 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:34:04 crc kubenswrapper[4690]: I0320 17:34:04.781717 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:34:04 crc kubenswrapper[4690]: I0320 17:34:04.781735 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:34:04Z","lastTransitionTime":"2026-03-20T17:34:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:34:04 crc kubenswrapper[4690]: I0320 17:34:04.884079 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:34:04 crc kubenswrapper[4690]: I0320 17:34:04.884124 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:34:04 crc kubenswrapper[4690]: I0320 17:34:04.884134 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:34:04 crc kubenswrapper[4690]: I0320 17:34:04.884147 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:34:04 crc kubenswrapper[4690]: I0320 17:34:04.884157 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:34:04Z","lastTransitionTime":"2026-03-20T17:34:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:34:04 crc kubenswrapper[4690]: I0320 17:34:04.987037 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:34:04 crc kubenswrapper[4690]: I0320 17:34:04.987087 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:34:04 crc kubenswrapper[4690]: I0320 17:34:04.987102 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:34:04 crc kubenswrapper[4690]: I0320 17:34:04.987125 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:34:04 crc kubenswrapper[4690]: I0320 17:34:04.987143 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:34:04Z","lastTransitionTime":"2026-03-20T17:34:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:34:05 crc kubenswrapper[4690]: I0320 17:34:05.090182 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:34:05 crc kubenswrapper[4690]: I0320 17:34:05.090298 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:34:05 crc kubenswrapper[4690]: I0320 17:34:05.090324 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:34:05 crc kubenswrapper[4690]: I0320 17:34:05.090352 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:34:05 crc kubenswrapper[4690]: I0320 17:34:05.090373 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:34:05Z","lastTransitionTime":"2026-03-20T17:34:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:34:05 crc kubenswrapper[4690]: I0320 17:34:05.193103 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:34:05 crc kubenswrapper[4690]: I0320 17:34:05.193165 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:34:05 crc kubenswrapper[4690]: I0320 17:34:05.193182 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:34:05 crc kubenswrapper[4690]: I0320 17:34:05.193205 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:34:05 crc kubenswrapper[4690]: I0320 17:34:05.193227 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:34:05Z","lastTransitionTime":"2026-03-20T17:34:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:34:05 crc kubenswrapper[4690]: I0320 17:34:05.296817 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:34:05 crc kubenswrapper[4690]: I0320 17:34:05.296873 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:34:05 crc kubenswrapper[4690]: I0320 17:34:05.296890 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:34:05 crc kubenswrapper[4690]: I0320 17:34:05.296914 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:34:05 crc kubenswrapper[4690]: I0320 17:34:05.296930 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:34:05Z","lastTransitionTime":"2026-03-20T17:34:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:34:05 crc kubenswrapper[4690]: I0320 17:34:05.400401 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:34:05 crc kubenswrapper[4690]: I0320 17:34:05.400485 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:34:05 crc kubenswrapper[4690]: I0320 17:34:05.400508 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:34:05 crc kubenswrapper[4690]: I0320 17:34:05.400538 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:34:05 crc kubenswrapper[4690]: I0320 17:34:05.400561 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:34:05Z","lastTransitionTime":"2026-03-20T17:34:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:34:05 crc kubenswrapper[4690]: I0320 17:34:05.504042 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:34:05 crc kubenswrapper[4690]: I0320 17:34:05.504121 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:34:05 crc kubenswrapper[4690]: I0320 17:34:05.504142 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:34:05 crc kubenswrapper[4690]: I0320 17:34:05.504170 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:34:05 crc kubenswrapper[4690]: I0320 17:34:05.504187 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:34:05Z","lastTransitionTime":"2026-03-20T17:34:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:34:05 crc kubenswrapper[4690]: I0320 17:34:05.608073 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:34:05 crc kubenswrapper[4690]: I0320 17:34:05.608158 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:34:05 crc kubenswrapper[4690]: I0320 17:34:05.608169 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:34:05 crc kubenswrapper[4690]: I0320 17:34:05.608212 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:34:05 crc kubenswrapper[4690]: I0320 17:34:05.608228 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:34:05Z","lastTransitionTime":"2026-03-20T17:34:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:34:05 crc kubenswrapper[4690]: I0320 17:34:05.711368 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:34:05 crc kubenswrapper[4690]: I0320 17:34:05.711432 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:34:05 crc kubenswrapper[4690]: I0320 17:34:05.711444 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:34:05 crc kubenswrapper[4690]: I0320 17:34:05.711463 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:34:05 crc kubenswrapper[4690]: I0320 17:34:05.711473 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:34:05Z","lastTransitionTime":"2026-03-20T17:34:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:34:05 crc kubenswrapper[4690]: E0320 17:34:05.812778 4690 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 20 17:34:05 crc kubenswrapper[4690]: I0320 17:34:05.882839 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:34:05 crc kubenswrapper[4690]: I0320 17:34:05.883083 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:34:05 crc kubenswrapper[4690]: I0320 17:34:05.883209 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:34:05 crc kubenswrapper[4690]: I0320 17:34:05.883295 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bgj72" Mar 20 17:34:05 crc kubenswrapper[4690]: E0320 17:34:05.883407 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:34:05 crc kubenswrapper[4690]: E0320 17:34:05.883532 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:34:05 crc kubenswrapper[4690]: E0320 17:34:05.883747 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:34:05 crc kubenswrapper[4690]: I0320 17:34:05.883888 4690 scope.go:117] "RemoveContainer" containerID="60a788ca120045ef7b2481c3da0afac1f8ae2522b3edd3b73a48f5f8dab045a4" Mar 20 17:34:05 crc kubenswrapper[4690]: E0320 17:34:05.884640 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bgj72" podUID="3cb690cf-caea-4c1b-ad3c-7e17a802b1a3" Mar 20 17:34:05 crc kubenswrapper[4690]: I0320 17:34:05.901763 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:05Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:05 crc kubenswrapper[4690]: I0320 17:34:05.917949 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bf8dm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"189715be-f690-4a1d-9bd3-fb0dcae7affe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab5d0027832ffcb62f2f0869a4811a56bd02954cbdd4fd0e20870dc72818ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z9vwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bf8dm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:05Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:05 crc kubenswrapper[4690]: I0320 17:34:05.930134 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64a74fb2e29c84d99284cdca82ecd7abae5fc195747f292f11036116ec270ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37728496304293eddfd812f4584815ce277a3a2b02b6716e5f7d5d77ebaf9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:05Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:05 crc kubenswrapper[4690]: I0320 17:34:05.959503 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dacf40f3-f7fe-429b-bb11-3057bc037779\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b273b610fa19944625ca87d5ec10f818b86154d676f1def5ebe494ee44ed3848\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f552ca9ec154d035a9f9809b20d9ff2cd19bbd4cb9262173a0334289741f4fad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd8c552d958aced0cb683d87c3ef8d88494d4888ccb028a9f4c27b24b4923264\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5355eb1563fa92e70ca61e39a864a15b53da2181b277f3e134d121b5626b954a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f044bae4d4345b16e951ba16d4dc6df9b400789b67b6eb23d806fba27dc77d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://601a5cb96354f970de2322d08594baacac3c21ec962d27dc0c809f1bc99de4d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://601a5cb96354f970de2322d08594baacac3c21ec962d27dc0c809f1bc99de4d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e719a69188fb4ee3882973f6f72ba027c5a546cb39b119b27bcd38d8cc728521\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e719a69188fb4ee3882973f6f72ba027c5a546cb39b119b27bcd38d8cc728521\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ef118e8eca52e42d265877595d296d5641caa5c79886b886eefca7686f9b6524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef118e8eca52e42d265877595d296d5641caa5c79886b886eefca7686f9b6524\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:05Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:05 crc kubenswrapper[4690]: I0320 17:34:05.978741 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bgj72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cb690cf-caea-4c1b-ad3c-7e17a802b1a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djqjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djqjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bgj72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:05Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:05 crc kubenswrapper[4690]: E0320 17:34:05.985859 4690 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 17:34:06 crc kubenswrapper[4690]: I0320 17:34:06.001141 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tzvwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fe7c1d1-7aa9-4c64-941e-7415a99367ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9bad176e93c3fff461f57c5c15ed0d5bcc9ef12767d38012fe1145dd701112b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56dc92b978a7c1bbd4e3ccc2a6821348e2a990247e49e82c4de43c8bbe305cad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56dc92b978a7c1bbd4e3ccc2a6821348e2a990247e49e82c4de43c8bbe305cad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b4a3f2829967bcafe60ed0c6d08a421e8c8a5cd49d2a7445bbc92c2592d7457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b4a3f2829967bcafe60ed0c6d08a421e8c8a5cd49d2a7445bbc92c2592d7457\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://971c38dc48c64a0c8c8781e6d2a3d6f5222f9e846fb32ae417a4a1872a296b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://971c38dc48c64a0c8c8781e6d2a3d6f5222f9e846fb32ae417a4a1872a296b47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6e590fdf915cb209ad79022e0bb1b20cf642ebfeaa5e67cad61f14c495feaed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6e590fdf915cb209ad79022e0bb1b20cf642ebfeaa5e67cad61f14c495feaed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa9bfe0b6b30c8ecbcab836f9fd1770f959392e981e9676b281b5768a4279d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa9bfe0b6b30c8ecbcab836f9fd1770f959392e981e9676b281b5768a4279d22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d589b2d09af16c9faaa995e5d4abaa7663d53b499e93fbb2ad76e9ef14ff32c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d589b2d09af16c9faaa995e5d4abaa7663d53b499e93fbb2ad76e9ef14ff32c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tzvwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:05Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:06 crc kubenswrapper[4690]: I0320 17:34:06.034523 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01a728ab-e286-4606-b922-d510978b863a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89f5bb035f84384df58eb38689bda300611344d78c38c548c61cd02a479b6852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://187278dddcc4ae295ce37bb5966dd95b70987cf9579d8a302c45162906caa098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d198c0b94cfc2e9429a02ccb1bf444b3746c37cd3278cc5c41cccad3a92f3a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78b79e7c6bc179739a43168addace3ea75f4067c5938f219a5cb0e545f65472f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11c8e8059826df28ea1bdafe3ca56a8a902ff916246367be3ece76d468194901\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95c9b322e5da6bc8172886af77d6507bccaaf8e4489181c78d3f5e522d781aa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c9d11cdb738402f6fe1772ac1ecc821fce38aec2a4d791927874099c1c91f9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c9d11cdb738402f6fe1772ac1ecc821fce38aec2a4d791927874099c1c91f9e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:33:58Z\\\",\\\"message\\\":\\\"webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:58Z is after 2025-08-24T17:21:41Z]\\\\nI0320 17:33:58.453424 6680 services_controller.go:434] Service openshift-cluster-version/cluster-version-operator retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{cluster-version-operator openshift-cluster-version ddf4933a-f532-4906-9b8f-3b15aa433264 6187 0 2025-02-23 05:11:57 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:cluster-version-operator] map[exclude.release.openshift.io/internal-openshift-hosted:true include.release.openshift.io/self-managed-high-availability:true kubernetes.io/description:Expose cluster-version operator metrics to other in-cluster consumers. Access requires a prometheus-k8s RoleBinding in this namespace. service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:cluster-version-operator-serving-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc006b7dc67 \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Nam\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:57Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-7bsmm_openshift-ovn-kubernetes(01a728ab-e286-4606-b922-d510978b863a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6447a78cef9ba2045f7928077399b681d152b37755ec287ae1633a26a67711ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ad2529bd38d1e0c84ca456ccdcc8020ce82a667c5aa5ea3a0027d397ec94f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ad2529bd38d1e0c84ca456ccdcc8020ce82a667c5aa5ea3a0027d397ec94f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7bsmm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:06Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:06 crc kubenswrapper[4690]: I0320 17:34:06.057430 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1ec4f2e-81b3-4b81-b071-1306b93f352a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc5b19d4175f97a26633b3c61b49147f93e1edeb8975964cb23bbe474f6326e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe2bb59ee9fc82c3e49b375d294aebc73e2175d699416cb28c587a153cbadc5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83d020fd903a7b604233a4229c9a201a78f0f9d41864c94e82220090dd73e69e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a788ca120045ef7b2481c3da0afac1f8ae2522b3edd3b73a48f5f8dab045a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60a788ca120045ef7b2481c3da0afac1f8ae2522b3edd3b73a48f5f8dab045a4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:33:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:33:16.417534 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:33:16.417775 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:33:16.418850 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4179466923/tls.crt::/tmp/serving-cert-4179466923/tls.key\\\\\\\"\\\\nI0320 17:33:16.771141 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:33:16.777371 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:33:16.777420 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:33:16.777489 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:33:16.777503 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:33:16.783760 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 17:33:16.783788 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:33:16.783793 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 17:33:16.783790 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:33:16.783798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:33:16.783816 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:33:16.783823 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:33:16.783828 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:33:16.787038 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d1877a8c2e19c04c44916cbcd68e19a117e4d6075b33ce7131064590120b12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438a96b878fe413aa54a56021b7ca5d2d38226050a036c2ce144aaead090aff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://438a96b878fe413aa54a56021b7ca5d2d38226050a036c2ce144aaead090aff7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:06Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:06 crc kubenswrapper[4690]: I0320 17:34:06.073382 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7501f273f832d465f837fe21cbfaddda7e9fdbfafe44e94d3fbfee21bbd2735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:06Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:06 crc kubenswrapper[4690]: I0320 17:34:06.090377 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:06Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:06 crc kubenswrapper[4690]: I0320 17:34:06.108390 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:06Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:06 crc kubenswrapper[4690]: I0320 17:34:06.122249 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qhmg6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5abdfe2-a5f7-43a7-9c83-a9eb0dacdea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19bc44db59dd7f723e92f099fb77ea80fac41a5fc0a3818ddd8d443495c50c8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lb8q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qhmg6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:06Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:06 crc kubenswrapper[4690]: I0320 17:34:06.135321 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cdd6a8b-6b15-41c5-ba81-51e1ef53835e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c42561cbc470c23295468bf31d6dda364c3962cf8ac84f53ed62c01fa3e19db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbfcf8baf8b3cc4746bc7b314297f0f820b7461ad85d9c2f500a3ed589fb4bc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbfcf8baf8b3cc4746bc7b314297f0f820b7461ad85d9c2f500a3ed589fb4bc0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:06Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:06 crc kubenswrapper[4690]: I0320 17:34:06.148046 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://202f57e25ffca6b763271ebd9354cb780bda72898aa4b753ce08bcf5a774dbd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:06Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:06 crc kubenswrapper[4690]: I0320 17:34:06.159556 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c18651e4-89e3-43fd-a780-bfa6df87591e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://746499ab480c55aa548acd69b4adc2adb724c111d53536273f1e738c5d67209c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v64dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09565d72b6e11bc9bc4f72446c455016fb107bdf0fe367b56427ce9f79c20b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v64dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wtg2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:06Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:06 crc kubenswrapper[4690]: I0320 17:34:06.168273 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4rfg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deaf1de2-4906-4e89-ae1b-83b6d35f97a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53b3e701b77813269b88f29ec4e437ca71cad9cd1b9cc9310dc6b59cc609bcc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmghf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4rfg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:06Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:06 crc kubenswrapper[4690]: I0320 17:34:06.178967 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nqtt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f51dea1-fc10-4d4a-9065-2d0c020b36f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc815d328a997ab7b69c5eb959fedde44313867916d64f4ebaf96d77e34b2e84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzj2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de078ec156833ff0304a8e83014adf2c8fc5c7f8db9bb25c366acf27fa446ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzj2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8nqtt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:06Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:06 crc kubenswrapper[4690]: I0320 17:34:06.189573 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c86b6b30-cf74-4708-b280-8c90ce27af28\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9910536149dc102d5a56c9ac27047ab0f86628788126c6c4aaf8aa8e8bc414bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://123e6e9aa8268f78a2852df2460763150ed92462bebd7c852c2bb6f78a092781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://906d1b0f0eda0e576d188ea1c4f601f45dcc8e93bf96330fa4e50be9d7a082b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87f674631ef5a418e3657c5c5103ab4c199d3f1690e0a0c737927afd35db4170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87f674631ef5a418e3657c5c5103ab4c199d3f1690e0a0c737927afd35db4170\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:06Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:07 crc kubenswrapper[4690]: I0320 17:34:07.562055 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 20 17:34:07 crc kubenswrapper[4690]: I0320 17:34:07.564837 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"af5080d60c7a6c75aac659ab9995f5f78392919748687dc3c81f6df7af1afe76"} Mar 20 17:34:07 crc kubenswrapper[4690]: I0320 17:34:07.565304 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:34:07 crc kubenswrapper[4690]: I0320 17:34:07.601121 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dacf40f3-f7fe-429b-bb11-3057bc037779\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b273b610fa19944625ca87d5ec10f818b86154d676f1def5ebe494ee44ed3848\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f552ca9ec154d035a9f9809b20d9ff2cd19bbd4cb9262173a0334289741f4fad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd8c552d958aced0cb683d87c3ef8d88494d4888ccb028a9f4c27b24b4923264\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5355eb1563fa92e70ca61e39a864a15b53da2181b277f3e134d121b5626b954a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f044bae4d4345b16e951ba16d4dc6df9b400789b67b6eb23d806fba27dc77d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://601a5cb96354f970de2322d08594baacac3c21ec962d27dc0c809f1bc99de4d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://601a5cb96354f970de2322d08594baacac3c21ec962d27dc0c809f1bc99de4d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e719a69188fb4ee3882973f6f72ba027c5a546cb39b119b27bcd38d8cc728521\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e719a69188fb4ee3882973f6f72ba027c5a546cb39b119b27bcd38d8cc728521\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ef118e8eca52e42d265877595d296d5641caa5c79886b886eefca7686f9b6524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef118e8eca52e42d265877595d296d5641caa5c79886b886eefca7686f9b6524\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:07Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:07 crc kubenswrapper[4690]: I0320 17:34:07.618490 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bgj72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cb690cf-caea-4c1b-ad3c-7e17a802b1a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djqjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djqjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bgj72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:07Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:07 crc kubenswrapper[4690]: I0320 17:34:07.642399 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tzvwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fe7c1d1-7aa9-4c64-941e-7415a99367ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9bad176e93c3fff461f57c5c15ed0d5bcc9ef12767d38012fe1145dd701112b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56dc92b978a7c1bbd4e3ccc2a6821348e2a990247e49e82c4de43c8bbe305cad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56dc92b978a7c1bbd4e3ccc2a6821348e2a990247e49e82c4de43c8bbe305cad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b4a3f2829967bcafe60ed0c6d08a421e8c8a5cd49d2a7445bbc92c2592d7457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b4a3f2829967bcafe60ed0c6d08a421e8c8a5cd49d2a7445bbc92c2592d7457\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://971c38dc48c64a0c8c8781e6d2a3d6f5222f9e846fb32ae417a4a1872a296b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://971c38dc48c64a0c8c8781e6d2a3d6f5222f9e846fb32ae417a4a1872a296b47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6e590fdf915cb209ad79022e0bb1b20cf642ebfeaa5e67cad61f14c495feaed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6e590fdf915cb209ad79022e0bb1b20cf642ebfeaa5e67cad61f14c495feaed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa9bfe0b6b30c8ecbcab836f9fd1770f959392e981e9676b281b5768a4279d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa9bfe0b6b30c8ecbcab836f9fd1770f959392e981e9676b281b5768a4279d22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d589b2d09af16c9faaa995e5d4abaa7663d53b499e93fbb2ad76e9ef14ff32c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d589b2d09af16c9faaa995e5d4abaa7663d53b499e93fbb2ad76e9ef14ff32c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tzvwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:07Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:07 crc kubenswrapper[4690]: I0320 17:34:07.673939 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01a728ab-e286-4606-b922-d510978b863a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89f5bb035f84384df58eb38689bda300611344d78c38c548c61cd02a479b6852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://187278dddcc4ae295ce37bb5966dd95b70987cf9579d8a302c45162906caa098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d198c0b94cfc2e9429a02ccb1bf444b3746c37cd3278cc5c41cccad3a92f3a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78b79e7c6bc179739a43168addace3ea75f4067c5938f219a5cb0e545f65472f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11c8e8059826df28ea1bdafe3ca56a8a902ff916246367be3ece76d468194901\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95c9b322e5da6bc8172886af77d6507bccaaf8e4489181c78d3f5e522d781aa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c9d11cdb738402f6fe1772ac1ecc821fce38aec2a4d791927874099c1c91f9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c9d11cdb738402f6fe1772ac1ecc821fce38aec2a4d791927874099c1c91f9e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:33:58Z\\\",\\\"message\\\":\\\"webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:58Z is after 2025-08-24T17:21:41Z]\\\\nI0320 17:33:58.453424 6680 services_controller.go:434] Service openshift-cluster-version/cluster-version-operator retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{cluster-version-operator openshift-cluster-version ddf4933a-f532-4906-9b8f-3b15aa433264 6187 0 2025-02-23 05:11:57 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:cluster-version-operator] map[exclude.release.openshift.io/internal-openshift-hosted:true include.release.openshift.io/self-managed-high-availability:true kubernetes.io/description:Expose cluster-version operator metrics to other in-cluster consumers. Access requires a prometheus-k8s RoleBinding in this namespace. service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:cluster-version-operator-serving-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc006b7dc67 \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Nam\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:57Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-7bsmm_openshift-ovn-kubernetes(01a728ab-e286-4606-b922-d510978b863a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6447a78cef9ba2045f7928077399b681d152b37755ec287ae1633a26a67711ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ad2529bd38d1e0c84ca456ccdcc8020ce82a667c5aa5ea3a0027d397ec94f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ad2529bd38d1e0c84ca456ccdcc8020ce82a667c5aa5ea3a0027d397ec94f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7bsmm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:07Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:07 crc kubenswrapper[4690]: I0320 17:34:07.688574 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cdd6a8b-6b15-41c5-ba81-51e1ef53835e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c42561cbc470c23295468bf31d6dda364c3962cf8ac84f53ed62c01fa3e19db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbfcf8baf8b3cc4746bc7b314297f0f820b7461ad85d9c2f500a3ed589fb4bc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbfcf8baf8b3cc4746bc7b314297f0f820b7461ad85d9c2f500a3ed589fb4bc0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:07Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:07 crc kubenswrapper[4690]: I0320 17:34:07.706230 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1ec4f2e-81b3-4b81-b071-1306b93f352a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc5b19d4175f97a26633b3c61b49147f93e1edeb8975964cb23bbe474f6326e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe2bb59ee9fc82c3e49b375d294aebc73e2175d699416cb28c587a153cbadc5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83d020fd903a7b604233a4229c9a201a78f0f9d41864c94e82220090dd73e69e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af5080d60c7a6c75aac659ab9995f5f78392919748687dc3c81f6df7af1afe76\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60a788ca120045ef7b2481c3da0afac1f8ae2522b3edd3b73a48f5f8dab045a4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:33:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:33:16.417534 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:33:16.417775 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:33:16.418850 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4179466923/tls.crt::/tmp/serving-cert-4179466923/tls.key\\\\\\\"\\\\nI0320 17:33:16.771141 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:33:16.777371 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:33:16.777420 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:33:16.777489 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:33:16.777503 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:33:16.783760 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 17:33:16.783788 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:33:16.783793 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 17:33:16.783790 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:33:16.783798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:33:16.783816 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:33:16.783823 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:33:16.783828 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:33:16.787038 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:34:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d1877a8c2e19c04c44916cbcd68e19a117e4d6075b33ce7131064590120b12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438a96b878fe413aa54a56021b7ca5d2d38226050a036c2ce144aaead090aff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://438a96b878fe413aa54a56021b7ca5d2d38226050a036c2ce144aaead090aff7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:07Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:07 crc kubenswrapper[4690]: I0320 17:34:07.724962 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7501f273f832d465f837fe21cbfaddda7e9fdbfafe44e94d3fbfee21bbd2735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:07Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:07 crc kubenswrapper[4690]: I0320 17:34:07.742584 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:07Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:07 crc kubenswrapper[4690]: I0320 17:34:07.788879 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:07Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:07 crc kubenswrapper[4690]: I0320 17:34:07.816634 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qhmg6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5abdfe2-a5f7-43a7-9c83-a9eb0dacdea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19bc44db59dd7f723e92f099fb77ea80fac41a5fc0a3818ddd8d443495c50c8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lb8q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qhmg6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:07Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:07 crc kubenswrapper[4690]: I0320 17:34:07.828740 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c86b6b30-cf74-4708-b280-8c90ce27af28\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9910536149dc102d5a56c9ac27047ab0f86628788126c6c4aaf8aa8e8bc414bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://123e6e9aa8268f78a2852df2460763150ed92462bebd7c852c2bb6f78a092781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://906d1b0f0eda0e576d188ea1c4f601f45dcc8e93bf96330fa4e50be9d7a082b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87f674631ef5a418e3657c5c5103ab4c199d3f1690e0a0c737927afd35db4170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87f674631ef5a418e3657c5c5103ab4c199d3f1690e0a0c737927afd35db4170\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:07Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:07 crc kubenswrapper[4690]: I0320 17:34:07.839796 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://202f57e25ffca6b763271ebd9354cb780bda72898aa4b753ce08bcf5a774dbd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:07Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:07 crc kubenswrapper[4690]: I0320 17:34:07.850576 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c18651e4-89e3-43fd-a780-bfa6df87591e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://746499ab480c55aa548acd69b4adc2adb724c111d53536273f1e738c5d67209c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v64dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09565d72b6e11bc9bc4f72446c455016fb107bdf0fe367b56427ce9f79c20b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v64dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wtg2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:07Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:07 crc kubenswrapper[4690]: I0320 17:34:07.864027 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4rfg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deaf1de2-4906-4e89-ae1b-83b6d35f97a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53b3e701b77813269b88f29ec4e437ca71cad9cd1b9cc9310dc6b59cc609bcc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmghf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4rfg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:07Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:07 crc kubenswrapper[4690]: I0320 17:34:07.877344 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nqtt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f51dea1-fc10-4d4a-9065-2d0c020b36f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc815d328a997ab7b69c5eb959fedde44313867916d64f4ebaf96d77e34b2e84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzj2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de078ec156833ff0304a8e83014adf2c8fc5c7f8db9bb25c366acf27fa446ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzj2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8nqtt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:07Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:07 crc kubenswrapper[4690]: I0320 17:34:07.882667 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:34:07 crc kubenswrapper[4690]: I0320 17:34:07.882727 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bgj72" Mar 20 17:34:07 crc kubenswrapper[4690]: I0320 17:34:07.882769 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:34:07 crc kubenswrapper[4690]: I0320 17:34:07.882944 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:34:07 crc kubenswrapper[4690]: E0320 17:34:07.882945 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:34:07 crc kubenswrapper[4690]: E0320 17:34:07.883087 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bgj72" podUID="3cb690cf-caea-4c1b-ad3c-7e17a802b1a3" Mar 20 17:34:07 crc kubenswrapper[4690]: E0320 17:34:07.883144 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:34:07 crc kubenswrapper[4690]: E0320 17:34:07.883224 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:34:07 crc kubenswrapper[4690]: I0320 17:34:07.894836 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64a74fb2e29c84d99284cdca82ecd7abae5fc195747f292f11036116ec270ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37728496304293eddfd812f4584815ce277a3a2b02b6716e5f7d5d77ebaf9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:07Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:07 crc kubenswrapper[4690]: I0320 17:34:07.908957 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:07Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:07 crc kubenswrapper[4690]: I0320 17:34:07.927781 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bf8dm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"189715be-f690-4a1d-9bd3-fb0dcae7affe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab5d0027832ffcb62f2f0869a4811a56bd02954cbdd4fd0e20870dc72818ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z9vwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bf8dm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:07Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:08 crc kubenswrapper[4690]: I0320 17:34:08.452290 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:34:08 crc kubenswrapper[4690]: I0320 17:34:08.452327 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:34:08 crc kubenswrapper[4690]: I0320 17:34:08.452338 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:34:08 crc kubenswrapper[4690]: I0320 17:34:08.452355 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:34:08 crc kubenswrapper[4690]: I0320 17:34:08.452365 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:34:08Z","lastTransitionTime":"2026-03-20T17:34:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:34:08 crc kubenswrapper[4690]: E0320 17:34:08.470974 4690 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"65dcae3a-f6f0-4cdb-ac7a-76b1f475ea12\\\",\\\"systemUUID\\\":\\\"6ccc1e34-4160-4143-b919-ac2f717f294a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:08Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:08 crc kubenswrapper[4690]: I0320 17:34:08.477318 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:34:08 crc kubenswrapper[4690]: I0320 17:34:08.477386 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:34:08 crc kubenswrapper[4690]: I0320 17:34:08.477409 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:34:08 crc kubenswrapper[4690]: I0320 17:34:08.477436 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:34:08 crc kubenswrapper[4690]: I0320 17:34:08.477453 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:34:08Z","lastTransitionTime":"2026-03-20T17:34:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:34:08 crc kubenswrapper[4690]: E0320 17:34:08.493784 4690 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"65dcae3a-f6f0-4cdb-ac7a-76b1f475ea12\\\",\\\"systemUUID\\\":\\\"6ccc1e34-4160-4143-b919-ac2f717f294a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:08Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:08 crc kubenswrapper[4690]: I0320 17:34:08.497854 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:34:08 crc kubenswrapper[4690]: I0320 17:34:08.497909 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:34:08 crc kubenswrapper[4690]: I0320 17:34:08.497927 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:34:08 crc kubenswrapper[4690]: I0320 17:34:08.497949 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:34:08 crc kubenswrapper[4690]: I0320 17:34:08.497965 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:34:08Z","lastTransitionTime":"2026-03-20T17:34:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:34:08 crc kubenswrapper[4690]: E0320 17:34:08.514942 4690 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"65dcae3a-f6f0-4cdb-ac7a-76b1f475ea12\\\",\\\"systemUUID\\\":\\\"6ccc1e34-4160-4143-b919-ac2f717f294a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:08Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:08 crc kubenswrapper[4690]: I0320 17:34:08.518931 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:34:08 crc kubenswrapper[4690]: I0320 17:34:08.518969 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:34:08 crc kubenswrapper[4690]: I0320 17:34:08.518979 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:34:08 crc kubenswrapper[4690]: I0320 17:34:08.518995 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:34:08 crc kubenswrapper[4690]: I0320 17:34:08.519005 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:34:08Z","lastTransitionTime":"2026-03-20T17:34:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:34:08 crc kubenswrapper[4690]: E0320 17:34:08.537482 4690 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"65dcae3a-f6f0-4cdb-ac7a-76b1f475ea12\\\",\\\"systemUUID\\\":\\\"6ccc1e34-4160-4143-b919-ac2f717f294a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:08Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:08 crc kubenswrapper[4690]: I0320 17:34:08.542020 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:34:08 crc kubenswrapper[4690]: I0320 17:34:08.542066 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:34:08 crc kubenswrapper[4690]: I0320 17:34:08.542077 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:34:08 crc kubenswrapper[4690]: I0320 17:34:08.542094 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:34:08 crc kubenswrapper[4690]: I0320 17:34:08.542106 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:34:08Z","lastTransitionTime":"2026-03-20T17:34:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:34:08 crc kubenswrapper[4690]: E0320 17:34:08.560564 4690 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"65dcae3a-f6f0-4cdb-ac7a-76b1f475ea12\\\",\\\"systemUUID\\\":\\\"6ccc1e34-4160-4143-b919-ac2f717f294a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:08Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:08 crc kubenswrapper[4690]: E0320 17:34:08.560698 4690 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 17:34:09 crc kubenswrapper[4690]: I0320 17:34:09.883088 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:34:09 crc kubenswrapper[4690]: I0320 17:34:09.883112 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bgj72" Mar 20 17:34:09 crc kubenswrapper[4690]: I0320 17:34:09.883298 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:34:09 crc kubenswrapper[4690]: E0320 17:34:09.883459 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:34:09 crc kubenswrapper[4690]: I0320 17:34:09.883483 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:34:09 crc kubenswrapper[4690]: E0320 17:34:09.883549 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bgj72" podUID="3cb690cf-caea-4c1b-ad3c-7e17a802b1a3" Mar 20 17:34:09 crc kubenswrapper[4690]: E0320 17:34:09.883678 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:34:09 crc kubenswrapper[4690]: E0320 17:34:09.883926 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:34:10 crc kubenswrapper[4690]: E0320 17:34:10.987135 4690 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 17:34:11 crc kubenswrapper[4690]: I0320 17:34:11.883200 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:34:11 crc kubenswrapper[4690]: I0320 17:34:11.883293 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:34:11 crc kubenswrapper[4690]: I0320 17:34:11.883312 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bgj72" Mar 20 17:34:11 crc kubenswrapper[4690]: E0320 17:34:11.883367 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:34:11 crc kubenswrapper[4690]: I0320 17:34:11.883541 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:34:11 crc kubenswrapper[4690]: E0320 17:34:11.883562 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:34:11 crc kubenswrapper[4690]: E0320 17:34:11.883604 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:34:11 crc kubenswrapper[4690]: E0320 17:34:11.883677 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bgj72" podUID="3cb690cf-caea-4c1b-ad3c-7e17a802b1a3" Mar 20 17:34:12 crc kubenswrapper[4690]: I0320 17:34:12.882890 4690 scope.go:117] "RemoveContainer" containerID="0c9d11cdb738402f6fe1772ac1ecc821fce38aec2a4d791927874099c1c91f9e" Mar 20 17:34:13 crc kubenswrapper[4690]: I0320 17:34:13.587525 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7bsmm_01a728ab-e286-4606-b922-d510978b863a/ovnkube-controller/1.log" Mar 20 17:34:13 crc kubenswrapper[4690]: I0320 17:34:13.590642 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" event={"ID":"01a728ab-e286-4606-b922-d510978b863a","Type":"ContainerStarted","Data":"31889aedfc9e0388694b4201c8c752dabd1634603e55401682b8cb995946bab6"} Mar 20 17:34:13 crc kubenswrapper[4690]: I0320 17:34:13.591249 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" Mar 20 17:34:13 crc kubenswrapper[4690]: I0320 17:34:13.608021 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64a74fb2e29c84d99284cdca82ecd7abae5fc195747f292f11036116ec270ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37728496304293eddfd812f4584815ce277a3a2b02b6716e5f7d5d77ebaf9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:13Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:13 crc kubenswrapper[4690]: I0320 17:34:13.626346 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:13Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:13 crc kubenswrapper[4690]: I0320 17:34:13.644212 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bf8dm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"189715be-f690-4a1d-9bd3-fb0dcae7affe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab5d0027832ffcb62f2f0869a4811a56bd02954cbdd4fd0e20870dc72818ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z9vwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bf8dm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:13Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:13 crc kubenswrapper[4690]: I0320 17:34:13.671764 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dacf40f3-f7fe-429b-bb11-3057bc037779\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b273b610fa19944625ca87d5ec10f818b86154d676f1def5ebe494ee44ed3848\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f552ca9ec154d035a9f9809b20d9ff2cd19bbd4cb9262173a0334289741f4fad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd8c552d958aced0cb683d87c3ef8d88494d4888ccb028a9f4c27b24b4923264\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5355eb1563fa92e70ca61e39a864a15b53da2181b277f3e134d121b5626b954a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f044bae4d4345b16e951ba16d4dc6df9b400789b67b6eb23d806fba27dc77d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://601a5cb96354f970de2322d08594baacac3c21ec962d27dc0c809f1bc99de4d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://601a5cb96354f970de2322d08594baacac3c21ec962d27dc0c809f1bc99de4d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e719a69188fb4ee3882973f6f72ba027c5a546cb39b119b27bcd38d8cc728521\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e719a69188fb4ee3882973f6f72ba027c5a546cb39b119b27bcd38d8cc728521\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ef118e8eca52e42d265877595d296d5641caa5c79886b886eefca7686f9b6524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef118e8eca52e42d265877595d296d5641caa5c79886b886eefca7686f9b6524\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:13Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:13 crc kubenswrapper[4690]: I0320 17:34:13.685671 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bgj72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cb690cf-caea-4c1b-ad3c-7e17a802b1a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djqjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djqjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bgj72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:13Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:13 crc kubenswrapper[4690]: I0320 17:34:13.703831 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tzvwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fe7c1d1-7aa9-4c64-941e-7415a99367ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9bad176e93c3fff461f57c5c15ed0d5bcc9ef12767d38012fe1145dd701112b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56dc92b978a7c1bbd4e3ccc2a6821348e2a990247e49e82c4de43c8bbe305cad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56dc92b978a7c1bbd4e3ccc2a6821348e2a990247e49e82c4de43c8bbe305cad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b4a3f2829967bcafe60ed0c6d08a421e8c8a5cd49d2a7445bbc92c2592d7457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b4a3f2829967bcafe60ed0c6d08a421e8c8a5cd49d2a7445bbc92c2592d7457\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://971c38dc48c64a0c8c8781e6d2a3d6f5222f9e846fb32ae417a4a1872a296b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://971c38dc48c64a0c8c8781e6d2a3d6f5222f9e846fb32ae417a4a1872a296b47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6e590fdf915cb209ad79022e0bb1b20cf642ebfeaa5e67cad61f14c495feaed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6e590fdf915cb209ad79022e0bb1b20cf642ebfeaa5e67cad61f14c495feaed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa9bfe0b6b30c8ecbcab836f9fd1770f959392e981e9676b281b5768a4279d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa9bfe0b6b30c8ecbcab836f9fd1770f959392e981e9676b281b5768a4279d22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d589b2d09af16c9faaa995e5d4abaa7663d53b499e93fbb2ad76e9ef14ff32c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d589b2d09af16c9faaa995e5d4abaa7663d53b499e93fbb2ad76e9ef14ff32c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tzvwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:13Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:13 crc kubenswrapper[4690]: I0320 17:34:13.730536 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01a728ab-e286-4606-b922-d510978b863a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89f5bb035f84384df58eb38689bda300611344d78c38c548c61cd02a479b6852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://187278dddcc4ae295ce37bb5966dd95b70987cf9579d8a302c45162906caa098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d198c0b94cfc2e9429a02ccb1bf444b3746c37cd3278cc5c41cccad3a92f3a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78b79e7c6bc179739a43168addace3ea75f4067c5938f219a5cb0e545f65472f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11c8e8059826df28ea1bdafe3ca56a8a902ff916246367be3ece76d468194901\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95c9b322e5da6bc8172886af77d6507bccaaf8e4489181c78d3f5e522d781aa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31889aedfc9e0388694b4201c8c752dabd1634603e55401682b8cb995946bab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c9d11cdb738402f6fe1772ac1ecc821fce38aec2a4d791927874099c1c91f9e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:33:58Z\\\",\\\"message\\\":\\\"webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:58Z is after 2025-08-24T17:21:41Z]\\\\nI0320 17:33:58.453424 6680 services_controller.go:434] Service openshift-cluster-version/cluster-version-operator retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{cluster-version-operator openshift-cluster-version ddf4933a-f532-4906-9b8f-3b15aa433264 6187 0 2025-02-23 05:11:57 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:cluster-version-operator] map[exclude.release.openshift.io/internal-openshift-hosted:true include.release.openshift.io/self-managed-high-availability:true kubernetes.io/description:Expose cluster-version operator metrics to other in-cluster consumers. Access requires a prometheus-k8s RoleBinding in this namespace. service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:cluster-version-operator-serving-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc006b7dc67 \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Nam\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:57Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:34:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6447a78cef9ba2045f7928077399b681d152b37755ec287ae1633a26a67711ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ad2529bd38d1e0c84ca456ccdcc8020ce82a667c5aa5ea3a0027d397ec94f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ad2529bd38d1e0c84ca456ccdcc8020ce82a667c5aa5ea3a0027d397ec94f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7bsmm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:13Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:13 crc kubenswrapper[4690]: I0320 17:34:13.746018 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cdd6a8b-6b15-41c5-ba81-51e1ef53835e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c42561cbc470c23295468bf31d6dda364c3962cf8ac84f53ed62c01fa3e19db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbfcf8baf8b3cc4746bc7b314297f0f820b7461ad85d9c2f500a3ed589fb4bc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbfcf8baf8b3cc4746bc7b314297f0f820b7461ad85d9c2f500a3ed589fb4bc0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:13Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:13 crc kubenswrapper[4690]: I0320 17:34:13.763278 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1ec4f2e-81b3-4b81-b071-1306b93f352a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc5b19d4175f97a26633b3c61b49147f93e1edeb8975964cb23bbe474f6326e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe2bb59ee9fc82c3e49b375d294aebc73e2175d699416cb28c587a153cbadc5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83d020fd903a7b604233a4229c9a201a78f0f9d41864c94e82220090dd73e69e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af5080d60c7a6c75aac659ab9995f5f78392919748687dc3c81f6df7af1afe76\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60a788ca120045ef7b2481c3da0afac1f8ae2522b3edd3b73a48f5f8dab045a4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:33:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:33:16.417534 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:33:16.417775 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:33:16.418850 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4179466923/tls.crt::/tmp/serving-cert-4179466923/tls.key\\\\\\\"\\\\nI0320 17:33:16.771141 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:33:16.777371 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:33:16.777420 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:33:16.777489 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:33:16.777503 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:33:16.783760 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 17:33:16.783788 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:33:16.783793 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 17:33:16.783790 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:33:16.783798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:33:16.783816 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:33:16.783823 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:33:16.783828 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:33:16.787038 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:34:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d1877a8c2e19c04c44916cbcd68e19a117e4d6075b33ce7131064590120b12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438a96b878fe413aa54a56021b7ca5d2d38226050a036c2ce144aaead090aff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://438a96b878fe413aa54a56021b7ca5d2d38226050a036c2ce144aaead090aff7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:13Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:13 crc kubenswrapper[4690]: I0320 17:34:13.781995 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7501f273f832d465f837fe21cbfaddda7e9fdbfafe44e94d3fbfee21bbd2735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:13Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:13 crc kubenswrapper[4690]: I0320 17:34:13.799871 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:13Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:13 crc kubenswrapper[4690]: I0320 17:34:13.820601 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:13Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:13 crc kubenswrapper[4690]: I0320 17:34:13.832722 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qhmg6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5abdfe2-a5f7-43a7-9c83-a9eb0dacdea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19bc44db59dd7f723e92f099fb77ea80fac41a5fc0a3818ddd8d443495c50c8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lb8q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qhmg6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:13Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:13 crc kubenswrapper[4690]: I0320 17:34:13.845331 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c86b6b30-cf74-4708-b280-8c90ce27af28\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9910536149dc102d5a56c9ac27047ab0f86628788126c6c4aaf8aa8e8bc414bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://123e6e9aa8268f78a2852df2460763150ed92462bebd7c852c2bb6f78a092781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://906d1b0f0eda0e576d188ea1c4f601f45dcc8e93bf96330fa4e50be9d7a082b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87f674631ef5a418e3657c5c5103ab4c199d3f1690e0a0c737927afd35db4170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87f674631ef5a418e3657c5c5103ab4c199d3f1690e0a0c737927afd35db4170\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:13Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:13 crc kubenswrapper[4690]: I0320 17:34:13.861581 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://202f57e25ffca6b763271ebd9354cb780bda72898aa4b753ce08bcf5a774dbd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:13Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:13 crc kubenswrapper[4690]: I0320 17:34:13.895957 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:34:13 crc kubenswrapper[4690]: I0320 17:34:13.896046 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bgj72" Mar 20 17:34:13 crc kubenswrapper[4690]: E0320 17:34:13.896160 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:34:13 crc kubenswrapper[4690]: I0320 17:34:13.895956 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:34:13 crc kubenswrapper[4690]: E0320 17:34:13.896807 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bgj72" podUID="3cb690cf-caea-4c1b-ad3c-7e17a802b1a3" Mar 20 17:34:13 crc kubenswrapper[4690]: I0320 17:34:13.896377 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:34:13 crc kubenswrapper[4690]: E0320 17:34:13.896996 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:34:13 crc kubenswrapper[4690]: E0320 17:34:13.897154 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:34:13 crc kubenswrapper[4690]: I0320 17:34:13.911419 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c18651e4-89e3-43fd-a780-bfa6df87591e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://746499ab480c55aa548acd69b4adc2adb724c111d53536273f1e738c5d67209c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v64dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09565d72b6e11bc9bc4f72446c455016fb107bdf0fe367b56427ce9f79c20b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v64dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wtg2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:13Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:13 crc kubenswrapper[4690]: I0320 17:34:13.928019 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4rfg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deaf1de2-4906-4e89-ae1b-83b6d35f97a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53b3e701b77813269b88f29ec4e437ca71cad9cd1b9cc9310dc6b59cc609bcc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmghf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4rfg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:13Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:13 crc kubenswrapper[4690]: I0320 17:34:13.944284 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nqtt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f51dea1-fc10-4d4a-9065-2d0c020b36f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc815d328a997ab7b69c5eb959fedde44313867916d64f4ebaf96d77e34b2e84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzj2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de078ec156833ff0304a8e83014adf2c8fc5c7f8db9bb25c366acf27fa446ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzj2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8nqtt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:13Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:14 crc kubenswrapper[4690]: I0320 17:34:14.596412 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7bsmm_01a728ab-e286-4606-b922-d510978b863a/ovnkube-controller/2.log" Mar 20 17:34:14 crc kubenswrapper[4690]: I0320 17:34:14.597568 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7bsmm_01a728ab-e286-4606-b922-d510978b863a/ovnkube-controller/1.log" Mar 20 17:34:14 crc kubenswrapper[4690]: I0320 17:34:14.601610 4690 generic.go:334] "Generic (PLEG): container finished" podID="01a728ab-e286-4606-b922-d510978b863a" containerID="31889aedfc9e0388694b4201c8c752dabd1634603e55401682b8cb995946bab6" exitCode=1 Mar 20 17:34:14 crc kubenswrapper[4690]: I0320 17:34:14.601686 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" event={"ID":"01a728ab-e286-4606-b922-d510978b863a","Type":"ContainerDied","Data":"31889aedfc9e0388694b4201c8c752dabd1634603e55401682b8cb995946bab6"} Mar 20 17:34:14 crc kubenswrapper[4690]: I0320 17:34:14.601781 4690 scope.go:117] "RemoveContainer" containerID="0c9d11cdb738402f6fe1772ac1ecc821fce38aec2a4d791927874099c1c91f9e" Mar 20 17:34:14 crc kubenswrapper[4690]: I0320 17:34:14.602839 4690 scope.go:117] "RemoveContainer" containerID="31889aedfc9e0388694b4201c8c752dabd1634603e55401682b8cb995946bab6" Mar 20 17:34:14 crc kubenswrapper[4690]: E0320 17:34:14.603115 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7bsmm_openshift-ovn-kubernetes(01a728ab-e286-4606-b922-d510978b863a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" podUID="01a728ab-e286-4606-b922-d510978b863a" Mar 20 17:34:14 crc kubenswrapper[4690]: I0320 17:34:14.622851 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7501f273f832d465f837fe21cbfaddda7e9fdbfafe44e94d3fbfee21bbd2735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:14Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:14 crc kubenswrapper[4690]: I0320 17:34:14.638593 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:14Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:14 crc kubenswrapper[4690]: I0320 17:34:14.654800 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:14Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:14 crc kubenswrapper[4690]: I0320 17:34:14.670755 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qhmg6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5abdfe2-a5f7-43a7-9c83-a9eb0dacdea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19bc44db59dd7f723e92f099fb77ea80fac41a5fc0a3818ddd8d443495c50c8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lb8q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qhmg6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:14Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:14 crc kubenswrapper[4690]: I0320 17:34:14.684604 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cdd6a8b-6b15-41c5-ba81-51e1ef53835e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c42561cbc470c23295468bf31d6dda364c3962cf8ac84f53ed62c01fa3e19db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbfcf8baf8b3cc4746bc7b314297f0f820b7461ad85d9c2f500a3ed589fb4bc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbfcf8baf8b3cc4746bc7b314297f0f820b7461ad85d9c2f500a3ed589fb4bc0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:14Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:14 crc kubenswrapper[4690]: I0320 17:34:14.707683 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1ec4f2e-81b3-4b81-b071-1306b93f352a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc5b19d4175f97a26633b3c61b49147f93e1edeb8975964cb23bbe474f6326e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe2bb59ee9fc82c3e49b375d294aebc73e2175d699416cb28c587a153cbadc5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83d020fd903a7b604233a4229c9a201a78f0f9d41864c94e82220090dd73e69e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af5080d60c7a6c75aac659ab9995f5f78392919748687dc3c81f6df7af1afe76\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60a788ca120045ef7b2481c3da0afac1f8ae2522b3edd3b73a48f5f8dab045a4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:33:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:33:16.417534 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:33:16.417775 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:33:16.418850 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4179466923/tls.crt::/tmp/serving-cert-4179466923/tls.key\\\\\\\"\\\\nI0320 17:33:16.771141 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:33:16.777371 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:33:16.777420 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:33:16.777489 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:33:16.777503 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:33:16.783760 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 17:33:16.783788 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:33:16.783793 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 17:33:16.783790 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:33:16.783798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:33:16.783816 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:33:16.783823 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:33:16.783828 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:33:16.787038 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:34:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d1877a8c2e19c04c44916cbcd68e19a117e4d6075b33ce7131064590120b12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438a96b878fe413aa54a56021b7ca5d2d38226050a036c2ce144aaead090aff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://438a96b878fe413aa54a56021b7ca5d2d38226050a036c2ce144aaead090aff7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:14Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:14 crc kubenswrapper[4690]: I0320 17:34:14.722926 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c18651e4-89e3-43fd-a780-bfa6df87591e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://746499ab480c55aa548acd69b4adc2adb724c111d53536273f1e738c5d67209c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v64dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09565d72b6e11bc9bc4f72446c455016fb107bdf0fe367b56427ce9f79c20b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v64dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wtg2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:14Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:14 crc kubenswrapper[4690]: I0320 17:34:14.735987 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4rfg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deaf1de2-4906-4e89-ae1b-83b6d35f97a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53b3e701b77813269b88f29ec4e437ca71cad9cd1b9cc9310dc6b59cc609bcc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmghf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4rfg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:14Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:14 crc kubenswrapper[4690]: I0320 17:34:14.749248 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nqtt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f51dea1-fc10-4d4a-9065-2d0c020b36f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc815d328a997ab7b69c5eb959fedde44313867916d64f4ebaf96d77e34b2e84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzj2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de078ec156833ff0304a8e83014adf2c8fc5c7f8db9bb25c366acf27fa446ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzj2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8nqtt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:14Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:14 crc kubenswrapper[4690]: I0320 17:34:14.763500 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c86b6b30-cf74-4708-b280-8c90ce27af28\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9910536149dc102d5a56c9ac27047ab0f86628788126c6c4aaf8aa8e8bc414bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://123e6e9aa8268f78a2852df2460763150ed92462bebd7c852c2bb6f78a092781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://906d1b0f0eda0e576d188ea1c4f601f45dcc8e93bf96330fa4e50be9d7a082b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87f674631ef5a418e3657c5c5103ab4c199d3f1690e0a0c737927afd35db4170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87f674631ef5a418e3657c5c5103ab4c199d3f1690e0a0c737927afd35db4170\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:14Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:14 crc kubenswrapper[4690]: I0320 17:34:14.777881 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://202f57e25ffca6b763271ebd9354cb780bda72898aa4b753ce08bcf5a774dbd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:14Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:14 crc kubenswrapper[4690]: I0320 17:34:14.791011 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bf8dm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"189715be-f690-4a1d-9bd3-fb0dcae7affe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab5d0027832ffcb62f2f0869a4811a56bd02954cbdd4fd0e20870dc72818ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z9vwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bf8dm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:14Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:14 crc kubenswrapper[4690]: I0320 17:34:14.807104 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64a74fb2e29c84d99284cdca82ecd7abae5fc195747f292f11036116ec270ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37728496304293eddfd812f4584815ce277a3a2b02b6716e5f7d5d77ebaf9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:14Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:14 crc kubenswrapper[4690]: I0320 17:34:14.824851 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:14Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:14 crc kubenswrapper[4690]: I0320 17:34:14.839592 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bgj72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cb690cf-caea-4c1b-ad3c-7e17a802b1a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djqjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djqjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bgj72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:14Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:14 crc kubenswrapper[4690]: I0320 17:34:14.857470 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tzvwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fe7c1d1-7aa9-4c64-941e-7415a99367ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9bad176e93c3fff461f57c5c15ed0d5bcc9ef12767d38012fe1145dd701112b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56dc92b978a7c1bbd4e3ccc2a6821348e2a990247e49e82c4de43c8bbe305cad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56dc92b978a7c1bbd4e3ccc2a6821348e2a990247e49e82c4de43c8bbe305cad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b4a3f2829967bcafe60ed0c6d08a421e8c8a5cd49d2a7445bbc92c2592d7457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b4a3f2829967bcafe60ed0c6d08a421e8c8a5cd49d2a7445bbc92c2592d7457\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://971c38dc48c64a0c8c8781e6d2a3d6f5222f9e846fb32ae417a4a1872a296b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://971c38dc48c64a0c8c8781e6d2a3d6f5222f9e846fb32ae417a4a1872a296b47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6e590fdf915cb209ad79022e0bb1b20cf642ebfeaa5e67cad61f14c495feaed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6e590fdf915cb209ad79022e0bb1b20cf642ebfeaa5e67cad61f14c495feaed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa9bfe0b6b30c8ecbcab836f9fd1770f959392e981e9676b281b5768a4279d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa9bfe0b6b30c8ecbcab836f9fd1770f959392e981e9676b281b5768a4279d22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d589b2d09af16c9faaa995e5d4abaa7663d53b499e93fbb2ad76e9ef14ff32c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d589b2d09af16c9faaa995e5d4abaa7663d53b499e93fbb2ad76e9ef14ff32c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tzvwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:14Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:14 crc kubenswrapper[4690]: I0320 17:34:14.885441 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01a728ab-e286-4606-b922-d510978b863a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89f5bb035f84384df58eb38689bda300611344d78c38c548c61cd02a479b6852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://187278dddcc4ae295ce37bb5966dd95b70987cf9579d8a302c45162906caa098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d198c0b94cfc2e9429a02ccb1bf444b3746c37cd3278cc5c41cccad3a92f3a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78b79e7c6bc179739a43168addace3ea75f4067c5938f219a5cb0e545f65472f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11c8e8059826df28ea1bdafe3ca56a8a902ff916246367be3ece76d468194901\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95c9b322e5da6bc8172886af77d6507bccaaf8e4489181c78d3f5e522d781aa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31889aedfc9e0388694b4201c8c752dabd1634603e55401682b8cb995946bab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c9d11cdb738402f6fe1772ac1ecc821fce38aec2a4d791927874099c1c91f9e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:33:58Z\\\",\\\"message\\\":\\\"webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:33:58Z is after 2025-08-24T17:21:41Z]\\\\nI0320 17:33:58.453424 6680 services_controller.go:434] Service openshift-cluster-version/cluster-version-operator retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{cluster-version-operator openshift-cluster-version ddf4933a-f532-4906-9b8f-3b15aa433264 6187 0 2025-02-23 05:11:57 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:cluster-version-operator] map[exclude.release.openshift.io/internal-openshift-hosted:true include.release.openshift.io/self-managed-high-availability:true kubernetes.io/description:Expose cluster-version operator metrics to other in-cluster consumers. Access requires a prometheus-k8s RoleBinding in this namespace. service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:cluster-version-operator-serving-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc006b7dc67 \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Nam\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:57Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31889aedfc9e0388694b4201c8c752dabd1634603e55401682b8cb995946bab6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:34:13Z\\\",\\\"message\\\":\\\"k-metrics-daemon-bgj72 in node crc\\\\nI0320 17:34:13.787724 6883 services_controller.go:443] Built service openshift-apiserver/check-endpoints LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.139\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:17698, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0320 17:34:13.787931 6883 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI0320 17:34:13.788092 6883 default_network_controller.go:776] Recording success event on pod openshift-etcd/etcd-crc\\\\nF0320 17:34:13.787689 6883 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.opensh\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:34:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6447a78cef9ba2045f7928077399b681d152b37755ec287ae1633a26a67711ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ad2529bd38d1e0c84ca456ccdcc8020ce82a667c5aa5ea3a0027d397ec94f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ad2529bd38d1e0c84ca456ccdcc8020ce82a667c5aa5ea3a0027d397ec94f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7bsmm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:14Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:14 crc kubenswrapper[4690]: I0320 17:34:14.905653 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dacf40f3-f7fe-429b-bb11-3057bc037779\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b273b610fa19944625ca87d5ec10f818b86154d676f1def5ebe494ee44ed3848\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f552ca9ec154d035a9f9809b20d9ff2cd19bbd4cb9262173a0334289741f4fad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd8c552d958aced0cb683d87c3ef8d88494d4888ccb028a9f4c27b24b4923264\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5355eb1563fa92e70ca61e39a864a15b53da2181b277f3e134d121b5626b954a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f044bae4d4345b16e951ba16d4dc6df9b400789b67b6eb23d806fba27dc77d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://601a5cb96354f970de2322d08594baacac3c21ec962d27dc0c809f1bc99de4d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://601a5cb96354f970de2322d08594baacac3c21ec962d27dc0c809f1bc99de4d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e719a69188fb4ee3882973f6f72ba027c5a546cb39b119b27bcd38d8cc728521\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e719a69188fb4ee3882973f6f72ba027c5a546cb39b119b27bcd38d8cc728521\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ef118e8eca52e42d265877595d296d5641caa5c79886b886eefca7686f9b6524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef118e8eca52e42d265877595d296d5641caa5c79886b886eefca7686f9b6524\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:14Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:15 crc kubenswrapper[4690]: I0320 17:34:15.606985 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7bsmm_01a728ab-e286-4606-b922-d510978b863a/ovnkube-controller/2.log" Mar 20 17:34:15 crc kubenswrapper[4690]: I0320 17:34:15.611568 4690 scope.go:117] "RemoveContainer" containerID="31889aedfc9e0388694b4201c8c752dabd1634603e55401682b8cb995946bab6" Mar 20 17:34:15 crc kubenswrapper[4690]: E0320 17:34:15.611752 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7bsmm_openshift-ovn-kubernetes(01a728ab-e286-4606-b922-d510978b863a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" podUID="01a728ab-e286-4606-b922-d510978b863a" Mar 20 17:34:15 crc kubenswrapper[4690]: I0320 17:34:15.627901 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:15Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:15 crc kubenswrapper[4690]: I0320 17:34:15.650425 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:15Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:15 crc kubenswrapper[4690]: I0320 17:34:15.668152 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qhmg6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5abdfe2-a5f7-43a7-9c83-a9eb0dacdea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19bc44db59dd7f723e92f099fb77ea80fac41a5fc0a3818ddd8d443495c50c8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lb8q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qhmg6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:15Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:15 crc kubenswrapper[4690]: I0320 17:34:15.684446 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cdd6a8b-6b15-41c5-ba81-51e1ef53835e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c42561cbc470c23295468bf31d6dda364c3962cf8ac84f53ed62c01fa3e19db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbfcf8baf8b3cc4746bc7b314297f0f820b7461ad85d9c2f500a3ed589fb4bc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbfcf8baf8b3cc4746bc7b314297f0f820b7461ad85d9c2f500a3ed589fb4bc0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:15Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:15 crc kubenswrapper[4690]: I0320 17:34:15.704684 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1ec4f2e-81b3-4b81-b071-1306b93f352a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc5b19d4175f97a26633b3c61b49147f93e1edeb8975964cb23bbe474f6326e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe2bb59ee9fc82c3e49b375d294aebc73e2175d699416cb28c587a153cbadc5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83d020fd903a7b604233a4229c9a201a78f0f9d41864c94e82220090dd73e69e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af5080d60c7a6c75aac659ab9995f5f78392919748687dc3c81f6df7af1afe76\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60a788ca120045ef7b2481c3da0afac1f8ae2522b3edd3b73a48f5f8dab045a4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:33:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:33:16.417534 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:33:16.417775 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:33:16.418850 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4179466923/tls.crt::/tmp/serving-cert-4179466923/tls.key\\\\\\\"\\\\nI0320 17:33:16.771141 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:33:16.777371 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:33:16.777420 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:33:16.777489 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:33:16.777503 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:33:16.783760 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 17:33:16.783788 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:33:16.783793 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 17:33:16.783790 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:33:16.783798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:33:16.783816 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:33:16.783823 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:33:16.783828 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:33:16.787038 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:34:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d1877a8c2e19c04c44916cbcd68e19a117e4d6075b33ce7131064590120b12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438a96b878fe413aa54a56021b7ca5d2d38226050a036c2ce144aaead090aff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://438a96b878fe413aa54a56021b7ca5d2d38226050a036c2ce144aaead090aff7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:15Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:15 crc kubenswrapper[4690]: I0320 17:34:15.726387 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7501f273f832d465f837fe21cbfaddda7e9fdbfafe44e94d3fbfee21bbd2735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:15Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:15 crc kubenswrapper[4690]: I0320 17:34:15.741962 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4rfg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deaf1de2-4906-4e89-ae1b-83b6d35f97a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53b3e701b77813269b88f29ec4e437ca71cad9cd1b9cc9310dc6b59cc609bcc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmghf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4rfg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:15Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:15 crc kubenswrapper[4690]: I0320 17:34:15.754568 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nqtt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f51dea1-fc10-4d4a-9065-2d0c020b36f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc815d328a997ab7b69c5eb959fedde44313867916d64f4ebaf96d77e34b2e84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzj2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de078ec156833ff0304a8e83014adf2c8fc5c7f8db9bb25c366acf27fa446ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzj2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8nqtt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:15Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:15 crc kubenswrapper[4690]: I0320 17:34:15.769221 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c86b6b30-cf74-4708-b280-8c90ce27af28\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9910536149dc102d5a56c9ac27047ab0f86628788126c6c4aaf8aa8e8bc414bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://123e6e9aa8268f78a2852df2460763150ed92462bebd7c852c2bb6f78a092781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://906d1b0f0eda0e576d188ea1c4f601f45dcc8e93bf96330fa4e50be9d7a082b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87f674631ef5a418e3657c5c5103ab4c199d3f1690e0a0c737927afd35db4170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87f674631ef5a418e3657c5c5103ab4c199d3f1690e0a0c737927afd35db4170\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:15Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:15 crc kubenswrapper[4690]: I0320 17:34:15.783404 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://202f57e25ffca6b763271ebd9354cb780bda72898aa4b753ce08bcf5a774dbd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:15Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:15 crc kubenswrapper[4690]: I0320 17:34:15.798158 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c18651e4-89e3-43fd-a780-bfa6df87591e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://746499ab480c55aa548acd69b4adc2adb724c111d53536273f1e738c5d67209c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v64dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09565d72b6e11bc9bc4f72446c455016fb107bdf0fe367b56427ce9f79c20b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v64dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wtg2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:15Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:15 crc kubenswrapper[4690]: I0320 17:34:15.817199 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64a74fb2e29c84d99284cdca82ecd7abae5fc195747f292f11036116ec270ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37728496304293eddfd812f4584815ce277a3a2b02b6716e5f7d5d77ebaf9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:15Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:15 crc kubenswrapper[4690]: I0320 17:34:15.835664 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:15Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:15 crc kubenswrapper[4690]: I0320 17:34:15.852178 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bf8dm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"189715be-f690-4a1d-9bd3-fb0dcae7affe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab5d0027832ffcb62f2f0869a4811a56bd02954cbdd4fd0e20870dc72818ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z9vwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bf8dm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:15Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:15 crc kubenswrapper[4690]: I0320 17:34:15.879500 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tzvwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fe7c1d1-7aa9-4c64-941e-7415a99367ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9bad176e93c3fff461f57c5c15ed0d5bcc9ef12767d38012fe1145dd701112b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56dc92b978a7c1bbd4e3ccc2a6821348e2a990247e49e82c4de43c8bbe305cad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56dc92b978a7c1bbd4e3ccc2a6821348e2a990247e49e82c4de43c8bbe305cad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b4a3f2829967bcafe60ed0c6d08a421e8c8a5cd49d2a7445bbc92c2592d7457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b4a3f2829967bcafe60ed0c6d08a421e8c8a5cd49d2a7445bbc92c2592d7457\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://971c38dc48c64a0c8c8781e6d2a3d6f5222f9e846fb32ae417a4a1872a296b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://971c38dc48c64a0c8c8781e6d2a3d6f5222f9e846fb32ae417a4a1872a296b47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6e590fdf915cb209ad79022e0bb1b20cf642ebfeaa5e67cad61f14c495feaed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6e590fdf915cb209ad79022e0bb1b20cf642ebfeaa5e67cad61f14c495feaed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa9bfe0b6b30c8ecbcab836f9fd1770f959392e981e9676b281b5768a4279d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa9bfe0b6b30c8ecbcab836f9fd1770f959392e981e9676b281b5768a4279d22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d589b2d09af16c9faaa995e5d4abaa7663d53b499e93fbb2ad76e9ef14ff32c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d589b2d09af16c9faaa995e5d4abaa7663d53b499e93fbb2ad76e9ef14ff32c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tzvwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:15Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:15 crc kubenswrapper[4690]: I0320 17:34:15.882703 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:34:15 crc kubenswrapper[4690]: I0320 17:34:15.882715 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bgj72" Mar 20 17:34:15 crc kubenswrapper[4690]: E0320 17:34:15.882949 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:34:15 crc kubenswrapper[4690]: E0320 17:34:15.883084 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bgj72" podUID="3cb690cf-caea-4c1b-ad3c-7e17a802b1a3" Mar 20 17:34:15 crc kubenswrapper[4690]: I0320 17:34:15.882717 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:34:15 crc kubenswrapper[4690]: E0320 17:34:15.883184 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:34:15 crc kubenswrapper[4690]: I0320 17:34:15.883400 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:34:15 crc kubenswrapper[4690]: E0320 17:34:15.883674 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:34:15 crc kubenswrapper[4690]: I0320 17:34:15.918998 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01a728ab-e286-4606-b922-d510978b863a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89f5bb035f84384df58eb38689bda300611344d78c38c548c61cd02a479b6852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://187278dddcc4ae295ce37bb5966dd95b70987cf9579d8a302c45162906caa098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d198c0b94cfc2e9429a02ccb1bf444b3746c37cd3278cc5c41cccad3a92f3a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78b79e7c6bc179739a43168addace3ea75f4067c5938f219a5cb0e545f65472f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11c8e8059826df28ea1bdafe3ca56a8a902ff916246367be3ece76d468194901\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95c9b322e5da6bc8172886af77d6507bccaaf8e4489181c78d3f5e522d781aa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31889aedfc9e0388694b4201c8c752dabd1634603e55401682b8cb995946bab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31889aedfc9e0388694b4201c8c752dabd1634603e55401682b8cb995946bab6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:34:13Z\\\",\\\"message\\\":\\\"k-metrics-daemon-bgj72 in node crc\\\\nI0320 17:34:13.787724 6883 services_controller.go:443] Built service openshift-apiserver/check-endpoints LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.139\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:17698, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0320 17:34:13.787931 6883 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI0320 17:34:13.788092 6883 default_network_controller.go:776] Recording success event on pod openshift-etcd/etcd-crc\\\\nF0320 17:34:13.787689 6883 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.opensh\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:34:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7bsmm_openshift-ovn-kubernetes(01a728ab-e286-4606-b922-d510978b863a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6447a78cef9ba2045f7928077399b681d152b37755ec287ae1633a26a67711ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ad2529bd38d1e0c84ca456ccdcc8020ce82a667c5aa5ea3a0027d397ec94f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ad2529bd38d1e0c84ca456ccdcc8020ce82a667c5aa5ea3a0027d397ec94f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7bsmm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:15Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:15 crc kubenswrapper[4690]: I0320 17:34:15.955316 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dacf40f3-f7fe-429b-bb11-3057bc037779\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b273b610fa19944625ca87d5ec10f818b86154d676f1def5ebe494ee44ed3848\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f552ca9ec154d035a9f9809b20d9ff2cd19bbd4cb9262173a0334289741f4fad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd8c552d958aced0cb683d87c3ef8d88494d4888ccb028a9f4c27b24b4923264\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5355eb1563fa92e70ca61e39a864a15b53da2181b277f3e134d121b5626b954a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f044bae4d4345b16e951ba16d4dc6df9b400789b67b6eb23d806fba27dc77d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://601a5cb96354f970de2322d08594baacac3c21ec962d27dc0c809f1bc99de4d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://601a5cb96354f970de2322d08594baacac3c21ec962d27dc0c809f1bc99de4d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e719a69188fb4ee3882973f6f72ba027c5a546cb39b119b27bcd38d8cc728521\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e719a69188fb4ee3882973f6f72ba027c5a546cb39b119b27bcd38d8cc728521\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ef118e8eca52e42d265877595d296d5641caa5c79886b886eefca7686f9b6524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef118e8eca52e42d265877595d296d5641caa5c79886b886eefca7686f9b6524\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:15Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:15 crc kubenswrapper[4690]: I0320 17:34:15.973956 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bgj72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cb690cf-caea-4c1b-ad3c-7e17a802b1a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djqjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djqjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bgj72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:15Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:15 crc kubenswrapper[4690]: E0320 17:34:15.988871 4690 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 17:34:16 crc kubenswrapper[4690]: I0320 17:34:16.005936 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1ec4f2e-81b3-4b81-b071-1306b93f352a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc5b19d4175f97a26633b3c61b49147f93e1edeb8975964cb23bbe474f6326e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe2bb59ee9fc82c3e49b375d294aebc73e2175d699416cb28c587a153cbadc5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83d020fd903a7b604233a4229c9a201a78f0f9d41864c94e82220090dd73e69e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af5080d60c7a6c75aac659ab9995f5f78392919748687dc3c81f6df7af1afe76\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60a788ca120045ef7b2481c3da0afac1f8ae2522b3edd3b73a48f5f8dab045a4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:33:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:33:16.417534 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:33:16.417775 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:33:16.418850 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4179466923/tls.crt::/tmp/serving-cert-4179466923/tls.key\\\\\\\"\\\\nI0320 17:33:16.771141 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:33:16.777371 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:33:16.777420 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:33:16.777489 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:33:16.777503 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:33:16.783760 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 17:33:16.783788 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:33:16.783793 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 17:33:16.783790 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:33:16.783798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:33:16.783816 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:33:16.783823 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:33:16.783828 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:33:16.787038 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:34:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d1877a8c2e19c04c44916cbcd68e19a117e4d6075b33ce7131064590120b12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438a96b878fe413aa54a56021b7ca5d2d38226050a036c2ce144aaead090aff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://438a96b878fe413aa54a56021b7ca5d2d38226050a036c2ce144aaead090aff7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:16Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:16 crc kubenswrapper[4690]: I0320 17:34:16.021324 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7501f273f832d465f837fe21cbfaddda7e9fdbfafe44e94d3fbfee21bbd2735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:16Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:16 crc kubenswrapper[4690]: I0320 17:34:16.036247 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:16Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:16 crc kubenswrapper[4690]: I0320 17:34:16.053592 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:16Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:16 crc kubenswrapper[4690]: I0320 17:34:16.069734 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qhmg6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5abdfe2-a5f7-43a7-9c83-a9eb0dacdea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19bc44db59dd7f723e92f099fb77ea80fac41a5fc0a3818ddd8d443495c50c8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lb8q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qhmg6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:16Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:16 crc kubenswrapper[4690]: I0320 17:34:16.082507 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cdd6a8b-6b15-41c5-ba81-51e1ef53835e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c42561cbc470c23295468bf31d6dda364c3962cf8ac84f53ed62c01fa3e19db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbfcf8baf8b3cc4746bc7b314297f0f820b7461ad85d9c2f500a3ed589fb4bc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbfcf8baf8b3cc4746bc7b314297f0f820b7461ad85d9c2f500a3ed589fb4bc0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:16Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:16 crc kubenswrapper[4690]: I0320 17:34:16.093223 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://202f57e25ffca6b763271ebd9354cb780bda72898aa4b753ce08bcf5a774dbd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:16Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:16 crc kubenswrapper[4690]: I0320 17:34:16.104639 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c18651e4-89e3-43fd-a780-bfa6df87591e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://746499ab480c55aa548acd69b4adc2adb724c111d53536273f1e738c5d67209c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v64dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09565d72b6e11bc9bc4f72446c455016fb107bdf0fe367b56427ce9f79c20b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v64dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wtg2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:16Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:16 crc kubenswrapper[4690]: I0320 17:34:16.114831 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4rfg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deaf1de2-4906-4e89-ae1b-83b6d35f97a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53b3e701b77813269b88f29ec4e437ca71cad9cd1b9cc9310dc6b59cc609bcc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmghf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4rfg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:16Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:16 crc kubenswrapper[4690]: I0320 17:34:16.125853 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nqtt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f51dea1-fc10-4d4a-9065-2d0c020b36f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc815d328a997ab7b69c5eb959fedde44313867916d64f4ebaf96d77e34b2e84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzj2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de078ec156833ff0304a8e83014adf2c8fc5c7f8db9bb25c366acf27fa446ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzj2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8nqtt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:16Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:16 crc kubenswrapper[4690]: I0320 17:34:16.142953 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c86b6b30-cf74-4708-b280-8c90ce27af28\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9910536149dc102d5a56c9ac27047ab0f86628788126c6c4aaf8aa8e8bc414bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://123e6e9aa8268f78a2852df2460763150ed92462bebd7c852c2bb6f78a092781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://906d1b0f0eda0e576d188ea1c4f601f45dcc8e93bf96330fa4e50be9d7a082b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87f674631ef5a418e3657c5c5103ab4c199d3f1690e0a0c737927afd35db4170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87f674631ef5a418e3657c5c5103ab4c199d3f1690e0a0c737927afd35db4170\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:16Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:16 crc kubenswrapper[4690]: I0320 17:34:16.156789 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:16Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:16 crc kubenswrapper[4690]: I0320 17:34:16.172458 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bf8dm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"189715be-f690-4a1d-9bd3-fb0dcae7affe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab5d0027832ffcb62f2f0869a4811a56bd02954cbdd4fd0e20870dc72818ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z9vwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bf8dm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:16Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:16 crc kubenswrapper[4690]: I0320 17:34:16.186161 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64a74fb2e29c84d99284cdca82ecd7abae5fc195747f292f11036116ec270ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37728496304293eddfd812f4584815ce277a3a2b02b6716e5f7d5d77ebaf9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:16Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:16 crc kubenswrapper[4690]: I0320 17:34:16.214207 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dacf40f3-f7fe-429b-bb11-3057bc037779\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b273b610fa19944625ca87d5ec10f818b86154d676f1def5ebe494ee44ed3848\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f552ca9ec154d035a9f9809b20d9ff2cd19bbd4cb9262173a0334289741f4fad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd8c552d958aced0cb683d87c3ef8d88494d4888ccb028a9f4c27b24b4923264\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5355eb1563fa92e70ca61e39a864a15b53da2181b277f3e134d121b5626b954a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f044bae4d4345b16e951ba16d4dc6df9b400789b67b6eb23d806fba27dc77d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://601a5cb96354f970de2322d08594baacac3c21ec962d27dc0c809f1bc99de4d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://601a5cb96354f970de2322d08594baacac3c21ec962d27dc0c809f1bc99de4d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e719a69188fb4ee3882973f6f72ba027c5a546cb39b119b27bcd38d8cc728521\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e719a69188fb4ee3882973f6f72ba027c5a546cb39b119b27bcd38d8cc728521\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ef118e8eca52e42d265877595d296d5641caa5c79886b886eefca7686f9b6524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef118e8eca52e42d265877595d296d5641caa5c79886b886eefca7686f9b6524\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:16Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:16 crc kubenswrapper[4690]: I0320 17:34:16.226056 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bgj72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cb690cf-caea-4c1b-ad3c-7e17a802b1a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djqjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djqjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bgj72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:16Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:16 crc kubenswrapper[4690]: I0320 17:34:16.248595 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tzvwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fe7c1d1-7aa9-4c64-941e-7415a99367ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9bad176e93c3fff461f57c5c15ed0d5bcc9ef12767d38012fe1145dd701112b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56dc92b978a7c1bbd4e3ccc2a6821348e2a990247e49e82c4de43c8bbe305cad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56dc92b978a7c1bbd4e3ccc2a6821348e2a990247e49e82c4de43c8bbe305cad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b4a3f2829967bcafe60ed0c6d08a421e8c8a5cd49d2a7445bbc92c2592d7457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b4a3f2829967bcafe60ed0c6d08a421e8c8a5cd49d2a7445bbc92c2592d7457\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://971c38dc48c64a0c8c8781e6d2a3d6f5222f9e846fb32ae417a4a1872a296b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://971c38dc48c64a0c8c8781e6d2a3d6f5222f9e846fb32ae417a4a1872a296b47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6e590fdf915cb209ad79022e0bb1b20cf642ebfeaa5e67cad61f14c495feaed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6e590fdf915cb209ad79022e0bb1b20cf642ebfeaa5e67cad61f14c495feaed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa9bfe0b6b30c8ecbcab836f9fd1770f959392e981e9676b281b5768a4279d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa9bfe0b6b30c8ecbcab836f9fd1770f959392e981e9676b281b5768a4279d22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d589b2d09af16c9faaa995e5d4abaa7663d53b499e93fbb2ad76e9ef14ff32c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d589b2d09af16c9faaa995e5d4abaa7663d53b499e93fbb2ad76e9ef14ff32c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tzvwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:16Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:16 crc kubenswrapper[4690]: I0320 17:34:16.279971 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01a728ab-e286-4606-b922-d510978b863a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89f5bb035f84384df58eb38689bda300611344d78c38c548c61cd02a479b6852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://187278dddcc4ae295ce37bb5966dd95b70987cf9579d8a302c45162906caa098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d198c0b94cfc2e9429a02ccb1bf444b3746c37cd3278cc5c41cccad3a92f3a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78b79e7c6bc179739a43168addace3ea75f4067c5938f219a5cb0e545f65472f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11c8e8059826df28ea1bdafe3ca56a8a902ff916246367be3ece76d468194901\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95c9b322e5da6bc8172886af77d6507bccaaf8e4489181c78d3f5e522d781aa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31889aedfc9e0388694b4201c8c752dabd1634603e55401682b8cb995946bab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31889aedfc9e0388694b4201c8c752dabd1634603e55401682b8cb995946bab6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:34:13Z\\\",\\\"message\\\":\\\"k-metrics-daemon-bgj72 in node crc\\\\nI0320 17:34:13.787724 6883 services_controller.go:443] Built service openshift-apiserver/check-endpoints LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.139\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:17698, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0320 17:34:13.787931 6883 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI0320 17:34:13.788092 6883 default_network_controller.go:776] Recording success event on pod openshift-etcd/etcd-crc\\\\nF0320 17:34:13.787689 6883 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.opensh\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:34:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7bsmm_openshift-ovn-kubernetes(01a728ab-e286-4606-b922-d510978b863a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6447a78cef9ba2045f7928077399b681d152b37755ec287ae1633a26a67711ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ad2529bd38d1e0c84ca456ccdcc8020ce82a667c5aa5ea3a0027d397ec94f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ad2529bd38d1e0c84ca456ccdcc8020ce82a667c5aa5ea3a0027d397ec94f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7bsmm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:16Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:17 crc kubenswrapper[4690]: I0320 17:34:17.882786 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:34:17 crc kubenswrapper[4690]: I0320 17:34:17.882789 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:34:17 crc kubenswrapper[4690]: I0320 17:34:17.882855 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:34:17 crc kubenswrapper[4690]: E0320 17:34:17.883043 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:34:17 crc kubenswrapper[4690]: I0320 17:34:17.882919 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bgj72" Mar 20 17:34:17 crc kubenswrapper[4690]: E0320 17:34:17.883193 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:34:17 crc kubenswrapper[4690]: E0320 17:34:17.883473 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bgj72" podUID="3cb690cf-caea-4c1b-ad3c-7e17a802b1a3" Mar 20 17:34:17 crc kubenswrapper[4690]: E0320 17:34:17.883704 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:34:18 crc kubenswrapper[4690]: I0320 17:34:18.749577 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:34:18 crc kubenswrapper[4690]: I0320 17:34:18.749644 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:34:18 crc kubenswrapper[4690]: I0320 17:34:18.749656 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:34:18 crc kubenswrapper[4690]: I0320 17:34:18.749676 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:34:18 crc kubenswrapper[4690]: I0320 17:34:18.749691 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:34:18Z","lastTransitionTime":"2026-03-20T17:34:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:34:18 crc kubenswrapper[4690]: E0320 17:34:18.765983 4690 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"65dcae3a-f6f0-4cdb-ac7a-76b1f475ea12\\\",\\\"systemUUID\\\":\\\"6ccc1e34-4160-4143-b919-ac2f717f294a\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:18Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:18 crc kubenswrapper[4690]: I0320 17:34:18.771194 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:34:18 crc kubenswrapper[4690]: I0320 17:34:18.771462 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:34:18 crc kubenswrapper[4690]: I0320 17:34:18.771496 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:34:18 crc kubenswrapper[4690]: I0320 17:34:18.771526 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:34:18 crc kubenswrapper[4690]: I0320 17:34:18.771546 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:34:18Z","lastTransitionTime":"2026-03-20T17:34:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:34:18 crc kubenswrapper[4690]: E0320 17:34:18.793994 4690 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"65dcae3a-f6f0-4cdb-ac7a-76b1f475ea12\\\",\\\"systemUUID\\\":\\\"6ccc1e34-4160-4143-b919-ac2f717f294a\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:18Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:18 crc kubenswrapper[4690]: I0320 17:34:18.799801 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:34:18 crc kubenswrapper[4690]: I0320 17:34:18.799926 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:34:18 crc kubenswrapper[4690]: I0320 17:34:18.799946 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:34:18 crc kubenswrapper[4690]: I0320 17:34:18.799977 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:34:18 crc kubenswrapper[4690]: I0320 17:34:18.799995 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:34:18Z","lastTransitionTime":"2026-03-20T17:34:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:34:18 crc kubenswrapper[4690]: E0320 17:34:18.815526 4690 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"65dcae3a-f6f0-4cdb-ac7a-76b1f475ea12\\\",\\\"systemUUID\\\":\\\"6ccc1e34-4160-4143-b919-ac2f717f294a\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:18Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:18 crc kubenswrapper[4690]: I0320 17:34:18.821613 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:34:18 crc kubenswrapper[4690]: I0320 17:34:18.821683 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:34:18 crc kubenswrapper[4690]: I0320 17:34:18.821700 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:34:18 crc kubenswrapper[4690]: I0320 17:34:18.821726 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:34:18 crc kubenswrapper[4690]: I0320 17:34:18.821743 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:34:18Z","lastTransitionTime":"2026-03-20T17:34:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:34:18 crc kubenswrapper[4690]: E0320 17:34:18.844103 4690 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"65dcae3a-f6f0-4cdb-ac7a-76b1f475ea12\\\",\\\"systemUUID\\\":\\\"6ccc1e34-4160-4143-b919-ac2f717f294a\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:18Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:18 crc kubenswrapper[4690]: I0320 17:34:18.851158 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:34:18 crc kubenswrapper[4690]: I0320 17:34:18.851227 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:34:18 crc kubenswrapper[4690]: I0320 17:34:18.851248 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:34:18 crc kubenswrapper[4690]: I0320 17:34:18.851638 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:34:18 crc kubenswrapper[4690]: I0320 17:34:18.851960 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:34:18Z","lastTransitionTime":"2026-03-20T17:34:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:34:18 crc kubenswrapper[4690]: E0320 17:34:18.872052 4690 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"65dcae3a-f6f0-4cdb-ac7a-76b1f475ea12\\\",\\\"systemUUID\\\":\\\"6ccc1e34-4160-4143-b919-ac2f717f294a\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:18Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:18 crc kubenswrapper[4690]: E0320 17:34:18.872377 4690 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 17:34:19 crc kubenswrapper[4690]: I0320 17:34:19.882551 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:34:19 crc kubenswrapper[4690]: I0320 17:34:19.882621 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:34:19 crc kubenswrapper[4690]: I0320 17:34:19.882693 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:34:19 crc kubenswrapper[4690]: I0320 17:34:19.882568 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bgj72" Mar 20 17:34:19 crc kubenswrapper[4690]: E0320 17:34:19.882774 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:34:19 crc kubenswrapper[4690]: E0320 17:34:19.882961 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:34:19 crc kubenswrapper[4690]: E0320 17:34:19.883123 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:34:19 crc kubenswrapper[4690]: E0320 17:34:19.883167 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bgj72" podUID="3cb690cf-caea-4c1b-ad3c-7e17a802b1a3" Mar 20 17:34:20 crc kubenswrapper[4690]: I0320 17:34:20.901378 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 20 17:34:20 crc kubenswrapper[4690]: E0320 17:34:20.989861 4690 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 17:34:21 crc kubenswrapper[4690]: I0320 17:34:21.883073 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:34:21 crc kubenswrapper[4690]: I0320 17:34:21.883176 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:34:21 crc kubenswrapper[4690]: I0320 17:34:21.883187 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:34:21 crc kubenswrapper[4690]: E0320 17:34:21.883301 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:34:21 crc kubenswrapper[4690]: I0320 17:34:21.883373 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bgj72" Mar 20 17:34:21 crc kubenswrapper[4690]: E0320 17:34:21.883512 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:34:21 crc kubenswrapper[4690]: E0320 17:34:21.883624 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:34:21 crc kubenswrapper[4690]: E0320 17:34:21.883772 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bgj72" podUID="3cb690cf-caea-4c1b-ad3c-7e17a802b1a3" Mar 20 17:34:23 crc kubenswrapper[4690]: I0320 17:34:23.883141 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:34:23 crc kubenswrapper[4690]: I0320 17:34:23.883190 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:34:23 crc kubenswrapper[4690]: E0320 17:34:23.883685 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:34:23 crc kubenswrapper[4690]: I0320 17:34:23.883331 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:34:23 crc kubenswrapper[4690]: I0320 17:34:23.883298 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bgj72" Mar 20 17:34:23 crc kubenswrapper[4690]: E0320 17:34:23.883791 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:34:23 crc kubenswrapper[4690]: E0320 17:34:23.883948 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bgj72" podUID="3cb690cf-caea-4c1b-ad3c-7e17a802b1a3" Mar 20 17:34:23 crc kubenswrapper[4690]: E0320 17:34:23.884057 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:34:25 crc kubenswrapper[4690]: I0320 17:34:25.433889 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:34:25 crc kubenswrapper[4690]: I0320 17:34:25.464353 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dacf40f3-f7fe-429b-bb11-3057bc037779\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b273b610fa19944625ca87d5ec10f818b86154d676f1def5ebe494ee44ed3848\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f552ca9ec154d035a9f9809b20d9ff2cd19bbd4cb9262173a0334289741f4fad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd8c552d958aced0cb683d87c3ef8d88494d4888ccb028a9f4c27b24b4923264\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5355eb1563fa92e70ca61e39a864a15b53da2181b277f3e134d121b5626b954a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f044bae4d4345b16e951ba16d4dc6df9b400789b67b6eb23d806fba27dc77d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://601a5cb96354f970de2322d08594baacac3c21ec962d27dc0c809f1bc99de4d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://601a5cb96354f970de2322d08594baacac3c21ec962d27dc0c809f1bc99de4d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e719a69188fb4ee3882973f6f72ba027c5a546cb39b119b27bcd38d8cc728521\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e719a69188fb4ee3882973f6f72ba027c5a546cb39b119b27bcd38d8cc728521\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ef118e8eca52e42d265877595d296d5641caa5c79886b886eefca7686f9b6524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef118e8eca52e42d265877595d296d5641caa5c79886b886eefca7686f9b6524\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:25Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:25 crc kubenswrapper[4690]: I0320 17:34:25.480326 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bgj72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cb690cf-caea-4c1b-ad3c-7e17a802b1a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djqjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djqjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bgj72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:25Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:25 crc kubenswrapper[4690]: I0320 17:34:25.503416 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tzvwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fe7c1d1-7aa9-4c64-941e-7415a99367ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9bad176e93c3fff461f57c5c15ed0d5bcc9ef12767d38012fe1145dd701112b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56dc92b978a7c1bbd4e3ccc2a6821348e2a990247e49e82c4de43c8bbe305cad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56dc92b978a7c1bbd4e3ccc2a6821348e2a990247e49e82c4de43c8bbe305cad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b4a3f2829967bcafe60ed0c6d08a421e8c8a5cd49d2a7445bbc92c2592d7457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b4a3f2829967bcafe60ed0c6d08a421e8c8a5cd49d2a7445bbc92c2592d7457\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://971c38dc48c64a0c8c8781e6d2a3d6f5222f9e846fb32ae417a4a1872a296b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://971c38dc48c64a0c8c8781e6d2a3d6f5222f9e846fb32ae417a4a1872a296b47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6e590fdf915cb209ad79022e0bb1b20cf642ebfeaa5e67cad61f14c495feaed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6e590fdf915cb209ad79022e0bb1b20cf642ebfeaa5e67cad61f14c495feaed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa9bfe0b6b30c8ecbcab836f9fd1770f959392e981e9676b281b5768a4279d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa9bfe0b6b30c8ecbcab836f9fd1770f959392e981e9676b281b5768a4279d22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d589b2d09af16c9faaa995e5d4abaa7663d53b499e93fbb2ad76e9ef14ff32c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d589b2d09af16c9faaa995e5d4abaa7663d53b499e93fbb2ad76e9ef14ff32c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tzvwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:25Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:25 crc kubenswrapper[4690]: I0320 17:34:25.534791 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01a728ab-e286-4606-b922-d510978b863a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89f5bb035f84384df58eb38689bda300611344d78c38c548c61cd02a479b6852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://187278dddcc4ae295ce37bb5966dd95b70987cf9579d8a302c45162906caa098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d198c0b94cfc2e9429a02ccb1bf444b3746c37cd3278cc5c41cccad3a92f3a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78b79e7c6bc179739a43168addace3ea75f4067c5938f219a5cb0e545f65472f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11c8e8059826df28ea1bdafe3ca56a8a902ff916246367be3ece76d468194901\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95c9b322e5da6bc8172886af77d6507bccaaf8e4489181c78d3f5e522d781aa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31889aedfc9e0388694b4201c8c752dabd1634603e55401682b8cb995946bab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31889aedfc9e0388694b4201c8c752dabd1634603e55401682b8cb995946bab6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:34:13Z\\\",\\\"message\\\":\\\"k-metrics-daemon-bgj72 in node crc\\\\nI0320 17:34:13.787724 6883 services_controller.go:443] Built service openshift-apiserver/check-endpoints LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.139\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:17698, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0320 17:34:13.787931 6883 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI0320 17:34:13.788092 6883 default_network_controller.go:776] Recording success event on pod openshift-etcd/etcd-crc\\\\nF0320 17:34:13.787689 6883 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.opensh\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:34:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7bsmm_openshift-ovn-kubernetes(01a728ab-e286-4606-b922-d510978b863a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6447a78cef9ba2045f7928077399b681d152b37755ec287ae1633a26a67711ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ad2529bd38d1e0c84ca456ccdcc8020ce82a667c5aa5ea3a0027d397ec94f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ad2529bd38d1e0c84ca456ccdcc8020ce82a667c5aa5ea3a0027d397ec94f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7bsmm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:25Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:25 crc kubenswrapper[4690]: I0320 17:34:25.554940 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a01b81a-5874-41c3-a2ea-0b3f68fb1194\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://696bd60243b29b1c078b32f2dcb7261e108e0b204ba5889b2c0ce5d6c8dff044\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98dccfbbb62f60dc126e6c81729f6ac78b1f017d1dd01a200d06beb2296fd1b2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:32:38Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 17:32:08.107057 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 17:32:08.109144 1 observer_polling.go:159] Starting file observer\\\\nI0320 17:32:08.136585 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 17:32:08.140728 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0320 17:32:38.286606 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43215f00bdcc0d708039a3dd34ce62baa101c8218cc73255f2027f3dbfe60198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35960cf982659d799c1e2ce1a4c7eb21b7b1c5d8e5979668b4b6df505c38bdf7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://079fc6ab0278dfdaa56142eb90b06568010882948e45bea053b0459a68c9faa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:25Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:25 crc kubenswrapper[4690]: I0320 17:34:25.577308 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1ec4f2e-81b3-4b81-b071-1306b93f352a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc5b19d4175f97a26633b3c61b49147f93e1edeb8975964cb23bbe474f6326e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe2bb59ee9fc82c3e49b375d294aebc73e2175d699416cb28c587a153cbadc5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83d020fd903a7b604233a4229c9a201a78f0f9d41864c94e82220090dd73e69e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af5080d60c7a6c75aac659ab9995f5f78392919748687dc3c81f6df7af1afe76\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60a788ca120045ef7b2481c3da0afac1f8ae2522b3edd3b73a48f5f8dab045a4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:33:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:33:16.417534 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:33:16.417775 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:33:16.418850 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4179466923/tls.crt::/tmp/serving-cert-4179466923/tls.key\\\\\\\"\\\\nI0320 17:33:16.771141 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:33:16.777371 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:33:16.777420 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:33:16.777489 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:33:16.777503 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:33:16.783760 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 17:33:16.783788 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:33:16.783793 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 17:33:16.783790 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:33:16.783798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:33:16.783816 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:33:16.783823 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:33:16.783828 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:33:16.787038 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:34:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d1877a8c2e19c04c44916cbcd68e19a117e4d6075b33ce7131064590120b12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438a96b878fe413aa54a56021b7ca5d2d38226050a036c2ce144aaead090aff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://438a96b878fe413aa54a56021b7ca5d2d38226050a036c2ce144aaead090aff7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:25Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:25 crc kubenswrapper[4690]: I0320 17:34:25.597558 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7501f273f832d465f837fe21cbfaddda7e9fdbfafe44e94d3fbfee21bbd2735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:25Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:25 crc kubenswrapper[4690]: I0320 17:34:25.626174 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:25Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:25 crc kubenswrapper[4690]: I0320 17:34:25.653144 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:25Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:25 crc kubenswrapper[4690]: I0320 17:34:25.677918 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qhmg6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5abdfe2-a5f7-43a7-9c83-a9eb0dacdea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19bc44db59dd7f723e92f099fb77ea80fac41a5fc0a3818ddd8d443495c50c8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lb8q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qhmg6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:25Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:25 crc kubenswrapper[4690]: I0320 17:34:25.690084 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cdd6a8b-6b15-41c5-ba81-51e1ef53835e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c42561cbc470c23295468bf31d6dda364c3962cf8ac84f53ed62c01fa3e19db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbfcf8baf8b3cc4746bc7b314297f0f820b7461ad85d9c2f500a3ed589fb4bc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbfcf8baf8b3cc4746bc7b314297f0f820b7461ad85d9c2f500a3ed589fb4bc0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:25Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:25 crc kubenswrapper[4690]: I0320 17:34:25.700700 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://202f57e25ffca6b763271ebd9354cb780bda72898aa4b753ce08bcf5a774dbd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:25Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:25 crc kubenswrapper[4690]: I0320 17:34:25.711136 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c18651e4-89e3-43fd-a780-bfa6df87591e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://746499ab480c55aa548acd69b4adc2adb724c111d53536273f1e738c5d67209c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v64dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09565d72b6e11bc9bc4f72446c455016fb107bdf0fe367b56427ce9f79c20b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v64dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wtg2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:25Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:25 crc kubenswrapper[4690]: I0320 17:34:25.720643 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4rfg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deaf1de2-4906-4e89-ae1b-83b6d35f97a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53b3e701b77813269b88f29ec4e437ca71cad9cd1b9cc9310dc6b59cc609bcc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmghf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4rfg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:25Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:25 crc kubenswrapper[4690]: I0320 17:34:25.730242 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nqtt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f51dea1-fc10-4d4a-9065-2d0c020b36f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc815d328a997ab7b69c5eb959fedde44313867916d64f4ebaf96d77e34b2e84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzj2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de078ec156833ff0304a8e83014adf2c8fc5c7f8db9bb25c366acf27fa446ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzj2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8nqtt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:25Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:25 crc kubenswrapper[4690]: I0320 17:34:25.742183 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c86b6b30-cf74-4708-b280-8c90ce27af28\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9910536149dc102d5a56c9ac27047ab0f86628788126c6c4aaf8aa8e8bc414bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://123e6e9aa8268f78a2852df2460763150ed92462bebd7c852c2bb6f78a092781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://906d1b0f0eda0e576d188ea1c4f601f45dcc8e93bf96330fa4e50be9d7a082b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87f674631ef5a418e3657c5c5103ab4c199d3f1690e0a0c737927afd35db4170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87f674631ef5a418e3657c5c5103ab4c199d3f1690e0a0c737927afd35db4170\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:25Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:25 crc kubenswrapper[4690]: I0320 17:34:25.754171 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:25Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:25 crc kubenswrapper[4690]: I0320 17:34:25.767047 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bf8dm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"189715be-f690-4a1d-9bd3-fb0dcae7affe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab5d0027832ffcb62f2f0869a4811a56bd02954cbdd4fd0e20870dc72818ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z9vwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bf8dm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:25Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:25 crc kubenswrapper[4690]: I0320 17:34:25.783356 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64a74fb2e29c84d99284cdca82ecd7abae5fc195747f292f11036116ec270ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37728496304293eddfd812f4584815ce277a3a2b02b6716e5f7d5d77ebaf9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:25Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:25 crc kubenswrapper[4690]: I0320 17:34:25.882450 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:34:25 crc kubenswrapper[4690]: I0320 17:34:25.882475 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bgj72" Mar 20 17:34:25 crc kubenswrapper[4690]: I0320 17:34:25.882494 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:34:25 crc kubenswrapper[4690]: I0320 17:34:25.882539 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:34:25 crc kubenswrapper[4690]: E0320 17:34:25.882892 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:34:25 crc kubenswrapper[4690]: E0320 17:34:25.883002 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:34:25 crc kubenswrapper[4690]: E0320 17:34:25.883094 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bgj72" podUID="3cb690cf-caea-4c1b-ad3c-7e17a802b1a3" Mar 20 17:34:25 crc kubenswrapper[4690]: E0320 17:34:25.883180 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:34:25 crc kubenswrapper[4690]: I0320 17:34:25.895327 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a01b81a-5874-41c3-a2ea-0b3f68fb1194\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://696bd60243b29b1c078b32f2dcb7261e108e0b204ba5889b2c0ce5d6c8dff044\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98dccfbbb62f60dc126e6c81729f6ac78b1f017d1dd01a200d06beb2296fd1b2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:32:38Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 17:32:08.107057 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 17:32:08.109144 1 observer_polling.go:159] Starting file observer\\\\nI0320 17:32:08.136585 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 17:32:08.140728 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0320 17:32:38.286606 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43215f00bdcc0d708039a3dd34ce62baa101c8218cc73255f2027f3dbfe60198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35960cf982659d799c1e2ce1a4c7eb21b7b1c5d8e5979668b4b6df505c38bdf7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://079fc6ab0278dfdaa56142eb90b06568010882948e45bea053b0459a68c9faa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:25Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:25 crc kubenswrapper[4690]: I0320 17:34:25.912866 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dacf40f3-f7fe-429b-bb11-3057bc037779\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b273b610fa19944625ca87d5ec10f818b86154d676f1def5ebe494ee44ed3848\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f552ca9ec154d035a9f9809b20d9ff2cd19bbd4cb9262173a0334289741f4fad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd8c552d958aced0cb683d87c3ef8d88494d4888ccb028a9f4c27b24b4923264\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5355eb1563fa92e70ca61e39a864a15b53da2181b277f3e134d121b5626b954a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f044bae4d4345b16e951ba16d4dc6df9b400789b67b6eb23d806fba27dc77d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://601a5cb96354f970de2322d08594baacac3c21ec962d27dc0c809f1bc99de4d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://601a5cb96354f970de2322d08594baacac3c21ec962d27dc0c809f1bc99de4d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e719a69188fb4ee3882973f6f72ba027c5a546cb39b119b27bcd38d8cc728521\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e719a69188fb4ee3882973f6f72ba027c5a546cb39b119b27bcd38d8cc728521\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ef118e8eca52e42d265877595d296d5641caa5c79886b886eefca7686f9b6524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef118e8eca52e42d265877595d296d5641caa5c79886b886eefca7686f9b6524\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:25Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:25 crc kubenswrapper[4690]: I0320 17:34:25.923225 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bgj72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cb690cf-caea-4c1b-ad3c-7e17a802b1a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djqjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djqjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bgj72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:25Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:25 crc kubenswrapper[4690]: I0320 17:34:25.941850 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tzvwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fe7c1d1-7aa9-4c64-941e-7415a99367ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9bad176e93c3fff461f57c5c15ed0d5bcc9ef12767d38012fe1145dd701112b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56dc92b978a7c1bbd4e3ccc2a6821348e2a990247e49e82c4de43c8bbe305cad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56dc92b978a7c1bbd4e3ccc2a6821348e2a990247e49e82c4de43c8bbe305cad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b4a3f2829967bcafe60ed0c6d08a421e8c8a5cd49d2a7445bbc92c2592d7457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b4a3f2829967bcafe60ed0c6d08a421e8c8a5cd49d2a7445bbc92c2592d7457\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://971c38dc48c64a0c8c8781e6d2a3d6f5222f9e846fb32ae417a4a1872a296b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://971c38dc48c64a0c8c8781e6d2a3d6f5222f9e846fb32ae417a4a1872a296b47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6e590fdf915cb209ad79022e0bb1b20cf642ebfeaa5e67cad61f14c495feaed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6e590fdf915cb209ad79022e0bb1b20cf642ebfeaa5e67cad61f14c495feaed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa9bfe0b6b30c8ecbcab836f9fd1770f959392e981e9676b281b5768a4279d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa9bfe0b6b30c8ecbcab836f9fd1770f959392e981e9676b281b5768a4279d22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d589b2d09af16c9faaa995e5d4abaa7663d53b499e93fbb2ad76e9ef14ff32c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d589b2d09af16c9faaa995e5d4abaa7663d53b499e93fbb2ad76e9ef14ff32c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tzvwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:25Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:25 crc kubenswrapper[4690]: I0320 17:34:25.967878 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01a728ab-e286-4606-b922-d510978b863a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89f5bb035f84384df58eb38689bda300611344d78c38c548c61cd02a479b6852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://187278dddcc4ae295ce37bb5966dd95b70987cf9579d8a302c45162906caa098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d198c0b94cfc2e9429a02ccb1bf444b3746c37cd3278cc5c41cccad3a92f3a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78b79e7c6bc179739a43168addace3ea75f4067c5938f219a5cb0e545f65472f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11c8e8059826df28ea1bdafe3ca56a8a902ff916246367be3ece76d468194901\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95c9b322e5da6bc8172886af77d6507bccaaf8e4489181c78d3f5e522d781aa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31889aedfc9e0388694b4201c8c752dabd1634603e55401682b8cb995946bab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31889aedfc9e0388694b4201c8c752dabd1634603e55401682b8cb995946bab6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:34:13Z\\\",\\\"message\\\":\\\"k-metrics-daemon-bgj72 in node crc\\\\nI0320 17:34:13.787724 6883 services_controller.go:443] Built service openshift-apiserver/check-endpoints LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.139\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:17698, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0320 17:34:13.787931 6883 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI0320 17:34:13.788092 6883 default_network_controller.go:776] Recording success event on pod openshift-etcd/etcd-crc\\\\nF0320 17:34:13.787689 6883 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.opensh\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:34:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7bsmm_openshift-ovn-kubernetes(01a728ab-e286-4606-b922-d510978b863a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6447a78cef9ba2045f7928077399b681d152b37755ec287ae1633a26a67711ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ad2529bd38d1e0c84ca456ccdcc8020ce82a667c5aa5ea3a0027d397ec94f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ad2529bd38d1e0c84ca456ccdcc8020ce82a667c5aa5ea3a0027d397ec94f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7bsmm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:25Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:25 crc kubenswrapper[4690]: I0320 17:34:25.979380 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cdd6a8b-6b15-41c5-ba81-51e1ef53835e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c42561cbc470c23295468bf31d6dda364c3962cf8ac84f53ed62c01fa3e19db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbfcf8baf8b3cc4746bc7b314297f0f820b7461ad85d9c2f500a3ed589fb4bc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbfcf8baf8b3cc4746bc7b314297f0f820b7461ad85d9c2f500a3ed589fb4bc0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:25Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:25 crc kubenswrapper[4690]: E0320 17:34:25.991011 4690 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 17:34:25 crc kubenswrapper[4690]: I0320 17:34:25.994192 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1ec4f2e-81b3-4b81-b071-1306b93f352a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc5b19d4175f97a26633b3c61b49147f93e1edeb8975964cb23bbe474f6326e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe2bb59ee9fc82c3e49b375d294aebc73e2175d699416cb28c587a153cbadc5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83d020fd903a7b604233a4229c9a201a78f0f9d41864c94e82220090dd73e69e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af5080d60c7a6c75aac659ab9995f5f78392919748687dc3c81f6df7af1afe76\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60a788ca120045ef7b2481c3da0afac1f8ae2522b3edd3b73a48f5f8dab045a4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:33:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:33:16.417534 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:33:16.417775 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:33:16.418850 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4179466923/tls.crt::/tmp/serving-cert-4179466923/tls.key\\\\\\\"\\\\nI0320 17:33:16.771141 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:33:16.777371 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:33:16.777420 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:33:16.777489 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:33:16.777503 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:33:16.783760 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 17:33:16.783788 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:33:16.783793 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 17:33:16.783790 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:33:16.783798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:33:16.783816 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:33:16.783823 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:33:16.783828 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:33:16.787038 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:34:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d1877a8c2e19c04c44916cbcd68e19a117e4d6075b33ce7131064590120b12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438a96b878fe413aa54a56021b7ca5d2d38226050a036c2ce144aaead090aff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://438a96b878fe413aa54a56021b7ca5d2d38226050a036c2ce144aaead090aff7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:25Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:26 crc kubenswrapper[4690]: I0320 17:34:26.016544 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7501f273f832d465f837fe21cbfaddda7e9fdbfafe44e94d3fbfee21bbd2735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:26Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:26 crc kubenswrapper[4690]: I0320 17:34:26.035049 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:26Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:26 crc kubenswrapper[4690]: I0320 17:34:26.053649 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:26Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:26 crc kubenswrapper[4690]: I0320 17:34:26.072113 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qhmg6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5abdfe2-a5f7-43a7-9c83-a9eb0dacdea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19bc44db59dd7f723e92f099fb77ea80fac41a5fc0a3818ddd8d443495c50c8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lb8q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qhmg6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:26Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:26 crc kubenswrapper[4690]: I0320 17:34:26.092275 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c86b6b30-cf74-4708-b280-8c90ce27af28\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9910536149dc102d5a56c9ac27047ab0f86628788126c6c4aaf8aa8e8bc414bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://123e6e9aa8268f78a2852df2460763150ed92462bebd7c852c2bb6f78a092781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://906d1b0f0eda0e576d188ea1c4f601f45dcc8e93bf96330fa4e50be9d7a082b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87f674631ef5a418e3657c5c5103ab4c199d3f1690e0a0c737927afd35db4170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87f674631ef5a418e3657c5c5103ab4c199d3f1690e0a0c737927afd35db4170\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:26Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:26 crc kubenswrapper[4690]: I0320 17:34:26.104718 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://202f57e25ffca6b763271ebd9354cb780bda72898aa4b753ce08bcf5a774dbd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:26Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:26 crc kubenswrapper[4690]: I0320 17:34:26.116677 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c18651e4-89e3-43fd-a780-bfa6df87591e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://746499ab480c55aa548acd69b4adc2adb724c111d53536273f1e738c5d67209c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v64dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09565d72b6e11bc9bc4f72446c455016fb107bdf0fe367b56427ce9f79c20b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v64dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wtg2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:26Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:26 crc kubenswrapper[4690]: I0320 17:34:26.127374 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4rfg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deaf1de2-4906-4e89-ae1b-83b6d35f97a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53b3e701b77813269b88f29ec4e437ca71cad9cd1b9cc9310dc6b59cc609bcc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmghf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4rfg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:26Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:26 crc kubenswrapper[4690]: I0320 17:34:26.141703 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nqtt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f51dea1-fc10-4d4a-9065-2d0c020b36f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc815d328a997ab7b69c5eb959fedde44313867916d64f4ebaf96d77e34b2e84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzj2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de078ec156833ff0304a8e83014adf2c8fc5c7f8db9bb25c366acf27fa446ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzj2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8nqtt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:26Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:26 crc kubenswrapper[4690]: I0320 17:34:26.159436 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64a74fb2e29c84d99284cdca82ecd7abae5fc195747f292f11036116ec270ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37728496304293eddfd812f4584815ce277a3a2b02b6716e5f7d5d77ebaf9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:26Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:26 crc kubenswrapper[4690]: I0320 17:34:26.176591 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:26Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:26 crc kubenswrapper[4690]: I0320 17:34:26.194273 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bf8dm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"189715be-f690-4a1d-9bd3-fb0dcae7affe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab5d0027832ffcb62f2f0869a4811a56bd02954cbdd4fd0e20870dc72818ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z9vwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bf8dm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:26Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:27 crc kubenswrapper[4690]: I0320 17:34:27.882670 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:34:27 crc kubenswrapper[4690]: I0320 17:34:27.882760 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:34:27 crc kubenswrapper[4690]: I0320 17:34:27.882799 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:34:27 crc kubenswrapper[4690]: I0320 17:34:27.882859 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bgj72" Mar 20 17:34:27 crc kubenswrapper[4690]: E0320 17:34:27.883085 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:34:27 crc kubenswrapper[4690]: E0320 17:34:27.883223 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:34:27 crc kubenswrapper[4690]: E0320 17:34:27.883415 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:34:27 crc kubenswrapper[4690]: E0320 17:34:27.883577 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bgj72" podUID="3cb690cf-caea-4c1b-ad3c-7e17a802b1a3" Mar 20 17:34:28 crc kubenswrapper[4690]: I0320 17:34:28.166412 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:34:28 crc kubenswrapper[4690]: I0320 17:34:28.166553 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:34:28 crc kubenswrapper[4690]: I0320 17:34:28.166597 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:34:28 crc kubenswrapper[4690]: E0320 17:34:28.166641 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:35:32.166611575 +0000 UTC m=+207.032437263 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:34:28 crc kubenswrapper[4690]: I0320 17:34:28.166701 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:34:28 crc kubenswrapper[4690]: I0320 17:34:28.166782 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:34:28 crc kubenswrapper[4690]: E0320 17:34:28.166790 4690 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 17:34:28 crc kubenswrapper[4690]: E0320 17:34:28.166828 4690 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 17:34:28 crc kubenswrapper[4690]: I0320 17:34:28.166838 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3cb690cf-caea-4c1b-ad3c-7e17a802b1a3-metrics-certs\") pod \"network-metrics-daemon-bgj72\" (UID: \"3cb690cf-caea-4c1b-ad3c-7e17a802b1a3\") " pod="openshift-multus/network-metrics-daemon-bgj72" Mar 20 17:34:28 crc kubenswrapper[4690]: E0320 17:34:28.166846 4690 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:34:28 crc kubenswrapper[4690]: E0320 17:34:28.166914 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 17:35:32.166891833 +0000 UTC m=+207.032717541 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:34:28 crc kubenswrapper[4690]: E0320 17:34:28.166976 4690 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 17:34:28 crc kubenswrapper[4690]: E0320 17:34:28.167004 4690 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 17:34:28 crc kubenswrapper[4690]: E0320 17:34:28.167024 4690 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 17:34:28 crc kubenswrapper[4690]: E0320 17:34:28.167032 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3cb690cf-caea-4c1b-ad3c-7e17a802b1a3-metrics-certs podName:3cb690cf-caea-4c1b-ad3c-7e17a802b1a3 nodeName:}" failed. No retries permitted until 2026-03-20 17:35:32.167023597 +0000 UTC m=+207.032849395 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3cb690cf-caea-4c1b-ad3c-7e17a802b1a3-metrics-certs") pod "network-metrics-daemon-bgj72" (UID: "3cb690cf-caea-4c1b-ad3c-7e17a802b1a3") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 17:34:28 crc kubenswrapper[4690]: E0320 17:34:28.167039 4690 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:34:28 crc kubenswrapper[4690]: E0320 17:34:28.167081 4690 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 17:34:28 crc kubenswrapper[4690]: E0320 17:34:28.167138 4690 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 17:34:28 crc kubenswrapper[4690]: E0320 17:34:28.167081 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 17:35:32.167066598 +0000 UTC m=+207.032892306 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:34:28 crc kubenswrapper[4690]: E0320 17:34:28.167167 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 17:35:32.167160791 +0000 UTC m=+207.032986599 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 17:34:28 crc kubenswrapper[4690]: E0320 17:34:28.167178 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 17:35:32.167172331 +0000 UTC m=+207.032998159 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 17:34:28 crc kubenswrapper[4690]: I0320 17:34:28.884849 4690 scope.go:117] "RemoveContainer" containerID="31889aedfc9e0388694b4201c8c752dabd1634603e55401682b8cb995946bab6" Mar 20 17:34:28 crc kubenswrapper[4690]: E0320 17:34:28.885249 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7bsmm_openshift-ovn-kubernetes(01a728ab-e286-4606-b922-d510978b863a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" podUID="01a728ab-e286-4606-b922-d510978b863a" Mar 20 17:34:28 crc kubenswrapper[4690]: I0320 17:34:28.949714 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:34:28 crc kubenswrapper[4690]: I0320 17:34:28.949764 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:34:28 crc kubenswrapper[4690]: I0320 17:34:28.949775 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:34:28 crc kubenswrapper[4690]: I0320 17:34:28.949790 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:34:28 crc kubenswrapper[4690]: I0320 17:34:28.949801 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:34:28Z","lastTransitionTime":"2026-03-20T17:34:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:34:28 crc kubenswrapper[4690]: E0320 17:34:28.963578 4690 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"65dcae3a-f6f0-4cdb-ac7a-76b1f475ea12\\\",\\\"systemUUID\\\":\\\"6ccc1e34-4160-4143-b919-ac2f717f294a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:28Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:28 crc kubenswrapper[4690]: I0320 17:34:28.968796 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:34:28 crc kubenswrapper[4690]: I0320 17:34:28.968847 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:34:28 crc kubenswrapper[4690]: I0320 17:34:28.968859 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:34:28 crc kubenswrapper[4690]: I0320 17:34:28.968874 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:34:28 crc kubenswrapper[4690]: I0320 17:34:28.968886 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:34:28Z","lastTransitionTime":"2026-03-20T17:34:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:34:28 crc kubenswrapper[4690]: E0320 17:34:28.986424 4690 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"65dcae3a-f6f0-4cdb-ac7a-76b1f475ea12\\\",\\\"systemUUID\\\":\\\"6ccc1e34-4160-4143-b919-ac2f717f294a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:28Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:28 crc kubenswrapper[4690]: I0320 17:34:28.994479 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:34:28 crc kubenswrapper[4690]: I0320 17:34:28.994548 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:34:28 crc kubenswrapper[4690]: I0320 17:34:28.994562 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:34:28 crc kubenswrapper[4690]: I0320 17:34:28.994578 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:34:28 crc kubenswrapper[4690]: I0320 17:34:28.994590 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:34:28Z","lastTransitionTime":"2026-03-20T17:34:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:34:29 crc kubenswrapper[4690]: E0320 17:34:29.009380 4690 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"65dcae3a-f6f0-4cdb-ac7a-76b1f475ea12\\\",\\\"systemUUID\\\":\\\"6ccc1e34-4160-4143-b919-ac2f717f294a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:29Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:29 crc kubenswrapper[4690]: I0320 17:34:29.012875 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:34:29 crc kubenswrapper[4690]: I0320 17:34:29.012923 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:34:29 crc kubenswrapper[4690]: I0320 17:34:29.012937 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:34:29 crc kubenswrapper[4690]: I0320 17:34:29.012957 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:34:29 crc kubenswrapper[4690]: I0320 17:34:29.012971 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:34:29Z","lastTransitionTime":"2026-03-20T17:34:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:34:29 crc kubenswrapper[4690]: E0320 17:34:29.026301 4690 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"65dcae3a-f6f0-4cdb-ac7a-76b1f475ea12\\\",\\\"systemUUID\\\":\\\"6ccc1e34-4160-4143-b919-ac2f717f294a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:29Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:29 crc kubenswrapper[4690]: I0320 17:34:29.030316 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:34:29 crc kubenswrapper[4690]: I0320 17:34:29.030350 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:34:29 crc kubenswrapper[4690]: I0320 17:34:29.030360 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:34:29 crc kubenswrapper[4690]: I0320 17:34:29.030377 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:34:29 crc kubenswrapper[4690]: I0320 17:34:29.030385 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:34:29Z","lastTransitionTime":"2026-03-20T17:34:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:34:29 crc kubenswrapper[4690]: E0320 17:34:29.044835 4690 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"65dcae3a-f6f0-4cdb-ac7a-76b1f475ea12\\\",\\\"systemUUID\\\":\\\"6ccc1e34-4160-4143-b919-ac2f717f294a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:29Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:29 crc kubenswrapper[4690]: E0320 17:34:29.045054 4690 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 17:34:29 crc kubenswrapper[4690]: I0320 17:34:29.883036 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:34:29 crc kubenswrapper[4690]: I0320 17:34:29.883194 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:34:29 crc kubenswrapper[4690]: E0320 17:34:29.883327 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:34:29 crc kubenswrapper[4690]: E0320 17:34:29.883415 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:34:29 crc kubenswrapper[4690]: I0320 17:34:29.883482 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:34:29 crc kubenswrapper[4690]: E0320 17:34:29.883583 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:34:29 crc kubenswrapper[4690]: I0320 17:34:29.883830 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bgj72" Mar 20 17:34:29 crc kubenswrapper[4690]: E0320 17:34:29.883974 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bgj72" podUID="3cb690cf-caea-4c1b-ad3c-7e17a802b1a3" Mar 20 17:34:30 crc kubenswrapper[4690]: E0320 17:34:30.992209 4690 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 17:34:31 crc kubenswrapper[4690]: I0320 17:34:31.973096 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:34:31 crc kubenswrapper[4690]: E0320 17:34:31.973238 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:34:31 crc kubenswrapper[4690]: I0320 17:34:31.973337 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bgj72" Mar 20 17:34:31 crc kubenswrapper[4690]: E0320 17:34:31.973386 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bgj72" podUID="3cb690cf-caea-4c1b-ad3c-7e17a802b1a3" Mar 20 17:34:31 crc kubenswrapper[4690]: I0320 17:34:31.973604 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:34:31 crc kubenswrapper[4690]: E0320 17:34:31.973655 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:34:31 crc kubenswrapper[4690]: I0320 17:34:31.973682 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:34:31 crc kubenswrapper[4690]: E0320 17:34:31.973841 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:34:33 crc kubenswrapper[4690]: I0320 17:34:33.883205 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:34:33 crc kubenswrapper[4690]: I0320 17:34:33.883299 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bgj72" Mar 20 17:34:33 crc kubenswrapper[4690]: E0320 17:34:33.883472 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:34:33 crc kubenswrapper[4690]: I0320 17:34:33.883529 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:34:33 crc kubenswrapper[4690]: I0320 17:34:33.883593 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:34:33 crc kubenswrapper[4690]: E0320 17:34:33.883685 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:34:33 crc kubenswrapper[4690]: E0320 17:34:33.883743 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:34:33 crc kubenswrapper[4690]: E0320 17:34:33.883947 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bgj72" podUID="3cb690cf-caea-4c1b-ad3c-7e17a802b1a3" Mar 20 17:34:35 crc kubenswrapper[4690]: I0320 17:34:35.882432 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:34:35 crc kubenswrapper[4690]: I0320 17:34:35.882450 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:34:35 crc kubenswrapper[4690]: I0320 17:34:35.882629 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bgj72" Mar 20 17:34:35 crc kubenswrapper[4690]: I0320 17:34:35.882862 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:34:35 crc kubenswrapper[4690]: E0320 17:34:35.883095 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:34:35 crc kubenswrapper[4690]: E0320 17:34:35.883203 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:34:35 crc kubenswrapper[4690]: E0320 17:34:35.883477 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:34:35 crc kubenswrapper[4690]: E0320 17:34:35.883525 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bgj72" podUID="3cb690cf-caea-4c1b-ad3c-7e17a802b1a3" Mar 20 17:34:35 crc kubenswrapper[4690]: I0320 17:34:35.904355 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dacf40f3-f7fe-429b-bb11-3057bc037779\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b273b610fa19944625ca87d5ec10f818b86154d676f1def5ebe494ee44ed3848\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f552ca9ec154d035a9f9809b20d9ff2cd19bbd4cb9262173a0334289741f4fad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd8c552d958aced0cb683d87c3ef8d88494d4888ccb028a9f4c27b24b4923264\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5355eb1563fa92e70ca61e39a864a15b53da2181b277f3e134d121b5626b954a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f044bae4d4345b16e951ba16d4dc6df9b400789b67b6eb23d806fba27dc77d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://601a5cb96354f970de2322d08594baacac3c21ec962d27dc0c809f1bc99de4d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://601a5cb96354f970de2322d08594baacac3c21ec962d27dc0c809f1bc99de4d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e719a69188fb4ee3882973f6f72ba027c5a546cb39b119b27bcd38d8cc728521\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e719a69188fb4ee3882973f6f72ba027c5a546cb39b119b27bcd38d8cc728521\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ef118e8eca52e42d265877595d296d5641caa5c79886b886eefca7686f9b6524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef118e8eca52e42d265877595d296d5641caa5c79886b886eefca7686f9b6524\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:35Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:35 crc kubenswrapper[4690]: I0320 17:34:35.918493 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bgj72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cb690cf-caea-4c1b-ad3c-7e17a802b1a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djqjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djqjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bgj72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:35Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:35 crc kubenswrapper[4690]: I0320 17:34:35.939766 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tzvwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fe7c1d1-7aa9-4c64-941e-7415a99367ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9bad176e93c3fff461f57c5c15ed0d5bcc9ef12767d38012fe1145dd701112b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56dc92b978a7c1bbd4e3ccc2a6821348e2a990247e49e82c4de43c8bbe305cad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56dc92b978a7c1bbd4e3ccc2a6821348e2a990247e49e82c4de43c8bbe305cad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b4a3f2829967bcafe60ed0c6d08a421e8c8a5cd49d2a7445bbc92c2592d7457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b4a3f2829967bcafe60ed0c6d08a421e8c8a5cd49d2a7445bbc92c2592d7457\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://971c38dc48c64a0c8c8781e6d2a3d6f5222f9e846fb32ae417a4a1872a296b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://971c38dc48c64a0c8c8781e6d2a3d6f5222f9e846fb32ae417a4a1872a296b47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6e590fdf915cb209ad79022e0bb1b20cf642ebfeaa5e67cad61f14c495feaed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6e590fdf915cb209ad79022e0bb1b20cf642ebfeaa5e67cad61f14c495feaed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa9bfe0b6b30c8ecbcab836f9fd1770f959392e981e9676b281b5768a4279d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa9bfe0b6b30c8ecbcab836f9fd1770f959392e981e9676b281b5768a4279d22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d589b2d09af16c9faaa995e5d4abaa7663d53b499e93fbb2ad76e9ef14ff32c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d589b2d09af16c9faaa995e5d4abaa7663d53b499e93fbb2ad76e9ef14ff32c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tzvwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:35Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:35 crc kubenswrapper[4690]: I0320 17:34:35.970786 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01a728ab-e286-4606-b922-d510978b863a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89f5bb035f84384df58eb38689bda300611344d78c38c548c61cd02a479b6852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://187278dddcc4ae295ce37bb5966dd95b70987cf9579d8a302c45162906caa098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d198c0b94cfc2e9429a02ccb1bf444b3746c37cd3278cc5c41cccad3a92f3a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78b79e7c6bc179739a43168addace3ea75f4067c5938f219a5cb0e545f65472f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11c8e8059826df28ea1bdafe3ca56a8a902ff916246367be3ece76d468194901\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95c9b322e5da6bc8172886af77d6507bccaaf8e4489181c78d3f5e522d781aa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31889aedfc9e0388694b4201c8c752dabd1634603e55401682b8cb995946bab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31889aedfc9e0388694b4201c8c752dabd1634603e55401682b8cb995946bab6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:34:13Z\\\",\\\"message\\\":\\\"k-metrics-daemon-bgj72 in node crc\\\\nI0320 17:34:13.787724 6883 services_controller.go:443] Built service openshift-apiserver/check-endpoints LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.139\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:17698, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0320 17:34:13.787931 6883 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI0320 17:34:13.788092 6883 default_network_controller.go:776] Recording success event on pod openshift-etcd/etcd-crc\\\\nF0320 17:34:13.787689 6883 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.opensh\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:34:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7bsmm_openshift-ovn-kubernetes(01a728ab-e286-4606-b922-d510978b863a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6447a78cef9ba2045f7928077399b681d152b37755ec287ae1633a26a67711ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ad2529bd38d1e0c84ca456ccdcc8020ce82a667c5aa5ea3a0027d397ec94f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ad2529bd38d1e0c84ca456ccdcc8020ce82a667c5aa5ea3a0027d397ec94f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7bsmm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:35Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:35 crc kubenswrapper[4690]: I0320 17:34:35.988039 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a01b81a-5874-41c3-a2ea-0b3f68fb1194\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://696bd60243b29b1c078b32f2dcb7261e108e0b204ba5889b2c0ce5d6c8dff044\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98dccfbbb62f60dc126e6c81729f6ac78b1f017d1dd01a200d06beb2296fd1b2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:32:38Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 17:32:08.107057 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 17:32:08.109144 1 observer_polling.go:159] Starting file observer\\\\nI0320 17:32:08.136585 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 17:32:08.140728 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0320 17:32:38.286606 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43215f00bdcc0d708039a3dd34ce62baa101c8218cc73255f2027f3dbfe60198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35960cf982659d799c1e2ce1a4c7eb21b7b1c5d8e5979668b4b6df505c38bdf7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://079fc6ab0278dfdaa56142eb90b06568010882948e45bea053b0459a68c9faa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:35Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:35 crc kubenswrapper[4690]: E0320 17:34:35.992879 4690 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 17:34:36 crc kubenswrapper[4690]: I0320 17:34:36.008334 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1ec4f2e-81b3-4b81-b071-1306b93f352a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc5b19d4175f97a26633b3c61b49147f93e1edeb8975964cb23bbe474f6326e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe2bb59ee9fc82c3e49b375d294aebc73e2175d699416cb28c587a153cbadc5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83d020fd903a7b604233a4229c9a201a78f0f9d41864c94e82220090dd73e69e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af5080d60c7a6c75aac659ab9995f5f78392919748687dc3c81f6df7af1afe76\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60a788ca120045ef7b2481c3da0afac1f8ae2522b3edd3b73a48f5f8dab045a4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:33:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:33:16.417534 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:33:16.417775 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:33:16.418850 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4179466923/tls.crt::/tmp/serving-cert-4179466923/tls.key\\\\\\\"\\\\nI0320 17:33:16.771141 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:33:16.777371 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:33:16.777420 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:33:16.777489 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:33:16.777503 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:33:16.783760 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 17:33:16.783788 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:33:16.783793 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 17:33:16.783790 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:33:16.783798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:33:16.783816 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:33:16.783823 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:33:16.783828 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:33:16.787038 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:34:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d1877a8c2e19c04c44916cbcd68e19a117e4d6075b33ce7131064590120b12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438a96b878fe413aa54a56021b7ca5d2d38226050a036c2ce144aaead090aff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://438a96b878fe413aa54a56021b7ca5d2d38226050a036c2ce144aaead090aff7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:36Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:36 crc kubenswrapper[4690]: I0320 17:34:36.027764 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7501f273f832d465f837fe21cbfaddda7e9fdbfafe44e94d3fbfee21bbd2735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:36Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:36 crc kubenswrapper[4690]: I0320 17:34:36.040558 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:36Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:36 crc kubenswrapper[4690]: I0320 17:34:36.054906 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:36Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:36 crc kubenswrapper[4690]: I0320 17:34:36.067556 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qhmg6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5abdfe2-a5f7-43a7-9c83-a9eb0dacdea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19bc44db59dd7f723e92f099fb77ea80fac41a5fc0a3818ddd8d443495c50c8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lb8q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qhmg6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:36Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:36 crc kubenswrapper[4690]: I0320 17:34:36.080682 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cdd6a8b-6b15-41c5-ba81-51e1ef53835e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c42561cbc470c23295468bf31d6dda364c3962cf8ac84f53ed62c01fa3e19db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbfcf8baf8b3cc4746bc7b314297f0f820b7461ad85d9c2f500a3ed589fb4bc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbfcf8baf8b3cc4746bc7b314297f0f820b7461ad85d9c2f500a3ed589fb4bc0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:36Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:36 crc kubenswrapper[4690]: I0320 17:34:36.096547 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://202f57e25ffca6b763271ebd9354cb780bda72898aa4b753ce08bcf5a774dbd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:36Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:36 crc kubenswrapper[4690]: I0320 17:34:36.107991 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c18651e4-89e3-43fd-a780-bfa6df87591e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://746499ab480c55aa548acd69b4adc2adb724c111d53536273f1e738c5d67209c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v64dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09565d72b6e11bc9bc4f72446c455016fb107bdf0fe367b56427ce9f79c20b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v64dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wtg2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:36Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:36 crc kubenswrapper[4690]: I0320 17:34:36.119156 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4rfg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deaf1de2-4906-4e89-ae1b-83b6d35f97a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53b3e701b77813269b88f29ec4e437ca71cad9cd1b9cc9310dc6b59cc609bcc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmghf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4rfg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:36Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:36 crc kubenswrapper[4690]: I0320 17:34:36.130976 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nqtt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f51dea1-fc10-4d4a-9065-2d0c020b36f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc815d328a997ab7b69c5eb959fedde44313867916d64f4ebaf96d77e34b2e84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzj2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de078ec156833ff0304a8e83014adf2c8fc5c7f8db9bb25c366acf27fa446ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzj2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8nqtt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:36Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:36 crc kubenswrapper[4690]: I0320 17:34:36.146402 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c86b6b30-cf74-4708-b280-8c90ce27af28\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9910536149dc102d5a56c9ac27047ab0f86628788126c6c4aaf8aa8e8bc414bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://123e6e9aa8268f78a2852df2460763150ed92462bebd7c852c2bb6f78a092781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://906d1b0f0eda0e576d188ea1c4f601f45dcc8e93bf96330fa4e50be9d7a082b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87f674631ef5a418e3657c5c5103ab4c199d3f1690e0a0c737927afd35db4170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87f674631ef5a418e3657c5c5103ab4c199d3f1690e0a0c737927afd35db4170\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:36Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:36 crc kubenswrapper[4690]: I0320 17:34:36.162037 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:36Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:36 crc kubenswrapper[4690]: I0320 17:34:36.178795 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bf8dm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"189715be-f690-4a1d-9bd3-fb0dcae7affe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab5d0027832ffcb62f2f0869a4811a56bd02954cbdd4fd0e20870dc72818ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z9vwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bf8dm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:36Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:36 crc kubenswrapper[4690]: I0320 17:34:36.196435 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64a74fb2e29c84d99284cdca82ecd7abae5fc195747f292f11036116ec270ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37728496304293eddfd812f4584815ce277a3a2b02b6716e5f7d5d77ebaf9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:36Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:37 crc kubenswrapper[4690]: I0320 17:34:37.882590 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:34:37 crc kubenswrapper[4690]: I0320 17:34:37.882679 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:34:37 crc kubenswrapper[4690]: I0320 17:34:37.882693 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bgj72" Mar 20 17:34:37 crc kubenswrapper[4690]: E0320 17:34:37.882788 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:34:37 crc kubenswrapper[4690]: I0320 17:34:37.882812 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:34:37 crc kubenswrapper[4690]: E0320 17:34:37.882976 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bgj72" podUID="3cb690cf-caea-4c1b-ad3c-7e17a802b1a3" Mar 20 17:34:37 crc kubenswrapper[4690]: E0320 17:34:37.883141 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:34:37 crc kubenswrapper[4690]: E0320 17:34:37.883337 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:34:39 crc kubenswrapper[4690]: I0320 17:34:39.085576 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:34:39 crc kubenswrapper[4690]: I0320 17:34:39.085615 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:34:39 crc kubenswrapper[4690]: I0320 17:34:39.085631 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:34:39 crc kubenswrapper[4690]: I0320 17:34:39.085648 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:34:39 crc kubenswrapper[4690]: I0320 17:34:39.085657 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:34:39Z","lastTransitionTime":"2026-03-20T17:34:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:34:39 crc kubenswrapper[4690]: E0320 17:34:39.097129 4690 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"65dcae3a-f6f0-4cdb-ac7a-76b1f475ea12\\\",\\\"systemUUID\\\":\\\"6ccc1e34-4160-4143-b919-ac2f717f294a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:39Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:39 crc kubenswrapper[4690]: I0320 17:34:39.102363 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:34:39 crc kubenswrapper[4690]: I0320 17:34:39.102421 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:34:39 crc kubenswrapper[4690]: I0320 17:34:39.102439 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:34:39 crc kubenswrapper[4690]: I0320 17:34:39.102462 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:34:39 crc kubenswrapper[4690]: I0320 17:34:39.102482 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:34:39Z","lastTransitionTime":"2026-03-20T17:34:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:34:39 crc kubenswrapper[4690]: E0320 17:34:39.122457 4690 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"65dcae3a-f6f0-4cdb-ac7a-76b1f475ea12\\\",\\\"systemUUID\\\":\\\"6ccc1e34-4160-4143-b919-ac2f717f294a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:39Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:39 crc kubenswrapper[4690]: I0320 17:34:39.127075 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:34:39 crc kubenswrapper[4690]: I0320 17:34:39.127114 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:34:39 crc kubenswrapper[4690]: I0320 17:34:39.127126 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:34:39 crc kubenswrapper[4690]: I0320 17:34:39.127144 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:34:39 crc kubenswrapper[4690]: I0320 17:34:39.127157 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:34:39Z","lastTransitionTime":"2026-03-20T17:34:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:34:39 crc kubenswrapper[4690]: E0320 17:34:39.139877 4690 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"65dcae3a-f6f0-4cdb-ac7a-76b1f475ea12\\\",\\\"systemUUID\\\":\\\"6ccc1e34-4160-4143-b919-ac2f717f294a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:39Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:39 crc kubenswrapper[4690]: I0320 17:34:39.144696 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:34:39 crc kubenswrapper[4690]: I0320 17:34:39.144768 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:34:39 crc kubenswrapper[4690]: I0320 17:34:39.144796 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:34:39 crc kubenswrapper[4690]: I0320 17:34:39.144825 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:34:39 crc kubenswrapper[4690]: I0320 17:34:39.144846 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:34:39Z","lastTransitionTime":"2026-03-20T17:34:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:34:39 crc kubenswrapper[4690]: E0320 17:34:39.157711 4690 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"65dcae3a-f6f0-4cdb-ac7a-76b1f475ea12\\\",\\\"systemUUID\\\":\\\"6ccc1e34-4160-4143-b919-ac2f717f294a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:39Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:39 crc kubenswrapper[4690]: I0320 17:34:39.161572 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:34:39 crc kubenswrapper[4690]: I0320 17:34:39.161622 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:34:39 crc kubenswrapper[4690]: I0320 17:34:39.161634 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:34:39 crc kubenswrapper[4690]: I0320 17:34:39.161650 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:34:39 crc kubenswrapper[4690]: I0320 17:34:39.161663 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:34:39Z","lastTransitionTime":"2026-03-20T17:34:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:34:39 crc kubenswrapper[4690]: E0320 17:34:39.173952 4690 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"65dcae3a-f6f0-4cdb-ac7a-76b1f475ea12\\\",\\\"systemUUID\\\":\\\"6ccc1e34-4160-4143-b919-ac2f717f294a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:39Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:39 crc kubenswrapper[4690]: E0320 17:34:39.174125 4690 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 17:34:39 crc kubenswrapper[4690]: I0320 17:34:39.882706 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:34:39 crc kubenswrapper[4690]: I0320 17:34:39.882738 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:34:39 crc kubenswrapper[4690]: E0320 17:34:39.882879 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:34:39 crc kubenswrapper[4690]: I0320 17:34:39.882989 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bgj72" Mar 20 17:34:39 crc kubenswrapper[4690]: I0320 17:34:39.883003 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:34:39 crc kubenswrapper[4690]: E0320 17:34:39.883174 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:34:39 crc kubenswrapper[4690]: E0320 17:34:39.883434 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:34:39 crc kubenswrapper[4690]: E0320 17:34:39.883557 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bgj72" podUID="3cb690cf-caea-4c1b-ad3c-7e17a802b1a3" Mar 20 17:34:40 crc kubenswrapper[4690]: E0320 17:34:40.995031 4690 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 17:34:41 crc kubenswrapper[4690]: I0320 17:34:41.882186 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:34:41 crc kubenswrapper[4690]: I0320 17:34:41.882430 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bgj72" Mar 20 17:34:41 crc kubenswrapper[4690]: E0320 17:34:41.882437 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:34:41 crc kubenswrapper[4690]: E0320 17:34:41.882514 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bgj72" podUID="3cb690cf-caea-4c1b-ad3c-7e17a802b1a3" Mar 20 17:34:41 crc kubenswrapper[4690]: I0320 17:34:41.882517 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:34:41 crc kubenswrapper[4690]: E0320 17:34:41.882604 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:34:41 crc kubenswrapper[4690]: I0320 17:34:41.882730 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:34:41 crc kubenswrapper[4690]: E0320 17:34:41.882989 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:34:42 crc kubenswrapper[4690]: I0320 17:34:42.709305 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bf8dm_189715be-f690-4a1d-9bd3-fb0dcae7affe/kube-multus/0.log" Mar 20 17:34:42 crc kubenswrapper[4690]: I0320 17:34:42.709391 4690 generic.go:334] "Generic (PLEG): container finished" podID="189715be-f690-4a1d-9bd3-fb0dcae7affe" containerID="6ab5d0027832ffcb62f2f0869a4811a56bd02954cbdd4fd0e20870dc72818ba4" exitCode=1 Mar 20 17:34:42 crc kubenswrapper[4690]: I0320 17:34:42.709470 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bf8dm" event={"ID":"189715be-f690-4a1d-9bd3-fb0dcae7affe","Type":"ContainerDied","Data":"6ab5d0027832ffcb62f2f0869a4811a56bd02954cbdd4fd0e20870dc72818ba4"} Mar 20 17:34:42 crc kubenswrapper[4690]: I0320 17:34:42.710124 4690 scope.go:117] "RemoveContainer" containerID="6ab5d0027832ffcb62f2f0869a4811a56bd02954cbdd4fd0e20870dc72818ba4" Mar 20 17:34:42 crc kubenswrapper[4690]: I0320 17:34:42.727466 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qhmg6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5abdfe2-a5f7-43a7-9c83-a9eb0dacdea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19bc44db59dd7f723e92f099fb77ea80fac41a5fc0a3818ddd8d443495c50c8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lb8q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qhmg6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:42Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:42 crc kubenswrapper[4690]: I0320 17:34:42.745362 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cdd6a8b-6b15-41c5-ba81-51e1ef53835e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c42561cbc470c23295468bf31d6dda364c3962cf8ac84f53ed62c01fa3e19db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbfcf8baf8b3cc4746bc7b314297f0f820b7461ad85d9c2f500a3ed589fb4bc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbfcf8baf8b3cc4746bc7b314297f0f820b7461ad85d9c2f500a3ed589fb4bc0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:42Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:42 crc kubenswrapper[4690]: I0320 17:34:42.769804 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1ec4f2e-81b3-4b81-b071-1306b93f352a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc5b19d4175f97a26633b3c61b49147f93e1edeb8975964cb23bbe474f6326e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe2bb59ee9fc82c3e49b375d294aebc73e2175d699416cb28c587a153cbadc5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83d020fd903a7b604233a4229c9a201a78f0f9d41864c94e82220090dd73e69e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af5080d60c7a6c75aac659ab9995f5f78392919748687dc3c81f6df7af1afe76\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60a788ca120045ef7b2481c3da0afac1f8ae2522b3edd3b73a48f5f8dab045a4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:33:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:33:16.417534 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:33:16.417775 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:33:16.418850 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4179466923/tls.crt::/tmp/serving-cert-4179466923/tls.key\\\\\\\"\\\\nI0320 17:33:16.771141 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:33:16.777371 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:33:16.777420 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:33:16.777489 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:33:16.777503 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:33:16.783760 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 17:33:16.783788 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:33:16.783793 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 17:33:16.783790 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:33:16.783798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:33:16.783816 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:33:16.783823 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:33:16.783828 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:33:16.787038 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:34:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d1877a8c2e19c04c44916cbcd68e19a117e4d6075b33ce7131064590120b12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438a96b878fe413aa54a56021b7ca5d2d38226050a036c2ce144aaead090aff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://438a96b878fe413aa54a56021b7ca5d2d38226050a036c2ce144aaead090aff7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:42Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:42 crc kubenswrapper[4690]: I0320 17:34:42.785107 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7501f273f832d465f837fe21cbfaddda7e9fdbfafe44e94d3fbfee21bbd2735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:42Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:42 crc kubenswrapper[4690]: I0320 17:34:42.804229 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:42Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:42 crc kubenswrapper[4690]: I0320 17:34:42.819187 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:42Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:42 crc kubenswrapper[4690]: I0320 17:34:42.831899 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c86b6b30-cf74-4708-b280-8c90ce27af28\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9910536149dc102d5a56c9ac27047ab0f86628788126c6c4aaf8aa8e8bc414bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://123e6e9aa8268f78a2852df2460763150ed92462bebd7c852c2bb6f78a092781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://906d1b0f0eda0e576d188ea1c4f601f45dcc8e93bf96330fa4e50be9d7a082b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87f674631ef5a418e3657c5c5103ab4c199d3f1690e0a0c737927afd35db4170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87f674631ef5a418e3657c5c5103ab4c199d3f1690e0a0c737927afd35db4170\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:42Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:42 crc kubenswrapper[4690]: I0320 17:34:42.845481 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://202f57e25ffca6b763271ebd9354cb780bda72898aa4b753ce08bcf5a774dbd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:42Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:42 crc kubenswrapper[4690]: I0320 17:34:42.858338 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c18651e4-89e3-43fd-a780-bfa6df87591e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://746499ab480c55aa548acd69b4adc2adb724c111d53536273f1e738c5d67209c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v64dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09565d72b6e11bc9bc4f72446c455016fb107bdf0fe367b56427ce9f79c20b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v64dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wtg2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:42Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:42 crc kubenswrapper[4690]: I0320 17:34:42.870312 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4rfg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deaf1de2-4906-4e89-ae1b-83b6d35f97a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53b3e701b77813269b88f29ec4e437ca71cad9cd1b9cc9310dc6b59cc609bcc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmghf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4rfg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:42Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:42 crc kubenswrapper[4690]: I0320 17:34:42.881714 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nqtt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f51dea1-fc10-4d4a-9065-2d0c020b36f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc815d328a997ab7b69c5eb959fedde44313867916d64f4ebaf96d77e34b2e84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzj2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de078ec156833ff0304a8e83014adf2c8fc5c7f8db9bb25c366acf27fa446ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzj2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8nqtt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:42Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:42 crc kubenswrapper[4690]: I0320 17:34:42.895024 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64a74fb2e29c84d99284cdca82ecd7abae5fc195747f292f11036116ec270ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37728496304293eddfd812f4584815ce277a3a2b02b6716e5f7d5d77ebaf9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:42Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:42 crc kubenswrapper[4690]: I0320 17:34:42.911734 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:42Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:42 crc kubenswrapper[4690]: I0320 17:34:42.928486 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bf8dm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"189715be-f690-4a1d-9bd3-fb0dcae7affe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab5d0027832ffcb62f2f0869a4811a56bd02954cbdd4fd0e20870dc72818ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ab5d0027832ffcb62f2f0869a4811a56bd02954cbdd4fd0e20870dc72818ba4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:34:41Z\\\",\\\"message\\\":\\\"2026-03-20T17:33:56+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_1bdcc22d-b4ba-4714-aa18-2d803f8b3ba5\\\\n2026-03-20T17:33:56+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_1bdcc22d-b4ba-4714-aa18-2d803f8b3ba5 to /host/opt/cni/bin/\\\\n2026-03-20T17:33:56Z [verbose] multus-daemon started\\\\n2026-03-20T17:33:56Z [verbose] Readiness Indicator file check\\\\n2026-03-20T17:34:41Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z9vwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bf8dm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:42Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:42 crc kubenswrapper[4690]: I0320 17:34:42.942185 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a01b81a-5874-41c3-a2ea-0b3f68fb1194\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://696bd60243b29b1c078b32f2dcb7261e108e0b204ba5889b2c0ce5d6c8dff044\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98dccfbbb62f60dc126e6c81729f6ac78b1f017d1dd01a200d06beb2296fd1b2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:32:38Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 17:32:08.107057 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 17:32:08.109144 1 observer_polling.go:159] Starting file observer\\\\nI0320 17:32:08.136585 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 17:32:08.140728 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0320 17:32:38.286606 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43215f00bdcc0d708039a3dd34ce62baa101c8218cc73255f2027f3dbfe60198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35960cf982659d799c1e2ce1a4c7eb21b7b1c5d8e5979668b4b6df505c38bdf7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://079fc6ab0278dfdaa56142eb90b06568010882948e45bea053b0459a68c9faa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:42Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:42 crc kubenswrapper[4690]: I0320 17:34:42.964237 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dacf40f3-f7fe-429b-bb11-3057bc037779\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b273b610fa19944625ca87d5ec10f818b86154d676f1def5ebe494ee44ed3848\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f552ca9ec154d035a9f9809b20d9ff2cd19bbd4cb9262173a0334289741f4fad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd8c552d958aced0cb683d87c3ef8d88494d4888ccb028a9f4c27b24b4923264\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5355eb1563fa92e70ca61e39a864a15b53da2181b277f3e134d121b5626b954a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f044bae4d4345b16e951ba16d4dc6df9b400789b67b6eb23d806fba27dc77d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://601a5cb96354f970de2322d08594baacac3c21ec962d27dc0c809f1bc99de4d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://601a5cb96354f970de2322d08594baacac3c21ec962d27dc0c809f1bc99de4d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e719a69188fb4ee3882973f6f72ba027c5a546cb39b119b27bcd38d8cc728521\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e719a69188fb4ee3882973f6f72ba027c5a546cb39b119b27bcd38d8cc728521\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ef118e8eca52e42d265877595d296d5641caa5c79886b886eefca7686f9b6524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef118e8eca52e42d265877595d296d5641caa5c79886b886eefca7686f9b6524\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:42Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:42 crc kubenswrapper[4690]: I0320 17:34:42.978618 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bgj72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cb690cf-caea-4c1b-ad3c-7e17a802b1a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djqjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djqjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bgj72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:42Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:42 crc kubenswrapper[4690]: I0320 17:34:42.998490 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tzvwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fe7c1d1-7aa9-4c64-941e-7415a99367ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9bad176e93c3fff461f57c5c15ed0d5bcc9ef12767d38012fe1145dd701112b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56dc92b978a7c1bbd4e3ccc2a6821348e2a990247e49e82c4de43c8bbe305cad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56dc92b978a7c1bbd4e3ccc2a6821348e2a990247e49e82c4de43c8bbe305cad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b4a3f2829967bcafe60ed0c6d08a421e8c8a5cd49d2a7445bbc92c2592d7457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b4a3f2829967bcafe60ed0c6d08a421e8c8a5cd49d2a7445bbc92c2592d7457\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://971c38dc48c64a0c8c8781e6d2a3d6f5222f9e846fb32ae417a4a1872a296b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://971c38dc48c64a0c8c8781e6d2a3d6f5222f9e846fb32ae417a4a1872a296b47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6e590fdf915cb209ad79022e0bb1b20cf642ebfeaa5e67cad61f14c495feaed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6e590fdf915cb209ad79022e0bb1b20cf642ebfeaa5e67cad61f14c495feaed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa9bfe0b6b30c8ecbcab836f9fd1770f959392e981e9676b281b5768a4279d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa9bfe0b6b30c8ecbcab836f9fd1770f959392e981e9676b281b5768a4279d22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d589b2d09af16c9faaa995e5d4abaa7663d53b499e93fbb2ad76e9ef14ff32c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d589b2d09af16c9faaa995e5d4abaa7663d53b499e93fbb2ad76e9ef14ff32c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tzvwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:42Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:43 crc kubenswrapper[4690]: I0320 17:34:43.025349 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01a728ab-e286-4606-b922-d510978b863a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89f5bb035f84384df58eb38689bda300611344d78c38c548c61cd02a479b6852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://187278dddcc4ae295ce37bb5966dd95b70987cf9579d8a302c45162906caa098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d198c0b94cfc2e9429a02ccb1bf444b3746c37cd3278cc5c41cccad3a92f3a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78b79e7c6bc179739a43168addace3ea75f4067c5938f219a5cb0e545f65472f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11c8e8059826df28ea1bdafe3ca56a8a902ff916246367be3ece76d468194901\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95c9b322e5da6bc8172886af77d6507bccaaf8e4489181c78d3f5e522d781aa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31889aedfc9e0388694b4201c8c752dabd1634603e55401682b8cb995946bab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31889aedfc9e0388694b4201c8c752dabd1634603e55401682b8cb995946bab6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:34:13Z\\\",\\\"message\\\":\\\"k-metrics-daemon-bgj72 in node crc\\\\nI0320 17:34:13.787724 6883 services_controller.go:443] Built service openshift-apiserver/check-endpoints LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.139\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:17698, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0320 17:34:13.787931 6883 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI0320 17:34:13.788092 6883 default_network_controller.go:776] Recording success event on pod openshift-etcd/etcd-crc\\\\nF0320 17:34:13.787689 6883 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.opensh\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:34:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7bsmm_openshift-ovn-kubernetes(01a728ab-e286-4606-b922-d510978b863a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6447a78cef9ba2045f7928077399b681d152b37755ec287ae1633a26a67711ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ad2529bd38d1e0c84ca456ccdcc8020ce82a667c5aa5ea3a0027d397ec94f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ad2529bd38d1e0c84ca456ccdcc8020ce82a667c5aa5ea3a0027d397ec94f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7bsmm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:43Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:43 crc kubenswrapper[4690]: I0320 17:34:43.716766 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bf8dm_189715be-f690-4a1d-9bd3-fb0dcae7affe/kube-multus/0.log" Mar 20 17:34:43 crc kubenswrapper[4690]: I0320 17:34:43.717419 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bf8dm" event={"ID":"189715be-f690-4a1d-9bd3-fb0dcae7affe","Type":"ContainerStarted","Data":"1a2c238f16fbb8b532515c8ae6456c4e5b9b6e5797597ea258171e573c9f4ba7"} Mar 20 17:34:43 crc kubenswrapper[4690]: I0320 17:34:43.739086 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64a74fb2e29c84d99284cdca82ecd7abae5fc195747f292f11036116ec270ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37728496304293eddfd812f4584815ce277a3a2b02b6716e5f7d5d77ebaf9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:43Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:43 crc kubenswrapper[4690]: I0320 17:34:43.761167 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:43Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:43 crc kubenswrapper[4690]: I0320 17:34:43.775945 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bf8dm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"189715be-f690-4a1d-9bd3-fb0dcae7affe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a2c238f16fbb8b532515c8ae6456c4e5b9b6e5797597ea258171e573c9f4ba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ab5d0027832ffcb62f2f0869a4811a56bd02954cbdd4fd0e20870dc72818ba4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:34:41Z\\\",\\\"message\\\":\\\"2026-03-20T17:33:56+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_1bdcc22d-b4ba-4714-aa18-2d803f8b3ba5\\\\n2026-03-20T17:33:56+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_1bdcc22d-b4ba-4714-aa18-2d803f8b3ba5 to /host/opt/cni/bin/\\\\n2026-03-20T17:33:56Z [verbose] multus-daemon started\\\\n2026-03-20T17:33:56Z [verbose] Readiness Indicator file check\\\\n2026-03-20T17:34:41Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:55Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:34:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z9vwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bf8dm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:43Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:43 crc kubenswrapper[4690]: I0320 17:34:43.792943 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a01b81a-5874-41c3-a2ea-0b3f68fb1194\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://696bd60243b29b1c078b32f2dcb7261e108e0b204ba5889b2c0ce5d6c8dff044\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98dccfbbb62f60dc126e6c81729f6ac78b1f017d1dd01a200d06beb2296fd1b2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:32:38Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 17:32:08.107057 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 17:32:08.109144 1 observer_polling.go:159] Starting file observer\\\\nI0320 17:32:08.136585 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 17:32:08.140728 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0320 17:32:38.286606 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43215f00bdcc0d708039a3dd34ce62baa101c8218cc73255f2027f3dbfe60198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35960cf982659d799c1e2ce1a4c7eb21b7b1c5d8e5979668b4b6df505c38bdf7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://079fc6ab0278dfdaa56142eb90b06568010882948e45bea053b0459a68c9faa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:43Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:43 crc kubenswrapper[4690]: I0320 17:34:43.823676 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dacf40f3-f7fe-429b-bb11-3057bc037779\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b273b610fa19944625ca87d5ec10f818b86154d676f1def5ebe494ee44ed3848\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f552ca9ec154d035a9f9809b20d9ff2cd19bbd4cb9262173a0334289741f4fad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd8c552d958aced0cb683d87c3ef8d88494d4888ccb028a9f4c27b24b4923264\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5355eb1563fa92e70ca61e39a864a15b53da2181b277f3e134d121b5626b954a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f044bae4d4345b16e951ba16d4dc6df9b400789b67b6eb23d806fba27dc77d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://601a5cb96354f970de2322d08594baacac3c21ec962d27dc0c809f1bc99de4d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://601a5cb96354f970de2322d08594baacac3c21ec962d27dc0c809f1bc99de4d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e719a69188fb4ee3882973f6f72ba027c5a546cb39b119b27bcd38d8cc728521\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e719a69188fb4ee3882973f6f72ba027c5a546cb39b119b27bcd38d8cc728521\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ef118e8eca52e42d265877595d296d5641caa5c79886b886eefca7686f9b6524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef118e8eca52e42d265877595d296d5641caa5c79886b886eefca7686f9b6524\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:43Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:43 crc kubenswrapper[4690]: I0320 17:34:43.834767 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bgj72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cb690cf-caea-4c1b-ad3c-7e17a802b1a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djqjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djqjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bgj72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:43Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:43 crc kubenswrapper[4690]: I0320 17:34:43.850049 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tzvwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fe7c1d1-7aa9-4c64-941e-7415a99367ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9bad176e93c3fff461f57c5c15ed0d5bcc9ef12767d38012fe1145dd701112b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56dc92b978a7c1bbd4e3ccc2a6821348e2a990247e49e82c4de43c8bbe305cad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56dc92b978a7c1bbd4e3ccc2a6821348e2a990247e49e82c4de43c8bbe305cad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b4a3f2829967bcafe60ed0c6d08a421e8c8a5cd49d2a7445bbc92c2592d7457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b4a3f2829967bcafe60ed0c6d08a421e8c8a5cd49d2a7445bbc92c2592d7457\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://971c38dc48c64a0c8c8781e6d2a3d6f5222f9e846fb32ae417a4a1872a296b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://971c38dc48c64a0c8c8781e6d2a3d6f5222f9e846fb32ae417a4a1872a296b47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6e590fdf915cb209ad79022e0bb1b20cf642ebfeaa5e67cad61f14c495feaed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6e590fdf915cb209ad79022e0bb1b20cf642ebfeaa5e67cad61f14c495feaed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa9bfe0b6b30c8ecbcab836f9fd1770f959392e981e9676b281b5768a4279d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa9bfe0b6b30c8ecbcab836f9fd1770f959392e981e9676b281b5768a4279d22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d589b2d09af16c9faaa995e5d4abaa7663d53b499e93fbb2ad76e9ef14ff32c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d589b2d09af16c9faaa995e5d4abaa7663d53b499e93fbb2ad76e9ef14ff32c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tzvwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:43Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:43 crc kubenswrapper[4690]: I0320 17:34:43.875841 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01a728ab-e286-4606-b922-d510978b863a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89f5bb035f84384df58eb38689bda300611344d78c38c548c61cd02a479b6852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://187278dddcc4ae295ce37bb5966dd95b70987cf9579d8a302c45162906caa098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d198c0b94cfc2e9429a02ccb1bf444b3746c37cd3278cc5c41cccad3a92f3a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78b79e7c6bc179739a43168addace3ea75f4067c5938f219a5cb0e545f65472f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11c8e8059826df28ea1bdafe3ca56a8a902ff916246367be3ece76d468194901\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95c9b322e5da6bc8172886af77d6507bccaaf8e4489181c78d3f5e522d781aa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31889aedfc9e0388694b4201c8c752dabd1634603e55401682b8cb995946bab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31889aedfc9e0388694b4201c8c752dabd1634603e55401682b8cb995946bab6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:34:13Z\\\",\\\"message\\\":\\\"k-metrics-daemon-bgj72 in node crc\\\\nI0320 17:34:13.787724 6883 services_controller.go:443] Built service openshift-apiserver/check-endpoints LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.139\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:17698, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0320 17:34:13.787931 6883 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI0320 17:34:13.788092 6883 default_network_controller.go:776] Recording success event on pod openshift-etcd/etcd-crc\\\\nF0320 17:34:13.787689 6883 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.opensh\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:34:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7bsmm_openshift-ovn-kubernetes(01a728ab-e286-4606-b922-d510978b863a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6447a78cef9ba2045f7928077399b681d152b37755ec287ae1633a26a67711ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ad2529bd38d1e0c84ca456ccdcc8020ce82a667c5aa5ea3a0027d397ec94f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ad2529bd38d1e0c84ca456ccdcc8020ce82a667c5aa5ea3a0027d397ec94f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7bsmm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:43Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:43 crc kubenswrapper[4690]: I0320 17:34:43.883005 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:34:43 crc kubenswrapper[4690]: I0320 17:34:43.883056 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:34:43 crc kubenswrapper[4690]: E0320 17:34:43.883101 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:34:43 crc kubenswrapper[4690]: E0320 17:34:43.883174 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:34:43 crc kubenswrapper[4690]: I0320 17:34:43.883296 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bgj72" Mar 20 17:34:43 crc kubenswrapper[4690]: E0320 17:34:43.883403 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bgj72" podUID="3cb690cf-caea-4c1b-ad3c-7e17a802b1a3" Mar 20 17:34:43 crc kubenswrapper[4690]: I0320 17:34:43.883478 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:34:43 crc kubenswrapper[4690]: E0320 17:34:43.883672 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:34:43 crc kubenswrapper[4690]: I0320 17:34:43.884832 4690 scope.go:117] "RemoveContainer" containerID="31889aedfc9e0388694b4201c8c752dabd1634603e55401682b8cb995946bab6" Mar 20 17:34:43 crc kubenswrapper[4690]: I0320 17:34:43.891401 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qhmg6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5abdfe2-a5f7-43a7-9c83-a9eb0dacdea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19bc44db59dd7f723e92f099fb77ea80fac41a5fc0a3818ddd8d443495c50c8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lb8q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qhmg6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:43Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:43 crc kubenswrapper[4690]: I0320 17:34:43.907531 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cdd6a8b-6b15-41c5-ba81-51e1ef53835e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c42561cbc470c23295468bf31d6dda364c3962cf8ac84f53ed62c01fa3e19db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbfcf8baf8b3cc4746bc7b314297f0f820b7461ad85d9c2f500a3ed589fb4bc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbfcf8baf8b3cc4746bc7b314297f0f820b7461ad85d9c2f500a3ed589fb4bc0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:43Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:43 crc kubenswrapper[4690]: I0320 17:34:43.932520 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1ec4f2e-81b3-4b81-b071-1306b93f352a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc5b19d4175f97a26633b3c61b49147f93e1edeb8975964cb23bbe474f6326e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe2bb59ee9fc82c3e49b375d294aebc73e2175d699416cb28c587a153cbadc5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83d020fd903a7b604233a4229c9a201a78f0f9d41864c94e82220090dd73e69e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af5080d60c7a6c75aac659ab9995f5f78392919748687dc3c81f6df7af1afe76\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60a788ca120045ef7b2481c3da0afac1f8ae2522b3edd3b73a48f5f8dab045a4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:33:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:33:16.417534 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:33:16.417775 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:33:16.418850 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4179466923/tls.crt::/tmp/serving-cert-4179466923/tls.key\\\\\\\"\\\\nI0320 17:33:16.771141 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:33:16.777371 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:33:16.777420 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:33:16.777489 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:33:16.777503 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:33:16.783760 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 17:33:16.783788 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:33:16.783793 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 17:33:16.783790 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:33:16.783798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:33:16.783816 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:33:16.783823 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:33:16.783828 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:33:16.787038 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:34:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d1877a8c2e19c04c44916cbcd68e19a117e4d6075b33ce7131064590120b12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438a96b878fe413aa54a56021b7ca5d2d38226050a036c2ce144aaead090aff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://438a96b878fe413aa54a56021b7ca5d2d38226050a036c2ce144aaead090aff7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:43Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:43 crc kubenswrapper[4690]: I0320 17:34:43.952831 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7501f273f832d465f837fe21cbfaddda7e9fdbfafe44e94d3fbfee21bbd2735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:43Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:43 crc kubenswrapper[4690]: I0320 17:34:43.970523 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:43Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:43 crc kubenswrapper[4690]: I0320 17:34:43.988926 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:43Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:44 crc kubenswrapper[4690]: I0320 17:34:44.006319 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c86b6b30-cf74-4708-b280-8c90ce27af28\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9910536149dc102d5a56c9ac27047ab0f86628788126c6c4aaf8aa8e8bc414bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://123e6e9aa8268f78a2852df2460763150ed92462bebd7c852c2bb6f78a092781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://906d1b0f0eda0e576d188ea1c4f601f45dcc8e93bf96330fa4e50be9d7a082b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87f674631ef5a418e3657c5c5103ab4c199d3f1690e0a0c737927afd35db4170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87f674631ef5a418e3657c5c5103ab4c199d3f1690e0a0c737927afd35db4170\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:44Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:44 crc kubenswrapper[4690]: I0320 17:34:44.025036 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://202f57e25ffca6b763271ebd9354cb780bda72898aa4b753ce08bcf5a774dbd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:44Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:44 crc kubenswrapper[4690]: I0320 17:34:44.042815 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c18651e4-89e3-43fd-a780-bfa6df87591e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://746499ab480c55aa548acd69b4adc2adb724c111d53536273f1e738c5d67209c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v64dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09565d72b6e11bc9bc4f72446c455016fb107bdf0fe367b56427ce9f79c20b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v64dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wtg2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:44Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:44 crc kubenswrapper[4690]: I0320 17:34:44.059516 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4rfg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deaf1de2-4906-4e89-ae1b-83b6d35f97a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53b3e701b77813269b88f29ec4e437ca71cad9cd1b9cc9310dc6b59cc609bcc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmghf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4rfg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:44Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:44 crc kubenswrapper[4690]: I0320 17:34:44.077790 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nqtt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f51dea1-fc10-4d4a-9065-2d0c020b36f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc815d328a997ab7b69c5eb959fedde44313867916d64f4ebaf96d77e34b2e84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzj2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de078ec156833ff0304a8e83014adf2c8fc5c7f8db9bb25c366acf27fa446ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzj2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8nqtt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:44Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:44 crc kubenswrapper[4690]: I0320 17:34:44.722793 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7bsmm_01a728ab-e286-4606-b922-d510978b863a/ovnkube-controller/2.log" Mar 20 17:34:44 crc kubenswrapper[4690]: I0320 17:34:44.725536 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" event={"ID":"01a728ab-e286-4606-b922-d510978b863a","Type":"ContainerStarted","Data":"81abe4d654d381b11ab7ff28d592be23303e3f7934bb0c68d3f3c30316b491ca"} Mar 20 17:34:44 crc kubenswrapper[4690]: I0320 17:34:44.726268 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" Mar 20 17:34:44 crc kubenswrapper[4690]: I0320 17:34:44.745193 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64a74fb2e29c84d99284cdca82ecd7abae5fc195747f292f11036116ec270ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37728496304293eddfd812f4584815ce277a3a2b02b6716e5f7d5d77ebaf9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:44Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:44 crc kubenswrapper[4690]: I0320 17:34:44.767105 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:44Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:44 crc kubenswrapper[4690]: I0320 17:34:44.786967 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bf8dm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"189715be-f690-4a1d-9bd3-fb0dcae7affe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a2c238f16fbb8b532515c8ae6456c4e5b9b6e5797597ea258171e573c9f4ba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ab5d0027832ffcb62f2f0869a4811a56bd02954cbdd4fd0e20870dc72818ba4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:34:41Z\\\",\\\"message\\\":\\\"2026-03-20T17:33:56+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_1bdcc22d-b4ba-4714-aa18-2d803f8b3ba5\\\\n2026-03-20T17:33:56+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_1bdcc22d-b4ba-4714-aa18-2d803f8b3ba5 to /host/opt/cni/bin/\\\\n2026-03-20T17:33:56Z [verbose] multus-daemon started\\\\n2026-03-20T17:33:56Z [verbose] Readiness Indicator file check\\\\n2026-03-20T17:34:41Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:55Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:34:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z9vwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bf8dm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:44Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:44 crc kubenswrapper[4690]: I0320 17:34:44.803911 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a01b81a-5874-41c3-a2ea-0b3f68fb1194\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://696bd60243b29b1c078b32f2dcb7261e108e0b204ba5889b2c0ce5d6c8dff044\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98dccfbbb62f60dc126e6c81729f6ac78b1f017d1dd01a200d06beb2296fd1b2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:32:38Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 17:32:08.107057 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 17:32:08.109144 1 observer_polling.go:159] Starting file observer\\\\nI0320 17:32:08.136585 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 17:32:08.140728 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0320 17:32:38.286606 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43215f00bdcc0d708039a3dd34ce62baa101c8218cc73255f2027f3dbfe60198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35960cf982659d799c1e2ce1a4c7eb21b7b1c5d8e5979668b4b6df505c38bdf7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://079fc6ab0278dfdaa56142eb90b06568010882948e45bea053b0459a68c9faa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:44Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:44 crc kubenswrapper[4690]: I0320 17:34:44.834103 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dacf40f3-f7fe-429b-bb11-3057bc037779\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b273b610fa19944625ca87d5ec10f818b86154d676f1def5ebe494ee44ed3848\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f552ca9ec154d035a9f9809b20d9ff2cd19bbd4cb9262173a0334289741f4fad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd8c552d958aced0cb683d87c3ef8d88494d4888ccb028a9f4c27b24b4923264\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5355eb1563fa92e70ca61e39a864a15b53da2181b277f3e134d121b5626b954a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f044bae4d4345b16e951ba16d4dc6df9b400789b67b6eb23d806fba27dc77d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://601a5cb96354f970de2322d08594baacac3c21ec962d27dc0c809f1bc99de4d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://601a5cb96354f970de2322d08594baacac3c21ec962d27dc0c809f1bc99de4d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e719a69188fb4ee3882973f6f72ba027c5a546cb39b119b27bcd38d8cc728521\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e719a69188fb4ee3882973f6f72ba027c5a546cb39b119b27bcd38d8cc728521\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ef118e8eca52e42d265877595d296d5641caa5c79886b886eefca7686f9b6524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef118e8eca52e42d265877595d296d5641caa5c79886b886eefca7686f9b6524\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:44Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:44 crc kubenswrapper[4690]: I0320 17:34:44.847716 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bgj72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cb690cf-caea-4c1b-ad3c-7e17a802b1a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djqjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djqjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bgj72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:44Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:44 crc kubenswrapper[4690]: I0320 17:34:44.864823 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tzvwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fe7c1d1-7aa9-4c64-941e-7415a99367ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9bad176e93c3fff461f57c5c15ed0d5bcc9ef12767d38012fe1145dd701112b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56dc92b978a7c1bbd4e3ccc2a6821348e2a990247e49e82c4de43c8bbe305cad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56dc92b978a7c1bbd4e3ccc2a6821348e2a990247e49e82c4de43c8bbe305cad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b4a3f2829967bcafe60ed0c6d08a421e8c8a5cd49d2a7445bbc92c2592d7457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b4a3f2829967bcafe60ed0c6d08a421e8c8a5cd49d2a7445bbc92c2592d7457\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://971c38dc48c64a0c8c8781e6d2a3d6f5222f9e846fb32ae417a4a1872a296b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://971c38dc48c64a0c8c8781e6d2a3d6f5222f9e846fb32ae417a4a1872a296b47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6e590fdf915cb209ad79022e0bb1b20cf642ebfeaa5e67cad61f14c495feaed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6e590fdf915cb209ad79022e0bb1b20cf642ebfeaa5e67cad61f14c495feaed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa9bfe0b6b30c8ecbcab836f9fd1770f959392e981e9676b281b5768a4279d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa9bfe0b6b30c8ecbcab836f9fd1770f959392e981e9676b281b5768a4279d22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d589b2d09af16c9faaa995e5d4abaa7663d53b499e93fbb2ad76e9ef14ff32c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d589b2d09af16c9faaa995e5d4abaa7663d53b499e93fbb2ad76e9ef14ff32c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tzvwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:44Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:44 crc kubenswrapper[4690]: I0320 17:34:44.895359 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01a728ab-e286-4606-b922-d510978b863a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89f5bb035f84384df58eb38689bda300611344d78c38c548c61cd02a479b6852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://187278dddcc4ae295ce37bb5966dd95b70987cf9579d8a302c45162906caa098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d198c0b94cfc2e9429a02ccb1bf444b3746c37cd3278cc5c41cccad3a92f3a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78b79e7c6bc179739a43168addace3ea75f4067c5938f219a5cb0e545f65472f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11c8e8059826df28ea1bdafe3ca56a8a902ff916246367be3ece76d468194901\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95c9b322e5da6bc8172886af77d6507bccaaf8e4489181c78d3f5e522d781aa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81abe4d654d381b11ab7ff28d592be23303e3f7934bb0c68d3f3c30316b491ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31889aedfc9e0388694b4201c8c752dabd1634603e55401682b8cb995946bab6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:34:13Z\\\",\\\"message\\\":\\\"k-metrics-daemon-bgj72 in node crc\\\\nI0320 17:34:13.787724 6883 services_controller.go:443] Built service openshift-apiserver/check-endpoints LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.139\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:17698, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0320 17:34:13.787931 6883 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI0320 17:34:13.788092 6883 default_network_controller.go:776] Recording success event on pod openshift-etcd/etcd-crc\\\\nF0320 17:34:13.787689 6883 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.opensh\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:34:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:34:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6447a78cef9ba2045f7928077399b681d152b37755ec287ae1633a26a67711ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ad2529bd38d1e0c84ca456ccdcc8020ce82a667c5aa5ea3a0027d397ec94f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ad2529bd38d1e0c84ca456ccdcc8020ce82a667c5aa5ea3a0027d397ec94f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7bsmm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:44Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:44 crc kubenswrapper[4690]: I0320 17:34:44.906824 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cdd6a8b-6b15-41c5-ba81-51e1ef53835e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c42561cbc470c23295468bf31d6dda364c3962cf8ac84f53ed62c01fa3e19db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbfcf8baf8b3cc4746bc7b314297f0f820b7461ad85d9c2f500a3ed589fb4bc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbfcf8baf8b3cc4746bc7b314297f0f820b7461ad85d9c2f500a3ed589fb4bc0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:44Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:44 crc kubenswrapper[4690]: I0320 17:34:44.923190 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1ec4f2e-81b3-4b81-b071-1306b93f352a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc5b19d4175f97a26633b3c61b49147f93e1edeb8975964cb23bbe474f6326e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe2bb59ee9fc82c3e49b375d294aebc73e2175d699416cb28c587a153cbadc5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83d020fd903a7b604233a4229c9a201a78f0f9d41864c94e82220090dd73e69e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af5080d60c7a6c75aac659ab9995f5f78392919748687dc3c81f6df7af1afe76\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60a788ca120045ef7b2481c3da0afac1f8ae2522b3edd3b73a48f5f8dab045a4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:33:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:33:16.417534 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:33:16.417775 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:33:16.418850 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4179466923/tls.crt::/tmp/serving-cert-4179466923/tls.key\\\\\\\"\\\\nI0320 17:33:16.771141 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:33:16.777371 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:33:16.777420 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:33:16.777489 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:33:16.777503 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:33:16.783760 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 17:33:16.783788 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:33:16.783793 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 17:33:16.783790 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:33:16.783798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:33:16.783816 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:33:16.783823 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:33:16.783828 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:33:16.787038 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:34:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d1877a8c2e19c04c44916cbcd68e19a117e4d6075b33ce7131064590120b12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438a96b878fe413aa54a56021b7ca5d2d38226050a036c2ce144aaead090aff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://438a96b878fe413aa54a56021b7ca5d2d38226050a036c2ce144aaead090aff7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:44Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:44 crc kubenswrapper[4690]: I0320 17:34:44.937666 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7501f273f832d465f837fe21cbfaddda7e9fdbfafe44e94d3fbfee21bbd2735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:44Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:44 crc kubenswrapper[4690]: I0320 17:34:44.953912 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:44Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:44 crc kubenswrapper[4690]: I0320 17:34:44.973047 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:44Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:44 crc kubenswrapper[4690]: I0320 17:34:44.987578 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qhmg6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5abdfe2-a5f7-43a7-9c83-a9eb0dacdea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19bc44db59dd7f723e92f099fb77ea80fac41a5fc0a3818ddd8d443495c50c8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lb8q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qhmg6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:44Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:45 crc kubenswrapper[4690]: I0320 17:34:45.003917 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c86b6b30-cf74-4708-b280-8c90ce27af28\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9910536149dc102d5a56c9ac27047ab0f86628788126c6c4aaf8aa8e8bc414bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://123e6e9aa8268f78a2852df2460763150ed92462bebd7c852c2bb6f78a092781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://906d1b0f0eda0e576d188ea1c4f601f45dcc8e93bf96330fa4e50be9d7a082b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87f674631ef5a418e3657c5c5103ab4c199d3f1690e0a0c737927afd35db4170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87f674631ef5a418e3657c5c5103ab4c199d3f1690e0a0c737927afd35db4170\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:45Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:45 crc kubenswrapper[4690]: I0320 17:34:45.017662 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://202f57e25ffca6b763271ebd9354cb780bda72898aa4b753ce08bcf5a774dbd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:45Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:45 crc kubenswrapper[4690]: I0320 17:34:45.031776 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c18651e4-89e3-43fd-a780-bfa6df87591e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://746499ab480c55aa548acd69b4adc2adb724c111d53536273f1e738c5d67209c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v64dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09565d72b6e11bc9bc4f72446c455016fb107bdf0fe367b56427ce9f79c20b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v64dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wtg2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:45Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:45 crc kubenswrapper[4690]: I0320 17:34:45.047228 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4rfg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deaf1de2-4906-4e89-ae1b-83b6d35f97a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53b3e701b77813269b88f29ec4e437ca71cad9cd1b9cc9310dc6b59cc609bcc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmghf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4rfg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:45Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:45 crc kubenswrapper[4690]: I0320 17:34:45.061995 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nqtt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f51dea1-fc10-4d4a-9065-2d0c020b36f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc815d328a997ab7b69c5eb959fedde44313867916d64f4ebaf96d77e34b2e84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzj2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de078ec156833ff0304a8e83014adf2c8fc5c7f8db9bb25c366acf27fa446ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzj2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8nqtt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:45Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:45 crc kubenswrapper[4690]: I0320 17:34:45.731861 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7bsmm_01a728ab-e286-4606-b922-d510978b863a/ovnkube-controller/3.log" Mar 20 17:34:45 crc kubenswrapper[4690]: I0320 17:34:45.733036 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7bsmm_01a728ab-e286-4606-b922-d510978b863a/ovnkube-controller/2.log" Mar 20 17:34:45 crc kubenswrapper[4690]: I0320 17:34:45.737284 4690 generic.go:334] "Generic (PLEG): container finished" podID="01a728ab-e286-4606-b922-d510978b863a" containerID="81abe4d654d381b11ab7ff28d592be23303e3f7934bb0c68d3f3c30316b491ca" exitCode=1 Mar 20 17:34:45 crc kubenswrapper[4690]: I0320 17:34:45.737345 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" event={"ID":"01a728ab-e286-4606-b922-d510978b863a","Type":"ContainerDied","Data":"81abe4d654d381b11ab7ff28d592be23303e3f7934bb0c68d3f3c30316b491ca"} Mar 20 17:34:45 crc kubenswrapper[4690]: I0320 17:34:45.737418 4690 scope.go:117] "RemoveContainer" containerID="31889aedfc9e0388694b4201c8c752dabd1634603e55401682b8cb995946bab6" Mar 20 17:34:45 crc kubenswrapper[4690]: I0320 17:34:45.738399 4690 scope.go:117] "RemoveContainer" containerID="81abe4d654d381b11ab7ff28d592be23303e3f7934bb0c68d3f3c30316b491ca" Mar 20 17:34:45 crc kubenswrapper[4690]: E0320 17:34:45.738868 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7bsmm_openshift-ovn-kubernetes(01a728ab-e286-4606-b922-d510978b863a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" podUID="01a728ab-e286-4606-b922-d510978b863a" Mar 20 17:34:45 crc kubenswrapper[4690]: I0320 17:34:45.776742 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01a728ab-e286-4606-b922-d510978b863a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89f5bb035f84384df58eb38689bda300611344d78c38c548c61cd02a479b6852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://187278dddcc4ae295ce37bb5966dd95b70987cf9579d8a302c45162906caa098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d198c0b94cfc2e9429a02ccb1bf444b3746c37cd3278cc5c41cccad3a92f3a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78b79e7c6bc179739a43168addace3ea75f4067c5938f219a5cb0e545f65472f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11c8e8059826df28ea1bdafe3ca56a8a902ff916246367be3ece76d468194901\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95c9b322e5da6bc8172886af77d6507bccaaf8e4489181c78d3f5e522d781aa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81abe4d654d381b11ab7ff28d592be23303e3f7934bb0c68d3f3c30316b491ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31889aedfc9e0388694b4201c8c752dabd1634603e55401682b8cb995946bab6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:34:13Z\\\",\\\"message\\\":\\\"k-metrics-daemon-bgj72 in node crc\\\\nI0320 17:34:13.787724 6883 services_controller.go:443] Built service openshift-apiserver/check-endpoints LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.139\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:17698, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0320 17:34:13.787931 6883 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI0320 17:34:13.788092 6883 default_network_controller.go:776] Recording success event on pod openshift-etcd/etcd-crc\\\\nF0320 17:34:13.787689 6883 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.opensh\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:34:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81abe4d654d381b11ab7ff28d592be23303e3f7934bb0c68d3f3c30316b491ca\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:34:44Z\\\",\\\"message\\\":\\\"cyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 17:34:44.775483 7209 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0320 17:34:44.775498 7209 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 17:34:44.775533 7209 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 17:34:44.775568 7209 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 17:34:44.775579 7209 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 17:34:44.775641 7209 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 17:34:44.775695 7209 factory.go:656] Stopping watch factory\\\\nI0320 17:34:44.775730 7209 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 17:34:44.775733 7209 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 17:34:44.775750 7209 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0320 17:34:44.775766 7209 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 17:34:44.775781 7209 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:34:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6447a78cef9ba2045f7928077399b681d152b37755ec287ae1633a26a67711ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ad2529bd38d1e0c84ca456ccdcc8020ce82a667c5aa5ea3a0027d397ec94f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ad2529bd38d1e0c84ca456ccdcc8020ce82a667c5aa5ea3a0027d397ec94f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7bsmm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:45Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:45 crc kubenswrapper[4690]: I0320 17:34:45.802685 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a01b81a-5874-41c3-a2ea-0b3f68fb1194\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://696bd60243b29b1c078b32f2dcb7261e108e0b204ba5889b2c0ce5d6c8dff044\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98dccfbbb62f60dc126e6c81729f6ac78b1f017d1dd01a200d06beb2296fd1b2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:32:38Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 17:32:08.107057 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 17:32:08.109144 1 observer_polling.go:159] Starting file observer\\\\nI0320 17:32:08.136585 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 17:32:08.140728 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0320 17:32:38.286606 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43215f00bdcc0d708039a3dd34ce62baa101c8218cc73255f2027f3dbfe60198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35960cf982659d799c1e2ce1a4c7eb21b7b1c5d8e5979668b4b6df505c38bdf7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://079fc6ab0278dfdaa56142eb90b06568010882948e45bea053b0459a68c9faa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:45Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:45 crc kubenswrapper[4690]: I0320 17:34:45.829407 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dacf40f3-f7fe-429b-bb11-3057bc037779\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b273b610fa19944625ca87d5ec10f818b86154d676f1def5ebe494ee44ed3848\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f552ca9ec154d035a9f9809b20d9ff2cd19bbd4cb9262173a0334289741f4fad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd8c552d958aced0cb683d87c3ef8d88494d4888ccb028a9f4c27b24b4923264\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5355eb1563fa92e70ca61e39a864a15b53da2181b277f3e134d121b5626b954a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f044bae4d4345b16e951ba16d4dc6df9b400789b67b6eb23d806fba27dc77d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://601a5cb96354f970de2322d08594baacac3c21ec962d27dc0c809f1bc99de4d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://601a5cb96354f970de2322d08594baacac3c21ec962d27dc0c809f1bc99de4d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e719a69188fb4ee3882973f6f72ba027c5a546cb39b119b27bcd38d8cc728521\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e719a69188fb4ee3882973f6f72ba027c5a546cb39b119b27bcd38d8cc728521\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ef118e8eca52e42d265877595d296d5641caa5c79886b886eefca7686f9b6524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef118e8eca52e42d265877595d296d5641caa5c79886b886eefca7686f9b6524\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:45Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:45 crc kubenswrapper[4690]: I0320 17:34:45.842276 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bgj72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cb690cf-caea-4c1b-ad3c-7e17a802b1a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djqjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djqjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bgj72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:45Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:45 crc kubenswrapper[4690]: I0320 17:34:45.859233 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tzvwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fe7c1d1-7aa9-4c64-941e-7415a99367ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9bad176e93c3fff461f57c5c15ed0d5bcc9ef12767d38012fe1145dd701112b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56dc92b978a7c1bbd4e3ccc2a6821348e2a990247e49e82c4de43c8bbe305cad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56dc92b978a7c1bbd4e3ccc2a6821348e2a990247e49e82c4de43c8bbe305cad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b4a3f2829967bcafe60ed0c6d08a421e8c8a5cd49d2a7445bbc92c2592d7457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b4a3f2829967bcafe60ed0c6d08a421e8c8a5cd49d2a7445bbc92c2592d7457\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://971c38dc48c64a0c8c8781e6d2a3d6f5222f9e846fb32ae417a4a1872a296b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://971c38dc48c64a0c8c8781e6d2a3d6f5222f9e846fb32ae417a4a1872a296b47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6e590fdf915cb209ad79022e0bb1b20cf642ebfeaa5e67cad61f14c495feaed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6e590fdf915cb209ad79022e0bb1b20cf642ebfeaa5e67cad61f14c495feaed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa9bfe0b6b30c8ecbcab836f9fd1770f959392e981e9676b281b5768a4279d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa9bfe0b6b30c8ecbcab836f9fd1770f959392e981e9676b281b5768a4279d22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d589b2d09af16c9faaa995e5d4abaa7663d53b499e93fbb2ad76e9ef14ff32c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d589b2d09af16c9faaa995e5d4abaa7663d53b499e93fbb2ad76e9ef14ff32c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tzvwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:45Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:45 crc kubenswrapper[4690]: I0320 17:34:45.872110 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:45Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:45 crc kubenswrapper[4690]: I0320 17:34:45.883246 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:34:45 crc kubenswrapper[4690]: I0320 17:34:45.883350 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:34:45 crc kubenswrapper[4690]: I0320 17:34:45.883368 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bgj72" Mar 20 17:34:45 crc kubenswrapper[4690]: I0320 17:34:45.883447 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:34:45 crc kubenswrapper[4690]: E0320 17:34:45.883443 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:34:45 crc kubenswrapper[4690]: E0320 17:34:45.883586 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bgj72" podUID="3cb690cf-caea-4c1b-ad3c-7e17a802b1a3" Mar 20 17:34:45 crc kubenswrapper[4690]: E0320 17:34:45.883621 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:34:45 crc kubenswrapper[4690]: E0320 17:34:45.883804 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:34:45 crc kubenswrapper[4690]: I0320 17:34:45.883895 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qhmg6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5abdfe2-a5f7-43a7-9c83-a9eb0dacdea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19bc44db59dd7f723e92f099fb77ea80fac41a5fc0a3818ddd8d443495c50c8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lb8q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qhmg6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:45Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:45 crc kubenswrapper[4690]: I0320 17:34:45.895139 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cdd6a8b-6b15-41c5-ba81-51e1ef53835e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c42561cbc470c23295468bf31d6dda364c3962cf8ac84f53ed62c01fa3e19db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbfcf8baf8b3cc4746bc7b314297f0f820b7461ad85d9c2f500a3ed589fb4bc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbfcf8baf8b3cc4746bc7b314297f0f820b7461ad85d9c2f500a3ed589fb4bc0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:45Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:45 crc kubenswrapper[4690]: I0320 17:34:45.910058 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1ec4f2e-81b3-4b81-b071-1306b93f352a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc5b19d4175f97a26633b3c61b49147f93e1edeb8975964cb23bbe474f6326e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe2bb59ee9fc82c3e49b375d294aebc73e2175d699416cb28c587a153cbadc5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83d020fd903a7b604233a4229c9a201a78f0f9d41864c94e82220090dd73e69e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af5080d60c7a6c75aac659ab9995f5f78392919748687dc3c81f6df7af1afe76\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60a788ca120045ef7b2481c3da0afac1f8ae2522b3edd3b73a48f5f8dab045a4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:33:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:33:16.417534 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:33:16.417775 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:33:16.418850 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4179466923/tls.crt::/tmp/serving-cert-4179466923/tls.key\\\\\\\"\\\\nI0320 17:33:16.771141 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:33:16.777371 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:33:16.777420 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:33:16.777489 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:33:16.777503 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:33:16.783760 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 17:33:16.783788 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:33:16.783793 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 17:33:16.783790 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:33:16.783798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:33:16.783816 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:33:16.783823 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:33:16.783828 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:33:16.787038 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:34:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d1877a8c2e19c04c44916cbcd68e19a117e4d6075b33ce7131064590120b12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438a96b878fe413aa54a56021b7ca5d2d38226050a036c2ce144aaead090aff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://438a96b878fe413aa54a56021b7ca5d2d38226050a036c2ce144aaead090aff7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:45Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:45 crc kubenswrapper[4690]: I0320 17:34:45.928611 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7501f273f832d465f837fe21cbfaddda7e9fdbfafe44e94d3fbfee21bbd2735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:45Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:45 crc kubenswrapper[4690]: I0320 17:34:45.942834 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:45Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:45 crc kubenswrapper[4690]: I0320 17:34:45.957662 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nqtt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f51dea1-fc10-4d4a-9065-2d0c020b36f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc815d328a997ab7b69c5eb959fedde44313867916d64f4ebaf96d77e34b2e84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzj2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de078ec156833ff0304a8e83014adf2c8fc5c7f8db9bb25c366acf27fa446ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzj2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8nqtt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:45Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:45 crc kubenswrapper[4690]: I0320 17:34:45.970802 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c86b6b30-cf74-4708-b280-8c90ce27af28\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9910536149dc102d5a56c9ac27047ab0f86628788126c6c4aaf8aa8e8bc414bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://123e6e9aa8268f78a2852df2460763150ed92462bebd7c852c2bb6f78a092781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://906d1b0f0eda0e576d188ea1c4f601f45dcc8e93bf96330fa4e50be9d7a082b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87f674631ef5a418e3657c5c5103ab4c199d3f1690e0a0c737927afd35db4170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87f674631ef5a418e3657c5c5103ab4c199d3f1690e0a0c737927afd35db4170\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:45Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:45 crc kubenswrapper[4690]: I0320 17:34:45.983737 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://202f57e25ffca6b763271ebd9354cb780bda72898aa4b753ce08bcf5a774dbd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:45Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:45 crc kubenswrapper[4690]: E0320 17:34:45.995732 4690 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 17:34:46 crc kubenswrapper[4690]: I0320 17:34:46.000564 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c18651e4-89e3-43fd-a780-bfa6df87591e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://746499ab480c55aa548acd69b4adc2adb724c111d53536273f1e738c5d67209c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v64dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09565d72b6e11bc9bc4f72446c455016fb107bdf0fe367b56427ce9f79c20b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v64dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wtg2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:45Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:46 crc kubenswrapper[4690]: I0320 17:34:46.016747 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4rfg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deaf1de2-4906-4e89-ae1b-83b6d35f97a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53b3e701b77813269b88f29ec4e437ca71cad9cd1b9cc9310dc6b59cc609bcc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmghf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4rfg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:46Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:46 crc kubenswrapper[4690]: I0320 17:34:46.033901 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64a74fb2e29c84d99284cdca82ecd7abae5fc195747f292f11036116ec270ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37728496304293eddfd812f4584815ce277a3a2b02b6716e5f7d5d77ebaf9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:46Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:46 crc kubenswrapper[4690]: I0320 17:34:46.051030 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:46Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:46 crc kubenswrapper[4690]: I0320 17:34:46.066795 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bf8dm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"189715be-f690-4a1d-9bd3-fb0dcae7affe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a2c238f16fbb8b532515c8ae6456c4e5b9b6e5797597ea258171e573c9f4ba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ab5d0027832ffcb62f2f0869a4811a56bd02954cbdd4fd0e20870dc72818ba4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:34:41Z\\\",\\\"message\\\":\\\"2026-03-20T17:33:56+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_1bdcc22d-b4ba-4714-aa18-2d803f8b3ba5\\\\n2026-03-20T17:33:56+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_1bdcc22d-b4ba-4714-aa18-2d803f8b3ba5 to /host/opt/cni/bin/\\\\n2026-03-20T17:33:56Z [verbose] multus-daemon started\\\\n2026-03-20T17:33:56Z [verbose] Readiness Indicator file check\\\\n2026-03-20T17:34:41Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:55Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:34:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z9vwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bf8dm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:46Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:46 crc kubenswrapper[4690]: I0320 17:34:46.079938 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nqtt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f51dea1-fc10-4d4a-9065-2d0c020b36f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc815d328a997ab7b69c5eb959fedde44313867916d64f4ebaf96d77e34b2e84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzj2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de078ec156833ff0304a8e83014adf2c8fc5c7f8db9bb25c366acf27fa446ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzj2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8nqtt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:46Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:46 crc kubenswrapper[4690]: I0320 17:34:46.092217 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c86b6b30-cf74-4708-b280-8c90ce27af28\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9910536149dc102d5a56c9ac27047ab0f86628788126c6c4aaf8aa8e8bc414bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://123e6e9aa8268f78a2852df2460763150ed92462bebd7c852c2bb6f78a092781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://906d1b0f0eda0e576d188ea1c4f601f45dcc8e93bf96330fa4e50be9d7a082b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87f674631ef5a418e3657c5c5103ab4c199d3f1690e0a0c737927afd35db4170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87f674631ef5a418e3657c5c5103ab4c199d3f1690e0a0c737927afd35db4170\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:46Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:46 crc kubenswrapper[4690]: I0320 17:34:46.104860 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://202f57e25ffca6b763271ebd9354cb780bda72898aa4b753ce08bcf5a774dbd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:46Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:46 crc kubenswrapper[4690]: I0320 17:34:46.119563 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c18651e4-89e3-43fd-a780-bfa6df87591e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://746499ab480c55aa548acd69b4adc2adb724c111d53536273f1e738c5d67209c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v64dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09565d72b6e11bc9bc4f72446c455016fb107bdf0fe367b56427ce9f79c20b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v64dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wtg2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:46Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:46 crc kubenswrapper[4690]: I0320 17:34:46.131043 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4rfg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deaf1de2-4906-4e89-ae1b-83b6d35f97a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53b3e701b77813269b88f29ec4e437ca71cad9cd1b9cc9310dc6b59cc609bcc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmghf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4rfg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:46Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:46 crc kubenswrapper[4690]: I0320 17:34:46.143563 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64a74fb2e29c84d99284cdca82ecd7abae5fc195747f292f11036116ec270ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37728496304293eddfd812f4584815ce277a3a2b02b6716e5f7d5d77ebaf9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:46Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:46 crc kubenswrapper[4690]: I0320 17:34:46.154170 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:46Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:46 crc kubenswrapper[4690]: I0320 17:34:46.170113 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bf8dm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"189715be-f690-4a1d-9bd3-fb0dcae7affe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a2c238f16fbb8b532515c8ae6456c4e5b9b6e5797597ea258171e573c9f4ba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ab5d0027832ffcb62f2f0869a4811a56bd02954cbdd4fd0e20870dc72818ba4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:34:41Z\\\",\\\"message\\\":\\\"2026-03-20T17:33:56+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_1bdcc22d-b4ba-4714-aa18-2d803f8b3ba5\\\\n2026-03-20T17:33:56+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_1bdcc22d-b4ba-4714-aa18-2d803f8b3ba5 to /host/opt/cni/bin/\\\\n2026-03-20T17:33:56Z [verbose] multus-daemon started\\\\n2026-03-20T17:33:56Z [verbose] Readiness Indicator file check\\\\n2026-03-20T17:34:41Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:55Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:34:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z9vwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bf8dm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:46Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:46 crc kubenswrapper[4690]: I0320 17:34:46.196706 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01a728ab-e286-4606-b922-d510978b863a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89f5bb035f84384df58eb38689bda300611344d78c38c548c61cd02a479b6852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://187278dddcc4ae295ce37bb5966dd95b70987cf9579d8a302c45162906caa098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d198c0b94cfc2e9429a02ccb1bf444b3746c37cd3278cc5c41cccad3a92f3a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78b79e7c6bc179739a43168addace3ea75f4067c5938f219a5cb0e545f65472f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11c8e8059826df28ea1bdafe3ca56a8a902ff916246367be3ece76d468194901\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95c9b322e5da6bc8172886af77d6507bccaaf8e4489181c78d3f5e522d781aa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81abe4d654d381b11ab7ff28d592be23303e3f7934bb0c68d3f3c30316b491ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31889aedfc9e0388694b4201c8c752dabd1634603e55401682b8cb995946bab6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:34:13Z\\\",\\\"message\\\":\\\"k-metrics-daemon-bgj72 in node crc\\\\nI0320 17:34:13.787724 6883 services_controller.go:443] Built service openshift-apiserver/check-endpoints LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.139\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:17698, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0320 17:34:13.787931 6883 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI0320 17:34:13.788092 6883 default_network_controller.go:776] Recording success event on pod openshift-etcd/etcd-crc\\\\nF0320 17:34:13.787689 6883 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.opensh\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:34:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81abe4d654d381b11ab7ff28d592be23303e3f7934bb0c68d3f3c30316b491ca\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:34:44Z\\\",\\\"message\\\":\\\"cyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 17:34:44.775483 7209 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0320 17:34:44.775498 7209 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 17:34:44.775533 7209 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 17:34:44.775568 7209 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 17:34:44.775579 7209 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 17:34:44.775641 7209 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 17:34:44.775695 7209 factory.go:656] Stopping watch factory\\\\nI0320 17:34:44.775730 7209 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 17:34:44.775733 7209 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 17:34:44.775750 7209 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0320 17:34:44.775766 7209 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 17:34:44.775781 7209 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:34:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6447a78cef9ba2045f7928077399b681d152b37755ec287ae1633a26a67711ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ad2529bd38d1e0c84ca456ccdcc8020ce82a667c5aa5ea3a0027d397ec94f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ad2529bd38d1e0c84ca456ccdcc8020ce82a667c5aa5ea3a0027d397ec94f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7bsmm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:46Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:46 crc kubenswrapper[4690]: I0320 17:34:46.216222 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a01b81a-5874-41c3-a2ea-0b3f68fb1194\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://696bd60243b29b1c078b32f2dcb7261e108e0b204ba5889b2c0ce5d6c8dff044\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98dccfbbb62f60dc126e6c81729f6ac78b1f017d1dd01a200d06beb2296fd1b2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:32:38Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 17:32:08.107057 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 17:32:08.109144 1 observer_polling.go:159] Starting file observer\\\\nI0320 17:32:08.136585 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 17:32:08.140728 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0320 17:32:38.286606 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43215f00bdcc0d708039a3dd34ce62baa101c8218cc73255f2027f3dbfe60198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35960cf982659d799c1e2ce1a4c7eb21b7b1c5d8e5979668b4b6df505c38bdf7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://079fc6ab0278dfdaa56142eb90b06568010882948e45bea053b0459a68c9faa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:46Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:46 crc kubenswrapper[4690]: I0320 17:34:46.235438 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dacf40f3-f7fe-429b-bb11-3057bc037779\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b273b610fa19944625ca87d5ec10f818b86154d676f1def5ebe494ee44ed3848\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f552ca9ec154d035a9f9809b20d9ff2cd19bbd4cb9262173a0334289741f4fad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd8c552d958aced0cb683d87c3ef8d88494d4888ccb028a9f4c27b24b4923264\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5355eb1563fa92e70ca61e39a864a15b53da2181b277f3e134d121b5626b954a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f044bae4d4345b16e951ba16d4dc6df9b400789b67b6eb23d806fba27dc77d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://601a5cb96354f970de2322d08594baacac3c21ec962d27dc0c809f1bc99de4d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://601a5cb96354f970de2322d08594baacac3c21ec962d27dc0c809f1bc99de4d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e719a69188fb4ee3882973f6f72ba027c5a546cb39b119b27bcd38d8cc728521\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e719a69188fb4ee3882973f6f72ba027c5a546cb39b119b27bcd38d8cc728521\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ef118e8eca52e42d265877595d296d5641caa5c79886b886eefca7686f9b6524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef118e8eca52e42d265877595d296d5641caa5c79886b886eefca7686f9b6524\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:46Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:46 crc kubenswrapper[4690]: I0320 17:34:46.247443 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bgj72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cb690cf-caea-4c1b-ad3c-7e17a802b1a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djqjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djqjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bgj72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:46Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:46 crc kubenswrapper[4690]: I0320 17:34:46.264081 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tzvwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fe7c1d1-7aa9-4c64-941e-7415a99367ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9bad176e93c3fff461f57c5c15ed0d5bcc9ef12767d38012fe1145dd701112b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56dc92b978a7c1bbd4e3ccc2a6821348e2a990247e49e82c4de43c8bbe305cad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56dc92b978a7c1bbd4e3ccc2a6821348e2a990247e49e82c4de43c8bbe305cad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b4a3f2829967bcafe60ed0c6d08a421e8c8a5cd49d2a7445bbc92c2592d7457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b4a3f2829967bcafe60ed0c6d08a421e8c8a5cd49d2a7445bbc92c2592d7457\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://971c38dc48c64a0c8c8781e6d2a3d6f5222f9e846fb32ae417a4a1872a296b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://971c38dc48c64a0c8c8781e6d2a3d6f5222f9e846fb32ae417a4a1872a296b47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6e590fdf915cb209ad79022e0bb1b20cf642ebfeaa5e67cad61f14c495feaed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6e590fdf915cb209ad79022e0bb1b20cf642ebfeaa5e67cad61f14c495feaed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa9bfe0b6b30c8ecbcab836f9fd1770f959392e981e9676b281b5768a4279d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa9bfe0b6b30c8ecbcab836f9fd1770f959392e981e9676b281b5768a4279d22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d589b2d09af16c9faaa995e5d4abaa7663d53b499e93fbb2ad76e9ef14ff32c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d589b2d09af16c9faaa995e5d4abaa7663d53b499e93fbb2ad76e9ef14ff32c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tzvwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:46Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:46 crc kubenswrapper[4690]: I0320 17:34:46.281452 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:46Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:46 crc kubenswrapper[4690]: I0320 17:34:46.294736 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qhmg6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5abdfe2-a5f7-43a7-9c83-a9eb0dacdea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19bc44db59dd7f723e92f099fb77ea80fac41a5fc0a3818ddd8d443495c50c8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lb8q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qhmg6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:46Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:46 crc kubenswrapper[4690]: I0320 17:34:46.305516 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cdd6a8b-6b15-41c5-ba81-51e1ef53835e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c42561cbc470c23295468bf31d6dda364c3962cf8ac84f53ed62c01fa3e19db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbfcf8baf8b3cc4746bc7b314297f0f820b7461ad85d9c2f500a3ed589fb4bc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbfcf8baf8b3cc4746bc7b314297f0f820b7461ad85d9c2f500a3ed589fb4bc0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:46Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:46 crc kubenswrapper[4690]: I0320 17:34:46.321663 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1ec4f2e-81b3-4b81-b071-1306b93f352a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc5b19d4175f97a26633b3c61b49147f93e1edeb8975964cb23bbe474f6326e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe2bb59ee9fc82c3e49b375d294aebc73e2175d699416cb28c587a153cbadc5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83d020fd903a7b604233a4229c9a201a78f0f9d41864c94e82220090dd73e69e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af5080d60c7a6c75aac659ab9995f5f78392919748687dc3c81f6df7af1afe76\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60a788ca120045ef7b2481c3da0afac1f8ae2522b3edd3b73a48f5f8dab045a4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:33:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:33:16.417534 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:33:16.417775 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:33:16.418850 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4179466923/tls.crt::/tmp/serving-cert-4179466923/tls.key\\\\\\\"\\\\nI0320 17:33:16.771141 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:33:16.777371 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:33:16.777420 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:33:16.777489 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:33:16.777503 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:33:16.783760 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 17:33:16.783788 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:33:16.783793 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 17:33:16.783790 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:33:16.783798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:33:16.783816 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:33:16.783823 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:33:16.783828 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:33:16.787038 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:34:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d1877a8c2e19c04c44916cbcd68e19a117e4d6075b33ce7131064590120b12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438a96b878fe413aa54a56021b7ca5d2d38226050a036c2ce144aaead090aff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://438a96b878fe413aa54a56021b7ca5d2d38226050a036c2ce144aaead090aff7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:46Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:46 crc kubenswrapper[4690]: I0320 17:34:46.341578 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7501f273f832d465f837fe21cbfaddda7e9fdbfafe44e94d3fbfee21bbd2735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:46Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:46 crc kubenswrapper[4690]: I0320 17:34:46.355899 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:46Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:46 crc kubenswrapper[4690]: I0320 17:34:46.744769 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7bsmm_01a728ab-e286-4606-b922-d510978b863a/ovnkube-controller/3.log" Mar 20 17:34:46 crc kubenswrapper[4690]: I0320 17:34:46.749317 4690 scope.go:117] "RemoveContainer" containerID="81abe4d654d381b11ab7ff28d592be23303e3f7934bb0c68d3f3c30316b491ca" Mar 20 17:34:46 crc kubenswrapper[4690]: E0320 17:34:46.749500 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7bsmm_openshift-ovn-kubernetes(01a728ab-e286-4606-b922-d510978b863a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" podUID="01a728ab-e286-4606-b922-d510978b863a" Mar 20 17:34:46 crc kubenswrapper[4690]: I0320 17:34:46.770351 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1ec4f2e-81b3-4b81-b071-1306b93f352a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc5b19d4175f97a26633b3c61b49147f93e1edeb8975964cb23bbe474f6326e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe2bb59ee9fc82c3e49b375d294aebc73e2175d699416cb28c587a153cbadc5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83d020fd903a7b604233a4229c9a201a78f0f9d41864c94e82220090dd73e69e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af5080d60c7a6c75aac659ab9995f5f78392919748687dc3c81f6df7af1afe76\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60a788ca120045ef7b2481c3da0afac1f8ae2522b3edd3b73a48f5f8dab045a4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:33:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:33:16.417534 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:33:16.417775 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:33:16.418850 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4179466923/tls.crt::/tmp/serving-cert-4179466923/tls.key\\\\\\\"\\\\nI0320 17:33:16.771141 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:33:16.777371 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:33:16.777420 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:33:16.777489 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:33:16.777503 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:33:16.783760 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 17:33:16.783788 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:33:16.783793 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 17:33:16.783790 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:33:16.783798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:33:16.783816 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:33:16.783823 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:33:16.783828 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:33:16.787038 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:34:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d1877a8c2e19c04c44916cbcd68e19a117e4d6075b33ce7131064590120b12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438a96b878fe413aa54a56021b7ca5d2d38226050a036c2ce144aaead090aff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://438a96b878fe413aa54a56021b7ca5d2d38226050a036c2ce144aaead090aff7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:46Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:46 crc kubenswrapper[4690]: I0320 17:34:46.787831 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7501f273f832d465f837fe21cbfaddda7e9fdbfafe44e94d3fbfee21bbd2735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:46Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:46 crc kubenswrapper[4690]: I0320 17:34:46.806519 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:46Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:46 crc kubenswrapper[4690]: I0320 17:34:46.828080 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:46Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:46 crc kubenswrapper[4690]: I0320 17:34:46.843839 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qhmg6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5abdfe2-a5f7-43a7-9c83-a9eb0dacdea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19bc44db59dd7f723e92f099fb77ea80fac41a5fc0a3818ddd8d443495c50c8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lb8q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qhmg6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:46Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:46 crc kubenswrapper[4690]: I0320 17:34:46.860377 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cdd6a8b-6b15-41c5-ba81-51e1ef53835e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c42561cbc470c23295468bf31d6dda364c3962cf8ac84f53ed62c01fa3e19db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbfcf8baf8b3cc4746bc7b314297f0f820b7461ad85d9c2f500a3ed589fb4bc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbfcf8baf8b3cc4746bc7b314297f0f820b7461ad85d9c2f500a3ed589fb4bc0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:46Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:46 crc kubenswrapper[4690]: I0320 17:34:46.878948 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://202f57e25ffca6b763271ebd9354cb780bda72898aa4b753ce08bcf5a774dbd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:46Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:46 crc kubenswrapper[4690]: I0320 17:34:46.894054 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c18651e4-89e3-43fd-a780-bfa6df87591e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://746499ab480c55aa548acd69b4adc2adb724c111d53536273f1e738c5d67209c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v64dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09565d72b6e11bc9bc4f72446c455016fb107bdf0fe367b56427ce9f79c20b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v64dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wtg2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:46Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:46 crc kubenswrapper[4690]: I0320 17:34:46.909390 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4rfg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deaf1de2-4906-4e89-ae1b-83b6d35f97a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53b3e701b77813269b88f29ec4e437ca71cad9cd1b9cc9310dc6b59cc609bcc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmghf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4rfg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:46Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:46 crc kubenswrapper[4690]: I0320 17:34:46.924637 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nqtt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f51dea1-fc10-4d4a-9065-2d0c020b36f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc815d328a997ab7b69c5eb959fedde44313867916d64f4ebaf96d77e34b2e84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzj2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de078ec156833ff0304a8e83014adf2c8fc5c7f8db9bb25c366acf27fa446ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzj2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8nqtt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:46Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:46 crc kubenswrapper[4690]: I0320 17:34:46.942351 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c86b6b30-cf74-4708-b280-8c90ce27af28\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9910536149dc102d5a56c9ac27047ab0f86628788126c6c4aaf8aa8e8bc414bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://123e6e9aa8268f78a2852df2460763150ed92462bebd7c852c2bb6f78a092781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://906d1b0f0eda0e576d188ea1c4f601f45dcc8e93bf96330fa4e50be9d7a082b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87f674631ef5a418e3657c5c5103ab4c199d3f1690e0a0c737927afd35db4170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87f674631ef5a418e3657c5c5103ab4c199d3f1690e0a0c737927afd35db4170\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:46Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:46 crc kubenswrapper[4690]: I0320 17:34:46.959571 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:46Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:46 crc kubenswrapper[4690]: I0320 17:34:46.980383 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bf8dm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"189715be-f690-4a1d-9bd3-fb0dcae7affe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a2c238f16fbb8b532515c8ae6456c4e5b9b6e5797597ea258171e573c9f4ba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ab5d0027832ffcb62f2f0869a4811a56bd02954cbdd4fd0e20870dc72818ba4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:34:41Z\\\",\\\"message\\\":\\\"2026-03-20T17:33:56+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_1bdcc22d-b4ba-4714-aa18-2d803f8b3ba5\\\\n2026-03-20T17:33:56+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_1bdcc22d-b4ba-4714-aa18-2d803f8b3ba5 to /host/opt/cni/bin/\\\\n2026-03-20T17:33:56Z [verbose] multus-daemon started\\\\n2026-03-20T17:33:56Z [verbose] Readiness Indicator file check\\\\n2026-03-20T17:34:41Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:55Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:34:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z9vwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bf8dm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:46Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:47 crc kubenswrapper[4690]: I0320 17:34:46.999884 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64a74fb2e29c84d99284cdca82ecd7abae5fc195747f292f11036116ec270ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37728496304293eddfd812f4584815ce277a3a2b02b6716e5f7d5d77ebaf9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:46Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:47 crc kubenswrapper[4690]: I0320 17:34:47.032479 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dacf40f3-f7fe-429b-bb11-3057bc037779\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b273b610fa19944625ca87d5ec10f818b86154d676f1def5ebe494ee44ed3848\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f552ca9ec154d035a9f9809b20d9ff2cd19bbd4cb9262173a0334289741f4fad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd8c552d958aced0cb683d87c3ef8d88494d4888ccb028a9f4c27b24b4923264\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5355eb1563fa92e70ca61e39a864a15b53da2181b277f3e134d121b5626b954a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f044bae4d4345b16e951ba16d4dc6df9b400789b67b6eb23d806fba27dc77d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://601a5cb96354f970de2322d08594baacac3c21ec962d27dc0c809f1bc99de4d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://601a5cb96354f970de2322d08594baacac3c21ec962d27dc0c809f1bc99de4d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e719a69188fb4ee3882973f6f72ba027c5a546cb39b119b27bcd38d8cc728521\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e719a69188fb4ee3882973f6f72ba027c5a546cb39b119b27bcd38d8cc728521\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ef118e8eca52e42d265877595d296d5641caa5c79886b886eefca7686f9b6524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef118e8eca52e42d265877595d296d5641caa5c79886b886eefca7686f9b6524\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:47Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:47 crc kubenswrapper[4690]: I0320 17:34:47.049622 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bgj72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cb690cf-caea-4c1b-ad3c-7e17a802b1a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djqjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djqjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bgj72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:47Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:47 crc kubenswrapper[4690]: I0320 17:34:47.072974 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tzvwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fe7c1d1-7aa9-4c64-941e-7415a99367ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9bad176e93c3fff461f57c5c15ed0d5bcc9ef12767d38012fe1145dd701112b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56dc92b978a7c1bbd4e3ccc2a6821348e2a990247e49e82c4de43c8bbe305cad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56dc92b978a7c1bbd4e3ccc2a6821348e2a990247e49e82c4de43c8bbe305cad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b4a3f2829967bcafe60ed0c6d08a421e8c8a5cd49d2a7445bbc92c2592d7457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b4a3f2829967bcafe60ed0c6d08a421e8c8a5cd49d2a7445bbc92c2592d7457\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://971c38dc48c64a0c8c8781e6d2a3d6f5222f9e846fb32ae417a4a1872a296b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://971c38dc48c64a0c8c8781e6d2a3d6f5222f9e846fb32ae417a4a1872a296b47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6e590fdf915cb209ad79022e0bb1b20cf642ebfeaa5e67cad61f14c495feaed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6e590fdf915cb209ad79022e0bb1b20cf642ebfeaa5e67cad61f14c495feaed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa9bfe0b6b30c8ecbcab836f9fd1770f959392e981e9676b281b5768a4279d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa9bfe0b6b30c8ecbcab836f9fd1770f959392e981e9676b281b5768a4279d22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d589b2d09af16c9faaa995e5d4abaa7663d53b499e93fbb2ad76e9ef14ff32c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d589b2d09af16c9faaa995e5d4abaa7663d53b499e93fbb2ad76e9ef14ff32c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tzvwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:47Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:47 crc kubenswrapper[4690]: I0320 17:34:47.105943 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01a728ab-e286-4606-b922-d510978b863a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89f5bb035f84384df58eb38689bda300611344d78c38c548c61cd02a479b6852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://187278dddcc4ae295ce37bb5966dd95b70987cf9579d8a302c45162906caa098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d198c0b94cfc2e9429a02ccb1bf444b3746c37cd3278cc5c41cccad3a92f3a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78b79e7c6bc179739a43168addace3ea75f4067c5938f219a5cb0e545f65472f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11c8e8059826df28ea1bdafe3ca56a8a902ff916246367be3ece76d468194901\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95c9b322e5da6bc8172886af77d6507bccaaf8e4489181c78d3f5e522d781aa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81abe4d654d381b11ab7ff28d592be23303e3f7934bb0c68d3f3c30316b491ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81abe4d654d381b11ab7ff28d592be23303e3f7934bb0c68d3f3c30316b491ca\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:34:44Z\\\",\\\"message\\\":\\\"cyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 17:34:44.775483 7209 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0320 17:34:44.775498 7209 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 17:34:44.775533 7209 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 17:34:44.775568 7209 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 17:34:44.775579 7209 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 17:34:44.775641 7209 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 17:34:44.775695 7209 factory.go:656] Stopping watch factory\\\\nI0320 17:34:44.775730 7209 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 17:34:44.775733 7209 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 17:34:44.775750 7209 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0320 17:34:44.775766 7209 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 17:34:44.775781 7209 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:34:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7bsmm_openshift-ovn-kubernetes(01a728ab-e286-4606-b922-d510978b863a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6447a78cef9ba2045f7928077399b681d152b37755ec287ae1633a26a67711ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ad2529bd38d1e0c84ca456ccdcc8020ce82a667c5aa5ea3a0027d397ec94f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ad2529bd38d1e0c84ca456ccdcc8020ce82a667c5aa5ea3a0027d397ec94f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7bsmm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:47Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:47 crc kubenswrapper[4690]: I0320 17:34:47.128320 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a01b81a-5874-41c3-a2ea-0b3f68fb1194\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://696bd60243b29b1c078b32f2dcb7261e108e0b204ba5889b2c0ce5d6c8dff044\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98dccfbbb62f60dc126e6c81729f6ac78b1f017d1dd01a200d06beb2296fd1b2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:32:38Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 17:32:08.107057 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 17:32:08.109144 1 observer_polling.go:159] Starting file observer\\\\nI0320 17:32:08.136585 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 17:32:08.140728 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0320 17:32:38.286606 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43215f00bdcc0d708039a3dd34ce62baa101c8218cc73255f2027f3dbfe60198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35960cf982659d799c1e2ce1a4c7eb21b7b1c5d8e5979668b4b6df505c38bdf7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://079fc6ab0278dfdaa56142eb90b06568010882948e45bea053b0459a68c9faa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:47Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:47 crc kubenswrapper[4690]: I0320 17:34:47.882696 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:34:47 crc kubenswrapper[4690]: I0320 17:34:47.882725 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:34:47 crc kubenswrapper[4690]: I0320 17:34:47.882924 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bgj72" Mar 20 17:34:47 crc kubenswrapper[4690]: E0320 17:34:47.882912 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:34:47 crc kubenswrapper[4690]: E0320 17:34:47.883102 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:34:47 crc kubenswrapper[4690]: I0320 17:34:47.882897 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:34:47 crc kubenswrapper[4690]: E0320 17:34:47.883318 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bgj72" podUID="3cb690cf-caea-4c1b-ad3c-7e17a802b1a3" Mar 20 17:34:47 crc kubenswrapper[4690]: E0320 17:34:47.883488 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:34:49 crc kubenswrapper[4690]: I0320 17:34:49.465602 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:34:49 crc kubenswrapper[4690]: I0320 17:34:49.465663 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:34:49 crc kubenswrapper[4690]: I0320 17:34:49.465682 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:34:49 crc kubenswrapper[4690]: I0320 17:34:49.465704 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:34:49 crc kubenswrapper[4690]: I0320 17:34:49.465720 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:34:49Z","lastTransitionTime":"2026-03-20T17:34:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:34:49 crc kubenswrapper[4690]: E0320 17:34:49.488802 4690 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"65dcae3a-f6f0-4cdb-ac7a-76b1f475ea12\\\",\\\"systemUUID\\\":\\\"6ccc1e34-4160-4143-b919-ac2f717f294a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:49Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:49 crc kubenswrapper[4690]: I0320 17:34:49.494093 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:34:49 crc kubenswrapper[4690]: I0320 17:34:49.494144 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:34:49 crc kubenswrapper[4690]: I0320 17:34:49.494157 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:34:49 crc kubenswrapper[4690]: I0320 17:34:49.494175 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:34:49 crc kubenswrapper[4690]: I0320 17:34:49.494187 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:34:49Z","lastTransitionTime":"2026-03-20T17:34:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:34:49 crc kubenswrapper[4690]: E0320 17:34:49.511327 4690 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"65dcae3a-f6f0-4cdb-ac7a-76b1f475ea12\\\",\\\"systemUUID\\\":\\\"6ccc1e34-4160-4143-b919-ac2f717f294a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:49Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:49 crc kubenswrapper[4690]: I0320 17:34:49.516031 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:34:49 crc kubenswrapper[4690]: I0320 17:34:49.516082 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:34:49 crc kubenswrapper[4690]: I0320 17:34:49.516094 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:34:49 crc kubenswrapper[4690]: I0320 17:34:49.516111 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:34:49 crc kubenswrapper[4690]: I0320 17:34:49.516123 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:34:49Z","lastTransitionTime":"2026-03-20T17:34:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:34:49 crc kubenswrapper[4690]: E0320 17:34:49.533806 4690 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"65dcae3a-f6f0-4cdb-ac7a-76b1f475ea12\\\",\\\"systemUUID\\\":\\\"6ccc1e34-4160-4143-b919-ac2f717f294a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:49Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:49 crc kubenswrapper[4690]: I0320 17:34:49.537868 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:34:49 crc kubenswrapper[4690]: I0320 17:34:49.537918 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:34:49 crc kubenswrapper[4690]: I0320 17:34:49.537937 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:34:49 crc kubenswrapper[4690]: I0320 17:34:49.537965 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:34:49 crc kubenswrapper[4690]: I0320 17:34:49.537987 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:34:49Z","lastTransitionTime":"2026-03-20T17:34:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:34:49 crc kubenswrapper[4690]: E0320 17:34:49.558029 4690 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"65dcae3a-f6f0-4cdb-ac7a-76b1f475ea12\\\",\\\"systemUUID\\\":\\\"6ccc1e34-4160-4143-b919-ac2f717f294a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:49Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:49 crc kubenswrapper[4690]: I0320 17:34:49.562421 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:34:49 crc kubenswrapper[4690]: I0320 17:34:49.562483 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:34:49 crc kubenswrapper[4690]: I0320 17:34:49.562516 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:34:49 crc kubenswrapper[4690]: I0320 17:34:49.562537 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:34:49 crc kubenswrapper[4690]: I0320 17:34:49.562549 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:34:49Z","lastTransitionTime":"2026-03-20T17:34:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:34:49 crc kubenswrapper[4690]: E0320 17:34:49.575099 4690 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:34:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"65dcae3a-f6f0-4cdb-ac7a-76b1f475ea12\\\",\\\"systemUUID\\\":\\\"6ccc1e34-4160-4143-b919-ac2f717f294a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:49Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:49 crc kubenswrapper[4690]: E0320 17:34:49.575274 4690 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 17:34:49 crc kubenswrapper[4690]: I0320 17:34:49.882648 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:34:49 crc kubenswrapper[4690]: E0320 17:34:49.882815 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:34:49 crc kubenswrapper[4690]: I0320 17:34:49.883116 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:34:49 crc kubenswrapper[4690]: E0320 17:34:49.883203 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:34:49 crc kubenswrapper[4690]: I0320 17:34:49.883456 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bgj72" Mar 20 17:34:49 crc kubenswrapper[4690]: E0320 17:34:49.883566 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bgj72" podUID="3cb690cf-caea-4c1b-ad3c-7e17a802b1a3" Mar 20 17:34:49 crc kubenswrapper[4690]: I0320 17:34:49.883674 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:34:49 crc kubenswrapper[4690]: E0320 17:34:49.883831 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:34:50 crc kubenswrapper[4690]: E0320 17:34:50.997103 4690 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 17:34:51 crc kubenswrapper[4690]: I0320 17:34:51.882371 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:34:51 crc kubenswrapper[4690]: E0320 17:34:51.882580 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:34:51 crc kubenswrapper[4690]: I0320 17:34:51.882371 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:34:51 crc kubenswrapper[4690]: I0320 17:34:51.882636 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bgj72" Mar 20 17:34:51 crc kubenswrapper[4690]: I0320 17:34:51.882644 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:34:51 crc kubenswrapper[4690]: E0320 17:34:51.882793 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:34:51 crc kubenswrapper[4690]: E0320 17:34:51.883022 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bgj72" podUID="3cb690cf-caea-4c1b-ad3c-7e17a802b1a3" Mar 20 17:34:51 crc kubenswrapper[4690]: E0320 17:34:51.883170 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:34:53 crc kubenswrapper[4690]: I0320 17:34:53.882459 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bgj72" Mar 20 17:34:53 crc kubenswrapper[4690]: I0320 17:34:53.882538 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:34:53 crc kubenswrapper[4690]: I0320 17:34:53.882477 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:34:53 crc kubenswrapper[4690]: I0320 17:34:53.882563 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:34:53 crc kubenswrapper[4690]: E0320 17:34:53.882666 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bgj72" podUID="3cb690cf-caea-4c1b-ad3c-7e17a802b1a3" Mar 20 17:34:53 crc kubenswrapper[4690]: E0320 17:34:53.882799 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:34:53 crc kubenswrapper[4690]: E0320 17:34:53.882871 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:34:53 crc kubenswrapper[4690]: E0320 17:34:53.883020 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:34:55 crc kubenswrapper[4690]: I0320 17:34:55.883140 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:34:55 crc kubenswrapper[4690]: I0320 17:34:55.883229 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bgj72" Mar 20 17:34:55 crc kubenswrapper[4690]: I0320 17:34:55.883467 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:34:55 crc kubenswrapper[4690]: I0320 17:34:55.883516 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:34:55 crc kubenswrapper[4690]: E0320 17:34:55.883800 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:34:55 crc kubenswrapper[4690]: E0320 17:34:55.883882 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:34:55 crc kubenswrapper[4690]: E0320 17:34:55.884023 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bgj72" podUID="3cb690cf-caea-4c1b-ad3c-7e17a802b1a3" Mar 20 17:34:55 crc kubenswrapper[4690]: E0320 17:34:55.884122 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:34:55 crc kubenswrapper[4690]: I0320 17:34:55.899699 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c86b6b30-cf74-4708-b280-8c90ce27af28\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9910536149dc102d5a56c9ac27047ab0f86628788126c6c4aaf8aa8e8bc414bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://123e6e9aa8268f78a2852df2460763150ed92462bebd7c852c2bb6f78a092781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://906d1b0f0eda0e576d188ea1c4f601f45dcc8e93bf96330fa4e50be9d7a082b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87f674631ef5a418e3657c5c5103ab4c199d3f1690e0a0c737927afd35db4170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87f674631ef5a418e3657c5c5103ab4c199d3f1690e0a0c737927afd35db4170\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:55Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:55 crc kubenswrapper[4690]: I0320 17:34:55.916478 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://202f57e25ffca6b763271ebd9354cb780bda72898aa4b753ce08bcf5a774dbd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:55Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:55 crc kubenswrapper[4690]: I0320 17:34:55.933059 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c18651e4-89e3-43fd-a780-bfa6df87591e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://746499ab480c55aa548acd69b4adc2adb724c111d53536273f1e738c5d67209c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v64dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09565d72b6e11bc9bc4f72446c455016fb107bdf0fe367b56427ce9f79c20b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v64dg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wtg2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:55Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:55 crc kubenswrapper[4690]: I0320 17:34:55.947565 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4rfg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deaf1de2-4906-4e89-ae1b-83b6d35f97a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53b3e701b77813269b88f29ec4e437ca71cad9cd1b9cc9310dc6b59cc609bcc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qmghf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4rfg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:55Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:55 crc kubenswrapper[4690]: I0320 17:34:55.964068 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nqtt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f51dea1-fc10-4d4a-9065-2d0c020b36f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc815d328a997ab7b69c5eb959fedde44313867916d64f4ebaf96d77e34b2e84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzj2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de078ec156833ff0304a8e83014adf2c8fc5c7f8db9bb25c366acf27fa446ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzj2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8nqtt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:55Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:55 crc kubenswrapper[4690]: I0320 17:34:55.980653 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64a74fb2e29c84d99284cdca82ecd7abae5fc195747f292f11036116ec270ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37728496304293eddfd812f4584815ce277a3a2b02b6716e5f7d5d77ebaf9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:55Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:55 crc kubenswrapper[4690]: I0320 17:34:55.996766 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:55Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:55 crc kubenswrapper[4690]: E0320 17:34:55.998943 4690 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 17:34:56 crc kubenswrapper[4690]: I0320 17:34:56.024332 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bf8dm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"189715be-f690-4a1d-9bd3-fb0dcae7affe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a2c238f16fbb8b532515c8ae6456c4e5b9b6e5797597ea258171e573c9f4ba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ab5d0027832ffcb62f2f0869a4811a56bd02954cbdd4fd0e20870dc72818ba4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:34:41Z\\\",\\\"message\\\":\\\"2026-03-20T17:33:56+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_1bdcc22d-b4ba-4714-aa18-2d803f8b3ba5\\\\n2026-03-20T17:33:56+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_1bdcc22d-b4ba-4714-aa18-2d803f8b3ba5 to /host/opt/cni/bin/\\\\n2026-03-20T17:33:56Z [verbose] multus-daemon started\\\\n2026-03-20T17:33:56Z [verbose] Readiness Indicator file check\\\\n2026-03-20T17:34:41Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:55Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:34:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z9vwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bf8dm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:56Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:56 crc kubenswrapper[4690]: I0320 17:34:56.039849 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a01b81a-5874-41c3-a2ea-0b3f68fb1194\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://696bd60243b29b1c078b32f2dcb7261e108e0b204ba5889b2c0ce5d6c8dff044\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98dccfbbb62f60dc126e6c81729f6ac78b1f017d1dd01a200d06beb2296fd1b2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:32:38Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 17:32:08.107057 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 17:32:08.109144 1 observer_polling.go:159] Starting file observer\\\\nI0320 17:32:08.136585 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 17:32:08.140728 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0320 17:32:38.286606 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43215f00bdcc0d708039a3dd34ce62baa101c8218cc73255f2027f3dbfe60198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35960cf982659d799c1e2ce1a4c7eb21b7b1c5d8e5979668b4b6df505c38bdf7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://079fc6ab0278dfdaa56142eb90b06568010882948e45bea053b0459a68c9faa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:56Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:56 crc kubenswrapper[4690]: I0320 17:34:56.071415 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dacf40f3-f7fe-429b-bb11-3057bc037779\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b273b610fa19944625ca87d5ec10f818b86154d676f1def5ebe494ee44ed3848\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f552ca9ec154d035a9f9809b20d9ff2cd19bbd4cb9262173a0334289741f4fad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd8c552d958aced0cb683d87c3ef8d88494d4888ccb028a9f4c27b24b4923264\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5355eb1563fa92e70ca61e39a864a15b53da2181b277f3e134d121b5626b954a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f044bae4d4345b16e951ba16d4dc6df9b400789b67b6eb23d806fba27dc77d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://601a5cb96354f970de2322d08594baacac3c21ec962d27dc0c809f1bc99de4d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://601a5cb96354f970de2322d08594baacac3c21ec962d27dc0c809f1bc99de4d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e719a69188fb4ee3882973f6f72ba027c5a546cb39b119b27bcd38d8cc728521\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e719a69188fb4ee3882973f6f72ba027c5a546cb39b119b27bcd38d8cc728521\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ef118e8eca52e42d265877595d296d5641caa5c79886b886eefca7686f9b6524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef118e8eca52e42d265877595d296d5641caa5c79886b886eefca7686f9b6524\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:56Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:56 crc kubenswrapper[4690]: I0320 17:34:56.086026 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bgj72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cb690cf-caea-4c1b-ad3c-7e17a802b1a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djqjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djqjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bgj72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:56Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:56 crc kubenswrapper[4690]: I0320 17:34:56.102516 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tzvwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fe7c1d1-7aa9-4c64-941e-7415a99367ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9bad176e93c3fff461f57c5c15ed0d5bcc9ef12767d38012fe1145dd701112b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56dc92b978a7c1bbd4e3ccc2a6821348e2a990247e49e82c4de43c8bbe305cad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56dc92b978a7c1bbd4e3ccc2a6821348e2a990247e49e82c4de43c8bbe305cad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b4a3f2829967bcafe60ed0c6d08a421e8c8a5cd49d2a7445bbc92c2592d7457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b4a3f2829967bcafe60ed0c6d08a421e8c8a5cd49d2a7445bbc92c2592d7457\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://971c38dc48c64a0c8c8781e6d2a3d6f5222f9e846fb32ae417a4a1872a296b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://971c38dc48c64a0c8c8781e6d2a3d6f5222f9e846fb32ae417a4a1872a296b47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6e590fdf915cb209ad79022e0bb1b20cf642ebfeaa5e67cad61f14c495feaed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6e590fdf915cb209ad79022e0bb1b20cf642ebfeaa5e67cad61f14c495feaed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa9bfe0b6b30c8ecbcab836f9fd1770f959392e981e9676b281b5768a4279d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa9bfe0b6b30c8ecbcab836f9fd1770f959392e981e9676b281b5768a4279d22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d589b2d09af16c9faaa995e5d4abaa7663d53b499e93fbb2ad76e9ef14ff32c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d589b2d09af16c9faaa995e5d4abaa7663d53b499e93fbb2ad76e9ef14ff32c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79kbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tzvwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:56Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:56 crc kubenswrapper[4690]: I0320 17:34:56.128907 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01a728ab-e286-4606-b922-d510978b863a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89f5bb035f84384df58eb38689bda300611344d78c38c548c61cd02a479b6852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://187278dddcc4ae295ce37bb5966dd95b70987cf9579d8a302c45162906caa098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d198c0b94cfc2e9429a02ccb1bf444b3746c37cd3278cc5c41cccad3a92f3a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78b79e7c6bc179739a43168addace3ea75f4067c5938f219a5cb0e545f65472f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11c8e8059826df28ea1bdafe3ca56a8a902ff916246367be3ece76d468194901\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95c9b322e5da6bc8172886af77d6507bccaaf8e4489181c78d3f5e522d781aa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81abe4d654d381b11ab7ff28d592be23303e3f7934bb0c68d3f3c30316b491ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81abe4d654d381b11ab7ff28d592be23303e3f7934bb0c68d3f3c30316b491ca\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:34:44Z\\\",\\\"message\\\":\\\"cyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 17:34:44.775483 7209 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0320 17:34:44.775498 7209 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 17:34:44.775533 7209 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 17:34:44.775568 7209 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 17:34:44.775579 7209 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 17:34:44.775641 7209 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 17:34:44.775695 7209 factory.go:656] Stopping watch factory\\\\nI0320 17:34:44.775730 7209 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 17:34:44.775733 7209 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 17:34:44.775750 7209 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0320 17:34:44.775766 7209 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 17:34:44.775781 7209 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:34:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7bsmm_openshift-ovn-kubernetes(01a728ab-e286-4606-b922-d510978b863a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6447a78cef9ba2045f7928077399b681d152b37755ec287ae1633a26a67711ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ad2529bd38d1e0c84ca456ccdcc8020ce82a667c5aa5ea3a0027d397ec94f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ad2529bd38d1e0c84ca456ccdcc8020ce82a667c5aa5ea3a0027d397ec94f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:33:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmwk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7bsmm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:56Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:56 crc kubenswrapper[4690]: I0320 17:34:56.142955 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cdd6a8b-6b15-41c5-ba81-51e1ef53835e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c42561cbc470c23295468bf31d6dda364c3962cf8ac84f53ed62c01fa3e19db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbfcf8baf8b3cc4746bc7b314297f0f820b7461ad85d9c2f500a3ed589fb4bc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbfcf8baf8b3cc4746bc7b314297f0f820b7461ad85d9c2f500a3ed589fb4bc0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:56Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:56 crc kubenswrapper[4690]: I0320 17:34:56.161927 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1ec4f2e-81b3-4b81-b071-1306b93f352a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:34:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:32:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc5b19d4175f97a26633b3c61b49147f93e1edeb8975964cb23bbe474f6326e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe2bb59ee9fc82c3e49b375d294aebc73e2175d699416cb28c587a153cbadc5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83d020fd903a7b604233a4229c9a201a78f0f9d41864c94e82220090dd73e69e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af5080d60c7a6c75aac659ab9995f5f78392919748687dc3c81f6df7af1afe76\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60a788ca120045ef7b2481c3da0afac1f8ae2522b3edd3b73a48f5f8dab045a4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:33:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:33:16.417534 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:33:16.417775 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:33:16.418850 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4179466923/tls.crt::/tmp/serving-cert-4179466923/tls.key\\\\\\\"\\\\nI0320 17:33:16.771141 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:33:16.777371 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:33:16.777420 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:33:16.777489 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:33:16.777503 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:33:16.783760 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 17:33:16.783788 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:33:16.783793 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 17:33:16.783790 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:33:16.783798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:33:16.783816 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:33:16.783823 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:33:16.783828 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:33:16.787038 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:33:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:34:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d1877a8c2e19c04c44916cbcd68e19a117e4d6075b33ce7131064590120b12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:32:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438a96b878fe413aa54a56021b7ca5d2d38226050a036c2ce144aaead090aff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://438a96b878fe413aa54a56021b7ca5d2d38226050a036c2ce144aaead090aff7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:32:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:32:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:56Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:56 crc kubenswrapper[4690]: I0320 17:34:56.182443 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7501f273f832d465f837fe21cbfaddda7e9fdbfafe44e94d3fbfee21bbd2735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:56Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:56 crc kubenswrapper[4690]: I0320 17:34:56.197917 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:56Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:56 crc kubenswrapper[4690]: I0320 17:34:56.216106 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:56Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:56 crc kubenswrapper[4690]: I0320 17:34:56.228325 4690 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qhmg6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5abdfe2-a5f7-43a7-9c83-a9eb0dacdea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:33:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19bc44db59dd7f723e92f099fb77ea80fac41a5fc0a3818ddd8d443495c50c8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:33:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lb8q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:33:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qhmg6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:34:56Z is after 2025-08-24T17:21:41Z" Mar 20 17:34:57 crc kubenswrapper[4690]: I0320 17:34:57.883212 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:34:57 crc kubenswrapper[4690]: I0320 17:34:57.883341 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:34:57 crc kubenswrapper[4690]: I0320 17:34:57.883431 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:34:57 crc kubenswrapper[4690]: E0320 17:34:57.883433 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:34:57 crc kubenswrapper[4690]: I0320 17:34:57.883512 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bgj72" Mar 20 17:34:57 crc kubenswrapper[4690]: E0320 17:34:57.883650 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:34:57 crc kubenswrapper[4690]: E0320 17:34:57.883795 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bgj72" podUID="3cb690cf-caea-4c1b-ad3c-7e17a802b1a3" Mar 20 17:34:57 crc kubenswrapper[4690]: E0320 17:34:57.884079 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:34:59 crc kubenswrapper[4690]: I0320 17:34:59.662725 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:34:59 crc kubenswrapper[4690]: I0320 17:34:59.662794 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:34:59 crc kubenswrapper[4690]: I0320 17:34:59.662822 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:34:59 crc kubenswrapper[4690]: I0320 17:34:59.662854 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:34:59 crc kubenswrapper[4690]: I0320 17:34:59.662876 4690 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:34:59Z","lastTransitionTime":"2026-03-20T17:34:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:34:59 crc kubenswrapper[4690]: I0320 17:34:59.721659 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-9zx8f"] Mar 20 17:34:59 crc kubenswrapper[4690]: I0320 17:34:59.722242 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9zx8f" Mar 20 17:34:59 crc kubenswrapper[4690]: I0320 17:34:59.725174 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 20 17:34:59 crc kubenswrapper[4690]: I0320 17:34:59.725222 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 20 17:34:59 crc kubenswrapper[4690]: I0320 17:34:59.725172 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 20 17:34:59 crc kubenswrapper[4690]: I0320 17:34:59.725415 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 20 17:34:59 crc kubenswrapper[4690]: I0320 17:34:59.799954 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/e367144e-990c-48cd-93ad-348a3d3f5812-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-9zx8f\" (UID: \"e367144e-990c-48cd-93ad-348a3d3f5812\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9zx8f" Mar 20 17:34:59 crc kubenswrapper[4690]: I0320 17:34:59.800006 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e367144e-990c-48cd-93ad-348a3d3f5812-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-9zx8f\" (UID: \"e367144e-990c-48cd-93ad-348a3d3f5812\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9zx8f" Mar 20 17:34:59 crc kubenswrapper[4690]: I0320 17:34:59.800038 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e367144e-990c-48cd-93ad-348a3d3f5812-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-9zx8f\" (UID: \"e367144e-990c-48cd-93ad-348a3d3f5812\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9zx8f" Mar 20 17:34:59 crc kubenswrapper[4690]: I0320 17:34:59.800097 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/e367144e-990c-48cd-93ad-348a3d3f5812-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-9zx8f\" (UID: \"e367144e-990c-48cd-93ad-348a3d3f5812\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9zx8f" Mar 20 17:34:59 crc kubenswrapper[4690]: I0320 17:34:59.800149 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e367144e-990c-48cd-93ad-348a3d3f5812-service-ca\") pod \"cluster-version-operator-5c965bbfc6-9zx8f\" (UID: \"e367144e-990c-48cd-93ad-348a3d3f5812\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9zx8f" Mar 20 17:34:59 crc kubenswrapper[4690]: I0320 17:34:59.806989 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-bf8dm" podStartSLOduration=131.806969569 podStartE2EDuration="2m11.806969569s" podCreationTimestamp="2026-03-20 17:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:34:59.806505266 +0000 UTC m=+174.672330994" watchObservedRunningTime="2026-03-20 17:34:59.806969569 +0000 UTC m=+174.672795267" Mar 20 17:34:59 crc kubenswrapper[4690]: I0320 17:34:59.825282 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=39.825242986 podStartE2EDuration="39.825242986s" podCreationTimestamp="2026-03-20 17:34:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:34:59.825178034 +0000 UTC m=+174.691003722" watchObservedRunningTime="2026-03-20 17:34:59.825242986 +0000 UTC m=+174.691068664" Mar 20 17:34:59 crc kubenswrapper[4690]: I0320 17:34:59.882517 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bgj72" Mar 20 17:34:59 crc kubenswrapper[4690]: I0320 17:34:59.882547 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:34:59 crc kubenswrapper[4690]: I0320 17:34:59.882592 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:34:59 crc kubenswrapper[4690]: E0320 17:34:59.882660 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bgj72" podUID="3cb690cf-caea-4c1b-ad3c-7e17a802b1a3" Mar 20 17:34:59 crc kubenswrapper[4690]: E0320 17:34:59.882824 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:34:59 crc kubenswrapper[4690]: E0320 17:34:59.882866 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:34:59 crc kubenswrapper[4690]: I0320 17:34:59.882930 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:34:59 crc kubenswrapper[4690]: E0320 17:34:59.882997 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:34:59 crc kubenswrapper[4690]: I0320 17:34:59.893631 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=86.893612207 podStartE2EDuration="1m26.893612207s" podCreationTimestamp="2026-03-20 17:33:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:34:59.892206776 +0000 UTC m=+174.758032474" watchObservedRunningTime="2026-03-20 17:34:59.893612207 +0000 UTC m=+174.759437885" Mar 20 17:34:59 crc kubenswrapper[4690]: I0320 17:34:59.905848 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/e367144e-990c-48cd-93ad-348a3d3f5812-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-9zx8f\" (UID: \"e367144e-990c-48cd-93ad-348a3d3f5812\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9zx8f" Mar 20 17:34:59 crc kubenswrapper[4690]: I0320 17:34:59.905945 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e367144e-990c-48cd-93ad-348a3d3f5812-service-ca\") pod \"cluster-version-operator-5c965bbfc6-9zx8f\" (UID: \"e367144e-990c-48cd-93ad-348a3d3f5812\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9zx8f" Mar 20 17:34:59 crc kubenswrapper[4690]: I0320 17:34:59.905989 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/e367144e-990c-48cd-93ad-348a3d3f5812-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-9zx8f\" (UID: \"e367144e-990c-48cd-93ad-348a3d3f5812\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9zx8f" Mar 20 17:34:59 crc kubenswrapper[4690]: I0320 17:34:59.906019 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e367144e-990c-48cd-93ad-348a3d3f5812-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-9zx8f\" (UID: \"e367144e-990c-48cd-93ad-348a3d3f5812\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9zx8f" Mar 20 17:34:59 crc kubenswrapper[4690]: I0320 17:34:59.906047 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e367144e-990c-48cd-93ad-348a3d3f5812-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-9zx8f\" (UID: \"e367144e-990c-48cd-93ad-348a3d3f5812\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9zx8f" Mar 20 17:34:59 crc kubenswrapper[4690]: I0320 17:34:59.906474 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/e367144e-990c-48cd-93ad-348a3d3f5812-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-9zx8f\" (UID: \"e367144e-990c-48cd-93ad-348a3d3f5812\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9zx8f" Mar 20 17:34:59 crc kubenswrapper[4690]: I0320 17:34:59.907278 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e367144e-990c-48cd-93ad-348a3d3f5812-service-ca\") pod \"cluster-version-operator-5c965bbfc6-9zx8f\" (UID: \"e367144e-990c-48cd-93ad-348a3d3f5812\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9zx8f" Mar 20 17:34:59 crc kubenswrapper[4690]: I0320 17:34:59.907346 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/e367144e-990c-48cd-93ad-348a3d3f5812-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-9zx8f\" (UID: \"e367144e-990c-48cd-93ad-348a3d3f5812\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9zx8f" Mar 20 17:34:59 crc kubenswrapper[4690]: I0320 17:34:59.913305 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e367144e-990c-48cd-93ad-348a3d3f5812-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-9zx8f\" (UID: \"e367144e-990c-48cd-93ad-348a3d3f5812\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9zx8f" Mar 20 17:34:59 crc kubenswrapper[4690]: I0320 17:34:59.930660 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e367144e-990c-48cd-93ad-348a3d3f5812-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-9zx8f\" (UID: \"e367144e-990c-48cd-93ad-348a3d3f5812\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9zx8f" Mar 20 17:34:59 crc kubenswrapper[4690]: I0320 17:34:59.946995 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-tzvwm" podStartSLOduration=131.946972975 podStartE2EDuration="2m11.946972975s" podCreationTimestamp="2026-03-20 17:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:34:59.925388643 +0000 UTC m=+174.791214341" watchObservedRunningTime="2026-03-20 17:34:59.946972975 +0000 UTC m=+174.812798663" Mar 20 17:34:59 crc kubenswrapper[4690]: I0320 17:34:59.958619 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=88.95858893 podStartE2EDuration="1m28.95858893s" podCreationTimestamp="2026-03-20 17:33:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:34:59.957584771 +0000 UTC m=+174.823410459" watchObservedRunningTime="2026-03-20 17:34:59.95858893 +0000 UTC m=+174.824414628" Mar 20 17:34:59 crc kubenswrapper[4690]: I0320 17:34:59.974945 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=94.974923321 podStartE2EDuration="1m34.974923321s" podCreationTimestamp="2026-03-20 17:33:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:34:59.974918391 +0000 UTC m=+174.840744079" watchObservedRunningTime="2026-03-20 17:34:59.974923321 +0000 UTC m=+174.840748999" Mar 20 17:34:59 crc kubenswrapper[4690]: I0320 17:34:59.987353 4690 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 20 17:34:59 crc kubenswrapper[4690]: I0320 17:34:59.993549 4690 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 20 17:35:00 crc kubenswrapper[4690]: I0320 17:35:00.023732 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-qhmg6" podStartSLOduration=132.023716786 podStartE2EDuration="2m12.023716786s" podCreationTimestamp="2026-03-20 17:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:35:00.023411917 +0000 UTC m=+174.889237615" watchObservedRunningTime="2026-03-20 17:35:00.023716786 +0000 UTC m=+174.889542464" Mar 20 17:35:00 crc kubenswrapper[4690]: I0320 17:35:00.037476 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=58.037452862 podStartE2EDuration="58.037452862s" podCreationTimestamp="2026-03-20 17:34:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:35:00.037322438 +0000 UTC m=+174.903148116" watchObservedRunningTime="2026-03-20 17:35:00.037452862 +0000 UTC m=+174.903278550" Mar 20 17:35:00 crc kubenswrapper[4690]: I0320 17:35:00.058748 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podStartSLOduration=132.058730766 podStartE2EDuration="2m12.058730766s" podCreationTimestamp="2026-03-20 17:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:35:00.058382496 +0000 UTC m=+174.924208164" watchObservedRunningTime="2026-03-20 17:35:00.058730766 +0000 UTC m=+174.924556454" Mar 20 17:35:00 crc kubenswrapper[4690]: I0320 17:35:00.071174 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-4rfg5" podStartSLOduration=132.071154344 podStartE2EDuration="2m12.071154344s" podCreationTimestamp="2026-03-20 17:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:35:00.070925967 +0000 UTC m=+174.936751665" watchObservedRunningTime="2026-03-20 17:35:00.071154344 +0000 UTC m=+174.936980022" Mar 20 17:35:00 crc kubenswrapper[4690]: I0320 17:35:00.078675 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9zx8f" Mar 20 17:35:00 crc kubenswrapper[4690]: W0320 17:35:00.094579 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode367144e_990c_48cd_93ad_348a3d3f5812.slice/crio-bdb85b802eef4989d6b65013845c201b535e8e2de36f6d8bfab422785afd91fb WatchSource:0}: Error finding container bdb85b802eef4989d6b65013845c201b535e8e2de36f6d8bfab422785afd91fb: Status 404 returned error can't find the container with id bdb85b802eef4989d6b65013845c201b535e8e2de36f6d8bfab422785afd91fb Mar 20 17:35:00 crc kubenswrapper[4690]: I0320 17:35:00.096203 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nqtt" podStartSLOduration=132.096186875 podStartE2EDuration="2m12.096186875s" podCreationTimestamp="2026-03-20 17:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:35:00.08523792 +0000 UTC m=+174.951063598" watchObservedRunningTime="2026-03-20 17:35:00.096186875 +0000 UTC m=+174.962012553" Mar 20 17:35:00 crc kubenswrapper[4690]: I0320 17:35:00.804459 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9zx8f" event={"ID":"e367144e-990c-48cd-93ad-348a3d3f5812","Type":"ContainerStarted","Data":"2e769d8b3b8c6e55a3095fd4201826d9e525525c7c38b7870006751715e27959"} Mar 20 17:35:00 crc kubenswrapper[4690]: I0320 17:35:00.804551 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9zx8f" event={"ID":"e367144e-990c-48cd-93ad-348a3d3f5812","Type":"ContainerStarted","Data":"bdb85b802eef4989d6b65013845c201b535e8e2de36f6d8bfab422785afd91fb"} Mar 20 17:35:00 crc kubenswrapper[4690]: I0320 17:35:00.826449 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9zx8f" podStartSLOduration=132.826420406 podStartE2EDuration="2m12.826420406s" podCreationTimestamp="2026-03-20 17:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:35:00.825364135 +0000 UTC m=+175.691189853" watchObservedRunningTime="2026-03-20 17:35:00.826420406 +0000 UTC m=+175.692246194" Mar 20 17:35:00 crc kubenswrapper[4690]: I0320 17:35:00.884568 4690 scope.go:117] "RemoveContainer" containerID="81abe4d654d381b11ab7ff28d592be23303e3f7934bb0c68d3f3c30316b491ca" Mar 20 17:35:00 crc kubenswrapper[4690]: E0320 17:35:00.884837 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7bsmm_openshift-ovn-kubernetes(01a728ab-e286-4606-b922-d510978b863a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" podUID="01a728ab-e286-4606-b922-d510978b863a" Mar 20 17:35:01 crc kubenswrapper[4690]: E0320 17:35:01.000929 4690 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 17:35:01 crc kubenswrapper[4690]: I0320 17:35:01.882669 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:35:01 crc kubenswrapper[4690]: I0320 17:35:01.882727 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:35:01 crc kubenswrapper[4690]: I0320 17:35:01.882677 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:35:01 crc kubenswrapper[4690]: I0320 17:35:01.882833 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bgj72" Mar 20 17:35:01 crc kubenswrapper[4690]: E0320 17:35:01.883397 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:35:01 crc kubenswrapper[4690]: E0320 17:35:01.883580 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:35:01 crc kubenswrapper[4690]: E0320 17:35:01.883758 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:35:01 crc kubenswrapper[4690]: E0320 17:35:01.883952 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bgj72" podUID="3cb690cf-caea-4c1b-ad3c-7e17a802b1a3" Mar 20 17:35:03 crc kubenswrapper[4690]: I0320 17:35:03.883158 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:35:03 crc kubenswrapper[4690]: I0320 17:35:03.883162 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:35:03 crc kubenswrapper[4690]: I0320 17:35:03.883309 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bgj72" Mar 20 17:35:03 crc kubenswrapper[4690]: E0320 17:35:03.883405 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:35:03 crc kubenswrapper[4690]: I0320 17:35:03.883442 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:35:03 crc kubenswrapper[4690]: E0320 17:35:03.883751 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:35:03 crc kubenswrapper[4690]: E0320 17:35:03.883918 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:35:03 crc kubenswrapper[4690]: E0320 17:35:03.883958 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bgj72" podUID="3cb690cf-caea-4c1b-ad3c-7e17a802b1a3" Mar 20 17:35:05 crc kubenswrapper[4690]: I0320 17:35:05.882357 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:35:05 crc kubenswrapper[4690]: I0320 17:35:05.882394 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bgj72" Mar 20 17:35:05 crc kubenswrapper[4690]: E0320 17:35:05.884667 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:35:05 crc kubenswrapper[4690]: I0320 17:35:05.884713 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:35:05 crc kubenswrapper[4690]: I0320 17:35:05.884727 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:35:05 crc kubenswrapper[4690]: E0320 17:35:05.885218 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:35:05 crc kubenswrapper[4690]: E0320 17:35:05.885332 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:35:05 crc kubenswrapper[4690]: E0320 17:35:05.884840 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bgj72" podUID="3cb690cf-caea-4c1b-ad3c-7e17a802b1a3" Mar 20 17:35:06 crc kubenswrapper[4690]: E0320 17:35:06.001785 4690 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 17:35:07 crc kubenswrapper[4690]: I0320 17:35:07.882725 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:35:07 crc kubenswrapper[4690]: I0320 17:35:07.882792 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bgj72" Mar 20 17:35:07 crc kubenswrapper[4690]: I0320 17:35:07.882904 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:35:07 crc kubenswrapper[4690]: E0320 17:35:07.883065 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:35:07 crc kubenswrapper[4690]: I0320 17:35:07.883163 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:35:07 crc kubenswrapper[4690]: E0320 17:35:07.883305 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:35:07 crc kubenswrapper[4690]: E0320 17:35:07.883527 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:35:07 crc kubenswrapper[4690]: E0320 17:35:07.884033 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bgj72" podUID="3cb690cf-caea-4c1b-ad3c-7e17a802b1a3" Mar 20 17:35:09 crc kubenswrapper[4690]: I0320 17:35:09.882443 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:35:09 crc kubenswrapper[4690]: E0320 17:35:09.883188 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:35:09 crc kubenswrapper[4690]: I0320 17:35:09.882602 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:35:09 crc kubenswrapper[4690]: E0320 17:35:09.883485 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:35:09 crc kubenswrapper[4690]: I0320 17:35:09.882509 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:35:09 crc kubenswrapper[4690]: E0320 17:35:09.883717 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:35:09 crc kubenswrapper[4690]: I0320 17:35:09.882653 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bgj72" Mar 20 17:35:09 crc kubenswrapper[4690]: E0320 17:35:09.883962 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bgj72" podUID="3cb690cf-caea-4c1b-ad3c-7e17a802b1a3" Mar 20 17:35:11 crc kubenswrapper[4690]: E0320 17:35:11.003326 4690 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 17:35:11 crc kubenswrapper[4690]: I0320 17:35:11.882138 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bgj72" Mar 20 17:35:11 crc kubenswrapper[4690]: I0320 17:35:11.882194 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:35:11 crc kubenswrapper[4690]: I0320 17:35:11.882235 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:35:11 crc kubenswrapper[4690]: I0320 17:35:11.882138 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:35:11 crc kubenswrapper[4690]: E0320 17:35:11.882311 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bgj72" podUID="3cb690cf-caea-4c1b-ad3c-7e17a802b1a3" Mar 20 17:35:11 crc kubenswrapper[4690]: E0320 17:35:11.882475 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:35:11 crc kubenswrapper[4690]: E0320 17:35:11.882499 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:35:11 crc kubenswrapper[4690]: E0320 17:35:11.882638 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:35:13 crc kubenswrapper[4690]: I0320 17:35:13.882600 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:35:13 crc kubenswrapper[4690]: E0320 17:35:13.882814 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:35:13 crc kubenswrapper[4690]: I0320 17:35:13.882625 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bgj72" Mar 20 17:35:13 crc kubenswrapper[4690]: E0320 17:35:13.883293 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bgj72" podUID="3cb690cf-caea-4c1b-ad3c-7e17a802b1a3" Mar 20 17:35:13 crc kubenswrapper[4690]: I0320 17:35:13.883707 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:35:13 crc kubenswrapper[4690]: I0320 17:35:13.883708 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:35:13 crc kubenswrapper[4690]: E0320 17:35:13.884055 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:35:13 crc kubenswrapper[4690]: E0320 17:35:13.884163 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:35:14 crc kubenswrapper[4690]: I0320 17:35:14.884203 4690 scope.go:117] "RemoveContainer" containerID="81abe4d654d381b11ab7ff28d592be23303e3f7934bb0c68d3f3c30316b491ca" Mar 20 17:35:14 crc kubenswrapper[4690]: E0320 17:35:14.884560 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7bsmm_openshift-ovn-kubernetes(01a728ab-e286-4606-b922-d510978b863a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" podUID="01a728ab-e286-4606-b922-d510978b863a" Mar 20 17:35:15 crc kubenswrapper[4690]: I0320 17:35:15.882399 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:35:15 crc kubenswrapper[4690]: E0320 17:35:15.882529 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:35:15 crc kubenswrapper[4690]: I0320 17:35:15.882604 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bgj72" Mar 20 17:35:15 crc kubenswrapper[4690]: I0320 17:35:15.882676 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:35:15 crc kubenswrapper[4690]: E0320 17:35:15.885097 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bgj72" podUID="3cb690cf-caea-4c1b-ad3c-7e17a802b1a3" Mar 20 17:35:15 crc kubenswrapper[4690]: E0320 17:35:15.885459 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:35:15 crc kubenswrapper[4690]: I0320 17:35:15.885143 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:35:15 crc kubenswrapper[4690]: E0320 17:35:15.886486 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:35:16 crc kubenswrapper[4690]: E0320 17:35:16.004123 4690 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 17:35:17 crc kubenswrapper[4690]: I0320 17:35:17.882886 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:35:17 crc kubenswrapper[4690]: E0320 17:35:17.883062 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:35:17 crc kubenswrapper[4690]: I0320 17:35:17.883228 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:35:17 crc kubenswrapper[4690]: I0320 17:35:17.883311 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:35:17 crc kubenswrapper[4690]: I0320 17:35:17.882914 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bgj72" Mar 20 17:35:17 crc kubenswrapper[4690]: E0320 17:35:17.883606 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:35:17 crc kubenswrapper[4690]: E0320 17:35:17.883875 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bgj72" podUID="3cb690cf-caea-4c1b-ad3c-7e17a802b1a3" Mar 20 17:35:17 crc kubenswrapper[4690]: E0320 17:35:17.884057 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:35:19 crc kubenswrapper[4690]: I0320 17:35:19.882707 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:35:19 crc kubenswrapper[4690]: I0320 17:35:19.882786 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bgj72" Mar 20 17:35:19 crc kubenswrapper[4690]: E0320 17:35:19.882829 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:35:19 crc kubenswrapper[4690]: I0320 17:35:19.882995 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:35:19 crc kubenswrapper[4690]: E0320 17:35:19.883056 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bgj72" podUID="3cb690cf-caea-4c1b-ad3c-7e17a802b1a3" Mar 20 17:35:19 crc kubenswrapper[4690]: I0320 17:35:19.883004 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:35:19 crc kubenswrapper[4690]: E0320 17:35:19.883177 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:35:19 crc kubenswrapper[4690]: E0320 17:35:19.883661 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:35:21 crc kubenswrapper[4690]: E0320 17:35:21.004975 4690 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 17:35:21 crc kubenswrapper[4690]: I0320 17:35:21.882753 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:35:21 crc kubenswrapper[4690]: I0320 17:35:21.882835 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:35:21 crc kubenswrapper[4690]: E0320 17:35:21.882969 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:35:21 crc kubenswrapper[4690]: I0320 17:35:21.882754 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:35:21 crc kubenswrapper[4690]: I0320 17:35:21.883054 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bgj72" Mar 20 17:35:21 crc kubenswrapper[4690]: E0320 17:35:21.883213 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:35:21 crc kubenswrapper[4690]: E0320 17:35:21.883371 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:35:21 crc kubenswrapper[4690]: E0320 17:35:21.883492 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bgj72" podUID="3cb690cf-caea-4c1b-ad3c-7e17a802b1a3" Mar 20 17:35:23 crc kubenswrapper[4690]: I0320 17:35:23.882313 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:35:23 crc kubenswrapper[4690]: I0320 17:35:23.882457 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:35:23 crc kubenswrapper[4690]: I0320 17:35:23.882477 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:35:23 crc kubenswrapper[4690]: E0320 17:35:23.882646 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:35:23 crc kubenswrapper[4690]: I0320 17:35:23.882940 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bgj72" Mar 20 17:35:23 crc kubenswrapper[4690]: E0320 17:35:23.883041 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bgj72" podUID="3cb690cf-caea-4c1b-ad3c-7e17a802b1a3" Mar 20 17:35:23 crc kubenswrapper[4690]: E0320 17:35:23.883248 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:35:23 crc kubenswrapper[4690]: E0320 17:35:23.883492 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:35:25 crc kubenswrapper[4690]: I0320 17:35:25.883323 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:35:25 crc kubenswrapper[4690]: I0320 17:35:25.883322 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:35:25 crc kubenswrapper[4690]: I0320 17:35:25.883385 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bgj72" Mar 20 17:35:25 crc kubenswrapper[4690]: I0320 17:35:25.883240 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:35:25 crc kubenswrapper[4690]: E0320 17:35:25.884708 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:35:25 crc kubenswrapper[4690]: E0320 17:35:25.885325 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:35:25 crc kubenswrapper[4690]: E0320 17:35:25.885510 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bgj72" podUID="3cb690cf-caea-4c1b-ad3c-7e17a802b1a3" Mar 20 17:35:25 crc kubenswrapper[4690]: E0320 17:35:25.885559 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:35:26 crc kubenswrapper[4690]: E0320 17:35:26.006003 4690 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 17:35:27 crc kubenswrapper[4690]: I0320 17:35:27.882315 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bgj72" Mar 20 17:35:27 crc kubenswrapper[4690]: I0320 17:35:27.882358 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:35:27 crc kubenswrapper[4690]: I0320 17:35:27.882347 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:35:27 crc kubenswrapper[4690]: I0320 17:35:27.882333 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:35:27 crc kubenswrapper[4690]: E0320 17:35:27.882539 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:35:27 crc kubenswrapper[4690]: E0320 17:35:27.883188 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:35:27 crc kubenswrapper[4690]: E0320 17:35:27.883597 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:35:27 crc kubenswrapper[4690]: E0320 17:35:27.884217 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bgj72" podUID="3cb690cf-caea-4c1b-ad3c-7e17a802b1a3" Mar 20 17:35:28 crc kubenswrapper[4690]: I0320 17:35:28.900547 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bf8dm_189715be-f690-4a1d-9bd3-fb0dcae7affe/kube-multus/1.log" Mar 20 17:35:28 crc kubenswrapper[4690]: I0320 17:35:28.901222 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bf8dm_189715be-f690-4a1d-9bd3-fb0dcae7affe/kube-multus/0.log" Mar 20 17:35:28 crc kubenswrapper[4690]: I0320 17:35:28.901301 4690 generic.go:334] "Generic (PLEG): container finished" podID="189715be-f690-4a1d-9bd3-fb0dcae7affe" containerID="1a2c238f16fbb8b532515c8ae6456c4e5b9b6e5797597ea258171e573c9f4ba7" exitCode=1 Mar 20 17:35:28 crc kubenswrapper[4690]: I0320 17:35:28.901338 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bf8dm" event={"ID":"189715be-f690-4a1d-9bd3-fb0dcae7affe","Type":"ContainerDied","Data":"1a2c238f16fbb8b532515c8ae6456c4e5b9b6e5797597ea258171e573c9f4ba7"} Mar 20 17:35:28 crc kubenswrapper[4690]: I0320 17:35:28.901374 4690 scope.go:117] "RemoveContainer" containerID="6ab5d0027832ffcb62f2f0869a4811a56bd02954cbdd4fd0e20870dc72818ba4" Mar 20 17:35:28 crc kubenswrapper[4690]: I0320 17:35:28.901861 4690 scope.go:117] "RemoveContainer" containerID="1a2c238f16fbb8b532515c8ae6456c4e5b9b6e5797597ea258171e573c9f4ba7" Mar 20 17:35:28 crc kubenswrapper[4690]: E0320 17:35:28.902114 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-bf8dm_openshift-multus(189715be-f690-4a1d-9bd3-fb0dcae7affe)\"" pod="openshift-multus/multus-bf8dm" podUID="189715be-f690-4a1d-9bd3-fb0dcae7affe" Mar 20 17:35:29 crc kubenswrapper[4690]: I0320 17:35:29.882663 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:35:29 crc kubenswrapper[4690]: I0320 17:35:29.882670 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:35:29 crc kubenswrapper[4690]: I0320 17:35:29.882774 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:35:29 crc kubenswrapper[4690]: I0320 17:35:29.882976 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bgj72" Mar 20 17:35:29 crc kubenswrapper[4690]: E0320 17:35:29.883328 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:35:29 crc kubenswrapper[4690]: E0320 17:35:29.883639 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:35:29 crc kubenswrapper[4690]: I0320 17:35:29.883679 4690 scope.go:117] "RemoveContainer" containerID="81abe4d654d381b11ab7ff28d592be23303e3f7934bb0c68d3f3c30316b491ca" Mar 20 17:35:29 crc kubenswrapper[4690]: E0320 17:35:29.883967 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:35:29 crc kubenswrapper[4690]: E0320 17:35:29.884068 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bgj72" podUID="3cb690cf-caea-4c1b-ad3c-7e17a802b1a3" Mar 20 17:35:29 crc kubenswrapper[4690]: I0320 17:35:29.907337 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bf8dm_189715be-f690-4a1d-9bd3-fb0dcae7affe/kube-multus/1.log" Mar 20 17:35:30 crc kubenswrapper[4690]: I0320 17:35:30.831709 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-bgj72"] Mar 20 17:35:30 crc kubenswrapper[4690]: I0320 17:35:30.831884 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bgj72" Mar 20 17:35:30 crc kubenswrapper[4690]: E0320 17:35:30.832101 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bgj72" podUID="3cb690cf-caea-4c1b-ad3c-7e17a802b1a3" Mar 20 17:35:30 crc kubenswrapper[4690]: I0320 17:35:30.912487 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7bsmm_01a728ab-e286-4606-b922-d510978b863a/ovnkube-controller/3.log" Mar 20 17:35:30 crc kubenswrapper[4690]: I0320 17:35:30.915520 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" event={"ID":"01a728ab-e286-4606-b922-d510978b863a","Type":"ContainerStarted","Data":"e6135f9b9f357be4756c24d0e74244d64acd2fdfe7868743556379250a02e5ec"} Mar 20 17:35:30 crc kubenswrapper[4690]: I0320 17:35:30.916052 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" Mar 20 17:35:30 crc kubenswrapper[4690]: I0320 17:35:30.955003 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" podStartSLOduration=162.954980287 podStartE2EDuration="2m42.954980287s" podCreationTimestamp="2026-03-20 17:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:35:30.95372949 +0000 UTC m=+205.819555208" watchObservedRunningTime="2026-03-20 17:35:30.954980287 +0000 UTC m=+205.820805985" Mar 20 17:35:31 crc kubenswrapper[4690]: E0320 17:35:31.007818 4690 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 17:35:31 crc kubenswrapper[4690]: I0320 17:35:31.946488 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:35:31 crc kubenswrapper[4690]: I0320 17:35:31.946548 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:35:31 crc kubenswrapper[4690]: I0320 17:35:31.946681 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:35:31 crc kubenswrapper[4690]: E0320 17:35:31.946698 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:35:31 crc kubenswrapper[4690]: E0320 17:35:31.946962 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:35:31 crc kubenswrapper[4690]: E0320 17:35:31.947164 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:35:32 crc kubenswrapper[4690]: I0320 17:35:32.250391 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:35:32 crc kubenswrapper[4690]: E0320 17:35:32.250571 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:37:34.25053532 +0000 UTC m=+329.116361048 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:35:32 crc kubenswrapper[4690]: I0320 17:35:32.250637 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3cb690cf-caea-4c1b-ad3c-7e17a802b1a3-metrics-certs\") pod \"network-metrics-daemon-bgj72\" (UID: \"3cb690cf-caea-4c1b-ad3c-7e17a802b1a3\") " pod="openshift-multus/network-metrics-daemon-bgj72" Mar 20 17:35:32 crc kubenswrapper[4690]: I0320 17:35:32.250727 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:35:32 crc kubenswrapper[4690]: I0320 17:35:32.250790 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:35:32 crc kubenswrapper[4690]: I0320 17:35:32.250846 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:35:32 crc kubenswrapper[4690]: E0320 17:35:32.250889 4690 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 17:35:32 crc kubenswrapper[4690]: I0320 17:35:32.250910 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:35:32 crc kubenswrapper[4690]: E0320 17:35:32.250972 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3cb690cf-caea-4c1b-ad3c-7e17a802b1a3-metrics-certs podName:3cb690cf-caea-4c1b-ad3c-7e17a802b1a3 nodeName:}" failed. No retries permitted until 2026-03-20 17:37:34.250948882 +0000 UTC m=+329.116774600 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3cb690cf-caea-4c1b-ad3c-7e17a802b1a3-metrics-certs") pod "network-metrics-daemon-bgj72" (UID: "3cb690cf-caea-4c1b-ad3c-7e17a802b1a3") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 17:35:32 crc kubenswrapper[4690]: E0320 17:35:32.251037 4690 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 17:35:32 crc kubenswrapper[4690]: E0320 17:35:32.251042 4690 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 17:35:32 crc kubenswrapper[4690]: E0320 17:35:32.251090 4690 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 17:35:32 crc kubenswrapper[4690]: E0320 17:35:32.251136 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 17:37:34.251105147 +0000 UTC m=+329.116930865 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 17:35:32 crc kubenswrapper[4690]: E0320 17:35:32.251103 4690 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 17:35:32 crc kubenswrapper[4690]: E0320 17:35:32.251188 4690 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:35:32 crc kubenswrapper[4690]: E0320 17:35:32.251143 4690 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 17:35:32 crc kubenswrapper[4690]: E0320 17:35:32.251241 4690 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:35:32 crc kubenswrapper[4690]: E0320 17:35:32.251244 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 17:37:34.25122457 +0000 UTC m=+329.117050288 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:35:32 crc kubenswrapper[4690]: E0320 17:35:32.251442 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 17:37:34.251419006 +0000 UTC m=+329.117244734 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:35:32 crc kubenswrapper[4690]: E0320 17:35:32.251062 4690 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 17:35:32 crc kubenswrapper[4690]: E0320 17:35:32.251539 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 17:37:34.251514309 +0000 UTC m=+329.117340027 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 17:35:32 crc kubenswrapper[4690]: I0320 17:35:32.883206 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bgj72" Mar 20 17:35:32 crc kubenswrapper[4690]: E0320 17:35:32.883451 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bgj72" podUID="3cb690cf-caea-4c1b-ad3c-7e17a802b1a3" Mar 20 17:35:33 crc kubenswrapper[4690]: I0320 17:35:33.882296 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:35:33 crc kubenswrapper[4690]: I0320 17:35:33.882380 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:35:33 crc kubenswrapper[4690]: E0320 17:35:33.882519 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:35:33 crc kubenswrapper[4690]: I0320 17:35:33.882535 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:35:33 crc kubenswrapper[4690]: E0320 17:35:33.882632 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:35:33 crc kubenswrapper[4690]: E0320 17:35:33.882729 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:35:34 crc kubenswrapper[4690]: I0320 17:35:34.883153 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bgj72" Mar 20 17:35:34 crc kubenswrapper[4690]: E0320 17:35:34.883724 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bgj72" podUID="3cb690cf-caea-4c1b-ad3c-7e17a802b1a3" Mar 20 17:35:35 crc kubenswrapper[4690]: I0320 17:35:35.882635 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:35:35 crc kubenswrapper[4690]: I0320 17:35:35.882679 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:35:35 crc kubenswrapper[4690]: I0320 17:35:35.882638 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:35:35 crc kubenswrapper[4690]: E0320 17:35:35.883633 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:35:35 crc kubenswrapper[4690]: E0320 17:35:35.883793 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:35:35 crc kubenswrapper[4690]: E0320 17:35:35.883822 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:35:36 crc kubenswrapper[4690]: E0320 17:35:36.008800 4690 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 17:35:36 crc kubenswrapper[4690]: I0320 17:35:36.882626 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bgj72" Mar 20 17:35:36 crc kubenswrapper[4690]: E0320 17:35:36.882853 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bgj72" podUID="3cb690cf-caea-4c1b-ad3c-7e17a802b1a3" Mar 20 17:35:37 crc kubenswrapper[4690]: I0320 17:35:37.883730 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:35:37 crc kubenswrapper[4690]: E0320 17:35:37.883855 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:35:37 crc kubenswrapper[4690]: I0320 17:35:37.883923 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:35:37 crc kubenswrapper[4690]: I0320 17:35:37.883737 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:35:37 crc kubenswrapper[4690]: E0320 17:35:37.884136 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:35:37 crc kubenswrapper[4690]: E0320 17:35:37.884493 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:35:38 crc kubenswrapper[4690]: I0320 17:35:38.882846 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bgj72" Mar 20 17:35:38 crc kubenswrapper[4690]: E0320 17:35:38.883031 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bgj72" podUID="3cb690cf-caea-4c1b-ad3c-7e17a802b1a3" Mar 20 17:35:39 crc kubenswrapper[4690]: I0320 17:35:39.882877 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:35:39 crc kubenswrapper[4690]: E0320 17:35:39.883552 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:35:39 crc kubenswrapper[4690]: I0320 17:35:39.883230 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:35:39 crc kubenswrapper[4690]: E0320 17:35:39.883626 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:35:39 crc kubenswrapper[4690]: I0320 17:35:39.883129 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:35:39 crc kubenswrapper[4690]: E0320 17:35:39.883692 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:35:40 crc kubenswrapper[4690]: I0320 17:35:40.882600 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bgj72" Mar 20 17:35:40 crc kubenswrapper[4690]: E0320 17:35:40.882803 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bgj72" podUID="3cb690cf-caea-4c1b-ad3c-7e17a802b1a3" Mar 20 17:35:41 crc kubenswrapper[4690]: E0320 17:35:41.010543 4690 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 17:35:41 crc kubenswrapper[4690]: I0320 17:35:41.882811 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:35:41 crc kubenswrapper[4690]: I0320 17:35:41.882880 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:35:41 crc kubenswrapper[4690]: I0320 17:35:41.882919 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:35:41 crc kubenswrapper[4690]: E0320 17:35:41.883048 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:35:41 crc kubenswrapper[4690]: E0320 17:35:41.883170 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:35:41 crc kubenswrapper[4690]: E0320 17:35:41.883291 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:35:42 crc kubenswrapper[4690]: I0320 17:35:42.882526 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bgj72" Mar 20 17:35:42 crc kubenswrapper[4690]: E0320 17:35:42.882645 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bgj72" podUID="3cb690cf-caea-4c1b-ad3c-7e17a802b1a3" Mar 20 17:35:42 crc kubenswrapper[4690]: I0320 17:35:42.882828 4690 scope.go:117] "RemoveContainer" containerID="1a2c238f16fbb8b532515c8ae6456c4e5b9b6e5797597ea258171e573c9f4ba7" Mar 20 17:35:43 crc kubenswrapper[4690]: I0320 17:35:43.882848 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:35:43 crc kubenswrapper[4690]: I0320 17:35:43.882886 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:35:43 crc kubenswrapper[4690]: E0320 17:35:43.883463 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:35:43 crc kubenswrapper[4690]: I0320 17:35:43.883190 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:35:43 crc kubenswrapper[4690]: E0320 17:35:43.883541 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:35:43 crc kubenswrapper[4690]: E0320 17:35:43.883602 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:35:43 crc kubenswrapper[4690]: I0320 17:35:43.991911 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bf8dm_189715be-f690-4a1d-9bd3-fb0dcae7affe/kube-multus/1.log" Mar 20 17:35:43 crc kubenswrapper[4690]: I0320 17:35:43.991991 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bf8dm" event={"ID":"189715be-f690-4a1d-9bd3-fb0dcae7affe","Type":"ContainerStarted","Data":"50174b4b1d0d5ad19c52c1f42347f6d15551581b6ce597a9860c9607c408f9ff"} Mar 20 17:35:44 crc kubenswrapper[4690]: I0320 17:35:44.882703 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bgj72" Mar 20 17:35:44 crc kubenswrapper[4690]: E0320 17:35:44.882854 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bgj72" podUID="3cb690cf-caea-4c1b-ad3c-7e17a802b1a3" Mar 20 17:35:45 crc kubenswrapper[4690]: I0320 17:35:45.882165 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:35:45 crc kubenswrapper[4690]: I0320 17:35:45.882181 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:35:45 crc kubenswrapper[4690]: I0320 17:35:45.882206 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:35:45 crc kubenswrapper[4690]: E0320 17:35:45.886334 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:35:45 crc kubenswrapper[4690]: E0320 17:35:45.886653 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:35:45 crc kubenswrapper[4690]: E0320 17:35:45.886530 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:35:46 crc kubenswrapper[4690]: I0320 17:35:46.882210 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bgj72" Mar 20 17:35:46 crc kubenswrapper[4690]: I0320 17:35:46.884133 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 20 17:35:46 crc kubenswrapper[4690]: I0320 17:35:46.884587 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 20 17:35:47 crc kubenswrapper[4690]: I0320 17:35:47.882995 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:35:47 crc kubenswrapper[4690]: I0320 17:35:47.883001 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:35:47 crc kubenswrapper[4690]: I0320 17:35:47.883279 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:35:47 crc kubenswrapper[4690]: I0320 17:35:47.885485 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 20 17:35:47 crc kubenswrapper[4690]: I0320 17:35:47.886011 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 20 17:35:47 crc kubenswrapper[4690]: I0320 17:35:47.886147 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 20 17:35:47 crc kubenswrapper[4690]: I0320 17:35:47.886353 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.683867 4690 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.743074 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-fq57l"] Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.744728 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fq57l" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.745326 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-wbkxs"] Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.746004 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-wbkxs" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.748389 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-52php"] Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.749421 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-52php" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.751327 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-zklpl"] Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.752038 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zklpl" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.752765 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-j6k6w"] Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.753547 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-tzrf8"] Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.753594 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-j6k6w" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.754035 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd4ee554-cd4d-4ff1-bef8-309484654b00-serving-cert\") pod \"openshift-config-operator-7777fb866f-fq57l\" (UID: \"bd4ee554-cd4d-4ff1-bef8-309484654b00\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fq57l" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.754249 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bd4ee554-cd4d-4ff1-bef8-309484654b00-available-featuregates\") pod \"openshift-config-operator-7777fb866f-fq57l\" (UID: \"bd4ee554-cd4d-4ff1-bef8-309484654b00\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fq57l" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.754362 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sg24n\" (UniqueName: \"kubernetes.io/projected/bd4ee554-cd4d-4ff1-bef8-309484654b00-kube-api-access-sg24n\") pod \"openshift-config-operator-7777fb866f-fq57l\" (UID: \"bd4ee554-cd4d-4ff1-bef8-309484654b00\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fq57l" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.754779 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tzrf8" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.754923 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.755280 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.756525 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.758853 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.760994 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.761124 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.761418 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.761542 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.763282 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-9n98c"] Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.764100 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-9n98c" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.764444 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.764871 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.765783 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.770318 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.770333 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.770348 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.770405 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.770554 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.770626 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.771184 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.771392 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.771612 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.771640 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.771925 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.772086 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.772752 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.780349 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.780603 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.781355 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.781478 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.781562 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.785728 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-nm2vw"] Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.786136 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-dxqqz"] Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.786239 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.786479 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-n5c8r"] Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.786638 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.786867 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-n5c8r" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.787097 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.787318 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-nm2vw" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.787491 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.787558 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.787569 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.787814 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-dxqqz" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.788322 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.788337 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.788372 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.788327 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.788486 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.789024 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.793163 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.793497 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.799334 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.799705 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.799918 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.800033 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.800334 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.800491 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.800615 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.800624 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.800718 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.800747 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.800797 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.800860 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.800930 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.800967 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.800992 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.801037 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.801071 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.800930 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.801696 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-8l2n9"] Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.802683 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-8l2n9" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.802942 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mk5m6"] Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.803496 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mk5m6" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.827047 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f646m"] Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.840871 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-6rw4d"] Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.841537 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-v9wf6"] Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.841858 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f646m" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.841940 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-v9wf6" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.842064 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-6rw4d" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.842375 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.842588 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.842825 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.842899 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.843037 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.843105 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.845319 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-fq57l"] Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.845533 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-ppgjz"] Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.845942 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.846062 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-ppgjz" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.846375 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.846477 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.847478 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.847626 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2n45v"] Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.848167 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.848291 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tv6bv"] Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.848557 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2n45v" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.848814 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tv6bv" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.849051 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.849243 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.849539 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.849715 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-j6k6w"] Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.850102 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-d7smm"] Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.850554 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.850652 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-d7smm" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.853289 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6fhf7"] Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.853963 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-nprpv"] Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.854502 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4kp5n"] Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.855086 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-6fhf7" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.855099 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kkhg7"] Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.855197 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4kp5n" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.855285 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-nprpv" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.855837 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kkhg7" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.857804 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ace7a9fa-7eac-449c-8b61-6018d592fc4f-serving-cert\") pod \"route-controller-manager-6576b87f9c-zklpl\" (UID: \"ace7a9fa-7eac-449c-8b61-6018d592fc4f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zklpl" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.857853 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/20a97349-3805-4434-be4a-1cb8024add50-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-n5c8r\" (UID: \"20a97349-3805-4434-be4a-1cb8024add50\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-n5c8r" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.857880 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srrhb\" (UniqueName: \"kubernetes.io/projected/8baee130-f518-4071-afbc-13625917aa7b-kube-api-access-srrhb\") pod \"openshift-controller-manager-operator-756b6f6bc6-mk5m6\" (UID: \"8baee130-f518-4071-afbc-13625917aa7b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mk5m6" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.857906 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fcf13749-fd7c-4f01-9598-7f041910cd74-config\") pod \"controller-manager-879f6c89f-wbkxs\" (UID: \"fcf13749-fd7c-4f01-9598-7f041910cd74\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wbkxs" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.857934 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a0c61344-19c2-4d8b-8aec-be86ac403866-etcd-serving-ca\") pod \"apiserver-76f77b778f-8l2n9\" (UID: \"a0c61344-19c2-4d8b-8aec-be86ac403866\") " pod="openshift-apiserver/apiserver-76f77b778f-8l2n9" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.857958 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20a97349-3805-4434-be4a-1cb8024add50-config\") pod \"kube-controller-manager-operator-78b949d7b-n5c8r\" (UID: \"20a97349-3805-4434-be4a-1cb8024add50\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-n5c8r" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.857980 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ff4fe98d-c7c0-475a-85cb-70ab2c4ad122-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-6rw4d\" (UID: \"ff4fe98d-c7c0-475a-85cb-70ab2c4ad122\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-6rw4d" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.858004 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c4eaf3f2-8536-46bf-8c5f-82606abec128-console-config\") pod \"console-f9d7485db-ppgjz\" (UID: \"c4eaf3f2-8536-46bf-8c5f-82606abec128\") " pod="openshift-console/console-f9d7485db-ppgjz" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.858024 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cg58v\" (UniqueName: \"kubernetes.io/projected/c73bcf80-34dc-466e-b1b0-a92850850498-kube-api-access-cg58v\") pod \"apiserver-7bbb656c7d-52php\" (UID: \"c73bcf80-34dc-466e-b1b0-a92850850498\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-52php" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.858043 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c73bcf80-34dc-466e-b1b0-a92850850498-audit-dir\") pod \"apiserver-7bbb656c7d-52php\" (UID: \"c73bcf80-34dc-466e-b1b0-a92850850498\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-52php" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.858067 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bkg9\" (UniqueName: \"kubernetes.io/projected/41847043-0aca-46d5-940f-3dfd2ded491f-kube-api-access-4bkg9\") pod \"etcd-operator-b45778765-nm2vw\" (UID: \"41847043-0aca-46d5-940f-3dfd2ded491f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nm2vw" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.858087 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zkhd\" (UniqueName: \"kubernetes.io/projected/28a597c2-65fe-4f1f-b4da-8cedf2c92a6b-kube-api-access-5zkhd\") pod \"machine-approver-56656f9798-tzrf8\" (UID: \"28a597c2-65fe-4f1f-b4da-8cedf2c92a6b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tzrf8" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.858113 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bcdf1a44-e01e-4f8d-a5dd-f050ff98f14d-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-j6k6w\" (UID: \"bcdf1a44-e01e-4f8d-a5dd-f050ff98f14d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-j6k6w" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.858135 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c73bcf80-34dc-466e-b1b0-a92850850498-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-52php\" (UID: \"c73bcf80-34dc-466e-b1b0-a92850850498\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-52php" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.858164 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sg24n\" (UniqueName: \"kubernetes.io/projected/bd4ee554-cd4d-4ff1-bef8-309484654b00-kube-api-access-sg24n\") pod \"openshift-config-operator-7777fb866f-fq57l\" (UID: \"bd4ee554-cd4d-4ff1-bef8-309484654b00\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fq57l" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.858190 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/41847043-0aca-46d5-940f-3dfd2ded491f-serving-cert\") pod \"etcd-operator-b45778765-nm2vw\" (UID: \"41847043-0aca-46d5-940f-3dfd2ded491f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nm2vw" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.858212 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2657z\" (UniqueName: \"kubernetes.io/projected/fcf13749-fd7c-4f01-9598-7f041910cd74-kube-api-access-2657z\") pod \"controller-manager-879f6c89f-wbkxs\" (UID: \"fcf13749-fd7c-4f01-9598-7f041910cd74\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wbkxs" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.858271 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ace7a9fa-7eac-449c-8b61-6018d592fc4f-config\") pod \"route-controller-manager-6576b87f9c-zklpl\" (UID: \"ace7a9fa-7eac-449c-8b61-6018d592fc4f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zklpl" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.858300 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a0c61344-19c2-4d8b-8aec-be86ac403866-trusted-ca-bundle\") pod \"apiserver-76f77b778f-8l2n9\" (UID: \"a0c61344-19c2-4d8b-8aec-be86ac403866\") " pod="openshift-apiserver/apiserver-76f77b778f-8l2n9" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.858325 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4eaf3f2-8536-46bf-8c5f-82606abec128-trusted-ca-bundle\") pod \"console-f9d7485db-ppgjz\" (UID: \"c4eaf3f2-8536-46bf-8c5f-82606abec128\") " pod="openshift-console/console-f9d7485db-ppgjz" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.858348 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a646659-b6c9-42c0-9bc8-ae149ad8ba85-config\") pod \"authentication-operator-69f744f599-dxqqz\" (UID: \"9a646659-b6c9-42c0-9bc8-ae149ad8ba85\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dxqqz" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.858372 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9a646659-b6c9-42c0-9bc8-ae149ad8ba85-service-ca-bundle\") pod \"authentication-operator-69f744f599-dxqqz\" (UID: \"9a646659-b6c9-42c0-9bc8-ae149ad8ba85\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dxqqz" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.858394 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28a597c2-65fe-4f1f-b4da-8cedf2c92a6b-config\") pod \"machine-approver-56656f9798-tzrf8\" (UID: \"28a597c2-65fe-4f1f-b4da-8cedf2c92a6b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tzrf8" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.858419 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6496f\" (UniqueName: \"kubernetes.io/projected/ace7a9fa-7eac-449c-8b61-6018d592fc4f-kube-api-access-6496f\") pod \"route-controller-manager-6576b87f9c-zklpl\" (UID: \"ace7a9fa-7eac-449c-8b61-6018d592fc4f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zklpl" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.858443 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ace7a9fa-7eac-449c-8b61-6018d592fc4f-client-ca\") pod \"route-controller-manager-6576b87f9c-zklpl\" (UID: \"ace7a9fa-7eac-449c-8b61-6018d592fc4f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zklpl" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.858476 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c4eaf3f2-8536-46bf-8c5f-82606abec128-service-ca\") pod \"console-f9d7485db-ppgjz\" (UID: \"c4eaf3f2-8536-46bf-8c5f-82606abec128\") " pod="openshift-console/console-f9d7485db-ppgjz" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.858499 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c4eaf3f2-8536-46bf-8c5f-82606abec128-console-oauth-config\") pod \"console-f9d7485db-ppgjz\" (UID: \"c4eaf3f2-8536-46bf-8c5f-82606abec128\") " pod="openshift-console/console-f9d7485db-ppgjz" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.858524 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/28a597c2-65fe-4f1f-b4da-8cedf2c92a6b-auth-proxy-config\") pod \"machine-approver-56656f9798-tzrf8\" (UID: \"28a597c2-65fe-4f1f-b4da-8cedf2c92a6b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tzrf8" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.858582 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bcdf1a44-e01e-4f8d-a5dd-f050ff98f14d-config\") pod \"openshift-apiserver-operator-796bbdcf4f-j6k6w\" (UID: \"bcdf1a44-e01e-4f8d-a5dd-f050ff98f14d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-j6k6w" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.858639 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0c61344-19c2-4d8b-8aec-be86ac403866-config\") pod \"apiserver-76f77b778f-8l2n9\" (UID: \"a0c61344-19c2-4d8b-8aec-be86ac403866\") " pod="openshift-apiserver/apiserver-76f77b778f-8l2n9" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.858685 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/bb444275-6cc1-42be-b742-afc344a60995-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-9n98c\" (UID: \"bb444275-6cc1-42be-b742-afc344a60995\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9n98c" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.858720 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/a0c61344-19c2-4d8b-8aec-be86ac403866-image-import-ca\") pod \"apiserver-76f77b778f-8l2n9\" (UID: \"a0c61344-19c2-4d8b-8aec-be86ac403866\") " pod="openshift-apiserver/apiserver-76f77b778f-8l2n9" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.858744 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c4eaf3f2-8536-46bf-8c5f-82606abec128-console-serving-cert\") pod \"console-f9d7485db-ppgjz\" (UID: \"c4eaf3f2-8536-46bf-8c5f-82606abec128\") " pod="openshift-console/console-f9d7485db-ppgjz" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.858766 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c73bcf80-34dc-466e-b1b0-a92850850498-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-52php\" (UID: \"c73bcf80-34dc-466e-b1b0-a92850850498\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-52php" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.858799 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/41847043-0aca-46d5-940f-3dfd2ded491f-etcd-client\") pod \"etcd-operator-b45778765-nm2vw\" (UID: \"41847043-0aca-46d5-940f-3dfd2ded491f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nm2vw" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.858824 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a0c61344-19c2-4d8b-8aec-be86ac403866-etcd-client\") pod \"apiserver-76f77b778f-8l2n9\" (UID: \"a0c61344-19c2-4d8b-8aec-be86ac403866\") " pod="openshift-apiserver/apiserver-76f77b778f-8l2n9" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.858843 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7qc9\" (UniqueName: \"kubernetes.io/projected/40dc1d7f-44d0-4ded-92b5-c2cf3df0bfd8-kube-api-access-d7qc9\") pod \"catalog-operator-68c6474976-f646m\" (UID: \"40dc1d7f-44d0-4ded-92b5-c2cf3df0bfd8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f646m" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.858865 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fcf13749-fd7c-4f01-9598-7f041910cd74-client-ca\") pod \"controller-manager-879f6c89f-wbkxs\" (UID: \"fcf13749-fd7c-4f01-9598-7f041910cd74\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wbkxs" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.858963 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bf7nx\" (UniqueName: \"kubernetes.io/projected/bcdf1a44-e01e-4f8d-a5dd-f050ff98f14d-kube-api-access-bf7nx\") pod \"openshift-apiserver-operator-796bbdcf4f-j6k6w\" (UID: \"bcdf1a44-e01e-4f8d-a5dd-f050ff98f14d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-j6k6w" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.858997 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/41847043-0aca-46d5-940f-3dfd2ded491f-etcd-ca\") pod \"etcd-operator-b45778765-nm2vw\" (UID: \"41847043-0aca-46d5-940f-3dfd2ded491f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nm2vw" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.859052 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd4ee554-cd4d-4ff1-bef8-309484654b00-serving-cert\") pod \"openshift-config-operator-7777fb866f-fq57l\" (UID: \"bd4ee554-cd4d-4ff1-bef8-309484654b00\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fq57l" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.859097 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c73bcf80-34dc-466e-b1b0-a92850850498-etcd-client\") pod \"apiserver-7bbb656c7d-52php\" (UID: \"c73bcf80-34dc-466e-b1b0-a92850850498\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-52php" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.859120 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zk4b5\" (UniqueName: \"kubernetes.io/projected/9a646659-b6c9-42c0-9bc8-ae149ad8ba85-kube-api-access-zk4b5\") pod \"authentication-operator-69f744f599-dxqqz\" (UID: \"9a646659-b6c9-42c0-9bc8-ae149ad8ba85\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dxqqz" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.859152 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9a646659-b6c9-42c0-9bc8-ae149ad8ba85-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-dxqqz\" (UID: \"9a646659-b6c9-42c0-9bc8-ae149ad8ba85\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dxqqz" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.859178 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c73bcf80-34dc-466e-b1b0-a92850850498-audit-policies\") pod \"apiserver-7bbb656c7d-52php\" (UID: \"c73bcf80-34dc-466e-b1b0-a92850850498\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-52php" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.859200 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c73bcf80-34dc-466e-b1b0-a92850850498-encryption-config\") pod \"apiserver-7bbb656c7d-52php\" (UID: \"c73bcf80-34dc-466e-b1b0-a92850850498\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-52php" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.859219 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/28a597c2-65fe-4f1f-b4da-8cedf2c92a6b-machine-approver-tls\") pod \"machine-approver-56656f9798-tzrf8\" (UID: \"28a597c2-65fe-4f1f-b4da-8cedf2c92a6b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tzrf8" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.860753 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.862102 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.862320 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.862471 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.862548 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.862772 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.862920 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.862998 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.863016 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.863175 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.863222 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.863289 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.863330 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.863372 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.863338 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.862486 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.863389 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.863377 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.863505 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.863559 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.863559 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a0c61344-19c2-4d8b-8aec-be86ac403866-serving-cert\") pod \"apiserver-76f77b778f-8l2n9\" (UID: \"a0c61344-19c2-4d8b-8aec-be86ac403866\") " pod="openshift-apiserver/apiserver-76f77b778f-8l2n9" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.863635 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bd4ee554-cd4d-4ff1-bef8-309484654b00-available-featuregates\") pod \"openshift-config-operator-7777fb866f-fq57l\" (UID: \"bd4ee554-cd4d-4ff1-bef8-309484654b00\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fq57l" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.863638 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.863692 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fcf13749-fd7c-4f01-9598-7f041910cd74-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-wbkxs\" (UID: \"fcf13749-fd7c-4f01-9598-7f041910cd74\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wbkxs" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.863737 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4c77\" (UniqueName: \"kubernetes.io/projected/c4eaf3f2-8536-46bf-8c5f-82606abec128-kube-api-access-w4c77\") pod \"console-f9d7485db-ppgjz\" (UID: \"c4eaf3f2-8536-46bf-8c5f-82606abec128\") " pod="openshift-console/console-f9d7485db-ppgjz" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.863760 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a0c61344-19c2-4d8b-8aec-be86ac403866-node-pullsecrets\") pod \"apiserver-76f77b778f-8l2n9\" (UID: \"a0c61344-19c2-4d8b-8aec-be86ac403866\") " pod="openshift-apiserver/apiserver-76f77b778f-8l2n9" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.863788 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9a646659-b6c9-42c0-9bc8-ae149ad8ba85-serving-cert\") pod \"authentication-operator-69f744f599-dxqqz\" (UID: \"9a646659-b6c9-42c0-9bc8-ae149ad8ba85\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dxqqz" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.863811 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ndgf\" (UniqueName: \"kubernetes.io/projected/789eef8f-04a8-44cf-9e16-878de3a035bb-kube-api-access-8ndgf\") pod \"downloads-7954f5f757-v9wf6\" (UID: \"789eef8f-04a8-44cf-9e16-878de3a035bb\") " pod="openshift-console/downloads-7954f5f757-v9wf6" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.863868 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7g2h\" (UniqueName: \"kubernetes.io/projected/bb444275-6cc1-42be-b742-afc344a60995-kube-api-access-n7g2h\") pod \"machine-api-operator-5694c8668f-9n98c\" (UID: \"bb444275-6cc1-42be-b742-afc344a60995\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9n98c" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.863897 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8baee130-f518-4071-afbc-13625917aa7b-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-mk5m6\" (UID: \"8baee130-f518-4071-afbc-13625917aa7b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mk5m6" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.863917 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fcf13749-fd7c-4f01-9598-7f041910cd74-serving-cert\") pod \"controller-manager-879f6c89f-wbkxs\" (UID: \"fcf13749-fd7c-4f01-9598-7f041910cd74\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wbkxs" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.863942 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7tvv\" (UniqueName: \"kubernetes.io/projected/ff4fe98d-c7c0-475a-85cb-70ab2c4ad122-kube-api-access-b7tvv\") pod \"multus-admission-controller-857f4d67dd-6rw4d\" (UID: \"ff4fe98d-c7c0-475a-85cb-70ab2c4ad122\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-6rw4d" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.863976 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb444275-6cc1-42be-b742-afc344a60995-config\") pod \"machine-api-operator-5694c8668f-9n98c\" (UID: \"bb444275-6cc1-42be-b742-afc344a60995\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9n98c" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.863997 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/40dc1d7f-44d0-4ded-92b5-c2cf3df0bfd8-profile-collector-cert\") pod \"catalog-operator-68c6474976-f646m\" (UID: \"40dc1d7f-44d0-4ded-92b5-c2cf3df0bfd8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f646m" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.864041 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bd4ee554-cd4d-4ff1-bef8-309484654b00-available-featuregates\") pod \"openshift-config-operator-7777fb866f-fq57l\" (UID: \"bd4ee554-cd4d-4ff1-bef8-309484654b00\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fq57l" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.864047 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtrf9\" (UniqueName: \"kubernetes.io/projected/a0c61344-19c2-4d8b-8aec-be86ac403866-kube-api-access-qtrf9\") pod \"apiserver-76f77b778f-8l2n9\" (UID: \"a0c61344-19c2-4d8b-8aec-be86ac403866\") " pod="openshift-apiserver/apiserver-76f77b778f-8l2n9" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.864109 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c4eaf3f2-8536-46bf-8c5f-82606abec128-oauth-serving-cert\") pod \"console-f9d7485db-ppgjz\" (UID: \"c4eaf3f2-8536-46bf-8c5f-82606abec128\") " pod="openshift-console/console-f9d7485db-ppgjz" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.864159 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/20a97349-3805-4434-be4a-1cb8024add50-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-n5c8r\" (UID: \"20a97349-3805-4434-be4a-1cb8024add50\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-n5c8r" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.864166 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.864182 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c73bcf80-34dc-466e-b1b0-a92850850498-serving-cert\") pod \"apiserver-7bbb656c7d-52php\" (UID: \"c73bcf80-34dc-466e-b1b0-a92850850498\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-52php" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.864206 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8baee130-f518-4071-afbc-13625917aa7b-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-mk5m6\" (UID: \"8baee130-f518-4071-afbc-13625917aa7b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mk5m6" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.864241 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41847043-0aca-46d5-940f-3dfd2ded491f-config\") pod \"etcd-operator-b45778765-nm2vw\" (UID: \"41847043-0aca-46d5-940f-3dfd2ded491f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nm2vw" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.864283 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a0c61344-19c2-4d8b-8aec-be86ac403866-encryption-config\") pod \"apiserver-76f77b778f-8l2n9\" (UID: \"a0c61344-19c2-4d8b-8aec-be86ac403866\") " pod="openshift-apiserver/apiserver-76f77b778f-8l2n9" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.864304 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/40dc1d7f-44d0-4ded-92b5-c2cf3df0bfd8-srv-cert\") pod \"catalog-operator-68c6474976-f646m\" (UID: \"40dc1d7f-44d0-4ded-92b5-c2cf3df0bfd8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f646m" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.864324 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/41847043-0aca-46d5-940f-3dfd2ded491f-etcd-service-ca\") pod \"etcd-operator-b45778765-nm2vw\" (UID: \"41847043-0aca-46d5-940f-3dfd2ded491f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nm2vw" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.864347 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/bb444275-6cc1-42be-b742-afc344a60995-images\") pod \"machine-api-operator-5694c8668f-9n98c\" (UID: \"bb444275-6cc1-42be-b742-afc344a60995\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9n98c" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.864365 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-jtggf"] Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.864312 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.864325 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.864368 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a0c61344-19c2-4d8b-8aec-be86ac403866-audit-dir\") pod \"apiserver-76f77b778f-8l2n9\" (UID: \"a0c61344-19c2-4d8b-8aec-be86ac403866\") " pod="openshift-apiserver/apiserver-76f77b778f-8l2n9" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.864802 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/a0c61344-19c2-4d8b-8aec-be86ac403866-audit\") pod \"apiserver-76f77b778f-8l2n9\" (UID: \"a0c61344-19c2-4d8b-8aec-be86ac403866\") " pod="openshift-apiserver/apiserver-76f77b778f-8l2n9" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.865172 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jtggf" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.865484 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.865643 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nfmkn"] Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.865746 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.866937 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nfmkn" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.869108 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd4ee554-cd4d-4ff1-bef8-309484654b00-serving-cert\") pod \"openshift-config-operator-7777fb866f-fq57l\" (UID: \"bd4ee554-cd4d-4ff1-bef8-309484654b00\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fq57l" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.879014 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dnpcn"] Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.879665 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-6zcl9"] Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.879968 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cg94j"] Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.880691 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-dnpcn" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.885375 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-6zcl9" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.905654 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-sfhzf"] Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.906674 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567130-scc4x"] Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.907166 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-zklpl"] Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.907275 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567130-scc4x" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.907528 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cg94j" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.907679 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sfhzf" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.908545 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.908734 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.928268 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2s6xz"] Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.928781 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-275fn"] Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.929153 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-275fn" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.929433 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2s6xz" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.929740 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-2b2sh"] Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.929764 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.930099 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-2b2sh" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.931959 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.933545 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-ft28f"] Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.934245 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-sv7wd"] Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.934664 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-sv7wd" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.934979 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ft28f" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.935396 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fqfhl"] Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.935745 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fqfhl" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.936008 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-wbkxs"] Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.937572 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567134-66l98"] Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.938361 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-v7pjq"] Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.938834 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-v7pjq" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.939099 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567134-66l98" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.939401 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-8pvtf"] Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.940026 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-8pvtf" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.943570 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-5m5vk"] Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.944278 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-5m5vk" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.946898 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-9n98c"] Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.948100 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-dxqqz"] Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.948857 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-nm2vw"] Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.949921 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.950071 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-52php"] Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.951344 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-8l2n9"] Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.953602 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-v9wf6"] Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.956228 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mk5m6"] Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.957311 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4kp5n"] Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.958736 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-ppgjz"] Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.959527 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-6rw4d"] Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.962337 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-n5c8r"] Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.962381 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-jtggf"] Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.963324 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cg94j"] Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.965886 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-88brt"] Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.967038 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567130-scc4x"] Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.967168 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-88brt" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.968055 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-d7smm"] Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.969132 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.969604 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kkhg7"] Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.970506 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f646m"] Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.971774 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/20a97349-3805-4434-be4a-1cb8024add50-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-n5c8r\" (UID: \"20a97349-3805-4434-be4a-1cb8024add50\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-n5c8r" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.971805 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srrhb\" (UniqueName: \"kubernetes.io/projected/8baee130-f518-4071-afbc-13625917aa7b-kube-api-access-srrhb\") pod \"openshift-controller-manager-operator-756b6f6bc6-mk5m6\" (UID: \"8baee130-f518-4071-afbc-13625917aa7b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mk5m6" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.971836 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee298daa-0334-4d62-b83f-7c2499f55af6-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-fqfhl\" (UID: \"ee298daa-0334-4d62-b83f-7c2499f55af6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fqfhl" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.971855 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fcf13749-fd7c-4f01-9598-7f041910cd74-config\") pod \"controller-manager-879f6c89f-wbkxs\" (UID: \"fcf13749-fd7c-4f01-9598-7f041910cd74\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wbkxs" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.971873 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a7917eb5-2c7a-426c-8850-4209cd22e790-serving-cert\") pod \"console-operator-58897d9998-6zcl9\" (UID: \"a7917eb5-2c7a-426c-8850-4209cd22e790\") " pod="openshift-console-operator/console-operator-58897d9998-6zcl9" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.971892 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/733046ae-dba4-407a-83ee-89677527d7cc-config\") pod \"service-ca-operator-777779d784-275fn\" (UID: \"733046ae-dba4-407a-83ee-89677527d7cc\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-275fn" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.971912 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vz2mj\" (UniqueName: \"kubernetes.io/projected/b3bcd08e-d3ff-4cc6-8a32-1d43add9fddf-kube-api-access-vz2mj\") pod \"packageserver-d55dfcdfc-2n45v\" (UID: \"b3bcd08e-d3ff-4cc6-8a32-1d43add9fddf\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2n45v" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.971930 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1a46a1ee-5f40-4d85-b726-d758b7ceff37-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-2b2sh\" (UID: \"1a46a1ee-5f40-4d85-b726-d758b7ceff37\") " pod="openshift-authentication/oauth-openshift-558db77b4-2b2sh" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.971947 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c4eaf3f2-8536-46bf-8c5f-82606abec128-console-config\") pod \"console-f9d7485db-ppgjz\" (UID: \"c4eaf3f2-8536-46bf-8c5f-82606abec128\") " pod="openshift-console/console-f9d7485db-ppgjz" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.971965 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20a97349-3805-4434-be4a-1cb8024add50-config\") pod \"kube-controller-manager-operator-78b949d7b-n5c8r\" (UID: \"20a97349-3805-4434-be4a-1cb8024add50\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-n5c8r" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.971982 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ff4fe98d-c7c0-475a-85cb-70ab2c4ad122-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-6rw4d\" (UID: \"ff4fe98d-c7c0-475a-85cb-70ab2c4ad122\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-6rw4d" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.972002 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bcdf1a44-e01e-4f8d-a5dd-f050ff98f14d-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-j6k6w\" (UID: \"bcdf1a44-e01e-4f8d-a5dd-f050ff98f14d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-j6k6w" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.972020 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c73bcf80-34dc-466e-b1b0-a92850850498-audit-dir\") pod \"apiserver-7bbb656c7d-52php\" (UID: \"c73bcf80-34dc-466e-b1b0-a92850850498\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-52php" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.972039 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zkhd\" (UniqueName: \"kubernetes.io/projected/28a597c2-65fe-4f1f-b4da-8cedf2c92a6b-kube-api-access-5zkhd\") pod \"machine-approver-56656f9798-tzrf8\" (UID: \"28a597c2-65fe-4f1f-b4da-8cedf2c92a6b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tzrf8" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.972058 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6328cb4-ec5c-4913-b7b9-ed18d759d7f1-config\") pod \"kube-apiserver-operator-766d6c64bb-v7pjq\" (UID: \"e6328cb4-ec5c-4913-b7b9-ed18d759d7f1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-v7pjq" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.972074 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhc57\" (UniqueName: \"kubernetes.io/projected/733046ae-dba4-407a-83ee-89677527d7cc-kube-api-access-vhc57\") pod \"service-ca-operator-777779d784-275fn\" (UID: \"733046ae-dba4-407a-83ee-89677527d7cc\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-275fn" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.972104 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1a46a1ee-5f40-4d85-b726-d758b7ceff37-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-2b2sh\" (UID: \"1a46a1ee-5f40-4d85-b726-d758b7ceff37\") " pod="openshift-authentication/oauth-openshift-558db77b4-2b2sh" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.972123 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4eaf3f2-8536-46bf-8c5f-82606abec128-trusted-ca-bundle\") pod \"console-f9d7485db-ppgjz\" (UID: \"c4eaf3f2-8536-46bf-8c5f-82606abec128\") " pod="openshift-console/console-f9d7485db-ppgjz" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.972142 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a646659-b6c9-42c0-9bc8-ae149ad8ba85-config\") pod \"authentication-operator-69f744f599-dxqqz\" (UID: \"9a646659-b6c9-42c0-9bc8-ae149ad8ba85\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dxqqz" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.972163 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9a646659-b6c9-42c0-9bc8-ae149ad8ba85-service-ca-bundle\") pod \"authentication-operator-69f744f599-dxqqz\" (UID: \"9a646659-b6c9-42c0-9bc8-ae149ad8ba85\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dxqqz" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.972182 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ace7a9fa-7eac-449c-8b61-6018d592fc4f-client-ca\") pod \"route-controller-manager-6576b87f9c-zklpl\" (UID: \"ace7a9fa-7eac-449c-8b61-6018d592fc4f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zklpl" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.972204 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1a46a1ee-5f40-4d85-b726-d758b7ceff37-audit-dir\") pod \"oauth-openshift-558db77b4-2b2sh\" (UID: \"1a46a1ee-5f40-4d85-b726-d758b7ceff37\") " pod="openshift-authentication/oauth-openshift-558db77b4-2b2sh" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.972232 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sntsx\" (UniqueName: \"kubernetes.io/projected/e770c47d-95d6-45be-87cb-1fa3922afa82-kube-api-access-sntsx\") pod \"ingress-operator-5b745b69d9-ft28f\" (UID: \"e770c47d-95d6-45be-87cb-1fa3922afa82\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ft28f" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.972673 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0c61344-19c2-4d8b-8aec-be86ac403866-config\") pod \"apiserver-76f77b778f-8l2n9\" (UID: \"a0c61344-19c2-4d8b-8aec-be86ac403866\") " pod="openshift-apiserver/apiserver-76f77b778f-8l2n9" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.972714 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/28a597c2-65fe-4f1f-b4da-8cedf2c92a6b-auth-proxy-config\") pod \"machine-approver-56656f9798-tzrf8\" (UID: \"28a597c2-65fe-4f1f-b4da-8cedf2c92a6b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tzrf8" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.972736 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1a46a1ee-5f40-4d85-b726-d758b7ceff37-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-2b2sh\" (UID: \"1a46a1ee-5f40-4d85-b726-d758b7ceff37\") " pod="openshift-authentication/oauth-openshift-558db77b4-2b2sh" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.972757 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/a0c61344-19c2-4d8b-8aec-be86ac403866-image-import-ca\") pod \"apiserver-76f77b778f-8l2n9\" (UID: \"a0c61344-19c2-4d8b-8aec-be86ac403866\") " pod="openshift-apiserver/apiserver-76f77b778f-8l2n9" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.972778 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1a46a1ee-5f40-4d85-b726-d758b7ceff37-audit-policies\") pod \"oauth-openshift-558db77b4-2b2sh\" (UID: \"1a46a1ee-5f40-4d85-b726-d758b7ceff37\") " pod="openshift-authentication/oauth-openshift-558db77b4-2b2sh" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.972795 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a0c61344-19c2-4d8b-8aec-be86ac403866-etcd-client\") pod \"apiserver-76f77b778f-8l2n9\" (UID: \"a0c61344-19c2-4d8b-8aec-be86ac403866\") " pod="openshift-apiserver/apiserver-76f77b778f-8l2n9" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.972813 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7qc9\" (UniqueName: \"kubernetes.io/projected/40dc1d7f-44d0-4ded-92b5-c2cf3df0bfd8-kube-api-access-d7qc9\") pod \"catalog-operator-68c6474976-f646m\" (UID: \"40dc1d7f-44d0-4ded-92b5-c2cf3df0bfd8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f646m" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.972831 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9v6b5\" (UniqueName: \"kubernetes.io/projected/dc26c755-5e1b-480b-b3ed-b3d3dee36d94-kube-api-access-9v6b5\") pod \"control-plane-machine-set-operator-78cbb6b69f-kkhg7\" (UID: \"dc26c755-5e1b-480b-b3ed-b3d3dee36d94\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kkhg7" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.972849 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57922\" (UniqueName: \"kubernetes.io/projected/80d86fac-74cc-41d4-81df-2e718c1568d9-kube-api-access-57922\") pod \"marketplace-operator-79b997595-dnpcn\" (UID: \"80d86fac-74cc-41d4-81df-2e718c1568d9\") " pod="openshift-marketplace/marketplace-operator-79b997595-dnpcn" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.972868 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6edcd95a-9780-4af1-9454-da6dce913528-images\") pod \"machine-config-operator-74547568cd-sfhzf\" (UID: \"6edcd95a-9780-4af1-9454-da6dce913528\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sfhzf" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.972886 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6edcd95a-9780-4af1-9454-da6dce913528-proxy-tls\") pod \"machine-config-operator-74547568cd-sfhzf\" (UID: \"6edcd95a-9780-4af1-9454-da6dce913528\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sfhzf" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.972917 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a78540fe-014c-42e6-916c-3f39b4611a15-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-cg94j\" (UID: \"a78540fe-014c-42e6-916c-3f39b4611a15\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cg94j" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.972933 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1a46a1ee-5f40-4d85-b726-d758b7ceff37-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-2b2sh\" (UID: \"1a46a1ee-5f40-4d85-b726-d758b7ceff37\") " pod="openshift-authentication/oauth-openshift-558db77b4-2b2sh" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.972950 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a46a1ee-5f40-4d85-b726-d758b7ceff37-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-2b2sh\" (UID: \"1a46a1ee-5f40-4d85-b726-d758b7ceff37\") " pod="openshift-authentication/oauth-openshift-558db77b4-2b2sh" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.972970 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c73bcf80-34dc-466e-b1b0-a92850850498-etcd-client\") pod \"apiserver-7bbb656c7d-52php\" (UID: \"c73bcf80-34dc-466e-b1b0-a92850850498\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-52php" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.972986 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20a97349-3805-4434-be4a-1cb8024add50-config\") pod \"kube-controller-manager-operator-78b949d7b-n5c8r\" (UID: \"20a97349-3805-4434-be4a-1cb8024add50\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-n5c8r" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.973134 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c4eaf3f2-8536-46bf-8c5f-82606abec128-console-config\") pod \"console-f9d7485db-ppgjz\" (UID: \"c4eaf3f2-8536-46bf-8c5f-82606abec128\") " pod="openshift-console/console-f9d7485db-ppgjz" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.972996 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mszs7\" (UniqueName: \"kubernetes.io/projected/a151c473-d304-4e1d-ba12-7860c0efbac9-kube-api-access-mszs7\") pod \"olm-operator-6b444d44fb-tv6bv\" (UID: \"a151c473-d304-4e1d-ba12-7860c0efbac9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tv6bv" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.974812 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-nprpv"] Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.974824 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0c61344-19c2-4d8b-8aec-be86ac403866-config\") pod \"apiserver-76f77b778f-8l2n9\" (UID: \"a0c61344-19c2-4d8b-8aec-be86ac403866\") " pod="openshift-apiserver/apiserver-76f77b778f-8l2n9" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.974849 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ace7a9fa-7eac-449c-8b61-6018d592fc4f-client-ca\") pod \"route-controller-manager-6576b87f9c-zklpl\" (UID: \"ace7a9fa-7eac-449c-8b61-6018d592fc4f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zklpl" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.974870 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c73bcf80-34dc-466e-b1b0-a92850850498-audit-dir\") pod \"apiserver-7bbb656c7d-52php\" (UID: \"c73bcf80-34dc-466e-b1b0-a92850850498\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-52php" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.975554 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/28a597c2-65fe-4f1f-b4da-8cedf2c92a6b-auth-proxy-config\") pod \"machine-approver-56656f9798-tzrf8\" (UID: \"28a597c2-65fe-4f1f-b4da-8cedf2c92a6b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tzrf8" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.975563 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9a646659-b6c9-42c0-9bc8-ae149ad8ba85-service-ca-bundle\") pod \"authentication-operator-69f744f599-dxqqz\" (UID: \"9a646659-b6c9-42c0-9bc8-ae149ad8ba85\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dxqqz" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.975578 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a646659-b6c9-42c0-9bc8-ae149ad8ba85-config\") pod \"authentication-operator-69f744f599-dxqqz\" (UID: \"9a646659-b6c9-42c0-9bc8-ae149ad8ba85\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dxqqz" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.975598 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fcf13749-fd7c-4f01-9598-7f041910cd74-config\") pod \"controller-manager-879f6c89f-wbkxs\" (UID: \"fcf13749-fd7c-4f01-9598-7f041910cd74\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wbkxs" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.975661 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6fhf7"] Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.976529 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/a0c61344-19c2-4d8b-8aec-be86ac403866-image-import-ca\") pod \"apiserver-76f77b778f-8l2n9\" (UID: \"a0c61344-19c2-4d8b-8aec-be86ac403866\") " pod="openshift-apiserver/apiserver-76f77b778f-8l2n9" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.977000 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tv6bv"] Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.977207 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4eaf3f2-8536-46bf-8c5f-82606abec128-trusted-ca-bundle\") pod \"console-f9d7485db-ppgjz\" (UID: \"c4eaf3f2-8536-46bf-8c5f-82606abec128\") " pod="openshift-console/console-f9d7485db-ppgjz" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.977358 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zk4b5\" (UniqueName: \"kubernetes.io/projected/9a646659-b6c9-42c0-9bc8-ae149ad8ba85-kube-api-access-zk4b5\") pod \"authentication-operator-69f744f599-dxqqz\" (UID: \"9a646659-b6c9-42c0-9bc8-ae149ad8ba85\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dxqqz" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.977411 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/03f86e30-e6e2-473e-8a52-c1e86d28c2e2-config-volume\") pod \"collect-profiles-29567130-scc4x\" (UID: \"03f86e30-e6e2-473e-8a52-c1e86d28c2e2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567130-scc4x" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.977435 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d11ac9c7-0d8b-4a2c-a60f-7a0e88b01fa7-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2s6xz\" (UID: \"d11ac9c7-0d8b-4a2c-a60f-7a0e88b01fa7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2s6xz" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.977470 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c73bcf80-34dc-466e-b1b0-a92850850498-audit-policies\") pod \"apiserver-7bbb656c7d-52php\" (UID: \"c73bcf80-34dc-466e-b1b0-a92850850498\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-52php" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.977489 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c73bcf80-34dc-466e-b1b0-a92850850498-encryption-config\") pod \"apiserver-7bbb656c7d-52php\" (UID: \"c73bcf80-34dc-466e-b1b0-a92850850498\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-52php" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.977758 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a0c61344-19c2-4d8b-8aec-be86ac403866-serving-cert\") pod \"apiserver-76f77b778f-8l2n9\" (UID: \"a0c61344-19c2-4d8b-8aec-be86ac403866\") " pod="openshift-apiserver/apiserver-76f77b778f-8l2n9" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.977810 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/28a597c2-65fe-4f1f-b4da-8cedf2c92a6b-machine-approver-tls\") pod \"machine-approver-56656f9798-tzrf8\" (UID: \"28a597c2-65fe-4f1f-b4da-8cedf2c92a6b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tzrf8" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.977842 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee298daa-0334-4d62-b83f-7c2499f55af6-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-fqfhl\" (UID: \"ee298daa-0334-4d62-b83f-7c2499f55af6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fqfhl" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.978546 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-6rktv"] Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.979055 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bcdf1a44-e01e-4f8d-a5dd-f050ff98f14d-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-j6k6w\" (UID: \"bcdf1a44-e01e-4f8d-a5dd-f050ff98f14d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-j6k6w" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.979131 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e770c47d-95d6-45be-87cb-1fa3922afa82-bound-sa-token\") pod \"ingress-operator-5b745b69d9-ft28f\" (UID: \"e770c47d-95d6-45be-87cb-1fa3922afa82\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ft28f" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.979175 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/51ad4830-9e57-4bf2-91e5-7c24c7648d8b-metrics-tls\") pod \"dns-operator-744455d44c-nprpv\" (UID: \"51ad4830-9e57-4bf2-91e5-7c24c7648d8b\") " pod="openshift-dns-operator/dns-operator-744455d44c-nprpv" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.979208 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/906d9a20-0731-435a-80af-0dab64476e32-metrics-certs\") pod \"router-default-5444994796-sv7wd\" (UID: \"906d9a20-0731-435a-80af-0dab64476e32\") " pod="openshift-ingress/router-default-5444994796-sv7wd" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.979368 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fcf13749-fd7c-4f01-9598-7f041910cd74-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-wbkxs\" (UID: \"fcf13749-fd7c-4f01-9598-7f041910cd74\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wbkxs" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.979412 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xcff\" (UniqueName: \"kubernetes.io/projected/f05b3314-839f-43ca-bb32-951ef0582151-kube-api-access-8xcff\") pod \"ingress-canary-5m5vk\" (UID: \"f05b3314-839f-43ca-bb32-951ef0582151\") " pod="openshift-ingress-canary/ingress-canary-5m5vk" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.980303 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-sfhzf"] Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.980401 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6rktv" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.980454 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4c77\" (UniqueName: \"kubernetes.io/projected/c4eaf3f2-8536-46bf-8c5f-82606abec128-kube-api-access-w4c77\") pod \"console-f9d7485db-ppgjz\" (UID: \"c4eaf3f2-8536-46bf-8c5f-82606abec128\") " pod="openshift-console/console-f9d7485db-ppgjz" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.980508 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/906d9a20-0731-435a-80af-0dab64476e32-default-certificate\") pod \"router-default-5444994796-sv7wd\" (UID: \"906d9a20-0731-435a-80af-0dab64476e32\") " pod="openshift-ingress/router-default-5444994796-sv7wd" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.980669 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c73bcf80-34dc-466e-b1b0-a92850850498-etcd-client\") pod \"apiserver-7bbb656c7d-52php\" (UID: \"c73bcf80-34dc-466e-b1b0-a92850850498\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-52php" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.980689 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9a646659-b6c9-42c0-9bc8-ae149ad8ba85-serving-cert\") pod \"authentication-operator-69f744f599-dxqqz\" (UID: \"9a646659-b6c9-42c0-9bc8-ae149ad8ba85\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dxqqz" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.980736 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb444275-6cc1-42be-b742-afc344a60995-config\") pod \"machine-api-operator-5694c8668f-9n98c\" (UID: \"bb444275-6cc1-42be-b742-afc344a60995\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9n98c" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.980775 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8baee130-f518-4071-afbc-13625917aa7b-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-mk5m6\" (UID: \"8baee130-f518-4071-afbc-13625917aa7b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mk5m6" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.980822 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nfmkn"] Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.981327 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c73bcf80-34dc-466e-b1b0-a92850850498-encryption-config\") pod \"apiserver-7bbb656c7d-52php\" (UID: \"c73bcf80-34dc-466e-b1b0-a92850850498\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-52php" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.981378 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7tvv\" (UniqueName: \"kubernetes.io/projected/ff4fe98d-c7c0-475a-85cb-70ab2c4ad122-kube-api-access-b7tvv\") pod \"multus-admission-controller-857f4d67dd-6rw4d\" (UID: \"ff4fe98d-c7c0-475a-85cb-70ab2c4ad122\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-6rw4d" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.981465 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/40dc1d7f-44d0-4ded-92b5-c2cf3df0bfd8-profile-collector-cert\") pod \"catalog-operator-68c6474976-f646m\" (UID: \"40dc1d7f-44d0-4ded-92b5-c2cf3df0bfd8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f646m" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.981580 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8baee130-f518-4071-afbc-13625917aa7b-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-mk5m6\" (UID: \"8baee130-f518-4071-afbc-13625917aa7b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mk5m6" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.981634 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtrf9\" (UniqueName: \"kubernetes.io/projected/a0c61344-19c2-4d8b-8aec-be86ac403866-kube-api-access-qtrf9\") pod \"apiserver-76f77b778f-8l2n9\" (UID: \"a0c61344-19c2-4d8b-8aec-be86ac403866\") " pod="openshift-apiserver/apiserver-76f77b778f-8l2n9" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.981706 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c4eaf3f2-8536-46bf-8c5f-82606abec128-oauth-serving-cert\") pod \"console-f9d7485db-ppgjz\" (UID: \"c4eaf3f2-8536-46bf-8c5f-82606abec128\") " pod="openshift-console/console-f9d7485db-ppgjz" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.981718 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb444275-6cc1-42be-b742-afc344a60995-config\") pod \"machine-api-operator-5694c8668f-9n98c\" (UID: \"bb444275-6cc1-42be-b742-afc344a60995\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9n98c" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.981775 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2wfr\" (UniqueName: \"kubernetes.io/projected/906d9a20-0731-435a-80af-0dab64476e32-kube-api-access-h2wfr\") pod \"router-default-5444994796-sv7wd\" (UID: \"906d9a20-0731-435a-80af-0dab64476e32\") " pod="openshift-ingress/router-default-5444994796-sv7wd" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.981826 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e6328cb4-ec5c-4913-b7b9-ed18d759d7f1-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-v7pjq\" (UID: \"e6328cb4-ec5c-4913-b7b9-ed18d759d7f1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-v7pjq" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.981848 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/31ca74dd-dc4d-466a-8ca3-48f9b2d3e9f8-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-nfmkn\" (UID: \"31ca74dd-dc4d-466a-8ca3-48f9b2d3e9f8\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nfmkn" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.981936 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c73bcf80-34dc-466e-b1b0-a92850850498-serving-cert\") pod \"apiserver-7bbb656c7d-52php\" (UID: \"c73bcf80-34dc-466e-b1b0-a92850850498\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-52php" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.981998 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8baee130-f518-4071-afbc-13625917aa7b-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-mk5m6\" (UID: \"8baee130-f518-4071-afbc-13625917aa7b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mk5m6" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.982026 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gbmq\" (UniqueName: \"kubernetes.io/projected/a78540fe-014c-42e6-916c-3f39b4611a15-kube-api-access-7gbmq\") pod \"cluster-image-registry-operator-dc59b4c8b-cg94j\" (UID: \"a78540fe-014c-42e6-916c-3f39b4611a15\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cg94j" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.982091 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41847043-0aca-46d5-940f-3dfd2ded491f-config\") pod \"etcd-operator-b45778765-nm2vw\" (UID: \"41847043-0aca-46d5-940f-3dfd2ded491f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nm2vw" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.982140 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-5m5vk"] Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.982359 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a0c61344-19c2-4d8b-8aec-be86ac403866-serving-cert\") pod \"apiserver-76f77b778f-8l2n9\" (UID: \"a0c61344-19c2-4d8b-8aec-be86ac403866\") " pod="openshift-apiserver/apiserver-76f77b778f-8l2n9" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.982428 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c4eaf3f2-8536-46bf-8c5f-82606abec128-oauth-serving-cert\") pod \"console-f9d7485db-ppgjz\" (UID: \"c4eaf3f2-8536-46bf-8c5f-82606abec128\") " pod="openshift-console/console-f9d7485db-ppgjz" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.982506 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/733046ae-dba4-407a-83ee-89677527d7cc-serving-cert\") pod \"service-ca-operator-777779d784-275fn\" (UID: \"733046ae-dba4-407a-83ee-89677527d7cc\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-275fn" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.982555 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mng4t\" (UniqueName: \"kubernetes.io/projected/51ad4830-9e57-4bf2-91e5-7c24c7648d8b-kube-api-access-mng4t\") pod \"dns-operator-744455d44c-nprpv\" (UID: \"51ad4830-9e57-4bf2-91e5-7c24c7648d8b\") " pod="openshift-dns-operator/dns-operator-744455d44c-nprpv" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.982581 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a0c61344-19c2-4d8b-8aec-be86ac403866-encryption-config\") pod \"apiserver-76f77b778f-8l2n9\" (UID: \"a0c61344-19c2-4d8b-8aec-be86ac403866\") " pod="openshift-apiserver/apiserver-76f77b778f-8l2n9" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.982636 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/41847043-0aca-46d5-940f-3dfd2ded491f-etcd-service-ca\") pod \"etcd-operator-b45778765-nm2vw\" (UID: \"41847043-0aca-46d5-940f-3dfd2ded491f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nm2vw" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.982662 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/429d8b5d-8e50-4115-89e7-1c8d3f53bd27-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-jtggf\" (UID: \"429d8b5d-8e50-4115-89e7-1c8d3f53bd27\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jtggf" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.982681 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9flwk\" (UniqueName: \"kubernetes.io/projected/34d2f5b9-1f8e-4413-b178-58cd10fa7548-kube-api-access-9flwk\") pod \"auto-csr-approver-29567134-66l98\" (UID: \"34d2f5b9-1f8e-4413-b178-58cd10fa7548\") " pod="openshift-infra/auto-csr-approver-29567134-66l98" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.982705 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c73bcf80-34dc-466e-b1b0-a92850850498-audit-policies\") pod \"apiserver-7bbb656c7d-52php\" (UID: \"c73bcf80-34dc-466e-b1b0-a92850850498\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-52php" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.982727 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ace7a9fa-7eac-449c-8b61-6018d592fc4f-serving-cert\") pod \"route-controller-manager-6576b87f9c-zklpl\" (UID: \"ace7a9fa-7eac-449c-8b61-6018d592fc4f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zklpl" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.982770 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/429d8b5d-8e50-4115-89e7-1c8d3f53bd27-proxy-tls\") pod \"machine-config-controller-84d6567774-jtggf\" (UID: \"429d8b5d-8e50-4115-89e7-1c8d3f53bd27\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jtggf" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.982810 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a151c473-d304-4e1d-ba12-7860c0efbac9-srv-cert\") pod \"olm-operator-6b444d44fb-tv6bv\" (UID: \"a151c473-d304-4e1d-ba12-7860c0efbac9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tv6bv" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.982833 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a0c61344-19c2-4d8b-8aec-be86ac403866-etcd-serving-ca\") pod \"apiserver-76f77b778f-8l2n9\" (UID: \"a0c61344-19c2-4d8b-8aec-be86ac403866\") " pod="openshift-apiserver/apiserver-76f77b778f-8l2n9" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.982851 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a7917eb5-2c7a-426c-8850-4209cd22e790-trusted-ca\") pod \"console-operator-58897d9998-6zcl9\" (UID: \"a7917eb5-2c7a-426c-8850-4209cd22e790\") " pod="openshift-console-operator/console-operator-58897d9998-6zcl9" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.982888 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/dc26c755-5e1b-480b-b3ed-b3d3dee36d94-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-kkhg7\" (UID: \"dc26c755-5e1b-480b-b3ed-b3d3dee36d94\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kkhg7" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.983060 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9a646659-b6c9-42c0-9bc8-ae149ad8ba85-serving-cert\") pod \"authentication-operator-69f744f599-dxqqz\" (UID: \"9a646659-b6c9-42c0-9bc8-ae149ad8ba85\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dxqqz" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.983183 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cg58v\" (UniqueName: \"kubernetes.io/projected/c73bcf80-34dc-466e-b1b0-a92850850498-kube-api-access-cg58v\") pod \"apiserver-7bbb656c7d-52php\" (UID: \"c73bcf80-34dc-466e-b1b0-a92850850498\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-52php" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.983284 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9b7t\" (UniqueName: \"kubernetes.io/projected/fc1d890d-f494-466b-94a2-03c2d2c3fe7f-kube-api-access-k9b7t\") pod \"migrator-59844c95c7-8pvtf\" (UID: \"fc1d890d-f494-466b-94a2-03c2d2c3fe7f\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-8pvtf" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.983322 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fcf13749-fd7c-4f01-9598-7f041910cd74-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-wbkxs\" (UID: \"fcf13749-fd7c-4f01-9598-7f041910cd74\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wbkxs" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.983362 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c73bcf80-34dc-466e-b1b0-a92850850498-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-52php\" (UID: \"c73bcf80-34dc-466e-b1b0-a92850850498\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-52php" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.983431 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-2b2sh"] Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.983593 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bkg9\" (UniqueName: \"kubernetes.io/projected/41847043-0aca-46d5-940f-3dfd2ded491f-kube-api-access-4bkg9\") pod \"etcd-operator-b45778765-nm2vw\" (UID: \"41847043-0aca-46d5-940f-3dfd2ded491f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nm2vw" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.983633 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a151c473-d304-4e1d-ba12-7860c0efbac9-profile-collector-cert\") pod \"olm-operator-6b444d44fb-tv6bv\" (UID: \"a151c473-d304-4e1d-ba12-7860c0efbac9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tv6bv" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.983657 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2657z\" (UniqueName: \"kubernetes.io/projected/fcf13749-fd7c-4f01-9598-7f041910cd74-kube-api-access-2657z\") pod \"controller-manager-879f6c89f-wbkxs\" (UID: \"fcf13749-fd7c-4f01-9598-7f041910cd74\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wbkxs" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.983682 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6edcd95a-9780-4af1-9454-da6dce913528-auth-proxy-config\") pod \"machine-config-operator-74547568cd-sfhzf\" (UID: \"6edcd95a-9780-4af1-9454-da6dce913528\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sfhzf" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.983795 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/41847043-0aca-46d5-940f-3dfd2ded491f-serving-cert\") pod \"etcd-operator-b45778765-nm2vw\" (UID: \"41847043-0aca-46d5-940f-3dfd2ded491f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nm2vw" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.983819 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7917eb5-2c7a-426c-8850-4209cd22e790-config\") pod \"console-operator-58897d9998-6zcl9\" (UID: \"a7917eb5-2c7a-426c-8850-4209cd22e790\") " pod="openshift-console-operator/console-operator-58897d9998-6zcl9" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.983846 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/a78540fe-014c-42e6-916c-3f39b4611a15-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-cg94j\" (UID: \"a78540fe-014c-42e6-916c-3f39b4611a15\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cg94j" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.983869 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d11ac9c7-0d8b-4a2c-a60f-7a0e88b01fa7-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2s6xz\" (UID: \"d11ac9c7-0d8b-4a2c-a60f-7a0e88b01fa7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2s6xz" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.983878 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41847043-0aca-46d5-940f-3dfd2ded491f-config\") pod \"etcd-operator-b45778765-nm2vw\" (UID: \"41847043-0aca-46d5-940f-3dfd2ded491f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nm2vw" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.983889 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b3bcd08e-d3ff-4cc6-8a32-1d43add9fddf-apiservice-cert\") pod \"packageserver-d55dfcdfc-2n45v\" (UID: \"b3bcd08e-d3ff-4cc6-8a32-1d43add9fddf\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2n45v" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.983955 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ace7a9fa-7eac-449c-8b61-6018d592fc4f-config\") pod \"route-controller-manager-6576b87f9c-zklpl\" (UID: \"ace7a9fa-7eac-449c-8b61-6018d592fc4f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zklpl" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.983978 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d11ac9c7-0d8b-4a2c-a60f-7a0e88b01fa7-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2s6xz\" (UID: \"d11ac9c7-0d8b-4a2c-a60f-7a0e88b01fa7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2s6xz" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.984029 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/906d9a20-0731-435a-80af-0dab64476e32-stats-auth\") pod \"router-default-5444994796-sv7wd\" (UID: \"906d9a20-0731-435a-80af-0dab64476e32\") " pod="openshift-ingress/router-default-5444994796-sv7wd" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.984056 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a0c61344-19c2-4d8b-8aec-be86ac403866-trusted-ca-bundle\") pod \"apiserver-76f77b778f-8l2n9\" (UID: \"a0c61344-19c2-4d8b-8aec-be86ac403866\") " pod="openshift-apiserver/apiserver-76f77b778f-8l2n9" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.984324 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a0c61344-19c2-4d8b-8aec-be86ac403866-etcd-serving-ca\") pod \"apiserver-76f77b778f-8l2n9\" (UID: \"a0c61344-19c2-4d8b-8aec-be86ac403866\") " pod="openshift-apiserver/apiserver-76f77b778f-8l2n9" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.984550 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28a597c2-65fe-4f1f-b4da-8cedf2c92a6b-config\") pod \"machine-approver-56656f9798-tzrf8\" (UID: \"28a597c2-65fe-4f1f-b4da-8cedf2c92a6b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tzrf8" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.984707 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6496f\" (UniqueName: \"kubernetes.io/projected/ace7a9fa-7eac-449c-8b61-6018d592fc4f-kube-api-access-6496f\") pod \"route-controller-manager-6576b87f9c-zklpl\" (UID: \"ace7a9fa-7eac-449c-8b61-6018d592fc4f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zklpl" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.984743 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ff4fe98d-c7c0-475a-85cb-70ab2c4ad122-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-6rw4d\" (UID: \"ff4fe98d-c7c0-475a-85cb-70ab2c4ad122\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-6rw4d" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.984844 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c4eaf3f2-8536-46bf-8c5f-82606abec128-service-ca\") pod \"console-f9d7485db-ppgjz\" (UID: \"c4eaf3f2-8536-46bf-8c5f-82606abec128\") " pod="openshift-console/console-f9d7485db-ppgjz" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.984875 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-88brt"] Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.984890 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bcdf1a44-e01e-4f8d-a5dd-f050ff98f14d-config\") pod \"openshift-apiserver-operator-796bbdcf4f-j6k6w\" (UID: \"bcdf1a44-e01e-4f8d-a5dd-f050ff98f14d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-j6k6w" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.984937 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c4eaf3f2-8536-46bf-8c5f-82606abec128-console-oauth-config\") pod \"console-f9d7485db-ppgjz\" (UID: \"c4eaf3f2-8536-46bf-8c5f-82606abec128\") " pod="openshift-console/console-f9d7485db-ppgjz" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.985024 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28a597c2-65fe-4f1f-b4da-8cedf2c92a6b-config\") pod \"machine-approver-56656f9798-tzrf8\" (UID: \"28a597c2-65fe-4f1f-b4da-8cedf2c92a6b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tzrf8" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.985081 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e770c47d-95d6-45be-87cb-1fa3922afa82-metrics-tls\") pod \"ingress-operator-5b745b69d9-ft28f\" (UID: \"e770c47d-95d6-45be-87cb-1fa3922afa82\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ft28f" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.985103 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/d0a0c8bb-22ed-4ebb-aaf6-37d9a2e15a7c-signing-key\") pod \"service-ca-9c57cc56f-d7smm\" (UID: \"d0a0c8bb-22ed-4ebb-aaf6-37d9a2e15a7c\") " pod="openshift-service-ca/service-ca-9c57cc56f-d7smm" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.985126 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e770c47d-95d6-45be-87cb-1fa3922afa82-trusted-ca\") pod \"ingress-operator-5b745b69d9-ft28f\" (UID: \"e770c47d-95d6-45be-87cb-1fa3922afa82\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ft28f" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.985144 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b3bcd08e-d3ff-4cc6-8a32-1d43add9fddf-webhook-cert\") pod \"packageserver-d55dfcdfc-2n45v\" (UID: \"b3bcd08e-d3ff-4cc6-8a32-1d43add9fddf\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2n45v" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.985165 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/bb444275-6cc1-42be-b742-afc344a60995-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-9n98c\" (UID: \"bb444275-6cc1-42be-b742-afc344a60995\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9n98c" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.985199 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c4eaf3f2-8536-46bf-8c5f-82606abec128-console-serving-cert\") pod \"console-f9d7485db-ppgjz\" (UID: \"c4eaf3f2-8536-46bf-8c5f-82606abec128\") " pod="openshift-console/console-f9d7485db-ppgjz" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.985239 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c73bcf80-34dc-466e-b1b0-a92850850498-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-52php\" (UID: \"c73bcf80-34dc-466e-b1b0-a92850850498\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-52php" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.985299 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/41847043-0aca-46d5-940f-3dfd2ded491f-etcd-client\") pod \"etcd-operator-b45778765-nm2vw\" (UID: \"41847043-0aca-46d5-940f-3dfd2ded491f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nm2vw" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.985383 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fcf13749-fd7c-4f01-9598-7f041910cd74-client-ca\") pod \"controller-manager-879f6c89f-wbkxs\" (UID: \"fcf13749-fd7c-4f01-9598-7f041910cd74\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wbkxs" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.985389 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/41847043-0aca-46d5-940f-3dfd2ded491f-etcd-service-ca\") pod \"etcd-operator-b45778765-nm2vw\" (UID: \"41847043-0aca-46d5-940f-3dfd2ded491f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nm2vw" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.985406 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jd7z\" (UniqueName: \"kubernetes.io/projected/31ca74dd-dc4d-466a-8ca3-48f9b2d3e9f8-kube-api-access-4jd7z\") pod \"package-server-manager-789f6589d5-nfmkn\" (UID: \"31ca74dd-dc4d-466a-8ca3-48f9b2d3e9f8\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nfmkn" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.985449 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1a46a1ee-5f40-4d85-b726-d758b7ceff37-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-2b2sh\" (UID: \"1a46a1ee-5f40-4d85-b726-d758b7ceff37\") " pod="openshift-authentication/oauth-openshift-558db77b4-2b2sh" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.985524 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a0c61344-19c2-4d8b-8aec-be86ac403866-trusted-ca-bundle\") pod \"apiserver-76f77b778f-8l2n9\" (UID: \"a0c61344-19c2-4d8b-8aec-be86ac403866\") " pod="openshift-apiserver/apiserver-76f77b778f-8l2n9" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.987637 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bcdf1a44-e01e-4f8d-a5dd-f050ff98f14d-config\") pod \"openshift-apiserver-operator-796bbdcf4f-j6k6w\" (UID: \"bcdf1a44-e01e-4f8d-a5dd-f050ff98f14d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-j6k6w" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.987854 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c4eaf3f2-8536-46bf-8c5f-82606abec128-service-ca\") pod \"console-f9d7485db-ppgjz\" (UID: \"c4eaf3f2-8536-46bf-8c5f-82606abec128\") " pod="openshift-console/console-f9d7485db-ppgjz" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.988172 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a0c61344-19c2-4d8b-8aec-be86ac403866-etcd-client\") pod \"apiserver-76f77b778f-8l2n9\" (UID: \"a0c61344-19c2-4d8b-8aec-be86ac403866\") " pod="openshift-apiserver/apiserver-76f77b778f-8l2n9" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.988398 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8baee130-f518-4071-afbc-13625917aa7b-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-mk5m6\" (UID: \"8baee130-f518-4071-afbc-13625917aa7b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mk5m6" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.988458 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtn4s\" (UniqueName: \"kubernetes.io/projected/a7917eb5-2c7a-426c-8850-4209cd22e790-kube-api-access-gtn4s\") pod \"console-operator-58897d9998-6zcl9\" (UID: \"a7917eb5-2c7a-426c-8850-4209cd22e790\") " pod="openshift-console-operator/console-operator-58897d9998-6zcl9" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.988500 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bf7nx\" (UniqueName: \"kubernetes.io/projected/bcdf1a44-e01e-4f8d-a5dd-f050ff98f14d-kube-api-access-bf7nx\") pod \"openshift-apiserver-operator-796bbdcf4f-j6k6w\" (UID: \"bcdf1a44-e01e-4f8d-a5dd-f050ff98f14d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-j6k6w" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.988527 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/41847043-0aca-46d5-940f-3dfd2ded491f-etcd-ca\") pod \"etcd-operator-b45778765-nm2vw\" (UID: \"41847043-0aca-46d5-940f-3dfd2ded491f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nm2vw" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.988554 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1a46a1ee-5f40-4d85-b726-d758b7ceff37-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-2b2sh\" (UID: \"1a46a1ee-5f40-4d85-b726-d758b7ceff37\") " pod="openshift-authentication/oauth-openshift-558db77b4-2b2sh" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.988844 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fcf13749-fd7c-4f01-9598-7f041910cd74-client-ca\") pod \"controller-manager-879f6c89f-wbkxs\" (UID: \"fcf13749-fd7c-4f01-9598-7f041910cd74\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wbkxs" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.988865 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c73bcf80-34dc-466e-b1b0-a92850850498-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-52php\" (UID: \"c73bcf80-34dc-466e-b1b0-a92850850498\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-52php" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.989316 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/03f86e30-e6e2-473e-8a52-c1e86d28c2e2-secret-volume\") pod \"collect-profiles-29567130-scc4x\" (UID: \"03f86e30-e6e2-473e-8a52-c1e86d28c2e2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567130-scc4x" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.989545 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/80d86fac-74cc-41d4-81df-2e718c1568d9-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-dnpcn\" (UID: \"80d86fac-74cc-41d4-81df-2e718c1568d9\") " pod="openshift-marketplace/marketplace-operator-79b997595-dnpcn" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.989573 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpkrr\" (UniqueName: \"kubernetes.io/projected/03f86e30-e6e2-473e-8a52-c1e86d28c2e2-kube-api-access-rpkrr\") pod \"collect-profiles-29567130-scc4x\" (UID: \"03f86e30-e6e2-473e-8a52-c1e86d28c2e2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567130-scc4x" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.989786 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1a46a1ee-5f40-4d85-b726-d758b7ceff37-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-2b2sh\" (UID: \"1a46a1ee-5f40-4d85-b726-d758b7ceff37\") " pod="openshift-authentication/oauth-openshift-558db77b4-2b2sh" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.990106 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/40dc1d7f-44d0-4ded-92b5-c2cf3df0bfd8-profile-collector-cert\") pod \"catalog-operator-68c6474976-f646m\" (UID: \"40dc1d7f-44d0-4ded-92b5-c2cf3df0bfd8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f646m" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.990112 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/28a597c2-65fe-4f1f-b4da-8cedf2c92a6b-machine-approver-tls\") pod \"machine-approver-56656f9798-tzrf8\" (UID: \"28a597c2-65fe-4f1f-b4da-8cedf2c92a6b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tzrf8" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.990550 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/41847043-0aca-46d5-940f-3dfd2ded491f-serving-cert\") pod \"etcd-operator-b45778765-nm2vw\" (UID: \"41847043-0aca-46d5-940f-3dfd2ded491f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nm2vw" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.990569 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9a646659-b6c9-42c0-9bc8-ae149ad8ba85-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-dxqqz\" (UID: \"9a646659-b6c9-42c0-9bc8-ae149ad8ba85\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dxqqz" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.990690 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c73bcf80-34dc-466e-b1b0-a92850850498-serving-cert\") pod \"apiserver-7bbb656c7d-52php\" (UID: \"c73bcf80-34dc-466e-b1b0-a92850850498\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-52php" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.990625 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a78540fe-014c-42e6-916c-3f39b4611a15-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-cg94j\" (UID: \"a78540fe-014c-42e6-916c-3f39b4611a15\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cg94j" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.990902 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1a46a1ee-5f40-4d85-b726-d758b7ceff37-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-2b2sh\" (UID: \"1a46a1ee-5f40-4d85-b726-d758b7ceff37\") " pod="openshift-authentication/oauth-openshift-558db77b4-2b2sh" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.990984 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ace7a9fa-7eac-449c-8b61-6018d592fc4f-serving-cert\") pod \"route-controller-manager-6576b87f9c-zklpl\" (UID: \"ace7a9fa-7eac-449c-8b61-6018d592fc4f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zklpl" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.991026 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e6328cb4-ec5c-4913-b7b9-ed18d759d7f1-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-v7pjq\" (UID: \"e6328cb4-ec5c-4913-b7b9-ed18d759d7f1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-v7pjq" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.991066 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1a46a1ee-5f40-4d85-b726-d758b7ceff37-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-2b2sh\" (UID: \"1a46a1ee-5f40-4d85-b726-d758b7ceff37\") " pod="openshift-authentication/oauth-openshift-558db77b4-2b2sh" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.991176 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c73bcf80-34dc-466e-b1b0-a92850850498-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-52php\" (UID: \"c73bcf80-34dc-466e-b1b0-a92850850498\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-52php" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.991457 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1a46a1ee-5f40-4d85-b726-d758b7ceff37-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-2b2sh\" (UID: \"1a46a1ee-5f40-4d85-b726-d758b7ceff37\") " pod="openshift-authentication/oauth-openshift-558db77b4-2b2sh" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.991487 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.991499 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xfns\" (UniqueName: \"kubernetes.io/projected/d0a0c8bb-22ed-4ebb-aaf6-37d9a2e15a7c-kube-api-access-8xfns\") pod \"service-ca-9c57cc56f-d7smm\" (UID: \"d0a0c8bb-22ed-4ebb-aaf6-37d9a2e15a7c\") " pod="openshift-service-ca/service-ca-9c57cc56f-d7smm" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.991638 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vr2vf\" (UniqueName: \"kubernetes.io/projected/1a46a1ee-5f40-4d85-b726-d758b7ceff37-kube-api-access-vr2vf\") pod \"oauth-openshift-558db77b4-2b2sh\" (UID: \"1a46a1ee-5f40-4d85-b726-d758b7ceff37\") " pod="openshift-authentication/oauth-openshift-558db77b4-2b2sh" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.991796 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a0c61344-19c2-4d8b-8aec-be86ac403866-node-pullsecrets\") pod \"apiserver-76f77b778f-8l2n9\" (UID: \"a0c61344-19c2-4d8b-8aec-be86ac403866\") " pod="openshift-apiserver/apiserver-76f77b778f-8l2n9" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.991829 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ndgf\" (UniqueName: \"kubernetes.io/projected/789eef8f-04a8-44cf-9e16-878de3a035bb-kube-api-access-8ndgf\") pod \"downloads-7954f5f757-v9wf6\" (UID: \"789eef8f-04a8-44cf-9e16-878de3a035bb\") " pod="openshift-console/downloads-7954f5f757-v9wf6" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.992373 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7g2h\" (UniqueName: \"kubernetes.io/projected/bb444275-6cc1-42be-b742-afc344a60995-kube-api-access-n7g2h\") pod \"machine-api-operator-5694c8668f-9n98c\" (UID: \"bb444275-6cc1-42be-b742-afc344a60995\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9n98c" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.992543 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a0c61344-19c2-4d8b-8aec-be86ac403866-node-pullsecrets\") pod \"apiserver-76f77b778f-8l2n9\" (UID: \"a0c61344-19c2-4d8b-8aec-be86ac403866\") " pod="openshift-apiserver/apiserver-76f77b778f-8l2n9" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.992959 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/41847043-0aca-46d5-940f-3dfd2ded491f-etcd-ca\") pod \"etcd-operator-b45778765-nm2vw\" (UID: \"41847043-0aca-46d5-940f-3dfd2ded491f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nm2vw" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.992441 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fcf13749-fd7c-4f01-9598-7f041910cd74-serving-cert\") pod \"controller-manager-879f6c89f-wbkxs\" (UID: \"fcf13749-fd7c-4f01-9598-7f041910cd74\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wbkxs" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.993116 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-944vg\" (UniqueName: \"kubernetes.io/projected/429d8b5d-8e50-4115-89e7-1c8d3f53bd27-kube-api-access-944vg\") pod \"machine-config-controller-84d6567774-jtggf\" (UID: \"429d8b5d-8e50-4115-89e7-1c8d3f53bd27\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jtggf" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.993356 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/20a97349-3805-4434-be4a-1cb8024add50-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-n5c8r\" (UID: \"20a97349-3805-4434-be4a-1cb8024add50\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-n5c8r" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.993431 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/80d86fac-74cc-41d4-81df-2e718c1568d9-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-dnpcn\" (UID: \"80d86fac-74cc-41d4-81df-2e718c1568d9\") " pod="openshift-marketplace/marketplace-operator-79b997595-dnpcn" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.994277 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/b3bcd08e-d3ff-4cc6-8a32-1d43add9fddf-tmpfs\") pod \"packageserver-d55dfcdfc-2n45v\" (UID: \"b3bcd08e-d3ff-4cc6-8a32-1d43add9fddf\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2n45v" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.994482 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/906d9a20-0731-435a-80af-0dab64476e32-service-ca-bundle\") pod \"router-default-5444994796-sv7wd\" (UID: \"906d9a20-0731-435a-80af-0dab64476e32\") " pod="openshift-ingress/router-default-5444994796-sv7wd" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.994519 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f05b3314-839f-43ca-bb32-951ef0582151-cert\") pod \"ingress-canary-5m5vk\" (UID: \"f05b3314-839f-43ca-bb32-951ef0582151\") " pod="openshift-ingress-canary/ingress-canary-5m5vk" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.994567 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/bb444275-6cc1-42be-b742-afc344a60995-images\") pod \"machine-api-operator-5694c8668f-9n98c\" (UID: \"bb444275-6cc1-42be-b742-afc344a60995\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9n98c" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.994711 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/40dc1d7f-44d0-4ded-92b5-c2cf3df0bfd8-srv-cert\") pod \"catalog-operator-68c6474976-f646m\" (UID: \"40dc1d7f-44d0-4ded-92b5-c2cf3df0bfd8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f646m" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.994742 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xqp9\" (UniqueName: \"kubernetes.io/projected/ee298daa-0334-4d62-b83f-7c2499f55af6-kube-api-access-9xqp9\") pod \"kube-storage-version-migrator-operator-b67b599dd-fqfhl\" (UID: \"ee298daa-0334-4d62-b83f-7c2499f55af6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fqfhl" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.994924 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/a0c61344-19c2-4d8b-8aec-be86ac403866-audit\") pod \"apiserver-76f77b778f-8l2n9\" (UID: \"a0c61344-19c2-4d8b-8aec-be86ac403866\") " pod="openshift-apiserver/apiserver-76f77b778f-8l2n9" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.994953 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a0c61344-19c2-4d8b-8aec-be86ac403866-audit-dir\") pod \"apiserver-76f77b778f-8l2n9\" (UID: \"a0c61344-19c2-4d8b-8aec-be86ac403866\") " pod="openshift-apiserver/apiserver-76f77b778f-8l2n9" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.994975 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/d0a0c8bb-22ed-4ebb-aaf6-37d9a2e15a7c-signing-cabundle\") pod \"service-ca-9c57cc56f-d7smm\" (UID: \"d0a0c8bb-22ed-4ebb-aaf6-37d9a2e15a7c\") " pod="openshift-service-ca/service-ca-9c57cc56f-d7smm" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.995193 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmz4h\" (UniqueName: \"kubernetes.io/projected/6edcd95a-9780-4af1-9454-da6dce913528-kube-api-access-gmz4h\") pod \"machine-config-operator-74547568cd-sfhzf\" (UID: \"6edcd95a-9780-4af1-9454-da6dce913528\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sfhzf" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.995666 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/a0c61344-19c2-4d8b-8aec-be86ac403866-audit\") pod \"apiserver-76f77b778f-8l2n9\" (UID: \"a0c61344-19c2-4d8b-8aec-be86ac403866\") " pod="openshift-apiserver/apiserver-76f77b778f-8l2n9" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.996130 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fcf13749-fd7c-4f01-9598-7f041910cd74-serving-cert\") pod \"controller-manager-879f6c89f-wbkxs\" (UID: \"fcf13749-fd7c-4f01-9598-7f041910cd74\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wbkxs" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.996181 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a0c61344-19c2-4d8b-8aec-be86ac403866-audit-dir\") pod \"apiserver-76f77b778f-8l2n9\" (UID: \"a0c61344-19c2-4d8b-8aec-be86ac403866\") " pod="openshift-apiserver/apiserver-76f77b778f-8l2n9" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.996181 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c4eaf3f2-8536-46bf-8c5f-82606abec128-console-oauth-config\") pod \"console-f9d7485db-ppgjz\" (UID: \"c4eaf3f2-8536-46bf-8c5f-82606abec128\") " pod="openshift-console/console-f9d7485db-ppgjz" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.996775 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a0c61344-19c2-4d8b-8aec-be86ac403866-encryption-config\") pod \"apiserver-76f77b778f-8l2n9\" (UID: \"a0c61344-19c2-4d8b-8aec-be86ac403866\") " pod="openshift-apiserver/apiserver-76f77b778f-8l2n9" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.997053 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/bb444275-6cc1-42be-b742-afc344a60995-images\") pod \"machine-api-operator-5694c8668f-9n98c\" (UID: \"bb444275-6cc1-42be-b742-afc344a60995\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9n98c" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.997115 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c4eaf3f2-8536-46bf-8c5f-82606abec128-console-serving-cert\") pod \"console-f9d7485db-ppgjz\" (UID: \"c4eaf3f2-8536-46bf-8c5f-82606abec128\") " pod="openshift-console/console-f9d7485db-ppgjz" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.998154 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/40dc1d7f-44d0-4ded-92b5-c2cf3df0bfd8-srv-cert\") pod \"catalog-operator-68c6474976-f646m\" (UID: \"40dc1d7f-44d0-4ded-92b5-c2cf3df0bfd8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f646m" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.998918 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9a646659-b6c9-42c0-9bc8-ae149ad8ba85-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-dxqqz\" (UID: \"9a646659-b6c9-42c0-9bc8-ae149ad8ba85\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dxqqz" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.999334 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ace7a9fa-7eac-449c-8b61-6018d592fc4f-config\") pod \"route-controller-manager-6576b87f9c-zklpl\" (UID: \"ace7a9fa-7eac-449c-8b61-6018d592fc4f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zklpl" Mar 20 17:35:50 crc kubenswrapper[4690]: I0320 17:35:50.999670 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/41847043-0aca-46d5-940f-3dfd2ded491f-etcd-client\") pod \"etcd-operator-b45778765-nm2vw\" (UID: \"41847043-0aca-46d5-940f-3dfd2ded491f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nm2vw" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.000737 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2n45v"] Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.001452 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/20a97349-3805-4434-be4a-1cb8024add50-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-n5c8r\" (UID: \"20a97349-3805-4434-be4a-1cb8024add50\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-n5c8r" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.013824 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/bb444275-6cc1-42be-b742-afc344a60995-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-9n98c\" (UID: \"bb444275-6cc1-42be-b742-afc344a60995\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9n98c" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.013894 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-275fn"] Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.015660 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.015870 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dnpcn"] Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.016976 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fqfhl"] Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.017961 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2s6xz"] Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.019203 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-v7pjq"] Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.020419 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-6rktv"] Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.021527 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-6zcl9"] Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.022765 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567134-66l98"] Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.025134 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-ft28f"] Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.026505 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-8pvtf"] Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.027160 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-fqlzx"] Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.027567 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.027824 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-fqlzx" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.046846 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.067237 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.087783 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.095830 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9v6b5\" (UniqueName: \"kubernetes.io/projected/dc26c755-5e1b-480b-b3ed-b3d3dee36d94-kube-api-access-9v6b5\") pod \"control-plane-machine-set-operator-78cbb6b69f-kkhg7\" (UID: \"dc26c755-5e1b-480b-b3ed-b3d3dee36d94\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kkhg7" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.095884 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57922\" (UniqueName: \"kubernetes.io/projected/80d86fac-74cc-41d4-81df-2e718c1568d9-kube-api-access-57922\") pod \"marketplace-operator-79b997595-dnpcn\" (UID: \"80d86fac-74cc-41d4-81df-2e718c1568d9\") " pod="openshift-marketplace/marketplace-operator-79b997595-dnpcn" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.095917 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1a46a1ee-5f40-4d85-b726-d758b7ceff37-audit-policies\") pod \"oauth-openshift-558db77b4-2b2sh\" (UID: \"1a46a1ee-5f40-4d85-b726-d758b7ceff37\") " pod="openshift-authentication/oauth-openshift-558db77b4-2b2sh" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.095955 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6edcd95a-9780-4af1-9454-da6dce913528-images\") pod \"machine-config-operator-74547568cd-sfhzf\" (UID: \"6edcd95a-9780-4af1-9454-da6dce913528\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sfhzf" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.095976 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6edcd95a-9780-4af1-9454-da6dce913528-proxy-tls\") pod \"machine-config-operator-74547568cd-sfhzf\" (UID: \"6edcd95a-9780-4af1-9454-da6dce913528\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sfhzf" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.095995 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1a46a1ee-5f40-4d85-b726-d758b7ceff37-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-2b2sh\" (UID: \"1a46a1ee-5f40-4d85-b726-d758b7ceff37\") " pod="openshift-authentication/oauth-openshift-558db77b4-2b2sh" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.096096 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a46a1ee-5f40-4d85-b726-d758b7ceff37-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-2b2sh\" (UID: \"1a46a1ee-5f40-4d85-b726-d758b7ceff37\") " pod="openshift-authentication/oauth-openshift-558db77b4-2b2sh" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.096175 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a78540fe-014c-42e6-916c-3f39b4611a15-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-cg94j\" (UID: \"a78540fe-014c-42e6-916c-3f39b4611a15\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cg94j" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.096229 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mszs7\" (UniqueName: \"kubernetes.io/projected/a151c473-d304-4e1d-ba12-7860c0efbac9-kube-api-access-mszs7\") pod \"olm-operator-6b444d44fb-tv6bv\" (UID: \"a151c473-d304-4e1d-ba12-7860c0efbac9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tv6bv" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.096280 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/03f86e30-e6e2-473e-8a52-c1e86d28c2e2-config-volume\") pod \"collect-profiles-29567130-scc4x\" (UID: \"03f86e30-e6e2-473e-8a52-c1e86d28c2e2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567130-scc4x" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.096316 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d11ac9c7-0d8b-4a2c-a60f-7a0e88b01fa7-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2s6xz\" (UID: \"d11ac9c7-0d8b-4a2c-a60f-7a0e88b01fa7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2s6xz" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.096488 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee298daa-0334-4d62-b83f-7c2499f55af6-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-fqfhl\" (UID: \"ee298daa-0334-4d62-b83f-7c2499f55af6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fqfhl" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.096632 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e770c47d-95d6-45be-87cb-1fa3922afa82-bound-sa-token\") pod \"ingress-operator-5b745b69d9-ft28f\" (UID: \"e770c47d-95d6-45be-87cb-1fa3922afa82\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ft28f" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.096663 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/906d9a20-0731-435a-80af-0dab64476e32-metrics-certs\") pod \"router-default-5444994796-sv7wd\" (UID: \"906d9a20-0731-435a-80af-0dab64476e32\") " pod="openshift-ingress/router-default-5444994796-sv7wd" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.096901 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/51ad4830-9e57-4bf2-91e5-7c24c7648d8b-metrics-tls\") pod \"dns-operator-744455d44c-nprpv\" (UID: \"51ad4830-9e57-4bf2-91e5-7c24c7648d8b\") " pod="openshift-dns-operator/dns-operator-744455d44c-nprpv" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.097246 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a78540fe-014c-42e6-916c-3f39b4611a15-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-cg94j\" (UID: \"a78540fe-014c-42e6-916c-3f39b4611a15\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cg94j" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.097468 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xcff\" (UniqueName: \"kubernetes.io/projected/f05b3314-839f-43ca-bb32-951ef0582151-kube-api-access-8xcff\") pod \"ingress-canary-5m5vk\" (UID: \"f05b3314-839f-43ca-bb32-951ef0582151\") " pod="openshift-ingress-canary/ingress-canary-5m5vk" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.097534 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/906d9a20-0731-435a-80af-0dab64476e32-default-certificate\") pod \"router-default-5444994796-sv7wd\" (UID: \"906d9a20-0731-435a-80af-0dab64476e32\") " pod="openshift-ingress/router-default-5444994796-sv7wd" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.097593 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2wfr\" (UniqueName: \"kubernetes.io/projected/906d9a20-0731-435a-80af-0dab64476e32-kube-api-access-h2wfr\") pod \"router-default-5444994796-sv7wd\" (UID: \"906d9a20-0731-435a-80af-0dab64476e32\") " pod="openshift-ingress/router-default-5444994796-sv7wd" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.097614 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e6328cb4-ec5c-4913-b7b9-ed18d759d7f1-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-v7pjq\" (UID: \"e6328cb4-ec5c-4913-b7b9-ed18d759d7f1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-v7pjq" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.097632 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gbmq\" (UniqueName: \"kubernetes.io/projected/a78540fe-014c-42e6-916c-3f39b4611a15-kube-api-access-7gbmq\") pod \"cluster-image-registry-operator-dc59b4c8b-cg94j\" (UID: \"a78540fe-014c-42e6-916c-3f39b4611a15\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cg94j" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.097652 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/31ca74dd-dc4d-466a-8ca3-48f9b2d3e9f8-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-nfmkn\" (UID: \"31ca74dd-dc4d-466a-8ca3-48f9b2d3e9f8\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nfmkn" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.097681 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/733046ae-dba4-407a-83ee-89677527d7cc-serving-cert\") pod \"service-ca-operator-777779d784-275fn\" (UID: \"733046ae-dba4-407a-83ee-89677527d7cc\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-275fn" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.097700 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mng4t\" (UniqueName: \"kubernetes.io/projected/51ad4830-9e57-4bf2-91e5-7c24c7648d8b-kube-api-access-mng4t\") pod \"dns-operator-744455d44c-nprpv\" (UID: \"51ad4830-9e57-4bf2-91e5-7c24c7648d8b\") " pod="openshift-dns-operator/dns-operator-744455d44c-nprpv" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.097721 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9flwk\" (UniqueName: \"kubernetes.io/projected/34d2f5b9-1f8e-4413-b178-58cd10fa7548-kube-api-access-9flwk\") pod \"auto-csr-approver-29567134-66l98\" (UID: \"34d2f5b9-1f8e-4413-b178-58cd10fa7548\") " pod="openshift-infra/auto-csr-approver-29567134-66l98" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.097742 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/429d8b5d-8e50-4115-89e7-1c8d3f53bd27-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-jtggf\" (UID: \"429d8b5d-8e50-4115-89e7-1c8d3f53bd27\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jtggf" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.097763 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a7917eb5-2c7a-426c-8850-4209cd22e790-trusted-ca\") pod \"console-operator-58897d9998-6zcl9\" (UID: \"a7917eb5-2c7a-426c-8850-4209cd22e790\") " pod="openshift-console-operator/console-operator-58897d9998-6zcl9" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.097787 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/dc26c755-5e1b-480b-b3ed-b3d3dee36d94-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-kkhg7\" (UID: \"dc26c755-5e1b-480b-b3ed-b3d3dee36d94\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kkhg7" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.097819 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/429d8b5d-8e50-4115-89e7-1c8d3f53bd27-proxy-tls\") pod \"machine-config-controller-84d6567774-jtggf\" (UID: \"429d8b5d-8e50-4115-89e7-1c8d3f53bd27\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jtggf" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.097846 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a151c473-d304-4e1d-ba12-7860c0efbac9-srv-cert\") pod \"olm-operator-6b444d44fb-tv6bv\" (UID: \"a151c473-d304-4e1d-ba12-7860c0efbac9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tv6bv" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.097880 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9b7t\" (UniqueName: \"kubernetes.io/projected/fc1d890d-f494-466b-94a2-03c2d2c3fe7f-kube-api-access-k9b7t\") pod \"migrator-59844c95c7-8pvtf\" (UID: \"fc1d890d-f494-466b-94a2-03c2d2c3fe7f\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-8pvtf" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.097905 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a151c473-d304-4e1d-ba12-7860c0efbac9-profile-collector-cert\") pod \"olm-operator-6b444d44fb-tv6bv\" (UID: \"a151c473-d304-4e1d-ba12-7860c0efbac9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tv6bv" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.097949 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6edcd95a-9780-4af1-9454-da6dce913528-auth-proxy-config\") pod \"machine-config-operator-74547568cd-sfhzf\" (UID: \"6edcd95a-9780-4af1-9454-da6dce913528\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sfhzf" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.097982 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7917eb5-2c7a-426c-8850-4209cd22e790-config\") pod \"console-operator-58897d9998-6zcl9\" (UID: \"a7917eb5-2c7a-426c-8850-4209cd22e790\") " pod="openshift-console-operator/console-operator-58897d9998-6zcl9" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.098004 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/a78540fe-014c-42e6-916c-3f39b4611a15-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-cg94j\" (UID: \"a78540fe-014c-42e6-916c-3f39b4611a15\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cg94j" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.098025 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d11ac9c7-0d8b-4a2c-a60f-7a0e88b01fa7-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2s6xz\" (UID: \"d11ac9c7-0d8b-4a2c-a60f-7a0e88b01fa7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2s6xz" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.098045 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d11ac9c7-0d8b-4a2c-a60f-7a0e88b01fa7-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2s6xz\" (UID: \"d11ac9c7-0d8b-4a2c-a60f-7a0e88b01fa7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2s6xz" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.098067 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b3bcd08e-d3ff-4cc6-8a32-1d43add9fddf-apiservice-cert\") pod \"packageserver-d55dfcdfc-2n45v\" (UID: \"b3bcd08e-d3ff-4cc6-8a32-1d43add9fddf\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2n45v" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.098097 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/906d9a20-0731-435a-80af-0dab64476e32-stats-auth\") pod \"router-default-5444994796-sv7wd\" (UID: \"906d9a20-0731-435a-80af-0dab64476e32\") " pod="openshift-ingress/router-default-5444994796-sv7wd" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.098122 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e770c47d-95d6-45be-87cb-1fa3922afa82-metrics-tls\") pod \"ingress-operator-5b745b69d9-ft28f\" (UID: \"e770c47d-95d6-45be-87cb-1fa3922afa82\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ft28f" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.098141 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/d0a0c8bb-22ed-4ebb-aaf6-37d9a2e15a7c-signing-key\") pod \"service-ca-9c57cc56f-d7smm\" (UID: \"d0a0c8bb-22ed-4ebb-aaf6-37d9a2e15a7c\") " pod="openshift-service-ca/service-ca-9c57cc56f-d7smm" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.098165 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e770c47d-95d6-45be-87cb-1fa3922afa82-trusted-ca\") pod \"ingress-operator-5b745b69d9-ft28f\" (UID: \"e770c47d-95d6-45be-87cb-1fa3922afa82\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ft28f" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.098188 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b3bcd08e-d3ff-4cc6-8a32-1d43add9fddf-webhook-cert\") pod \"packageserver-d55dfcdfc-2n45v\" (UID: \"b3bcd08e-d3ff-4cc6-8a32-1d43add9fddf\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2n45v" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.098211 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jd7z\" (UniqueName: \"kubernetes.io/projected/31ca74dd-dc4d-466a-8ca3-48f9b2d3e9f8-kube-api-access-4jd7z\") pod \"package-server-manager-789f6589d5-nfmkn\" (UID: \"31ca74dd-dc4d-466a-8ca3-48f9b2d3e9f8\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nfmkn" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.098233 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1a46a1ee-5f40-4d85-b726-d758b7ceff37-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-2b2sh\" (UID: \"1a46a1ee-5f40-4d85-b726-d758b7ceff37\") " pod="openshift-authentication/oauth-openshift-558db77b4-2b2sh" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.098278 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtn4s\" (UniqueName: \"kubernetes.io/projected/a7917eb5-2c7a-426c-8850-4209cd22e790-kube-api-access-gtn4s\") pod \"console-operator-58897d9998-6zcl9\" (UID: \"a7917eb5-2c7a-426c-8850-4209cd22e790\") " pod="openshift-console-operator/console-operator-58897d9998-6zcl9" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.098302 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1a46a1ee-5f40-4d85-b726-d758b7ceff37-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-2b2sh\" (UID: \"1a46a1ee-5f40-4d85-b726-d758b7ceff37\") " pod="openshift-authentication/oauth-openshift-558db77b4-2b2sh" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.098336 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/03f86e30-e6e2-473e-8a52-c1e86d28c2e2-secret-volume\") pod \"collect-profiles-29567130-scc4x\" (UID: \"03f86e30-e6e2-473e-8a52-c1e86d28c2e2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567130-scc4x" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.098358 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/80d86fac-74cc-41d4-81df-2e718c1568d9-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-dnpcn\" (UID: \"80d86fac-74cc-41d4-81df-2e718c1568d9\") " pod="openshift-marketplace/marketplace-operator-79b997595-dnpcn" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.098380 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rpkrr\" (UniqueName: \"kubernetes.io/projected/03f86e30-e6e2-473e-8a52-c1e86d28c2e2-kube-api-access-rpkrr\") pod \"collect-profiles-29567130-scc4x\" (UID: \"03f86e30-e6e2-473e-8a52-c1e86d28c2e2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567130-scc4x" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.098402 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1a46a1ee-5f40-4d85-b726-d758b7ceff37-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-2b2sh\" (UID: \"1a46a1ee-5f40-4d85-b726-d758b7ceff37\") " pod="openshift-authentication/oauth-openshift-558db77b4-2b2sh" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.098424 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a78540fe-014c-42e6-916c-3f39b4611a15-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-cg94j\" (UID: \"a78540fe-014c-42e6-916c-3f39b4611a15\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cg94j" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.098446 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1a46a1ee-5f40-4d85-b726-d758b7ceff37-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-2b2sh\" (UID: \"1a46a1ee-5f40-4d85-b726-d758b7ceff37\") " pod="openshift-authentication/oauth-openshift-558db77b4-2b2sh" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.098467 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e6328cb4-ec5c-4913-b7b9-ed18d759d7f1-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-v7pjq\" (UID: \"e6328cb4-ec5c-4913-b7b9-ed18d759d7f1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-v7pjq" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.098491 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1a46a1ee-5f40-4d85-b726-d758b7ceff37-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-2b2sh\" (UID: \"1a46a1ee-5f40-4d85-b726-d758b7ceff37\") " pod="openshift-authentication/oauth-openshift-558db77b4-2b2sh" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.098512 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1a46a1ee-5f40-4d85-b726-d758b7ceff37-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-2b2sh\" (UID: \"1a46a1ee-5f40-4d85-b726-d758b7ceff37\") " pod="openshift-authentication/oauth-openshift-558db77b4-2b2sh" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.098532 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xfns\" (UniqueName: \"kubernetes.io/projected/d0a0c8bb-22ed-4ebb-aaf6-37d9a2e15a7c-kube-api-access-8xfns\") pod \"service-ca-9c57cc56f-d7smm\" (UID: \"d0a0c8bb-22ed-4ebb-aaf6-37d9a2e15a7c\") " pod="openshift-service-ca/service-ca-9c57cc56f-d7smm" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.098553 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vr2vf\" (UniqueName: \"kubernetes.io/projected/1a46a1ee-5f40-4d85-b726-d758b7ceff37-kube-api-access-vr2vf\") pod \"oauth-openshift-558db77b4-2b2sh\" (UID: \"1a46a1ee-5f40-4d85-b726-d758b7ceff37\") " pod="openshift-authentication/oauth-openshift-558db77b4-2b2sh" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.098591 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-944vg\" (UniqueName: \"kubernetes.io/projected/429d8b5d-8e50-4115-89e7-1c8d3f53bd27-kube-api-access-944vg\") pod \"machine-config-controller-84d6567774-jtggf\" (UID: \"429d8b5d-8e50-4115-89e7-1c8d3f53bd27\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jtggf" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.098613 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/80d86fac-74cc-41d4-81df-2e718c1568d9-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-dnpcn\" (UID: \"80d86fac-74cc-41d4-81df-2e718c1568d9\") " pod="openshift-marketplace/marketplace-operator-79b997595-dnpcn" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.098643 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/b3bcd08e-d3ff-4cc6-8a32-1d43add9fddf-tmpfs\") pod \"packageserver-d55dfcdfc-2n45v\" (UID: \"b3bcd08e-d3ff-4cc6-8a32-1d43add9fddf\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2n45v" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.098666 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/906d9a20-0731-435a-80af-0dab64476e32-service-ca-bundle\") pod \"router-default-5444994796-sv7wd\" (UID: \"906d9a20-0731-435a-80af-0dab64476e32\") " pod="openshift-ingress/router-default-5444994796-sv7wd" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.098688 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f05b3314-839f-43ca-bb32-951ef0582151-cert\") pod \"ingress-canary-5m5vk\" (UID: \"f05b3314-839f-43ca-bb32-951ef0582151\") " pod="openshift-ingress-canary/ingress-canary-5m5vk" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.098714 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xqp9\" (UniqueName: \"kubernetes.io/projected/ee298daa-0334-4d62-b83f-7c2499f55af6-kube-api-access-9xqp9\") pod \"kube-storage-version-migrator-operator-b67b599dd-fqfhl\" (UID: \"ee298daa-0334-4d62-b83f-7c2499f55af6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fqfhl" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.098737 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/d0a0c8bb-22ed-4ebb-aaf6-37d9a2e15a7c-signing-cabundle\") pod \"service-ca-9c57cc56f-d7smm\" (UID: \"d0a0c8bb-22ed-4ebb-aaf6-37d9a2e15a7c\") " pod="openshift-service-ca/service-ca-9c57cc56f-d7smm" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.098759 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmz4h\" (UniqueName: \"kubernetes.io/projected/6edcd95a-9780-4af1-9454-da6dce913528-kube-api-access-gmz4h\") pod \"machine-config-operator-74547568cd-sfhzf\" (UID: \"6edcd95a-9780-4af1-9454-da6dce913528\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sfhzf" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.098798 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee298daa-0334-4d62-b83f-7c2499f55af6-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-fqfhl\" (UID: \"ee298daa-0334-4d62-b83f-7c2499f55af6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fqfhl" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.098817 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vz2mj\" (UniqueName: \"kubernetes.io/projected/b3bcd08e-d3ff-4cc6-8a32-1d43add9fddf-kube-api-access-vz2mj\") pod \"packageserver-d55dfcdfc-2n45v\" (UID: \"b3bcd08e-d3ff-4cc6-8a32-1d43add9fddf\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2n45v" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.098834 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1a46a1ee-5f40-4d85-b726-d758b7ceff37-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-2b2sh\" (UID: \"1a46a1ee-5f40-4d85-b726-d758b7ceff37\") " pod="openshift-authentication/oauth-openshift-558db77b4-2b2sh" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.098851 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a7917eb5-2c7a-426c-8850-4209cd22e790-serving-cert\") pod \"console-operator-58897d9998-6zcl9\" (UID: \"a7917eb5-2c7a-426c-8850-4209cd22e790\") " pod="openshift-console-operator/console-operator-58897d9998-6zcl9" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.098868 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/733046ae-dba4-407a-83ee-89677527d7cc-config\") pod \"service-ca-operator-777779d784-275fn\" (UID: \"733046ae-dba4-407a-83ee-89677527d7cc\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-275fn" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.098894 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6328cb4-ec5c-4913-b7b9-ed18d759d7f1-config\") pod \"kube-apiserver-operator-766d6c64bb-v7pjq\" (UID: \"e6328cb4-ec5c-4913-b7b9-ed18d759d7f1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-v7pjq" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.098914 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhc57\" (UniqueName: \"kubernetes.io/projected/733046ae-dba4-407a-83ee-89677527d7cc-kube-api-access-vhc57\") pod \"service-ca-operator-777779d784-275fn\" (UID: \"733046ae-dba4-407a-83ee-89677527d7cc\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-275fn" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.098930 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1a46a1ee-5f40-4d85-b726-d758b7ceff37-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-2b2sh\" (UID: \"1a46a1ee-5f40-4d85-b726-d758b7ceff37\") " pod="openshift-authentication/oauth-openshift-558db77b4-2b2sh" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.098958 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sntsx\" (UniqueName: \"kubernetes.io/projected/e770c47d-95d6-45be-87cb-1fa3922afa82-kube-api-access-sntsx\") pod \"ingress-operator-5b745b69d9-ft28f\" (UID: \"e770c47d-95d6-45be-87cb-1fa3922afa82\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ft28f" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.098973 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1a46a1ee-5f40-4d85-b726-d758b7ceff37-audit-dir\") pod \"oauth-openshift-558db77b4-2b2sh\" (UID: \"1a46a1ee-5f40-4d85-b726-d758b7ceff37\") " pod="openshift-authentication/oauth-openshift-558db77b4-2b2sh" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.098995 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1a46a1ee-5f40-4d85-b726-d758b7ceff37-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-2b2sh\" (UID: \"1a46a1ee-5f40-4d85-b726-d758b7ceff37\") " pod="openshift-authentication/oauth-openshift-558db77b4-2b2sh" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.099804 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6edcd95a-9780-4af1-9454-da6dce913528-auth-proxy-config\") pod \"machine-config-operator-74547568cd-sfhzf\" (UID: \"6edcd95a-9780-4af1-9454-da6dce913528\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sfhzf" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.098643 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/429d8b5d-8e50-4115-89e7-1c8d3f53bd27-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-jtggf\" (UID: \"429d8b5d-8e50-4115-89e7-1c8d3f53bd27\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jtggf" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.100056 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/b3bcd08e-d3ff-4cc6-8a32-1d43add9fddf-tmpfs\") pod \"packageserver-d55dfcdfc-2n45v\" (UID: \"b3bcd08e-d3ff-4cc6-8a32-1d43add9fddf\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2n45v" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.100161 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1a46a1ee-5f40-4d85-b726-d758b7ceff37-audit-dir\") pod \"oauth-openshift-558db77b4-2b2sh\" (UID: \"1a46a1ee-5f40-4d85-b726-d758b7ceff37\") " pod="openshift-authentication/oauth-openshift-558db77b4-2b2sh" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.100471 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/d0a0c8bb-22ed-4ebb-aaf6-37d9a2e15a7c-signing-cabundle\") pod \"service-ca-9c57cc56f-d7smm\" (UID: \"d0a0c8bb-22ed-4ebb-aaf6-37d9a2e15a7c\") " pod="openshift-service-ca/service-ca-9c57cc56f-d7smm" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.101105 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a151c473-d304-4e1d-ba12-7860c0efbac9-srv-cert\") pod \"olm-operator-6b444d44fb-tv6bv\" (UID: \"a151c473-d304-4e1d-ba12-7860c0efbac9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tv6bv" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.101900 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b3bcd08e-d3ff-4cc6-8a32-1d43add9fddf-apiservice-cert\") pod \"packageserver-d55dfcdfc-2n45v\" (UID: \"b3bcd08e-d3ff-4cc6-8a32-1d43add9fddf\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2n45v" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.102049 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/51ad4830-9e57-4bf2-91e5-7c24c7648d8b-metrics-tls\") pod \"dns-operator-744455d44c-nprpv\" (UID: \"51ad4830-9e57-4bf2-91e5-7c24c7648d8b\") " pod="openshift-dns-operator/dns-operator-744455d44c-nprpv" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.102745 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b3bcd08e-d3ff-4cc6-8a32-1d43add9fddf-webhook-cert\") pod \"packageserver-d55dfcdfc-2n45v\" (UID: \"b3bcd08e-d3ff-4cc6-8a32-1d43add9fddf\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2n45v" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.102745 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a151c473-d304-4e1d-ba12-7860c0efbac9-profile-collector-cert\") pod \"olm-operator-6b444d44fb-tv6bv\" (UID: \"a151c473-d304-4e1d-ba12-7860c0efbac9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tv6bv" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.103531 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/d0a0c8bb-22ed-4ebb-aaf6-37d9a2e15a7c-signing-key\") pod \"service-ca-9c57cc56f-d7smm\" (UID: \"d0a0c8bb-22ed-4ebb-aaf6-37d9a2e15a7c\") " pod="openshift-service-ca/service-ca-9c57cc56f-d7smm" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.104481 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/03f86e30-e6e2-473e-8a52-c1e86d28c2e2-secret-volume\") pod \"collect-profiles-29567130-scc4x\" (UID: \"03f86e30-e6e2-473e-8a52-c1e86d28c2e2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567130-scc4x" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.108334 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.111744 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/dc26c755-5e1b-480b-b3ed-b3d3dee36d94-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-kkhg7\" (UID: \"dc26c755-5e1b-480b-b3ed-b3d3dee36d94\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kkhg7" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.128942 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.160179 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sg24n\" (UniqueName: \"kubernetes.io/projected/bd4ee554-cd4d-4ff1-bef8-309484654b00-kube-api-access-sg24n\") pod \"openshift-config-operator-7777fb866f-fq57l\" (UID: \"bd4ee554-cd4d-4ff1-bef8-309484654b00\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fq57l" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.167085 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.174507 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/429d8b5d-8e50-4115-89e7-1c8d3f53bd27-proxy-tls\") pod \"machine-config-controller-84d6567774-jtggf\" (UID: \"429d8b5d-8e50-4115-89e7-1c8d3f53bd27\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jtggf" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.186919 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.228087 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.243012 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/31ca74dd-dc4d-466a-8ca3-48f9b2d3e9f8-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-nfmkn\" (UID: \"31ca74dd-dc4d-466a-8ca3-48f9b2d3e9f8\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nfmkn" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.255560 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.262789 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/80d86fac-74cc-41d4-81df-2e718c1568d9-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-dnpcn\" (UID: \"80d86fac-74cc-41d4-81df-2e718c1568d9\") " pod="openshift-marketplace/marketplace-operator-79b997595-dnpcn" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.268400 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.289073 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.293329 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/80d86fac-74cc-41d4-81df-2e718c1568d9-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-dnpcn\" (UID: \"80d86fac-74cc-41d4-81df-2e718c1568d9\") " pod="openshift-marketplace/marketplace-operator-79b997595-dnpcn" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.309783 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.328002 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.354296 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.361069 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a7917eb5-2c7a-426c-8850-4209cd22e790-trusted-ca\") pod \"console-operator-58897d9998-6zcl9\" (UID: \"a7917eb5-2c7a-426c-8850-4209cd22e790\") " pod="openshift-console-operator/console-operator-58897d9998-6zcl9" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.367376 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.387126 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fq57l" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.387642 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.394233 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a7917eb5-2c7a-426c-8850-4209cd22e790-serving-cert\") pod \"console-operator-58897d9998-6zcl9\" (UID: \"a7917eb5-2c7a-426c-8850-4209cd22e790\") " pod="openshift-console-operator/console-operator-58897d9998-6zcl9" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.407806 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.427418 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.447972 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.459116 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/03f86e30-e6e2-473e-8a52-c1e86d28c2e2-config-volume\") pod \"collect-profiles-29567130-scc4x\" (UID: \"03f86e30-e6e2-473e-8a52-c1e86d28c2e2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567130-scc4x" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.468204 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.478706 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/a78540fe-014c-42e6-916c-3f39b4611a15-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-cg94j\" (UID: \"a78540fe-014c-42e6-916c-3f39b4611a15\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cg94j" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.488616 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.497146 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6edcd95a-9780-4af1-9454-da6dce913528-images\") pod \"machine-config-operator-74547568cd-sfhzf\" (UID: \"6edcd95a-9780-4af1-9454-da6dce913528\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sfhzf" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.507512 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.528620 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.548702 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.550183 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7917eb5-2c7a-426c-8850-4209cd22e790-config\") pod \"console-operator-58897d9998-6zcl9\" (UID: \"a7917eb5-2c7a-426c-8850-4209cd22e790\") " pod="openshift-console-operator/console-operator-58897d9998-6zcl9" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.568709 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.588549 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.600996 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6edcd95a-9780-4af1-9454-da6dce913528-proxy-tls\") pod \"machine-config-operator-74547568cd-sfhzf\" (UID: \"6edcd95a-9780-4af1-9454-da6dce913528\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sfhzf" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.607462 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.627346 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.647766 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.668379 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.675953 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/733046ae-dba4-407a-83ee-89677527d7cc-serving-cert\") pod \"service-ca-operator-777779d784-275fn\" (UID: \"733046ae-dba4-407a-83ee-89677527d7cc\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-275fn" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.679111 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-fq57l"] Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.688100 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.691402 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/733046ae-dba4-407a-83ee-89677527d7cc-config\") pod \"service-ca-operator-777779d784-275fn\" (UID: \"733046ae-dba4-407a-83ee-89677527d7cc\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-275fn" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.707351 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.727394 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.747229 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.758570 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d11ac9c7-0d8b-4a2c-a60f-7a0e88b01fa7-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2s6xz\" (UID: \"d11ac9c7-0d8b-4a2c-a60f-7a0e88b01fa7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2s6xz" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.769513 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.771240 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d11ac9c7-0d8b-4a2c-a60f-7a0e88b01fa7-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2s6xz\" (UID: \"d11ac9c7-0d8b-4a2c-a60f-7a0e88b01fa7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2s6xz" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.788641 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.813008 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.825540 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1a46a1ee-5f40-4d85-b726-d758b7ceff37-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-2b2sh\" (UID: \"1a46a1ee-5f40-4d85-b726-d758b7ceff37\") " pod="openshift-authentication/oauth-openshift-558db77b4-2b2sh" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.827101 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.832078 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1a46a1ee-5f40-4d85-b726-d758b7ceff37-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-2b2sh\" (UID: \"1a46a1ee-5f40-4d85-b726-d758b7ceff37\") " pod="openshift-authentication/oauth-openshift-558db77b4-2b2sh" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.847639 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.867317 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.876186 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1a46a1ee-5f40-4d85-b726-d758b7ceff37-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-2b2sh\" (UID: \"1a46a1ee-5f40-4d85-b726-d758b7ceff37\") " pod="openshift-authentication/oauth-openshift-558db77b4-2b2sh" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.904243 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.907143 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.915382 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1a46a1ee-5f40-4d85-b726-d758b7ceff37-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-2b2sh\" (UID: \"1a46a1ee-5f40-4d85-b726-d758b7ceff37\") " pod="openshift-authentication/oauth-openshift-558db77b4-2b2sh" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.915799 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1a46a1ee-5f40-4d85-b726-d758b7ceff37-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-2b2sh\" (UID: \"1a46a1ee-5f40-4d85-b726-d758b7ceff37\") " pod="openshift-authentication/oauth-openshift-558db77b4-2b2sh" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.930056 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.940855 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1a46a1ee-5f40-4d85-b726-d758b7ceff37-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-2b2sh\" (UID: \"1a46a1ee-5f40-4d85-b726-d758b7ceff37\") " pod="openshift-authentication/oauth-openshift-558db77b4-2b2sh" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.945880 4690 request.go:700] Waited for 1.015350007s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/secrets?fieldSelector=metadata.name%3Dv4-0-config-system-router-certs&limit=500&resourceVersion=0 Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.949244 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.956333 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1a46a1ee-5f40-4d85-b726-d758b7ceff37-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-2b2sh\" (UID: \"1a46a1ee-5f40-4d85-b726-d758b7ceff37\") " pod="openshift-authentication/oauth-openshift-558db77b4-2b2sh" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.968597 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.975179 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1a46a1ee-5f40-4d85-b726-d758b7ceff37-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-2b2sh\" (UID: \"1a46a1ee-5f40-4d85-b726-d758b7ceff37\") " pod="openshift-authentication/oauth-openshift-558db77b4-2b2sh" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.987674 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 20 17:35:51 crc kubenswrapper[4690]: I0320 17:35:51.996957 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1a46a1ee-5f40-4d85-b726-d758b7ceff37-audit-policies\") pod \"oauth-openshift-558db77b4-2b2sh\" (UID: \"1a46a1ee-5f40-4d85-b726-d758b7ceff37\") " pod="openshift-authentication/oauth-openshift-558db77b4-2b2sh" Mar 20 17:35:52 crc kubenswrapper[4690]: I0320 17:35:52.006769 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 20 17:35:52 crc kubenswrapper[4690]: I0320 17:35:52.010158 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1a46a1ee-5f40-4d85-b726-d758b7ceff37-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-2b2sh\" (UID: \"1a46a1ee-5f40-4d85-b726-d758b7ceff37\") " pod="openshift-authentication/oauth-openshift-558db77b4-2b2sh" Mar 20 17:35:52 crc kubenswrapper[4690]: I0320 17:35:52.022487 4690 generic.go:334] "Generic (PLEG): container finished" podID="bd4ee554-cd4d-4ff1-bef8-309484654b00" containerID="87af17b1065c6d35ac9e2cdfbdd2fe32069f4a01c079b1510d14e5f34065c50d" exitCode=0 Mar 20 17:35:52 crc kubenswrapper[4690]: I0320 17:35:52.022558 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fq57l" event={"ID":"bd4ee554-cd4d-4ff1-bef8-309484654b00","Type":"ContainerDied","Data":"87af17b1065c6d35ac9e2cdfbdd2fe32069f4a01c079b1510d14e5f34065c50d"} Mar 20 17:35:52 crc kubenswrapper[4690]: I0320 17:35:52.022589 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fq57l" event={"ID":"bd4ee554-cd4d-4ff1-bef8-309484654b00","Type":"ContainerStarted","Data":"7404c4debb7a8aa92e0522a3399abc7aeaede638e9844aef1f4f227063fe18ce"} Mar 20 17:35:52 crc kubenswrapper[4690]: I0320 17:35:52.027367 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 20 17:35:52 crc kubenswrapper[4690]: I0320 17:35:52.031037 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1a46a1ee-5f40-4d85-b726-d758b7ceff37-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-2b2sh\" (UID: \"1a46a1ee-5f40-4d85-b726-d758b7ceff37\") " pod="openshift-authentication/oauth-openshift-558db77b4-2b2sh" Mar 20 17:35:52 crc kubenswrapper[4690]: I0320 17:35:52.061693 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 20 17:35:52 crc kubenswrapper[4690]: I0320 17:35:52.067563 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 20 17:35:52 crc kubenswrapper[4690]: I0320 17:35:52.069299 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a46a1ee-5f40-4d85-b726-d758b7ceff37-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-2b2sh\" (UID: \"1a46a1ee-5f40-4d85-b726-d758b7ceff37\") " pod="openshift-authentication/oauth-openshift-558db77b4-2b2sh" Mar 20 17:35:52 crc kubenswrapper[4690]: I0320 17:35:52.088355 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 20 17:35:52 crc kubenswrapper[4690]: E0320 17:35:52.097620 4690 configmap.go:193] Couldn't get configMap openshift-kube-storage-version-migrator-operator/config: failed to sync configmap cache: timed out waiting for the condition Mar 20 17:35:52 crc kubenswrapper[4690]: E0320 17:35:52.097800 4690 secret.go:188] Couldn't get secret openshift-kube-apiserver-operator/kube-apiserver-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 20 17:35:52 crc kubenswrapper[4690]: E0320 17:35:52.097653 4690 secret.go:188] Couldn't get secret openshift-ingress/router-metrics-certs-default: failed to sync secret cache: timed out waiting for the condition Mar 20 17:35:52 crc kubenswrapper[4690]: E0320 17:35:52.097829 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ee298daa-0334-4d62-b83f-7c2499f55af6-config podName:ee298daa-0334-4d62-b83f-7c2499f55af6 nodeName:}" failed. No retries permitted until 2026-03-20 17:35:52.597785012 +0000 UTC m=+227.463610700 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/ee298daa-0334-4d62-b83f-7c2499f55af6-config") pod "kube-storage-version-migrator-operator-b67b599dd-fqfhl" (UID: "ee298daa-0334-4d62-b83f-7c2499f55af6") : failed to sync configmap cache: timed out waiting for the condition Mar 20 17:35:52 crc kubenswrapper[4690]: E0320 17:35:52.097712 4690 secret.go:188] Couldn't get secret openshift-ingress/router-certs-default: failed to sync secret cache: timed out waiting for the condition Mar 20 17:35:52 crc kubenswrapper[4690]: E0320 17:35:52.098009 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e6328cb4-ec5c-4913-b7b9-ed18d759d7f1-serving-cert podName:e6328cb4-ec5c-4913-b7b9-ed18d759d7f1 nodeName:}" failed. No retries permitted until 2026-03-20 17:35:52.597949787 +0000 UTC m=+227.463775505 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/e6328cb4-ec5c-4913-b7b9-ed18d759d7f1-serving-cert") pod "kube-apiserver-operator-766d6c64bb-v7pjq" (UID: "e6328cb4-ec5c-4913-b7b9-ed18d759d7f1") : failed to sync secret cache: timed out waiting for the condition Mar 20 17:35:52 crc kubenswrapper[4690]: E0320 17:35:52.098052 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/906d9a20-0731-435a-80af-0dab64476e32-metrics-certs podName:906d9a20-0731-435a-80af-0dab64476e32 nodeName:}" failed. No retries permitted until 2026-03-20 17:35:52.59803267 +0000 UTC m=+227.463858378 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/906d9a20-0731-435a-80af-0dab64476e32-metrics-certs") pod "router-default-5444994796-sv7wd" (UID: "906d9a20-0731-435a-80af-0dab64476e32") : failed to sync secret cache: timed out waiting for the condition Mar 20 17:35:52 crc kubenswrapper[4690]: E0320 17:35:52.098097 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/906d9a20-0731-435a-80af-0dab64476e32-default-certificate podName:906d9a20-0731-435a-80af-0dab64476e32 nodeName:}" failed. No retries permitted until 2026-03-20 17:35:52.598082891 +0000 UTC m=+227.463908609 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "default-certificate" (UniqueName: "kubernetes.io/secret/906d9a20-0731-435a-80af-0dab64476e32-default-certificate") pod "router-default-5444994796-sv7wd" (UID: "906d9a20-0731-435a-80af-0dab64476e32") : failed to sync secret cache: timed out waiting for the condition Mar 20 17:35:52 crc kubenswrapper[4690]: E0320 17:35:52.099165 4690 secret.go:188] Couldn't get secret openshift-ingress-operator/metrics-tls: failed to sync secret cache: timed out waiting for the condition Mar 20 17:35:52 crc kubenswrapper[4690]: E0320 17:35:52.099305 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e770c47d-95d6-45be-87cb-1fa3922afa82-metrics-tls podName:e770c47d-95d6-45be-87cb-1fa3922afa82 nodeName:}" failed. No retries permitted until 2026-03-20 17:35:52.599245935 +0000 UTC m=+227.465071813 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e770c47d-95d6-45be-87cb-1fa3922afa82-metrics-tls") pod "ingress-operator-5b745b69d9-ft28f" (UID: "e770c47d-95d6-45be-87cb-1fa3922afa82") : failed to sync secret cache: timed out waiting for the condition Mar 20 17:35:52 crc kubenswrapper[4690]: E0320 17:35:52.099401 4690 configmap.go:193] Couldn't get configMap openshift-ingress-operator/trusted-ca: failed to sync configmap cache: timed out waiting for the condition Mar 20 17:35:52 crc kubenswrapper[4690]: E0320 17:35:52.099538 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e770c47d-95d6-45be-87cb-1fa3922afa82-trusted-ca podName:e770c47d-95d6-45be-87cb-1fa3922afa82 nodeName:}" failed. No retries permitted until 2026-03-20 17:35:52.599512993 +0000 UTC m=+227.465338701 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "trusted-ca" (UniqueName: "kubernetes.io/configmap/e770c47d-95d6-45be-87cb-1fa3922afa82-trusted-ca") pod "ingress-operator-5b745b69d9-ft28f" (UID: "e770c47d-95d6-45be-87cb-1fa3922afa82") : failed to sync configmap cache: timed out waiting for the condition Mar 20 17:35:52 crc kubenswrapper[4690]: E0320 17:35:52.099194 4690 secret.go:188] Couldn't get secret openshift-ingress/router-stats-default: failed to sync secret cache: timed out waiting for the condition Mar 20 17:35:52 crc kubenswrapper[4690]: E0320 17:35:52.099885 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/906d9a20-0731-435a-80af-0dab64476e32-stats-auth podName:906d9a20-0731-435a-80af-0dab64476e32 nodeName:}" failed. No retries permitted until 2026-03-20 17:35:52.599867543 +0000 UTC m=+227.465693231 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "stats-auth" (UniqueName: "kubernetes.io/secret/906d9a20-0731-435a-80af-0dab64476e32-stats-auth") pod "router-default-5444994796-sv7wd" (UID: "906d9a20-0731-435a-80af-0dab64476e32") : failed to sync secret cache: timed out waiting for the condition Mar 20 17:35:52 crc kubenswrapper[4690]: E0320 17:35:52.100020 4690 secret.go:188] Couldn't get secret openshift-ingress-canary/canary-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 20 17:35:52 crc kubenswrapper[4690]: E0320 17:35:52.100109 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f05b3314-839f-43ca-bb32-951ef0582151-cert podName:f05b3314-839f-43ca-bb32-951ef0582151 nodeName:}" failed. No retries permitted until 2026-03-20 17:35:52.600074259 +0000 UTC m=+227.465899977 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f05b3314-839f-43ca-bb32-951ef0582151-cert") pod "ingress-canary-5m5vk" (UID: "f05b3314-839f-43ca-bb32-951ef0582151") : failed to sync secret cache: timed out waiting for the condition Mar 20 17:35:52 crc kubenswrapper[4690]: E0320 17:35:52.100187 4690 configmap.go:193] Couldn't get configMap openshift-kube-apiserver-operator/kube-apiserver-operator-config: failed to sync configmap cache: timed out waiting for the condition Mar 20 17:35:52 crc kubenswrapper[4690]: E0320 17:35:52.100186 4690 secret.go:188] Couldn't get secret openshift-kube-storage-version-migrator-operator/serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 20 17:35:52 crc kubenswrapper[4690]: E0320 17:35:52.100029 4690 configmap.go:193] Couldn't get configMap openshift-ingress/service-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Mar 20 17:35:52 crc kubenswrapper[4690]: E0320 17:35:52.100248 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e6328cb4-ec5c-4913-b7b9-ed18d759d7f1-config podName:e6328cb4-ec5c-4913-b7b9-ed18d759d7f1 nodeName:}" failed. No retries permitted until 2026-03-20 17:35:52.600232734 +0000 UTC m=+227.466058452 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/e6328cb4-ec5c-4913-b7b9-ed18d759d7f1-config") pod "kube-apiserver-operator-766d6c64bb-v7pjq" (UID: "e6328cb4-ec5c-4913-b7b9-ed18d759d7f1") : failed to sync configmap cache: timed out waiting for the condition Mar 20 17:35:52 crc kubenswrapper[4690]: E0320 17:35:52.100850 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee298daa-0334-4d62-b83f-7c2499f55af6-serving-cert podName:ee298daa-0334-4d62-b83f-7c2499f55af6 nodeName:}" failed. No retries permitted until 2026-03-20 17:35:52.600818691 +0000 UTC m=+227.466644409 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/ee298daa-0334-4d62-b83f-7c2499f55af6-serving-cert") pod "kube-storage-version-migrator-operator-b67b599dd-fqfhl" (UID: "ee298daa-0334-4d62-b83f-7c2499f55af6") : failed to sync secret cache: timed out waiting for the condition Mar 20 17:35:52 crc kubenswrapper[4690]: E0320 17:35:52.100899 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/906d9a20-0731-435a-80af-0dab64476e32-service-ca-bundle podName:906d9a20-0731-435a-80af-0dab64476e32 nodeName:}" failed. No retries permitted until 2026-03-20 17:35:52.600884163 +0000 UTC m=+227.466709871 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/906d9a20-0731-435a-80af-0dab64476e32-service-ca-bundle") pod "router-default-5444994796-sv7wd" (UID: "906d9a20-0731-435a-80af-0dab64476e32") : failed to sync configmap cache: timed out waiting for the condition Mar 20 17:35:52 crc kubenswrapper[4690]: I0320 17:35:52.110476 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 20 17:35:52 crc kubenswrapper[4690]: I0320 17:35:52.127806 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 20 17:35:52 crc kubenswrapper[4690]: I0320 17:35:52.149400 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 20 17:35:52 crc kubenswrapper[4690]: I0320 17:35:52.168451 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 20 17:35:52 crc kubenswrapper[4690]: I0320 17:35:52.188209 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 20 17:35:52 crc kubenswrapper[4690]: I0320 17:35:52.208179 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 20 17:35:52 crc kubenswrapper[4690]: I0320 17:35:52.228311 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 20 17:35:52 crc kubenswrapper[4690]: I0320 17:35:52.247603 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 20 17:35:52 crc kubenswrapper[4690]: I0320 17:35:52.268753 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 20 17:35:52 crc kubenswrapper[4690]: I0320 17:35:52.308927 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 20 17:35:52 crc kubenswrapper[4690]: I0320 17:35:52.309097 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 20 17:35:52 crc kubenswrapper[4690]: I0320 17:35:52.328787 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 20 17:35:52 crc kubenswrapper[4690]: I0320 17:35:52.348170 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 20 17:35:52 crc kubenswrapper[4690]: I0320 17:35:52.368974 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 20 17:35:52 crc kubenswrapper[4690]: I0320 17:35:52.388401 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 20 17:35:52 crc kubenswrapper[4690]: I0320 17:35:52.408333 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 20 17:35:52 crc kubenswrapper[4690]: I0320 17:35:52.427716 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 20 17:35:52 crc kubenswrapper[4690]: I0320 17:35:52.448704 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 20 17:35:52 crc kubenswrapper[4690]: I0320 17:35:52.467736 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 20 17:35:52 crc kubenswrapper[4690]: I0320 17:35:52.487459 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 20 17:35:52 crc kubenswrapper[4690]: I0320 17:35:52.508229 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 17:35:52 crc kubenswrapper[4690]: I0320 17:35:52.529056 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 17:35:52 crc kubenswrapper[4690]: I0320 17:35:52.547785 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 20 17:35:52 crc kubenswrapper[4690]: I0320 17:35:52.567845 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 20 17:35:52 crc kubenswrapper[4690]: I0320 17:35:52.589314 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 20 17:35:52 crc kubenswrapper[4690]: I0320 17:35:52.607776 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 20 17:35:52 crc kubenswrapper[4690]: I0320 17:35:52.628804 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 20 17:35:52 crc kubenswrapper[4690]: I0320 17:35:52.631653 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/906d9a20-0731-435a-80af-0dab64476e32-service-ca-bundle\") pod \"router-default-5444994796-sv7wd\" (UID: \"906d9a20-0731-435a-80af-0dab64476e32\") " pod="openshift-ingress/router-default-5444994796-sv7wd" Mar 20 17:35:52 crc kubenswrapper[4690]: I0320 17:35:52.631907 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f05b3314-839f-43ca-bb32-951ef0582151-cert\") pod \"ingress-canary-5m5vk\" (UID: \"f05b3314-839f-43ca-bb32-951ef0582151\") " pod="openshift-ingress-canary/ingress-canary-5m5vk" Mar 20 17:35:52 crc kubenswrapper[4690]: I0320 17:35:52.632147 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee298daa-0334-4d62-b83f-7c2499f55af6-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-fqfhl\" (UID: \"ee298daa-0334-4d62-b83f-7c2499f55af6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fqfhl" Mar 20 17:35:52 crc kubenswrapper[4690]: I0320 17:35:52.632373 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6328cb4-ec5c-4913-b7b9-ed18d759d7f1-config\") pod \"kube-apiserver-operator-766d6c64bb-v7pjq\" (UID: \"e6328cb4-ec5c-4913-b7b9-ed18d759d7f1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-v7pjq" Mar 20 17:35:52 crc kubenswrapper[4690]: I0320 17:35:52.632749 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee298daa-0334-4d62-b83f-7c2499f55af6-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-fqfhl\" (UID: \"ee298daa-0334-4d62-b83f-7c2499f55af6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fqfhl" Mar 20 17:35:52 crc kubenswrapper[4690]: I0320 17:35:52.633052 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/906d9a20-0731-435a-80af-0dab64476e32-metrics-certs\") pod \"router-default-5444994796-sv7wd\" (UID: \"906d9a20-0731-435a-80af-0dab64476e32\") " pod="openshift-ingress/router-default-5444994796-sv7wd" Mar 20 17:35:52 crc kubenswrapper[4690]: I0320 17:35:52.633308 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/906d9a20-0731-435a-80af-0dab64476e32-default-certificate\") pod \"router-default-5444994796-sv7wd\" (UID: \"906d9a20-0731-435a-80af-0dab64476e32\") " pod="openshift-ingress/router-default-5444994796-sv7wd" Mar 20 17:35:52 crc kubenswrapper[4690]: I0320 17:35:52.633547 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/906d9a20-0731-435a-80af-0dab64476e32-service-ca-bundle\") pod \"router-default-5444994796-sv7wd\" (UID: \"906d9a20-0731-435a-80af-0dab64476e32\") " pod="openshift-ingress/router-default-5444994796-sv7wd" Mar 20 17:35:52 crc kubenswrapper[4690]: I0320 17:35:52.633558 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e6328cb4-ec5c-4913-b7b9-ed18d759d7f1-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-v7pjq\" (UID: \"e6328cb4-ec5c-4913-b7b9-ed18d759d7f1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-v7pjq" Mar 20 17:35:52 crc kubenswrapper[4690]: I0320 17:35:52.633923 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee298daa-0334-4d62-b83f-7c2499f55af6-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-fqfhl\" (UID: \"ee298daa-0334-4d62-b83f-7c2499f55af6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fqfhl" Mar 20 17:35:52 crc kubenswrapper[4690]: I0320 17:35:52.633939 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/906d9a20-0731-435a-80af-0dab64476e32-stats-auth\") pod \"router-default-5444994796-sv7wd\" (UID: \"906d9a20-0731-435a-80af-0dab64476e32\") " pod="openshift-ingress/router-default-5444994796-sv7wd" Mar 20 17:35:52 crc kubenswrapper[4690]: I0320 17:35:52.634056 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e770c47d-95d6-45be-87cb-1fa3922afa82-metrics-tls\") pod \"ingress-operator-5b745b69d9-ft28f\" (UID: \"e770c47d-95d6-45be-87cb-1fa3922afa82\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ft28f" Mar 20 17:35:52 crc kubenswrapper[4690]: I0320 17:35:52.634218 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e770c47d-95d6-45be-87cb-1fa3922afa82-trusted-ca\") pod \"ingress-operator-5b745b69d9-ft28f\" (UID: \"e770c47d-95d6-45be-87cb-1fa3922afa82\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ft28f" Mar 20 17:35:52 crc kubenswrapper[4690]: I0320 17:35:52.634229 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6328cb4-ec5c-4913-b7b9-ed18d759d7f1-config\") pod \"kube-apiserver-operator-766d6c64bb-v7pjq\" (UID: \"e6328cb4-ec5c-4913-b7b9-ed18d759d7f1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-v7pjq" Mar 20 17:35:52 crc kubenswrapper[4690]: I0320 17:35:52.635471 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e770c47d-95d6-45be-87cb-1fa3922afa82-trusted-ca\") pod \"ingress-operator-5b745b69d9-ft28f\" (UID: \"e770c47d-95d6-45be-87cb-1fa3922afa82\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ft28f" Mar 20 17:35:52 crc kubenswrapper[4690]: I0320 17:35:52.637742 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee298daa-0334-4d62-b83f-7c2499f55af6-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-fqfhl\" (UID: \"ee298daa-0334-4d62-b83f-7c2499f55af6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fqfhl" Mar 20 17:35:52 crc kubenswrapper[4690]: I0320 17:35:52.638342 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/906d9a20-0731-435a-80af-0dab64476e32-metrics-certs\") pod \"router-default-5444994796-sv7wd\" (UID: \"906d9a20-0731-435a-80af-0dab64476e32\") " pod="openshift-ingress/router-default-5444994796-sv7wd" Mar 20 17:35:52 crc kubenswrapper[4690]: I0320 17:35:52.638993 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/906d9a20-0731-435a-80af-0dab64476e32-stats-auth\") pod \"router-default-5444994796-sv7wd\" (UID: \"906d9a20-0731-435a-80af-0dab64476e32\") " pod="openshift-ingress/router-default-5444994796-sv7wd" Mar 20 17:35:52 crc kubenswrapper[4690]: I0320 17:35:52.639302 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/906d9a20-0731-435a-80af-0dab64476e32-default-certificate\") pod \"router-default-5444994796-sv7wd\" (UID: \"906d9a20-0731-435a-80af-0dab64476e32\") " pod="openshift-ingress/router-default-5444994796-sv7wd" Mar 20 17:35:52 crc kubenswrapper[4690]: I0320 17:35:52.639919 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e770c47d-95d6-45be-87cb-1fa3922afa82-metrics-tls\") pod \"ingress-operator-5b745b69d9-ft28f\" (UID: \"e770c47d-95d6-45be-87cb-1fa3922afa82\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ft28f" Mar 20 17:35:52 crc kubenswrapper[4690]: I0320 17:35:52.641637 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e6328cb4-ec5c-4913-b7b9-ed18d759d7f1-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-v7pjq\" (UID: \"e6328cb4-ec5c-4913-b7b9-ed18d759d7f1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-v7pjq" Mar 20 17:35:52 crc kubenswrapper[4690]: I0320 17:35:52.648765 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 20 17:35:52 crc kubenswrapper[4690]: I0320 17:35:52.657554 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f05b3314-839f-43ca-bb32-951ef0582151-cert\") pod \"ingress-canary-5m5vk\" (UID: \"f05b3314-839f-43ca-bb32-951ef0582151\") " pod="openshift-ingress-canary/ingress-canary-5m5vk" Mar 20 17:35:52 crc kubenswrapper[4690]: I0320 17:35:52.668082 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 20 17:35:52 crc kubenswrapper[4690]: I0320 17:35:52.707660 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 20 17:35:52 crc kubenswrapper[4690]: I0320 17:35:52.730216 4690 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 20 17:35:52 crc kubenswrapper[4690]: I0320 17:35:52.749610 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 20 17:35:52 crc kubenswrapper[4690]: I0320 17:35:52.793650 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7qc9\" (UniqueName: \"kubernetes.io/projected/40dc1d7f-44d0-4ded-92b5-c2cf3df0bfd8-kube-api-access-d7qc9\") pod \"catalog-operator-68c6474976-f646m\" (UID: \"40dc1d7f-44d0-4ded-92b5-c2cf3df0bfd8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f646m" Mar 20 17:35:52 crc kubenswrapper[4690]: I0320 17:35:52.816716 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/20a97349-3805-4434-be4a-1cb8024add50-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-n5c8r\" (UID: \"20a97349-3805-4434-be4a-1cb8024add50\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-n5c8r" Mar 20 17:35:52 crc kubenswrapper[4690]: I0320 17:35:52.828162 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srrhb\" (UniqueName: \"kubernetes.io/projected/8baee130-f518-4071-afbc-13625917aa7b-kube-api-access-srrhb\") pod \"openshift-controller-manager-operator-756b6f6bc6-mk5m6\" (UID: \"8baee130-f518-4071-afbc-13625917aa7b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mk5m6" Mar 20 17:35:52 crc kubenswrapper[4690]: I0320 17:35:52.834192 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mk5m6" Mar 20 17:35:52 crc kubenswrapper[4690]: I0320 17:35:52.849047 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zkhd\" (UniqueName: \"kubernetes.io/projected/28a597c2-65fe-4f1f-b4da-8cedf2c92a6b-kube-api-access-5zkhd\") pod \"machine-approver-56656f9798-tzrf8\" (UID: \"28a597c2-65fe-4f1f-b4da-8cedf2c92a6b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tzrf8" Mar 20 17:35:52 crc kubenswrapper[4690]: I0320 17:35:52.859129 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f646m" Mar 20 17:35:52 crc kubenswrapper[4690]: I0320 17:35:52.866414 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zk4b5\" (UniqueName: \"kubernetes.io/projected/9a646659-b6c9-42c0-9bc8-ae149ad8ba85-kube-api-access-zk4b5\") pod \"authentication-operator-69f744f599-dxqqz\" (UID: \"9a646659-b6c9-42c0-9bc8-ae149ad8ba85\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dxqqz" Mar 20 17:35:52 crc kubenswrapper[4690]: I0320 17:35:52.892939 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4c77\" (UniqueName: \"kubernetes.io/projected/c4eaf3f2-8536-46bf-8c5f-82606abec128-kube-api-access-w4c77\") pod \"console-f9d7485db-ppgjz\" (UID: \"c4eaf3f2-8536-46bf-8c5f-82606abec128\") " pod="openshift-console/console-f9d7485db-ppgjz" Mar 20 17:35:52 crc kubenswrapper[4690]: I0320 17:35:52.893703 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 20 17:35:52 crc kubenswrapper[4690]: I0320 17:35:52.907117 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 20 17:35:52 crc kubenswrapper[4690]: I0320 17:35:52.932440 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 20 17:35:52 crc kubenswrapper[4690]: I0320 17:35:52.965959 4690 request.go:700] Waited for 1.98416934s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-apiserver/serviceaccounts/openshift-apiserver-sa/token Mar 20 17:35:52 crc kubenswrapper[4690]: I0320 17:35:52.966723 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7tvv\" (UniqueName: \"kubernetes.io/projected/ff4fe98d-c7c0-475a-85cb-70ab2c4ad122-kube-api-access-b7tvv\") pod \"multus-admission-controller-857f4d67dd-6rw4d\" (UID: \"ff4fe98d-c7c0-475a-85cb-70ab2c4ad122\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-6rw4d" Mar 20 17:35:52 crc kubenswrapper[4690]: I0320 17:35:52.993393 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtrf9\" (UniqueName: \"kubernetes.io/projected/a0c61344-19c2-4d8b-8aec-be86ac403866-kube-api-access-qtrf9\") pod \"apiserver-76f77b778f-8l2n9\" (UID: \"a0c61344-19c2-4d8b-8aec-be86ac403866\") " pod="openshift-apiserver/apiserver-76f77b778f-8l2n9" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.013027 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2657z\" (UniqueName: \"kubernetes.io/projected/fcf13749-fd7c-4f01-9598-7f041910cd74-kube-api-access-2657z\") pod \"controller-manager-879f6c89f-wbkxs\" (UID: \"fcf13749-fd7c-4f01-9598-7f041910cd74\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wbkxs" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.027280 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fq57l" event={"ID":"bd4ee554-cd4d-4ff1-bef8-309484654b00","Type":"ContainerStarted","Data":"08cf48cf7686b0db5f4484a24027fc2cd8a39f2db10a57c26d126d908be0e45a"} Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.027494 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fq57l" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.034917 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cg58v\" (UniqueName: \"kubernetes.io/projected/c73bcf80-34dc-466e-b1b0-a92850850498-kube-api-access-cg58v\") pod \"apiserver-7bbb656c7d-52php\" (UID: \"c73bcf80-34dc-466e-b1b0-a92850850498\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-52php" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.036993 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tzrf8" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.045761 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bkg9\" (UniqueName: \"kubernetes.io/projected/41847043-0aca-46d5-940f-3dfd2ded491f-kube-api-access-4bkg9\") pod \"etcd-operator-b45778765-nm2vw\" (UID: \"41847043-0aca-46d5-940f-3dfd2ded491f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nm2vw" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.056354 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-n5c8r" Mar 20 17:35:53 crc kubenswrapper[4690]: W0320 17:35:53.057515 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28a597c2_65fe_4f1f_b4da_8cedf2c92a6b.slice/crio-1bc192289e63995005dc4b81466b1cebe6e490a63942182140c2a7cb36cbb1e0 WatchSource:0}: Error finding container 1bc192289e63995005dc4b81466b1cebe6e490a63942182140c2a7cb36cbb1e0: Status 404 returned error can't find the container with id 1bc192289e63995005dc4b81466b1cebe6e490a63942182140c2a7cb36cbb1e0 Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.064219 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-nm2vw" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.066709 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mk5m6"] Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.068095 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6496f\" (UniqueName: \"kubernetes.io/projected/ace7a9fa-7eac-449c-8b61-6018d592fc4f-kube-api-access-6496f\") pod \"route-controller-manager-6576b87f9c-zklpl\" (UID: \"ace7a9fa-7eac-449c-8b61-6018d592fc4f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zklpl" Mar 20 17:35:53 crc kubenswrapper[4690]: W0320 17:35:53.083880 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8baee130_f518_4071_afbc_13625917aa7b.slice/crio-86de7b03a7dcd87d1d5460d121a01d832f9549c83e320c13f89a6a5cfccf8aa8 WatchSource:0}: Error finding container 86de7b03a7dcd87d1d5460d121a01d832f9549c83e320c13f89a6a5cfccf8aa8: Status 404 returned error can't find the container with id 86de7b03a7dcd87d1d5460d121a01d832f9549c83e320c13f89a6a5cfccf8aa8 Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.087232 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ndgf\" (UniqueName: \"kubernetes.io/projected/789eef8f-04a8-44cf-9e16-878de3a035bb-kube-api-access-8ndgf\") pod \"downloads-7954f5f757-v9wf6\" (UID: \"789eef8f-04a8-44cf-9e16-878de3a035bb\") " pod="openshift-console/downloads-7954f5f757-v9wf6" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.105355 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bf7nx\" (UniqueName: \"kubernetes.io/projected/bcdf1a44-e01e-4f8d-a5dd-f050ff98f14d-kube-api-access-bf7nx\") pod \"openshift-apiserver-operator-796bbdcf4f-j6k6w\" (UID: \"bcdf1a44-e01e-4f8d-a5dd-f050ff98f14d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-j6k6w" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.125575 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-8l2n9" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.126480 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7g2h\" (UniqueName: \"kubernetes.io/projected/bb444275-6cc1-42be-b742-afc344a60995-kube-api-access-n7g2h\") pod \"machine-api-operator-5694c8668f-9n98c\" (UID: \"bb444275-6cc1-42be-b742-afc344a60995\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9n98c" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.127640 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-dxqqz" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.128159 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f646m"] Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.129615 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.143458 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-v9wf6" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.147325 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.153088 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-6rw4d" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.169151 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-ppgjz" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.171816 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.212038 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9v6b5\" (UniqueName: \"kubernetes.io/projected/dc26c755-5e1b-480b-b3ed-b3d3dee36d94-kube-api-access-9v6b5\") pod \"control-plane-machine-set-operator-78cbb6b69f-kkhg7\" (UID: \"dc26c755-5e1b-480b-b3ed-b3d3dee36d94\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kkhg7" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.219319 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-wbkxs" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.221221 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57922\" (UniqueName: \"kubernetes.io/projected/80d86fac-74cc-41d4-81df-2e718c1568d9-kube-api-access-57922\") pod \"marketplace-operator-79b997595-dnpcn\" (UID: \"80d86fac-74cc-41d4-81df-2e718c1568d9\") " pod="openshift-marketplace/marketplace-operator-79b997595-dnpcn" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.223693 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kkhg7" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.245043 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mszs7\" (UniqueName: \"kubernetes.io/projected/a151c473-d304-4e1d-ba12-7860c0efbac9-kube-api-access-mszs7\") pod \"olm-operator-6b444d44fb-tv6bv\" (UID: \"a151c473-d304-4e1d-ba12-7860c0efbac9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tv6bv" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.250311 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-52php" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.252908 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-dnpcn" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.264397 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zklpl" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.267485 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d11ac9c7-0d8b-4a2c-a60f-7a0e88b01fa7-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2s6xz\" (UID: \"d11ac9c7-0d8b-4a2c-a60f-7a0e88b01fa7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2s6xz" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.286273 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-j6k6w" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.294393 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-n5c8r"] Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.296795 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e770c47d-95d6-45be-87cb-1fa3922afa82-bound-sa-token\") pod \"ingress-operator-5b745b69d9-ft28f\" (UID: \"e770c47d-95d6-45be-87cb-1fa3922afa82\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ft28f" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.304967 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2s6xz" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.319185 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xcff\" (UniqueName: \"kubernetes.io/projected/f05b3314-839f-43ca-bb32-951ef0582151-kube-api-access-8xcff\") pod \"ingress-canary-5m5vk\" (UID: \"f05b3314-839f-43ca-bb32-951ef0582151\") " pod="openshift-ingress-canary/ingress-canary-5m5vk" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.334339 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2wfr\" (UniqueName: \"kubernetes.io/projected/906d9a20-0731-435a-80af-0dab64476e32-kube-api-access-h2wfr\") pod \"router-default-5444994796-sv7wd\" (UID: \"906d9a20-0731-435a-80af-0dab64476e32\") " pod="openshift-ingress/router-default-5444994796-sv7wd" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.343127 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gbmq\" (UniqueName: \"kubernetes.io/projected/a78540fe-014c-42e6-916c-3f39b4611a15-kube-api-access-7gbmq\") pod \"cluster-image-registry-operator-dc59b4c8b-cg94j\" (UID: \"a78540fe-014c-42e6-916c-3f39b4611a15\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cg94j" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.348751 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-9n98c" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.369727 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mng4t\" (UniqueName: \"kubernetes.io/projected/51ad4830-9e57-4bf2-91e5-7c24c7648d8b-kube-api-access-mng4t\") pod \"dns-operator-744455d44c-nprpv\" (UID: \"51ad4830-9e57-4bf2-91e5-7c24c7648d8b\") " pod="openshift-dns-operator/dns-operator-744455d44c-nprpv" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.391727 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-5m5vk" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.395369 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9b7t\" (UniqueName: \"kubernetes.io/projected/fc1d890d-f494-466b-94a2-03c2d2c3fe7f-kube-api-access-k9b7t\") pod \"migrator-59844c95c7-8pvtf\" (UID: \"fc1d890d-f494-466b-94a2-03c2d2c3fe7f\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-8pvtf" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.404742 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpkrr\" (UniqueName: \"kubernetes.io/projected/03f86e30-e6e2-473e-8a52-c1e86d28c2e2-kube-api-access-rpkrr\") pod \"collect-profiles-29567130-scc4x\" (UID: \"03f86e30-e6e2-473e-8a52-c1e86d28c2e2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567130-scc4x" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.419668 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-nm2vw"] Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.441745 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtn4s\" (UniqueName: \"kubernetes.io/projected/a7917eb5-2c7a-426c-8850-4209cd22e790-kube-api-access-gtn4s\") pod \"console-operator-58897d9998-6zcl9\" (UID: \"a7917eb5-2c7a-426c-8850-4209cd22e790\") " pod="openshift-console-operator/console-operator-58897d9998-6zcl9" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.446707 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jd7z\" (UniqueName: \"kubernetes.io/projected/31ca74dd-dc4d-466a-8ca3-48f9b2d3e9f8-kube-api-access-4jd7z\") pod \"package-server-manager-789f6589d5-nfmkn\" (UID: \"31ca74dd-dc4d-466a-8ca3-48f9b2d3e9f8\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nfmkn" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.468070 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9flwk\" (UniqueName: \"kubernetes.io/projected/34d2f5b9-1f8e-4413-b178-58cd10fa7548-kube-api-access-9flwk\") pod \"auto-csr-approver-29567134-66l98\" (UID: \"34d2f5b9-1f8e-4413-b178-58cd10fa7548\") " pod="openshift-infra/auto-csr-approver-29567134-66l98" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.481820 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tv6bv" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.481873 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xqp9\" (UniqueName: \"kubernetes.io/projected/ee298daa-0334-4d62-b83f-7c2499f55af6-kube-api-access-9xqp9\") pod \"kube-storage-version-migrator-operator-b67b599dd-fqfhl\" (UID: \"ee298daa-0334-4d62-b83f-7c2499f55af6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fqfhl" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.504119 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vr2vf\" (UniqueName: \"kubernetes.io/projected/1a46a1ee-5f40-4d85-b726-d758b7ceff37-kube-api-access-vr2vf\") pod \"oauth-openshift-558db77b4-2b2sh\" (UID: \"1a46a1ee-5f40-4d85-b726-d758b7ceff37\") " pod="openshift-authentication/oauth-openshift-558db77b4-2b2sh" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.516607 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-nprpv" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.539675 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nfmkn" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.560432 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-6zcl9" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.564960 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-944vg\" (UniqueName: \"kubernetes.io/projected/429d8b5d-8e50-4115-89e7-1c8d3f53bd27-kube-api-access-944vg\") pod \"machine-config-controller-84d6567774-jtggf\" (UID: \"429d8b5d-8e50-4115-89e7-1c8d3f53bd27\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jtggf" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.565574 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xfns\" (UniqueName: \"kubernetes.io/projected/d0a0c8bb-22ed-4ebb-aaf6-37d9a2e15a7c-kube-api-access-8xfns\") pod \"service-ca-9c57cc56f-d7smm\" (UID: \"d0a0c8bb-22ed-4ebb-aaf6-37d9a2e15a7c\") " pod="openshift-service-ca/service-ca-9c57cc56f-d7smm" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.571805 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567130-scc4x" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.573926 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e6328cb4-ec5c-4913-b7b9-ed18d759d7f1-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-v7pjq\" (UID: \"e6328cb4-ec5c-4913-b7b9-ed18d759d7f1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-v7pjq" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.590364 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmz4h\" (UniqueName: \"kubernetes.io/projected/6edcd95a-9780-4af1-9454-da6dce913528-kube-api-access-gmz4h\") pod \"machine-config-operator-74547568cd-sfhzf\" (UID: \"6edcd95a-9780-4af1-9454-da6dce913528\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sfhzf" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.603507 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sntsx\" (UniqueName: \"kubernetes.io/projected/e770c47d-95d6-45be-87cb-1fa3922afa82-kube-api-access-sntsx\") pod \"ingress-operator-5b745b69d9-ft28f\" (UID: \"e770c47d-95d6-45be-87cb-1fa3922afa82\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ft28f" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.618252 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-2b2sh" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.621327 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-sv7wd" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.626741 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhc57\" (UniqueName: \"kubernetes.io/projected/733046ae-dba4-407a-83ee-89677527d7cc-kube-api-access-vhc57\") pod \"service-ca-operator-777779d784-275fn\" (UID: \"733046ae-dba4-407a-83ee-89677527d7cc\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-275fn" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.637814 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ft28f" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.647677 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vz2mj\" (UniqueName: \"kubernetes.io/projected/b3bcd08e-d3ff-4cc6-8a32-1d43add9fddf-kube-api-access-vz2mj\") pod \"packageserver-d55dfcdfc-2n45v\" (UID: \"b3bcd08e-d3ff-4cc6-8a32-1d43add9fddf\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2n45v" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.654894 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fqfhl" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.664610 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-v7pjq" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.676514 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567134-66l98" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.680696 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a78540fe-014c-42e6-916c-3f39b4611a15-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-cg94j\" (UID: \"a78540fe-014c-42e6-916c-3f39b4611a15\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cg94j" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.686181 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-8pvtf" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.759916 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8-installation-pull-secrets\") pod \"image-registry-697d97f7c8-6fhf7\" (UID: \"11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fhf7" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.759966 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fhf7\" (UID: \"11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fhf7" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.759985 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cz9rp\" (UniqueName: \"kubernetes.io/projected/11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8-kube-api-access-cz9rp\") pod \"image-registry-697d97f7c8-6fhf7\" (UID: \"11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fhf7" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.760007 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8-registry-certificates\") pod \"image-registry-697d97f7c8-6fhf7\" (UID: \"11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fhf7" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.763881 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8-bound-sa-token\") pod \"image-registry-697d97f7c8-6fhf7\" (UID: \"11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fhf7" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.763969 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8-ca-trust-extracted\") pod \"image-registry-697d97f7c8-6fhf7\" (UID: \"11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fhf7" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.763988 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8-registry-tls\") pod \"image-registry-697d97f7c8-6fhf7\" (UID: \"11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fhf7" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.764003 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8-trusted-ca\") pod \"image-registry-697d97f7c8-6fhf7\" (UID: \"11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fhf7" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.764081 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2rl8\" (UniqueName: \"kubernetes.io/projected/73b9ca3f-754a-4970-85ef-b3203caee6e4-kube-api-access-b2rl8\") pod \"cluster-samples-operator-665b6dd947-4kp5n\" (UID: \"73b9ca3f-754a-4970-85ef-b3203caee6e4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4kp5n" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.764129 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/73b9ca3f-754a-4970-85ef-b3203caee6e4-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-4kp5n\" (UID: \"73b9ca3f-754a-4970-85ef-b3203caee6e4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4kp5n" Mar 20 17:35:53 crc kubenswrapper[4690]: E0320 17:35:53.769592 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:35:54.269572048 +0000 UTC m=+229.135397726 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fhf7" (UID: "11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.778233 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2n45v" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.789030 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-d7smm" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.821730 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-v9wf6"] Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.833379 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-dxqqz"] Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.833820 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jtggf" Mar 20 17:35:53 crc kubenswrapper[4690]: W0320 17:35:53.856067 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod906d9a20_0731_435a_80af_0dab64476e32.slice/crio-ece7fd930e1e05864c1702c303f13819b8f0e44dba46cb6b3523c100013f5538 WatchSource:0}: Error finding container ece7fd930e1e05864c1702c303f13819b8f0e44dba46cb6b3523c100013f5538: Status 404 returned error can't find the container with id ece7fd930e1e05864c1702c303f13819b8f0e44dba46cb6b3523c100013f5538 Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.865344 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.865517 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/73b9ca3f-754a-4970-85ef-b3203caee6e4-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-4kp5n\" (UID: \"73b9ca3f-754a-4970-85ef-b3203caee6e4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4kp5n" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.865683 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5fnf\" (UniqueName: \"kubernetes.io/projected/daded41b-1e26-4dde-aded-4a2e3c1dc4fd-kube-api-access-q5fnf\") pod \"dns-default-6rktv\" (UID: \"daded41b-1e26-4dde-aded-4a2e3c1dc4fd\") " pod="openshift-dns/dns-default-6rktv" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.865709 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/daded41b-1e26-4dde-aded-4a2e3c1dc4fd-metrics-tls\") pod \"dns-default-6rktv\" (UID: \"daded41b-1e26-4dde-aded-4a2e3c1dc4fd\") " pod="openshift-dns/dns-default-6rktv" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.865792 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/43391457-a499-43df-82a4-15be4ce2a0ac-csi-data-dir\") pod \"csi-hostpathplugin-88brt\" (UID: \"43391457-a499-43df-82a4-15be4ce2a0ac\") " pod="hostpath-provisioner/csi-hostpathplugin-88brt" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.869454 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8-installation-pull-secrets\") pod \"image-registry-697d97f7c8-6fhf7\" (UID: \"11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fhf7" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.881711 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sfhzf" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.882193 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kkhg7"] Mar 20 17:35:53 crc kubenswrapper[4690]: E0320 17:35:53.882328 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:35:54.38230043 +0000 UTC m=+229.248126098 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.882652 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gvtc\" (UniqueName: \"kubernetes.io/projected/861dbb3a-f563-415b-ae55-45dfd9f7208b-kube-api-access-2gvtc\") pod \"machine-config-server-fqlzx\" (UID: \"861dbb3a-f563-415b-ae55-45dfd9f7208b\") " pod="openshift-machine-config-operator/machine-config-server-fqlzx" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.882786 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fhf7\" (UID: \"11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fhf7" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.882820 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cz9rp\" (UniqueName: \"kubernetes.io/projected/11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8-kube-api-access-cz9rp\") pod \"image-registry-697d97f7c8-6fhf7\" (UID: \"11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fhf7" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.883105 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8-registry-certificates\") pod \"image-registry-697d97f7c8-6fhf7\" (UID: \"11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fhf7" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.883160 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/43391457-a499-43df-82a4-15be4ce2a0ac-registration-dir\") pod \"csi-hostpathplugin-88brt\" (UID: \"43391457-a499-43df-82a4-15be4ce2a0ac\") " pod="hostpath-provisioner/csi-hostpathplugin-88brt" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.883182 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/43391457-a499-43df-82a4-15be4ce2a0ac-plugins-dir\") pod \"csi-hostpathplugin-88brt\" (UID: \"43391457-a499-43df-82a4-15be4ce2a0ac\") " pod="hostpath-provisioner/csi-hostpathplugin-88brt" Mar 20 17:35:53 crc kubenswrapper[4690]: E0320 17:35:53.884522 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:35:54.384506974 +0000 UTC m=+229.250332652 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fhf7" (UID: "11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.884950 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cg94j" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.886458 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8-registry-certificates\") pod \"image-registry-697d97f7c8-6fhf7\" (UID: \"11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fhf7" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.888281 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/daded41b-1e26-4dde-aded-4a2e3c1dc4fd-config-volume\") pod \"dns-default-6rktv\" (UID: \"daded41b-1e26-4dde-aded-4a2e3c1dc4fd\") " pod="openshift-dns/dns-default-6rktv" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.888966 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8-bound-sa-token\") pod \"image-registry-697d97f7c8-6fhf7\" (UID: \"11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fhf7" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.889997 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/43391457-a499-43df-82a4-15be4ce2a0ac-socket-dir\") pod \"csi-hostpathplugin-88brt\" (UID: \"43391457-a499-43df-82a4-15be4ce2a0ac\") " pod="hostpath-provisioner/csi-hostpathplugin-88brt" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.890032 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p27pm\" (UniqueName: \"kubernetes.io/projected/43391457-a499-43df-82a4-15be4ce2a0ac-kube-api-access-p27pm\") pod \"csi-hostpathplugin-88brt\" (UID: \"43391457-a499-43df-82a4-15be4ce2a0ac\") " pod="hostpath-provisioner/csi-hostpathplugin-88brt" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.890067 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/861dbb3a-f563-415b-ae55-45dfd9f7208b-certs\") pod \"machine-config-server-fqlzx\" (UID: \"861dbb3a-f563-415b-ae55-45dfd9f7208b\") " pod="openshift-machine-config-operator/machine-config-server-fqlzx" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.890469 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/43391457-a499-43df-82a4-15be4ce2a0ac-mountpoint-dir\") pod \"csi-hostpathplugin-88brt\" (UID: \"43391457-a499-43df-82a4-15be4ce2a0ac\") " pod="hostpath-provisioner/csi-hostpathplugin-88brt" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.890731 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8-ca-trust-extracted\") pod \"image-registry-697d97f7c8-6fhf7\" (UID: \"11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fhf7" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.890773 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8-registry-tls\") pod \"image-registry-697d97f7c8-6fhf7\" (UID: \"11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fhf7" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.890797 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8-trusted-ca\") pod \"image-registry-697d97f7c8-6fhf7\" (UID: \"11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fhf7" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.891026 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8-installation-pull-secrets\") pod \"image-registry-697d97f7c8-6fhf7\" (UID: \"11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fhf7" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.892542 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8-ca-trust-extracted\") pod \"image-registry-697d97f7c8-6fhf7\" (UID: \"11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fhf7" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.892718 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-275fn" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.893484 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2rl8\" (UniqueName: \"kubernetes.io/projected/73b9ca3f-754a-4970-85ef-b3203caee6e4-kube-api-access-b2rl8\") pod \"cluster-samples-operator-665b6dd947-4kp5n\" (UID: \"73b9ca3f-754a-4970-85ef-b3203caee6e4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4kp5n" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.893984 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8-trusted-ca\") pod \"image-registry-697d97f7c8-6fhf7\" (UID: \"11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fhf7" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.894691 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/861dbb3a-f563-415b-ae55-45dfd9f7208b-node-bootstrap-token\") pod \"machine-config-server-fqlzx\" (UID: \"861dbb3a-f563-415b-ae55-45dfd9f7208b\") " pod="openshift-machine-config-operator/machine-config-server-fqlzx" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.894886 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/73b9ca3f-754a-4970-85ef-b3203caee6e4-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-4kp5n\" (UID: \"73b9ca3f-754a-4970-85ef-b3203caee6e4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4kp5n" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.909999 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8-registry-tls\") pod \"image-registry-697d97f7c8-6fhf7\" (UID: \"11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fhf7" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.933305 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cz9rp\" (UniqueName: \"kubernetes.io/projected/11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8-kube-api-access-cz9rp\") pod \"image-registry-697d97f7c8-6fhf7\" (UID: \"11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fhf7" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.985012 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2rl8\" (UniqueName: \"kubernetes.io/projected/73b9ca3f-754a-4970-85ef-b3203caee6e4-kube-api-access-b2rl8\") pod \"cluster-samples-operator-665b6dd947-4kp5n\" (UID: \"73b9ca3f-754a-4970-85ef-b3203caee6e4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4kp5n" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.985074 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8-bound-sa-token\") pod \"image-registry-697d97f7c8-6fhf7\" (UID: \"11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fhf7" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.995869 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.996005 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/43391457-a499-43df-82a4-15be4ce2a0ac-registration-dir\") pod \"csi-hostpathplugin-88brt\" (UID: \"43391457-a499-43df-82a4-15be4ce2a0ac\") " pod="hostpath-provisioner/csi-hostpathplugin-88brt" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.996025 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/43391457-a499-43df-82a4-15be4ce2a0ac-plugins-dir\") pod \"csi-hostpathplugin-88brt\" (UID: \"43391457-a499-43df-82a4-15be4ce2a0ac\") " pod="hostpath-provisioner/csi-hostpathplugin-88brt" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.996069 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/daded41b-1e26-4dde-aded-4a2e3c1dc4fd-config-volume\") pod \"dns-default-6rktv\" (UID: \"daded41b-1e26-4dde-aded-4a2e3c1dc4fd\") " pod="openshift-dns/dns-default-6rktv" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.996092 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/43391457-a499-43df-82a4-15be4ce2a0ac-socket-dir\") pod \"csi-hostpathplugin-88brt\" (UID: \"43391457-a499-43df-82a4-15be4ce2a0ac\") " pod="hostpath-provisioner/csi-hostpathplugin-88brt" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.996112 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/861dbb3a-f563-415b-ae55-45dfd9f7208b-certs\") pod \"machine-config-server-fqlzx\" (UID: \"861dbb3a-f563-415b-ae55-45dfd9f7208b\") " pod="openshift-machine-config-operator/machine-config-server-fqlzx" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.996138 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p27pm\" (UniqueName: \"kubernetes.io/projected/43391457-a499-43df-82a4-15be4ce2a0ac-kube-api-access-p27pm\") pod \"csi-hostpathplugin-88brt\" (UID: \"43391457-a499-43df-82a4-15be4ce2a0ac\") " pod="hostpath-provisioner/csi-hostpathplugin-88brt" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.996163 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/43391457-a499-43df-82a4-15be4ce2a0ac-mountpoint-dir\") pod \"csi-hostpathplugin-88brt\" (UID: \"43391457-a499-43df-82a4-15be4ce2a0ac\") " pod="hostpath-provisioner/csi-hostpathplugin-88brt" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.996226 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/861dbb3a-f563-415b-ae55-45dfd9f7208b-node-bootstrap-token\") pod \"machine-config-server-fqlzx\" (UID: \"861dbb3a-f563-415b-ae55-45dfd9f7208b\") " pod="openshift-machine-config-operator/machine-config-server-fqlzx" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.996291 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5fnf\" (UniqueName: \"kubernetes.io/projected/daded41b-1e26-4dde-aded-4a2e3c1dc4fd-kube-api-access-q5fnf\") pod \"dns-default-6rktv\" (UID: \"daded41b-1e26-4dde-aded-4a2e3c1dc4fd\") " pod="openshift-dns/dns-default-6rktv" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.996328 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/daded41b-1e26-4dde-aded-4a2e3c1dc4fd-metrics-tls\") pod \"dns-default-6rktv\" (UID: \"daded41b-1e26-4dde-aded-4a2e3c1dc4fd\") " pod="openshift-dns/dns-default-6rktv" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.996347 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/43391457-a499-43df-82a4-15be4ce2a0ac-csi-data-dir\") pod \"csi-hostpathplugin-88brt\" (UID: \"43391457-a499-43df-82a4-15be4ce2a0ac\") " pod="hostpath-provisioner/csi-hostpathplugin-88brt" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.996383 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gvtc\" (UniqueName: \"kubernetes.io/projected/861dbb3a-f563-415b-ae55-45dfd9f7208b-kube-api-access-2gvtc\") pod \"machine-config-server-fqlzx\" (UID: \"861dbb3a-f563-415b-ae55-45dfd9f7208b\") " pod="openshift-machine-config-operator/machine-config-server-fqlzx" Mar 20 17:35:53 crc kubenswrapper[4690]: E0320 17:35:53.996565 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:35:54.496548866 +0000 UTC m=+229.362374544 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.996805 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/43391457-a499-43df-82a4-15be4ce2a0ac-registration-dir\") pod \"csi-hostpathplugin-88brt\" (UID: \"43391457-a499-43df-82a4-15be4ce2a0ac\") " pod="hostpath-provisioner/csi-hostpathplugin-88brt" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.996845 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/43391457-a499-43df-82a4-15be4ce2a0ac-plugins-dir\") pod \"csi-hostpathplugin-88brt\" (UID: \"43391457-a499-43df-82a4-15be4ce2a0ac\") " pod="hostpath-provisioner/csi-hostpathplugin-88brt" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.997384 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/daded41b-1e26-4dde-aded-4a2e3c1dc4fd-config-volume\") pod \"dns-default-6rktv\" (UID: \"daded41b-1e26-4dde-aded-4a2e3c1dc4fd\") " pod="openshift-dns/dns-default-6rktv" Mar 20 17:35:53 crc kubenswrapper[4690]: I0320 17:35:53.997443 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/43391457-a499-43df-82a4-15be4ce2a0ac-socket-dir\") pod \"csi-hostpathplugin-88brt\" (UID: \"43391457-a499-43df-82a4-15be4ce2a0ac\") " pod="hostpath-provisioner/csi-hostpathplugin-88brt" Mar 20 17:35:54 crc kubenswrapper[4690]: I0320 17:35:54.000703 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/43391457-a499-43df-82a4-15be4ce2a0ac-mountpoint-dir\") pod \"csi-hostpathplugin-88brt\" (UID: \"43391457-a499-43df-82a4-15be4ce2a0ac\") " pod="hostpath-provisioner/csi-hostpathplugin-88brt" Mar 20 17:35:54 crc kubenswrapper[4690]: I0320 17:35:54.006168 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/43391457-a499-43df-82a4-15be4ce2a0ac-csi-data-dir\") pod \"csi-hostpathplugin-88brt\" (UID: \"43391457-a499-43df-82a4-15be4ce2a0ac\") " pod="hostpath-provisioner/csi-hostpathplugin-88brt" Mar 20 17:35:54 crc kubenswrapper[4690]: I0320 17:35:54.017596 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/861dbb3a-f563-415b-ae55-45dfd9f7208b-certs\") pod \"machine-config-server-fqlzx\" (UID: \"861dbb3a-f563-415b-ae55-45dfd9f7208b\") " pod="openshift-machine-config-operator/machine-config-server-fqlzx" Mar 20 17:35:54 crc kubenswrapper[4690]: I0320 17:35:54.018939 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/861dbb3a-f563-415b-ae55-45dfd9f7208b-node-bootstrap-token\") pod \"machine-config-server-fqlzx\" (UID: \"861dbb3a-f563-415b-ae55-45dfd9f7208b\") " pod="openshift-machine-config-operator/machine-config-server-fqlzx" Mar 20 17:35:54 crc kubenswrapper[4690]: I0320 17:35:54.019001 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/daded41b-1e26-4dde-aded-4a2e3c1dc4fd-metrics-tls\") pod \"dns-default-6rktv\" (UID: \"daded41b-1e26-4dde-aded-4a2e3c1dc4fd\") " pod="openshift-dns/dns-default-6rktv" Mar 20 17:35:54 crc kubenswrapper[4690]: I0320 17:35:54.039077 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-6rw4d"] Mar 20 17:35:54 crc kubenswrapper[4690]: I0320 17:35:54.039112 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-ppgjz"] Mar 20 17:35:54 crc kubenswrapper[4690]: I0320 17:35:54.039125 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-8l2n9"] Mar 20 17:35:54 crc kubenswrapper[4690]: I0320 17:35:54.041118 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-nm2vw" event={"ID":"41847043-0aca-46d5-940f-3dfd2ded491f","Type":"ContainerStarted","Data":"217c8f2761c8c2672ccdb5f9f632cae4edf0c7ec3517e6e42ce95eef9ac249fb"} Mar 20 17:35:54 crc kubenswrapper[4690]: I0320 17:35:54.042127 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f646m" event={"ID":"40dc1d7f-44d0-4ded-92b5-c2cf3df0bfd8","Type":"ContainerStarted","Data":"b69d3a3f2ef076333452c3b65117460f075314b196f8965e3b6c2df45c240e32"} Mar 20 17:35:54 crc kubenswrapper[4690]: I0320 17:35:54.042150 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f646m" event={"ID":"40dc1d7f-44d0-4ded-92b5-c2cf3df0bfd8","Type":"ContainerStarted","Data":"972d80195fca70466db657a39dc20b8845a7c9f641b65cdc5976983a2e5e4d59"} Mar 20 17:35:54 crc kubenswrapper[4690]: I0320 17:35:54.042704 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f646m" Mar 20 17:35:54 crc kubenswrapper[4690]: I0320 17:35:54.044197 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tzrf8" event={"ID":"28a597c2-65fe-4f1f-b4da-8cedf2c92a6b","Type":"ContainerStarted","Data":"c61e6b648afcb689919febdc6d9a9ea430b606e79cd1346fc322ab4ce39c1050"} Mar 20 17:35:54 crc kubenswrapper[4690]: I0320 17:35:54.044227 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tzrf8" event={"ID":"28a597c2-65fe-4f1f-b4da-8cedf2c92a6b","Type":"ContainerStarted","Data":"1bc192289e63995005dc4b81466b1cebe6e490a63942182140c2a7cb36cbb1e0"} Mar 20 17:35:54 crc kubenswrapper[4690]: I0320 17:35:54.044753 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-sv7wd" event={"ID":"906d9a20-0731-435a-80af-0dab64476e32","Type":"ContainerStarted","Data":"ece7fd930e1e05864c1702c303f13819b8f0e44dba46cb6b3523c100013f5538"} Mar 20 17:35:54 crc kubenswrapper[4690]: I0320 17:35:54.045448 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-n5c8r" event={"ID":"20a97349-3805-4434-be4a-1cb8024add50","Type":"ContainerStarted","Data":"c366d51b2ec91240e0ab86c1fff74bc9e28efa1e25d2fd743685dd6063e640a5"} Mar 20 17:35:54 crc kubenswrapper[4690]: I0320 17:35:54.045470 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-n5c8r" event={"ID":"20a97349-3805-4434-be4a-1cb8024add50","Type":"ContainerStarted","Data":"07dfe3f4d316b8839c7ec293a867c8170fc82a71adc8200a3e776da86cfb5398"} Mar 20 17:35:54 crc kubenswrapper[4690]: I0320 17:35:54.046710 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mk5m6" event={"ID":"8baee130-f518-4071-afbc-13625917aa7b","Type":"ContainerStarted","Data":"b80fbacb5b5c5b8a65e754d9f35dd8798ff051985acda85f54e9628bdf02b770"} Mar 20 17:35:54 crc kubenswrapper[4690]: I0320 17:35:54.046770 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mk5m6" event={"ID":"8baee130-f518-4071-afbc-13625917aa7b","Type":"ContainerStarted","Data":"86de7b03a7dcd87d1d5460d121a01d832f9549c83e320c13f89a6a5cfccf8aa8"} Mar 20 17:35:54 crc kubenswrapper[4690]: I0320 17:35:54.058331 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gvtc\" (UniqueName: \"kubernetes.io/projected/861dbb3a-f563-415b-ae55-45dfd9f7208b-kube-api-access-2gvtc\") pod \"machine-config-server-fqlzx\" (UID: \"861dbb3a-f563-415b-ae55-45dfd9f7208b\") " pod="openshift-machine-config-operator/machine-config-server-fqlzx" Mar 20 17:35:54 crc kubenswrapper[4690]: I0320 17:35:54.085102 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p27pm\" (UniqueName: \"kubernetes.io/projected/43391457-a499-43df-82a4-15be4ce2a0ac-kube-api-access-p27pm\") pod \"csi-hostpathplugin-88brt\" (UID: \"43391457-a499-43df-82a4-15be4ce2a0ac\") " pod="hostpath-provisioner/csi-hostpathplugin-88brt" Mar 20 17:35:54 crc kubenswrapper[4690]: I0320 17:35:54.086925 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5fnf\" (UniqueName: \"kubernetes.io/projected/daded41b-1e26-4dde-aded-4a2e3c1dc4fd-kube-api-access-q5fnf\") pod \"dns-default-6rktv\" (UID: \"daded41b-1e26-4dde-aded-4a2e3c1dc4fd\") " pod="openshift-dns/dns-default-6rktv" Mar 20 17:35:54 crc kubenswrapper[4690]: I0320 17:35:54.097954 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fhf7\" (UID: \"11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fhf7" Mar 20 17:35:54 crc kubenswrapper[4690]: E0320 17:35:54.098329 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:35:54.598313138 +0000 UTC m=+229.464138816 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fhf7" (UID: "11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:35:54 crc kubenswrapper[4690]: I0320 17:35:54.110679 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4kp5n" Mar 20 17:35:54 crc kubenswrapper[4690]: I0320 17:35:54.121534 4690 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-f646m container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.29:8443/healthz\": dial tcp 10.217.0.29:8443: connect: connection refused" start-of-body= Mar 20 17:35:54 crc kubenswrapper[4690]: I0320 17:35:54.121589 4690 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f646m" podUID="40dc1d7f-44d0-4ded-92b5-c2cf3df0bfd8" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.29:8443/healthz\": dial tcp 10.217.0.29:8443: connect: connection refused" Mar 20 17:35:54 crc kubenswrapper[4690]: W0320 17:35:54.131210 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a646659_b6c9_42c0_9bc8_ae149ad8ba85.slice/crio-887e0138e31c842a5e0af5e7debce5632548ac8625dce926db6c9601ee185b07 WatchSource:0}: Error finding container 887e0138e31c842a5e0af5e7debce5632548ac8625dce926db6c9601ee185b07: Status 404 returned error can't find the container with id 887e0138e31c842a5e0af5e7debce5632548ac8625dce926db6c9601ee185b07 Mar 20 17:35:54 crc kubenswrapper[4690]: W0320 17:35:54.155537 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4eaf3f2_8536_46bf_8c5f_82606abec128.slice/crio-1486a39e93d8e0297220cafe8abb99f98cb2467ab826f72f2f6792f169089f3e WatchSource:0}: Error finding container 1486a39e93d8e0297220cafe8abb99f98cb2467ab826f72f2f6792f169089f3e: Status 404 returned error can't find the container with id 1486a39e93d8e0297220cafe8abb99f98cb2467ab826f72f2f6792f169089f3e Mar 20 17:35:54 crc kubenswrapper[4690]: I0320 17:35:54.200775 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:35:54 crc kubenswrapper[4690]: E0320 17:35:54.203336 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:35:54.703315565 +0000 UTC m=+229.569141243 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:35:54 crc kubenswrapper[4690]: I0320 17:35:54.278968 4690 patch_prober.go:28] interesting pod/machine-config-daemon-wtg2q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:35:54 crc kubenswrapper[4690]: I0320 17:35:54.279028 4690 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:35:54 crc kubenswrapper[4690]: I0320 17:35:54.297852 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-88brt" Mar 20 17:35:54 crc kubenswrapper[4690]: I0320 17:35:54.303101 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fhf7\" (UID: \"11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fhf7" Mar 20 17:35:54 crc kubenswrapper[4690]: E0320 17:35:54.303392 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:35:54.803380068 +0000 UTC m=+229.669205736 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fhf7" (UID: "11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:35:54 crc kubenswrapper[4690]: I0320 17:35:54.327457 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6rktv" Mar 20 17:35:54 crc kubenswrapper[4690]: I0320 17:35:54.327477 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-fqlzx" Mar 20 17:35:54 crc kubenswrapper[4690]: I0320 17:35:54.368313 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-52php"] Mar 20 17:35:54 crc kubenswrapper[4690]: I0320 17:35:54.400034 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" Mar 20 17:35:54 crc kubenswrapper[4690]: I0320 17:35:54.405478 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:35:54 crc kubenswrapper[4690]: E0320 17:35:54.405846 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:35:54.90582963 +0000 UTC m=+229.771655308 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:35:54 crc kubenswrapper[4690]: I0320 17:35:54.429356 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-wbkxs"] Mar 20 17:35:54 crc kubenswrapper[4690]: I0320 17:35:54.506816 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fhf7\" (UID: \"11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fhf7" Mar 20 17:35:54 crc kubenswrapper[4690]: E0320 17:35:54.507891 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:35:55.007877721 +0000 UTC m=+229.873703399 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fhf7" (UID: "11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:35:54 crc kubenswrapper[4690]: I0320 17:35:54.607682 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:35:54 crc kubenswrapper[4690]: E0320 17:35:54.608100 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:35:55.108085398 +0000 UTC m=+229.973911076 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:35:54 crc kubenswrapper[4690]: I0320 17:35:54.708756 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fhf7\" (UID: \"11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fhf7" Mar 20 17:35:54 crc kubenswrapper[4690]: E0320 17:35:54.709540 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:35:55.209529781 +0000 UTC m=+230.075355459 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fhf7" (UID: "11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:35:54 crc kubenswrapper[4690]: I0320 17:35:54.813068 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:35:54 crc kubenswrapper[4690]: E0320 17:35:54.813139 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:35:55.313122857 +0000 UTC m=+230.178948535 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:35:54 crc kubenswrapper[4690]: I0320 17:35:54.813296 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fhf7\" (UID: \"11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fhf7" Mar 20 17:35:54 crc kubenswrapper[4690]: E0320 17:35:54.813552 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:35:55.313543389 +0000 UTC m=+230.179369067 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fhf7" (UID: "11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:35:54 crc kubenswrapper[4690]: I0320 17:35:54.915095 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dnpcn"] Mar 20 17:35:54 crc kubenswrapper[4690]: I0320 17:35:54.916251 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:35:54 crc kubenswrapper[4690]: E0320 17:35:54.916474 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:35:55.416335222 +0000 UTC m=+230.282160890 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:35:54 crc kubenswrapper[4690]: I0320 17:35:54.916862 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fhf7\" (UID: \"11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fhf7" Mar 20 17:35:54 crc kubenswrapper[4690]: E0320 17:35:54.917212 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:35:55.417199797 +0000 UTC m=+230.283025475 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fhf7" (UID: "11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:35:54 crc kubenswrapper[4690]: I0320 17:35:54.931058 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-j6k6w"] Mar 20 17:35:54 crc kubenswrapper[4690]: I0320 17:35:54.942901 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-5m5vk"] Mar 20 17:35:55 crc kubenswrapper[4690]: I0320 17:35:55.018970 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:35:55 crc kubenswrapper[4690]: E0320 17:35:55.019457 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:35:55.519442423 +0000 UTC m=+230.385268101 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:35:55 crc kubenswrapper[4690]: I0320 17:35:55.058194 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fq57l" podStartSLOduration=187.058176651 podStartE2EDuration="3m7.058176651s" podCreationTimestamp="2026-03-20 17:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:35:55.00248868 +0000 UTC m=+229.868314358" watchObservedRunningTime="2026-03-20 17:35:55.058176651 +0000 UTC m=+229.924002329" Mar 20 17:35:55 crc kubenswrapper[4690]: I0320 17:35:55.066384 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2s6xz"] Mar 20 17:35:55 crc kubenswrapper[4690]: I0320 17:35:55.067771 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tzrf8" event={"ID":"28a597c2-65fe-4f1f-b4da-8cedf2c92a6b","Type":"ContainerStarted","Data":"ed441888447f7e16942e97d3fa485e5bcc79da62522696fdcd2b2985df42304b"} Mar 20 17:35:55 crc kubenswrapper[4690]: I0320 17:35:55.079294 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-6zcl9"] Mar 20 17:35:55 crc kubenswrapper[4690]: I0320 17:35:55.079574 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-6rw4d" event={"ID":"ff4fe98d-c7c0-475a-85cb-70ab2c4ad122","Type":"ContainerStarted","Data":"8560d9fea925dfaa9126f0655816d416216e3f369d10f0e5475f18a816e05beb"} Mar 20 17:35:55 crc kubenswrapper[4690]: I0320 17:35:55.079624 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-6rw4d" event={"ID":"ff4fe98d-c7c0-475a-85cb-70ab2c4ad122","Type":"ContainerStarted","Data":"f772f57c2866cdb312a12dd7c6cfbfaccbbafc7af81a01426ce8a5e6095c0449"} Mar 20 17:35:55 crc kubenswrapper[4690]: I0320 17:35:55.082343 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-zklpl"] Mar 20 17:35:55 crc kubenswrapper[4690]: I0320 17:35:55.083133 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-ppgjz" event={"ID":"c4eaf3f2-8536-46bf-8c5f-82606abec128","Type":"ContainerStarted","Data":"1486a39e93d8e0297220cafe8abb99f98cb2467ab826f72f2f6792f169089f3e"} Mar 20 17:35:55 crc kubenswrapper[4690]: I0320 17:35:55.083840 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-wbkxs" event={"ID":"fcf13749-fd7c-4f01-9598-7f041910cd74","Type":"ContainerStarted","Data":"8cb321c7eaacad68afc4f8a1a40ab6006f1bf22164f2dee3c73928959510feb7"} Mar 20 17:35:55 crc kubenswrapper[4690]: I0320 17:35:55.093570 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-nm2vw" event={"ID":"41847043-0aca-46d5-940f-3dfd2ded491f","Type":"ContainerStarted","Data":"9bf1bacec8c023da97fdf93f9d1659a121117696c9bcfb088f32d42e5765ac36"} Mar 20 17:35:55 crc kubenswrapper[4690]: I0320 17:35:55.107339 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-9n98c"] Mar 20 17:35:55 crc kubenswrapper[4690]: I0320 17:35:55.115845 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-dnpcn" event={"ID":"80d86fac-74cc-41d4-81df-2e718c1568d9","Type":"ContainerStarted","Data":"cf6df4e6f7fe75b0a336e3b90e47481446533d1991a1e2916fbe7c0f4b5977b2"} Mar 20 17:35:55 crc kubenswrapper[4690]: I0320 17:35:55.121417 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-nprpv"] Mar 20 17:35:55 crc kubenswrapper[4690]: I0320 17:35:55.121510 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fhf7\" (UID: \"11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fhf7" Mar 20 17:35:55 crc kubenswrapper[4690]: E0320 17:35:55.121787 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:35:55.621775462 +0000 UTC m=+230.487601140 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fhf7" (UID: "11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:35:55 crc kubenswrapper[4690]: I0320 17:35:55.125005 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-sv7wd" event={"ID":"906d9a20-0731-435a-80af-0dab64476e32","Type":"ContainerStarted","Data":"055a68d23d20849df7047736a476fef9d48bae8aa6a61d2a54e960186a14ff8f"} Mar 20 17:35:55 crc kubenswrapper[4690]: I0320 17:35:55.153135 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tv6bv"] Mar 20 17:35:55 crc kubenswrapper[4690]: I0320 17:35:55.158846 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-ft28f"] Mar 20 17:35:55 crc kubenswrapper[4690]: I0320 17:35:55.162810 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-2b2sh"] Mar 20 17:35:55 crc kubenswrapper[4690]: I0320 17:35:55.180359 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kkhg7" event={"ID":"dc26c755-5e1b-480b-b3ed-b3d3dee36d94","Type":"ContainerStarted","Data":"27325809a19c895fd898ff6746635d5cf723d4751b0e3835b729338d065ec485"} Mar 20 17:35:55 crc kubenswrapper[4690]: I0320 17:35:55.184561 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nfmkn"] Mar 20 17:35:55 crc kubenswrapper[4690]: I0320 17:35:55.190361 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-8l2n9" event={"ID":"a0c61344-19c2-4d8b-8aec-be86ac403866","Type":"ContainerStarted","Data":"65449d626cfc5f6231745ba66ed8a0613464410a4d56de668fc23c3952151018"} Mar 20 17:35:55 crc kubenswrapper[4690]: I0320 17:35:55.195465 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-5m5vk" event={"ID":"f05b3314-839f-43ca-bb32-951ef0582151","Type":"ContainerStarted","Data":"5dda2b9e6f03bc9aeca54f1938144c1f1e3f81cd122ce1873852caf708c8628e"} Mar 20 17:35:55 crc kubenswrapper[4690]: I0320 17:35:55.207319 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-8pvtf"] Mar 20 17:35:55 crc kubenswrapper[4690]: I0320 17:35:55.213515 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-j6k6w" event={"ID":"bcdf1a44-e01e-4f8d-a5dd-f050ff98f14d","Type":"ContainerStarted","Data":"2d618a081eead42baaa05bae63c06d009ecb87cf3cb417baa7f96d462426bacb"} Mar 20 17:35:55 crc kubenswrapper[4690]: I0320 17:35:55.215905 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-dxqqz" event={"ID":"9a646659-b6c9-42c0-9bc8-ae149ad8ba85","Type":"ContainerStarted","Data":"887e0138e31c842a5e0af5e7debce5632548ac8625dce926db6c9601ee185b07"} Mar 20 17:35:55 crc kubenswrapper[4690]: I0320 17:35:55.218226 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-v9wf6" event={"ID":"789eef8f-04a8-44cf-9e16-878de3a035bb","Type":"ContainerStarted","Data":"b910be92537376a336602f1519fe1b6304affb69ad13b6f5d5140b74fe9ee3e9"} Mar 20 17:35:55 crc kubenswrapper[4690]: I0320 17:35:55.218248 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-v9wf6" event={"ID":"789eef8f-04a8-44cf-9e16-878de3a035bb","Type":"ContainerStarted","Data":"6ca6f8257e0fcc8569d3d8a64f3e5340c31b487d64c6100f1a4b898b43cd165a"} Mar 20 17:35:55 crc kubenswrapper[4690]: I0320 17:35:55.221672 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-v9wf6" Mar 20 17:35:55 crc kubenswrapper[4690]: I0320 17:35:55.224872 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567130-scc4x"] Mar 20 17:35:55 crc kubenswrapper[4690]: I0320 17:35:55.224974 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:35:55 crc kubenswrapper[4690]: E0320 17:35:55.231821 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:35:55.731784825 +0000 UTC m=+230.597610503 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:35:55 crc kubenswrapper[4690]: I0320 17:35:55.235921 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567134-66l98"] Mar 20 17:35:55 crc kubenswrapper[4690]: I0320 17:35:55.236049 4690 patch_prober.go:28] interesting pod/downloads-7954f5f757-v9wf6 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Mar 20 17:35:55 crc kubenswrapper[4690]: I0320 17:35:55.236104 4690 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-v9wf6" podUID="789eef8f-04a8-44cf-9e16-878de3a035bb" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Mar 20 17:35:55 crc kubenswrapper[4690]: I0320 17:35:55.238115 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-d7smm"] Mar 20 17:35:55 crc kubenswrapper[4690]: I0320 17:35:55.243087 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-52php" event={"ID":"c73bcf80-34dc-466e-b1b0-a92850850498","Type":"ContainerStarted","Data":"8ed687ecb54c3e7435dcd9776cd51ab9f42f36239205252bd53264c20f615a20"} Mar 20 17:35:55 crc kubenswrapper[4690]: I0320 17:35:55.265948 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f646m" Mar 20 17:35:55 crc kubenswrapper[4690]: W0320 17:35:55.267952 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb444275_6cc1_42be_b742_afc344a60995.slice/crio-082b2deee4d47e2c5836da70e2823f17de3b014adc4748399588fcb8dd3c3e4c WatchSource:0}: Error finding container 082b2deee4d47e2c5836da70e2823f17de3b014adc4748399588fcb8dd3c3e4c: Status 404 returned error can't find the container with id 082b2deee4d47e2c5836da70e2823f17de3b014adc4748399588fcb8dd3c3e4c Mar 20 17:35:55 crc kubenswrapper[4690]: I0320 17:35:55.272075 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-v7pjq"] Mar 20 17:35:55 crc kubenswrapper[4690]: W0320 17:35:55.278277 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51ad4830_9e57_4bf2_91e5_7c24c7648d8b.slice/crio-04cc1b4f2e6849d4499feb049d0db8a49c45df812913a400db0cc9b0f338e33b WatchSource:0}: Error finding container 04cc1b4f2e6849d4499feb049d0db8a49c45df812913a400db0cc9b0f338e33b: Status 404 returned error can't find the container with id 04cc1b4f2e6849d4499feb049d0db8a49c45df812913a400db0cc9b0f338e33b Mar 20 17:35:55 crc kubenswrapper[4690]: W0320 17:35:55.288082 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda151c473_d304_4e1d_ba12_7860c0efbac9.slice/crio-bee05fde2edae94cd34435c6a21b00bf558ab5eeb61f437389285e80a6dd18c3 WatchSource:0}: Error finding container bee05fde2edae94cd34435c6a21b00bf558ab5eeb61f437389285e80a6dd18c3: Status 404 returned error can't find the container with id bee05fde2edae94cd34435c6a21b00bf558ab5eeb61f437389285e80a6dd18c3 Mar 20 17:35:55 crc kubenswrapper[4690]: I0320 17:35:55.302571 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2n45v"] Mar 20 17:35:55 crc kubenswrapper[4690]: I0320 17:35:55.317198 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4kp5n"] Mar 20 17:35:55 crc kubenswrapper[4690]: I0320 17:35:55.329988 4690 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 17:35:55 crc kubenswrapper[4690]: I0320 17:35:55.336319 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fqfhl"] Mar 20 17:35:55 crc kubenswrapper[4690]: I0320 17:35:55.362338 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fhf7\" (UID: \"11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fhf7" Mar 20 17:35:55 crc kubenswrapper[4690]: E0320 17:35:55.365034 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:35:55.865016293 +0000 UTC m=+230.730841971 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fhf7" (UID: "11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:35:55 crc kubenswrapper[4690]: I0320 17:35:55.367397 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-88brt"] Mar 20 17:35:55 crc kubenswrapper[4690]: I0320 17:35:55.392116 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cg94j"] Mar 20 17:35:55 crc kubenswrapper[4690]: I0320 17:35:55.410128 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-275fn"] Mar 20 17:35:55 crc kubenswrapper[4690]: I0320 17:35:55.448331 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-sfhzf"] Mar 20 17:35:55 crc kubenswrapper[4690]: I0320 17:35:55.464459 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f646m" podStartSLOduration=187.464425187 podStartE2EDuration="3m7.464425187s" podCreationTimestamp="2026-03-20 17:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:35:55.452946423 +0000 UTC m=+230.318772111" watchObservedRunningTime="2026-03-20 17:35:55.464425187 +0000 UTC m=+230.330250865" Mar 20 17:35:55 crc kubenswrapper[4690]: I0320 17:35:55.464649 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-jtggf"] Mar 20 17:35:55 crc kubenswrapper[4690]: I0320 17:35:55.472598 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:35:55 crc kubenswrapper[4690]: E0320 17:35:55.472875 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:35:55.972856132 +0000 UTC m=+230.838681810 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:35:55 crc kubenswrapper[4690]: I0320 17:35:55.473018 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fhf7\" (UID: \"11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fhf7" Mar 20 17:35:55 crc kubenswrapper[4690]: E0320 17:35:55.473822 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:35:55.97379113 +0000 UTC m=+230.839616808 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fhf7" (UID: "11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:35:55 crc kubenswrapper[4690]: I0320 17:35:55.481325 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-6rktv"] Mar 20 17:35:55 crc kubenswrapper[4690]: I0320 17:35:55.512584 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mk5m6" podStartSLOduration=187.512548838 podStartE2EDuration="3m7.512548838s" podCreationTimestamp="2026-03-20 17:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:35:55.500958331 +0000 UTC m=+230.366784009" watchObservedRunningTime="2026-03-20 17:35:55.512548838 +0000 UTC m=+230.378374506" Mar 20 17:35:55 crc kubenswrapper[4690]: W0320 17:35:55.517330 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6edcd95a_9780_4af1_9454_da6dce913528.slice/crio-70cb1d8b2759bc464b2f1b5963f6e8dd20669f1959072f34f2a3e85ea7ecb705 WatchSource:0}: Error finding container 70cb1d8b2759bc464b2f1b5963f6e8dd20669f1959072f34f2a3e85ea7ecb705: Status 404 returned error can't find the container with id 70cb1d8b2759bc464b2f1b5963f6e8dd20669f1959072f34f2a3e85ea7ecb705 Mar 20 17:35:55 crc kubenswrapper[4690]: W0320 17:35:55.555642 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddaded41b_1e26_4dde_aded_4a2e3c1dc4fd.slice/crio-03c3480c520d6902b80fa758debfdc6a0bae8ed1e73d9bd744e98d4c5f810981 WatchSource:0}: Error finding container 03c3480c520d6902b80fa758debfdc6a0bae8ed1e73d9bd744e98d4c5f810981: Status 404 returned error can't find the container with id 03c3480c520d6902b80fa758debfdc6a0bae8ed1e73d9bd744e98d4c5f810981 Mar 20 17:35:55 crc kubenswrapper[4690]: I0320 17:35:55.577179 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:35:55 crc kubenswrapper[4690]: E0320 17:35:55.577322 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:35:56.077298923 +0000 UTC m=+230.943124591 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:35:55 crc kubenswrapper[4690]: I0320 17:35:55.578833 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fhf7\" (UID: \"11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fhf7" Mar 20 17:35:55 crc kubenswrapper[4690]: E0320 17:35:55.580649 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:35:56.08062515 +0000 UTC m=+230.946450828 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fhf7" (UID: "11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:35:55 crc kubenswrapper[4690]: I0320 17:35:55.623994 4690 patch_prober.go:28] interesting pod/router-default-5444994796-sv7wd container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Mar 20 17:35:55 crc kubenswrapper[4690]: I0320 17:35:55.624054 4690 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-sv7wd" podUID="906d9a20-0731-435a-80af-0dab64476e32" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Mar 20 17:35:55 crc kubenswrapper[4690]: I0320 17:35:55.624158 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-sv7wd" Mar 20 17:35:55 crc kubenswrapper[4690]: I0320 17:35:55.641947 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-nm2vw" podStartSLOduration=187.641921084 podStartE2EDuration="3m7.641921084s" podCreationTimestamp="2026-03-20 17:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:35:55.599642923 +0000 UTC m=+230.465468601" watchObservedRunningTime="2026-03-20 17:35:55.641921084 +0000 UTC m=+230.507746762" Mar 20 17:35:55 crc kubenswrapper[4690]: I0320 17:35:55.680144 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:35:55 crc kubenswrapper[4690]: E0320 17:35:55.680391 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:35:56.180364433 +0000 UTC m=+231.046190111 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:35:55 crc kubenswrapper[4690]: I0320 17:35:55.683219 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tzrf8" podStartSLOduration=187.683204196 podStartE2EDuration="3m7.683204196s" podCreationTimestamp="2026-03-20 17:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:35:55.643671335 +0000 UTC m=+230.509497023" watchObservedRunningTime="2026-03-20 17:35:55.683204196 +0000 UTC m=+230.549029874" Mar 20 17:35:55 crc kubenswrapper[4690]: I0320 17:35:55.688598 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fhf7\" (UID: \"11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fhf7" Mar 20 17:35:55 crc kubenswrapper[4690]: E0320 17:35:55.689093 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:35:56.189077937 +0000 UTC m=+231.054903615 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fhf7" (UID: "11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:35:55 crc kubenswrapper[4690]: I0320 17:35:55.722623 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-v9wf6" podStartSLOduration=187.722603613 podStartE2EDuration="3m7.722603613s" podCreationTimestamp="2026-03-20 17:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:35:55.719551644 +0000 UTC m=+230.585377322" watchObservedRunningTime="2026-03-20 17:35:55.722603613 +0000 UTC m=+230.588429291" Mar 20 17:35:55 crc kubenswrapper[4690]: I0320 17:35:55.772321 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-n5c8r" podStartSLOduration=187.772301959 podStartE2EDuration="3m7.772301959s" podCreationTimestamp="2026-03-20 17:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:35:55.763820123 +0000 UTC m=+230.629645801" watchObservedRunningTime="2026-03-20 17:35:55.772301959 +0000 UTC m=+230.638127637" Mar 20 17:35:55 crc kubenswrapper[4690]: I0320 17:35:55.790955 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:35:55 crc kubenswrapper[4690]: E0320 17:35:55.791470 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:35:56.291444457 +0000 UTC m=+231.157270195 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:35:55 crc kubenswrapper[4690]: I0320 17:35:55.842903 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-sv7wd" podStartSLOduration=187.842875904 podStartE2EDuration="3m7.842875904s" podCreationTimestamp="2026-03-20 17:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:35:55.84136201 +0000 UTC m=+230.707187708" watchObservedRunningTime="2026-03-20 17:35:55.842875904 +0000 UTC m=+230.708701582" Mar 20 17:35:55 crc kubenswrapper[4690]: I0320 17:35:55.894973 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fhf7\" (UID: \"11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fhf7" Mar 20 17:35:55 crc kubenswrapper[4690]: E0320 17:35:55.895970 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:35:56.395952009 +0000 UTC m=+231.261777687 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fhf7" (UID: "11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:35:55 crc kubenswrapper[4690]: I0320 17:35:55.996482 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:35:55 crc kubenswrapper[4690]: E0320 17:35:55.996829 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:35:56.496811445 +0000 UTC m=+231.362637123 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:35:56 crc kubenswrapper[4690]: I0320 17:35:56.072002 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-wbkxs"] Mar 20 17:35:56 crc kubenswrapper[4690]: I0320 17:35:56.100376 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fhf7\" (UID: \"11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fhf7" Mar 20 17:35:56 crc kubenswrapper[4690]: E0320 17:35:56.100927 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:35:56.600903455 +0000 UTC m=+231.466729143 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fhf7" (UID: "11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:35:56 crc kubenswrapper[4690]: I0320 17:35:56.178697 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-zklpl"] Mar 20 17:35:56 crc kubenswrapper[4690]: I0320 17:35:56.206631 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:35:56 crc kubenswrapper[4690]: E0320 17:35:56.207003 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:35:56.706976523 +0000 UTC m=+231.572802231 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:35:56 crc kubenswrapper[4690]: I0320 17:35:56.274646 4690 generic.go:334] "Generic (PLEG): container finished" podID="a0c61344-19c2-4d8b-8aec-be86ac403866" containerID="dd0c5f025f4c2c0da30b109900f811e9adef2e3a8440f05090570aed661d1ab2" exitCode=0 Mar 20 17:35:56 crc kubenswrapper[4690]: I0320 17:35:56.274766 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-8l2n9" event={"ID":"a0c61344-19c2-4d8b-8aec-be86ac403866","Type":"ContainerDied","Data":"dd0c5f025f4c2c0da30b109900f811e9adef2e3a8440f05090570aed661d1ab2"} Mar 20 17:35:56 crc kubenswrapper[4690]: I0320 17:35:56.302832 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-275fn" event={"ID":"733046ae-dba4-407a-83ee-89677527d7cc","Type":"ContainerStarted","Data":"e8dc85adeda5cd98e4b00e145578a712a74da669dacc40d47acb917b73b46cee"} Mar 20 17:35:56 crc kubenswrapper[4690]: I0320 17:35:56.309953 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fhf7\" (UID: \"11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fhf7" Mar 20 17:35:56 crc kubenswrapper[4690]: E0320 17:35:56.310644 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:35:56.810622961 +0000 UTC m=+231.676448639 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fhf7" (UID: "11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:35:56 crc kubenswrapper[4690]: I0320 17:35:56.317207 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-6zcl9" event={"ID":"a7917eb5-2c7a-426c-8850-4209cd22e790","Type":"ContainerStarted","Data":"2d6ff852c7deedc562fa7698d69b7273372b5dfa517516535d1a6d61cc0417b7"} Mar 20 17:35:56 crc kubenswrapper[4690]: I0320 17:35:56.351466 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2n45v" event={"ID":"b3bcd08e-d3ff-4cc6-8a32-1d43add9fddf","Type":"ContainerStarted","Data":"8c1e14b780c8d057a0e03ec1bdf00c5b4724e82d7945c1fc451a73e4266e8aaa"} Mar 20 17:35:56 crc kubenswrapper[4690]: I0320 17:35:56.353792 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-dnpcn" event={"ID":"80d86fac-74cc-41d4-81df-2e718c1568d9","Type":"ContainerStarted","Data":"b760ad6cf95133d8fc74387d30f58aa3b60fa64983a86b7f2b2cf8c0828be7a1"} Mar 20 17:35:56 crc kubenswrapper[4690]: I0320 17:35:56.356180 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-dnpcn" Mar 20 17:35:56 crc kubenswrapper[4690]: I0320 17:35:56.362674 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567134-66l98" event={"ID":"34d2f5b9-1f8e-4413-b178-58cd10fa7548","Type":"ContainerStarted","Data":"22714bce4009fbdefd48088e188e121fb893b0cbb9f6cde85b3262cda24f5b02"} Mar 20 17:35:56 crc kubenswrapper[4690]: I0320 17:35:56.368909 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nfmkn" event={"ID":"31ca74dd-dc4d-466a-8ca3-48f9b2d3e9f8","Type":"ContainerStarted","Data":"3738cfbb657b8c3e3f3ed4695414f1effb79774f36c714165ddd7793577c86ab"} Mar 20 17:35:56 crc kubenswrapper[4690]: I0320 17:35:56.372716 4690 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-dnpcn container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" start-of-body= Mar 20 17:35:56 crc kubenswrapper[4690]: I0320 17:35:56.372778 4690 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-dnpcn" podUID="80d86fac-74cc-41d4-81df-2e718c1568d9" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" Mar 20 17:35:56 crc kubenswrapper[4690]: I0320 17:35:56.374956 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cg94j" event={"ID":"a78540fe-014c-42e6-916c-3f39b4611a15","Type":"ContainerStarted","Data":"0909b932b320b1b8c204c88fcd62be33f49999f5f977a288e71b8033cefe1ee7"} Mar 20 17:35:56 crc kubenswrapper[4690]: I0320 17:35:56.382945 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sfhzf" event={"ID":"6edcd95a-9780-4af1-9454-da6dce913528","Type":"ContainerStarted","Data":"70cb1d8b2759bc464b2f1b5963f6e8dd20669f1959072f34f2a3e85ea7ecb705"} Mar 20 17:35:56 crc kubenswrapper[4690]: I0320 17:35:56.388837 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-5m5vk" event={"ID":"f05b3314-839f-43ca-bb32-951ef0582151","Type":"ContainerStarted","Data":"508a28f0d8700428eaab803a99a4604acb955eb32e7dc3975635f4cc75727c07"} Mar 20 17:35:56 crc kubenswrapper[4690]: I0320 17:35:56.414193 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:35:56 crc kubenswrapper[4690]: E0320 17:35:56.414366 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:35:56.91434625 +0000 UTC m=+231.780171928 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:35:56 crc kubenswrapper[4690]: I0320 17:35:56.414624 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fhf7\" (UID: \"11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fhf7" Mar 20 17:35:56 crc kubenswrapper[4690]: E0320 17:35:56.415494 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:35:56.915486913 +0000 UTC m=+231.781312591 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fhf7" (UID: "11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:35:56 crc kubenswrapper[4690]: I0320 17:35:56.415518 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4kp5n" event={"ID":"73b9ca3f-754a-4970-85ef-b3203caee6e4","Type":"ContainerStarted","Data":"071a0b2021468477ce1a1f081a76d02e2a5ba1ff2470dd86402b5c8c164329bf"} Mar 20 17:35:56 crc kubenswrapper[4690]: I0320 17:35:56.415549 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4kp5n" event={"ID":"73b9ca3f-754a-4970-85ef-b3203caee6e4","Type":"ContainerStarted","Data":"7b5f2182520b0139a4518b266b20604116d579400f9aaf66947ead8607f79494"} Mar 20 17:35:56 crc kubenswrapper[4690]: I0320 17:35:56.461987 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ft28f" event={"ID":"e770c47d-95d6-45be-87cb-1fa3922afa82","Type":"ContainerStarted","Data":"56dd07b5497b9286f3a4dcd539878b9c5165eb6bf04de36d4ea793c1b55bd574"} Mar 20 17:35:56 crc kubenswrapper[4690]: I0320 17:35:56.465453 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-d7smm" event={"ID":"d0a0c8bb-22ed-4ebb-aaf6-37d9a2e15a7c","Type":"ContainerStarted","Data":"e9b579e057c9afb7b1055555eb150c22394d9f1d4ead21d0a67e8f77da1e78a8"} Mar 20 17:35:56 crc kubenswrapper[4690]: I0320 17:35:56.474088 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2s6xz" event={"ID":"d11ac9c7-0d8b-4a2c-a60f-7a0e88b01fa7","Type":"ContainerStarted","Data":"7c883029afcaee1df8904c4bd5026104a15217b7030cc658254878659acec2d5"} Mar 20 17:35:56 crc kubenswrapper[4690]: I0320 17:35:56.490424 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zklpl" event={"ID":"ace7a9fa-7eac-449c-8b61-6018d592fc4f","Type":"ContainerStarted","Data":"6c74d17c78a3df4dd5bcdaf5c8462934420e7db51aaf05b8b5d442c885315928"} Mar 20 17:35:56 crc kubenswrapper[4690]: I0320 17:35:56.490481 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zklpl" event={"ID":"ace7a9fa-7eac-449c-8b61-6018d592fc4f","Type":"ContainerStarted","Data":"e4740c419cc549513d6adc1b04914bc451eaf6b9fd7a94d81b81a3802be5c2b5"} Mar 20 17:35:56 crc kubenswrapper[4690]: I0320 17:35:56.491245 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zklpl" Mar 20 17:35:56 crc kubenswrapper[4690]: I0320 17:35:56.498140 4690 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-zklpl container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Mar 20 17:35:56 crc kubenswrapper[4690]: I0320 17:35:56.498199 4690 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zklpl" podUID="ace7a9fa-7eac-449c-8b61-6018d592fc4f" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Mar 20 17:35:56 crc kubenswrapper[4690]: I0320 17:35:56.498579 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-v7pjq" event={"ID":"e6328cb4-ec5c-4913-b7b9-ed18d759d7f1","Type":"ContainerStarted","Data":"93d831e74f24e08d4ff373d9cfa1752e6b59a908cb46d693cd1e49d9731273cb"} Mar 20 17:35:56 crc kubenswrapper[4690]: I0320 17:35:56.515609 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:35:56 crc kubenswrapper[4690]: E0320 17:35:56.515714 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:35:57.01569863 +0000 UTC m=+231.881524308 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:35:56 crc kubenswrapper[4690]: I0320 17:35:56.515968 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fhf7\" (UID: \"11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fhf7" Mar 20 17:35:56 crc kubenswrapper[4690]: E0320 17:35:56.516380 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:35:57.01636472 +0000 UTC m=+231.882190398 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fhf7" (UID: "11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:35:56 crc kubenswrapper[4690]: I0320 17:35:56.527310 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-ppgjz" event={"ID":"c4eaf3f2-8536-46bf-8c5f-82606abec128","Type":"ContainerStarted","Data":"afca7cf50c05785fa233a36eb9a7627d0add01a18be711776213fdd9ed33b0e2"} Mar 20 17:35:56 crc kubenswrapper[4690]: I0320 17:35:56.571813 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-dxqqz" event={"ID":"9a646659-b6c9-42c0-9bc8-ae149ad8ba85","Type":"ContainerStarted","Data":"7946a240a6b317e1a12736233484b1e5f6db9860153c4e117cc6f262edd3f863"} Mar 20 17:35:56 crc kubenswrapper[4690]: I0320 17:35:56.584017 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6rktv" event={"ID":"daded41b-1e26-4dde-aded-4a2e3c1dc4fd","Type":"ContainerStarted","Data":"03c3480c520d6902b80fa758debfdc6a0bae8ed1e73d9bd744e98d4c5f810981"} Mar 20 17:35:56 crc kubenswrapper[4690]: I0320 17:35:56.586670 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-88brt" event={"ID":"43391457-a499-43df-82a4-15be4ce2a0ac","Type":"ContainerStarted","Data":"8ed0b0bc77aefbe3471755ee8c78f70bc17c608e2d9086e5f9a4869c1b74f8ff"} Mar 20 17:35:56 crc kubenswrapper[4690]: I0320 17:35:56.600695 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kkhg7" event={"ID":"dc26c755-5e1b-480b-b3ed-b3d3dee36d94","Type":"ContainerStarted","Data":"5964735bb2d8daf1cde361fcf00eeaa60f06a5f820f04d1a33a9ec7a459e2558"} Mar 20 17:35:56 crc kubenswrapper[4690]: I0320 17:35:56.628330 4690 generic.go:334] "Generic (PLEG): container finished" podID="c73bcf80-34dc-466e-b1b0-a92850850498" containerID="ed503d1747f7bd41cbe8e2808ba4f26c423463f45001664aa66a0d1c8186ec62" exitCode=0 Mar 20 17:35:56 crc kubenswrapper[4690]: I0320 17:35:56.628347 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-52php" event={"ID":"c73bcf80-34dc-466e-b1b0-a92850850498","Type":"ContainerDied","Data":"ed503d1747f7bd41cbe8e2808ba4f26c423463f45001664aa66a0d1c8186ec62"} Mar 20 17:35:56 crc kubenswrapper[4690]: I0320 17:35:56.629018 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:35:56 crc kubenswrapper[4690]: E0320 17:35:56.629130 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:35:57.129107732 +0000 UTC m=+231.994933410 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:35:56 crc kubenswrapper[4690]: I0320 17:35:56.633001 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fhf7\" (UID: \"11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fhf7" Mar 20 17:35:56 crc kubenswrapper[4690]: E0320 17:35:56.638791 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:35:57.138767623 +0000 UTC m=+232.004593301 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fhf7" (UID: "11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:35:56 crc kubenswrapper[4690]: I0320 17:35:56.638808 4690 ???:1] "http: TLS handshake error from 192.168.126.11:46824: no serving certificate available for the kubelet" Mar 20 17:35:56 crc kubenswrapper[4690]: I0320 17:35:56.638875 4690 patch_prober.go:28] interesting pod/router-default-5444994796-sv7wd container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 17:35:56 crc kubenswrapper[4690]: [-]has-synced failed: reason withheld Mar 20 17:35:56 crc kubenswrapper[4690]: [+]process-running ok Mar 20 17:35:56 crc kubenswrapper[4690]: healthz check failed Mar 20 17:35:56 crc kubenswrapper[4690]: I0320 17:35:56.638919 4690 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-sv7wd" podUID="906d9a20-0731-435a-80af-0dab64476e32" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 17:35:56 crc kubenswrapper[4690]: I0320 17:35:56.658268 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tv6bv" event={"ID":"a151c473-d304-4e1d-ba12-7860c0efbac9","Type":"ContainerStarted","Data":"bee05fde2edae94cd34435c6a21b00bf558ab5eeb61f437389285e80a6dd18c3"} Mar 20 17:35:56 crc kubenswrapper[4690]: I0320 17:35:56.662743 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-nprpv" event={"ID":"51ad4830-9e57-4bf2-91e5-7c24c7648d8b","Type":"ContainerStarted","Data":"04cc1b4f2e6849d4499feb049d0db8a49c45df812913a400db0cc9b0f338e33b"} Mar 20 17:35:56 crc kubenswrapper[4690]: I0320 17:35:56.678114 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-wbkxs" event={"ID":"fcf13749-fd7c-4f01-9598-7f041910cd74","Type":"ContainerStarted","Data":"857b206b63c2b51156893882f77a5e3f20a8a487e11bdfa8c3a5a58351eaedd1"} Mar 20 17:35:56 crc kubenswrapper[4690]: I0320 17:35:56.678866 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-wbkxs" Mar 20 17:35:56 crc kubenswrapper[4690]: I0320 17:35:56.682745 4690 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-wbkxs container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Mar 20 17:35:56 crc kubenswrapper[4690]: I0320 17:35:56.682800 4690 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-wbkxs" podUID="fcf13749-fd7c-4f01-9598-7f041910cd74" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Mar 20 17:35:56 crc kubenswrapper[4690]: I0320 17:35:56.702023 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vwgk4"] Mar 20 17:35:56 crc kubenswrapper[4690]: I0320 17:35:56.703199 4690 ???:1] "http: TLS handshake error from 192.168.126.11:46836: no serving certificate available for the kubelet" Mar 20 17:35:56 crc kubenswrapper[4690]: I0320 17:35:56.704035 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vwgk4" Mar 20 17:35:56 crc kubenswrapper[4690]: I0320 17:35:56.719714 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-6rw4d" event={"ID":"ff4fe98d-c7c0-475a-85cb-70ab2c4ad122","Type":"ContainerStarted","Data":"a974b4f80c1dc4c126773ee3c3a17187a2a184ec909db149a0a43716d79b59ae"} Mar 20 17:35:56 crc kubenswrapper[4690]: I0320 17:35:56.723602 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vwgk4"] Mar 20 17:35:56 crc kubenswrapper[4690]: I0320 17:35:56.727730 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 20 17:35:56 crc kubenswrapper[4690]: I0320 17:35:56.727970 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-j6k6w" event={"ID":"bcdf1a44-e01e-4f8d-a5dd-f050ff98f14d","Type":"ContainerStarted","Data":"6f0c69aa58b8646e557cde8d0bf06aa0072f1fd767dc50948e64b69d66429ab2"} Mar 20 17:35:56 crc kubenswrapper[4690]: I0320 17:35:56.742671 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:35:56 crc kubenswrapper[4690]: E0320 17:35:56.743173 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:35:57.243122481 +0000 UTC m=+232.108948159 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:35:56 crc kubenswrapper[4690]: I0320 17:35:56.747283 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fhf7\" (UID: \"11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fhf7" Mar 20 17:35:56 crc kubenswrapper[4690]: E0320 17:35:56.748499 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:35:57.248483097 +0000 UTC m=+232.114308855 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fhf7" (UID: "11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:35:56 crc kubenswrapper[4690]: I0320 17:35:56.749549 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-8pvtf" event={"ID":"fc1d890d-f494-466b-94a2-03c2d2c3fe7f","Type":"ContainerStarted","Data":"4f8efce5ee8b079d5074bef550edba6620626be71f082b38b36041338016bc53"} Mar 20 17:35:56 crc kubenswrapper[4690]: I0320 17:35:56.757937 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fqfhl" event={"ID":"ee298daa-0334-4d62-b83f-7c2499f55af6","Type":"ContainerStarted","Data":"bc2af72de0a782afe37592f62b4f32cdda6b73aa1b8c70ac1cc76bda8ddb12d8"} Mar 20 17:35:56 crc kubenswrapper[4690]: I0320 17:35:56.757983 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fqfhl" event={"ID":"ee298daa-0334-4d62-b83f-7c2499f55af6","Type":"ContainerStarted","Data":"14ac15b30cda59beaeeadf2209146e2b69dc8d02151bba65bab27ddc2157eb00"} Mar 20 17:35:56 crc kubenswrapper[4690]: I0320 17:35:56.763180 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-fqlzx" event={"ID":"861dbb3a-f563-415b-ae55-45dfd9f7208b","Type":"ContainerStarted","Data":"675ba5ccbaf14bf6d6dcc8b5f67bbe1eb22a459f71ffd549f2cbff6cc8e11abd"} Mar 20 17:35:56 crc kubenswrapper[4690]: I0320 17:35:56.763436 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-fqlzx" event={"ID":"861dbb3a-f563-415b-ae55-45dfd9f7208b","Type":"ContainerStarted","Data":"93bb8e45bf10ce15073bb806f911bc1ab9467e169b1d1580b901ef05041fa8ff"} Mar 20 17:35:56 crc kubenswrapper[4690]: I0320 17:35:56.786561 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jtggf" event={"ID":"429d8b5d-8e50-4115-89e7-1c8d3f53bd27","Type":"ContainerStarted","Data":"56737e91b61ebe2374631663771598b4853dbd96f60bf63960cc16a6f04cde83"} Mar 20 17:35:56 crc kubenswrapper[4690]: I0320 17:35:56.815188 4690 ???:1] "http: TLS handshake error from 192.168.126.11:46844: no serving certificate available for the kubelet" Mar 20 17:35:56 crc kubenswrapper[4690]: I0320 17:35:56.815814 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-2b2sh" event={"ID":"1a46a1ee-5f40-4d85-b726-d758b7ceff37","Type":"ContainerStarted","Data":"7f1b4bedf31dd6cf015182945d19cf4cd140006f9784c489c1d61c2a1433d0bd"} Mar 20 17:35:56 crc kubenswrapper[4690]: I0320 17:35:56.848509 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:35:56 crc kubenswrapper[4690]: I0320 17:35:56.849871 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/416d626a-ef44-4b4e-91ce-51042b01a45a-utilities\") pod \"certified-operators-vwgk4\" (UID: \"416d626a-ef44-4b4e-91ce-51042b01a45a\") " pod="openshift-marketplace/certified-operators-vwgk4" Mar 20 17:35:56 crc kubenswrapper[4690]: I0320 17:35:56.849901 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtsrr\" (UniqueName: \"kubernetes.io/projected/416d626a-ef44-4b4e-91ce-51042b01a45a-kube-api-access-jtsrr\") pod \"certified-operators-vwgk4\" (UID: \"416d626a-ef44-4b4e-91ce-51042b01a45a\") " pod="openshift-marketplace/certified-operators-vwgk4" Mar 20 17:35:56 crc kubenswrapper[4690]: I0320 17:35:56.849980 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/416d626a-ef44-4b4e-91ce-51042b01a45a-catalog-content\") pod \"certified-operators-vwgk4\" (UID: \"416d626a-ef44-4b4e-91ce-51042b01a45a\") " pod="openshift-marketplace/certified-operators-vwgk4" Mar 20 17:35:56 crc kubenswrapper[4690]: I0320 17:35:56.852884 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-9n98c" event={"ID":"bb444275-6cc1-42be-b742-afc344a60995","Type":"ContainerStarted","Data":"082b2deee4d47e2c5836da70e2823f17de3b014adc4748399588fcb8dd3c3e4c"} Mar 20 17:35:56 crc kubenswrapper[4690]: E0320 17:35:56.856474 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:35:57.35644833 +0000 UTC m=+232.222274008 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:35:56 crc kubenswrapper[4690]: I0320 17:35:56.864756 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567130-scc4x" event={"ID":"03f86e30-e6e2-473e-8a52-c1e86d28c2e2","Type":"ContainerStarted","Data":"91f674c647b1cbb8c68f356504fcd0b8547628a097fac43eb8b4a938b1131e9a"} Mar 20 17:35:56 crc kubenswrapper[4690]: I0320 17:35:56.867304 4690 patch_prober.go:28] interesting pod/downloads-7954f5f757-v9wf6 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Mar 20 17:35:56 crc kubenswrapper[4690]: I0320 17:35:56.867390 4690 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-v9wf6" podUID="789eef8f-04a8-44cf-9e16-878de3a035bb" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Mar 20 17:35:56 crc kubenswrapper[4690]: I0320 17:35:56.903047 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-ppgjz" podStartSLOduration=188.903008525 podStartE2EDuration="3m8.903008525s" podCreationTimestamp="2026-03-20 17:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:35:56.883760585 +0000 UTC m=+231.749586273" watchObservedRunningTime="2026-03-20 17:35:56.903008525 +0000 UTC m=+231.768834203" Mar 20 17:35:56 crc kubenswrapper[4690]: I0320 17:35:56.911069 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4m7xw"] Mar 20 17:35:56 crc kubenswrapper[4690]: I0320 17:35:56.912236 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4m7xw" Mar 20 17:35:56 crc kubenswrapper[4690]: I0320 17:35:56.913172 4690 ???:1] "http: TLS handshake error from 192.168.126.11:46856: no serving certificate available for the kubelet" Mar 20 17:35:56 crc kubenswrapper[4690]: I0320 17:35:56.917721 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-5m5vk" podStartSLOduration=6.917697573 podStartE2EDuration="6.917697573s" podCreationTimestamp="2026-03-20 17:35:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:35:56.90283186 +0000 UTC m=+231.768657538" watchObservedRunningTime="2026-03-20 17:35:56.917697573 +0000 UTC m=+231.783523251" Mar 20 17:35:56 crc kubenswrapper[4690]: I0320 17:35:56.926474 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4m7xw"] Mar 20 17:35:56 crc kubenswrapper[4690]: I0320 17:35:56.943544 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zklpl" podStartSLOduration=188.943527765 podStartE2EDuration="3m8.943527765s" podCreationTimestamp="2026-03-20 17:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:35:56.942709831 +0000 UTC m=+231.808535519" watchObservedRunningTime="2026-03-20 17:35:56.943527765 +0000 UTC m=+231.809353443" Mar 20 17:35:56 crc kubenswrapper[4690]: I0320 17:35:56.948243 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 20 17:35:56 crc kubenswrapper[4690]: I0320 17:35:56.951845 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fhf7\" (UID: \"11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fhf7" Mar 20 17:35:56 crc kubenswrapper[4690]: I0320 17:35:56.951970 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtsrr\" (UniqueName: \"kubernetes.io/projected/416d626a-ef44-4b4e-91ce-51042b01a45a-kube-api-access-jtsrr\") pod \"certified-operators-vwgk4\" (UID: \"416d626a-ef44-4b4e-91ce-51042b01a45a\") " pod="openshift-marketplace/certified-operators-vwgk4" Mar 20 17:35:56 crc kubenswrapper[4690]: I0320 17:35:56.952007 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/416d626a-ef44-4b4e-91ce-51042b01a45a-utilities\") pod \"certified-operators-vwgk4\" (UID: \"416d626a-ef44-4b4e-91ce-51042b01a45a\") " pod="openshift-marketplace/certified-operators-vwgk4" Mar 20 17:35:56 crc kubenswrapper[4690]: I0320 17:35:56.952090 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/416d626a-ef44-4b4e-91ce-51042b01a45a-catalog-content\") pod \"certified-operators-vwgk4\" (UID: \"416d626a-ef44-4b4e-91ce-51042b01a45a\") " pod="openshift-marketplace/certified-operators-vwgk4" Mar 20 17:35:56 crc kubenswrapper[4690]: E0320 17:35:56.954584 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:35:57.454565706 +0000 UTC m=+232.320391384 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fhf7" (UID: "11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:35:56 crc kubenswrapper[4690]: I0320 17:35:56.957418 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/416d626a-ef44-4b4e-91ce-51042b01a45a-utilities\") pod \"certified-operators-vwgk4\" (UID: \"416d626a-ef44-4b4e-91ce-51042b01a45a\") " pod="openshift-marketplace/certified-operators-vwgk4" Mar 20 17:35:56 crc kubenswrapper[4690]: I0320 17:35:56.958703 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/416d626a-ef44-4b4e-91ce-51042b01a45a-catalog-content\") pod \"certified-operators-vwgk4\" (UID: \"416d626a-ef44-4b4e-91ce-51042b01a45a\") " pod="openshift-marketplace/certified-operators-vwgk4" Mar 20 17:35:57 crc kubenswrapper[4690]: I0320 17:35:57.011539 4690 ???:1] "http: TLS handshake error from 192.168.126.11:46860: no serving certificate available for the kubelet" Mar 20 17:35:57 crc kubenswrapper[4690]: I0320 17:35:57.048486 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtsrr\" (UniqueName: \"kubernetes.io/projected/416d626a-ef44-4b4e-91ce-51042b01a45a-kube-api-access-jtsrr\") pod \"certified-operators-vwgk4\" (UID: \"416d626a-ef44-4b4e-91ce-51042b01a45a\") " pod="openshift-marketplace/certified-operators-vwgk4" Mar 20 17:35:57 crc kubenswrapper[4690]: I0320 17:35:57.053093 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:35:57 crc kubenswrapper[4690]: I0320 17:35:57.053444 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30d0d78a-3ea1-4ce6-b8fb-13645cfedf18-utilities\") pod \"community-operators-4m7xw\" (UID: \"30d0d78a-3ea1-4ce6-b8fb-13645cfedf18\") " pod="openshift-marketplace/community-operators-4m7xw" Mar 20 17:35:57 crc kubenswrapper[4690]: I0320 17:35:57.053482 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jpch\" (UniqueName: \"kubernetes.io/projected/30d0d78a-3ea1-4ce6-b8fb-13645cfedf18-kube-api-access-5jpch\") pod \"community-operators-4m7xw\" (UID: \"30d0d78a-3ea1-4ce6-b8fb-13645cfedf18\") " pod="openshift-marketplace/community-operators-4m7xw" Mar 20 17:35:57 crc kubenswrapper[4690]: I0320 17:35:57.053584 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30d0d78a-3ea1-4ce6-b8fb-13645cfedf18-catalog-content\") pod \"community-operators-4m7xw\" (UID: \"30d0d78a-3ea1-4ce6-b8fb-13645cfedf18\") " pod="openshift-marketplace/community-operators-4m7xw" Mar 20 17:35:57 crc kubenswrapper[4690]: E0320 17:35:57.065424 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:35:57.553943569 +0000 UTC m=+232.419769247 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:35:57 crc kubenswrapper[4690]: I0320 17:35:57.071213 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fqfhl" podStartSLOduration=189.071191101 podStartE2EDuration="3m9.071191101s" podCreationTimestamp="2026-03-20 17:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:35:57.021770983 +0000 UTC m=+231.887596681" watchObservedRunningTime="2026-03-20 17:35:57.071191101 +0000 UTC m=+231.937016779" Mar 20 17:35:57 crc kubenswrapper[4690]: I0320 17:35:57.092560 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-fqlzx" podStartSLOduration=7.092539063 podStartE2EDuration="7.092539063s" podCreationTimestamp="2026-03-20 17:35:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:35:57.075402314 +0000 UTC m=+231.941228012" watchObservedRunningTime="2026-03-20 17:35:57.092539063 +0000 UTC m=+231.958364741" Mar 20 17:35:57 crc kubenswrapper[4690]: I0320 17:35:57.093931 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vwgk4" Mar 20 17:35:57 crc kubenswrapper[4690]: I0320 17:35:57.094815 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bnxz2"] Mar 20 17:35:57 crc kubenswrapper[4690]: I0320 17:35:57.102142 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bnxz2" Mar 20 17:35:57 crc kubenswrapper[4690]: I0320 17:35:57.116759 4690 ???:1] "http: TLS handshake error from 192.168.126.11:46868: no serving certificate available for the kubelet" Mar 20 17:35:57 crc kubenswrapper[4690]: I0320 17:35:57.126021 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-6rw4d" podStartSLOduration=189.125987076 podStartE2EDuration="3m9.125987076s" podCreationTimestamp="2026-03-20 17:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:35:57.103401239 +0000 UTC m=+231.969226917" watchObservedRunningTime="2026-03-20 17:35:57.125987076 +0000 UTC m=+231.991812754" Mar 20 17:35:57 crc kubenswrapper[4690]: I0320 17:35:57.156062 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bnxz2"] Mar 20 17:35:57 crc kubenswrapper[4690]: I0320 17:35:57.157003 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7552fec8-7b03-4ad9-8410-1705f639433e-catalog-content\") pod \"certified-operators-bnxz2\" (UID: \"7552fec8-7b03-4ad9-8410-1705f639433e\") " pod="openshift-marketplace/certified-operators-bnxz2" Mar 20 17:35:57 crc kubenswrapper[4690]: I0320 17:35:57.157082 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30d0d78a-3ea1-4ce6-b8fb-13645cfedf18-catalog-content\") pod \"community-operators-4m7xw\" (UID: \"30d0d78a-3ea1-4ce6-b8fb-13645cfedf18\") " pod="openshift-marketplace/community-operators-4m7xw" Mar 20 17:35:57 crc kubenswrapper[4690]: I0320 17:35:57.157114 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nd42m\" (UniqueName: \"kubernetes.io/projected/7552fec8-7b03-4ad9-8410-1705f639433e-kube-api-access-nd42m\") pod \"certified-operators-bnxz2\" (UID: \"7552fec8-7b03-4ad9-8410-1705f639433e\") " pod="openshift-marketplace/certified-operators-bnxz2" Mar 20 17:35:57 crc kubenswrapper[4690]: I0320 17:35:57.157140 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fhf7\" (UID: \"11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fhf7" Mar 20 17:35:57 crc kubenswrapper[4690]: I0320 17:35:57.157160 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7552fec8-7b03-4ad9-8410-1705f639433e-utilities\") pod \"certified-operators-bnxz2\" (UID: \"7552fec8-7b03-4ad9-8410-1705f639433e\") " pod="openshift-marketplace/certified-operators-bnxz2" Mar 20 17:35:57 crc kubenswrapper[4690]: I0320 17:35:57.157177 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30d0d78a-3ea1-4ce6-b8fb-13645cfedf18-utilities\") pod \"community-operators-4m7xw\" (UID: \"30d0d78a-3ea1-4ce6-b8fb-13645cfedf18\") " pod="openshift-marketplace/community-operators-4m7xw" Mar 20 17:35:57 crc kubenswrapper[4690]: I0320 17:35:57.157200 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jpch\" (UniqueName: \"kubernetes.io/projected/30d0d78a-3ea1-4ce6-b8fb-13645cfedf18-kube-api-access-5jpch\") pod \"community-operators-4m7xw\" (UID: \"30d0d78a-3ea1-4ce6-b8fb-13645cfedf18\") " pod="openshift-marketplace/community-operators-4m7xw" Mar 20 17:35:57 crc kubenswrapper[4690]: E0320 17:35:57.157870 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:35:57.657850104 +0000 UTC m=+232.523675842 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fhf7" (UID: "11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:35:57 crc kubenswrapper[4690]: I0320 17:35:57.157932 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30d0d78a-3ea1-4ce6-b8fb-13645cfedf18-catalog-content\") pod \"community-operators-4m7xw\" (UID: \"30d0d78a-3ea1-4ce6-b8fb-13645cfedf18\") " pod="openshift-marketplace/community-operators-4m7xw" Mar 20 17:35:57 crc kubenswrapper[4690]: I0320 17:35:57.158168 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30d0d78a-3ea1-4ce6-b8fb-13645cfedf18-utilities\") pod \"community-operators-4m7xw\" (UID: \"30d0d78a-3ea1-4ce6-b8fb-13645cfedf18\") " pod="openshift-marketplace/community-operators-4m7xw" Mar 20 17:35:57 crc kubenswrapper[4690]: I0320 17:35:57.159606 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kkhg7" podStartSLOduration=189.159575524 podStartE2EDuration="3m9.159575524s" podCreationTimestamp="2026-03-20 17:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:35:57.13952184 +0000 UTC m=+232.005347528" watchObservedRunningTime="2026-03-20 17:35:57.159575524 +0000 UTC m=+232.025401222" Mar 20 17:35:57 crc kubenswrapper[4690]: I0320 17:35:57.205759 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-dnpcn" podStartSLOduration=189.205718587 podStartE2EDuration="3m9.205718587s" podCreationTimestamp="2026-03-20 17:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:35:57.202337809 +0000 UTC m=+232.068163507" watchObservedRunningTime="2026-03-20 17:35:57.205718587 +0000 UTC m=+232.071544265" Mar 20 17:35:57 crc kubenswrapper[4690]: I0320 17:35:57.221237 4690 ???:1] "http: TLS handshake error from 192.168.126.11:46870: no serving certificate available for the kubelet" Mar 20 17:35:57 crc kubenswrapper[4690]: I0320 17:35:57.227190 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jpch\" (UniqueName: \"kubernetes.io/projected/30d0d78a-3ea1-4ce6-b8fb-13645cfedf18-kube-api-access-5jpch\") pod \"community-operators-4m7xw\" (UID: \"30d0d78a-3ea1-4ce6-b8fb-13645cfedf18\") " pod="openshift-marketplace/community-operators-4m7xw" Mar 20 17:35:57 crc kubenswrapper[4690]: I0320 17:35:57.249690 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-dxqqz" podStartSLOduration=189.249667306 podStartE2EDuration="3m9.249667306s" podCreationTimestamp="2026-03-20 17:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:35:57.244916099 +0000 UTC m=+232.110741777" watchObservedRunningTime="2026-03-20 17:35:57.249667306 +0000 UTC m=+232.115492984" Mar 20 17:35:57 crc kubenswrapper[4690]: I0320 17:35:57.258921 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:35:57 crc kubenswrapper[4690]: I0320 17:35:57.259237 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nd42m\" (UniqueName: \"kubernetes.io/projected/7552fec8-7b03-4ad9-8410-1705f639433e-kube-api-access-nd42m\") pod \"certified-operators-bnxz2\" (UID: \"7552fec8-7b03-4ad9-8410-1705f639433e\") " pod="openshift-marketplace/certified-operators-bnxz2" Mar 20 17:35:57 crc kubenswrapper[4690]: I0320 17:35:57.259691 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7552fec8-7b03-4ad9-8410-1705f639433e-utilities\") pod \"certified-operators-bnxz2\" (UID: \"7552fec8-7b03-4ad9-8410-1705f639433e\") " pod="openshift-marketplace/certified-operators-bnxz2" Mar 20 17:35:57 crc kubenswrapper[4690]: I0320 17:35:57.259834 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7552fec8-7b03-4ad9-8410-1705f639433e-catalog-content\") pod \"certified-operators-bnxz2\" (UID: \"7552fec8-7b03-4ad9-8410-1705f639433e\") " pod="openshift-marketplace/certified-operators-bnxz2" Mar 20 17:35:57 crc kubenswrapper[4690]: E0320 17:35:57.261174 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:35:57.76113915 +0000 UTC m=+232.626964828 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:35:57 crc kubenswrapper[4690]: I0320 17:35:57.261562 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7552fec8-7b03-4ad9-8410-1705f639433e-catalog-content\") pod \"certified-operators-bnxz2\" (UID: \"7552fec8-7b03-4ad9-8410-1705f639433e\") " pod="openshift-marketplace/certified-operators-bnxz2" Mar 20 17:35:57 crc kubenswrapper[4690]: I0320 17:35:57.261668 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7552fec8-7b03-4ad9-8410-1705f639433e-utilities\") pod \"certified-operators-bnxz2\" (UID: \"7552fec8-7b03-4ad9-8410-1705f639433e\") " pod="openshift-marketplace/certified-operators-bnxz2" Mar 20 17:35:57 crc kubenswrapper[4690]: I0320 17:35:57.288804 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4m7xw" Mar 20 17:35:57 crc kubenswrapper[4690]: I0320 17:35:57.313341 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-sdqjm"] Mar 20 17:35:57 crc kubenswrapper[4690]: I0320 17:35:57.314768 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sdqjm" Mar 20 17:35:57 crc kubenswrapper[4690]: I0320 17:35:57.331513 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nd42m\" (UniqueName: \"kubernetes.io/projected/7552fec8-7b03-4ad9-8410-1705f639433e-kube-api-access-nd42m\") pod \"certified-operators-bnxz2\" (UID: \"7552fec8-7b03-4ad9-8410-1705f639433e\") " pod="openshift-marketplace/certified-operators-bnxz2" Mar 20 17:35:57 crc kubenswrapper[4690]: I0320 17:35:57.332554 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sdqjm"] Mar 20 17:35:57 crc kubenswrapper[4690]: I0320 17:35:57.348821 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-wbkxs" podStartSLOduration=189.348794651 podStartE2EDuration="3m9.348794651s" podCreationTimestamp="2026-03-20 17:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:35:57.347607417 +0000 UTC m=+232.213433085" watchObservedRunningTime="2026-03-20 17:35:57.348794651 +0000 UTC m=+232.214620330" Mar 20 17:35:57 crc kubenswrapper[4690]: I0320 17:35:57.359412 4690 ???:1] "http: TLS handshake error from 192.168.126.11:46886: no serving certificate available for the kubelet" Mar 20 17:35:57 crc kubenswrapper[4690]: I0320 17:35:57.361807 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fhf7\" (UID: \"11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fhf7" Mar 20 17:35:57 crc kubenswrapper[4690]: I0320 17:35:57.361884 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29dcb3ba-2c4c-41f1-a655-02ce44ab280f-catalog-content\") pod \"community-operators-sdqjm\" (UID: \"29dcb3ba-2c4c-41f1-a655-02ce44ab280f\") " pod="openshift-marketplace/community-operators-sdqjm" Mar 20 17:35:57 crc kubenswrapper[4690]: I0320 17:35:57.361908 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29dcb3ba-2c4c-41f1-a655-02ce44ab280f-utilities\") pod \"community-operators-sdqjm\" (UID: \"29dcb3ba-2c4c-41f1-a655-02ce44ab280f\") " pod="openshift-marketplace/community-operators-sdqjm" Mar 20 17:35:57 crc kubenswrapper[4690]: I0320 17:35:57.361951 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26622\" (UniqueName: \"kubernetes.io/projected/29dcb3ba-2c4c-41f1-a655-02ce44ab280f-kube-api-access-26622\") pod \"community-operators-sdqjm\" (UID: \"29dcb3ba-2c4c-41f1-a655-02ce44ab280f\") " pod="openshift-marketplace/community-operators-sdqjm" Mar 20 17:35:57 crc kubenswrapper[4690]: E0320 17:35:57.362374 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:35:57.862357676 +0000 UTC m=+232.728183354 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fhf7" (UID: "11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:35:57 crc kubenswrapper[4690]: I0320 17:35:57.368518 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bnxz2" Mar 20 17:35:57 crc kubenswrapper[4690]: I0320 17:35:57.393244 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-j6k6w" podStartSLOduration=189.393224285 podStartE2EDuration="3m9.393224285s" podCreationTimestamp="2026-03-20 17:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:35:57.390177836 +0000 UTC m=+232.256003524" watchObservedRunningTime="2026-03-20 17:35:57.393224285 +0000 UTC m=+232.259049963" Mar 20 17:35:57 crc kubenswrapper[4690]: I0320 17:35:57.406619 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fq57l" Mar 20 17:35:57 crc kubenswrapper[4690]: I0320 17:35:57.462763 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:35:57 crc kubenswrapper[4690]: I0320 17:35:57.464403 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29567130-scc4x" podStartSLOduration=189.464386176 podStartE2EDuration="3m9.464386176s" podCreationTimestamp="2026-03-20 17:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:35:57.457146736 +0000 UTC m=+232.322972414" watchObservedRunningTime="2026-03-20 17:35:57.464386176 +0000 UTC m=+232.330211854" Mar 20 17:35:57 crc kubenswrapper[4690]: E0320 17:35:57.465386 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:35:57.965362235 +0000 UTC m=+232.831187913 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:35:57 crc kubenswrapper[4690]: I0320 17:35:57.469861 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fhf7\" (UID: \"11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fhf7" Mar 20 17:35:57 crc kubenswrapper[4690]: I0320 17:35:57.469914 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29dcb3ba-2c4c-41f1-a655-02ce44ab280f-catalog-content\") pod \"community-operators-sdqjm\" (UID: \"29dcb3ba-2c4c-41f1-a655-02ce44ab280f\") " pod="openshift-marketplace/community-operators-sdqjm" Mar 20 17:35:57 crc kubenswrapper[4690]: I0320 17:35:57.469935 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29dcb3ba-2c4c-41f1-a655-02ce44ab280f-utilities\") pod \"community-operators-sdqjm\" (UID: \"29dcb3ba-2c4c-41f1-a655-02ce44ab280f\") " pod="openshift-marketplace/community-operators-sdqjm" Mar 20 17:35:57 crc kubenswrapper[4690]: I0320 17:35:57.469972 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26622\" (UniqueName: \"kubernetes.io/projected/29dcb3ba-2c4c-41f1-a655-02ce44ab280f-kube-api-access-26622\") pod \"community-operators-sdqjm\" (UID: \"29dcb3ba-2c4c-41f1-a655-02ce44ab280f\") " pod="openshift-marketplace/community-operators-sdqjm" Mar 20 17:35:57 crc kubenswrapper[4690]: E0320 17:35:57.471586 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:35:57.971570916 +0000 UTC m=+232.837396614 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fhf7" (UID: "11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:35:57 crc kubenswrapper[4690]: I0320 17:35:57.482689 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29dcb3ba-2c4c-41f1-a655-02ce44ab280f-utilities\") pod \"community-operators-sdqjm\" (UID: \"29dcb3ba-2c4c-41f1-a655-02ce44ab280f\") " pod="openshift-marketplace/community-operators-sdqjm" Mar 20 17:35:57 crc kubenswrapper[4690]: I0320 17:35:57.483440 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29dcb3ba-2c4c-41f1-a655-02ce44ab280f-catalog-content\") pod \"community-operators-sdqjm\" (UID: \"29dcb3ba-2c4c-41f1-a655-02ce44ab280f\") " pod="openshift-marketplace/community-operators-sdqjm" Mar 20 17:35:57 crc kubenswrapper[4690]: I0320 17:35:57.529723 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26622\" (UniqueName: \"kubernetes.io/projected/29dcb3ba-2c4c-41f1-a655-02ce44ab280f-kube-api-access-26622\") pod \"community-operators-sdqjm\" (UID: \"29dcb3ba-2c4c-41f1-a655-02ce44ab280f\") " pod="openshift-marketplace/community-operators-sdqjm" Mar 20 17:35:57 crc kubenswrapper[4690]: I0320 17:35:57.572954 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:35:57 crc kubenswrapper[4690]: E0320 17:35:57.573708 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:35:58.073692928 +0000 UTC m=+232.939518596 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:35:57 crc kubenswrapper[4690]: I0320 17:35:57.662553 4690 patch_prober.go:28] interesting pod/router-default-5444994796-sv7wd container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 17:35:57 crc kubenswrapper[4690]: [-]has-synced failed: reason withheld Mar 20 17:35:57 crc kubenswrapper[4690]: [+]process-running ok Mar 20 17:35:57 crc kubenswrapper[4690]: healthz check failed Mar 20 17:35:57 crc kubenswrapper[4690]: I0320 17:35:57.662603 4690 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-sv7wd" podUID="906d9a20-0731-435a-80af-0dab64476e32" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 17:35:57 crc kubenswrapper[4690]: I0320 17:35:57.684732 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fhf7\" (UID: \"11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fhf7" Mar 20 17:35:57 crc kubenswrapper[4690]: E0320 17:35:57.685031 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:35:58.185019079 +0000 UTC m=+233.050844757 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fhf7" (UID: "11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:35:57 crc kubenswrapper[4690]: I0320 17:35:57.701134 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vwgk4"] Mar 20 17:35:57 crc kubenswrapper[4690]: I0320 17:35:57.708003 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sdqjm" Mar 20 17:35:57 crc kubenswrapper[4690]: I0320 17:35:57.786138 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:35:57 crc kubenswrapper[4690]: E0320 17:35:57.786323 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:35:58.286296347 +0000 UTC m=+233.152122025 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:35:57 crc kubenswrapper[4690]: I0320 17:35:57.786529 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fhf7\" (UID: \"11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fhf7" Mar 20 17:35:57 crc kubenswrapper[4690]: E0320 17:35:57.786847 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:35:58.286839843 +0000 UTC m=+233.152665521 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fhf7" (UID: "11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:35:57 crc kubenswrapper[4690]: I0320 17:35:57.915930 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:35:57 crc kubenswrapper[4690]: E0320 17:35:57.916888 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:35:58.416870949 +0000 UTC m=+233.282696627 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:35:57 crc kubenswrapper[4690]: I0320 17:35:57.966280 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ft28f" event={"ID":"e770c47d-95d6-45be-87cb-1fa3922afa82","Type":"ContainerStarted","Data":"d7f1a7ac0a30692f1ff46ea6c88db53e27168470263cb18c92e561d02fc1beba"} Mar 20 17:35:57 crc kubenswrapper[4690]: I0320 17:35:57.966322 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ft28f" event={"ID":"e770c47d-95d6-45be-87cb-1fa3922afa82","Type":"ContainerStarted","Data":"b3e6f4ed703e3f89df9325ba3d4aaaae56e697180ca97dcbb0a6be27318bfc19"} Mar 20 17:35:58 crc kubenswrapper[4690]: I0320 17:35:58.002323 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-v7pjq" event={"ID":"e6328cb4-ec5c-4913-b7b9-ed18d759d7f1","Type":"ContainerStarted","Data":"173ce1e87df0eef7a6ca82856b9307cb0151430f449a51d7d41ec365006582f2"} Mar 20 17:35:58 crc kubenswrapper[4690]: I0320 17:35:58.017146 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ft28f" podStartSLOduration=190.017124187 podStartE2EDuration="3m10.017124187s" podCreationTimestamp="2026-03-20 17:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:35:57.99008506 +0000 UTC m=+232.855910738" watchObservedRunningTime="2026-03-20 17:35:58.017124187 +0000 UTC m=+232.882949865" Mar 20 17:35:58 crc kubenswrapper[4690]: I0320 17:35:58.018995 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fhf7\" (UID: \"11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fhf7" Mar 20 17:35:58 crc kubenswrapper[4690]: E0320 17:35:58.019522 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:35:58.519508716 +0000 UTC m=+233.385334394 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fhf7" (UID: "11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:35:58 crc kubenswrapper[4690]: I0320 17:35:58.025685 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4m7xw"] Mar 20 17:35:58 crc kubenswrapper[4690]: I0320 17:35:58.039897 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-v7pjq" podStartSLOduration=190.039863529 podStartE2EDuration="3m10.039863529s" podCreationTimestamp="2026-03-20 17:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:35:58.038605282 +0000 UTC m=+232.904430960" watchObservedRunningTime="2026-03-20 17:35:58.039863529 +0000 UTC m=+232.905689207" Mar 20 17:35:58 crc kubenswrapper[4690]: I0320 17:35:58.046419 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4kp5n" event={"ID":"73b9ca3f-754a-4970-85ef-b3203caee6e4","Type":"ContainerStarted","Data":"c961ffafc1db7e463b510c7264a7706ae84139f52dd1350e871b98bbcae936cb"} Mar 20 17:35:58 crc kubenswrapper[4690]: I0320 17:35:58.095095 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4kp5n" podStartSLOduration=190.095081306 podStartE2EDuration="3m10.095081306s" podCreationTimestamp="2026-03-20 17:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:35:58.093314275 +0000 UTC m=+232.959139973" watchObservedRunningTime="2026-03-20 17:35:58.095081306 +0000 UTC m=+232.960906984" Mar 20 17:35:58 crc kubenswrapper[4690]: I0320 17:35:58.098622 4690 ???:1] "http: TLS handshake error from 192.168.126.11:46896: no serving certificate available for the kubelet" Mar 20 17:35:58 crc kubenswrapper[4690]: I0320 17:35:58.109074 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-d7smm" event={"ID":"d0a0c8bb-22ed-4ebb-aaf6-37d9a2e15a7c","Type":"ContainerStarted","Data":"f52efe3bd9307e096b83f2606d90346c89e25a473f815ad8a28d828c8c259646"} Mar 20 17:35:58 crc kubenswrapper[4690]: I0320 17:35:58.123093 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-275fn" event={"ID":"733046ae-dba4-407a-83ee-89677527d7cc","Type":"ContainerStarted","Data":"72eea33ada77d0fa4394d0ada06052920321ea0ba958424d8f4c1373b3f68808"} Mar 20 17:35:58 crc kubenswrapper[4690]: I0320 17:35:58.125533 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:35:58 crc kubenswrapper[4690]: E0320 17:35:58.126854 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:35:58.626837251 +0000 UTC m=+233.492662969 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:35:58 crc kubenswrapper[4690]: I0320 17:35:58.135016 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6rktv" event={"ID":"daded41b-1e26-4dde-aded-4a2e3c1dc4fd","Type":"ContainerStarted","Data":"d2de4469a1967bd8ae032a4724b158c9bd5c81c5c7b5ff7a1f60109de35e2fcb"} Mar 20 17:35:58 crc kubenswrapper[4690]: I0320 17:35:58.143860 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2s6xz" event={"ID":"d11ac9c7-0d8b-4a2c-a60f-7a0e88b01fa7","Type":"ContainerStarted","Data":"635a0ec60ad1c19475e7122523afd1602fab62a146adef65bc760c84a2cbc16c"} Mar 20 17:35:58 crc kubenswrapper[4690]: I0320 17:35:58.157732 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-d7smm" podStartSLOduration=190.15771733 podStartE2EDuration="3m10.15771733s" podCreationTimestamp="2026-03-20 17:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:35:58.155709981 +0000 UTC m=+233.021535659" watchObservedRunningTime="2026-03-20 17:35:58.15771733 +0000 UTC m=+233.023543008" Mar 20 17:35:58 crc kubenswrapper[4690]: I0320 17:35:58.174827 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nfmkn" event={"ID":"31ca74dd-dc4d-466a-8ca3-48f9b2d3e9f8","Type":"ContainerStarted","Data":"ec97b758c51d82261c4e07a6319d61b44146e70937f60a47408524396489e51f"} Mar 20 17:35:58 crc kubenswrapper[4690]: I0320 17:35:58.174878 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nfmkn" event={"ID":"31ca74dd-dc4d-466a-8ca3-48f9b2d3e9f8","Type":"ContainerStarted","Data":"8433ecd3f60dafa9571e393487c4faab714b8f565fa268ac1cb06571fa45fd8b"} Mar 20 17:35:58 crc kubenswrapper[4690]: I0320 17:35:58.175726 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nfmkn" Mar 20 17:35:58 crc kubenswrapper[4690]: I0320 17:35:58.192180 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bnxz2"] Mar 20 17:35:58 crc kubenswrapper[4690]: I0320 17:35:58.193147 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2s6xz" podStartSLOduration=190.193132481 podStartE2EDuration="3m10.193132481s" podCreationTimestamp="2026-03-20 17:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:35:58.187850087 +0000 UTC m=+233.053675785" watchObservedRunningTime="2026-03-20 17:35:58.193132481 +0000 UTC m=+233.058958149" Mar 20 17:35:58 crc kubenswrapper[4690]: I0320 17:35:58.197145 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-nprpv" event={"ID":"51ad4830-9e57-4bf2-91e5-7c24c7648d8b","Type":"ContainerStarted","Data":"0c42de82518c879b2eb056e4019db64c364d4af0958b0a3f6e906effba368cad"} Mar 20 17:35:58 crc kubenswrapper[4690]: I0320 17:35:58.199402 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-6zcl9" event={"ID":"a7917eb5-2c7a-426c-8850-4209cd22e790","Type":"ContainerStarted","Data":"b77f97b1c8b7ffc5151f323a70a970806973d58d2565bc13c347996d90726ac5"} Mar 20 17:35:58 crc kubenswrapper[4690]: I0320 17:35:58.199866 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-6zcl9" Mar 20 17:35:58 crc kubenswrapper[4690]: I0320 17:35:58.205143 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-275fn" podStartSLOduration=190.205127 podStartE2EDuration="3m10.205127s" podCreationTimestamp="2026-03-20 17:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:35:58.202621487 +0000 UTC m=+233.068447165" watchObservedRunningTime="2026-03-20 17:35:58.205127 +0000 UTC m=+233.070952678" Mar 20 17:35:58 crc kubenswrapper[4690]: I0320 17:35:58.214054 4690 patch_prober.go:28] interesting pod/console-operator-58897d9998-6zcl9 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.38:8443/readyz\": dial tcp 10.217.0.38:8443: connect: connection refused" start-of-body= Mar 20 17:35:58 crc kubenswrapper[4690]: I0320 17:35:58.214092 4690 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-6zcl9" podUID="a7917eb5-2c7a-426c-8850-4209cd22e790" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.38:8443/readyz\": dial tcp 10.217.0.38:8443: connect: connection refused" Mar 20 17:35:58 crc kubenswrapper[4690]: I0320 17:35:58.230401 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nfmkn" podStartSLOduration=190.230364685 podStartE2EDuration="3m10.230364685s" podCreationTimestamp="2026-03-20 17:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:35:58.229808809 +0000 UTC m=+233.095634497" watchObservedRunningTime="2026-03-20 17:35:58.230364685 +0000 UTC m=+233.096190363" Mar 20 17:35:58 crc kubenswrapper[4690]: I0320 17:35:58.230638 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fhf7\" (UID: \"11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fhf7" Mar 20 17:35:58 crc kubenswrapper[4690]: E0320 17:35:58.230854 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:35:58.730844319 +0000 UTC m=+233.596669997 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fhf7" (UID: "11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:35:58 crc kubenswrapper[4690]: I0320 17:35:58.250596 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vwgk4" event={"ID":"416d626a-ef44-4b4e-91ce-51042b01a45a","Type":"ContainerStarted","Data":"e8f670484c751ffd572d834f9d49b8b85a59e9f0b8533556624448b2653f7d87"} Mar 20 17:35:58 crc kubenswrapper[4690]: I0320 17:35:58.257191 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-6zcl9" podStartSLOduration=190.257178835 podStartE2EDuration="3m10.257178835s" podCreationTimestamp="2026-03-20 17:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:35:58.256598648 +0000 UTC m=+233.122424326" watchObservedRunningTime="2026-03-20 17:35:58.257178835 +0000 UTC m=+233.123004513" Mar 20 17:35:58 crc kubenswrapper[4690]: I0320 17:35:58.273244 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-52php" event={"ID":"c73bcf80-34dc-466e-b1b0-a92850850498","Type":"ContainerStarted","Data":"8986fcb213df28dab668db3277363e4e2830ef537cf96ecea2caf7a214cdc40b"} Mar 20 17:35:58 crc kubenswrapper[4690]: I0320 17:35:58.286017 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-nprpv" podStartSLOduration=190.285999974 podStartE2EDuration="3m10.285999974s" podCreationTimestamp="2026-03-20 17:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:35:58.283904873 +0000 UTC m=+233.149730551" watchObservedRunningTime="2026-03-20 17:35:58.285999974 +0000 UTC m=+233.151825652" Mar 20 17:35:58 crc kubenswrapper[4690]: I0320 17:35:58.333359 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sfhzf" event={"ID":"6edcd95a-9780-4af1-9454-da6dce913528","Type":"ContainerStarted","Data":"4ad68c48469fc45cc6c2600b3eda5d51ce6c7c2eee5b507384e055b1111c2c71"} Mar 20 17:35:58 crc kubenswrapper[4690]: I0320 17:35:58.333406 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sfhzf" event={"ID":"6edcd95a-9780-4af1-9454-da6dce913528","Type":"ContainerStarted","Data":"a9a62a2c2c5b74d7a704a4399d6ded4bc1a1f09169af6a0b7d8bd38c8038efec"} Mar 20 17:35:58 crc kubenswrapper[4690]: I0320 17:35:58.343124 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:35:58 crc kubenswrapper[4690]: E0320 17:35:58.353403 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:35:58.853376866 +0000 UTC m=+233.719202544 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:35:58 crc kubenswrapper[4690]: I0320 17:35:58.361254 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-52php" podStartSLOduration=190.361233894 podStartE2EDuration="3m10.361233894s" podCreationTimestamp="2026-03-20 17:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:35:58.360945566 +0000 UTC m=+233.226771244" watchObservedRunningTime="2026-03-20 17:35:58.361233894 +0000 UTC m=+233.227059572" Mar 20 17:35:58 crc kubenswrapper[4690]: I0320 17:35:58.362487 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tv6bv" event={"ID":"a151c473-d304-4e1d-ba12-7860c0efbac9","Type":"ContainerStarted","Data":"ce65ea21aecc6d8369f0272dd05510ed768f09bd34fdac7a90c10e336eae55cc"} Mar 20 17:35:58 crc kubenswrapper[4690]: I0320 17:35:58.363096 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tv6bv" Mar 20 17:35:58 crc kubenswrapper[4690]: I0320 17:35:58.365781 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-2b2sh" event={"ID":"1a46a1ee-5f40-4d85-b726-d758b7ceff37","Type":"ContainerStarted","Data":"ef89507084b46c18386d13773118a9bf4f9d9e196a762ad2f4233f51a1e58cca"} Mar 20 17:35:58 crc kubenswrapper[4690]: I0320 17:35:58.366701 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-2b2sh" Mar 20 17:35:58 crc kubenswrapper[4690]: I0320 17:35:58.367503 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fhf7\" (UID: \"11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fhf7" Mar 20 17:35:58 crc kubenswrapper[4690]: E0320 17:35:58.369903 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:35:58.869889166 +0000 UTC m=+233.735714834 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fhf7" (UID: "11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:35:58 crc kubenswrapper[4690]: I0320 17:35:58.384469 4690 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-2b2sh container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.34:6443/healthz\": dial tcp 10.217.0.34:6443: connect: connection refused" start-of-body= Mar 20 17:35:58 crc kubenswrapper[4690]: I0320 17:35:58.384539 4690 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-2b2sh" podUID="1a46a1ee-5f40-4d85-b726-d758b7ceff37" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.34:6443/healthz\": dial tcp 10.217.0.34:6443: connect: connection refused" Mar 20 17:35:58 crc kubenswrapper[4690]: I0320 17:35:58.384871 4690 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-tv6bv container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.30:8443/healthz\": dial tcp 10.217.0.30:8443: connect: connection refused" start-of-body= Mar 20 17:35:58 crc kubenswrapper[4690]: I0320 17:35:58.384912 4690 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tv6bv" podUID="a151c473-d304-4e1d-ba12-7860c0efbac9" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.30:8443/healthz\": dial tcp 10.217.0.30:8443: connect: connection refused" Mar 20 17:35:58 crc kubenswrapper[4690]: I0320 17:35:58.388552 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-9n98c" event={"ID":"bb444275-6cc1-42be-b742-afc344a60995","Type":"ContainerStarted","Data":"3e76a7fbb7d8db816a953a9639a743b78e84b85ddd696467f6fb8efd8a1787fb"} Mar 20 17:35:58 crc kubenswrapper[4690]: I0320 17:35:58.388603 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-9n98c" event={"ID":"bb444275-6cc1-42be-b742-afc344a60995","Type":"ContainerStarted","Data":"68d5f4d981ab762e761c5eb47150d8364f848bc50b52d51c3e7eb342c81b7df8"} Mar 20 17:35:58 crc kubenswrapper[4690]: I0320 17:35:58.407065 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2n45v" event={"ID":"b3bcd08e-d3ff-4cc6-8a32-1d43add9fddf","Type":"ContainerStarted","Data":"be0bb980670ae78ef5015749587161b36b4d2e998538234a0a726a1a7e0ab9f2"} Mar 20 17:35:58 crc kubenswrapper[4690]: I0320 17:35:58.408208 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2n45v" Mar 20 17:35:58 crc kubenswrapper[4690]: I0320 17:35:58.457928 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-8pvtf" event={"ID":"fc1d890d-f494-466b-94a2-03c2d2c3fe7f","Type":"ContainerStarted","Data":"056be792bc41ef6f1765063aed0d3cae496ba525094404a5ec64aaad1a0785f6"} Mar 20 17:35:58 crc kubenswrapper[4690]: I0320 17:35:58.458243 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-8pvtf" event={"ID":"fc1d890d-f494-466b-94a2-03c2d2c3fe7f","Type":"ContainerStarted","Data":"0174f7f495cc303ef72265b25985e1394287c6f547b35f6c083fb6a975a497fb"} Mar 20 17:35:58 crc kubenswrapper[4690]: I0320 17:35:58.466554 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sfhzf" podStartSLOduration=190.4665387 podStartE2EDuration="3m10.4665387s" podCreationTimestamp="2026-03-20 17:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:35:58.465324514 +0000 UTC m=+233.331150192" watchObservedRunningTime="2026-03-20 17:35:58.4665387 +0000 UTC m=+233.332364368" Mar 20 17:35:58 crc kubenswrapper[4690]: I0320 17:35:58.470960 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:35:58 crc kubenswrapper[4690]: E0320 17:35:58.472695 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:35:58.972677409 +0000 UTC m=+233.838503087 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:35:58 crc kubenswrapper[4690]: I0320 17:35:58.488495 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567130-scc4x" event={"ID":"03f86e30-e6e2-473e-8a52-c1e86d28c2e2","Type":"ContainerStarted","Data":"703fc82a0d26e5620a573c19a2ae4b9e7776c9253848f6560f212c1a48b1f19a"} Mar 20 17:35:58 crc kubenswrapper[4690]: I0320 17:35:58.522658 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cg94j" event={"ID":"a78540fe-014c-42e6-916c-3f39b4611a15","Type":"ContainerStarted","Data":"608ae962bf07c05ed41195a6b202ce377646fc8e4e9b1b5136fc99d208490ad1"} Mar 20 17:35:58 crc kubenswrapper[4690]: I0320 17:35:58.573181 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fhf7\" (UID: \"11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fhf7" Mar 20 17:35:58 crc kubenswrapper[4690]: I0320 17:35:58.574853 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-8l2n9" event={"ID":"a0c61344-19c2-4d8b-8aec-be86ac403866","Type":"ContainerStarted","Data":"69ea658a1a25ad4067c749cfe0edd28251246d86fb3c699d14e3be7b3d438655"} Mar 20 17:35:58 crc kubenswrapper[4690]: E0320 17:35:58.575880 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:35:59.075866792 +0000 UTC m=+233.941692460 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fhf7" (UID: "11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:35:58 crc kubenswrapper[4690]: I0320 17:35:58.610842 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sdqjm"] Mar 20 17:35:58 crc kubenswrapper[4690]: I0320 17:35:58.626513 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-8pvtf" podStartSLOduration=190.626496916 podStartE2EDuration="3m10.626496916s" podCreationTimestamp="2026-03-20 17:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:35:58.609054219 +0000 UTC m=+233.474879897" watchObservedRunningTime="2026-03-20 17:35:58.626496916 +0000 UTC m=+233.492322594" Mar 20 17:35:58 crc kubenswrapper[4690]: I0320 17:35:58.634606 4690 patch_prober.go:28] interesting pod/router-default-5444994796-sv7wd container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 17:35:58 crc kubenswrapper[4690]: [-]has-synced failed: reason withheld Mar 20 17:35:58 crc kubenswrapper[4690]: [+]process-running ok Mar 20 17:35:58 crc kubenswrapper[4690]: healthz check failed Mar 20 17:35:58 crc kubenswrapper[4690]: I0320 17:35:58.634657 4690 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-sv7wd" podUID="906d9a20-0731-435a-80af-0dab64476e32" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 17:35:58 crc kubenswrapper[4690]: I0320 17:35:58.661225 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jtggf" event={"ID":"429d8b5d-8e50-4115-89e7-1c8d3f53bd27","Type":"ContainerStarted","Data":"492b39eeea41f16470a4e08387f304bbf4e19cee8f7eff6ffeec5223ac01622d"} Mar 20 17:35:58 crc kubenswrapper[4690]: I0320 17:35:58.661274 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jtggf" event={"ID":"429d8b5d-8e50-4115-89e7-1c8d3f53bd27","Type":"ContainerStarted","Data":"8edf6f67195829f010b9d45829c54e9fad6f483897804db43b09879c8929ce47"} Mar 20 17:35:58 crc kubenswrapper[4690]: I0320 17:35:58.662307 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-wbkxs" podUID="fcf13749-fd7c-4f01-9598-7f041910cd74" containerName="controller-manager" containerID="cri-o://857b206b63c2b51156893882f77a5e3f20a8a487e11bdfa8c3a5a58351eaedd1" gracePeriod=30 Mar 20 17:35:58 crc kubenswrapper[4690]: I0320 17:35:58.664163 4690 patch_prober.go:28] interesting pod/downloads-7954f5f757-v9wf6 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Mar 20 17:35:58 crc kubenswrapper[4690]: I0320 17:35:58.664231 4690 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-v9wf6" podUID="789eef8f-04a8-44cf-9e16-878de3a035bb" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Mar 20 17:35:58 crc kubenswrapper[4690]: I0320 17:35:58.666436 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zklpl" podUID="ace7a9fa-7eac-449c-8b61-6018d592fc4f" containerName="route-controller-manager" containerID="cri-o://6c74d17c78a3df4dd5bcdaf5c8462934420e7db51aaf05b8b5d442c885315928" gracePeriod=30 Mar 20 17:35:58 crc kubenswrapper[4690]: I0320 17:35:58.670536 4690 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-dnpcn container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" start-of-body= Mar 20 17:35:58 crc kubenswrapper[4690]: I0320 17:35:58.670583 4690 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-dnpcn" podUID="80d86fac-74cc-41d4-81df-2e718c1568d9" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" Mar 20 17:35:58 crc kubenswrapper[4690]: I0320 17:35:58.675816 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:35:58 crc kubenswrapper[4690]: E0320 17:35:58.676390 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:35:59.176372968 +0000 UTC m=+234.042198646 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:35:58 crc kubenswrapper[4690]: I0320 17:35:58.703847 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zklpl" Mar 20 17:35:58 crc kubenswrapper[4690]: I0320 17:35:58.708598 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-wbkxs" Mar 20 17:35:58 crc kubenswrapper[4690]: I0320 17:35:58.722640 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-2b2sh" podStartSLOduration=190.722624385 podStartE2EDuration="3m10.722624385s" podCreationTimestamp="2026-03-20 17:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:35:58.685477403 +0000 UTC m=+233.551303071" watchObservedRunningTime="2026-03-20 17:35:58.722624385 +0000 UTC m=+233.588450053" Mar 20 17:35:58 crc kubenswrapper[4690]: I0320 17:35:58.761696 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2n45v" podStartSLOduration=190.761665681 podStartE2EDuration="3m10.761665681s" podCreationTimestamp="2026-03-20 17:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:35:58.760052904 +0000 UTC m=+233.625878582" watchObservedRunningTime="2026-03-20 17:35:58.761665681 +0000 UTC m=+233.627491349" Mar 20 17:35:58 crc kubenswrapper[4690]: I0320 17:35:58.763482 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tv6bv" podStartSLOduration=190.763475534 podStartE2EDuration="3m10.763475534s" podCreationTimestamp="2026-03-20 17:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:35:58.725127138 +0000 UTC m=+233.590952816" watchObservedRunningTime="2026-03-20 17:35:58.763475534 +0000 UTC m=+233.629301212" Mar 20 17:35:58 crc kubenswrapper[4690]: I0320 17:35:58.776896 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fhf7\" (UID: \"11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fhf7" Mar 20 17:35:58 crc kubenswrapper[4690]: E0320 17:35:58.777790 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:35:59.27777937 +0000 UTC m=+234.143605048 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fhf7" (UID: "11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:35:58 crc kubenswrapper[4690]: I0320 17:35:58.787834 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-9n98c" podStartSLOduration=190.787819443 podStartE2EDuration="3m10.787819443s" podCreationTimestamp="2026-03-20 17:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:35:58.785103784 +0000 UTC m=+233.650929462" watchObservedRunningTime="2026-03-20 17:35:58.787819443 +0000 UTC m=+233.653645121" Mar 20 17:35:58 crc kubenswrapper[4690]: I0320 17:35:58.876133 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jtggf" podStartSLOduration=190.876118593 podStartE2EDuration="3m10.876118593s" podCreationTimestamp="2026-03-20 17:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:35:58.874847956 +0000 UTC m=+233.740673634" watchObservedRunningTime="2026-03-20 17:35:58.876118593 +0000 UTC m=+233.741944271" Mar 20 17:35:58 crc kubenswrapper[4690]: I0320 17:35:58.878970 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:35:58 crc kubenswrapper[4690]: E0320 17:35:58.879317 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:35:59.379304406 +0000 UTC m=+234.245130084 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:35:58 crc kubenswrapper[4690]: I0320 17:35:58.890371 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ssfrq"] Mar 20 17:35:58 crc kubenswrapper[4690]: I0320 17:35:58.891742 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ssfrq" Mar 20 17:35:58 crc kubenswrapper[4690]: I0320 17:35:58.900308 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 20 17:35:58 crc kubenswrapper[4690]: I0320 17:35:58.923034 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cg94j" podStartSLOduration=190.923017248 podStartE2EDuration="3m10.923017248s" podCreationTimestamp="2026-03-20 17:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:35:58.920618278 +0000 UTC m=+233.786443966" watchObservedRunningTime="2026-03-20 17:35:58.923017248 +0000 UTC m=+233.788842926" Mar 20 17:35:58 crc kubenswrapper[4690]: I0320 17:35:58.924095 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ssfrq"] Mar 20 17:35:58 crc kubenswrapper[4690]: I0320 17:35:58.983128 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fhf7\" (UID: \"11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fhf7" Mar 20 17:35:58 crc kubenswrapper[4690]: E0320 17:35:58.983498 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:35:59.483485599 +0000 UTC m=+234.349311277 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fhf7" (UID: "11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.046300 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-8l2n9" podStartSLOduration=191.046286067 podStartE2EDuration="3m11.046286067s" podCreationTimestamp="2026-03-20 17:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:35:59.013585995 +0000 UTC m=+233.879411683" watchObservedRunningTime="2026-03-20 17:35:59.046286067 +0000 UTC m=+233.912111755" Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.085200 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:35:59 crc kubenswrapper[4690]: E0320 17:35:59.085686 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:35:59.585669303 +0000 UTC m=+234.451494981 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.085728 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/244c63f8-c484-4edb-9cb6-0ac6a9dac136-catalog-content\") pod \"redhat-marketplace-ssfrq\" (UID: \"244c63f8-c484-4edb-9cb6-0ac6a9dac136\") " pod="openshift-marketplace/redhat-marketplace-ssfrq" Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.085791 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phklv\" (UniqueName: \"kubernetes.io/projected/244c63f8-c484-4edb-9cb6-0ac6a9dac136-kube-api-access-phklv\") pod \"redhat-marketplace-ssfrq\" (UID: \"244c63f8-c484-4edb-9cb6-0ac6a9dac136\") " pod="openshift-marketplace/redhat-marketplace-ssfrq" Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.085816 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fhf7\" (UID: \"11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fhf7" Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.085838 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/244c63f8-c484-4edb-9cb6-0ac6a9dac136-utilities\") pod \"redhat-marketplace-ssfrq\" (UID: \"244c63f8-c484-4edb-9cb6-0ac6a9dac136\") " pod="openshift-marketplace/redhat-marketplace-ssfrq" Mar 20 17:35:59 crc kubenswrapper[4690]: E0320 17:35:59.086086 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:35:59.586079835 +0000 UTC m=+234.451905513 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fhf7" (UID: "11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.186453 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.186565 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phklv\" (UniqueName: \"kubernetes.io/projected/244c63f8-c484-4edb-9cb6-0ac6a9dac136-kube-api-access-phklv\") pod \"redhat-marketplace-ssfrq\" (UID: \"244c63f8-c484-4edb-9cb6-0ac6a9dac136\") " pod="openshift-marketplace/redhat-marketplace-ssfrq" Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.186602 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/244c63f8-c484-4edb-9cb6-0ac6a9dac136-utilities\") pod \"redhat-marketplace-ssfrq\" (UID: \"244c63f8-c484-4edb-9cb6-0ac6a9dac136\") " pod="openshift-marketplace/redhat-marketplace-ssfrq" Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.186656 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/244c63f8-c484-4edb-9cb6-0ac6a9dac136-catalog-content\") pod \"redhat-marketplace-ssfrq\" (UID: \"244c63f8-c484-4edb-9cb6-0ac6a9dac136\") " pod="openshift-marketplace/redhat-marketplace-ssfrq" Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.187057 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/244c63f8-c484-4edb-9cb6-0ac6a9dac136-catalog-content\") pod \"redhat-marketplace-ssfrq\" (UID: \"244c63f8-c484-4edb-9cb6-0ac6a9dac136\") " pod="openshift-marketplace/redhat-marketplace-ssfrq" Mar 20 17:35:59 crc kubenswrapper[4690]: E0320 17:35:59.187121 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:35:59.687108146 +0000 UTC m=+234.552933824 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.187563 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/244c63f8-c484-4edb-9cb6-0ac6a9dac136-utilities\") pod \"redhat-marketplace-ssfrq\" (UID: \"244c63f8-c484-4edb-9cb6-0ac6a9dac136\") " pod="openshift-marketplace/redhat-marketplace-ssfrq" Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.237276 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phklv\" (UniqueName: \"kubernetes.io/projected/244c63f8-c484-4edb-9cb6-0ac6a9dac136-kube-api-access-phklv\") pod \"redhat-marketplace-ssfrq\" (UID: \"244c63f8-c484-4edb-9cb6-0ac6a9dac136\") " pod="openshift-marketplace/redhat-marketplace-ssfrq" Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.290049 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fhf7\" (UID: \"11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fhf7" Mar 20 17:35:59 crc kubenswrapper[4690]: E0320 17:35:59.290374 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:35:59.790363082 +0000 UTC m=+234.656188760 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fhf7" (UID: "11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.298213 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6f26l"] Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.303270 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6f26l" Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.315560 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ssfrq" Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.376569 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6f26l"] Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.390708 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:35:59 crc kubenswrapper[4690]: E0320 17:35:59.390783 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:35:59.890768585 +0000 UTC m=+234.756594263 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.391058 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e896e412-2900-44e4-908c-de1883bd9cdc-catalog-content\") pod \"redhat-marketplace-6f26l\" (UID: \"e896e412-2900-44e4-908c-de1883bd9cdc\") " pod="openshift-marketplace/redhat-marketplace-6f26l" Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.391092 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e896e412-2900-44e4-908c-de1883bd9cdc-utilities\") pod \"redhat-marketplace-6f26l\" (UID: \"e896e412-2900-44e4-908c-de1883bd9cdc\") " pod="openshift-marketplace/redhat-marketplace-6f26l" Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.391112 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8gtg\" (UniqueName: \"kubernetes.io/projected/e896e412-2900-44e4-908c-de1883bd9cdc-kube-api-access-v8gtg\") pod \"redhat-marketplace-6f26l\" (UID: \"e896e412-2900-44e4-908c-de1883bd9cdc\") " pod="openshift-marketplace/redhat-marketplace-6f26l" Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.391158 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fhf7\" (UID: \"11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fhf7" Mar 20 17:35:59 crc kubenswrapper[4690]: E0320 17:35:59.391422 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:35:59.891415584 +0000 UTC m=+234.757241262 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fhf7" (UID: "11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.412452 4690 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-2n45v container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.27:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.412740 4690 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2n45v" podUID="b3bcd08e-d3ff-4cc6-8a32-1d43add9fddf" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.27:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.492317 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:35:59 crc kubenswrapper[4690]: E0320 17:35:59.492552 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:35:59.992531727 +0000 UTC m=+234.858357405 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.492723 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e896e412-2900-44e4-908c-de1883bd9cdc-catalog-content\") pod \"redhat-marketplace-6f26l\" (UID: \"e896e412-2900-44e4-908c-de1883bd9cdc\") " pod="openshift-marketplace/redhat-marketplace-6f26l" Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.492766 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e896e412-2900-44e4-908c-de1883bd9cdc-utilities\") pod \"redhat-marketplace-6f26l\" (UID: \"e896e412-2900-44e4-908c-de1883bd9cdc\") " pod="openshift-marketplace/redhat-marketplace-6f26l" Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.492787 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8gtg\" (UniqueName: \"kubernetes.io/projected/e896e412-2900-44e4-908c-de1883bd9cdc-kube-api-access-v8gtg\") pod \"redhat-marketplace-6f26l\" (UID: \"e896e412-2900-44e4-908c-de1883bd9cdc\") " pod="openshift-marketplace/redhat-marketplace-6f26l" Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.492860 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fhf7\" (UID: \"11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fhf7" Mar 20 17:35:59 crc kubenswrapper[4690]: E0320 17:35:59.510914 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:36:00.010893562 +0000 UTC m=+234.876719240 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fhf7" (UID: "11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.518951 4690 ???:1] "http: TLS handshake error from 192.168.126.11:46906: no serving certificate available for the kubelet" Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.529108 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e896e412-2900-44e4-908c-de1883bd9cdc-utilities\") pod \"redhat-marketplace-6f26l\" (UID: \"e896e412-2900-44e4-908c-de1883bd9cdc\") " pod="openshift-marketplace/redhat-marketplace-6f26l" Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.529527 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e896e412-2900-44e4-908c-de1883bd9cdc-catalog-content\") pod \"redhat-marketplace-6f26l\" (UID: \"e896e412-2900-44e4-908c-de1883bd9cdc\") " pod="openshift-marketplace/redhat-marketplace-6f26l" Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.548444 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8gtg\" (UniqueName: \"kubernetes.io/projected/e896e412-2900-44e4-908c-de1883bd9cdc-kube-api-access-v8gtg\") pod \"redhat-marketplace-6f26l\" (UID: \"e896e412-2900-44e4-908c-de1883bd9cdc\") " pod="openshift-marketplace/redhat-marketplace-6f26l" Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.561723 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-wbkxs" Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.593667 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fcf13749-fd7c-4f01-9598-7f041910cd74-client-ca\") pod \"fcf13749-fd7c-4f01-9598-7f041910cd74\" (UID: \"fcf13749-fd7c-4f01-9598-7f041910cd74\") " Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.593755 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fcf13749-fd7c-4f01-9598-7f041910cd74-config\") pod \"fcf13749-fd7c-4f01-9598-7f041910cd74\" (UID: \"fcf13749-fd7c-4f01-9598-7f041910cd74\") " Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.593776 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2657z\" (UniqueName: \"kubernetes.io/projected/fcf13749-fd7c-4f01-9598-7f041910cd74-kube-api-access-2657z\") pod \"fcf13749-fd7c-4f01-9598-7f041910cd74\" (UID: \"fcf13749-fd7c-4f01-9598-7f041910cd74\") " Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.593799 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fcf13749-fd7c-4f01-9598-7f041910cd74-serving-cert\") pod \"fcf13749-fd7c-4f01-9598-7f041910cd74\" (UID: \"fcf13749-fd7c-4f01-9598-7f041910cd74\") " Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.593817 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fcf13749-fd7c-4f01-9598-7f041910cd74-proxy-ca-bundles\") pod \"fcf13749-fd7c-4f01-9598-7f041910cd74\" (UID: \"fcf13749-fd7c-4f01-9598-7f041910cd74\") " Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.594019 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:35:59 crc kubenswrapper[4690]: E0320 17:35:59.594329 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:36:00.09431325 +0000 UTC m=+234.960138918 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.595658 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fcf13749-fd7c-4f01-9598-7f041910cd74-client-ca" (OuterVolumeSpecName: "client-ca") pod "fcf13749-fd7c-4f01-9598-7f041910cd74" (UID: "fcf13749-fd7c-4f01-9598-7f041910cd74"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.596376 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fcf13749-fd7c-4f01-9598-7f041910cd74-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "fcf13749-fd7c-4f01-9598-7f041910cd74" (UID: "fcf13749-fd7c-4f01-9598-7f041910cd74"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.596986 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fcf13749-fd7c-4f01-9598-7f041910cd74-config" (OuterVolumeSpecName: "config") pod "fcf13749-fd7c-4f01-9598-7f041910cd74" (UID: "fcf13749-fd7c-4f01-9598-7f041910cd74"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.610134 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcf13749-fd7c-4f01-9598-7f041910cd74-kube-api-access-2657z" (OuterVolumeSpecName: "kube-api-access-2657z") pod "fcf13749-fd7c-4f01-9598-7f041910cd74" (UID: "fcf13749-fd7c-4f01-9598-7f041910cd74"). InnerVolumeSpecName "kube-api-access-2657z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.610518 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcf13749-fd7c-4f01-9598-7f041910cd74-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "fcf13749-fd7c-4f01-9598-7f041910cd74" (UID: "fcf13749-fd7c-4f01-9598-7f041910cd74"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.622447 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-d7677567-tbqjb"] Mar 20 17:35:59 crc kubenswrapper[4690]: E0320 17:35:59.622635 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcf13749-fd7c-4f01-9598-7f041910cd74" containerName="controller-manager" Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.622646 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcf13749-fd7c-4f01-9598-7f041910cd74" containerName="controller-manager" Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.622769 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcf13749-fd7c-4f01-9598-7f041910cd74" containerName="controller-manager" Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.623156 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-d7677567-tbqjb" Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.637579 4690 patch_prober.go:28] interesting pod/router-default-5444994796-sv7wd container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 17:35:59 crc kubenswrapper[4690]: [-]has-synced failed: reason withheld Mar 20 17:35:59 crc kubenswrapper[4690]: [+]process-running ok Mar 20 17:35:59 crc kubenswrapper[4690]: healthz check failed Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.637631 4690 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-sv7wd" podUID="906d9a20-0731-435a-80af-0dab64476e32" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.663957 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-d7677567-tbqjb"] Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.683513 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6f26l" Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.698104 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fhf7\" (UID: \"11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fhf7" Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.698217 4690 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fcf13749-fd7c-4f01-9598-7f041910cd74-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.698228 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2657z\" (UniqueName: \"kubernetes.io/projected/fcf13749-fd7c-4f01-9598-7f041910cd74-kube-api-access-2657z\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.698237 4690 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fcf13749-fd7c-4f01-9598-7f041910cd74-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.698244 4690 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fcf13749-fd7c-4f01-9598-7f041910cd74-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.698269 4690 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fcf13749-fd7c-4f01-9598-7f041910cd74-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:59 crc kubenswrapper[4690]: E0320 17:35:59.698525 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:36:00.198511784 +0000 UTC m=+235.064337462 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fhf7" (UID: "11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.746924 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-8l2n9" event={"ID":"a0c61344-19c2-4d8b-8aec-be86ac403866","Type":"ContainerStarted","Data":"74c90661d2d26509c494450e3188f293ebe3d1911031bf22accd9e6468bf4ad9"} Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.747801 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zklpl" Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.794781 4690 generic.go:334] "Generic (PLEG): container finished" podID="29dcb3ba-2c4c-41f1-a655-02ce44ab280f" containerID="95102bb7662ca95eee51ed2fbbff9d55f8a797e80c4d4ecb5e9a52b3b278984f" exitCode=0 Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.794898 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sdqjm" event={"ID":"29dcb3ba-2c4c-41f1-a655-02ce44ab280f","Type":"ContainerDied","Data":"95102bb7662ca95eee51ed2fbbff9d55f8a797e80c4d4ecb5e9a52b3b278984f"} Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.794925 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sdqjm" event={"ID":"29dcb3ba-2c4c-41f1-a655-02ce44ab280f","Type":"ContainerStarted","Data":"e9c0b404db88fda2532c20b4fa52c70a84ce96a9ebc68696b0910d64f13a07ed"} Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.806761 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.806949 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/03b2dced-b34b-414b-aed5-0b94c0eba98c-proxy-ca-bundles\") pod \"controller-manager-d7677567-tbqjb\" (UID: \"03b2dced-b34b-414b-aed5-0b94c0eba98c\") " pod="openshift-controller-manager/controller-manager-d7677567-tbqjb" Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.806997 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/03b2dced-b34b-414b-aed5-0b94c0eba98c-serving-cert\") pod \"controller-manager-d7677567-tbqjb\" (UID: \"03b2dced-b34b-414b-aed5-0b94c0eba98c\") " pod="openshift-controller-manager/controller-manager-d7677567-tbqjb" Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.807012 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03b2dced-b34b-414b-aed5-0b94c0eba98c-config\") pod \"controller-manager-d7677567-tbqjb\" (UID: \"03b2dced-b34b-414b-aed5-0b94c0eba98c\") " pod="openshift-controller-manager/controller-manager-d7677567-tbqjb" Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.807077 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7x7t\" (UniqueName: \"kubernetes.io/projected/03b2dced-b34b-414b-aed5-0b94c0eba98c-kube-api-access-g7x7t\") pod \"controller-manager-d7677567-tbqjb\" (UID: \"03b2dced-b34b-414b-aed5-0b94c0eba98c\") " pod="openshift-controller-manager/controller-manager-d7677567-tbqjb" Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.807097 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/03b2dced-b34b-414b-aed5-0b94c0eba98c-client-ca\") pod \"controller-manager-d7677567-tbqjb\" (UID: \"03b2dced-b34b-414b-aed5-0b94c0eba98c\") " pod="openshift-controller-manager/controller-manager-d7677567-tbqjb" Mar 20 17:35:59 crc kubenswrapper[4690]: E0320 17:35:59.807194 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:36:00.307179127 +0000 UTC m=+235.173004805 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.834665 4690 generic.go:334] "Generic (PLEG): container finished" podID="416d626a-ef44-4b4e-91ce-51042b01a45a" containerID="798662e81e66e461de94f26e9ad33bd80b165e58ad4a15aa5b734cdc24628353" exitCode=0 Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.834786 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vwgk4" event={"ID":"416d626a-ef44-4b4e-91ce-51042b01a45a","Type":"ContainerDied","Data":"798662e81e66e461de94f26e9ad33bd80b165e58ad4a15aa5b734cdc24628353"} Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.854555 4690 generic.go:334] "Generic (PLEG): container finished" podID="fcf13749-fd7c-4f01-9598-7f041910cd74" containerID="857b206b63c2b51156893882f77a5e3f20a8a487e11bdfa8c3a5a58351eaedd1" exitCode=0 Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.854611 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-wbkxs" event={"ID":"fcf13749-fd7c-4f01-9598-7f041910cd74","Type":"ContainerDied","Data":"857b206b63c2b51156893882f77a5e3f20a8a487e11bdfa8c3a5a58351eaedd1"} Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.854637 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-wbkxs" event={"ID":"fcf13749-fd7c-4f01-9598-7f041910cd74","Type":"ContainerDied","Data":"8cb321c7eaacad68afc4f8a1a40ab6006f1bf22164f2dee3c73928959510feb7"} Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.854652 4690 scope.go:117] "RemoveContainer" containerID="857b206b63c2b51156893882f77a5e3f20a8a487e11bdfa8c3a5a58351eaedd1" Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.854753 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-wbkxs" Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.908521 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ace7a9fa-7eac-449c-8b61-6018d592fc4f-client-ca\") pod \"ace7a9fa-7eac-449c-8b61-6018d592fc4f\" (UID: \"ace7a9fa-7eac-449c-8b61-6018d592fc4f\") " Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.908565 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ace7a9fa-7eac-449c-8b61-6018d592fc4f-serving-cert\") pod \"ace7a9fa-7eac-449c-8b61-6018d592fc4f\" (UID: \"ace7a9fa-7eac-449c-8b61-6018d592fc4f\") " Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.908593 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ace7a9fa-7eac-449c-8b61-6018d592fc4f-config\") pod \"ace7a9fa-7eac-449c-8b61-6018d592fc4f\" (UID: \"ace7a9fa-7eac-449c-8b61-6018d592fc4f\") " Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.908687 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6496f\" (UniqueName: \"kubernetes.io/projected/ace7a9fa-7eac-449c-8b61-6018d592fc4f-kube-api-access-6496f\") pod \"ace7a9fa-7eac-449c-8b61-6018d592fc4f\" (UID: \"ace7a9fa-7eac-449c-8b61-6018d592fc4f\") " Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.911712 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7x7t\" (UniqueName: \"kubernetes.io/projected/03b2dced-b34b-414b-aed5-0b94c0eba98c-kube-api-access-g7x7t\") pod \"controller-manager-d7677567-tbqjb\" (UID: \"03b2dced-b34b-414b-aed5-0b94c0eba98c\") " pod="openshift-controller-manager/controller-manager-d7677567-tbqjb" Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.911753 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/03b2dced-b34b-414b-aed5-0b94c0eba98c-client-ca\") pod \"controller-manager-d7677567-tbqjb\" (UID: \"03b2dced-b34b-414b-aed5-0b94c0eba98c\") " pod="openshift-controller-manager/controller-manager-d7677567-tbqjb" Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.911856 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/03b2dced-b34b-414b-aed5-0b94c0eba98c-proxy-ca-bundles\") pod \"controller-manager-d7677567-tbqjb\" (UID: \"03b2dced-b34b-414b-aed5-0b94c0eba98c\") " pod="openshift-controller-manager/controller-manager-d7677567-tbqjb" Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.911974 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/03b2dced-b34b-414b-aed5-0b94c0eba98c-serving-cert\") pod \"controller-manager-d7677567-tbqjb\" (UID: \"03b2dced-b34b-414b-aed5-0b94c0eba98c\") " pod="openshift-controller-manager/controller-manager-d7677567-tbqjb" Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.912006 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fhf7\" (UID: \"11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fhf7" Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.912022 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03b2dced-b34b-414b-aed5-0b94c0eba98c-config\") pod \"controller-manager-d7677567-tbqjb\" (UID: \"03b2dced-b34b-414b-aed5-0b94c0eba98c\") " pod="openshift-controller-manager/controller-manager-d7677567-tbqjb" Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.916864 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03b2dced-b34b-414b-aed5-0b94c0eba98c-config\") pod \"controller-manager-d7677567-tbqjb\" (UID: \"03b2dced-b34b-414b-aed5-0b94c0eba98c\") " pod="openshift-controller-manager/controller-manager-d7677567-tbqjb" Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.918148 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/03b2dced-b34b-414b-aed5-0b94c0eba98c-proxy-ca-bundles\") pod \"controller-manager-d7677567-tbqjb\" (UID: \"03b2dced-b34b-414b-aed5-0b94c0eba98c\") " pod="openshift-controller-manager/controller-manager-d7677567-tbqjb" Mar 20 17:35:59 crc kubenswrapper[4690]: E0320 17:35:59.918487 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:36:00.418473817 +0000 UTC m=+235.284299495 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fhf7" (UID: "11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.918694 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/03b2dced-b34b-414b-aed5-0b94c0eba98c-client-ca\") pod \"controller-manager-d7677567-tbqjb\" (UID: \"03b2dced-b34b-414b-aed5-0b94c0eba98c\") " pod="openshift-controller-manager/controller-manager-d7677567-tbqjb" Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.919190 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ace7a9fa-7eac-449c-8b61-6018d592fc4f-config" (OuterVolumeSpecName: "config") pod "ace7a9fa-7eac-449c-8b61-6018d592fc4f" (UID: "ace7a9fa-7eac-449c-8b61-6018d592fc4f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.926895 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ace7a9fa-7eac-449c-8b61-6018d592fc4f-client-ca" (OuterVolumeSpecName: "client-ca") pod "ace7a9fa-7eac-449c-8b61-6018d592fc4f" (UID: "ace7a9fa-7eac-449c-8b61-6018d592fc4f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.929197 4690 generic.go:334] "Generic (PLEG): container finished" podID="ace7a9fa-7eac-449c-8b61-6018d592fc4f" containerID="6c74d17c78a3df4dd5bcdaf5c8462934420e7db51aaf05b8b5d442c885315928" exitCode=0 Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.929264 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/03b2dced-b34b-414b-aed5-0b94c0eba98c-serving-cert\") pod \"controller-manager-d7677567-tbqjb\" (UID: \"03b2dced-b34b-414b-aed5-0b94c0eba98c\") " pod="openshift-controller-manager/controller-manager-d7677567-tbqjb" Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.929299 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zklpl" Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.946024 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ace7a9fa-7eac-449c-8b61-6018d592fc4f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ace7a9fa-7eac-449c-8b61-6018d592fc4f" (UID: "ace7a9fa-7eac-449c-8b61-6018d592fc4f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.956194 4690 generic.go:334] "Generic (PLEG): container finished" podID="30d0d78a-3ea1-4ce6-b8fb-13645cfedf18" containerID="0ddcfbf9f5cd054792d532a739efea5d4020042e55919f7bef16c4c048b1328b" exitCode=0 Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.956572 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7x7t\" (UniqueName: \"kubernetes.io/projected/03b2dced-b34b-414b-aed5-0b94c0eba98c-kube-api-access-g7x7t\") pod \"controller-manager-d7677567-tbqjb\" (UID: \"03b2dced-b34b-414b-aed5-0b94c0eba98c\") " pod="openshift-controller-manager/controller-manager-d7677567-tbqjb" Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.964571 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ace7a9fa-7eac-449c-8b61-6018d592fc4f-kube-api-access-6496f" (OuterVolumeSpecName: "kube-api-access-6496f") pod "ace7a9fa-7eac-449c-8b61-6018d592fc4f" (UID: "ace7a9fa-7eac-449c-8b61-6018d592fc4f"). InnerVolumeSpecName "kube-api-access-6496f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.983446 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-88brt" event={"ID":"43391457-a499-43df-82a4-15be4ce2a0ac","Type":"ContainerStarted","Data":"2c692114411a8eea32727954a92656084cfbb597fe6d00344ac5038db066381a"} Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.983487 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zltxc"] Mar 20 17:35:59 crc kubenswrapper[4690]: E0320 17:35:59.983774 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ace7a9fa-7eac-449c-8b61-6018d592fc4f" containerName="route-controller-manager" Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.983788 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="ace7a9fa-7eac-449c-8b61-6018d592fc4f" containerName="route-controller-manager" Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.983979 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="ace7a9fa-7eac-449c-8b61-6018d592fc4f" containerName="route-controller-manager" Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.984517 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-d7677567-tbqjb" Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.986880 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-wbkxs"] Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.986912 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zklpl" event={"ID":"ace7a9fa-7eac-449c-8b61-6018d592fc4f","Type":"ContainerDied","Data":"6c74d17c78a3df4dd5bcdaf5c8462934420e7db51aaf05b8b5d442c885315928"} Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.986932 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-wbkxs"] Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.986950 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zltxc"] Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.986961 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zklpl" event={"ID":"ace7a9fa-7eac-449c-8b61-6018d592fc4f","Type":"ContainerDied","Data":"e4740c419cc549513d6adc1b04914bc451eaf6b9fd7a94d81b81a3802be5c2b5"} Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.986971 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4m7xw" event={"ID":"30d0d78a-3ea1-4ce6-b8fb-13645cfedf18","Type":"ContainerDied","Data":"0ddcfbf9f5cd054792d532a739efea5d4020042e55919f7bef16c4c048b1328b"} Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.986983 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4m7xw" event={"ID":"30d0d78a-3ea1-4ce6-b8fb-13645cfedf18","Type":"ContainerStarted","Data":"354d0cde1963d6ff64bbe5e6e497b0642e6f4bdfe3ab86580d7ec32d37b65735"} Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.987477 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zltxc" Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.988152 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6rktv" event={"ID":"daded41b-1e26-4dde-aded-4a2e3c1dc4fd","Type":"ContainerStarted","Data":"e78fc8c61650681626c143afdf2fdd70e25e39244cd3f2a65584b1bed7009fed"} Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.988270 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-6rktv" Mar 20 17:35:59 crc kubenswrapper[4690]: I0320 17:35:59.989291 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 20 17:36:00 crc kubenswrapper[4690]: I0320 17:36:00.013668 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:36:00 crc kubenswrapper[4690]: I0320 17:36:00.013923 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6496f\" (UniqueName: \"kubernetes.io/projected/ace7a9fa-7eac-449c-8b61-6018d592fc4f-kube-api-access-6496f\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:00 crc kubenswrapper[4690]: I0320 17:36:00.013935 4690 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ace7a9fa-7eac-449c-8b61-6018d592fc4f-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:00 crc kubenswrapper[4690]: I0320 17:36:00.013945 4690 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ace7a9fa-7eac-449c-8b61-6018d592fc4f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:00 crc kubenswrapper[4690]: I0320 17:36:00.013952 4690 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ace7a9fa-7eac-449c-8b61-6018d592fc4f-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:00 crc kubenswrapper[4690]: E0320 17:36:00.014452 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:36:00.51443777 +0000 UTC m=+235.380263448 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:36:00 crc kubenswrapper[4690]: I0320 17:36:00.029104 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ssfrq"] Mar 20 17:36:00 crc kubenswrapper[4690]: I0320 17:36:00.033682 4690 generic.go:334] "Generic (PLEG): container finished" podID="7552fec8-7b03-4ad9-8410-1705f639433e" containerID="651611eef99ad79eaaa35e47a43a5ca328ca729a209618f730db996b4b0805b4" exitCode=0 Mar 20 17:36:00 crc kubenswrapper[4690]: I0320 17:36:00.033822 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bnxz2" event={"ID":"7552fec8-7b03-4ad9-8410-1705f639433e","Type":"ContainerDied","Data":"651611eef99ad79eaaa35e47a43a5ca328ca729a209618f730db996b4b0805b4"} Mar 20 17:36:00 crc kubenswrapper[4690]: I0320 17:36:00.033848 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bnxz2" event={"ID":"7552fec8-7b03-4ad9-8410-1705f639433e","Type":"ContainerStarted","Data":"c637e3c605641d5c710e7ff8dc8b37e56853cc1561ebddb04343fee132620166"} Mar 20 17:36:00 crc kubenswrapper[4690]: I0320 17:36:00.069273 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-nprpv" event={"ID":"51ad4830-9e57-4bf2-91e5-7c24c7648d8b","Type":"ContainerStarted","Data":"80f31e1c3a0bfb9fd0d9734493e1249a1e27ee751b91011c818660ffed34aa24"} Mar 20 17:36:00 crc kubenswrapper[4690]: W0320 17:36:00.089747 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod244c63f8_c484_4edb_9cb6_0ac6a9dac136.slice/crio-ea5166e7133ca5ed7a8463d26ce78afda8c19a95a9619bfeb6567454c9547370 WatchSource:0}: Error finding container ea5166e7133ca5ed7a8463d26ce78afda8c19a95a9619bfeb6567454c9547370: Status 404 returned error can't find the container with id ea5166e7133ca5ed7a8463d26ce78afda8c19a95a9619bfeb6567454c9547370 Mar 20 17:36:00 crc kubenswrapper[4690]: I0320 17:36:00.089868 4690 scope.go:117] "RemoveContainer" containerID="857b206b63c2b51156893882f77a5e3f20a8a487e11bdfa8c3a5a58351eaedd1" Mar 20 17:36:00 crc kubenswrapper[4690]: I0320 17:36:00.101541 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-2b2sh" Mar 20 17:36:00 crc kubenswrapper[4690]: E0320 17:36:00.102405 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"857b206b63c2b51156893882f77a5e3f20a8a487e11bdfa8c3a5a58351eaedd1\": container with ID starting with 857b206b63c2b51156893882f77a5e3f20a8a487e11bdfa8c3a5a58351eaedd1 not found: ID does not exist" containerID="857b206b63c2b51156893882f77a5e3f20a8a487e11bdfa8c3a5a58351eaedd1" Mar 20 17:36:00 crc kubenswrapper[4690]: I0320 17:36:00.102458 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"857b206b63c2b51156893882f77a5e3f20a8a487e11bdfa8c3a5a58351eaedd1"} err="failed to get container status \"857b206b63c2b51156893882f77a5e3f20a8a487e11bdfa8c3a5a58351eaedd1\": rpc error: code = NotFound desc = could not find container \"857b206b63c2b51156893882f77a5e3f20a8a487e11bdfa8c3a5a58351eaedd1\": container with ID starting with 857b206b63c2b51156893882f77a5e3f20a8a487e11bdfa8c3a5a58351eaedd1 not found: ID does not exist" Mar 20 17:36:00 crc kubenswrapper[4690]: I0320 17:36:00.102482 4690 scope.go:117] "RemoveContainer" containerID="6c74d17c78a3df4dd5bcdaf5c8462934420e7db51aaf05b8b5d442c885315928" Mar 20 17:36:00 crc kubenswrapper[4690]: I0320 17:36:00.116147 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tv6bv" Mar 20 17:36:00 crc kubenswrapper[4690]: I0320 17:36:00.116421 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edacb8ae-57ae-41f3-b13b-a423afa0e2dd-catalog-content\") pod \"redhat-operators-zltxc\" (UID: \"edacb8ae-57ae-41f3-b13b-a423afa0e2dd\") " pod="openshift-marketplace/redhat-operators-zltxc" Mar 20 17:36:00 crc kubenswrapper[4690]: I0320 17:36:00.116552 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87j8k\" (UniqueName: \"kubernetes.io/projected/edacb8ae-57ae-41f3-b13b-a423afa0e2dd-kube-api-access-87j8k\") pod \"redhat-operators-zltxc\" (UID: \"edacb8ae-57ae-41f3-b13b-a423afa0e2dd\") " pod="openshift-marketplace/redhat-operators-zltxc" Mar 20 17:36:00 crc kubenswrapper[4690]: I0320 17:36:00.116605 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fhf7\" (UID: \"11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fhf7" Mar 20 17:36:00 crc kubenswrapper[4690]: I0320 17:36:00.116637 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edacb8ae-57ae-41f3-b13b-a423afa0e2dd-utilities\") pod \"redhat-operators-zltxc\" (UID: \"edacb8ae-57ae-41f3-b13b-a423afa0e2dd\") " pod="openshift-marketplace/redhat-operators-zltxc" Mar 20 17:36:00 crc kubenswrapper[4690]: E0320 17:36:00.117575 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:36:00.617559642 +0000 UTC m=+235.483385320 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fhf7" (UID: "11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:36:00 crc kubenswrapper[4690]: I0320 17:36:00.142434 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-6rktv" podStartSLOduration=10.142412086 podStartE2EDuration="10.142412086s" podCreationTimestamp="2026-03-20 17:35:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:36:00.1178152 +0000 UTC m=+234.983640878" watchObservedRunningTime="2026-03-20 17:36:00.142412086 +0000 UTC m=+235.008237774" Mar 20 17:36:00 crc kubenswrapper[4690]: I0320 17:36:00.142930 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gj5xl"] Mar 20 17:36:00 crc kubenswrapper[4690]: I0320 17:36:00.144156 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gj5xl" Mar 20 17:36:00 crc kubenswrapper[4690]: I0320 17:36:00.153557 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gj5xl"] Mar 20 17:36:00 crc kubenswrapper[4690]: I0320 17:36:00.179116 4690 scope.go:117] "RemoveContainer" containerID="6c74d17c78a3df4dd5bcdaf5c8462934420e7db51aaf05b8b5d442c885315928" Mar 20 17:36:00 crc kubenswrapper[4690]: E0320 17:36:00.182832 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c74d17c78a3df4dd5bcdaf5c8462934420e7db51aaf05b8b5d442c885315928\": container with ID starting with 6c74d17c78a3df4dd5bcdaf5c8462934420e7db51aaf05b8b5d442c885315928 not found: ID does not exist" containerID="6c74d17c78a3df4dd5bcdaf5c8462934420e7db51aaf05b8b5d442c885315928" Mar 20 17:36:00 crc kubenswrapper[4690]: I0320 17:36:00.182878 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c74d17c78a3df4dd5bcdaf5c8462934420e7db51aaf05b8b5d442c885315928"} err="failed to get container status \"6c74d17c78a3df4dd5bcdaf5c8462934420e7db51aaf05b8b5d442c885315928\": rpc error: code = NotFound desc = could not find container \"6c74d17c78a3df4dd5bcdaf5c8462934420e7db51aaf05b8b5d442c885315928\": container with ID starting with 6c74d17c78a3df4dd5bcdaf5c8462934420e7db51aaf05b8b5d442c885315928 not found: ID does not exist" Mar 20 17:36:00 crc kubenswrapper[4690]: I0320 17:36:00.186930 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567136-lsh75"] Mar 20 17:36:00 crc kubenswrapper[4690]: I0320 17:36:00.187719 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567136-lsh75" Mar 20 17:36:00 crc kubenswrapper[4690]: I0320 17:36:00.205087 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567136-lsh75"] Mar 20 17:36:00 crc kubenswrapper[4690]: I0320 17:36:00.217135 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:36:00 crc kubenswrapper[4690]: I0320 17:36:00.217524 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edacb8ae-57ae-41f3-b13b-a423afa0e2dd-utilities\") pod \"redhat-operators-zltxc\" (UID: \"edacb8ae-57ae-41f3-b13b-a423afa0e2dd\") " pod="openshift-marketplace/redhat-operators-zltxc" Mar 20 17:36:00 crc kubenswrapper[4690]: I0320 17:36:00.217971 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edacb8ae-57ae-41f3-b13b-a423afa0e2dd-catalog-content\") pod \"redhat-operators-zltxc\" (UID: \"edacb8ae-57ae-41f3-b13b-a423afa0e2dd\") " pod="openshift-marketplace/redhat-operators-zltxc" Mar 20 17:36:00 crc kubenswrapper[4690]: I0320 17:36:00.218331 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87j8k\" (UniqueName: \"kubernetes.io/projected/edacb8ae-57ae-41f3-b13b-a423afa0e2dd-kube-api-access-87j8k\") pod \"redhat-operators-zltxc\" (UID: \"edacb8ae-57ae-41f3-b13b-a423afa0e2dd\") " pod="openshift-marketplace/redhat-operators-zltxc" Mar 20 17:36:00 crc kubenswrapper[4690]: E0320 17:36:00.219610 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:36:00.719592073 +0000 UTC m=+235.585417761 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:36:00 crc kubenswrapper[4690]: I0320 17:36:00.221935 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edacb8ae-57ae-41f3-b13b-a423afa0e2dd-utilities\") pod \"redhat-operators-zltxc\" (UID: \"edacb8ae-57ae-41f3-b13b-a423afa0e2dd\") " pod="openshift-marketplace/redhat-operators-zltxc" Mar 20 17:36:00 crc kubenswrapper[4690]: I0320 17:36:00.229538 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edacb8ae-57ae-41f3-b13b-a423afa0e2dd-catalog-content\") pod \"redhat-operators-zltxc\" (UID: \"edacb8ae-57ae-41f3-b13b-a423afa0e2dd\") " pod="openshift-marketplace/redhat-operators-zltxc" Mar 20 17:36:00 crc kubenswrapper[4690]: I0320 17:36:00.302613 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87j8k\" (UniqueName: \"kubernetes.io/projected/edacb8ae-57ae-41f3-b13b-a423afa0e2dd-kube-api-access-87j8k\") pod \"redhat-operators-zltxc\" (UID: \"edacb8ae-57ae-41f3-b13b-a423afa0e2dd\") " pod="openshift-marketplace/redhat-operators-zltxc" Mar 20 17:36:00 crc kubenswrapper[4690]: I0320 17:36:00.315559 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-zklpl"] Mar 20 17:36:00 crc kubenswrapper[4690]: I0320 17:36:00.319755 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fhf7\" (UID: \"11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fhf7" Mar 20 17:36:00 crc kubenswrapper[4690]: E0320 17:36:00.319991 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:36:00.819980145 +0000 UTC m=+235.685805823 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fhf7" (UID: "11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:36:00 crc kubenswrapper[4690]: I0320 17:36:00.325639 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4tmr\" (UniqueName: \"kubernetes.io/projected/068537fa-5883-4e11-a933-87706891d0ae-kube-api-access-m4tmr\") pod \"redhat-operators-gj5xl\" (UID: \"068537fa-5883-4e11-a933-87706891d0ae\") " pod="openshift-marketplace/redhat-operators-gj5xl" Mar 20 17:36:00 crc kubenswrapper[4690]: I0320 17:36:00.325666 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nw2qf\" (UniqueName: \"kubernetes.io/projected/d1c872c1-ae2b-4fd2-bb6f-e387fab73a06-kube-api-access-nw2qf\") pod \"auto-csr-approver-29567136-lsh75\" (UID: \"d1c872c1-ae2b-4fd2-bb6f-e387fab73a06\") " pod="openshift-infra/auto-csr-approver-29567136-lsh75" Mar 20 17:36:00 crc kubenswrapper[4690]: I0320 17:36:00.325695 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/068537fa-5883-4e11-a933-87706891d0ae-utilities\") pod \"redhat-operators-gj5xl\" (UID: \"068537fa-5883-4e11-a933-87706891d0ae\") " pod="openshift-marketplace/redhat-operators-gj5xl" Mar 20 17:36:00 crc kubenswrapper[4690]: I0320 17:36:00.325721 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/068537fa-5883-4e11-a933-87706891d0ae-catalog-content\") pod \"redhat-operators-gj5xl\" (UID: \"068537fa-5883-4e11-a933-87706891d0ae\") " pod="openshift-marketplace/redhat-operators-gj5xl" Mar 20 17:36:00 crc kubenswrapper[4690]: I0320 17:36:00.331071 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-zklpl"] Mar 20 17:36:00 crc kubenswrapper[4690]: I0320 17:36:00.338178 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 20 17:36:00 crc kubenswrapper[4690]: I0320 17:36:00.339207 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 17:36:00 crc kubenswrapper[4690]: I0320 17:36:00.345218 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 20 17:36:00 crc kubenswrapper[4690]: I0320 17:36:00.345455 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 20 17:36:00 crc kubenswrapper[4690]: I0320 17:36:00.349481 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 20 17:36:00 crc kubenswrapper[4690]: I0320 17:36:00.369221 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zltxc" Mar 20 17:36:00 crc kubenswrapper[4690]: I0320 17:36:00.426991 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:36:00 crc kubenswrapper[4690]: I0320 17:36:00.427162 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4tmr\" (UniqueName: \"kubernetes.io/projected/068537fa-5883-4e11-a933-87706891d0ae-kube-api-access-m4tmr\") pod \"redhat-operators-gj5xl\" (UID: \"068537fa-5883-4e11-a933-87706891d0ae\") " pod="openshift-marketplace/redhat-operators-gj5xl" Mar 20 17:36:00 crc kubenswrapper[4690]: I0320 17:36:00.427181 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nw2qf\" (UniqueName: \"kubernetes.io/projected/d1c872c1-ae2b-4fd2-bb6f-e387fab73a06-kube-api-access-nw2qf\") pod \"auto-csr-approver-29567136-lsh75\" (UID: \"d1c872c1-ae2b-4fd2-bb6f-e387fab73a06\") " pod="openshift-infra/auto-csr-approver-29567136-lsh75" Mar 20 17:36:00 crc kubenswrapper[4690]: I0320 17:36:00.427200 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/068537fa-5883-4e11-a933-87706891d0ae-utilities\") pod \"redhat-operators-gj5xl\" (UID: \"068537fa-5883-4e11-a933-87706891d0ae\") " pod="openshift-marketplace/redhat-operators-gj5xl" Mar 20 17:36:00 crc kubenswrapper[4690]: I0320 17:36:00.427216 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/068537fa-5883-4e11-a933-87706891d0ae-catalog-content\") pod \"redhat-operators-gj5xl\" (UID: \"068537fa-5883-4e11-a933-87706891d0ae\") " pod="openshift-marketplace/redhat-operators-gj5xl" Mar 20 17:36:00 crc kubenswrapper[4690]: I0320 17:36:00.427328 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/10c4cee5-54e0-45de-a3aa-f361cbec3b63-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"10c4cee5-54e0-45de-a3aa-f361cbec3b63\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 17:36:00 crc kubenswrapper[4690]: I0320 17:36:00.427392 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/10c4cee5-54e0-45de-a3aa-f361cbec3b63-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"10c4cee5-54e0-45de-a3aa-f361cbec3b63\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 17:36:00 crc kubenswrapper[4690]: E0320 17:36:00.427498 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:36:00.927482055 +0000 UTC m=+235.793307733 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:36:00 crc kubenswrapper[4690]: I0320 17:36:00.428138 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/068537fa-5883-4e11-a933-87706891d0ae-utilities\") pod \"redhat-operators-gj5xl\" (UID: \"068537fa-5883-4e11-a933-87706891d0ae\") " pod="openshift-marketplace/redhat-operators-gj5xl" Mar 20 17:36:00 crc kubenswrapper[4690]: I0320 17:36:00.428221 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/068537fa-5883-4e11-a933-87706891d0ae-catalog-content\") pod \"redhat-operators-gj5xl\" (UID: \"068537fa-5883-4e11-a933-87706891d0ae\") " pod="openshift-marketplace/redhat-operators-gj5xl" Mar 20 17:36:00 crc kubenswrapper[4690]: I0320 17:36:00.448370 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4tmr\" (UniqueName: \"kubernetes.io/projected/068537fa-5883-4e11-a933-87706891d0ae-kube-api-access-m4tmr\") pod \"redhat-operators-gj5xl\" (UID: \"068537fa-5883-4e11-a933-87706891d0ae\") " pod="openshift-marketplace/redhat-operators-gj5xl" Mar 20 17:36:00 crc kubenswrapper[4690]: I0320 17:36:00.448501 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nw2qf\" (UniqueName: \"kubernetes.io/projected/d1c872c1-ae2b-4fd2-bb6f-e387fab73a06-kube-api-access-nw2qf\") pod \"auto-csr-approver-29567136-lsh75\" (UID: \"d1c872c1-ae2b-4fd2-bb6f-e387fab73a06\") " pod="openshift-infra/auto-csr-approver-29567136-lsh75" Mar 20 17:36:00 crc kubenswrapper[4690]: I0320 17:36:00.454342 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6f26l"] Mar 20 17:36:00 crc kubenswrapper[4690]: I0320 17:36:00.481410 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-d7677567-tbqjb"] Mar 20 17:36:00 crc kubenswrapper[4690]: I0320 17:36:00.506234 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gj5xl" Mar 20 17:36:00 crc kubenswrapper[4690]: I0320 17:36:00.517964 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567136-lsh75" Mar 20 17:36:00 crc kubenswrapper[4690]: I0320 17:36:00.529917 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/10c4cee5-54e0-45de-a3aa-f361cbec3b63-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"10c4cee5-54e0-45de-a3aa-f361cbec3b63\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 17:36:00 crc kubenswrapper[4690]: I0320 17:36:00.529998 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/10c4cee5-54e0-45de-a3aa-f361cbec3b63-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"10c4cee5-54e0-45de-a3aa-f361cbec3b63\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 17:36:00 crc kubenswrapper[4690]: I0320 17:36:00.530035 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fhf7\" (UID: \"11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fhf7" Mar 20 17:36:00 crc kubenswrapper[4690]: E0320 17:36:00.530327 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:36:01.030315708 +0000 UTC m=+235.896141386 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fhf7" (UID: "11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:36:00 crc kubenswrapper[4690]: I0320 17:36:00.530381 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/10c4cee5-54e0-45de-a3aa-f361cbec3b63-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"10c4cee5-54e0-45de-a3aa-f361cbec3b63\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 17:36:00 crc kubenswrapper[4690]: I0320 17:36:00.562936 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/10c4cee5-54e0-45de-a3aa-f361cbec3b63-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"10c4cee5-54e0-45de-a3aa-f361cbec3b63\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 17:36:00 crc kubenswrapper[4690]: I0320 17:36:00.627174 4690 patch_prober.go:28] interesting pod/router-default-5444994796-sv7wd container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 17:36:00 crc kubenswrapper[4690]: [-]has-synced failed: reason withheld Mar 20 17:36:00 crc kubenswrapper[4690]: [+]process-running ok Mar 20 17:36:00 crc kubenswrapper[4690]: healthz check failed Mar 20 17:36:00 crc kubenswrapper[4690]: I0320 17:36:00.627232 4690 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-sv7wd" podUID="906d9a20-0731-435a-80af-0dab64476e32" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 17:36:00 crc kubenswrapper[4690]: I0320 17:36:00.631512 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:36:00 crc kubenswrapper[4690]: E0320 17:36:00.631998 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:36:01.131977598 +0000 UTC m=+235.997803276 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:36:00 crc kubenswrapper[4690]: I0320 17:36:00.661302 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-6zcl9" Mar 20 17:36:00 crc kubenswrapper[4690]: I0320 17:36:00.681135 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 17:36:00 crc kubenswrapper[4690]: I0320 17:36:00.732991 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fhf7\" (UID: \"11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fhf7" Mar 20 17:36:00 crc kubenswrapper[4690]: E0320 17:36:00.733866 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:36:01.233853823 +0000 UTC m=+236.099679501 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fhf7" (UID: "11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:36:00 crc kubenswrapper[4690]: I0320 17:36:00.780088 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2n45v" Mar 20 17:36:00 crc kubenswrapper[4690]: I0320 17:36:00.818835 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zltxc"] Mar 20 17:36:00 crc kubenswrapper[4690]: I0320 17:36:00.834110 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:36:00 crc kubenswrapper[4690]: E0320 17:36:00.834296 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:36:01.334267955 +0000 UTC m=+236.200093633 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:36:00 crc kubenswrapper[4690]: I0320 17:36:00.834663 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fhf7\" (UID: \"11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fhf7" Mar 20 17:36:00 crc kubenswrapper[4690]: E0320 17:36:00.835024 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:36:01.335011837 +0000 UTC m=+236.200837515 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fhf7" (UID: "11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:36:00 crc kubenswrapper[4690]: I0320 17:36:00.936276 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:36:00 crc kubenswrapper[4690]: E0320 17:36:00.936665 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:36:01.436647366 +0000 UTC m=+236.302473044 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:36:01 crc kubenswrapper[4690]: I0320 17:36:01.008229 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567136-lsh75"] Mar 20 17:36:01 crc kubenswrapper[4690]: I0320 17:36:01.037640 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fhf7\" (UID: \"11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fhf7" Mar 20 17:36:01 crc kubenswrapper[4690]: E0320 17:36:01.038001 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:36:01.537987896 +0000 UTC m=+236.403813564 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fhf7" (UID: "11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:36:01 crc kubenswrapper[4690]: I0320 17:36:01.059565 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gj5xl"] Mar 20 17:36:01 crc kubenswrapper[4690]: I0320 17:36:01.093857 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 20 17:36:01 crc kubenswrapper[4690]: I0320 17:36:01.103248 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zltxc" event={"ID":"edacb8ae-57ae-41f3-b13b-a423afa0e2dd","Type":"ContainerStarted","Data":"a56a35a3fc4901935cd2bf973dcec03fe7d2b7eba6651944115022317aa5c473"} Mar 20 17:36:01 crc kubenswrapper[4690]: I0320 17:36:01.108269 4690 generic.go:334] "Generic (PLEG): container finished" podID="03f86e30-e6e2-473e-8a52-c1e86d28c2e2" containerID="703fc82a0d26e5620a573c19a2ae4b9e7776c9253848f6560f212c1a48b1f19a" exitCode=0 Mar 20 17:36:01 crc kubenswrapper[4690]: I0320 17:36:01.108368 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567130-scc4x" event={"ID":"03f86e30-e6e2-473e-8a52-c1e86d28c2e2","Type":"ContainerDied","Data":"703fc82a0d26e5620a573c19a2ae4b9e7776c9253848f6560f212c1a48b1f19a"} Mar 20 17:36:01 crc kubenswrapper[4690]: I0320 17:36:01.116370 4690 generic.go:334] "Generic (PLEG): container finished" podID="e896e412-2900-44e4-908c-de1883bd9cdc" containerID="c8d4a838048210eaf37f5766143f03c2bf793451d4aeb61c3c91e10593cf9d46" exitCode=0 Mar 20 17:36:01 crc kubenswrapper[4690]: I0320 17:36:01.116621 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6f26l" event={"ID":"e896e412-2900-44e4-908c-de1883bd9cdc","Type":"ContainerDied","Data":"c8d4a838048210eaf37f5766143f03c2bf793451d4aeb61c3c91e10593cf9d46"} Mar 20 17:36:01 crc kubenswrapper[4690]: I0320 17:36:01.116689 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6f26l" event={"ID":"e896e412-2900-44e4-908c-de1883bd9cdc","Type":"ContainerStarted","Data":"6763a29e5889773af681a475bf692b525661b28dbc0ab49dcc87e155776ecaae"} Mar 20 17:36:01 crc kubenswrapper[4690]: W0320 17:36:01.123736 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod10c4cee5_54e0_45de_a3aa_f361cbec3b63.slice/crio-a82a685e436fd2385f8525178b616b1c3aff87dedbbf9707bc8a4e3bfa705f64 WatchSource:0}: Error finding container a82a685e436fd2385f8525178b616b1c3aff87dedbbf9707bc8a4e3bfa705f64: Status 404 returned error can't find the container with id a82a685e436fd2385f8525178b616b1c3aff87dedbbf9707bc8a4e3bfa705f64 Mar 20 17:36:01 crc kubenswrapper[4690]: I0320 17:36:01.132891 4690 generic.go:334] "Generic (PLEG): container finished" podID="244c63f8-c484-4edb-9cb6-0ac6a9dac136" containerID="f2e746f01c034ad6a813a3d33d439ba2886dbd797f83e2a72ce1203e983cbcde" exitCode=0 Mar 20 17:36:01 crc kubenswrapper[4690]: I0320 17:36:01.133237 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ssfrq" event={"ID":"244c63f8-c484-4edb-9cb6-0ac6a9dac136","Type":"ContainerDied","Data":"f2e746f01c034ad6a813a3d33d439ba2886dbd797f83e2a72ce1203e983cbcde"} Mar 20 17:36:01 crc kubenswrapper[4690]: I0320 17:36:01.133290 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ssfrq" event={"ID":"244c63f8-c484-4edb-9cb6-0ac6a9dac136","Type":"ContainerStarted","Data":"ea5166e7133ca5ed7a8463d26ce78afda8c19a95a9619bfeb6567454c9547370"} Mar 20 17:36:01 crc kubenswrapper[4690]: I0320 17:36:01.135454 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567136-lsh75" event={"ID":"d1c872c1-ae2b-4fd2-bb6f-e387fab73a06","Type":"ContainerStarted","Data":"3110d2a2be95095dc771e550d297bddaff17fbe9f3688949aa02221bbdc167fd"} Mar 20 17:36:01 crc kubenswrapper[4690]: I0320 17:36:01.138602 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:36:01 crc kubenswrapper[4690]: I0320 17:36:01.138776 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-d7677567-tbqjb" event={"ID":"03b2dced-b34b-414b-aed5-0b94c0eba98c","Type":"ContainerStarted","Data":"cfd57053637ec7b1871be1627e4db5a5d9082c0c82b51ee19cf4d55e82747ec7"} Mar 20 17:36:01 crc kubenswrapper[4690]: E0320 17:36:01.138795 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:36:01.63876752 +0000 UTC m=+236.504593198 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:36:01 crc kubenswrapper[4690]: I0320 17:36:01.138808 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-d7677567-tbqjb" event={"ID":"03b2dced-b34b-414b-aed5-0b94c0eba98c","Type":"ContainerStarted","Data":"c535fb0ee0f54a54782738972b01aad425b624746b55c8bc46f4b48136ec3141"} Mar 20 17:36:01 crc kubenswrapper[4690]: I0320 17:36:01.138955 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-d7677567-tbqjb" Mar 20 17:36:01 crc kubenswrapper[4690]: I0320 17:36:01.140841 4690 patch_prober.go:28] interesting pod/controller-manager-d7677567-tbqjb container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.51:8443/healthz\": dial tcp 10.217.0.51:8443: connect: connection refused" start-of-body= Mar 20 17:36:01 crc kubenswrapper[4690]: I0320 17:36:01.140868 4690 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-d7677567-tbqjb" podUID="03b2dced-b34b-414b-aed5-0b94c0eba98c" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.51:8443/healthz\": dial tcp 10.217.0.51:8443: connect: connection refused" Mar 20 17:36:01 crc kubenswrapper[4690]: I0320 17:36:01.140323 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fhf7\" (UID: \"11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fhf7" Mar 20 17:36:01 crc kubenswrapper[4690]: E0320 17:36:01.142331 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:36:01.642320413 +0000 UTC m=+236.508146091 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fhf7" (UID: "11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:36:01 crc kubenswrapper[4690]: I0320 17:36:01.185468 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-d7677567-tbqjb" podStartSLOduration=5.185448189 podStartE2EDuration="5.185448189s" podCreationTimestamp="2026-03-20 17:35:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:36:01.181115632 +0000 UTC m=+236.046941330" watchObservedRunningTime="2026-03-20 17:36:01.185448189 +0000 UTC m=+236.051273867" Mar 20 17:36:01 crc kubenswrapper[4690]: I0320 17:36:01.243052 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:36:01 crc kubenswrapper[4690]: E0320 17:36:01.243222 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:36:01.74319654 +0000 UTC m=+236.609022208 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:36:01 crc kubenswrapper[4690]: I0320 17:36:01.243893 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fhf7\" (UID: \"11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fhf7" Mar 20 17:36:01 crc kubenswrapper[4690]: E0320 17:36:01.246398 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:36:01.746389483 +0000 UTC m=+236.612215161 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fhf7" (UID: "11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:36:01 crc kubenswrapper[4690]: I0320 17:36:01.350274 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:36:01 crc kubenswrapper[4690]: E0320 17:36:01.350590 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:36:01.850576786 +0000 UTC m=+236.716402464 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:36:01 crc kubenswrapper[4690]: I0320 17:36:01.451476 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fhf7\" (UID: \"11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fhf7" Mar 20 17:36:01 crc kubenswrapper[4690]: E0320 17:36:01.451902 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:36:01.951883355 +0000 UTC m=+236.817709093 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fhf7" (UID: "11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:36:01 crc kubenswrapper[4690]: I0320 17:36:01.552588 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:36:01 crc kubenswrapper[4690]: E0320 17:36:01.553428 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:36:02.053381829 +0000 UTC m=+236.919207507 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:36:01 crc kubenswrapper[4690]: I0320 17:36:01.564657 4690 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 20 17:36:01 crc kubenswrapper[4690]: I0320 17:36:01.625890 4690 patch_prober.go:28] interesting pod/router-default-5444994796-sv7wd container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 17:36:01 crc kubenswrapper[4690]: [-]has-synced failed: reason withheld Mar 20 17:36:01 crc kubenswrapper[4690]: [+]process-running ok Mar 20 17:36:01 crc kubenswrapper[4690]: healthz check failed Mar 20 17:36:01 crc kubenswrapper[4690]: I0320 17:36:01.625967 4690 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-sv7wd" podUID="906d9a20-0731-435a-80af-0dab64476e32" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 17:36:01 crc kubenswrapper[4690]: I0320 17:36:01.655459 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fhf7\" (UID: \"11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fhf7" Mar 20 17:36:01 crc kubenswrapper[4690]: E0320 17:36:01.656088 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:36:02.156075949 +0000 UTC m=+237.021901617 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fhf7" (UID: "11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:36:01 crc kubenswrapper[4690]: I0320 17:36:01.756886 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:36:01 crc kubenswrapper[4690]: E0320 17:36:01.757297 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:36:02.257281745 +0000 UTC m=+237.123107423 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:36:01 crc kubenswrapper[4690]: I0320 17:36:01.858373 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fhf7\" (UID: \"11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fhf7" Mar 20 17:36:01 crc kubenswrapper[4690]: E0320 17:36:01.858641 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:36:02.358630045 +0000 UTC m=+237.224455723 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fhf7" (UID: "11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:36:01 crc kubenswrapper[4690]: I0320 17:36:01.921832 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ace7a9fa-7eac-449c-8b61-6018d592fc4f" path="/var/lib/kubelet/pods/ace7a9fa-7eac-449c-8b61-6018d592fc4f/volumes" Mar 20 17:36:01 crc kubenswrapper[4690]: I0320 17:36:01.922611 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fcf13749-fd7c-4f01-9598-7f041910cd74" path="/var/lib/kubelet/pods/fcf13749-fd7c-4f01-9598-7f041910cd74/volumes" Mar 20 17:36:01 crc kubenswrapper[4690]: I0320 17:36:01.964323 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:36:01 crc kubenswrapper[4690]: E0320 17:36:01.965404 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:36:02.465389643 +0000 UTC m=+237.331215321 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:36:02 crc kubenswrapper[4690]: I0320 17:36:02.070131 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fhf7\" (UID: \"11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fhf7" Mar 20 17:36:02 crc kubenswrapper[4690]: E0320 17:36:02.070420 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:36:02.570408911 +0000 UTC m=+237.436234589 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6fhf7" (UID: "11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:36:02 crc kubenswrapper[4690]: I0320 17:36:02.137905 4690 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-20T17:36:01.564698289Z","Handler":null,"Name":""} Mar 20 17:36:02 crc kubenswrapper[4690]: I0320 17:36:02.141609 4690 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 20 17:36:02 crc kubenswrapper[4690]: I0320 17:36:02.141643 4690 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 20 17:36:02 crc kubenswrapper[4690]: I0320 17:36:02.144447 4690 ???:1] "http: TLS handshake error from 192.168.126.11:46908: no serving certificate available for the kubelet" Mar 20 17:36:02 crc kubenswrapper[4690]: I0320 17:36:02.171546 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:36:02 crc kubenswrapper[4690]: I0320 17:36:02.175993 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 20 17:36:02 crc kubenswrapper[4690]: I0320 17:36:02.176680 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"10c4cee5-54e0-45de-a3aa-f361cbec3b63","Type":"ContainerStarted","Data":"a7dd4b19c3f5352acd3d68a5e1e699064696dccb5ef54b0b440667061f63dee9"} Mar 20 17:36:02 crc kubenswrapper[4690]: I0320 17:36:02.176717 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"10c4cee5-54e0-45de-a3aa-f361cbec3b63","Type":"ContainerStarted","Data":"a82a685e436fd2385f8525178b616b1c3aff87dedbbf9707bc8a4e3bfa705f64"} Mar 20 17:36:02 crc kubenswrapper[4690]: I0320 17:36:02.211689 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.211669033 podStartE2EDuration="2.211669033s" podCreationTimestamp="2026-03-20 17:36:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:36:02.207774759 +0000 UTC m=+237.073600437" watchObservedRunningTime="2026-03-20 17:36:02.211669033 +0000 UTC m=+237.077494711" Mar 20 17:36:02 crc kubenswrapper[4690]: I0320 17:36:02.233792 4690 generic.go:334] "Generic (PLEG): container finished" podID="068537fa-5883-4e11-a933-87706891d0ae" containerID="dc00855c4b1a0da44de8bb1d5b47f3854a15f9287db0d3c0cd01acd4e0bab1ca" exitCode=0 Mar 20 17:36:02 crc kubenswrapper[4690]: I0320 17:36:02.233963 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gj5xl" event={"ID":"068537fa-5883-4e11-a933-87706891d0ae","Type":"ContainerDied","Data":"dc00855c4b1a0da44de8bb1d5b47f3854a15f9287db0d3c0cd01acd4e0bab1ca"} Mar 20 17:36:02 crc kubenswrapper[4690]: I0320 17:36:02.234056 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gj5xl" event={"ID":"068537fa-5883-4e11-a933-87706891d0ae","Type":"ContainerStarted","Data":"c575e70c118f3401d4f7337a6269f9bf041442260c12872a11b892ef3e082faa"} Mar 20 17:36:02 crc kubenswrapper[4690]: I0320 17:36:02.250176 4690 generic.go:334] "Generic (PLEG): container finished" podID="edacb8ae-57ae-41f3-b13b-a423afa0e2dd" containerID="1db2ba672516ba3a1e846e0216de4fa0af7bbd987928438b3a0ce4db94a58503" exitCode=0 Mar 20 17:36:02 crc kubenswrapper[4690]: I0320 17:36:02.250303 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zltxc" event={"ID":"edacb8ae-57ae-41f3-b13b-a423afa0e2dd","Type":"ContainerDied","Data":"1db2ba672516ba3a1e846e0216de4fa0af7bbd987928438b3a0ce4db94a58503"} Mar 20 17:36:02 crc kubenswrapper[4690]: I0320 17:36:02.273953 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fhf7\" (UID: \"11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fhf7" Mar 20 17:36:02 crc kubenswrapper[4690]: I0320 17:36:02.296175 4690 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 17:36:02 crc kubenswrapper[4690]: I0320 17:36:02.296210 4690 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fhf7\" (UID: \"11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-6fhf7" Mar 20 17:36:02 crc kubenswrapper[4690]: I0320 17:36:02.357136 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-88brt" event={"ID":"43391457-a499-43df-82a4-15be4ce2a0ac","Type":"ContainerStarted","Data":"af4a96470117efc8f9388340cb3620ebc6171589de73833fc0f965696b865c46"} Mar 20 17:36:02 crc kubenswrapper[4690]: I0320 17:36:02.357168 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-88brt" event={"ID":"43391457-a499-43df-82a4-15be4ce2a0ac","Type":"ContainerStarted","Data":"8efb4983ad97c141711af984878a29a93256718dc3495a5e82b5b24a33e71184"} Mar 20 17:36:02 crc kubenswrapper[4690]: I0320 17:36:02.367493 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-d7677567-tbqjb" Mar 20 17:36:02 crc kubenswrapper[4690]: I0320 17:36:02.421298 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6fhf7\" (UID: \"11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-6fhf7" Mar 20 17:36:02 crc kubenswrapper[4690]: I0320 17:36:02.496817 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-6fhf7" Mar 20 17:36:02 crc kubenswrapper[4690]: I0320 17:36:02.511105 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c6846f548-qgm89"] Mar 20 17:36:02 crc kubenswrapper[4690]: I0320 17:36:02.512129 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-c6846f548-qgm89" Mar 20 17:36:02 crc kubenswrapper[4690]: I0320 17:36:02.520898 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c6846f548-qgm89"] Mar 20 17:36:02 crc kubenswrapper[4690]: I0320 17:36:02.524487 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 17:36:02 crc kubenswrapper[4690]: I0320 17:36:02.524797 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 17:36:02 crc kubenswrapper[4690]: I0320 17:36:02.525042 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 17:36:02 crc kubenswrapper[4690]: I0320 17:36:02.525194 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 17:36:02 crc kubenswrapper[4690]: I0320 17:36:02.527166 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 17:36:02 crc kubenswrapper[4690]: I0320 17:36:02.527446 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 17:36:02 crc kubenswrapper[4690]: I0320 17:36:02.626908 4690 patch_prober.go:28] interesting pod/router-default-5444994796-sv7wd container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 17:36:02 crc kubenswrapper[4690]: [-]has-synced failed: reason withheld Mar 20 17:36:02 crc kubenswrapper[4690]: [+]process-running ok Mar 20 17:36:02 crc kubenswrapper[4690]: healthz check failed Mar 20 17:36:02 crc kubenswrapper[4690]: I0320 17:36:02.626970 4690 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-sv7wd" podUID="906d9a20-0731-435a-80af-0dab64476e32" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 17:36:02 crc kubenswrapper[4690]: I0320 17:36:02.680491 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bcc0f390-eb81-4393-843f-731dffa103c1-client-ca\") pod \"route-controller-manager-c6846f548-qgm89\" (UID: \"bcc0f390-eb81-4393-843f-731dffa103c1\") " pod="openshift-route-controller-manager/route-controller-manager-c6846f548-qgm89" Mar 20 17:36:02 crc kubenswrapper[4690]: I0320 17:36:02.680767 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdzn5\" (UniqueName: \"kubernetes.io/projected/bcc0f390-eb81-4393-843f-731dffa103c1-kube-api-access-zdzn5\") pod \"route-controller-manager-c6846f548-qgm89\" (UID: \"bcc0f390-eb81-4393-843f-731dffa103c1\") " pod="openshift-route-controller-manager/route-controller-manager-c6846f548-qgm89" Mar 20 17:36:02 crc kubenswrapper[4690]: I0320 17:36:02.680803 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bcc0f390-eb81-4393-843f-731dffa103c1-serving-cert\") pod \"route-controller-manager-c6846f548-qgm89\" (UID: \"bcc0f390-eb81-4393-843f-731dffa103c1\") " pod="openshift-route-controller-manager/route-controller-manager-c6846f548-qgm89" Mar 20 17:36:02 crc kubenswrapper[4690]: I0320 17:36:02.680824 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bcc0f390-eb81-4393-843f-731dffa103c1-config\") pod \"route-controller-manager-c6846f548-qgm89\" (UID: \"bcc0f390-eb81-4393-843f-731dffa103c1\") " pod="openshift-route-controller-manager/route-controller-manager-c6846f548-qgm89" Mar 20 17:36:02 crc kubenswrapper[4690]: I0320 17:36:02.781480 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bcc0f390-eb81-4393-843f-731dffa103c1-serving-cert\") pod \"route-controller-manager-c6846f548-qgm89\" (UID: \"bcc0f390-eb81-4393-843f-731dffa103c1\") " pod="openshift-route-controller-manager/route-controller-manager-c6846f548-qgm89" Mar 20 17:36:02 crc kubenswrapper[4690]: I0320 17:36:02.781511 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bcc0f390-eb81-4393-843f-731dffa103c1-config\") pod \"route-controller-manager-c6846f548-qgm89\" (UID: \"bcc0f390-eb81-4393-843f-731dffa103c1\") " pod="openshift-route-controller-manager/route-controller-manager-c6846f548-qgm89" Mar 20 17:36:02 crc kubenswrapper[4690]: I0320 17:36:02.781609 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bcc0f390-eb81-4393-843f-731dffa103c1-client-ca\") pod \"route-controller-manager-c6846f548-qgm89\" (UID: \"bcc0f390-eb81-4393-843f-731dffa103c1\") " pod="openshift-route-controller-manager/route-controller-manager-c6846f548-qgm89" Mar 20 17:36:02 crc kubenswrapper[4690]: I0320 17:36:02.781630 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdzn5\" (UniqueName: \"kubernetes.io/projected/bcc0f390-eb81-4393-843f-731dffa103c1-kube-api-access-zdzn5\") pod \"route-controller-manager-c6846f548-qgm89\" (UID: \"bcc0f390-eb81-4393-843f-731dffa103c1\") " pod="openshift-route-controller-manager/route-controller-manager-c6846f548-qgm89" Mar 20 17:36:02 crc kubenswrapper[4690]: I0320 17:36:02.783225 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bcc0f390-eb81-4393-843f-731dffa103c1-config\") pod \"route-controller-manager-c6846f548-qgm89\" (UID: \"bcc0f390-eb81-4393-843f-731dffa103c1\") " pod="openshift-route-controller-manager/route-controller-manager-c6846f548-qgm89" Mar 20 17:36:02 crc kubenswrapper[4690]: I0320 17:36:02.786087 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bcc0f390-eb81-4393-843f-731dffa103c1-client-ca\") pod \"route-controller-manager-c6846f548-qgm89\" (UID: \"bcc0f390-eb81-4393-843f-731dffa103c1\") " pod="openshift-route-controller-manager/route-controller-manager-c6846f548-qgm89" Mar 20 17:36:02 crc kubenswrapper[4690]: I0320 17:36:02.795572 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bcc0f390-eb81-4393-843f-731dffa103c1-serving-cert\") pod \"route-controller-manager-c6846f548-qgm89\" (UID: \"bcc0f390-eb81-4393-843f-731dffa103c1\") " pod="openshift-route-controller-manager/route-controller-manager-c6846f548-qgm89" Mar 20 17:36:02 crc kubenswrapper[4690]: I0320 17:36:02.797646 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdzn5\" (UniqueName: \"kubernetes.io/projected/bcc0f390-eb81-4393-843f-731dffa103c1-kube-api-access-zdzn5\") pod \"route-controller-manager-c6846f548-qgm89\" (UID: \"bcc0f390-eb81-4393-843f-731dffa103c1\") " pod="openshift-route-controller-manager/route-controller-manager-c6846f548-qgm89" Mar 20 17:36:02 crc kubenswrapper[4690]: I0320 17:36:02.801490 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567130-scc4x" Mar 20 17:36:02 crc kubenswrapper[4690]: I0320 17:36:02.882938 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/03f86e30-e6e2-473e-8a52-c1e86d28c2e2-secret-volume\") pod \"03f86e30-e6e2-473e-8a52-c1e86d28c2e2\" (UID: \"03f86e30-e6e2-473e-8a52-c1e86d28c2e2\") " Mar 20 17:36:02 crc kubenswrapper[4690]: I0320 17:36:02.883049 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/03f86e30-e6e2-473e-8a52-c1e86d28c2e2-config-volume\") pod \"03f86e30-e6e2-473e-8a52-c1e86d28c2e2\" (UID: \"03f86e30-e6e2-473e-8a52-c1e86d28c2e2\") " Mar 20 17:36:02 crc kubenswrapper[4690]: I0320 17:36:02.883068 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rpkrr\" (UniqueName: \"kubernetes.io/projected/03f86e30-e6e2-473e-8a52-c1e86d28c2e2-kube-api-access-rpkrr\") pod \"03f86e30-e6e2-473e-8a52-c1e86d28c2e2\" (UID: \"03f86e30-e6e2-473e-8a52-c1e86d28c2e2\") " Mar 20 17:36:02 crc kubenswrapper[4690]: I0320 17:36:02.884195 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03f86e30-e6e2-473e-8a52-c1e86d28c2e2-config-volume" (OuterVolumeSpecName: "config-volume") pod "03f86e30-e6e2-473e-8a52-c1e86d28c2e2" (UID: "03f86e30-e6e2-473e-8a52-c1e86d28c2e2"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:36:02 crc kubenswrapper[4690]: I0320 17:36:02.887063 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-c6846f548-qgm89" Mar 20 17:36:02 crc kubenswrapper[4690]: I0320 17:36:02.896941 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03f86e30-e6e2-473e-8a52-c1e86d28c2e2-kube-api-access-rpkrr" (OuterVolumeSpecName: "kube-api-access-rpkrr") pod "03f86e30-e6e2-473e-8a52-c1e86d28c2e2" (UID: "03f86e30-e6e2-473e-8a52-c1e86d28c2e2"). InnerVolumeSpecName "kube-api-access-rpkrr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:36:02 crc kubenswrapper[4690]: I0320 17:36:02.896946 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03f86e30-e6e2-473e-8a52-c1e86d28c2e2-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "03f86e30-e6e2-473e-8a52-c1e86d28c2e2" (UID: "03f86e30-e6e2-473e-8a52-c1e86d28c2e2"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:36:02 crc kubenswrapper[4690]: I0320 17:36:02.972593 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 20 17:36:02 crc kubenswrapper[4690]: E0320 17:36:02.973001 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03f86e30-e6e2-473e-8a52-c1e86d28c2e2" containerName="collect-profiles" Mar 20 17:36:02 crc kubenswrapper[4690]: I0320 17:36:02.973012 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="03f86e30-e6e2-473e-8a52-c1e86d28c2e2" containerName="collect-profiles" Mar 20 17:36:02 crc kubenswrapper[4690]: I0320 17:36:02.973103 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="03f86e30-e6e2-473e-8a52-c1e86d28c2e2" containerName="collect-profiles" Mar 20 17:36:02 crc kubenswrapper[4690]: I0320 17:36:02.982950 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 17:36:02 crc kubenswrapper[4690]: I0320 17:36:02.984609 4690 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/03f86e30-e6e2-473e-8a52-c1e86d28c2e2-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:02 crc kubenswrapper[4690]: I0320 17:36:02.984622 4690 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/03f86e30-e6e2-473e-8a52-c1e86d28c2e2-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:02 crc kubenswrapper[4690]: I0320 17:36:02.984632 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rpkrr\" (UniqueName: \"kubernetes.io/projected/03f86e30-e6e2-473e-8a52-c1e86d28c2e2-kube-api-access-rpkrr\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:02 crc kubenswrapper[4690]: I0320 17:36:02.986531 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 20 17:36:02 crc kubenswrapper[4690]: I0320 17:36:02.993923 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 20 17:36:02 crc kubenswrapper[4690]: I0320 17:36:02.995696 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 20 17:36:03 crc kubenswrapper[4690]: I0320 17:36:03.087804 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0ebd561b-3fb9-4bd9-a028-8ec4af98a8b4-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"0ebd561b-3fb9-4bd9-a028-8ec4af98a8b4\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 17:36:03 crc kubenswrapper[4690]: I0320 17:36:03.087890 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0ebd561b-3fb9-4bd9-a028-8ec4af98a8b4-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"0ebd561b-3fb9-4bd9-a028-8ec4af98a8b4\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 17:36:03 crc kubenswrapper[4690]: I0320 17:36:03.127749 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-8l2n9" Mar 20 17:36:03 crc kubenswrapper[4690]: I0320 17:36:03.132367 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-8l2n9" Mar 20 17:36:03 crc kubenswrapper[4690]: I0320 17:36:03.139959 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-8l2n9" Mar 20 17:36:03 crc kubenswrapper[4690]: I0320 17:36:03.145133 4690 patch_prober.go:28] interesting pod/downloads-7954f5f757-v9wf6 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Mar 20 17:36:03 crc kubenswrapper[4690]: I0320 17:36:03.145175 4690 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-v9wf6" podUID="789eef8f-04a8-44cf-9e16-878de3a035bb" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Mar 20 17:36:03 crc kubenswrapper[4690]: I0320 17:36:03.145406 4690 patch_prober.go:28] interesting pod/downloads-7954f5f757-v9wf6 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Mar 20 17:36:03 crc kubenswrapper[4690]: I0320 17:36:03.145448 4690 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-v9wf6" podUID="789eef8f-04a8-44cf-9e16-878de3a035bb" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Mar 20 17:36:03 crc kubenswrapper[4690]: I0320 17:36:03.178863 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-ppgjz" Mar 20 17:36:03 crc kubenswrapper[4690]: I0320 17:36:03.178896 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-ppgjz" Mar 20 17:36:03 crc kubenswrapper[4690]: I0320 17:36:03.192862 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0ebd561b-3fb9-4bd9-a028-8ec4af98a8b4-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"0ebd561b-3fb9-4bd9-a028-8ec4af98a8b4\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 17:36:03 crc kubenswrapper[4690]: I0320 17:36:03.192962 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0ebd561b-3fb9-4bd9-a028-8ec4af98a8b4-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"0ebd561b-3fb9-4bd9-a028-8ec4af98a8b4\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 17:36:03 crc kubenswrapper[4690]: I0320 17:36:03.193319 4690 patch_prober.go:28] interesting pod/console-f9d7485db-ppgjz container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.16:8443/health\": dial tcp 10.217.0.16:8443: connect: connection refused" start-of-body= Mar 20 17:36:03 crc kubenswrapper[4690]: I0320 17:36:03.193345 4690 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-ppgjz" podUID="c4eaf3f2-8536-46bf-8c5f-82606abec128" containerName="console" probeResult="failure" output="Get \"https://10.217.0.16:8443/health\": dial tcp 10.217.0.16:8443: connect: connection refused" Mar 20 17:36:03 crc kubenswrapper[4690]: I0320 17:36:03.193399 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0ebd561b-3fb9-4bd9-a028-8ec4af98a8b4-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"0ebd561b-3fb9-4bd9-a028-8ec4af98a8b4\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 17:36:03 crc kubenswrapper[4690]: I0320 17:36:03.219930 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0ebd561b-3fb9-4bd9-a028-8ec4af98a8b4-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"0ebd561b-3fb9-4bd9-a028-8ec4af98a8b4\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 17:36:03 crc kubenswrapper[4690]: I0320 17:36:03.251247 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-52php" Mar 20 17:36:03 crc kubenswrapper[4690]: I0320 17:36:03.251307 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-52php" Mar 20 17:36:03 crc kubenswrapper[4690]: I0320 17:36:03.258132 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-52php" Mar 20 17:36:03 crc kubenswrapper[4690]: I0320 17:36:03.259432 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-dnpcn" Mar 20 17:36:03 crc kubenswrapper[4690]: I0320 17:36:03.322710 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 17:36:03 crc kubenswrapper[4690]: I0320 17:36:03.355926 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c6846f548-qgm89"] Mar 20 17:36:03 crc kubenswrapper[4690]: I0320 17:36:03.383945 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6fhf7"] Mar 20 17:36:03 crc kubenswrapper[4690]: I0320 17:36:03.399494 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-88brt" event={"ID":"43391457-a499-43df-82a4-15be4ce2a0ac","Type":"ContainerStarted","Data":"80e46eba0da5479797a92c7fc762387a273418c9368056ebcf3ec2a98e5efb62"} Mar 20 17:36:03 crc kubenswrapper[4690]: I0320 17:36:03.412121 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567130-scc4x" event={"ID":"03f86e30-e6e2-473e-8a52-c1e86d28c2e2","Type":"ContainerDied","Data":"91f674c647b1cbb8c68f356504fcd0b8547628a097fac43eb8b4a938b1131e9a"} Mar 20 17:36:03 crc kubenswrapper[4690]: I0320 17:36:03.412148 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="91f674c647b1cbb8c68f356504fcd0b8547628a097fac43eb8b4a938b1131e9a" Mar 20 17:36:03 crc kubenswrapper[4690]: I0320 17:36:03.412206 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567130-scc4x" Mar 20 17:36:03 crc kubenswrapper[4690]: I0320 17:36:03.417647 4690 generic.go:334] "Generic (PLEG): container finished" podID="10c4cee5-54e0-45de-a3aa-f361cbec3b63" containerID="a7dd4b19c3f5352acd3d68a5e1e699064696dccb5ef54b0b440667061f63dee9" exitCode=0 Mar 20 17:36:03 crc kubenswrapper[4690]: I0320 17:36:03.417703 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"10c4cee5-54e0-45de-a3aa-f361cbec3b63","Type":"ContainerDied","Data":"a7dd4b19c3f5352acd3d68a5e1e699064696dccb5ef54b0b440667061f63dee9"} Mar 20 17:36:03 crc kubenswrapper[4690]: I0320 17:36:03.425292 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-8l2n9" Mar 20 17:36:03 crc kubenswrapper[4690]: I0320 17:36:03.426817 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-52php" Mar 20 17:36:03 crc kubenswrapper[4690]: I0320 17:36:03.427166 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-88brt" podStartSLOduration=13.427143146 podStartE2EDuration="13.427143146s" podCreationTimestamp="2026-03-20 17:35:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:36:03.424486269 +0000 UTC m=+238.290311947" watchObservedRunningTime="2026-03-20 17:36:03.427143146 +0000 UTC m=+238.292968824" Mar 20 17:36:03 crc kubenswrapper[4690]: W0320 17:36:03.458303 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbcc0f390_eb81_4393_843f_731dffa103c1.slice/crio-5c3e60f811d82e9a4d6e540671faee3bf3e9c7a57ce0def48be23289c683085e WatchSource:0}: Error finding container 5c3e60f811d82e9a4d6e540671faee3bf3e9c7a57ce0def48be23289c683085e: Status 404 returned error can't find the container with id 5c3e60f811d82e9a4d6e540671faee3bf3e9c7a57ce0def48be23289c683085e Mar 20 17:36:03 crc kubenswrapper[4690]: W0320 17:36:03.554423 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11dbcb55_a8fa_4a9b_a8a3_bf6e79fa18c8.slice/crio-656e2bbbac3fdf4d70614ec5676403b5a6fbb7ca5c8bba31f472a2bfcf23e8f4 WatchSource:0}: Error finding container 656e2bbbac3fdf4d70614ec5676403b5a6fbb7ca5c8bba31f472a2bfcf23e8f4: Status 404 returned error can't find the container with id 656e2bbbac3fdf4d70614ec5676403b5a6fbb7ca5c8bba31f472a2bfcf23e8f4 Mar 20 17:36:03 crc kubenswrapper[4690]: I0320 17:36:03.625361 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-sv7wd" Mar 20 17:36:03 crc kubenswrapper[4690]: I0320 17:36:03.632357 4690 patch_prober.go:28] interesting pod/router-default-5444994796-sv7wd container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 17:36:03 crc kubenswrapper[4690]: [-]has-synced failed: reason withheld Mar 20 17:36:03 crc kubenswrapper[4690]: [+]process-running ok Mar 20 17:36:03 crc kubenswrapper[4690]: healthz check failed Mar 20 17:36:03 crc kubenswrapper[4690]: I0320 17:36:03.632404 4690 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-sv7wd" podUID="906d9a20-0731-435a-80af-0dab64476e32" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 17:36:03 crc kubenswrapper[4690]: I0320 17:36:03.906346 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 20 17:36:03 crc kubenswrapper[4690]: I0320 17:36:03.974910 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 20 17:36:04 crc kubenswrapper[4690]: I0320 17:36:04.437098 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-c6846f548-qgm89" event={"ID":"bcc0f390-eb81-4393-843f-731dffa103c1","Type":"ContainerStarted","Data":"5e54004d98bd59c3be6c28a26d9fdde3c0146ea1cf6be9292e219b8ec325da60"} Mar 20 17:36:04 crc kubenswrapper[4690]: I0320 17:36:04.437142 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-c6846f548-qgm89" event={"ID":"bcc0f390-eb81-4393-843f-731dffa103c1","Type":"ContainerStarted","Data":"5c3e60f811d82e9a4d6e540671faee3bf3e9c7a57ce0def48be23289c683085e"} Mar 20 17:36:04 crc kubenswrapper[4690]: I0320 17:36:04.437508 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-c6846f548-qgm89" Mar 20 17:36:04 crc kubenswrapper[4690]: I0320 17:36:04.443637 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-6fhf7" event={"ID":"11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8","Type":"ContainerStarted","Data":"4adc951754cfda921010f0fa0d9abfc0c746e7568c061110a54ad12757acf5eb"} Mar 20 17:36:04 crc kubenswrapper[4690]: I0320 17:36:04.443701 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-6fhf7" Mar 20 17:36:04 crc kubenswrapper[4690]: I0320 17:36:04.443716 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-6fhf7" event={"ID":"11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8","Type":"ContainerStarted","Data":"656e2bbbac3fdf4d70614ec5676403b5a6fbb7ca5c8bba31f472a2bfcf23e8f4"} Mar 20 17:36:04 crc kubenswrapper[4690]: I0320 17:36:04.454447 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"0ebd561b-3fb9-4bd9-a028-8ec4af98a8b4","Type":"ContainerStarted","Data":"4e2a8e8d3704ce6b0c84233b27d9694de5f342dc8a5fd089027a83ccbde6510d"} Mar 20 17:36:04 crc kubenswrapper[4690]: I0320 17:36:04.480221 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-c6846f548-qgm89" podStartSLOduration=8.4801993 podStartE2EDuration="8.4801993s" podCreationTimestamp="2026-03-20 17:35:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:36:04.463937037 +0000 UTC m=+239.329762715" watchObservedRunningTime="2026-03-20 17:36:04.4801993 +0000 UTC m=+239.346024978" Mar 20 17:36:04 crc kubenswrapper[4690]: I0320 17:36:04.585718 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-c6846f548-qgm89" Mar 20 17:36:04 crc kubenswrapper[4690]: I0320 17:36:04.627223 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-6fhf7" podStartSLOduration=196.62720797 podStartE2EDuration="3m16.62720797s" podCreationTimestamp="2026-03-20 17:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:36:04.512733418 +0000 UTC m=+239.378559116" watchObservedRunningTime="2026-03-20 17:36:04.62720797 +0000 UTC m=+239.493033648" Mar 20 17:36:04 crc kubenswrapper[4690]: I0320 17:36:04.627361 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-sv7wd" Mar 20 17:36:04 crc kubenswrapper[4690]: I0320 17:36:04.643062 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-sv7wd" Mar 20 17:36:04 crc kubenswrapper[4690]: I0320 17:36:04.958138 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 17:36:05 crc kubenswrapper[4690]: I0320 17:36:05.039790 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/10c4cee5-54e0-45de-a3aa-f361cbec3b63-kube-api-access\") pod \"10c4cee5-54e0-45de-a3aa-f361cbec3b63\" (UID: \"10c4cee5-54e0-45de-a3aa-f361cbec3b63\") " Mar 20 17:36:05 crc kubenswrapper[4690]: I0320 17:36:05.039949 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/10c4cee5-54e0-45de-a3aa-f361cbec3b63-kubelet-dir\") pod \"10c4cee5-54e0-45de-a3aa-f361cbec3b63\" (UID: \"10c4cee5-54e0-45de-a3aa-f361cbec3b63\") " Mar 20 17:36:05 crc kubenswrapper[4690]: I0320 17:36:05.040083 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/10c4cee5-54e0-45de-a3aa-f361cbec3b63-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "10c4cee5-54e0-45de-a3aa-f361cbec3b63" (UID: "10c4cee5-54e0-45de-a3aa-f361cbec3b63"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:36:05 crc kubenswrapper[4690]: I0320 17:36:05.040517 4690 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/10c4cee5-54e0-45de-a3aa-f361cbec3b63-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:05 crc kubenswrapper[4690]: I0320 17:36:05.046602 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10c4cee5-54e0-45de-a3aa-f361cbec3b63-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "10c4cee5-54e0-45de-a3aa-f361cbec3b63" (UID: "10c4cee5-54e0-45de-a3aa-f361cbec3b63"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:36:05 crc kubenswrapper[4690]: I0320 17:36:05.153968 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/10c4cee5-54e0-45de-a3aa-f361cbec3b63-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:05 crc kubenswrapper[4690]: I0320 17:36:05.481102 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"10c4cee5-54e0-45de-a3aa-f361cbec3b63","Type":"ContainerDied","Data":"a82a685e436fd2385f8525178b616b1c3aff87dedbbf9707bc8a4e3bfa705f64"} Mar 20 17:36:05 crc kubenswrapper[4690]: I0320 17:36:05.481155 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a82a685e436fd2385f8525178b616b1c3aff87dedbbf9707bc8a4e3bfa705f64" Mar 20 17:36:05 crc kubenswrapper[4690]: I0320 17:36:05.481217 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 17:36:05 crc kubenswrapper[4690]: I0320 17:36:05.484285 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"0ebd561b-3fb9-4bd9-a028-8ec4af98a8b4","Type":"ContainerStarted","Data":"50d4e4866b224c6736f808c5ea51184b12f7af80dc614df999282226969a689a"} Mar 20 17:36:05 crc kubenswrapper[4690]: I0320 17:36:05.506827 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=3.506810166 podStartE2EDuration="3.506810166s" podCreationTimestamp="2026-03-20 17:36:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:36:05.506013833 +0000 UTC m=+240.371839521" watchObservedRunningTime="2026-03-20 17:36:05.506810166 +0000 UTC m=+240.372635834" Mar 20 17:36:06 crc kubenswrapper[4690]: I0320 17:36:06.490915 4690 generic.go:334] "Generic (PLEG): container finished" podID="0ebd561b-3fb9-4bd9-a028-8ec4af98a8b4" containerID="50d4e4866b224c6736f808c5ea51184b12f7af80dc614df999282226969a689a" exitCode=0 Mar 20 17:36:06 crc kubenswrapper[4690]: I0320 17:36:06.491119 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"0ebd561b-3fb9-4bd9-a028-8ec4af98a8b4","Type":"ContainerDied","Data":"50d4e4866b224c6736f808c5ea51184b12f7af80dc614df999282226969a689a"} Mar 20 17:36:07 crc kubenswrapper[4690]: I0320 17:36:07.296049 4690 ???:1] "http: TLS handshake error from 192.168.126.11:48050: no serving certificate available for the kubelet" Mar 20 17:36:07 crc kubenswrapper[4690]: I0320 17:36:07.397205 4690 ???:1] "http: TLS handshake error from 192.168.126.11:48066: no serving certificate available for the kubelet" Mar 20 17:36:07 crc kubenswrapper[4690]: I0320 17:36:07.873846 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 17:36:08 crc kubenswrapper[4690]: I0320 17:36:08.029691 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0ebd561b-3fb9-4bd9-a028-8ec4af98a8b4-kube-api-access\") pod \"0ebd561b-3fb9-4bd9-a028-8ec4af98a8b4\" (UID: \"0ebd561b-3fb9-4bd9-a028-8ec4af98a8b4\") " Mar 20 17:36:08 crc kubenswrapper[4690]: I0320 17:36:08.029765 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0ebd561b-3fb9-4bd9-a028-8ec4af98a8b4-kubelet-dir\") pod \"0ebd561b-3fb9-4bd9-a028-8ec4af98a8b4\" (UID: \"0ebd561b-3fb9-4bd9-a028-8ec4af98a8b4\") " Mar 20 17:36:08 crc kubenswrapper[4690]: I0320 17:36:08.030121 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0ebd561b-3fb9-4bd9-a028-8ec4af98a8b4-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "0ebd561b-3fb9-4bd9-a028-8ec4af98a8b4" (UID: "0ebd561b-3fb9-4bd9-a028-8ec4af98a8b4"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:36:08 crc kubenswrapper[4690]: I0320 17:36:08.037327 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ebd561b-3fb9-4bd9-a028-8ec4af98a8b4-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0ebd561b-3fb9-4bd9-a028-8ec4af98a8b4" (UID: "0ebd561b-3fb9-4bd9-a028-8ec4af98a8b4"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:36:08 crc kubenswrapper[4690]: I0320 17:36:08.141318 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0ebd561b-3fb9-4bd9-a028-8ec4af98a8b4-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:08 crc kubenswrapper[4690]: I0320 17:36:08.141391 4690 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0ebd561b-3fb9-4bd9-a028-8ec4af98a8b4-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:08 crc kubenswrapper[4690]: I0320 17:36:08.529188 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"0ebd561b-3fb9-4bd9-a028-8ec4af98a8b4","Type":"ContainerDied","Data":"4e2a8e8d3704ce6b0c84233b27d9694de5f342dc8a5fd089027a83ccbde6510d"} Mar 20 17:36:08 crc kubenswrapper[4690]: I0320 17:36:08.529239 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e2a8e8d3704ce6b0c84233b27d9694de5f342dc8a5fd089027a83ccbde6510d" Mar 20 17:36:08 crc kubenswrapper[4690]: I0320 17:36:08.529317 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 17:36:09 crc kubenswrapper[4690]: I0320 17:36:09.330805 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-6rktv" Mar 20 17:36:13 crc kubenswrapper[4690]: I0320 17:36:13.144716 4690 patch_prober.go:28] interesting pod/downloads-7954f5f757-v9wf6 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Mar 20 17:36:13 crc kubenswrapper[4690]: I0320 17:36:13.145028 4690 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-v9wf6" podUID="789eef8f-04a8-44cf-9e16-878de3a035bb" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Mar 20 17:36:13 crc kubenswrapper[4690]: I0320 17:36:13.144803 4690 patch_prober.go:28] interesting pod/downloads-7954f5f757-v9wf6 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Mar 20 17:36:13 crc kubenswrapper[4690]: I0320 17:36:13.145128 4690 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-v9wf6" podUID="789eef8f-04a8-44cf-9e16-878de3a035bb" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Mar 20 17:36:13 crc kubenswrapper[4690]: I0320 17:36:13.178343 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-ppgjz" Mar 20 17:36:13 crc kubenswrapper[4690]: I0320 17:36:13.183168 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-ppgjz" Mar 20 17:36:15 crc kubenswrapper[4690]: I0320 17:36:15.830424 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-d7677567-tbqjb"] Mar 20 17:36:15 crc kubenswrapper[4690]: I0320 17:36:15.830944 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-d7677567-tbqjb" podUID="03b2dced-b34b-414b-aed5-0b94c0eba98c" containerName="controller-manager" containerID="cri-o://cfd57053637ec7b1871be1627e4db5a5d9082c0c82b51ee19cf4d55e82747ec7" gracePeriod=30 Mar 20 17:36:15 crc kubenswrapper[4690]: I0320 17:36:15.848664 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c6846f548-qgm89"] Mar 20 17:36:15 crc kubenswrapper[4690]: I0320 17:36:15.849189 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-c6846f548-qgm89" podUID="bcc0f390-eb81-4393-843f-731dffa103c1" containerName="route-controller-manager" containerID="cri-o://5e54004d98bd59c3be6c28a26d9fdde3c0146ea1cf6be9292e219b8ec325da60" gracePeriod=30 Mar 20 17:36:17 crc kubenswrapper[4690]: I0320 17:36:17.567190 4690 ???:1] "http: TLS handshake error from 192.168.126.11:42496: no serving certificate available for the kubelet" Mar 20 17:36:17 crc kubenswrapper[4690]: I0320 17:36:17.601958 4690 generic.go:334] "Generic (PLEG): container finished" podID="03b2dced-b34b-414b-aed5-0b94c0eba98c" containerID="cfd57053637ec7b1871be1627e4db5a5d9082c0c82b51ee19cf4d55e82747ec7" exitCode=0 Mar 20 17:36:17 crc kubenswrapper[4690]: I0320 17:36:17.602079 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-d7677567-tbqjb" event={"ID":"03b2dced-b34b-414b-aed5-0b94c0eba98c","Type":"ContainerDied","Data":"cfd57053637ec7b1871be1627e4db5a5d9082c0c82b51ee19cf4d55e82747ec7"} Mar 20 17:36:17 crc kubenswrapper[4690]: I0320 17:36:17.604578 4690 generic.go:334] "Generic (PLEG): container finished" podID="bcc0f390-eb81-4393-843f-731dffa103c1" containerID="5e54004d98bd59c3be6c28a26d9fdde3c0146ea1cf6be9292e219b8ec325da60" exitCode=0 Mar 20 17:36:17 crc kubenswrapper[4690]: I0320 17:36:17.604640 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-c6846f548-qgm89" event={"ID":"bcc0f390-eb81-4393-843f-731dffa103c1","Type":"ContainerDied","Data":"5e54004d98bd59c3be6c28a26d9fdde3c0146ea1cf6be9292e219b8ec325da60"} Mar 20 17:36:19 crc kubenswrapper[4690]: I0320 17:36:19.986450 4690 patch_prober.go:28] interesting pod/controller-manager-d7677567-tbqjb container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.51:8443/healthz\": dial tcp 10.217.0.51:8443: connect: connection refused" start-of-body= Mar 20 17:36:19 crc kubenswrapper[4690]: I0320 17:36:19.986839 4690 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-d7677567-tbqjb" podUID="03b2dced-b34b-414b-aed5-0b94c0eba98c" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.51:8443/healthz\": dial tcp 10.217.0.51:8443: connect: connection refused" Mar 20 17:36:22 crc kubenswrapper[4690]: I0320 17:36:22.507407 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-6fhf7" Mar 20 17:36:22 crc kubenswrapper[4690]: I0320 17:36:22.888848 4690 patch_prober.go:28] interesting pod/route-controller-manager-c6846f548-qgm89 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.56:8443/healthz\": dial tcp 10.217.0.56:8443: connect: connection refused" start-of-body= Mar 20 17:36:22 crc kubenswrapper[4690]: I0320 17:36:22.888905 4690 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-c6846f548-qgm89" podUID="bcc0f390-eb81-4393-843f-731dffa103c1" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.56:8443/healthz\": dial tcp 10.217.0.56:8443: connect: connection refused" Mar 20 17:36:23 crc kubenswrapper[4690]: I0320 17:36:23.152566 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-v9wf6" Mar 20 17:36:24 crc kubenswrapper[4690]: I0320 17:36:24.274612 4690 patch_prober.go:28] interesting pod/machine-config-daemon-wtg2q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:36:24 crc kubenswrapper[4690]: I0320 17:36:24.274691 4690 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:36:29 crc kubenswrapper[4690]: I0320 17:36:29.985899 4690 patch_prober.go:28] interesting pod/controller-manager-d7677567-tbqjb container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.51:8443/healthz\": dial tcp 10.217.0.51:8443: connect: connection refused" start-of-body= Mar 20 17:36:29 crc kubenswrapper[4690]: I0320 17:36:29.987165 4690 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-d7677567-tbqjb" podUID="03b2dced-b34b-414b-aed5-0b94c0eba98c" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.51:8443/healthz\": dial tcp 10.217.0.51:8443: connect: connection refused" Mar 20 17:36:32 crc kubenswrapper[4690]: I0320 17:36:32.888157 4690 patch_prober.go:28] interesting pod/route-controller-manager-c6846f548-qgm89 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.56:8443/healthz\": dial tcp 10.217.0.56:8443: connect: connection refused" start-of-body= Mar 20 17:36:32 crc kubenswrapper[4690]: I0320 17:36:32.888223 4690 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-c6846f548-qgm89" podUID="bcc0f390-eb81-4393-843f-731dffa103c1" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.56:8443/healthz\": dial tcp 10.217.0.56:8443: connect: connection refused" Mar 20 17:36:33 crc kubenswrapper[4690]: I0320 17:36:33.545302 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nfmkn" Mar 20 17:36:36 crc kubenswrapper[4690]: I0320 17:36:36.330815 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 20 17:36:36 crc kubenswrapper[4690]: E0320 17:36:36.331555 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ebd561b-3fb9-4bd9-a028-8ec4af98a8b4" containerName="pruner" Mar 20 17:36:36 crc kubenswrapper[4690]: I0320 17:36:36.331570 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ebd561b-3fb9-4bd9-a028-8ec4af98a8b4" containerName="pruner" Mar 20 17:36:36 crc kubenswrapper[4690]: E0320 17:36:36.331587 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10c4cee5-54e0-45de-a3aa-f361cbec3b63" containerName="pruner" Mar 20 17:36:36 crc kubenswrapper[4690]: I0320 17:36:36.331593 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="10c4cee5-54e0-45de-a3aa-f361cbec3b63" containerName="pruner" Mar 20 17:36:36 crc kubenswrapper[4690]: I0320 17:36:36.331697 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="10c4cee5-54e0-45de-a3aa-f361cbec3b63" containerName="pruner" Mar 20 17:36:36 crc kubenswrapper[4690]: I0320 17:36:36.331705 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ebd561b-3fb9-4bd9-a028-8ec4af98a8b4" containerName="pruner" Mar 20 17:36:36 crc kubenswrapper[4690]: I0320 17:36:36.332079 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 17:36:36 crc kubenswrapper[4690]: I0320 17:36:36.334381 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 20 17:36:36 crc kubenswrapper[4690]: I0320 17:36:36.334381 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 20 17:36:36 crc kubenswrapper[4690]: I0320 17:36:36.344173 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 20 17:36:36 crc kubenswrapper[4690]: I0320 17:36:36.468918 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/22016c8f-24ff-47ef-ad5f-1e22ef59ae23-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"22016c8f-24ff-47ef-ad5f-1e22ef59ae23\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 17:36:36 crc kubenswrapper[4690]: I0320 17:36:36.469103 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/22016c8f-24ff-47ef-ad5f-1e22ef59ae23-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"22016c8f-24ff-47ef-ad5f-1e22ef59ae23\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 17:36:36 crc kubenswrapper[4690]: I0320 17:36:36.570677 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/22016c8f-24ff-47ef-ad5f-1e22ef59ae23-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"22016c8f-24ff-47ef-ad5f-1e22ef59ae23\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 17:36:36 crc kubenswrapper[4690]: I0320 17:36:36.570758 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/22016c8f-24ff-47ef-ad5f-1e22ef59ae23-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"22016c8f-24ff-47ef-ad5f-1e22ef59ae23\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 17:36:36 crc kubenswrapper[4690]: I0320 17:36:36.570822 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/22016c8f-24ff-47ef-ad5f-1e22ef59ae23-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"22016c8f-24ff-47ef-ad5f-1e22ef59ae23\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 17:36:36 crc kubenswrapper[4690]: I0320 17:36:36.587979 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/22016c8f-24ff-47ef-ad5f-1e22ef59ae23-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"22016c8f-24ff-47ef-ad5f-1e22ef59ae23\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 17:36:36 crc kubenswrapper[4690]: I0320 17:36:36.659241 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 17:36:38 crc kubenswrapper[4690]: I0320 17:36:38.338664 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-d7677567-tbqjb" Mar 20 17:36:38 crc kubenswrapper[4690]: I0320 17:36:38.344915 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-c6846f548-qgm89" Mar 20 17:36:38 crc kubenswrapper[4690]: I0320 17:36:38.378366 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7ff4bd94b8-d4tbj"] Mar 20 17:36:38 crc kubenswrapper[4690]: E0320 17:36:38.378590 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03b2dced-b34b-414b-aed5-0b94c0eba98c" containerName="controller-manager" Mar 20 17:36:38 crc kubenswrapper[4690]: I0320 17:36:38.378603 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="03b2dced-b34b-414b-aed5-0b94c0eba98c" containerName="controller-manager" Mar 20 17:36:38 crc kubenswrapper[4690]: E0320 17:36:38.378622 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcc0f390-eb81-4393-843f-731dffa103c1" containerName="route-controller-manager" Mar 20 17:36:38 crc kubenswrapper[4690]: I0320 17:36:38.378628 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcc0f390-eb81-4393-843f-731dffa103c1" containerName="route-controller-manager" Mar 20 17:36:38 crc kubenswrapper[4690]: I0320 17:36:38.378724 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcc0f390-eb81-4393-843f-731dffa103c1" containerName="route-controller-manager" Mar 20 17:36:38 crc kubenswrapper[4690]: I0320 17:36:38.378736 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="03b2dced-b34b-414b-aed5-0b94c0eba98c" containerName="controller-manager" Mar 20 17:36:38 crc kubenswrapper[4690]: I0320 17:36:38.379091 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7ff4bd94b8-d4tbj" Mar 20 17:36:38 crc kubenswrapper[4690]: I0320 17:36:38.384943 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7ff4bd94b8-d4tbj"] Mar 20 17:36:38 crc kubenswrapper[4690]: I0320 17:36:38.421351 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g7x7t\" (UniqueName: \"kubernetes.io/projected/03b2dced-b34b-414b-aed5-0b94c0eba98c-kube-api-access-g7x7t\") pod \"03b2dced-b34b-414b-aed5-0b94c0eba98c\" (UID: \"03b2dced-b34b-414b-aed5-0b94c0eba98c\") " Mar 20 17:36:38 crc kubenswrapper[4690]: I0320 17:36:38.421423 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/03b2dced-b34b-414b-aed5-0b94c0eba98c-serving-cert\") pod \"03b2dced-b34b-414b-aed5-0b94c0eba98c\" (UID: \"03b2dced-b34b-414b-aed5-0b94c0eba98c\") " Mar 20 17:36:38 crc kubenswrapper[4690]: I0320 17:36:38.421489 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/03b2dced-b34b-414b-aed5-0b94c0eba98c-proxy-ca-bundles\") pod \"03b2dced-b34b-414b-aed5-0b94c0eba98c\" (UID: \"03b2dced-b34b-414b-aed5-0b94c0eba98c\") " Mar 20 17:36:38 crc kubenswrapper[4690]: I0320 17:36:38.421530 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bcc0f390-eb81-4393-843f-731dffa103c1-config\") pod \"bcc0f390-eb81-4393-843f-731dffa103c1\" (UID: \"bcc0f390-eb81-4393-843f-731dffa103c1\") " Mar 20 17:36:38 crc kubenswrapper[4690]: I0320 17:36:38.421584 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03b2dced-b34b-414b-aed5-0b94c0eba98c-config\") pod \"03b2dced-b34b-414b-aed5-0b94c0eba98c\" (UID: \"03b2dced-b34b-414b-aed5-0b94c0eba98c\") " Mar 20 17:36:38 crc kubenswrapper[4690]: I0320 17:36:38.421625 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bcc0f390-eb81-4393-843f-731dffa103c1-serving-cert\") pod \"bcc0f390-eb81-4393-843f-731dffa103c1\" (UID: \"bcc0f390-eb81-4393-843f-731dffa103c1\") " Mar 20 17:36:38 crc kubenswrapper[4690]: I0320 17:36:38.421658 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/03b2dced-b34b-414b-aed5-0b94c0eba98c-client-ca\") pod \"03b2dced-b34b-414b-aed5-0b94c0eba98c\" (UID: \"03b2dced-b34b-414b-aed5-0b94c0eba98c\") " Mar 20 17:36:38 crc kubenswrapper[4690]: I0320 17:36:38.421700 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bcc0f390-eb81-4393-843f-731dffa103c1-client-ca\") pod \"bcc0f390-eb81-4393-843f-731dffa103c1\" (UID: \"bcc0f390-eb81-4393-843f-731dffa103c1\") " Mar 20 17:36:38 crc kubenswrapper[4690]: I0320 17:36:38.421742 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdzn5\" (UniqueName: \"kubernetes.io/projected/bcc0f390-eb81-4393-843f-731dffa103c1-kube-api-access-zdzn5\") pod \"bcc0f390-eb81-4393-843f-731dffa103c1\" (UID: \"bcc0f390-eb81-4393-843f-731dffa103c1\") " Mar 20 17:36:38 crc kubenswrapper[4690]: I0320 17:36:38.421889 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd7aad87-3b32-4095-b7f5-efe8d25a53a9-config\") pod \"controller-manager-7ff4bd94b8-d4tbj\" (UID: \"fd7aad87-3b32-4095-b7f5-efe8d25a53a9\") " pod="openshift-controller-manager/controller-manager-7ff4bd94b8-d4tbj" Mar 20 17:36:38 crc kubenswrapper[4690]: I0320 17:36:38.421984 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fd7aad87-3b32-4095-b7f5-efe8d25a53a9-client-ca\") pod \"controller-manager-7ff4bd94b8-d4tbj\" (UID: \"fd7aad87-3b32-4095-b7f5-efe8d25a53a9\") " pod="openshift-controller-manager/controller-manager-7ff4bd94b8-d4tbj" Mar 20 17:36:38 crc kubenswrapper[4690]: I0320 17:36:38.422875 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03b2dced-b34b-414b-aed5-0b94c0eba98c-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "03b2dced-b34b-414b-aed5-0b94c0eba98c" (UID: "03b2dced-b34b-414b-aed5-0b94c0eba98c"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:36:38 crc kubenswrapper[4690]: I0320 17:36:38.422901 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03b2dced-b34b-414b-aed5-0b94c0eba98c-client-ca" (OuterVolumeSpecName: "client-ca") pod "03b2dced-b34b-414b-aed5-0b94c0eba98c" (UID: "03b2dced-b34b-414b-aed5-0b94c0eba98c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:36:38 crc kubenswrapper[4690]: I0320 17:36:38.423377 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bcc0f390-eb81-4393-843f-731dffa103c1-config" (OuterVolumeSpecName: "config") pod "bcc0f390-eb81-4393-843f-731dffa103c1" (UID: "bcc0f390-eb81-4393-843f-731dffa103c1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:36:38 crc kubenswrapper[4690]: I0320 17:36:38.423571 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03b2dced-b34b-414b-aed5-0b94c0eba98c-config" (OuterVolumeSpecName: "config") pod "03b2dced-b34b-414b-aed5-0b94c0eba98c" (UID: "03b2dced-b34b-414b-aed5-0b94c0eba98c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:36:38 crc kubenswrapper[4690]: I0320 17:36:38.422027 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fd7aad87-3b32-4095-b7f5-efe8d25a53a9-proxy-ca-bundles\") pod \"controller-manager-7ff4bd94b8-d4tbj\" (UID: \"fd7aad87-3b32-4095-b7f5-efe8d25a53a9\") " pod="openshift-controller-manager/controller-manager-7ff4bd94b8-d4tbj" Mar 20 17:36:38 crc kubenswrapper[4690]: I0320 17:36:38.424415 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7z5d5\" (UniqueName: \"kubernetes.io/projected/fd7aad87-3b32-4095-b7f5-efe8d25a53a9-kube-api-access-7z5d5\") pod \"controller-manager-7ff4bd94b8-d4tbj\" (UID: \"fd7aad87-3b32-4095-b7f5-efe8d25a53a9\") " pod="openshift-controller-manager/controller-manager-7ff4bd94b8-d4tbj" Mar 20 17:36:38 crc kubenswrapper[4690]: I0320 17:36:38.424482 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd7aad87-3b32-4095-b7f5-efe8d25a53a9-serving-cert\") pod \"controller-manager-7ff4bd94b8-d4tbj\" (UID: \"fd7aad87-3b32-4095-b7f5-efe8d25a53a9\") " pod="openshift-controller-manager/controller-manager-7ff4bd94b8-d4tbj" Mar 20 17:36:38 crc kubenswrapper[4690]: I0320 17:36:38.424293 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bcc0f390-eb81-4393-843f-731dffa103c1-client-ca" (OuterVolumeSpecName: "client-ca") pod "bcc0f390-eb81-4393-843f-731dffa103c1" (UID: "bcc0f390-eb81-4393-843f-731dffa103c1"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:36:38 crc kubenswrapper[4690]: I0320 17:36:38.425124 4690 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03b2dced-b34b-414b-aed5-0b94c0eba98c-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:38 crc kubenswrapper[4690]: I0320 17:36:38.425241 4690 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/03b2dced-b34b-414b-aed5-0b94c0eba98c-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:38 crc kubenswrapper[4690]: I0320 17:36:38.425379 4690 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bcc0f390-eb81-4393-843f-731dffa103c1-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:38 crc kubenswrapper[4690]: I0320 17:36:38.425488 4690 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/03b2dced-b34b-414b-aed5-0b94c0eba98c-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:38 crc kubenswrapper[4690]: I0320 17:36:38.425593 4690 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bcc0f390-eb81-4393-843f-731dffa103c1-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:38 crc kubenswrapper[4690]: I0320 17:36:38.426732 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcc0f390-eb81-4393-843f-731dffa103c1-kube-api-access-zdzn5" (OuterVolumeSpecName: "kube-api-access-zdzn5") pod "bcc0f390-eb81-4393-843f-731dffa103c1" (UID: "bcc0f390-eb81-4393-843f-731dffa103c1"). InnerVolumeSpecName "kube-api-access-zdzn5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:36:38 crc kubenswrapper[4690]: I0320 17:36:38.427176 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03b2dced-b34b-414b-aed5-0b94c0eba98c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "03b2dced-b34b-414b-aed5-0b94c0eba98c" (UID: "03b2dced-b34b-414b-aed5-0b94c0eba98c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:36:38 crc kubenswrapper[4690]: I0320 17:36:38.428365 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcc0f390-eb81-4393-843f-731dffa103c1-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bcc0f390-eb81-4393-843f-731dffa103c1" (UID: "bcc0f390-eb81-4393-843f-731dffa103c1"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:36:38 crc kubenswrapper[4690]: I0320 17:36:38.434502 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03b2dced-b34b-414b-aed5-0b94c0eba98c-kube-api-access-g7x7t" (OuterVolumeSpecName: "kube-api-access-g7x7t") pod "03b2dced-b34b-414b-aed5-0b94c0eba98c" (UID: "03b2dced-b34b-414b-aed5-0b94c0eba98c"). InnerVolumeSpecName "kube-api-access-g7x7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:36:38 crc kubenswrapper[4690]: I0320 17:36:38.526807 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7z5d5\" (UniqueName: \"kubernetes.io/projected/fd7aad87-3b32-4095-b7f5-efe8d25a53a9-kube-api-access-7z5d5\") pod \"controller-manager-7ff4bd94b8-d4tbj\" (UID: \"fd7aad87-3b32-4095-b7f5-efe8d25a53a9\") " pod="openshift-controller-manager/controller-manager-7ff4bd94b8-d4tbj" Mar 20 17:36:38 crc kubenswrapper[4690]: I0320 17:36:38.526900 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd7aad87-3b32-4095-b7f5-efe8d25a53a9-serving-cert\") pod \"controller-manager-7ff4bd94b8-d4tbj\" (UID: \"fd7aad87-3b32-4095-b7f5-efe8d25a53a9\") " pod="openshift-controller-manager/controller-manager-7ff4bd94b8-d4tbj" Mar 20 17:36:38 crc kubenswrapper[4690]: I0320 17:36:38.526963 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd7aad87-3b32-4095-b7f5-efe8d25a53a9-config\") pod \"controller-manager-7ff4bd94b8-d4tbj\" (UID: \"fd7aad87-3b32-4095-b7f5-efe8d25a53a9\") " pod="openshift-controller-manager/controller-manager-7ff4bd94b8-d4tbj" Mar 20 17:36:38 crc kubenswrapper[4690]: I0320 17:36:38.527032 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fd7aad87-3b32-4095-b7f5-efe8d25a53a9-client-ca\") pod \"controller-manager-7ff4bd94b8-d4tbj\" (UID: \"fd7aad87-3b32-4095-b7f5-efe8d25a53a9\") " pod="openshift-controller-manager/controller-manager-7ff4bd94b8-d4tbj" Mar 20 17:36:38 crc kubenswrapper[4690]: I0320 17:36:38.527052 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fd7aad87-3b32-4095-b7f5-efe8d25a53a9-proxy-ca-bundles\") pod \"controller-manager-7ff4bd94b8-d4tbj\" (UID: \"fd7aad87-3b32-4095-b7f5-efe8d25a53a9\") " pod="openshift-controller-manager/controller-manager-7ff4bd94b8-d4tbj" Mar 20 17:36:38 crc kubenswrapper[4690]: I0320 17:36:38.527127 4690 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bcc0f390-eb81-4393-843f-731dffa103c1-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:38 crc kubenswrapper[4690]: I0320 17:36:38.527140 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zdzn5\" (UniqueName: \"kubernetes.io/projected/bcc0f390-eb81-4393-843f-731dffa103c1-kube-api-access-zdzn5\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:38 crc kubenswrapper[4690]: I0320 17:36:38.527151 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g7x7t\" (UniqueName: \"kubernetes.io/projected/03b2dced-b34b-414b-aed5-0b94c0eba98c-kube-api-access-g7x7t\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:38 crc kubenswrapper[4690]: I0320 17:36:38.527161 4690 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/03b2dced-b34b-414b-aed5-0b94c0eba98c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:38 crc kubenswrapper[4690]: I0320 17:36:38.528957 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fd7aad87-3b32-4095-b7f5-efe8d25a53a9-proxy-ca-bundles\") pod \"controller-manager-7ff4bd94b8-d4tbj\" (UID: \"fd7aad87-3b32-4095-b7f5-efe8d25a53a9\") " pod="openshift-controller-manager/controller-manager-7ff4bd94b8-d4tbj" Mar 20 17:36:38 crc kubenswrapper[4690]: I0320 17:36:38.529319 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd7aad87-3b32-4095-b7f5-efe8d25a53a9-config\") pod \"controller-manager-7ff4bd94b8-d4tbj\" (UID: \"fd7aad87-3b32-4095-b7f5-efe8d25a53a9\") " pod="openshift-controller-manager/controller-manager-7ff4bd94b8-d4tbj" Mar 20 17:36:38 crc kubenswrapper[4690]: I0320 17:36:38.529866 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fd7aad87-3b32-4095-b7f5-efe8d25a53a9-client-ca\") pod \"controller-manager-7ff4bd94b8-d4tbj\" (UID: \"fd7aad87-3b32-4095-b7f5-efe8d25a53a9\") " pod="openshift-controller-manager/controller-manager-7ff4bd94b8-d4tbj" Mar 20 17:36:38 crc kubenswrapper[4690]: I0320 17:36:38.533709 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd7aad87-3b32-4095-b7f5-efe8d25a53a9-serving-cert\") pod \"controller-manager-7ff4bd94b8-d4tbj\" (UID: \"fd7aad87-3b32-4095-b7f5-efe8d25a53a9\") " pod="openshift-controller-manager/controller-manager-7ff4bd94b8-d4tbj" Mar 20 17:36:38 crc kubenswrapper[4690]: I0320 17:36:38.549541 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7z5d5\" (UniqueName: \"kubernetes.io/projected/fd7aad87-3b32-4095-b7f5-efe8d25a53a9-kube-api-access-7z5d5\") pod \"controller-manager-7ff4bd94b8-d4tbj\" (UID: \"fd7aad87-3b32-4095-b7f5-efe8d25a53a9\") " pod="openshift-controller-manager/controller-manager-7ff4bd94b8-d4tbj" Mar 20 17:36:38 crc kubenswrapper[4690]: I0320 17:36:38.706762 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7ff4bd94b8-d4tbj" Mar 20 17:36:38 crc kubenswrapper[4690]: I0320 17:36:38.717922 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-c6846f548-qgm89" event={"ID":"bcc0f390-eb81-4393-843f-731dffa103c1","Type":"ContainerDied","Data":"5c3e60f811d82e9a4d6e540671faee3bf3e9c7a57ce0def48be23289c683085e"} Mar 20 17:36:38 crc kubenswrapper[4690]: I0320 17:36:38.717954 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-c6846f548-qgm89" Mar 20 17:36:38 crc kubenswrapper[4690]: I0320 17:36:38.717988 4690 scope.go:117] "RemoveContainer" containerID="5e54004d98bd59c3be6c28a26d9fdde3c0146ea1cf6be9292e219b8ec325da60" Mar 20 17:36:38 crc kubenswrapper[4690]: I0320 17:36:38.719906 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-d7677567-tbqjb" event={"ID":"03b2dced-b34b-414b-aed5-0b94c0eba98c","Type":"ContainerDied","Data":"c535fb0ee0f54a54782738972b01aad425b624746b55c8bc46f4b48136ec3141"} Mar 20 17:36:38 crc kubenswrapper[4690]: I0320 17:36:38.719974 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-d7677567-tbqjb" Mar 20 17:36:38 crc kubenswrapper[4690]: I0320 17:36:38.748371 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c6846f548-qgm89"] Mar 20 17:36:38 crc kubenswrapper[4690]: I0320 17:36:38.755323 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c6846f548-qgm89"] Mar 20 17:36:38 crc kubenswrapper[4690]: I0320 17:36:38.758683 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-d7677567-tbqjb"] Mar 20 17:36:38 crc kubenswrapper[4690]: I0320 17:36:38.761499 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-d7677567-tbqjb"] Mar 20 17:36:39 crc kubenswrapper[4690]: E0320 17:36:39.276920 4690 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 20 17:36:39 crc kubenswrapper[4690]: E0320 17:36:39.277077 4690 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 17:36:39 crc kubenswrapper[4690]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 20 17:36:39 crc kubenswrapper[4690]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nw2qf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29567136-lsh75_openshift-infra(d1c872c1-ae2b-4fd2-bb6f-e387fab73a06): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 20 17:36:39 crc kubenswrapper[4690]: > logger="UnhandledError" Mar 20 17:36:39 crc kubenswrapper[4690]: E0320 17:36:39.278290 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29567136-lsh75" podUID="d1c872c1-ae2b-4fd2-bb6f-e387fab73a06" Mar 20 17:36:39 crc kubenswrapper[4690]: E0320 17:36:39.725123 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29567136-lsh75" podUID="d1c872c1-ae2b-4fd2-bb6f-e387fab73a06" Mar 20 17:36:39 crc kubenswrapper[4690]: I0320 17:36:39.889742 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03b2dced-b34b-414b-aed5-0b94c0eba98c" path="/var/lib/kubelet/pods/03b2dced-b34b-414b-aed5-0b94c0eba98c/volumes" Mar 20 17:36:39 crc kubenswrapper[4690]: I0320 17:36:39.890818 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bcc0f390-eb81-4393-843f-731dffa103c1" path="/var/lib/kubelet/pods/bcc0f390-eb81-4393-843f-731dffa103c1/volumes" Mar 20 17:36:40 crc kubenswrapper[4690]: I0320 17:36:40.543231 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d699b74bd-pctjl"] Mar 20 17:36:40 crc kubenswrapper[4690]: I0320 17:36:40.544271 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5d699b74bd-pctjl" Mar 20 17:36:40 crc kubenswrapper[4690]: I0320 17:36:40.546675 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 17:36:40 crc kubenswrapper[4690]: I0320 17:36:40.546889 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 17:36:40 crc kubenswrapper[4690]: I0320 17:36:40.547402 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 17:36:40 crc kubenswrapper[4690]: I0320 17:36:40.549734 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 17:36:40 crc kubenswrapper[4690]: I0320 17:36:40.549770 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 17:36:40 crc kubenswrapper[4690]: I0320 17:36:40.550397 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 17:36:40 crc kubenswrapper[4690]: I0320 17:36:40.561507 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d699b74bd-pctjl"] Mar 20 17:36:40 crc kubenswrapper[4690]: I0320 17:36:40.655104 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8a2404df-2583-48d9-a6e5-59daa7a3a8a8-serving-cert\") pod \"route-controller-manager-5d699b74bd-pctjl\" (UID: \"8a2404df-2583-48d9-a6e5-59daa7a3a8a8\") " pod="openshift-route-controller-manager/route-controller-manager-5d699b74bd-pctjl" Mar 20 17:36:40 crc kubenswrapper[4690]: I0320 17:36:40.655162 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8knx\" (UniqueName: \"kubernetes.io/projected/8a2404df-2583-48d9-a6e5-59daa7a3a8a8-kube-api-access-k8knx\") pod \"route-controller-manager-5d699b74bd-pctjl\" (UID: \"8a2404df-2583-48d9-a6e5-59daa7a3a8a8\") " pod="openshift-route-controller-manager/route-controller-manager-5d699b74bd-pctjl" Mar 20 17:36:40 crc kubenswrapper[4690]: I0320 17:36:40.655417 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8a2404df-2583-48d9-a6e5-59daa7a3a8a8-client-ca\") pod \"route-controller-manager-5d699b74bd-pctjl\" (UID: \"8a2404df-2583-48d9-a6e5-59daa7a3a8a8\") " pod="openshift-route-controller-manager/route-controller-manager-5d699b74bd-pctjl" Mar 20 17:36:40 crc kubenswrapper[4690]: I0320 17:36:40.655475 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a2404df-2583-48d9-a6e5-59daa7a3a8a8-config\") pod \"route-controller-manager-5d699b74bd-pctjl\" (UID: \"8a2404df-2583-48d9-a6e5-59daa7a3a8a8\") " pod="openshift-route-controller-manager/route-controller-manager-5d699b74bd-pctjl" Mar 20 17:36:40 crc kubenswrapper[4690]: I0320 17:36:40.757331 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8a2404df-2583-48d9-a6e5-59daa7a3a8a8-serving-cert\") pod \"route-controller-manager-5d699b74bd-pctjl\" (UID: \"8a2404df-2583-48d9-a6e5-59daa7a3a8a8\") " pod="openshift-route-controller-manager/route-controller-manager-5d699b74bd-pctjl" Mar 20 17:36:40 crc kubenswrapper[4690]: I0320 17:36:40.757398 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8knx\" (UniqueName: \"kubernetes.io/projected/8a2404df-2583-48d9-a6e5-59daa7a3a8a8-kube-api-access-k8knx\") pod \"route-controller-manager-5d699b74bd-pctjl\" (UID: \"8a2404df-2583-48d9-a6e5-59daa7a3a8a8\") " pod="openshift-route-controller-manager/route-controller-manager-5d699b74bd-pctjl" Mar 20 17:36:40 crc kubenswrapper[4690]: I0320 17:36:40.757464 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8a2404df-2583-48d9-a6e5-59daa7a3a8a8-client-ca\") pod \"route-controller-manager-5d699b74bd-pctjl\" (UID: \"8a2404df-2583-48d9-a6e5-59daa7a3a8a8\") " pod="openshift-route-controller-manager/route-controller-manager-5d699b74bd-pctjl" Mar 20 17:36:40 crc kubenswrapper[4690]: I0320 17:36:40.757485 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a2404df-2583-48d9-a6e5-59daa7a3a8a8-config\") pod \"route-controller-manager-5d699b74bd-pctjl\" (UID: \"8a2404df-2583-48d9-a6e5-59daa7a3a8a8\") " pod="openshift-route-controller-manager/route-controller-manager-5d699b74bd-pctjl" Mar 20 17:36:40 crc kubenswrapper[4690]: I0320 17:36:40.758520 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8a2404df-2583-48d9-a6e5-59daa7a3a8a8-client-ca\") pod \"route-controller-manager-5d699b74bd-pctjl\" (UID: \"8a2404df-2583-48d9-a6e5-59daa7a3a8a8\") " pod="openshift-route-controller-manager/route-controller-manager-5d699b74bd-pctjl" Mar 20 17:36:40 crc kubenswrapper[4690]: I0320 17:36:40.758841 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a2404df-2583-48d9-a6e5-59daa7a3a8a8-config\") pod \"route-controller-manager-5d699b74bd-pctjl\" (UID: \"8a2404df-2583-48d9-a6e5-59daa7a3a8a8\") " pod="openshift-route-controller-manager/route-controller-manager-5d699b74bd-pctjl" Mar 20 17:36:40 crc kubenswrapper[4690]: I0320 17:36:40.761467 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8a2404df-2583-48d9-a6e5-59daa7a3a8a8-serving-cert\") pod \"route-controller-manager-5d699b74bd-pctjl\" (UID: \"8a2404df-2583-48d9-a6e5-59daa7a3a8a8\") " pod="openshift-route-controller-manager/route-controller-manager-5d699b74bd-pctjl" Mar 20 17:36:40 crc kubenswrapper[4690]: I0320 17:36:40.773298 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8knx\" (UniqueName: \"kubernetes.io/projected/8a2404df-2583-48d9-a6e5-59daa7a3a8a8-kube-api-access-k8knx\") pod \"route-controller-manager-5d699b74bd-pctjl\" (UID: \"8a2404df-2583-48d9-a6e5-59daa7a3a8a8\") " pod="openshift-route-controller-manager/route-controller-manager-5d699b74bd-pctjl" Mar 20 17:36:40 crc kubenswrapper[4690]: E0320 17:36:40.834878 4690 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 20 17:36:40 crc kubenswrapper[4690]: E0320 17:36:40.835066 4690 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jtsrr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-vwgk4_openshift-marketplace(416d626a-ef44-4b4e-91ce-51042b01a45a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 17:36:40 crc kubenswrapper[4690]: E0320 17:36:40.836331 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-vwgk4" podUID="416d626a-ef44-4b4e-91ce-51042b01a45a" Mar 20 17:36:40 crc kubenswrapper[4690]: I0320 17:36:40.901799 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5d699b74bd-pctjl" Mar 20 17:36:41 crc kubenswrapper[4690]: I0320 17:36:41.520017 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 20 17:36:41 crc kubenswrapper[4690]: I0320 17:36:41.521059 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 17:36:41 crc kubenswrapper[4690]: I0320 17:36:41.535090 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 20 17:36:41 crc kubenswrapper[4690]: I0320 17:36:41.566503 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/de98586b-5aaf-464b-aceb-0493a4c4a84b-kube-api-access\") pod \"installer-9-crc\" (UID: \"de98586b-5aaf-464b-aceb-0493a4c4a84b\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 17:36:41 crc kubenswrapper[4690]: I0320 17:36:41.566706 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/de98586b-5aaf-464b-aceb-0493a4c4a84b-var-lock\") pod \"installer-9-crc\" (UID: \"de98586b-5aaf-464b-aceb-0493a4c4a84b\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 17:36:41 crc kubenswrapper[4690]: I0320 17:36:41.566747 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/de98586b-5aaf-464b-aceb-0493a4c4a84b-kubelet-dir\") pod \"installer-9-crc\" (UID: \"de98586b-5aaf-464b-aceb-0493a4c4a84b\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 17:36:41 crc kubenswrapper[4690]: I0320 17:36:41.668061 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/de98586b-5aaf-464b-aceb-0493a4c4a84b-kube-api-access\") pod \"installer-9-crc\" (UID: \"de98586b-5aaf-464b-aceb-0493a4c4a84b\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 17:36:41 crc kubenswrapper[4690]: I0320 17:36:41.668199 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/de98586b-5aaf-464b-aceb-0493a4c4a84b-var-lock\") pod \"installer-9-crc\" (UID: \"de98586b-5aaf-464b-aceb-0493a4c4a84b\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 17:36:41 crc kubenswrapper[4690]: I0320 17:36:41.668229 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/de98586b-5aaf-464b-aceb-0493a4c4a84b-kubelet-dir\") pod \"installer-9-crc\" (UID: \"de98586b-5aaf-464b-aceb-0493a4c4a84b\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 17:36:41 crc kubenswrapper[4690]: I0320 17:36:41.668333 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/de98586b-5aaf-464b-aceb-0493a4c4a84b-kubelet-dir\") pod \"installer-9-crc\" (UID: \"de98586b-5aaf-464b-aceb-0493a4c4a84b\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 17:36:41 crc kubenswrapper[4690]: I0320 17:36:41.668705 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/de98586b-5aaf-464b-aceb-0493a4c4a84b-var-lock\") pod \"installer-9-crc\" (UID: \"de98586b-5aaf-464b-aceb-0493a4c4a84b\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 17:36:41 crc kubenswrapper[4690]: I0320 17:36:41.694728 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/de98586b-5aaf-464b-aceb-0493a4c4a84b-kube-api-access\") pod \"installer-9-crc\" (UID: \"de98586b-5aaf-464b-aceb-0493a4c4a84b\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 17:36:41 crc kubenswrapper[4690]: I0320 17:36:41.864712 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 17:36:42 crc kubenswrapper[4690]: E0320 17:36:42.079080 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-vwgk4" podUID="416d626a-ef44-4b4e-91ce-51042b01a45a" Mar 20 17:36:42 crc kubenswrapper[4690]: E0320 17:36:42.135287 4690 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 20 17:36:42 crc kubenswrapper[4690]: E0320 17:36:42.135450 4690 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 17:36:42 crc kubenswrapper[4690]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 20 17:36:42 crc kubenswrapper[4690]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9flwk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29567134-66l98_openshift-infra(34d2f5b9-1f8e-4413-b178-58cd10fa7548): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 20 17:36:42 crc kubenswrapper[4690]: > logger="UnhandledError" Mar 20 17:36:42 crc kubenswrapper[4690]: E0320 17:36:42.136627 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29567134-66l98" podUID="34d2f5b9-1f8e-4413-b178-58cd10fa7548" Mar 20 17:36:42 crc kubenswrapper[4690]: E0320 17:36:42.140326 4690 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 20 17:36:42 crc kubenswrapper[4690]: E0320 17:36:42.140444 4690 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-phklv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-ssfrq_openshift-marketplace(244c63f8-c484-4edb-9cb6-0ac6a9dac136): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 17:36:42 crc kubenswrapper[4690]: E0320 17:36:42.141638 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-ssfrq" podUID="244c63f8-c484-4edb-9cb6-0ac6a9dac136" Mar 20 17:36:42 crc kubenswrapper[4690]: E0320 17:36:42.186548 4690 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 20 17:36:42 crc kubenswrapper[4690]: E0320 17:36:42.186769 4690 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-v8gtg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-6f26l_openshift-marketplace(e896e412-2900-44e4-908c-de1883bd9cdc): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 17:36:42 crc kubenswrapper[4690]: E0320 17:36:42.187993 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-6f26l" podUID="e896e412-2900-44e4-908c-de1883bd9cdc" Mar 20 17:36:42 crc kubenswrapper[4690]: E0320 17:36:42.197181 4690 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 20 17:36:42 crc kubenswrapper[4690]: E0320 17:36:42.197437 4690 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nd42m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-bnxz2_openshift-marketplace(7552fec8-7b03-4ad9-8410-1705f639433e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 17:36:42 crc kubenswrapper[4690]: E0320 17:36:42.198767 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-bnxz2" podUID="7552fec8-7b03-4ad9-8410-1705f639433e" Mar 20 17:36:42 crc kubenswrapper[4690]: E0320 17:36:42.743965 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29567134-66l98" podUID="34d2f5b9-1f8e-4413-b178-58cd10fa7548" Mar 20 17:36:43 crc kubenswrapper[4690]: E0320 17:36:43.692372 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-ssfrq" podUID="244c63f8-c484-4edb-9cb6-0ac6a9dac136" Mar 20 17:36:43 crc kubenswrapper[4690]: E0320 17:36:43.692404 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-6f26l" podUID="e896e412-2900-44e4-908c-de1883bd9cdc" Mar 20 17:36:43 crc kubenswrapper[4690]: E0320 17:36:43.692404 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-bnxz2" podUID="7552fec8-7b03-4ad9-8410-1705f639433e" Mar 20 17:36:43 crc kubenswrapper[4690]: E0320 17:36:43.767653 4690 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 20 17:36:43 crc kubenswrapper[4690]: E0320 17:36:43.767796 4690 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-26622,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-sdqjm_openshift-marketplace(29dcb3ba-2c4c-41f1-a655-02ce44ab280f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 17:36:43 crc kubenswrapper[4690]: E0320 17:36:43.768969 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-sdqjm" podUID="29dcb3ba-2c4c-41f1-a655-02ce44ab280f" Mar 20 17:36:43 crc kubenswrapper[4690]: E0320 17:36:43.796840 4690 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 20 17:36:43 crc kubenswrapper[4690]: E0320 17:36:43.796995 4690 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5jpch,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-4m7xw_openshift-marketplace(30d0d78a-3ea1-4ce6-b8fb-13645cfedf18): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 17:36:43 crc kubenswrapper[4690]: E0320 17:36:43.798224 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-4m7xw" podUID="30d0d78a-3ea1-4ce6-b8fb-13645cfedf18" Mar 20 17:36:47 crc kubenswrapper[4690]: E0320 17:36:47.162617 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-sdqjm" podUID="29dcb3ba-2c4c-41f1-a655-02ce44ab280f" Mar 20 17:36:47 crc kubenswrapper[4690]: E0320 17:36:47.162904 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-4m7xw" podUID="30d0d78a-3ea1-4ce6-b8fb-13645cfedf18" Mar 20 17:36:47 crc kubenswrapper[4690]: I0320 17:36:47.191580 4690 scope.go:117] "RemoveContainer" containerID="cfd57053637ec7b1871be1627e4db5a5d9082c0c82b51ee19cf4d55e82747ec7" Mar 20 17:36:47 crc kubenswrapper[4690]: E0320 17:36:47.236112 4690 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 20 17:36:47 crc kubenswrapper[4690]: E0320 17:36:47.236574 4690 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m4tmr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-gj5xl_openshift-marketplace(068537fa-5883-4e11-a933-87706891d0ae): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 17:36:47 crc kubenswrapper[4690]: E0320 17:36:47.237937 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-gj5xl" podUID="068537fa-5883-4e11-a933-87706891d0ae" Mar 20 17:36:47 crc kubenswrapper[4690]: E0320 17:36:47.304813 4690 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 20 17:36:47 crc kubenswrapper[4690]: E0320 17:36:47.304960 4690 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-87j8k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-zltxc_openshift-marketplace(edacb8ae-57ae-41f3-b13b-a423afa0e2dd): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 17:36:47 crc kubenswrapper[4690]: E0320 17:36:47.306121 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-zltxc" podUID="edacb8ae-57ae-41f3-b13b-a423afa0e2dd" Mar 20 17:36:47 crc kubenswrapper[4690]: I0320 17:36:47.392609 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d699b74bd-pctjl"] Mar 20 17:36:47 crc kubenswrapper[4690]: W0320 17:36:47.400223 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a2404df_2583_48d9_a6e5_59daa7a3a8a8.slice/crio-9c4fffbb7f288a3a8e7d86208e3433e98c4011ba2447b61a54ec96bf6f8d6440 WatchSource:0}: Error finding container 9c4fffbb7f288a3a8e7d86208e3433e98c4011ba2447b61a54ec96bf6f8d6440: Status 404 returned error can't find the container with id 9c4fffbb7f288a3a8e7d86208e3433e98c4011ba2447b61a54ec96bf6f8d6440 Mar 20 17:36:47 crc kubenswrapper[4690]: I0320 17:36:47.478735 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 20 17:36:47 crc kubenswrapper[4690]: I0320 17:36:47.642917 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 20 17:36:47 crc kubenswrapper[4690]: I0320 17:36:47.648989 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7ff4bd94b8-d4tbj"] Mar 20 17:36:47 crc kubenswrapper[4690]: W0320 17:36:47.652552 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podde98586b_5aaf_464b_aceb_0493a4c4a84b.slice/crio-8ddfae8f7f946034d5a5ba03c4905671b9c686703d33af849f26c8551d5d4df5 WatchSource:0}: Error finding container 8ddfae8f7f946034d5a5ba03c4905671b9c686703d33af849f26c8551d5d4df5: Status 404 returned error can't find the container with id 8ddfae8f7f946034d5a5ba03c4905671b9c686703d33af849f26c8551d5d4df5 Mar 20 17:36:47 crc kubenswrapper[4690]: W0320 17:36:47.657705 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd7aad87_3b32_4095_b7f5_efe8d25a53a9.slice/crio-d4344f54bfe290add9a4b0517651a23e843700a91d6994d394d65f155dc3897a WatchSource:0}: Error finding container d4344f54bfe290add9a4b0517651a23e843700a91d6994d394d65f155dc3897a: Status 404 returned error can't find the container with id d4344f54bfe290add9a4b0517651a23e843700a91d6994d394d65f155dc3897a Mar 20 17:36:47 crc kubenswrapper[4690]: I0320 17:36:47.772109 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"de98586b-5aaf-464b-aceb-0493a4c4a84b","Type":"ContainerStarted","Data":"8ddfae8f7f946034d5a5ba03c4905671b9c686703d33af849f26c8551d5d4df5"} Mar 20 17:36:47 crc kubenswrapper[4690]: I0320 17:36:47.774622 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7ff4bd94b8-d4tbj" event={"ID":"fd7aad87-3b32-4095-b7f5-efe8d25a53a9","Type":"ContainerStarted","Data":"d4344f54bfe290add9a4b0517651a23e843700a91d6994d394d65f155dc3897a"} Mar 20 17:36:47 crc kubenswrapper[4690]: I0320 17:36:47.781423 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"22016c8f-24ff-47ef-ad5f-1e22ef59ae23","Type":"ContainerStarted","Data":"f6b9b583c5acdc1c0fcfb8875c73d7d616372f1fdd3e788e39589b24ad3bb695"} Mar 20 17:36:47 crc kubenswrapper[4690]: I0320 17:36:47.783036 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5d699b74bd-pctjl" event={"ID":"8a2404df-2583-48d9-a6e5-59daa7a3a8a8","Type":"ContainerStarted","Data":"b2092ffe1b23b7f6077a73a45dfa48c4dc1e5e3d5eafdd336b0e36cdd2cba1ae"} Mar 20 17:36:47 crc kubenswrapper[4690]: I0320 17:36:47.783090 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5d699b74bd-pctjl" event={"ID":"8a2404df-2583-48d9-a6e5-59daa7a3a8a8","Type":"ContainerStarted","Data":"9c4fffbb7f288a3a8e7d86208e3433e98c4011ba2447b61a54ec96bf6f8d6440"} Mar 20 17:36:47 crc kubenswrapper[4690]: I0320 17:36:47.784481 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5d699b74bd-pctjl" Mar 20 17:36:47 crc kubenswrapper[4690]: E0320 17:36:47.786105 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-zltxc" podUID="edacb8ae-57ae-41f3-b13b-a423afa0e2dd" Mar 20 17:36:47 crc kubenswrapper[4690]: E0320 17:36:47.786665 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-gj5xl" podUID="068537fa-5883-4e11-a933-87706891d0ae" Mar 20 17:36:47 crc kubenswrapper[4690]: I0320 17:36:47.817494 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5d699b74bd-pctjl" podStartSLOduration=12.81747711 podStartE2EDuration="12.81747711s" podCreationTimestamp="2026-03-20 17:36:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:36:47.816953645 +0000 UTC m=+282.682779343" watchObservedRunningTime="2026-03-20 17:36:47.81747711 +0000 UTC m=+282.683302788" Mar 20 17:36:47 crc kubenswrapper[4690]: I0320 17:36:47.983664 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5d699b74bd-pctjl" Mar 20 17:36:48 crc kubenswrapper[4690]: I0320 17:36:48.802968 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"de98586b-5aaf-464b-aceb-0493a4c4a84b","Type":"ContainerStarted","Data":"8ecf0177b9fbb4044bcd83d45908355a3bf5d82ac73fe3f3e442ed72a7a4418e"} Mar 20 17:36:48 crc kubenswrapper[4690]: I0320 17:36:48.804083 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7ff4bd94b8-d4tbj" event={"ID":"fd7aad87-3b32-4095-b7f5-efe8d25a53a9","Type":"ContainerStarted","Data":"6fd29333e7e0cde725da6e53ff2d24f2041e305e46de60f70ca1f45b52d13620"} Mar 20 17:36:48 crc kubenswrapper[4690]: I0320 17:36:48.804264 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7ff4bd94b8-d4tbj" Mar 20 17:36:48 crc kubenswrapper[4690]: I0320 17:36:48.807308 4690 generic.go:334] "Generic (PLEG): container finished" podID="22016c8f-24ff-47ef-ad5f-1e22ef59ae23" containerID="d4fa4a0e739b3bcde2386f2262fa9423f372e0f74a9b7b5568ea86627c437198" exitCode=0 Mar 20 17:36:48 crc kubenswrapper[4690]: I0320 17:36:48.807474 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"22016c8f-24ff-47ef-ad5f-1e22ef59ae23","Type":"ContainerDied","Data":"d4fa4a0e739b3bcde2386f2262fa9423f372e0f74a9b7b5568ea86627c437198"} Mar 20 17:36:48 crc kubenswrapper[4690]: I0320 17:36:48.809220 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7ff4bd94b8-d4tbj" Mar 20 17:36:48 crc kubenswrapper[4690]: I0320 17:36:48.821157 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=7.8211391070000005 podStartE2EDuration="7.821139107s" podCreationTimestamp="2026-03-20 17:36:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:36:48.817933465 +0000 UTC m=+283.683759133" watchObservedRunningTime="2026-03-20 17:36:48.821139107 +0000 UTC m=+283.686964785" Mar 20 17:36:48 crc kubenswrapper[4690]: I0320 17:36:48.844134 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7ff4bd94b8-d4tbj" podStartSLOduration=13.844107368 podStartE2EDuration="13.844107368s" podCreationTimestamp="2026-03-20 17:36:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:36:48.838664572 +0000 UTC m=+283.704490250" watchObservedRunningTime="2026-03-20 17:36:48.844107368 +0000 UTC m=+283.709933086" Mar 20 17:36:50 crc kubenswrapper[4690]: I0320 17:36:50.058553 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 17:36:50 crc kubenswrapper[4690]: I0320 17:36:50.234166 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/22016c8f-24ff-47ef-ad5f-1e22ef59ae23-kube-api-access\") pod \"22016c8f-24ff-47ef-ad5f-1e22ef59ae23\" (UID: \"22016c8f-24ff-47ef-ad5f-1e22ef59ae23\") " Mar 20 17:36:50 crc kubenswrapper[4690]: I0320 17:36:50.235215 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/22016c8f-24ff-47ef-ad5f-1e22ef59ae23-kubelet-dir\") pod \"22016c8f-24ff-47ef-ad5f-1e22ef59ae23\" (UID: \"22016c8f-24ff-47ef-ad5f-1e22ef59ae23\") " Mar 20 17:36:50 crc kubenswrapper[4690]: I0320 17:36:50.235374 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/22016c8f-24ff-47ef-ad5f-1e22ef59ae23-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "22016c8f-24ff-47ef-ad5f-1e22ef59ae23" (UID: "22016c8f-24ff-47ef-ad5f-1e22ef59ae23"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:36:50 crc kubenswrapper[4690]: I0320 17:36:50.235960 4690 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/22016c8f-24ff-47ef-ad5f-1e22ef59ae23-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:50 crc kubenswrapper[4690]: I0320 17:36:50.241621 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22016c8f-24ff-47ef-ad5f-1e22ef59ae23-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "22016c8f-24ff-47ef-ad5f-1e22ef59ae23" (UID: "22016c8f-24ff-47ef-ad5f-1e22ef59ae23"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:36:50 crc kubenswrapper[4690]: I0320 17:36:50.338232 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/22016c8f-24ff-47ef-ad5f-1e22ef59ae23-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:50 crc kubenswrapper[4690]: I0320 17:36:50.821409 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"22016c8f-24ff-47ef-ad5f-1e22ef59ae23","Type":"ContainerDied","Data":"f6b9b583c5acdc1c0fcfb8875c73d7d616372f1fdd3e788e39589b24ad3bb695"} Mar 20 17:36:50 crc kubenswrapper[4690]: I0320 17:36:50.821483 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6b9b583c5acdc1c0fcfb8875c73d7d616372f1fdd3e788e39589b24ad3bb695" Mar 20 17:36:50 crc kubenswrapper[4690]: I0320 17:36:50.821487 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 17:36:54 crc kubenswrapper[4690]: I0320 17:36:54.274345 4690 patch_prober.go:28] interesting pod/machine-config-daemon-wtg2q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:36:54 crc kubenswrapper[4690]: I0320 17:36:54.274713 4690 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:36:54 crc kubenswrapper[4690]: I0320 17:36:54.274795 4690 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" Mar 20 17:36:54 crc kubenswrapper[4690]: I0320 17:36:54.275749 4690 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"09565d72b6e11bc9bc4f72446c455016fb107bdf0fe367b56427ce9f79c20b0e"} pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 17:36:54 crc kubenswrapper[4690]: I0320 17:36:54.275861 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" containerName="machine-config-daemon" containerID="cri-o://09565d72b6e11bc9bc4f72446c455016fb107bdf0fe367b56427ce9f79c20b0e" gracePeriod=600 Mar 20 17:36:54 crc kubenswrapper[4690]: I0320 17:36:54.846890 4690 generic.go:334] "Generic (PLEG): container finished" podID="c18651e4-89e3-43fd-a780-bfa6df87591e" containerID="09565d72b6e11bc9bc4f72446c455016fb107bdf0fe367b56427ce9f79c20b0e" exitCode=0 Mar 20 17:36:54 crc kubenswrapper[4690]: I0320 17:36:54.847154 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" event={"ID":"c18651e4-89e3-43fd-a780-bfa6df87591e","Type":"ContainerDied","Data":"09565d72b6e11bc9bc4f72446c455016fb107bdf0fe367b56427ce9f79c20b0e"} Mar 20 17:36:54 crc kubenswrapper[4690]: I0320 17:36:54.847179 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" event={"ID":"c18651e4-89e3-43fd-a780-bfa6df87591e","Type":"ContainerStarted","Data":"810ef61dfd66653c97e50a7c5e658e3e4610648ff84dc8342c8cadb5532980bc"} Mar 20 17:36:56 crc kubenswrapper[4690]: I0320 17:36:56.249808 4690 csr.go:261] certificate signing request csr-zqf8z is approved, waiting to be issued Mar 20 17:36:56 crc kubenswrapper[4690]: I0320 17:36:56.258554 4690 csr.go:257] certificate signing request csr-zqf8z is issued Mar 20 17:36:56 crc kubenswrapper[4690]: I0320 17:36:56.862205 4690 generic.go:334] "Generic (PLEG): container finished" podID="d1c872c1-ae2b-4fd2-bb6f-e387fab73a06" containerID="0fae6e0c4bfd93a4ea4458663879b020eaa4104b33a23ce7f203217e5c6f2138" exitCode=0 Mar 20 17:36:56 crc kubenswrapper[4690]: I0320 17:36:56.862425 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567136-lsh75" event={"ID":"d1c872c1-ae2b-4fd2-bb6f-e387fab73a06","Type":"ContainerDied","Data":"0fae6e0c4bfd93a4ea4458663879b020eaa4104b33a23ce7f203217e5c6f2138"} Mar 20 17:36:56 crc kubenswrapper[4690]: I0320 17:36:56.866548 4690 generic.go:334] "Generic (PLEG): container finished" podID="244c63f8-c484-4edb-9cb6-0ac6a9dac136" containerID="bad2421123885acf87544a625c009c38d815d1ae18f099a7195e0ae1f3e2d913" exitCode=0 Mar 20 17:36:56 crc kubenswrapper[4690]: I0320 17:36:56.866602 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ssfrq" event={"ID":"244c63f8-c484-4edb-9cb6-0ac6a9dac136","Type":"ContainerDied","Data":"bad2421123885acf87544a625c009c38d815d1ae18f099a7195e0ae1f3e2d913"} Mar 20 17:36:56 crc kubenswrapper[4690]: I0320 17:36:56.869463 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vwgk4" event={"ID":"416d626a-ef44-4b4e-91ce-51042b01a45a","Type":"ContainerStarted","Data":"0bec18c10ccf11a63ee46b90dc5f20be0e93558e8ef490df6229d21da1a612bf"} Mar 20 17:36:57 crc kubenswrapper[4690]: I0320 17:36:57.259647 4690 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-03 10:56:23.073135771 +0000 UTC Mar 20 17:36:57 crc kubenswrapper[4690]: I0320 17:36:57.260141 4690 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6929h19m25.813004187s for next certificate rotation Mar 20 17:36:57 crc kubenswrapper[4690]: I0320 17:36:57.876395 4690 generic.go:334] "Generic (PLEG): container finished" podID="416d626a-ef44-4b4e-91ce-51042b01a45a" containerID="0bec18c10ccf11a63ee46b90dc5f20be0e93558e8ef490df6229d21da1a612bf" exitCode=0 Mar 20 17:36:57 crc kubenswrapper[4690]: I0320 17:36:57.876537 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vwgk4" event={"ID":"416d626a-ef44-4b4e-91ce-51042b01a45a","Type":"ContainerDied","Data":"0bec18c10ccf11a63ee46b90dc5f20be0e93558e8ef490df6229d21da1a612bf"} Mar 20 17:36:57 crc kubenswrapper[4690]: I0320 17:36:57.878486 4690 generic.go:334] "Generic (PLEG): container finished" podID="e896e412-2900-44e4-908c-de1883bd9cdc" containerID="73435739160b21fe028a25c1fc78197ab6fa96ca8ced28914ff501355e6aff3c" exitCode=0 Mar 20 17:36:57 crc kubenswrapper[4690]: I0320 17:36:57.878550 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6f26l" event={"ID":"e896e412-2900-44e4-908c-de1883bd9cdc","Type":"ContainerDied","Data":"73435739160b21fe028a25c1fc78197ab6fa96ca8ced28914ff501355e6aff3c"} Mar 20 17:36:57 crc kubenswrapper[4690]: I0320 17:36:57.881443 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ssfrq" event={"ID":"244c63f8-c484-4edb-9cb6-0ac6a9dac136","Type":"ContainerStarted","Data":"d7de445ebe7990df16abb70ee2900f98d44fdfb8df9b82007bc8ece8b464694c"} Mar 20 17:36:57 crc kubenswrapper[4690]: I0320 17:36:57.996727 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ssfrq" podStartSLOduration=3.868028445 podStartE2EDuration="59.996707903s" podCreationTimestamp="2026-03-20 17:35:58 +0000 UTC" firstStartedPulling="2026-03-20 17:36:01.159225305 +0000 UTC m=+236.025050983" lastFinishedPulling="2026-03-20 17:36:57.287904773 +0000 UTC m=+292.153730441" observedRunningTime="2026-03-20 17:36:57.99519282 +0000 UTC m=+292.861018508" watchObservedRunningTime="2026-03-20 17:36:57.996707903 +0000 UTC m=+292.862533581" Mar 20 17:36:58 crc kubenswrapper[4690]: I0320 17:36:58.261159 4690 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-11-29 03:17:41.436180621 +0000 UTC Mar 20 17:36:58 crc kubenswrapper[4690]: I0320 17:36:58.261379 4690 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6081h40m43.174804001s for next certificate rotation Mar 20 17:36:58 crc kubenswrapper[4690]: I0320 17:36:58.290677 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567136-lsh75" Mar 20 17:36:58 crc kubenswrapper[4690]: I0320 17:36:58.356249 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nw2qf\" (UniqueName: \"kubernetes.io/projected/d1c872c1-ae2b-4fd2-bb6f-e387fab73a06-kube-api-access-nw2qf\") pod \"d1c872c1-ae2b-4fd2-bb6f-e387fab73a06\" (UID: \"d1c872c1-ae2b-4fd2-bb6f-e387fab73a06\") " Mar 20 17:36:58 crc kubenswrapper[4690]: I0320 17:36:58.365449 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1c872c1-ae2b-4fd2-bb6f-e387fab73a06-kube-api-access-nw2qf" (OuterVolumeSpecName: "kube-api-access-nw2qf") pod "d1c872c1-ae2b-4fd2-bb6f-e387fab73a06" (UID: "d1c872c1-ae2b-4fd2-bb6f-e387fab73a06"). InnerVolumeSpecName "kube-api-access-nw2qf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:36:58 crc kubenswrapper[4690]: I0320 17:36:58.458872 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nw2qf\" (UniqueName: \"kubernetes.io/projected/d1c872c1-ae2b-4fd2-bb6f-e387fab73a06-kube-api-access-nw2qf\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:58 crc kubenswrapper[4690]: I0320 17:36:58.897589 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6f26l" event={"ID":"e896e412-2900-44e4-908c-de1883bd9cdc","Type":"ContainerStarted","Data":"0fcf9ded1bc628603962f67ef9e04892fac4c60deedbbda824ddd69337caa969"} Mar 20 17:36:58 crc kubenswrapper[4690]: I0320 17:36:58.899463 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sdqjm" event={"ID":"29dcb3ba-2c4c-41f1-a655-02ce44ab280f","Type":"ContainerStarted","Data":"05740b1c124e8aeab3ecc0c6eb455b7a3b15192357a286cac64023ef3f7e6b24"} Mar 20 17:36:58 crc kubenswrapper[4690]: I0320 17:36:58.901711 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vwgk4" event={"ID":"416d626a-ef44-4b4e-91ce-51042b01a45a","Type":"ContainerStarted","Data":"1fe3dcdb3969906ffca8b3854da57c53585e6e2e9ca61660385b5987ad74672b"} Mar 20 17:36:58 crc kubenswrapper[4690]: I0320 17:36:58.902633 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567136-lsh75" Mar 20 17:36:58 crc kubenswrapper[4690]: I0320 17:36:58.902636 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567136-lsh75" event={"ID":"d1c872c1-ae2b-4fd2-bb6f-e387fab73a06","Type":"ContainerDied","Data":"3110d2a2be95095dc771e550d297bddaff17fbe9f3688949aa02221bbdc167fd"} Mar 20 17:36:58 crc kubenswrapper[4690]: I0320 17:36:58.902678 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3110d2a2be95095dc771e550d297bddaff17fbe9f3688949aa02221bbdc167fd" Mar 20 17:36:58 crc kubenswrapper[4690]: I0320 17:36:58.903690 4690 generic.go:334] "Generic (PLEG): container finished" podID="34d2f5b9-1f8e-4413-b178-58cd10fa7548" containerID="6901f038f408141511eb1c951407621da6d4ab4dff87c1828b77f43ae8798bbb" exitCode=0 Mar 20 17:36:58 crc kubenswrapper[4690]: I0320 17:36:58.903720 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567134-66l98" event={"ID":"34d2f5b9-1f8e-4413-b178-58cd10fa7548","Type":"ContainerDied","Data":"6901f038f408141511eb1c951407621da6d4ab4dff87c1828b77f43ae8798bbb"} Mar 20 17:36:58 crc kubenswrapper[4690]: I0320 17:36:58.988621 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vwgk4" podStartSLOduration=4.44238484 podStartE2EDuration="1m2.988608101s" podCreationTimestamp="2026-03-20 17:35:56 +0000 UTC" firstStartedPulling="2026-03-20 17:35:59.840242069 +0000 UTC m=+234.706067747" lastFinishedPulling="2026-03-20 17:36:58.38646533 +0000 UTC m=+293.252291008" observedRunningTime="2026-03-20 17:36:58.987594832 +0000 UTC m=+293.853420510" watchObservedRunningTime="2026-03-20 17:36:58.988608101 +0000 UTC m=+293.854433779" Mar 20 17:36:59 crc kubenswrapper[4690]: I0320 17:36:59.006912 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6f26l" podStartSLOduration=2.806764242 podStartE2EDuration="1m0.006899257s" podCreationTimestamp="2026-03-20 17:35:59 +0000 UTC" firstStartedPulling="2026-03-20 17:36:01.158941577 +0000 UTC m=+236.024767255" lastFinishedPulling="2026-03-20 17:36:58.359076592 +0000 UTC m=+293.224902270" observedRunningTime="2026-03-20 17:36:59.0045463 +0000 UTC m=+293.870371988" watchObservedRunningTime="2026-03-20 17:36:59.006899257 +0000 UTC m=+293.872724935" Mar 20 17:36:59 crc kubenswrapper[4690]: I0320 17:36:59.317511 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ssfrq" Mar 20 17:36:59 crc kubenswrapper[4690]: I0320 17:36:59.317861 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ssfrq" Mar 20 17:36:59 crc kubenswrapper[4690]: I0320 17:36:59.684242 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6f26l" Mar 20 17:36:59 crc kubenswrapper[4690]: I0320 17:36:59.684299 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6f26l" Mar 20 17:36:59 crc kubenswrapper[4690]: I0320 17:36:59.922249 4690 generic.go:334] "Generic (PLEG): container finished" podID="29dcb3ba-2c4c-41f1-a655-02ce44ab280f" containerID="05740b1c124e8aeab3ecc0c6eb455b7a3b15192357a286cac64023ef3f7e6b24" exitCode=0 Mar 20 17:36:59 crc kubenswrapper[4690]: I0320 17:36:59.922437 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sdqjm" event={"ID":"29dcb3ba-2c4c-41f1-a655-02ce44ab280f","Type":"ContainerDied","Data":"05740b1c124e8aeab3ecc0c6eb455b7a3b15192357a286cac64023ef3f7e6b24"} Mar 20 17:37:00 crc kubenswrapper[4690]: I0320 17:37:00.526923 4690 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-ssfrq" podUID="244c63f8-c484-4edb-9cb6-0ac6a9dac136" containerName="registry-server" probeResult="failure" output=< Mar 20 17:37:00 crc kubenswrapper[4690]: timeout: failed to connect service ":50051" within 1s Mar 20 17:37:00 crc kubenswrapper[4690]: > Mar 20 17:37:00 crc kubenswrapper[4690]: I0320 17:37:00.547415 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567134-66l98" Mar 20 17:37:00 crc kubenswrapper[4690]: I0320 17:37:00.702029 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9flwk\" (UniqueName: \"kubernetes.io/projected/34d2f5b9-1f8e-4413-b178-58cd10fa7548-kube-api-access-9flwk\") pod \"34d2f5b9-1f8e-4413-b178-58cd10fa7548\" (UID: \"34d2f5b9-1f8e-4413-b178-58cd10fa7548\") " Mar 20 17:37:00 crc kubenswrapper[4690]: I0320 17:37:00.710151 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34d2f5b9-1f8e-4413-b178-58cd10fa7548-kube-api-access-9flwk" (OuterVolumeSpecName: "kube-api-access-9flwk") pod "34d2f5b9-1f8e-4413-b178-58cd10fa7548" (UID: "34d2f5b9-1f8e-4413-b178-58cd10fa7548"). InnerVolumeSpecName "kube-api-access-9flwk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:37:00 crc kubenswrapper[4690]: I0320 17:37:00.727826 4690 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-6f26l" podUID="e896e412-2900-44e4-908c-de1883bd9cdc" containerName="registry-server" probeResult="failure" output=< Mar 20 17:37:00 crc kubenswrapper[4690]: timeout: failed to connect service ":50051" within 1s Mar 20 17:37:00 crc kubenswrapper[4690]: > Mar 20 17:37:00 crc kubenswrapper[4690]: I0320 17:37:00.804412 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9flwk\" (UniqueName: \"kubernetes.io/projected/34d2f5b9-1f8e-4413-b178-58cd10fa7548-kube-api-access-9flwk\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:00 crc kubenswrapper[4690]: I0320 17:37:00.930573 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567134-66l98" event={"ID":"34d2f5b9-1f8e-4413-b178-58cd10fa7548","Type":"ContainerDied","Data":"22714bce4009fbdefd48088e188e121fb893b0cbb9f6cde85b3262cda24f5b02"} Mar 20 17:37:00 crc kubenswrapper[4690]: I0320 17:37:00.930630 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="22714bce4009fbdefd48088e188e121fb893b0cbb9f6cde85b3262cda24f5b02" Mar 20 17:37:00 crc kubenswrapper[4690]: I0320 17:37:00.930661 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567134-66l98" Mar 20 17:37:04 crc kubenswrapper[4690]: I0320 17:37:04.970849 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sdqjm" event={"ID":"29dcb3ba-2c4c-41f1-a655-02ce44ab280f","Type":"ContainerStarted","Data":"58d939f0caa24a0c9532173f10e7d762c1e9e12a4171cfc922b95b9cf79a809c"} Mar 20 17:37:04 crc kubenswrapper[4690]: I0320 17:37:04.974478 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zltxc" event={"ID":"edacb8ae-57ae-41f3-b13b-a423afa0e2dd","Type":"ContainerStarted","Data":"84c094381485fb9029decdd3f4ffdb718e527e641f2e5b3bff237eed3c96ac6a"} Mar 20 17:37:04 crc kubenswrapper[4690]: I0320 17:37:04.975645 4690 generic.go:334] "Generic (PLEG): container finished" podID="7552fec8-7b03-4ad9-8410-1705f639433e" containerID="b866d1b036b3eda3cae3956ae10bb651acaca83b0874a0eb5f7e956b7e87ce02" exitCode=0 Mar 20 17:37:04 crc kubenswrapper[4690]: I0320 17:37:04.975679 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bnxz2" event={"ID":"7552fec8-7b03-4ad9-8410-1705f639433e","Type":"ContainerDied","Data":"b866d1b036b3eda3cae3956ae10bb651acaca83b0874a0eb5f7e956b7e87ce02"} Mar 20 17:37:05 crc kubenswrapper[4690]: I0320 17:37:05.024935 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-sdqjm" podStartSLOduration=4.205084252 podStartE2EDuration="1m8.024917025s" podCreationTimestamp="2026-03-20 17:35:57 +0000 UTC" firstStartedPulling="2026-03-20 17:35:59.807478846 +0000 UTC m=+234.673304524" lastFinishedPulling="2026-03-20 17:37:03.627311589 +0000 UTC m=+298.493137297" observedRunningTime="2026-03-20 17:37:05.021755764 +0000 UTC m=+299.887581472" watchObservedRunningTime="2026-03-20 17:37:05.024917025 +0000 UTC m=+299.890742703" Mar 20 17:37:05 crc kubenswrapper[4690]: I0320 17:37:05.984284 4690 generic.go:334] "Generic (PLEG): container finished" podID="edacb8ae-57ae-41f3-b13b-a423afa0e2dd" containerID="84c094381485fb9029decdd3f4ffdb718e527e641f2e5b3bff237eed3c96ac6a" exitCode=0 Mar 20 17:37:05 crc kubenswrapper[4690]: I0320 17:37:05.984339 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zltxc" event={"ID":"edacb8ae-57ae-41f3-b13b-a423afa0e2dd","Type":"ContainerDied","Data":"84c094381485fb9029decdd3f4ffdb718e527e641f2e5b3bff237eed3c96ac6a"} Mar 20 17:37:07 crc kubenswrapper[4690]: I0320 17:37:07.095076 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vwgk4" Mar 20 17:37:07 crc kubenswrapper[4690]: I0320 17:37:07.095487 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vwgk4" Mar 20 17:37:07 crc kubenswrapper[4690]: I0320 17:37:07.204374 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vwgk4" Mar 20 17:37:07 crc kubenswrapper[4690]: I0320 17:37:07.709057 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-sdqjm" Mar 20 17:37:07 crc kubenswrapper[4690]: I0320 17:37:07.709110 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-sdqjm" Mar 20 17:37:07 crc kubenswrapper[4690]: I0320 17:37:07.775228 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-sdqjm" Mar 20 17:37:08 crc kubenswrapper[4690]: I0320 17:37:08.001105 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gj5xl" event={"ID":"068537fa-5883-4e11-a933-87706891d0ae","Type":"ContainerStarted","Data":"805574f227d2d2bd444371e85c61f18a42097cbc92f61cc596f9ca7f48c54478"} Mar 20 17:37:08 crc kubenswrapper[4690]: I0320 17:37:08.004128 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4m7xw" event={"ID":"30d0d78a-3ea1-4ce6-b8fb-13645cfedf18","Type":"ContainerStarted","Data":"b2b52611441a520ab3625dab25e0048ab70ec8325a6a02bbe734c01f5f5c7f9f"} Mar 20 17:37:08 crc kubenswrapper[4690]: I0320 17:37:08.009494 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bnxz2" event={"ID":"7552fec8-7b03-4ad9-8410-1705f639433e","Type":"ContainerStarted","Data":"3c42f8b24eb11bd0dc19e8c6fff2c370af1437dd837868fbe9e8fa02bd64d413"} Mar 20 17:37:08 crc kubenswrapper[4690]: I0320 17:37:08.067700 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bnxz2" podStartSLOduration=3.494692784 podStartE2EDuration="1m11.0676811s" podCreationTimestamp="2026-03-20 17:35:57 +0000 UTC" firstStartedPulling="2026-03-20 17:36:00.076231989 +0000 UTC m=+234.942057667" lastFinishedPulling="2026-03-20 17:37:07.649220265 +0000 UTC m=+302.515045983" observedRunningTime="2026-03-20 17:37:08.063153999 +0000 UTC m=+302.928979687" watchObservedRunningTime="2026-03-20 17:37:08.0676811 +0000 UTC m=+302.933506798" Mar 20 17:37:08 crc kubenswrapper[4690]: I0320 17:37:08.068689 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vwgk4" Mar 20 17:37:09 crc kubenswrapper[4690]: I0320 17:37:09.018979 4690 generic.go:334] "Generic (PLEG): container finished" podID="068537fa-5883-4e11-a933-87706891d0ae" containerID="805574f227d2d2bd444371e85c61f18a42097cbc92f61cc596f9ca7f48c54478" exitCode=0 Mar 20 17:37:09 crc kubenswrapper[4690]: I0320 17:37:09.019095 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gj5xl" event={"ID":"068537fa-5883-4e11-a933-87706891d0ae","Type":"ContainerDied","Data":"805574f227d2d2bd444371e85c61f18a42097cbc92f61cc596f9ca7f48c54478"} Mar 20 17:37:09 crc kubenswrapper[4690]: I0320 17:37:09.023210 4690 generic.go:334] "Generic (PLEG): container finished" podID="30d0d78a-3ea1-4ce6-b8fb-13645cfedf18" containerID="b2b52611441a520ab3625dab25e0048ab70ec8325a6a02bbe734c01f5f5c7f9f" exitCode=0 Mar 20 17:37:09 crc kubenswrapper[4690]: I0320 17:37:09.023332 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4m7xw" event={"ID":"30d0d78a-3ea1-4ce6-b8fb-13645cfedf18","Type":"ContainerDied","Data":"b2b52611441a520ab3625dab25e0048ab70ec8325a6a02bbe734c01f5f5c7f9f"} Mar 20 17:37:09 crc kubenswrapper[4690]: I0320 17:37:09.037747 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zltxc" event={"ID":"edacb8ae-57ae-41f3-b13b-a423afa0e2dd","Type":"ContainerStarted","Data":"09b034838f6bb8b6db8b82a642484201233acf32ac664e09394e60737b22b716"} Mar 20 17:37:09 crc kubenswrapper[4690]: I0320 17:37:09.082516 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zltxc" podStartSLOduration=4.471420445 podStartE2EDuration="1m10.082488166s" podCreationTimestamp="2026-03-20 17:35:59 +0000 UTC" firstStartedPulling="2026-03-20 17:36:02.258137956 +0000 UTC m=+237.123963634" lastFinishedPulling="2026-03-20 17:37:07.869205637 +0000 UTC m=+302.735031355" observedRunningTime="2026-03-20 17:37:09.07530952 +0000 UTC m=+303.941135198" watchObservedRunningTime="2026-03-20 17:37:09.082488166 +0000 UTC m=+303.948313874" Mar 20 17:37:09 crc kubenswrapper[4690]: I0320 17:37:09.360678 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ssfrq" Mar 20 17:37:09 crc kubenswrapper[4690]: I0320 17:37:09.404165 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ssfrq" Mar 20 17:37:09 crc kubenswrapper[4690]: I0320 17:37:09.744216 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6f26l" Mar 20 17:37:09 crc kubenswrapper[4690]: I0320 17:37:09.783302 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6f26l" Mar 20 17:37:10 crc kubenswrapper[4690]: I0320 17:37:10.043205 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4m7xw" event={"ID":"30d0d78a-3ea1-4ce6-b8fb-13645cfedf18","Type":"ContainerStarted","Data":"a1c7f63012a04e35bc876e57c218890507ef1e6d645db5b916abf9ec63e0c657"} Mar 20 17:37:10 crc kubenswrapper[4690]: I0320 17:37:10.045714 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gj5xl" event={"ID":"068537fa-5883-4e11-a933-87706891d0ae","Type":"ContainerStarted","Data":"4140b0d641cf878254f672a33d6785be5e012927cc77271f69b840cf6dc17cd8"} Mar 20 17:37:10 crc kubenswrapper[4690]: I0320 17:37:10.064826 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4m7xw" podStartSLOduration=4.506500846 podStartE2EDuration="1m14.064808109s" podCreationTimestamp="2026-03-20 17:35:56 +0000 UTC" firstStartedPulling="2026-03-20 17:35:59.964009292 +0000 UTC m=+234.829834970" lastFinishedPulling="2026-03-20 17:37:09.522316565 +0000 UTC m=+304.388142233" observedRunningTime="2026-03-20 17:37:10.063198623 +0000 UTC m=+304.929024311" watchObservedRunningTime="2026-03-20 17:37:10.064808109 +0000 UTC m=+304.930633787" Mar 20 17:37:10 crc kubenswrapper[4690]: I0320 17:37:10.370308 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zltxc" Mar 20 17:37:10 crc kubenswrapper[4690]: I0320 17:37:10.370374 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zltxc" Mar 20 17:37:11 crc kubenswrapper[4690]: I0320 17:37:11.068869 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gj5xl" podStartSLOduration=3.691190715 podStartE2EDuration="1m11.068852437s" podCreationTimestamp="2026-03-20 17:36:00 +0000 UTC" firstStartedPulling="2026-03-20 17:36:02.241574593 +0000 UTC m=+237.107400271" lastFinishedPulling="2026-03-20 17:37:09.619236275 +0000 UTC m=+304.485061993" observedRunningTime="2026-03-20 17:37:11.065063758 +0000 UTC m=+305.930889436" watchObservedRunningTime="2026-03-20 17:37:11.068852437 +0000 UTC m=+305.934678115" Mar 20 17:37:11 crc kubenswrapper[4690]: I0320 17:37:11.413930 4690 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zltxc" podUID="edacb8ae-57ae-41f3-b13b-a423afa0e2dd" containerName="registry-server" probeResult="failure" output=< Mar 20 17:37:11 crc kubenswrapper[4690]: timeout: failed to connect service ":50051" within 1s Mar 20 17:37:11 crc kubenswrapper[4690]: > Mar 20 17:37:13 crc kubenswrapper[4690]: I0320 17:37:13.321022 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6f26l"] Mar 20 17:37:13 crc kubenswrapper[4690]: I0320 17:37:13.321613 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6f26l" podUID="e896e412-2900-44e4-908c-de1883bd9cdc" containerName="registry-server" containerID="cri-o://0fcf9ded1bc628603962f67ef9e04892fac4c60deedbbda824ddd69337caa969" gracePeriod=2 Mar 20 17:37:14 crc kubenswrapper[4690]: I0320 17:37:14.069642 4690 generic.go:334] "Generic (PLEG): container finished" podID="e896e412-2900-44e4-908c-de1883bd9cdc" containerID="0fcf9ded1bc628603962f67ef9e04892fac4c60deedbbda824ddd69337caa969" exitCode=0 Mar 20 17:37:14 crc kubenswrapper[4690]: I0320 17:37:14.069717 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6f26l" event={"ID":"e896e412-2900-44e4-908c-de1883bd9cdc","Type":"ContainerDied","Data":"0fcf9ded1bc628603962f67ef9e04892fac4c60deedbbda824ddd69337caa969"} Mar 20 17:37:14 crc kubenswrapper[4690]: I0320 17:37:14.518660 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6f26l" Mar 20 17:37:14 crc kubenswrapper[4690]: I0320 17:37:14.593205 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e896e412-2900-44e4-908c-de1883bd9cdc-utilities\") pod \"e896e412-2900-44e4-908c-de1883bd9cdc\" (UID: \"e896e412-2900-44e4-908c-de1883bd9cdc\") " Mar 20 17:37:14 crc kubenswrapper[4690]: I0320 17:37:14.593268 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e896e412-2900-44e4-908c-de1883bd9cdc-catalog-content\") pod \"e896e412-2900-44e4-908c-de1883bd9cdc\" (UID: \"e896e412-2900-44e4-908c-de1883bd9cdc\") " Mar 20 17:37:14 crc kubenswrapper[4690]: I0320 17:37:14.593290 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v8gtg\" (UniqueName: \"kubernetes.io/projected/e896e412-2900-44e4-908c-de1883bd9cdc-kube-api-access-v8gtg\") pod \"e896e412-2900-44e4-908c-de1883bd9cdc\" (UID: \"e896e412-2900-44e4-908c-de1883bd9cdc\") " Mar 20 17:37:14 crc kubenswrapper[4690]: I0320 17:37:14.594467 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e896e412-2900-44e4-908c-de1883bd9cdc-utilities" (OuterVolumeSpecName: "utilities") pod "e896e412-2900-44e4-908c-de1883bd9cdc" (UID: "e896e412-2900-44e4-908c-de1883bd9cdc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:37:14 crc kubenswrapper[4690]: I0320 17:37:14.607033 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e896e412-2900-44e4-908c-de1883bd9cdc-kube-api-access-v8gtg" (OuterVolumeSpecName: "kube-api-access-v8gtg") pod "e896e412-2900-44e4-908c-de1883bd9cdc" (UID: "e896e412-2900-44e4-908c-de1883bd9cdc"). InnerVolumeSpecName "kube-api-access-v8gtg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:37:14 crc kubenswrapper[4690]: I0320 17:37:14.623655 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e896e412-2900-44e4-908c-de1883bd9cdc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e896e412-2900-44e4-908c-de1883bd9cdc" (UID: "e896e412-2900-44e4-908c-de1883bd9cdc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:37:14 crc kubenswrapper[4690]: I0320 17:37:14.694124 4690 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e896e412-2900-44e4-908c-de1883bd9cdc-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:14 crc kubenswrapper[4690]: I0320 17:37:14.694338 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v8gtg\" (UniqueName: \"kubernetes.io/projected/e896e412-2900-44e4-908c-de1883bd9cdc-kube-api-access-v8gtg\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:14 crc kubenswrapper[4690]: I0320 17:37:14.694350 4690 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e896e412-2900-44e4-908c-de1883bd9cdc-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:15 crc kubenswrapper[4690]: I0320 17:37:15.081025 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6f26l" event={"ID":"e896e412-2900-44e4-908c-de1883bd9cdc","Type":"ContainerDied","Data":"6763a29e5889773af681a475bf692b525661b28dbc0ab49dcc87e155776ecaae"} Mar 20 17:37:15 crc kubenswrapper[4690]: I0320 17:37:15.081159 4690 scope.go:117] "RemoveContainer" containerID="0fcf9ded1bc628603962f67ef9e04892fac4c60deedbbda824ddd69337caa969" Mar 20 17:37:15 crc kubenswrapper[4690]: I0320 17:37:15.081411 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6f26l" Mar 20 17:37:15 crc kubenswrapper[4690]: I0320 17:37:15.111092 4690 scope.go:117] "RemoveContainer" containerID="73435739160b21fe028a25c1fc78197ab6fa96ca8ced28914ff501355e6aff3c" Mar 20 17:37:15 crc kubenswrapper[4690]: I0320 17:37:15.128523 4690 scope.go:117] "RemoveContainer" containerID="c8d4a838048210eaf37f5766143f03c2bf793451d4aeb61c3c91e10593cf9d46" Mar 20 17:37:15 crc kubenswrapper[4690]: I0320 17:37:15.193090 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6f26l"] Mar 20 17:37:15 crc kubenswrapper[4690]: I0320 17:37:15.195476 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6f26l"] Mar 20 17:37:15 crc kubenswrapper[4690]: I0320 17:37:15.845830 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7ff4bd94b8-d4tbj"] Mar 20 17:37:15 crc kubenswrapper[4690]: I0320 17:37:15.846071 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7ff4bd94b8-d4tbj" podUID="fd7aad87-3b32-4095-b7f5-efe8d25a53a9" containerName="controller-manager" containerID="cri-o://6fd29333e7e0cde725da6e53ff2d24f2041e305e46de60f70ca1f45b52d13620" gracePeriod=30 Mar 20 17:37:15 crc kubenswrapper[4690]: I0320 17:37:15.890315 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e896e412-2900-44e4-908c-de1883bd9cdc" path="/var/lib/kubelet/pods/e896e412-2900-44e4-908c-de1883bd9cdc/volumes" Mar 20 17:37:15 crc kubenswrapper[4690]: I0320 17:37:15.944001 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d699b74bd-pctjl"] Mar 20 17:37:15 crc kubenswrapper[4690]: I0320 17:37:15.944192 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5d699b74bd-pctjl" podUID="8a2404df-2583-48d9-a6e5-59daa7a3a8a8" containerName="route-controller-manager" containerID="cri-o://b2092ffe1b23b7f6077a73a45dfa48c4dc1e5e3d5eafdd336b0e36cdd2cba1ae" gracePeriod=30 Mar 20 17:37:16 crc kubenswrapper[4690]: I0320 17:37:16.912826 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5d699b74bd-pctjl" Mar 20 17:37:16 crc kubenswrapper[4690]: I0320 17:37:16.930492 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k8knx\" (UniqueName: \"kubernetes.io/projected/8a2404df-2583-48d9-a6e5-59daa7a3a8a8-kube-api-access-k8knx\") pod \"8a2404df-2583-48d9-a6e5-59daa7a3a8a8\" (UID: \"8a2404df-2583-48d9-a6e5-59daa7a3a8a8\") " Mar 20 17:37:16 crc kubenswrapper[4690]: I0320 17:37:16.930542 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a2404df-2583-48d9-a6e5-59daa7a3a8a8-config\") pod \"8a2404df-2583-48d9-a6e5-59daa7a3a8a8\" (UID: \"8a2404df-2583-48d9-a6e5-59daa7a3a8a8\") " Mar 20 17:37:16 crc kubenswrapper[4690]: I0320 17:37:16.930587 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8a2404df-2583-48d9-a6e5-59daa7a3a8a8-serving-cert\") pod \"8a2404df-2583-48d9-a6e5-59daa7a3a8a8\" (UID: \"8a2404df-2583-48d9-a6e5-59daa7a3a8a8\") " Mar 20 17:37:16 crc kubenswrapper[4690]: I0320 17:37:16.930628 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8a2404df-2583-48d9-a6e5-59daa7a3a8a8-client-ca\") pod \"8a2404df-2583-48d9-a6e5-59daa7a3a8a8\" (UID: \"8a2404df-2583-48d9-a6e5-59daa7a3a8a8\") " Mar 20 17:37:16 crc kubenswrapper[4690]: I0320 17:37:16.931659 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a2404df-2583-48d9-a6e5-59daa7a3a8a8-client-ca" (OuterVolumeSpecName: "client-ca") pod "8a2404df-2583-48d9-a6e5-59daa7a3a8a8" (UID: "8a2404df-2583-48d9-a6e5-59daa7a3a8a8"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:37:16 crc kubenswrapper[4690]: I0320 17:37:16.932216 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a2404df-2583-48d9-a6e5-59daa7a3a8a8-config" (OuterVolumeSpecName: "config") pod "8a2404df-2583-48d9-a6e5-59daa7a3a8a8" (UID: "8a2404df-2583-48d9-a6e5-59daa7a3a8a8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:37:16 crc kubenswrapper[4690]: I0320 17:37:16.937614 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a2404df-2583-48d9-a6e5-59daa7a3a8a8-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8a2404df-2583-48d9-a6e5-59daa7a3a8a8" (UID: "8a2404df-2583-48d9-a6e5-59daa7a3a8a8"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:37:16 crc kubenswrapper[4690]: I0320 17:37:16.947574 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a2404df-2583-48d9-a6e5-59daa7a3a8a8-kube-api-access-k8knx" (OuterVolumeSpecName: "kube-api-access-k8knx") pod "8a2404df-2583-48d9-a6e5-59daa7a3a8a8" (UID: "8a2404df-2583-48d9-a6e5-59daa7a3a8a8"). InnerVolumeSpecName "kube-api-access-k8knx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:37:16 crc kubenswrapper[4690]: I0320 17:37:16.996412 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7ff4bd94b8-d4tbj" Mar 20 17:37:17 crc kubenswrapper[4690]: I0320 17:37:17.032162 4690 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8a2404df-2583-48d9-a6e5-59daa7a3a8a8-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:17 crc kubenswrapper[4690]: I0320 17:37:17.032219 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k8knx\" (UniqueName: \"kubernetes.io/projected/8a2404df-2583-48d9-a6e5-59daa7a3a8a8-kube-api-access-k8knx\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:17 crc kubenswrapper[4690]: I0320 17:37:17.032228 4690 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a2404df-2583-48d9-a6e5-59daa7a3a8a8-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:17 crc kubenswrapper[4690]: I0320 17:37:17.032243 4690 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8a2404df-2583-48d9-a6e5-59daa7a3a8a8-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:17 crc kubenswrapper[4690]: I0320 17:37:17.097087 4690 generic.go:334] "Generic (PLEG): container finished" podID="fd7aad87-3b32-4095-b7f5-efe8d25a53a9" containerID="6fd29333e7e0cde725da6e53ff2d24f2041e305e46de60f70ca1f45b52d13620" exitCode=0 Mar 20 17:37:17 crc kubenswrapper[4690]: I0320 17:37:17.097173 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7ff4bd94b8-d4tbj" Mar 20 17:37:17 crc kubenswrapper[4690]: I0320 17:37:17.097193 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7ff4bd94b8-d4tbj" event={"ID":"fd7aad87-3b32-4095-b7f5-efe8d25a53a9","Type":"ContainerDied","Data":"6fd29333e7e0cde725da6e53ff2d24f2041e305e46de60f70ca1f45b52d13620"} Mar 20 17:37:17 crc kubenswrapper[4690]: I0320 17:37:17.097233 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7ff4bd94b8-d4tbj" event={"ID":"fd7aad87-3b32-4095-b7f5-efe8d25a53a9","Type":"ContainerDied","Data":"d4344f54bfe290add9a4b0517651a23e843700a91d6994d394d65f155dc3897a"} Mar 20 17:37:17 crc kubenswrapper[4690]: I0320 17:37:17.097296 4690 scope.go:117] "RemoveContainer" containerID="6fd29333e7e0cde725da6e53ff2d24f2041e305e46de60f70ca1f45b52d13620" Mar 20 17:37:17 crc kubenswrapper[4690]: I0320 17:37:17.099746 4690 generic.go:334] "Generic (PLEG): container finished" podID="8a2404df-2583-48d9-a6e5-59daa7a3a8a8" containerID="b2092ffe1b23b7f6077a73a45dfa48c4dc1e5e3d5eafdd336b0e36cdd2cba1ae" exitCode=0 Mar 20 17:37:17 crc kubenswrapper[4690]: I0320 17:37:17.099797 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5d699b74bd-pctjl" event={"ID":"8a2404df-2583-48d9-a6e5-59daa7a3a8a8","Type":"ContainerDied","Data":"b2092ffe1b23b7f6077a73a45dfa48c4dc1e5e3d5eafdd336b0e36cdd2cba1ae"} Mar 20 17:37:17 crc kubenswrapper[4690]: I0320 17:37:17.099829 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5d699b74bd-pctjl" event={"ID":"8a2404df-2583-48d9-a6e5-59daa7a3a8a8","Type":"ContainerDied","Data":"9c4fffbb7f288a3a8e7d86208e3433e98c4011ba2447b61a54ec96bf6f8d6440"} Mar 20 17:37:17 crc kubenswrapper[4690]: I0320 17:37:17.099896 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5d699b74bd-pctjl" Mar 20 17:37:17 crc kubenswrapper[4690]: I0320 17:37:17.127541 4690 scope.go:117] "RemoveContainer" containerID="6fd29333e7e0cde725da6e53ff2d24f2041e305e46de60f70ca1f45b52d13620" Mar 20 17:37:17 crc kubenswrapper[4690]: E0320 17:37:17.128806 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6fd29333e7e0cde725da6e53ff2d24f2041e305e46de60f70ca1f45b52d13620\": container with ID starting with 6fd29333e7e0cde725da6e53ff2d24f2041e305e46de60f70ca1f45b52d13620 not found: ID does not exist" containerID="6fd29333e7e0cde725da6e53ff2d24f2041e305e46de60f70ca1f45b52d13620" Mar 20 17:37:17 crc kubenswrapper[4690]: I0320 17:37:17.128838 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fd29333e7e0cde725da6e53ff2d24f2041e305e46de60f70ca1f45b52d13620"} err="failed to get container status \"6fd29333e7e0cde725da6e53ff2d24f2041e305e46de60f70ca1f45b52d13620\": rpc error: code = NotFound desc = could not find container \"6fd29333e7e0cde725da6e53ff2d24f2041e305e46de60f70ca1f45b52d13620\": container with ID starting with 6fd29333e7e0cde725da6e53ff2d24f2041e305e46de60f70ca1f45b52d13620 not found: ID does not exist" Mar 20 17:37:17 crc kubenswrapper[4690]: I0320 17:37:17.128861 4690 scope.go:117] "RemoveContainer" containerID="b2092ffe1b23b7f6077a73a45dfa48c4dc1e5e3d5eafdd336b0e36cdd2cba1ae" Mar 20 17:37:17 crc kubenswrapper[4690]: I0320 17:37:17.132670 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fd7aad87-3b32-4095-b7f5-efe8d25a53a9-client-ca\") pod \"fd7aad87-3b32-4095-b7f5-efe8d25a53a9\" (UID: \"fd7aad87-3b32-4095-b7f5-efe8d25a53a9\") " Mar 20 17:37:17 crc kubenswrapper[4690]: I0320 17:37:17.132728 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd7aad87-3b32-4095-b7f5-efe8d25a53a9-serving-cert\") pod \"fd7aad87-3b32-4095-b7f5-efe8d25a53a9\" (UID: \"fd7aad87-3b32-4095-b7f5-efe8d25a53a9\") " Mar 20 17:37:17 crc kubenswrapper[4690]: I0320 17:37:17.132750 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fd7aad87-3b32-4095-b7f5-efe8d25a53a9-proxy-ca-bundles\") pod \"fd7aad87-3b32-4095-b7f5-efe8d25a53a9\" (UID: \"fd7aad87-3b32-4095-b7f5-efe8d25a53a9\") " Mar 20 17:37:17 crc kubenswrapper[4690]: I0320 17:37:17.132832 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd7aad87-3b32-4095-b7f5-efe8d25a53a9-config\") pod \"fd7aad87-3b32-4095-b7f5-efe8d25a53a9\" (UID: \"fd7aad87-3b32-4095-b7f5-efe8d25a53a9\") " Mar 20 17:37:17 crc kubenswrapper[4690]: I0320 17:37:17.132860 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7z5d5\" (UniqueName: \"kubernetes.io/projected/fd7aad87-3b32-4095-b7f5-efe8d25a53a9-kube-api-access-7z5d5\") pod \"fd7aad87-3b32-4095-b7f5-efe8d25a53a9\" (UID: \"fd7aad87-3b32-4095-b7f5-efe8d25a53a9\") " Mar 20 17:37:17 crc kubenswrapper[4690]: I0320 17:37:17.134380 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd7aad87-3b32-4095-b7f5-efe8d25a53a9-client-ca" (OuterVolumeSpecName: "client-ca") pod "fd7aad87-3b32-4095-b7f5-efe8d25a53a9" (UID: "fd7aad87-3b32-4095-b7f5-efe8d25a53a9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:37:17 crc kubenswrapper[4690]: I0320 17:37:17.137568 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd7aad87-3b32-4095-b7f5-efe8d25a53a9-config" (OuterVolumeSpecName: "config") pod "fd7aad87-3b32-4095-b7f5-efe8d25a53a9" (UID: "fd7aad87-3b32-4095-b7f5-efe8d25a53a9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:37:17 crc kubenswrapper[4690]: I0320 17:37:17.138209 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd7aad87-3b32-4095-b7f5-efe8d25a53a9-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "fd7aad87-3b32-4095-b7f5-efe8d25a53a9" (UID: "fd7aad87-3b32-4095-b7f5-efe8d25a53a9"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:37:17 crc kubenswrapper[4690]: I0320 17:37:17.143590 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd7aad87-3b32-4095-b7f5-efe8d25a53a9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "fd7aad87-3b32-4095-b7f5-efe8d25a53a9" (UID: "fd7aad87-3b32-4095-b7f5-efe8d25a53a9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:37:17 crc kubenswrapper[4690]: I0320 17:37:17.148852 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d699b74bd-pctjl"] Mar 20 17:37:17 crc kubenswrapper[4690]: I0320 17:37:17.149018 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd7aad87-3b32-4095-b7f5-efe8d25a53a9-kube-api-access-7z5d5" (OuterVolumeSpecName: "kube-api-access-7z5d5") pod "fd7aad87-3b32-4095-b7f5-efe8d25a53a9" (UID: "fd7aad87-3b32-4095-b7f5-efe8d25a53a9"). InnerVolumeSpecName "kube-api-access-7z5d5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:37:17 crc kubenswrapper[4690]: I0320 17:37:17.152618 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d699b74bd-pctjl"] Mar 20 17:37:17 crc kubenswrapper[4690]: I0320 17:37:17.179566 4690 scope.go:117] "RemoveContainer" containerID="b2092ffe1b23b7f6077a73a45dfa48c4dc1e5e3d5eafdd336b0e36cdd2cba1ae" Mar 20 17:37:17 crc kubenswrapper[4690]: E0320 17:37:17.180099 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2092ffe1b23b7f6077a73a45dfa48c4dc1e5e3d5eafdd336b0e36cdd2cba1ae\": container with ID starting with b2092ffe1b23b7f6077a73a45dfa48c4dc1e5e3d5eafdd336b0e36cdd2cba1ae not found: ID does not exist" containerID="b2092ffe1b23b7f6077a73a45dfa48c4dc1e5e3d5eafdd336b0e36cdd2cba1ae" Mar 20 17:37:17 crc kubenswrapper[4690]: I0320 17:37:17.180149 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2092ffe1b23b7f6077a73a45dfa48c4dc1e5e3d5eafdd336b0e36cdd2cba1ae"} err="failed to get container status \"b2092ffe1b23b7f6077a73a45dfa48c4dc1e5e3d5eafdd336b0e36cdd2cba1ae\": rpc error: code = NotFound desc = could not find container \"b2092ffe1b23b7f6077a73a45dfa48c4dc1e5e3d5eafdd336b0e36cdd2cba1ae\": container with ID starting with b2092ffe1b23b7f6077a73a45dfa48c4dc1e5e3d5eafdd336b0e36cdd2cba1ae not found: ID does not exist" Mar 20 17:37:17 crc kubenswrapper[4690]: I0320 17:37:17.233971 4690 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd7aad87-3b32-4095-b7f5-efe8d25a53a9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:17 crc kubenswrapper[4690]: I0320 17:37:17.234009 4690 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fd7aad87-3b32-4095-b7f5-efe8d25a53a9-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:17 crc kubenswrapper[4690]: I0320 17:37:17.234023 4690 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd7aad87-3b32-4095-b7f5-efe8d25a53a9-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:17 crc kubenswrapper[4690]: I0320 17:37:17.234037 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7z5d5\" (UniqueName: \"kubernetes.io/projected/fd7aad87-3b32-4095-b7f5-efe8d25a53a9-kube-api-access-7z5d5\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:17 crc kubenswrapper[4690]: I0320 17:37:17.234050 4690 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fd7aad87-3b32-4095-b7f5-efe8d25a53a9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:17 crc kubenswrapper[4690]: I0320 17:37:17.289972 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4m7xw" Mar 20 17:37:17 crc kubenswrapper[4690]: I0320 17:37:17.290070 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4m7xw" Mar 20 17:37:17 crc kubenswrapper[4690]: I0320 17:37:17.327560 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4m7xw" Mar 20 17:37:17 crc kubenswrapper[4690]: I0320 17:37:17.368858 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bnxz2" Mar 20 17:37:17 crc kubenswrapper[4690]: I0320 17:37:17.368899 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bnxz2" Mar 20 17:37:17 crc kubenswrapper[4690]: I0320 17:37:17.435557 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7ff4bd94b8-d4tbj"] Mar 20 17:37:17 crc kubenswrapper[4690]: I0320 17:37:17.437797 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bnxz2" Mar 20 17:37:17 crc kubenswrapper[4690]: I0320 17:37:17.442225 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7ff4bd94b8-d4tbj"] Mar 20 17:37:17 crc kubenswrapper[4690]: I0320 17:37:17.573949 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f86fc9544-4rfsd"] Mar 20 17:37:17 crc kubenswrapper[4690]: E0320 17:37:17.574202 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1c872c1-ae2b-4fd2-bb6f-e387fab73a06" containerName="oc" Mar 20 17:37:17 crc kubenswrapper[4690]: I0320 17:37:17.574218 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1c872c1-ae2b-4fd2-bb6f-e387fab73a06" containerName="oc" Mar 20 17:37:17 crc kubenswrapper[4690]: E0320 17:37:17.574228 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34d2f5b9-1f8e-4413-b178-58cd10fa7548" containerName="oc" Mar 20 17:37:17 crc kubenswrapper[4690]: I0320 17:37:17.574236 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="34d2f5b9-1f8e-4413-b178-58cd10fa7548" containerName="oc" Mar 20 17:37:17 crc kubenswrapper[4690]: E0320 17:37:17.574247 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22016c8f-24ff-47ef-ad5f-1e22ef59ae23" containerName="pruner" Mar 20 17:37:17 crc kubenswrapper[4690]: I0320 17:37:17.574281 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="22016c8f-24ff-47ef-ad5f-1e22ef59ae23" containerName="pruner" Mar 20 17:37:17 crc kubenswrapper[4690]: E0320 17:37:17.574291 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a2404df-2583-48d9-a6e5-59daa7a3a8a8" containerName="route-controller-manager" Mar 20 17:37:17 crc kubenswrapper[4690]: I0320 17:37:17.574300 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a2404df-2583-48d9-a6e5-59daa7a3a8a8" containerName="route-controller-manager" Mar 20 17:37:17 crc kubenswrapper[4690]: E0320 17:37:17.574311 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e896e412-2900-44e4-908c-de1883bd9cdc" containerName="registry-server" Mar 20 17:37:17 crc kubenswrapper[4690]: I0320 17:37:17.574320 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="e896e412-2900-44e4-908c-de1883bd9cdc" containerName="registry-server" Mar 20 17:37:17 crc kubenswrapper[4690]: E0320 17:37:17.574338 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd7aad87-3b32-4095-b7f5-efe8d25a53a9" containerName="controller-manager" Mar 20 17:37:17 crc kubenswrapper[4690]: I0320 17:37:17.574346 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd7aad87-3b32-4095-b7f5-efe8d25a53a9" containerName="controller-manager" Mar 20 17:37:17 crc kubenswrapper[4690]: E0320 17:37:17.574357 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e896e412-2900-44e4-908c-de1883bd9cdc" containerName="extract-utilities" Mar 20 17:37:17 crc kubenswrapper[4690]: I0320 17:37:17.574367 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="e896e412-2900-44e4-908c-de1883bd9cdc" containerName="extract-utilities" Mar 20 17:37:17 crc kubenswrapper[4690]: E0320 17:37:17.574381 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e896e412-2900-44e4-908c-de1883bd9cdc" containerName="extract-content" Mar 20 17:37:17 crc kubenswrapper[4690]: I0320 17:37:17.574390 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="e896e412-2900-44e4-908c-de1883bd9cdc" containerName="extract-content" Mar 20 17:37:17 crc kubenswrapper[4690]: I0320 17:37:17.574510 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="22016c8f-24ff-47ef-ad5f-1e22ef59ae23" containerName="pruner" Mar 20 17:37:17 crc kubenswrapper[4690]: I0320 17:37:17.574522 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a2404df-2583-48d9-a6e5-59daa7a3a8a8" containerName="route-controller-manager" Mar 20 17:37:17 crc kubenswrapper[4690]: I0320 17:37:17.574532 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="34d2f5b9-1f8e-4413-b178-58cd10fa7548" containerName="oc" Mar 20 17:37:17 crc kubenswrapper[4690]: I0320 17:37:17.574543 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="e896e412-2900-44e4-908c-de1883bd9cdc" containerName="registry-server" Mar 20 17:37:17 crc kubenswrapper[4690]: I0320 17:37:17.574560 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1c872c1-ae2b-4fd2-bb6f-e387fab73a06" containerName="oc" Mar 20 17:37:17 crc kubenswrapper[4690]: I0320 17:37:17.574570 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd7aad87-3b32-4095-b7f5-efe8d25a53a9" containerName="controller-manager" Mar 20 17:37:17 crc kubenswrapper[4690]: I0320 17:37:17.575045 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6f86fc9544-4rfsd" Mar 20 17:37:17 crc kubenswrapper[4690]: I0320 17:37:17.579153 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 17:37:17 crc kubenswrapper[4690]: I0320 17:37:17.579337 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 17:37:17 crc kubenswrapper[4690]: I0320 17:37:17.579296 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 17:37:17 crc kubenswrapper[4690]: I0320 17:37:17.579897 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 17:37:17 crc kubenswrapper[4690]: I0320 17:37:17.580107 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 17:37:17 crc kubenswrapper[4690]: I0320 17:37:17.580180 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 17:37:17 crc kubenswrapper[4690]: I0320 17:37:17.583785 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f86fc9544-4rfsd"] Mar 20 17:37:17 crc kubenswrapper[4690]: I0320 17:37:17.655129 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rl4z\" (UniqueName: \"kubernetes.io/projected/0ae207bf-1228-48f1-ab12-e941daee4948-kube-api-access-8rl4z\") pod \"route-controller-manager-6f86fc9544-4rfsd\" (UID: \"0ae207bf-1228-48f1-ab12-e941daee4948\") " pod="openshift-route-controller-manager/route-controller-manager-6f86fc9544-4rfsd" Mar 20 17:37:17 crc kubenswrapper[4690]: I0320 17:37:17.655373 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ae207bf-1228-48f1-ab12-e941daee4948-config\") pod \"route-controller-manager-6f86fc9544-4rfsd\" (UID: \"0ae207bf-1228-48f1-ab12-e941daee4948\") " pod="openshift-route-controller-manager/route-controller-manager-6f86fc9544-4rfsd" Mar 20 17:37:17 crc kubenswrapper[4690]: I0320 17:37:17.655531 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ae207bf-1228-48f1-ab12-e941daee4948-serving-cert\") pod \"route-controller-manager-6f86fc9544-4rfsd\" (UID: \"0ae207bf-1228-48f1-ab12-e941daee4948\") " pod="openshift-route-controller-manager/route-controller-manager-6f86fc9544-4rfsd" Mar 20 17:37:17 crc kubenswrapper[4690]: I0320 17:37:17.655585 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0ae207bf-1228-48f1-ab12-e941daee4948-client-ca\") pod \"route-controller-manager-6f86fc9544-4rfsd\" (UID: \"0ae207bf-1228-48f1-ab12-e941daee4948\") " pod="openshift-route-controller-manager/route-controller-manager-6f86fc9544-4rfsd" Mar 20 17:37:17 crc kubenswrapper[4690]: I0320 17:37:17.751630 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-sdqjm" Mar 20 17:37:17 crc kubenswrapper[4690]: I0320 17:37:17.756546 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rl4z\" (UniqueName: \"kubernetes.io/projected/0ae207bf-1228-48f1-ab12-e941daee4948-kube-api-access-8rl4z\") pod \"route-controller-manager-6f86fc9544-4rfsd\" (UID: \"0ae207bf-1228-48f1-ab12-e941daee4948\") " pod="openshift-route-controller-manager/route-controller-manager-6f86fc9544-4rfsd" Mar 20 17:37:17 crc kubenswrapper[4690]: I0320 17:37:17.756650 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ae207bf-1228-48f1-ab12-e941daee4948-config\") pod \"route-controller-manager-6f86fc9544-4rfsd\" (UID: \"0ae207bf-1228-48f1-ab12-e941daee4948\") " pod="openshift-route-controller-manager/route-controller-manager-6f86fc9544-4rfsd" Mar 20 17:37:17 crc kubenswrapper[4690]: I0320 17:37:17.756702 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ae207bf-1228-48f1-ab12-e941daee4948-serving-cert\") pod \"route-controller-manager-6f86fc9544-4rfsd\" (UID: \"0ae207bf-1228-48f1-ab12-e941daee4948\") " pod="openshift-route-controller-manager/route-controller-manager-6f86fc9544-4rfsd" Mar 20 17:37:17 crc kubenswrapper[4690]: I0320 17:37:17.756722 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0ae207bf-1228-48f1-ab12-e941daee4948-client-ca\") pod \"route-controller-manager-6f86fc9544-4rfsd\" (UID: \"0ae207bf-1228-48f1-ab12-e941daee4948\") " pod="openshift-route-controller-manager/route-controller-manager-6f86fc9544-4rfsd" Mar 20 17:37:17 crc kubenswrapper[4690]: I0320 17:37:17.757723 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0ae207bf-1228-48f1-ab12-e941daee4948-client-ca\") pod \"route-controller-manager-6f86fc9544-4rfsd\" (UID: \"0ae207bf-1228-48f1-ab12-e941daee4948\") " pod="openshift-route-controller-manager/route-controller-manager-6f86fc9544-4rfsd" Mar 20 17:37:17 crc kubenswrapper[4690]: I0320 17:37:17.758286 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ae207bf-1228-48f1-ab12-e941daee4948-config\") pod \"route-controller-manager-6f86fc9544-4rfsd\" (UID: \"0ae207bf-1228-48f1-ab12-e941daee4948\") " pod="openshift-route-controller-manager/route-controller-manager-6f86fc9544-4rfsd" Mar 20 17:37:17 crc kubenswrapper[4690]: I0320 17:37:17.760519 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ae207bf-1228-48f1-ab12-e941daee4948-serving-cert\") pod \"route-controller-manager-6f86fc9544-4rfsd\" (UID: \"0ae207bf-1228-48f1-ab12-e941daee4948\") " pod="openshift-route-controller-manager/route-controller-manager-6f86fc9544-4rfsd" Mar 20 17:37:17 crc kubenswrapper[4690]: I0320 17:37:17.773726 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rl4z\" (UniqueName: \"kubernetes.io/projected/0ae207bf-1228-48f1-ab12-e941daee4948-kube-api-access-8rl4z\") pod \"route-controller-manager-6f86fc9544-4rfsd\" (UID: \"0ae207bf-1228-48f1-ab12-e941daee4948\") " pod="openshift-route-controller-manager/route-controller-manager-6f86fc9544-4rfsd" Mar 20 17:37:17 crc kubenswrapper[4690]: I0320 17:37:17.889178 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a2404df-2583-48d9-a6e5-59daa7a3a8a8" path="/var/lib/kubelet/pods/8a2404df-2583-48d9-a6e5-59daa7a3a8a8/volumes" Mar 20 17:37:17 crc kubenswrapper[4690]: I0320 17:37:17.889740 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd7aad87-3b32-4095-b7f5-efe8d25a53a9" path="/var/lib/kubelet/pods/fd7aad87-3b32-4095-b7f5-efe8d25a53a9/volumes" Mar 20 17:37:17 crc kubenswrapper[4690]: I0320 17:37:17.964436 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6f86fc9544-4rfsd" Mar 20 17:37:18 crc kubenswrapper[4690]: I0320 17:37:18.164730 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4m7xw" Mar 20 17:37:18 crc kubenswrapper[4690]: I0320 17:37:18.170514 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bnxz2" Mar 20 17:37:18 crc kubenswrapper[4690]: I0320 17:37:18.179742 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f86fc9544-4rfsd"] Mar 20 17:37:18 crc kubenswrapper[4690]: W0320 17:37:18.189775 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ae207bf_1228_48f1_ab12_e941daee4948.slice/crio-c221e53264c7aa7c6fc8ba79f304a71d77e7a5e3c11730cdbf22157647a0c883 WatchSource:0}: Error finding container c221e53264c7aa7c6fc8ba79f304a71d77e7a5e3c11730cdbf22157647a0c883: Status 404 returned error can't find the container with id c221e53264c7aa7c6fc8ba79f304a71d77e7a5e3c11730cdbf22157647a0c883 Mar 20 17:37:19 crc kubenswrapper[4690]: I0320 17:37:19.125847 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sdqjm"] Mar 20 17:37:19 crc kubenswrapper[4690]: I0320 17:37:19.126394 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-sdqjm" podUID="29dcb3ba-2c4c-41f1-a655-02ce44ab280f" containerName="registry-server" containerID="cri-o://58d939f0caa24a0c9532173f10e7d762c1e9e12a4171cfc922b95b9cf79a809c" gracePeriod=2 Mar 20 17:37:19 crc kubenswrapper[4690]: I0320 17:37:19.128807 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6f86fc9544-4rfsd" event={"ID":"0ae207bf-1228-48f1-ab12-e941daee4948","Type":"ContainerStarted","Data":"60dce8b62715f585f73a49e595d756f55a4900e4723938b1ab5885c0395e03c7"} Mar 20 17:37:19 crc kubenswrapper[4690]: I0320 17:37:19.128883 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6f86fc9544-4rfsd" event={"ID":"0ae207bf-1228-48f1-ab12-e941daee4948","Type":"ContainerStarted","Data":"c221e53264c7aa7c6fc8ba79f304a71d77e7a5e3c11730cdbf22157647a0c883"} Mar 20 17:37:19 crc kubenswrapper[4690]: I0320 17:37:19.470450 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sdqjm" Mar 20 17:37:19 crc kubenswrapper[4690]: I0320 17:37:19.491113 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6f86fc9544-4rfsd" podStartSLOduration=4.491091712 podStartE2EDuration="4.491091712s" podCreationTimestamp="2026-03-20 17:37:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:37:19.164828041 +0000 UTC m=+314.030653719" watchObservedRunningTime="2026-03-20 17:37:19.491091712 +0000 UTC m=+314.356917390" Mar 20 17:37:19 crc kubenswrapper[4690]: I0320 17:37:19.569794 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6848775948-wq8cb"] Mar 20 17:37:19 crc kubenswrapper[4690]: E0320 17:37:19.569979 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29dcb3ba-2c4c-41f1-a655-02ce44ab280f" containerName="extract-content" Mar 20 17:37:19 crc kubenswrapper[4690]: I0320 17:37:19.569990 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="29dcb3ba-2c4c-41f1-a655-02ce44ab280f" containerName="extract-content" Mar 20 17:37:19 crc kubenswrapper[4690]: E0320 17:37:19.569999 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29dcb3ba-2c4c-41f1-a655-02ce44ab280f" containerName="extract-utilities" Mar 20 17:37:19 crc kubenswrapper[4690]: I0320 17:37:19.570004 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="29dcb3ba-2c4c-41f1-a655-02ce44ab280f" containerName="extract-utilities" Mar 20 17:37:19 crc kubenswrapper[4690]: E0320 17:37:19.570018 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29dcb3ba-2c4c-41f1-a655-02ce44ab280f" containerName="registry-server" Mar 20 17:37:19 crc kubenswrapper[4690]: I0320 17:37:19.570025 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="29dcb3ba-2c4c-41f1-a655-02ce44ab280f" containerName="registry-server" Mar 20 17:37:19 crc kubenswrapper[4690]: I0320 17:37:19.570108 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="29dcb3ba-2c4c-41f1-a655-02ce44ab280f" containerName="registry-server" Mar 20 17:37:19 crc kubenswrapper[4690]: I0320 17:37:19.570442 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6848775948-wq8cb" Mar 20 17:37:19 crc kubenswrapper[4690]: I0320 17:37:19.572952 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 17:37:19 crc kubenswrapper[4690]: I0320 17:37:19.573004 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 17:37:19 crc kubenswrapper[4690]: I0320 17:37:19.573179 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 17:37:19 crc kubenswrapper[4690]: I0320 17:37:19.574369 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 17:37:19 crc kubenswrapper[4690]: I0320 17:37:19.574514 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 17:37:19 crc kubenswrapper[4690]: I0320 17:37:19.575330 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 17:37:19 crc kubenswrapper[4690]: I0320 17:37:19.579425 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 17:37:19 crc kubenswrapper[4690]: I0320 17:37:19.584942 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29dcb3ba-2c4c-41f1-a655-02ce44ab280f-catalog-content\") pod \"29dcb3ba-2c4c-41f1-a655-02ce44ab280f\" (UID: \"29dcb3ba-2c4c-41f1-a655-02ce44ab280f\") " Mar 20 17:37:19 crc kubenswrapper[4690]: I0320 17:37:19.585062 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29dcb3ba-2c4c-41f1-a655-02ce44ab280f-utilities\") pod \"29dcb3ba-2c4c-41f1-a655-02ce44ab280f\" (UID: \"29dcb3ba-2c4c-41f1-a655-02ce44ab280f\") " Mar 20 17:37:19 crc kubenswrapper[4690]: I0320 17:37:19.585189 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26622\" (UniqueName: \"kubernetes.io/projected/29dcb3ba-2c4c-41f1-a655-02ce44ab280f-kube-api-access-26622\") pod \"29dcb3ba-2c4c-41f1-a655-02ce44ab280f\" (UID: \"29dcb3ba-2c4c-41f1-a655-02ce44ab280f\") " Mar 20 17:37:19 crc kubenswrapper[4690]: I0320 17:37:19.588699 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29dcb3ba-2c4c-41f1-a655-02ce44ab280f-utilities" (OuterVolumeSpecName: "utilities") pod "29dcb3ba-2c4c-41f1-a655-02ce44ab280f" (UID: "29dcb3ba-2c4c-41f1-a655-02ce44ab280f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:37:19 crc kubenswrapper[4690]: I0320 17:37:19.592593 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29dcb3ba-2c4c-41f1-a655-02ce44ab280f-kube-api-access-26622" (OuterVolumeSpecName: "kube-api-access-26622") pod "29dcb3ba-2c4c-41f1-a655-02ce44ab280f" (UID: "29dcb3ba-2c4c-41f1-a655-02ce44ab280f"). InnerVolumeSpecName "kube-api-access-26622". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:37:19 crc kubenswrapper[4690]: I0320 17:37:19.618565 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6848775948-wq8cb"] Mar 20 17:37:19 crc kubenswrapper[4690]: I0320 17:37:19.644341 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29dcb3ba-2c4c-41f1-a655-02ce44ab280f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "29dcb3ba-2c4c-41f1-a655-02ce44ab280f" (UID: "29dcb3ba-2c4c-41f1-a655-02ce44ab280f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:37:19 crc kubenswrapper[4690]: I0320 17:37:19.687043 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41de6756-32f4-474f-bc58-d461313abb73-config\") pod \"controller-manager-6848775948-wq8cb\" (UID: \"41de6756-32f4-474f-bc58-d461313abb73\") " pod="openshift-controller-manager/controller-manager-6848775948-wq8cb" Mar 20 17:37:19 crc kubenswrapper[4690]: I0320 17:37:19.687126 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/41de6756-32f4-474f-bc58-d461313abb73-serving-cert\") pod \"controller-manager-6848775948-wq8cb\" (UID: \"41de6756-32f4-474f-bc58-d461313abb73\") " pod="openshift-controller-manager/controller-manager-6848775948-wq8cb" Mar 20 17:37:19 crc kubenswrapper[4690]: I0320 17:37:19.687158 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/41de6756-32f4-474f-bc58-d461313abb73-client-ca\") pod \"controller-manager-6848775948-wq8cb\" (UID: \"41de6756-32f4-474f-bc58-d461313abb73\") " pod="openshift-controller-manager/controller-manager-6848775948-wq8cb" Mar 20 17:37:19 crc kubenswrapper[4690]: I0320 17:37:19.687223 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kx988\" (UniqueName: \"kubernetes.io/projected/41de6756-32f4-474f-bc58-d461313abb73-kube-api-access-kx988\") pod \"controller-manager-6848775948-wq8cb\" (UID: \"41de6756-32f4-474f-bc58-d461313abb73\") " pod="openshift-controller-manager/controller-manager-6848775948-wq8cb" Mar 20 17:37:19 crc kubenswrapper[4690]: I0320 17:37:19.687244 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/41de6756-32f4-474f-bc58-d461313abb73-proxy-ca-bundles\") pod \"controller-manager-6848775948-wq8cb\" (UID: \"41de6756-32f4-474f-bc58-d461313abb73\") " pod="openshift-controller-manager/controller-manager-6848775948-wq8cb" Mar 20 17:37:19 crc kubenswrapper[4690]: I0320 17:37:19.687300 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26622\" (UniqueName: \"kubernetes.io/projected/29dcb3ba-2c4c-41f1-a655-02ce44ab280f-kube-api-access-26622\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:19 crc kubenswrapper[4690]: I0320 17:37:19.687311 4690 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29dcb3ba-2c4c-41f1-a655-02ce44ab280f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:19 crc kubenswrapper[4690]: I0320 17:37:19.687320 4690 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29dcb3ba-2c4c-41f1-a655-02ce44ab280f-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:19 crc kubenswrapper[4690]: I0320 17:37:19.723007 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bnxz2"] Mar 20 17:37:19 crc kubenswrapper[4690]: I0320 17:37:19.788203 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kx988\" (UniqueName: \"kubernetes.io/projected/41de6756-32f4-474f-bc58-d461313abb73-kube-api-access-kx988\") pod \"controller-manager-6848775948-wq8cb\" (UID: \"41de6756-32f4-474f-bc58-d461313abb73\") " pod="openshift-controller-manager/controller-manager-6848775948-wq8cb" Mar 20 17:37:19 crc kubenswrapper[4690]: I0320 17:37:19.788275 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/41de6756-32f4-474f-bc58-d461313abb73-proxy-ca-bundles\") pod \"controller-manager-6848775948-wq8cb\" (UID: \"41de6756-32f4-474f-bc58-d461313abb73\") " pod="openshift-controller-manager/controller-manager-6848775948-wq8cb" Mar 20 17:37:19 crc kubenswrapper[4690]: I0320 17:37:19.788309 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41de6756-32f4-474f-bc58-d461313abb73-config\") pod \"controller-manager-6848775948-wq8cb\" (UID: \"41de6756-32f4-474f-bc58-d461313abb73\") " pod="openshift-controller-manager/controller-manager-6848775948-wq8cb" Mar 20 17:37:19 crc kubenswrapper[4690]: I0320 17:37:19.788387 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/41de6756-32f4-474f-bc58-d461313abb73-serving-cert\") pod \"controller-manager-6848775948-wq8cb\" (UID: \"41de6756-32f4-474f-bc58-d461313abb73\") " pod="openshift-controller-manager/controller-manager-6848775948-wq8cb" Mar 20 17:37:19 crc kubenswrapper[4690]: I0320 17:37:19.788422 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/41de6756-32f4-474f-bc58-d461313abb73-client-ca\") pod \"controller-manager-6848775948-wq8cb\" (UID: \"41de6756-32f4-474f-bc58-d461313abb73\") " pod="openshift-controller-manager/controller-manager-6848775948-wq8cb" Mar 20 17:37:19 crc kubenswrapper[4690]: I0320 17:37:19.789742 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/41de6756-32f4-474f-bc58-d461313abb73-client-ca\") pod \"controller-manager-6848775948-wq8cb\" (UID: \"41de6756-32f4-474f-bc58-d461313abb73\") " pod="openshift-controller-manager/controller-manager-6848775948-wq8cb" Mar 20 17:37:19 crc kubenswrapper[4690]: I0320 17:37:19.790202 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/41de6756-32f4-474f-bc58-d461313abb73-proxy-ca-bundles\") pod \"controller-manager-6848775948-wq8cb\" (UID: \"41de6756-32f4-474f-bc58-d461313abb73\") " pod="openshift-controller-manager/controller-manager-6848775948-wq8cb" Mar 20 17:37:19 crc kubenswrapper[4690]: I0320 17:37:19.790532 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41de6756-32f4-474f-bc58-d461313abb73-config\") pod \"controller-manager-6848775948-wq8cb\" (UID: \"41de6756-32f4-474f-bc58-d461313abb73\") " pod="openshift-controller-manager/controller-manager-6848775948-wq8cb" Mar 20 17:37:19 crc kubenswrapper[4690]: I0320 17:37:19.793103 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/41de6756-32f4-474f-bc58-d461313abb73-serving-cert\") pod \"controller-manager-6848775948-wq8cb\" (UID: \"41de6756-32f4-474f-bc58-d461313abb73\") " pod="openshift-controller-manager/controller-manager-6848775948-wq8cb" Mar 20 17:37:19 crc kubenswrapper[4690]: I0320 17:37:19.817873 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kx988\" (UniqueName: \"kubernetes.io/projected/41de6756-32f4-474f-bc58-d461313abb73-kube-api-access-kx988\") pod \"controller-manager-6848775948-wq8cb\" (UID: \"41de6756-32f4-474f-bc58-d461313abb73\") " pod="openshift-controller-manager/controller-manager-6848775948-wq8cb" Mar 20 17:37:19 crc kubenswrapper[4690]: I0320 17:37:19.920504 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6848775948-wq8cb" Mar 20 17:37:20 crc kubenswrapper[4690]: I0320 17:37:20.114880 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6848775948-wq8cb"] Mar 20 17:37:20 crc kubenswrapper[4690]: W0320 17:37:20.120164 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41de6756_32f4_474f_bc58_d461313abb73.slice/crio-5e6387354d7e6eee962a871d7beac40639c1031cc272e1915eff00db959d491f WatchSource:0}: Error finding container 5e6387354d7e6eee962a871d7beac40639c1031cc272e1915eff00db959d491f: Status 404 returned error can't find the container with id 5e6387354d7e6eee962a871d7beac40639c1031cc272e1915eff00db959d491f Mar 20 17:37:20 crc kubenswrapper[4690]: I0320 17:37:20.135461 4690 generic.go:334] "Generic (PLEG): container finished" podID="29dcb3ba-2c4c-41f1-a655-02ce44ab280f" containerID="58d939f0caa24a0c9532173f10e7d762c1e9e12a4171cfc922b95b9cf79a809c" exitCode=0 Mar 20 17:37:20 crc kubenswrapper[4690]: I0320 17:37:20.135537 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sdqjm" event={"ID":"29dcb3ba-2c4c-41f1-a655-02ce44ab280f","Type":"ContainerDied","Data":"58d939f0caa24a0c9532173f10e7d762c1e9e12a4171cfc922b95b9cf79a809c"} Mar 20 17:37:20 crc kubenswrapper[4690]: I0320 17:37:20.135571 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sdqjm" event={"ID":"29dcb3ba-2c4c-41f1-a655-02ce44ab280f","Type":"ContainerDied","Data":"e9c0b404db88fda2532c20b4fa52c70a84ce96a9ebc68696b0910d64f13a07ed"} Mar 20 17:37:20 crc kubenswrapper[4690]: I0320 17:37:20.135588 4690 scope.go:117] "RemoveContainer" containerID="58d939f0caa24a0c9532173f10e7d762c1e9e12a4171cfc922b95b9cf79a809c" Mar 20 17:37:20 crc kubenswrapper[4690]: I0320 17:37:20.135591 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sdqjm" Mar 20 17:37:20 crc kubenswrapper[4690]: I0320 17:37:20.138910 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6848775948-wq8cb" event={"ID":"41de6756-32f4-474f-bc58-d461313abb73","Type":"ContainerStarted","Data":"5e6387354d7e6eee962a871d7beac40639c1031cc272e1915eff00db959d491f"} Mar 20 17:37:20 crc kubenswrapper[4690]: I0320 17:37:20.139101 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-bnxz2" podUID="7552fec8-7b03-4ad9-8410-1705f639433e" containerName="registry-server" containerID="cri-o://3c42f8b24eb11bd0dc19e8c6fff2c370af1437dd837868fbe9e8fa02bd64d413" gracePeriod=2 Mar 20 17:37:20 crc kubenswrapper[4690]: I0320 17:37:20.139214 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6f86fc9544-4rfsd" Mar 20 17:37:20 crc kubenswrapper[4690]: I0320 17:37:20.155067 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sdqjm"] Mar 20 17:37:20 crc kubenswrapper[4690]: I0320 17:37:20.157158 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6f86fc9544-4rfsd" Mar 20 17:37:20 crc kubenswrapper[4690]: I0320 17:37:20.161367 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-sdqjm"] Mar 20 17:37:20 crc kubenswrapper[4690]: I0320 17:37:20.162800 4690 scope.go:117] "RemoveContainer" containerID="05740b1c124e8aeab3ecc0c6eb455b7a3b15192357a286cac64023ef3f7e6b24" Mar 20 17:37:20 crc kubenswrapper[4690]: I0320 17:37:20.198616 4690 scope.go:117] "RemoveContainer" containerID="95102bb7662ca95eee51ed2fbbff9d55f8a797e80c4d4ecb5e9a52b3b278984f" Mar 20 17:37:20 crc kubenswrapper[4690]: I0320 17:37:20.236744 4690 scope.go:117] "RemoveContainer" containerID="58d939f0caa24a0c9532173f10e7d762c1e9e12a4171cfc922b95b9cf79a809c" Mar 20 17:37:20 crc kubenswrapper[4690]: E0320 17:37:20.237406 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58d939f0caa24a0c9532173f10e7d762c1e9e12a4171cfc922b95b9cf79a809c\": container with ID starting with 58d939f0caa24a0c9532173f10e7d762c1e9e12a4171cfc922b95b9cf79a809c not found: ID does not exist" containerID="58d939f0caa24a0c9532173f10e7d762c1e9e12a4171cfc922b95b9cf79a809c" Mar 20 17:37:20 crc kubenswrapper[4690]: I0320 17:37:20.237450 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58d939f0caa24a0c9532173f10e7d762c1e9e12a4171cfc922b95b9cf79a809c"} err="failed to get container status \"58d939f0caa24a0c9532173f10e7d762c1e9e12a4171cfc922b95b9cf79a809c\": rpc error: code = NotFound desc = could not find container \"58d939f0caa24a0c9532173f10e7d762c1e9e12a4171cfc922b95b9cf79a809c\": container with ID starting with 58d939f0caa24a0c9532173f10e7d762c1e9e12a4171cfc922b95b9cf79a809c not found: ID does not exist" Mar 20 17:37:20 crc kubenswrapper[4690]: I0320 17:37:20.237481 4690 scope.go:117] "RemoveContainer" containerID="05740b1c124e8aeab3ecc0c6eb455b7a3b15192357a286cac64023ef3f7e6b24" Mar 20 17:37:20 crc kubenswrapper[4690]: E0320 17:37:20.237831 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05740b1c124e8aeab3ecc0c6eb455b7a3b15192357a286cac64023ef3f7e6b24\": container with ID starting with 05740b1c124e8aeab3ecc0c6eb455b7a3b15192357a286cac64023ef3f7e6b24 not found: ID does not exist" containerID="05740b1c124e8aeab3ecc0c6eb455b7a3b15192357a286cac64023ef3f7e6b24" Mar 20 17:37:20 crc kubenswrapper[4690]: I0320 17:37:20.237865 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05740b1c124e8aeab3ecc0c6eb455b7a3b15192357a286cac64023ef3f7e6b24"} err="failed to get container status \"05740b1c124e8aeab3ecc0c6eb455b7a3b15192357a286cac64023ef3f7e6b24\": rpc error: code = NotFound desc = could not find container \"05740b1c124e8aeab3ecc0c6eb455b7a3b15192357a286cac64023ef3f7e6b24\": container with ID starting with 05740b1c124e8aeab3ecc0c6eb455b7a3b15192357a286cac64023ef3f7e6b24 not found: ID does not exist" Mar 20 17:37:20 crc kubenswrapper[4690]: I0320 17:37:20.237891 4690 scope.go:117] "RemoveContainer" containerID="95102bb7662ca95eee51ed2fbbff9d55f8a797e80c4d4ecb5e9a52b3b278984f" Mar 20 17:37:20 crc kubenswrapper[4690]: E0320 17:37:20.238392 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95102bb7662ca95eee51ed2fbbff9d55f8a797e80c4d4ecb5e9a52b3b278984f\": container with ID starting with 95102bb7662ca95eee51ed2fbbff9d55f8a797e80c4d4ecb5e9a52b3b278984f not found: ID does not exist" containerID="95102bb7662ca95eee51ed2fbbff9d55f8a797e80c4d4ecb5e9a52b3b278984f" Mar 20 17:37:20 crc kubenswrapper[4690]: I0320 17:37:20.238435 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95102bb7662ca95eee51ed2fbbff9d55f8a797e80c4d4ecb5e9a52b3b278984f"} err="failed to get container status \"95102bb7662ca95eee51ed2fbbff9d55f8a797e80c4d4ecb5e9a52b3b278984f\": rpc error: code = NotFound desc = could not find container \"95102bb7662ca95eee51ed2fbbff9d55f8a797e80c4d4ecb5e9a52b3b278984f\": container with ID starting with 95102bb7662ca95eee51ed2fbbff9d55f8a797e80c4d4ecb5e9a52b3b278984f not found: ID does not exist" Mar 20 17:37:20 crc kubenswrapper[4690]: I0320 17:37:20.414918 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zltxc" Mar 20 17:37:20 crc kubenswrapper[4690]: I0320 17:37:20.472081 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zltxc" Mar 20 17:37:20 crc kubenswrapper[4690]: I0320 17:37:20.507272 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gj5xl" Mar 20 17:37:20 crc kubenswrapper[4690]: I0320 17:37:20.507328 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gj5xl" Mar 20 17:37:20 crc kubenswrapper[4690]: I0320 17:37:20.579413 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gj5xl" Mar 20 17:37:21 crc kubenswrapper[4690]: I0320 17:37:21.148098 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6848775948-wq8cb" event={"ID":"41de6756-32f4-474f-bc58-d461313abb73","Type":"ContainerStarted","Data":"7d51be2d341956bbe9d859e69bb65401366990924ddb5367436269bd6325d7ec"} Mar 20 17:37:21 crc kubenswrapper[4690]: I0320 17:37:21.215016 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gj5xl" Mar 20 17:37:21 crc kubenswrapper[4690]: I0320 17:37:21.889653 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29dcb3ba-2c4c-41f1-a655-02ce44ab280f" path="/var/lib/kubelet/pods/29dcb3ba-2c4c-41f1-a655-02ce44ab280f/volumes" Mar 20 17:37:22 crc kubenswrapper[4690]: I0320 17:37:22.093429 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bnxz2" Mar 20 17:37:22 crc kubenswrapper[4690]: I0320 17:37:22.174587 4690 generic.go:334] "Generic (PLEG): container finished" podID="7552fec8-7b03-4ad9-8410-1705f639433e" containerID="3c42f8b24eb11bd0dc19e8c6fff2c370af1437dd837868fbe9e8fa02bd64d413" exitCode=0 Mar 20 17:37:22 crc kubenswrapper[4690]: I0320 17:37:22.174649 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bnxz2" Mar 20 17:37:22 crc kubenswrapper[4690]: I0320 17:37:22.174690 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bnxz2" event={"ID":"7552fec8-7b03-4ad9-8410-1705f639433e","Type":"ContainerDied","Data":"3c42f8b24eb11bd0dc19e8c6fff2c370af1437dd837868fbe9e8fa02bd64d413"} Mar 20 17:37:22 crc kubenswrapper[4690]: I0320 17:37:22.174727 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bnxz2" event={"ID":"7552fec8-7b03-4ad9-8410-1705f639433e","Type":"ContainerDied","Data":"c637e3c605641d5c710e7ff8dc8b37e56853cc1561ebddb04343fee132620166"} Mar 20 17:37:22 crc kubenswrapper[4690]: I0320 17:37:22.174753 4690 scope.go:117] "RemoveContainer" containerID="3c42f8b24eb11bd0dc19e8c6fff2c370af1437dd837868fbe9e8fa02bd64d413" Mar 20 17:37:22 crc kubenswrapper[4690]: I0320 17:37:22.186993 4690 scope.go:117] "RemoveContainer" containerID="b866d1b036b3eda3cae3956ae10bb651acaca83b0874a0eb5f7e956b7e87ce02" Mar 20 17:37:22 crc kubenswrapper[4690]: I0320 17:37:22.201410 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6848775948-wq8cb" podStartSLOduration=7.201390517 podStartE2EDuration="7.201390517s" podCreationTimestamp="2026-03-20 17:37:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:37:22.198825354 +0000 UTC m=+317.064651042" watchObservedRunningTime="2026-03-20 17:37:22.201390517 +0000 UTC m=+317.067216195" Mar 20 17:37:22 crc kubenswrapper[4690]: I0320 17:37:22.203623 4690 scope.go:117] "RemoveContainer" containerID="651611eef99ad79eaaa35e47a43a5ca328ca729a209618f730db996b4b0805b4" Mar 20 17:37:22 crc kubenswrapper[4690]: I0320 17:37:22.218462 4690 scope.go:117] "RemoveContainer" containerID="3c42f8b24eb11bd0dc19e8c6fff2c370af1437dd837868fbe9e8fa02bd64d413" Mar 20 17:37:22 crc kubenswrapper[4690]: E0320 17:37:22.220571 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c42f8b24eb11bd0dc19e8c6fff2c370af1437dd837868fbe9e8fa02bd64d413\": container with ID starting with 3c42f8b24eb11bd0dc19e8c6fff2c370af1437dd837868fbe9e8fa02bd64d413 not found: ID does not exist" containerID="3c42f8b24eb11bd0dc19e8c6fff2c370af1437dd837868fbe9e8fa02bd64d413" Mar 20 17:37:22 crc kubenswrapper[4690]: I0320 17:37:22.220610 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c42f8b24eb11bd0dc19e8c6fff2c370af1437dd837868fbe9e8fa02bd64d413"} err="failed to get container status \"3c42f8b24eb11bd0dc19e8c6fff2c370af1437dd837868fbe9e8fa02bd64d413\": rpc error: code = NotFound desc = could not find container \"3c42f8b24eb11bd0dc19e8c6fff2c370af1437dd837868fbe9e8fa02bd64d413\": container with ID starting with 3c42f8b24eb11bd0dc19e8c6fff2c370af1437dd837868fbe9e8fa02bd64d413 not found: ID does not exist" Mar 20 17:37:22 crc kubenswrapper[4690]: I0320 17:37:22.220633 4690 scope.go:117] "RemoveContainer" containerID="b866d1b036b3eda3cae3956ae10bb651acaca83b0874a0eb5f7e956b7e87ce02" Mar 20 17:37:22 crc kubenswrapper[4690]: E0320 17:37:22.220819 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b866d1b036b3eda3cae3956ae10bb651acaca83b0874a0eb5f7e956b7e87ce02\": container with ID starting with b866d1b036b3eda3cae3956ae10bb651acaca83b0874a0eb5f7e956b7e87ce02 not found: ID does not exist" containerID="b866d1b036b3eda3cae3956ae10bb651acaca83b0874a0eb5f7e956b7e87ce02" Mar 20 17:37:22 crc kubenswrapper[4690]: I0320 17:37:22.220837 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b866d1b036b3eda3cae3956ae10bb651acaca83b0874a0eb5f7e956b7e87ce02"} err="failed to get container status \"b866d1b036b3eda3cae3956ae10bb651acaca83b0874a0eb5f7e956b7e87ce02\": rpc error: code = NotFound desc = could not find container \"b866d1b036b3eda3cae3956ae10bb651acaca83b0874a0eb5f7e956b7e87ce02\": container with ID starting with b866d1b036b3eda3cae3956ae10bb651acaca83b0874a0eb5f7e956b7e87ce02 not found: ID does not exist" Mar 20 17:37:22 crc kubenswrapper[4690]: I0320 17:37:22.220851 4690 scope.go:117] "RemoveContainer" containerID="651611eef99ad79eaaa35e47a43a5ca328ca729a209618f730db996b4b0805b4" Mar 20 17:37:22 crc kubenswrapper[4690]: E0320 17:37:22.221045 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"651611eef99ad79eaaa35e47a43a5ca328ca729a209618f730db996b4b0805b4\": container with ID starting with 651611eef99ad79eaaa35e47a43a5ca328ca729a209618f730db996b4b0805b4 not found: ID does not exist" containerID="651611eef99ad79eaaa35e47a43a5ca328ca729a209618f730db996b4b0805b4" Mar 20 17:37:22 crc kubenswrapper[4690]: I0320 17:37:22.221063 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"651611eef99ad79eaaa35e47a43a5ca328ca729a209618f730db996b4b0805b4"} err="failed to get container status \"651611eef99ad79eaaa35e47a43a5ca328ca729a209618f730db996b4b0805b4\": rpc error: code = NotFound desc = could not find container \"651611eef99ad79eaaa35e47a43a5ca328ca729a209618f730db996b4b0805b4\": container with ID starting with 651611eef99ad79eaaa35e47a43a5ca328ca729a209618f730db996b4b0805b4 not found: ID does not exist" Mar 20 17:37:22 crc kubenswrapper[4690]: I0320 17:37:22.265881 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7552fec8-7b03-4ad9-8410-1705f639433e-catalog-content\") pod \"7552fec8-7b03-4ad9-8410-1705f639433e\" (UID: \"7552fec8-7b03-4ad9-8410-1705f639433e\") " Mar 20 17:37:22 crc kubenswrapper[4690]: I0320 17:37:22.266278 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7552fec8-7b03-4ad9-8410-1705f639433e-utilities\") pod \"7552fec8-7b03-4ad9-8410-1705f639433e\" (UID: \"7552fec8-7b03-4ad9-8410-1705f639433e\") " Mar 20 17:37:22 crc kubenswrapper[4690]: I0320 17:37:22.266383 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nd42m\" (UniqueName: \"kubernetes.io/projected/7552fec8-7b03-4ad9-8410-1705f639433e-kube-api-access-nd42m\") pod \"7552fec8-7b03-4ad9-8410-1705f639433e\" (UID: \"7552fec8-7b03-4ad9-8410-1705f639433e\") " Mar 20 17:37:22 crc kubenswrapper[4690]: I0320 17:37:22.277138 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7552fec8-7b03-4ad9-8410-1705f639433e-utilities" (OuterVolumeSpecName: "utilities") pod "7552fec8-7b03-4ad9-8410-1705f639433e" (UID: "7552fec8-7b03-4ad9-8410-1705f639433e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:37:22 crc kubenswrapper[4690]: I0320 17:37:22.278893 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7552fec8-7b03-4ad9-8410-1705f639433e-kube-api-access-nd42m" (OuterVolumeSpecName: "kube-api-access-nd42m") pod "7552fec8-7b03-4ad9-8410-1705f639433e" (UID: "7552fec8-7b03-4ad9-8410-1705f639433e"). InnerVolumeSpecName "kube-api-access-nd42m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:37:22 crc kubenswrapper[4690]: I0320 17:37:22.341122 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7552fec8-7b03-4ad9-8410-1705f639433e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7552fec8-7b03-4ad9-8410-1705f639433e" (UID: "7552fec8-7b03-4ad9-8410-1705f639433e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:37:22 crc kubenswrapper[4690]: I0320 17:37:22.374314 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nd42m\" (UniqueName: \"kubernetes.io/projected/7552fec8-7b03-4ad9-8410-1705f639433e-kube-api-access-nd42m\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:22 crc kubenswrapper[4690]: I0320 17:37:22.374358 4690 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7552fec8-7b03-4ad9-8410-1705f639433e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:22 crc kubenswrapper[4690]: I0320 17:37:22.374367 4690 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7552fec8-7b03-4ad9-8410-1705f639433e-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:22 crc kubenswrapper[4690]: I0320 17:37:22.520215 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bnxz2"] Mar 20 17:37:22 crc kubenswrapper[4690]: I0320 17:37:22.527189 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-bnxz2"] Mar 20 17:37:22 crc kubenswrapper[4690]: I0320 17:37:22.794752 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-2b2sh"] Mar 20 17:37:23 crc kubenswrapper[4690]: I0320 17:37:23.522440 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gj5xl"] Mar 20 17:37:23 crc kubenswrapper[4690]: I0320 17:37:23.523706 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gj5xl" podUID="068537fa-5883-4e11-a933-87706891d0ae" containerName="registry-server" containerID="cri-o://4140b0d641cf878254f672a33d6785be5e012927cc77271f69b840cf6dc17cd8" gracePeriod=2 Mar 20 17:37:23 crc kubenswrapper[4690]: I0320 17:37:23.889333 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7552fec8-7b03-4ad9-8410-1705f639433e" path="/var/lib/kubelet/pods/7552fec8-7b03-4ad9-8410-1705f639433e/volumes" Mar 20 17:37:23 crc kubenswrapper[4690]: I0320 17:37:23.894905 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gj5xl" Mar 20 17:37:23 crc kubenswrapper[4690]: I0320 17:37:23.995491 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/068537fa-5883-4e11-a933-87706891d0ae-catalog-content\") pod \"068537fa-5883-4e11-a933-87706891d0ae\" (UID: \"068537fa-5883-4e11-a933-87706891d0ae\") " Mar 20 17:37:23 crc kubenswrapper[4690]: I0320 17:37:23.995863 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4tmr\" (UniqueName: \"kubernetes.io/projected/068537fa-5883-4e11-a933-87706891d0ae-kube-api-access-m4tmr\") pod \"068537fa-5883-4e11-a933-87706891d0ae\" (UID: \"068537fa-5883-4e11-a933-87706891d0ae\") " Mar 20 17:37:23 crc kubenswrapper[4690]: I0320 17:37:23.995926 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/068537fa-5883-4e11-a933-87706891d0ae-utilities\") pod \"068537fa-5883-4e11-a933-87706891d0ae\" (UID: \"068537fa-5883-4e11-a933-87706891d0ae\") " Mar 20 17:37:23 crc kubenswrapper[4690]: I0320 17:37:23.997304 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/068537fa-5883-4e11-a933-87706891d0ae-utilities" (OuterVolumeSpecName: "utilities") pod "068537fa-5883-4e11-a933-87706891d0ae" (UID: "068537fa-5883-4e11-a933-87706891d0ae"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:37:24 crc kubenswrapper[4690]: I0320 17:37:24.002075 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/068537fa-5883-4e11-a933-87706891d0ae-kube-api-access-m4tmr" (OuterVolumeSpecName: "kube-api-access-m4tmr") pod "068537fa-5883-4e11-a933-87706891d0ae" (UID: "068537fa-5883-4e11-a933-87706891d0ae"). InnerVolumeSpecName "kube-api-access-m4tmr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:37:24 crc kubenswrapper[4690]: I0320 17:37:24.098101 4690 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/068537fa-5883-4e11-a933-87706891d0ae-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:24 crc kubenswrapper[4690]: I0320 17:37:24.098144 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4tmr\" (UniqueName: \"kubernetes.io/projected/068537fa-5883-4e11-a933-87706891d0ae-kube-api-access-m4tmr\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:24 crc kubenswrapper[4690]: I0320 17:37:24.163692 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/068537fa-5883-4e11-a933-87706891d0ae-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "068537fa-5883-4e11-a933-87706891d0ae" (UID: "068537fa-5883-4e11-a933-87706891d0ae"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:37:24 crc kubenswrapper[4690]: I0320 17:37:24.190128 4690 generic.go:334] "Generic (PLEG): container finished" podID="068537fa-5883-4e11-a933-87706891d0ae" containerID="4140b0d641cf878254f672a33d6785be5e012927cc77271f69b840cf6dc17cd8" exitCode=0 Mar 20 17:37:24 crc kubenswrapper[4690]: I0320 17:37:24.190165 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gj5xl" event={"ID":"068537fa-5883-4e11-a933-87706891d0ae","Type":"ContainerDied","Data":"4140b0d641cf878254f672a33d6785be5e012927cc77271f69b840cf6dc17cd8"} Mar 20 17:37:24 crc kubenswrapper[4690]: I0320 17:37:24.190188 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gj5xl" event={"ID":"068537fa-5883-4e11-a933-87706891d0ae","Type":"ContainerDied","Data":"c575e70c118f3401d4f7337a6269f9bf041442260c12872a11b892ef3e082faa"} Mar 20 17:37:24 crc kubenswrapper[4690]: I0320 17:37:24.190205 4690 scope.go:117] "RemoveContainer" containerID="4140b0d641cf878254f672a33d6785be5e012927cc77271f69b840cf6dc17cd8" Mar 20 17:37:24 crc kubenswrapper[4690]: I0320 17:37:24.190330 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gj5xl" Mar 20 17:37:24 crc kubenswrapper[4690]: I0320 17:37:24.198995 4690 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/068537fa-5883-4e11-a933-87706891d0ae-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:24 crc kubenswrapper[4690]: I0320 17:37:24.217420 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gj5xl"] Mar 20 17:37:24 crc kubenswrapper[4690]: I0320 17:37:24.219297 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gj5xl"] Mar 20 17:37:24 crc kubenswrapper[4690]: I0320 17:37:24.220330 4690 scope.go:117] "RemoveContainer" containerID="805574f227d2d2bd444371e85c61f18a42097cbc92f61cc596f9ca7f48c54478" Mar 20 17:37:24 crc kubenswrapper[4690]: I0320 17:37:24.252065 4690 scope.go:117] "RemoveContainer" containerID="dc00855c4b1a0da44de8bb1d5b47f3854a15f9287db0d3c0cd01acd4e0bab1ca" Mar 20 17:37:24 crc kubenswrapper[4690]: I0320 17:37:24.277272 4690 scope.go:117] "RemoveContainer" containerID="4140b0d641cf878254f672a33d6785be5e012927cc77271f69b840cf6dc17cd8" Mar 20 17:37:24 crc kubenswrapper[4690]: E0320 17:37:24.278660 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4140b0d641cf878254f672a33d6785be5e012927cc77271f69b840cf6dc17cd8\": container with ID starting with 4140b0d641cf878254f672a33d6785be5e012927cc77271f69b840cf6dc17cd8 not found: ID does not exist" containerID="4140b0d641cf878254f672a33d6785be5e012927cc77271f69b840cf6dc17cd8" Mar 20 17:37:24 crc kubenswrapper[4690]: I0320 17:37:24.278695 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4140b0d641cf878254f672a33d6785be5e012927cc77271f69b840cf6dc17cd8"} err="failed to get container status \"4140b0d641cf878254f672a33d6785be5e012927cc77271f69b840cf6dc17cd8\": rpc error: code = NotFound desc = could not find container \"4140b0d641cf878254f672a33d6785be5e012927cc77271f69b840cf6dc17cd8\": container with ID starting with 4140b0d641cf878254f672a33d6785be5e012927cc77271f69b840cf6dc17cd8 not found: ID does not exist" Mar 20 17:37:24 crc kubenswrapper[4690]: I0320 17:37:24.278720 4690 scope.go:117] "RemoveContainer" containerID="805574f227d2d2bd444371e85c61f18a42097cbc92f61cc596f9ca7f48c54478" Mar 20 17:37:24 crc kubenswrapper[4690]: E0320 17:37:24.279421 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"805574f227d2d2bd444371e85c61f18a42097cbc92f61cc596f9ca7f48c54478\": container with ID starting with 805574f227d2d2bd444371e85c61f18a42097cbc92f61cc596f9ca7f48c54478 not found: ID does not exist" containerID="805574f227d2d2bd444371e85c61f18a42097cbc92f61cc596f9ca7f48c54478" Mar 20 17:37:24 crc kubenswrapper[4690]: I0320 17:37:24.279446 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"805574f227d2d2bd444371e85c61f18a42097cbc92f61cc596f9ca7f48c54478"} err="failed to get container status \"805574f227d2d2bd444371e85c61f18a42097cbc92f61cc596f9ca7f48c54478\": rpc error: code = NotFound desc = could not find container \"805574f227d2d2bd444371e85c61f18a42097cbc92f61cc596f9ca7f48c54478\": container with ID starting with 805574f227d2d2bd444371e85c61f18a42097cbc92f61cc596f9ca7f48c54478 not found: ID does not exist" Mar 20 17:37:24 crc kubenswrapper[4690]: I0320 17:37:24.279467 4690 scope.go:117] "RemoveContainer" containerID="dc00855c4b1a0da44de8bb1d5b47f3854a15f9287db0d3c0cd01acd4e0bab1ca" Mar 20 17:37:24 crc kubenswrapper[4690]: E0320 17:37:24.279871 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc00855c4b1a0da44de8bb1d5b47f3854a15f9287db0d3c0cd01acd4e0bab1ca\": container with ID starting with dc00855c4b1a0da44de8bb1d5b47f3854a15f9287db0d3c0cd01acd4e0bab1ca not found: ID does not exist" containerID="dc00855c4b1a0da44de8bb1d5b47f3854a15f9287db0d3c0cd01acd4e0bab1ca" Mar 20 17:37:24 crc kubenswrapper[4690]: I0320 17:37:24.279890 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc00855c4b1a0da44de8bb1d5b47f3854a15f9287db0d3c0cd01acd4e0bab1ca"} err="failed to get container status \"dc00855c4b1a0da44de8bb1d5b47f3854a15f9287db0d3c0cd01acd4e0bab1ca\": rpc error: code = NotFound desc = could not find container \"dc00855c4b1a0da44de8bb1d5b47f3854a15f9287db0d3c0cd01acd4e0bab1ca\": container with ID starting with dc00855c4b1a0da44de8bb1d5b47f3854a15f9287db0d3c0cd01acd4e0bab1ca not found: ID does not exist" Mar 20 17:37:25 crc kubenswrapper[4690]: E0320 17:37:25.443966 4690 file.go:109] "Unable to process watch event" err="can't process config file \"/etc/kubernetes/manifests/kube-apiserver-startup-monitor-pod.yaml\": /etc/kubernetes/manifests/kube-apiserver-startup-monitor-pod.yaml: couldn't parse as pod(Object 'Kind' is missing in 'null'), please check config file" Mar 20 17:37:25 crc kubenswrapper[4690]: I0320 17:37:25.446598 4690 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 20 17:37:25 crc kubenswrapper[4690]: E0320 17:37:25.446927 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7552fec8-7b03-4ad9-8410-1705f639433e" containerName="registry-server" Mar 20 17:37:25 crc kubenswrapper[4690]: I0320 17:37:25.446953 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="7552fec8-7b03-4ad9-8410-1705f639433e" containerName="registry-server" Mar 20 17:37:25 crc kubenswrapper[4690]: E0320 17:37:25.446971 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7552fec8-7b03-4ad9-8410-1705f639433e" containerName="extract-utilities" Mar 20 17:37:25 crc kubenswrapper[4690]: I0320 17:37:25.446980 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="7552fec8-7b03-4ad9-8410-1705f639433e" containerName="extract-utilities" Mar 20 17:37:25 crc kubenswrapper[4690]: E0320 17:37:25.446990 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="068537fa-5883-4e11-a933-87706891d0ae" containerName="extract-utilities" Mar 20 17:37:25 crc kubenswrapper[4690]: I0320 17:37:25.446999 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="068537fa-5883-4e11-a933-87706891d0ae" containerName="extract-utilities" Mar 20 17:37:25 crc kubenswrapper[4690]: E0320 17:37:25.447010 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="068537fa-5883-4e11-a933-87706891d0ae" containerName="extract-content" Mar 20 17:37:25 crc kubenswrapper[4690]: I0320 17:37:25.447017 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="068537fa-5883-4e11-a933-87706891d0ae" containerName="extract-content" Mar 20 17:37:25 crc kubenswrapper[4690]: E0320 17:37:25.447026 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="068537fa-5883-4e11-a933-87706891d0ae" containerName="registry-server" Mar 20 17:37:25 crc kubenswrapper[4690]: I0320 17:37:25.447034 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="068537fa-5883-4e11-a933-87706891d0ae" containerName="registry-server" Mar 20 17:37:25 crc kubenswrapper[4690]: E0320 17:37:25.447044 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7552fec8-7b03-4ad9-8410-1705f639433e" containerName="extract-content" Mar 20 17:37:25 crc kubenswrapper[4690]: I0320 17:37:25.447051 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="7552fec8-7b03-4ad9-8410-1705f639433e" containerName="extract-content" Mar 20 17:37:25 crc kubenswrapper[4690]: I0320 17:37:25.460546 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="7552fec8-7b03-4ad9-8410-1705f639433e" containerName="registry-server" Mar 20 17:37:25 crc kubenswrapper[4690]: I0320 17:37:25.460624 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="068537fa-5883-4e11-a933-87706891d0ae" containerName="registry-server" Mar 20 17:37:25 crc kubenswrapper[4690]: I0320 17:37:25.462020 4690 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 17:37:25 crc kubenswrapper[4690]: I0320 17:37:25.462096 4690 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 17:37:25 crc kubenswrapper[4690]: I0320 17:37:25.462219 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 17:37:25 crc kubenswrapper[4690]: E0320 17:37:25.462526 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 20 17:37:25 crc kubenswrapper[4690]: I0320 17:37:25.462557 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 20 17:37:25 crc kubenswrapper[4690]: E0320 17:37:25.462578 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 20 17:37:25 crc kubenswrapper[4690]: I0320 17:37:25.462589 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 20 17:37:25 crc kubenswrapper[4690]: E0320 17:37:25.462607 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 20 17:37:25 crc kubenswrapper[4690]: I0320 17:37:25.462620 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 20 17:37:25 crc kubenswrapper[4690]: E0320 17:37:25.462830 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 17:37:25 crc kubenswrapper[4690]: I0320 17:37:25.462841 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 17:37:25 crc kubenswrapper[4690]: I0320 17:37:25.462609 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://abc5b19d4175f97a26633b3c61b49147f93e1edeb8975964cb23bbe474f6326e" gracePeriod=15 Mar 20 17:37:25 crc kubenswrapper[4690]: E0320 17:37:25.462864 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 17:37:25 crc kubenswrapper[4690]: I0320 17:37:25.462843 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://b6d1877a8c2e19c04c44916cbcd68e19a117e4d6075b33ce7131064590120b12" gracePeriod=15 Mar 20 17:37:25 crc kubenswrapper[4690]: I0320 17:37:25.462920 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://8fe2bb59ee9fc82c3e49b375d294aebc73e2175d699416cb28c587a153cbadc5" gracePeriod=15 Mar 20 17:37:25 crc kubenswrapper[4690]: I0320 17:37:25.462957 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://83d020fd903a7b604233a4229c9a201a78f0f9d41864c94e82220090dd73e69e" gracePeriod=15 Mar 20 17:37:25 crc kubenswrapper[4690]: I0320 17:37:25.462995 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://af5080d60c7a6c75aac659ab9995f5f78392919748687dc3c81f6df7af1afe76" gracePeriod=15 Mar 20 17:37:25 crc kubenswrapper[4690]: I0320 17:37:25.462878 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 17:37:25 crc kubenswrapper[4690]: E0320 17:37:25.463223 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 20 17:37:25 crc kubenswrapper[4690]: I0320 17:37:25.463244 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 20 17:37:25 crc kubenswrapper[4690]: E0320 17:37:25.463314 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 17:37:25 crc kubenswrapper[4690]: I0320 17:37:25.463330 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 17:37:25 crc kubenswrapper[4690]: E0320 17:37:25.463351 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 20 17:37:25 crc kubenswrapper[4690]: I0320 17:37:25.463366 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 20 17:37:25 crc kubenswrapper[4690]: E0320 17:37:25.463396 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 17:37:25 crc kubenswrapper[4690]: I0320 17:37:25.463409 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 17:37:25 crc kubenswrapper[4690]: I0320 17:37:25.463800 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 20 17:37:25 crc kubenswrapper[4690]: I0320 17:37:25.463823 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 17:37:25 crc kubenswrapper[4690]: I0320 17:37:25.463837 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 17:37:25 crc kubenswrapper[4690]: I0320 17:37:25.463853 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 20 17:37:25 crc kubenswrapper[4690]: I0320 17:37:25.463877 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 17:37:25 crc kubenswrapper[4690]: I0320 17:37:25.463896 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 17:37:25 crc kubenswrapper[4690]: I0320 17:37:25.463911 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 20 17:37:25 crc kubenswrapper[4690]: I0320 17:37:25.463928 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 20 17:37:25 crc kubenswrapper[4690]: E0320 17:37:25.464175 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 17:37:25 crc kubenswrapper[4690]: I0320 17:37:25.464190 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 17:37:25 crc kubenswrapper[4690]: I0320 17:37:25.464616 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 17:37:25 crc kubenswrapper[4690]: I0320 17:37:25.467796 4690 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Mar 20 17:37:25 crc kubenswrapper[4690]: E0320 17:37:25.504510 4690 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.192:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 17:37:25 crc kubenswrapper[4690]: I0320 17:37:25.514413 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:37:25 crc kubenswrapper[4690]: I0320 17:37:25.514494 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 17:37:25 crc kubenswrapper[4690]: I0320 17:37:25.514615 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 17:37:25 crc kubenswrapper[4690]: I0320 17:37:25.514668 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 17:37:25 crc kubenswrapper[4690]: I0320 17:37:25.514724 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 17:37:25 crc kubenswrapper[4690]: I0320 17:37:25.514771 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 17:37:25 crc kubenswrapper[4690]: I0320 17:37:25.514806 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:37:25 crc kubenswrapper[4690]: I0320 17:37:25.514833 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:37:25 crc kubenswrapper[4690]: I0320 17:37:25.616053 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:37:25 crc kubenswrapper[4690]: I0320 17:37:25.616122 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 17:37:25 crc kubenswrapper[4690]: I0320 17:37:25.616140 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 17:37:25 crc kubenswrapper[4690]: I0320 17:37:25.616160 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 17:37:25 crc kubenswrapper[4690]: I0320 17:37:25.616189 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 17:37:25 crc kubenswrapper[4690]: I0320 17:37:25.616184 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:37:25 crc kubenswrapper[4690]: I0320 17:37:25.616217 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 17:37:25 crc kubenswrapper[4690]: I0320 17:37:25.616243 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:37:25 crc kubenswrapper[4690]: I0320 17:37:25.616281 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 17:37:25 crc kubenswrapper[4690]: I0320 17:37:25.616289 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:37:25 crc kubenswrapper[4690]: I0320 17:37:25.616314 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 17:37:25 crc kubenswrapper[4690]: I0320 17:37:25.616312 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 17:37:25 crc kubenswrapper[4690]: I0320 17:37:25.616347 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:37:25 crc kubenswrapper[4690]: I0320 17:37:25.616364 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 17:37:25 crc kubenswrapper[4690]: I0320 17:37:25.616383 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:37:25 crc kubenswrapper[4690]: I0320 17:37:25.616423 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 17:37:25 crc kubenswrapper[4690]: I0320 17:37:25.806405 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 17:37:25 crc kubenswrapper[4690]: W0320 17:37:25.833028 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-391f1a48cd320662b7b725601192c318d3bb2be69dcddac0d4427c92fe0060bb WatchSource:0}: Error finding container 391f1a48cd320662b7b725601192c318d3bb2be69dcddac0d4427c92fe0060bb: Status 404 returned error can't find the container with id 391f1a48cd320662b7b725601192c318d3bb2be69dcddac0d4427c92fe0060bb Mar 20 17:37:25 crc kubenswrapper[4690]: E0320 17:37:25.841044 4690 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.192:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189e9d4a56d6035d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:37:25.840560989 +0000 UTC m=+320.706386667,LastTimestamp:2026-03-20 17:37:25.840560989 +0000 UTC m=+320.706386667,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:37:25 crc kubenswrapper[4690]: I0320 17:37:25.891870 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="068537fa-5883-4e11-a933-87706891d0ae" path="/var/lib/kubelet/pods/068537fa-5883-4e11-a933-87706891d0ae/volumes" Mar 20 17:37:26 crc kubenswrapper[4690]: I0320 17:37:26.207965 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 20 17:37:26 crc kubenswrapper[4690]: I0320 17:37:26.210089 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 17:37:26 crc kubenswrapper[4690]: I0320 17:37:26.211326 4690 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="af5080d60c7a6c75aac659ab9995f5f78392919748687dc3c81f6df7af1afe76" exitCode=0 Mar 20 17:37:26 crc kubenswrapper[4690]: I0320 17:37:26.211355 4690 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b6d1877a8c2e19c04c44916cbcd68e19a117e4d6075b33ce7131064590120b12" exitCode=0 Mar 20 17:37:26 crc kubenswrapper[4690]: I0320 17:37:26.211364 4690 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="8fe2bb59ee9fc82c3e49b375d294aebc73e2175d699416cb28c587a153cbadc5" exitCode=0 Mar 20 17:37:26 crc kubenswrapper[4690]: I0320 17:37:26.211372 4690 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="83d020fd903a7b604233a4229c9a201a78f0f9d41864c94e82220090dd73e69e" exitCode=2 Mar 20 17:37:26 crc kubenswrapper[4690]: I0320 17:37:26.211438 4690 scope.go:117] "RemoveContainer" containerID="60a788ca120045ef7b2481c3da0afac1f8ae2522b3edd3b73a48f5f8dab045a4" Mar 20 17:37:26 crc kubenswrapper[4690]: I0320 17:37:26.214571 4690 generic.go:334] "Generic (PLEG): container finished" podID="de98586b-5aaf-464b-aceb-0493a4c4a84b" containerID="8ecf0177b9fbb4044bcd83d45908355a3bf5d82ac73fe3f3e442ed72a7a4418e" exitCode=0 Mar 20 17:37:26 crc kubenswrapper[4690]: I0320 17:37:26.214727 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"de98586b-5aaf-464b-aceb-0493a4c4a84b","Type":"ContainerDied","Data":"8ecf0177b9fbb4044bcd83d45908355a3bf5d82ac73fe3f3e442ed72a7a4418e"} Mar 20 17:37:26 crc kubenswrapper[4690]: I0320 17:37:26.215694 4690 status_manager.go:851] "Failed to get status for pod" podUID="de98586b-5aaf-464b-aceb-0493a4c4a84b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 17:37:26 crc kubenswrapper[4690]: I0320 17:37:26.216553 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"0d3bb51a70f2d1c02efb6c8a28224826384cf12d0e33c9c1769ca6d92c266120"} Mar 20 17:37:26 crc kubenswrapper[4690]: I0320 17:37:26.216587 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"391f1a48cd320662b7b725601192c318d3bb2be69dcddac0d4427c92fe0060bb"} Mar 20 17:37:26 crc kubenswrapper[4690]: E0320 17:37:26.217349 4690 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.192:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 17:37:26 crc kubenswrapper[4690]: I0320 17:37:26.217485 4690 status_manager.go:851] "Failed to get status for pod" podUID="de98586b-5aaf-464b-aceb-0493a4c4a84b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 17:37:27 crc kubenswrapper[4690]: I0320 17:37:27.226820 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 17:37:27 crc kubenswrapper[4690]: I0320 17:37:27.631285 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 17:37:27 crc kubenswrapper[4690]: I0320 17:37:27.632057 4690 status_manager.go:851] "Failed to get status for pod" podUID="de98586b-5aaf-464b-aceb-0493a4c4a84b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 17:37:27 crc kubenswrapper[4690]: I0320 17:37:27.645887 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/de98586b-5aaf-464b-aceb-0493a4c4a84b-var-lock\") pod \"de98586b-5aaf-464b-aceb-0493a4c4a84b\" (UID: \"de98586b-5aaf-464b-aceb-0493a4c4a84b\") " Mar 20 17:37:27 crc kubenswrapper[4690]: I0320 17:37:27.645946 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/de98586b-5aaf-464b-aceb-0493a4c4a84b-kube-api-access\") pod \"de98586b-5aaf-464b-aceb-0493a4c4a84b\" (UID: \"de98586b-5aaf-464b-aceb-0493a4c4a84b\") " Mar 20 17:37:27 crc kubenswrapper[4690]: I0320 17:37:27.646018 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/de98586b-5aaf-464b-aceb-0493a4c4a84b-kubelet-dir\") pod \"de98586b-5aaf-464b-aceb-0493a4c4a84b\" (UID: \"de98586b-5aaf-464b-aceb-0493a4c4a84b\") " Mar 20 17:37:27 crc kubenswrapper[4690]: I0320 17:37:27.646020 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/de98586b-5aaf-464b-aceb-0493a4c4a84b-var-lock" (OuterVolumeSpecName: "var-lock") pod "de98586b-5aaf-464b-aceb-0493a4c4a84b" (UID: "de98586b-5aaf-464b-aceb-0493a4c4a84b"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:37:27 crc kubenswrapper[4690]: I0320 17:37:27.646143 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/de98586b-5aaf-464b-aceb-0493a4c4a84b-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "de98586b-5aaf-464b-aceb-0493a4c4a84b" (UID: "de98586b-5aaf-464b-aceb-0493a4c4a84b"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:37:27 crc kubenswrapper[4690]: I0320 17:37:27.646521 4690 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/de98586b-5aaf-464b-aceb-0493a4c4a84b-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:27 crc kubenswrapper[4690]: I0320 17:37:27.646552 4690 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/de98586b-5aaf-464b-aceb-0493a4c4a84b-var-lock\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:27 crc kubenswrapper[4690]: I0320 17:37:27.651199 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de98586b-5aaf-464b-aceb-0493a4c4a84b-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "de98586b-5aaf-464b-aceb-0493a4c4a84b" (UID: "de98586b-5aaf-464b-aceb-0493a4c4a84b"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:37:27 crc kubenswrapper[4690]: I0320 17:37:27.747967 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/de98586b-5aaf-464b-aceb-0493a4c4a84b-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:27 crc kubenswrapper[4690]: I0320 17:37:27.806071 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 17:37:27 crc kubenswrapper[4690]: I0320 17:37:27.806978 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:37:27 crc kubenswrapper[4690]: I0320 17:37:27.807576 4690 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 17:37:27 crc kubenswrapper[4690]: I0320 17:37:27.807837 4690 status_manager.go:851] "Failed to get status for pod" podUID="de98586b-5aaf-464b-aceb-0493a4c4a84b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 17:37:27 crc kubenswrapper[4690]: I0320 17:37:27.848596 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 20 17:37:27 crc kubenswrapper[4690]: I0320 17:37:27.848700 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 20 17:37:27 crc kubenswrapper[4690]: I0320 17:37:27.848734 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 20 17:37:27 crc kubenswrapper[4690]: I0320 17:37:27.848967 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:37:27 crc kubenswrapper[4690]: I0320 17:37:27.849015 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:37:27 crc kubenswrapper[4690]: I0320 17:37:27.849030 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:37:27 crc kubenswrapper[4690]: I0320 17:37:27.895083 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 20 17:37:27 crc kubenswrapper[4690]: I0320 17:37:27.950352 4690 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:27 crc kubenswrapper[4690]: I0320 17:37:27.950560 4690 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:27 crc kubenswrapper[4690]: I0320 17:37:27.950573 4690 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:28 crc kubenswrapper[4690]: I0320 17:37:28.238823 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 17:37:28 crc kubenswrapper[4690]: I0320 17:37:28.238842 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"de98586b-5aaf-464b-aceb-0493a4c4a84b","Type":"ContainerDied","Data":"8ddfae8f7f946034d5a5ba03c4905671b9c686703d33af849f26c8551d5d4df5"} Mar 20 17:37:28 crc kubenswrapper[4690]: I0320 17:37:28.238914 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8ddfae8f7f946034d5a5ba03c4905671b9c686703d33af849f26c8551d5d4df5" Mar 20 17:37:28 crc kubenswrapper[4690]: I0320 17:37:28.242128 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 17:37:28 crc kubenswrapper[4690]: I0320 17:37:28.243062 4690 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="abc5b19d4175f97a26633b3c61b49147f93e1edeb8975964cb23bbe474f6326e" exitCode=0 Mar 20 17:37:28 crc kubenswrapper[4690]: I0320 17:37:28.243209 4690 scope.go:117] "RemoveContainer" containerID="af5080d60c7a6c75aac659ab9995f5f78392919748687dc3c81f6df7af1afe76" Mar 20 17:37:28 crc kubenswrapper[4690]: I0320 17:37:28.243224 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:37:28 crc kubenswrapper[4690]: I0320 17:37:28.244225 4690 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 17:37:28 crc kubenswrapper[4690]: I0320 17:37:28.244643 4690 status_manager.go:851] "Failed to get status for pod" podUID="de98586b-5aaf-464b-aceb-0493a4c4a84b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 17:37:28 crc kubenswrapper[4690]: I0320 17:37:28.245794 4690 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 17:37:28 crc kubenswrapper[4690]: I0320 17:37:28.246380 4690 status_manager.go:851] "Failed to get status for pod" podUID="de98586b-5aaf-464b-aceb-0493a4c4a84b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 17:37:28 crc kubenswrapper[4690]: I0320 17:37:28.248314 4690 status_manager.go:851] "Failed to get status for pod" podUID="de98586b-5aaf-464b-aceb-0493a4c4a84b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 17:37:28 crc kubenswrapper[4690]: I0320 17:37:28.248851 4690 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 17:37:28 crc kubenswrapper[4690]: I0320 17:37:28.264341 4690 scope.go:117] "RemoveContainer" containerID="b6d1877a8c2e19c04c44916cbcd68e19a117e4d6075b33ce7131064590120b12" Mar 20 17:37:28 crc kubenswrapper[4690]: I0320 17:37:28.283946 4690 scope.go:117] "RemoveContainer" containerID="8fe2bb59ee9fc82c3e49b375d294aebc73e2175d699416cb28c587a153cbadc5" Mar 20 17:37:28 crc kubenswrapper[4690]: I0320 17:37:28.302416 4690 scope.go:117] "RemoveContainer" containerID="83d020fd903a7b604233a4229c9a201a78f0f9d41864c94e82220090dd73e69e" Mar 20 17:37:28 crc kubenswrapper[4690]: I0320 17:37:28.322666 4690 scope.go:117] "RemoveContainer" containerID="abc5b19d4175f97a26633b3c61b49147f93e1edeb8975964cb23bbe474f6326e" Mar 20 17:37:28 crc kubenswrapper[4690]: I0320 17:37:28.350431 4690 scope.go:117] "RemoveContainer" containerID="438a96b878fe413aa54a56021b7ca5d2d38226050a036c2ce144aaead090aff7" Mar 20 17:37:28 crc kubenswrapper[4690]: I0320 17:37:28.377718 4690 scope.go:117] "RemoveContainer" containerID="af5080d60c7a6c75aac659ab9995f5f78392919748687dc3c81f6df7af1afe76" Mar 20 17:37:28 crc kubenswrapper[4690]: E0320 17:37:28.378631 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af5080d60c7a6c75aac659ab9995f5f78392919748687dc3c81f6df7af1afe76\": container with ID starting with af5080d60c7a6c75aac659ab9995f5f78392919748687dc3c81f6df7af1afe76 not found: ID does not exist" containerID="af5080d60c7a6c75aac659ab9995f5f78392919748687dc3c81f6df7af1afe76" Mar 20 17:37:28 crc kubenswrapper[4690]: I0320 17:37:28.378715 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af5080d60c7a6c75aac659ab9995f5f78392919748687dc3c81f6df7af1afe76"} err="failed to get container status \"af5080d60c7a6c75aac659ab9995f5f78392919748687dc3c81f6df7af1afe76\": rpc error: code = NotFound desc = could not find container \"af5080d60c7a6c75aac659ab9995f5f78392919748687dc3c81f6df7af1afe76\": container with ID starting with af5080d60c7a6c75aac659ab9995f5f78392919748687dc3c81f6df7af1afe76 not found: ID does not exist" Mar 20 17:37:28 crc kubenswrapper[4690]: I0320 17:37:28.378762 4690 scope.go:117] "RemoveContainer" containerID="b6d1877a8c2e19c04c44916cbcd68e19a117e4d6075b33ce7131064590120b12" Mar 20 17:37:28 crc kubenswrapper[4690]: E0320 17:37:28.379435 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6d1877a8c2e19c04c44916cbcd68e19a117e4d6075b33ce7131064590120b12\": container with ID starting with b6d1877a8c2e19c04c44916cbcd68e19a117e4d6075b33ce7131064590120b12 not found: ID does not exist" containerID="b6d1877a8c2e19c04c44916cbcd68e19a117e4d6075b33ce7131064590120b12" Mar 20 17:37:28 crc kubenswrapper[4690]: I0320 17:37:28.379501 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6d1877a8c2e19c04c44916cbcd68e19a117e4d6075b33ce7131064590120b12"} err="failed to get container status \"b6d1877a8c2e19c04c44916cbcd68e19a117e4d6075b33ce7131064590120b12\": rpc error: code = NotFound desc = could not find container \"b6d1877a8c2e19c04c44916cbcd68e19a117e4d6075b33ce7131064590120b12\": container with ID starting with b6d1877a8c2e19c04c44916cbcd68e19a117e4d6075b33ce7131064590120b12 not found: ID does not exist" Mar 20 17:37:28 crc kubenswrapper[4690]: I0320 17:37:28.379541 4690 scope.go:117] "RemoveContainer" containerID="8fe2bb59ee9fc82c3e49b375d294aebc73e2175d699416cb28c587a153cbadc5" Mar 20 17:37:28 crc kubenswrapper[4690]: E0320 17:37:28.381172 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8fe2bb59ee9fc82c3e49b375d294aebc73e2175d699416cb28c587a153cbadc5\": container with ID starting with 8fe2bb59ee9fc82c3e49b375d294aebc73e2175d699416cb28c587a153cbadc5 not found: ID does not exist" containerID="8fe2bb59ee9fc82c3e49b375d294aebc73e2175d699416cb28c587a153cbadc5" Mar 20 17:37:28 crc kubenswrapper[4690]: I0320 17:37:28.381242 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fe2bb59ee9fc82c3e49b375d294aebc73e2175d699416cb28c587a153cbadc5"} err="failed to get container status \"8fe2bb59ee9fc82c3e49b375d294aebc73e2175d699416cb28c587a153cbadc5\": rpc error: code = NotFound desc = could not find container \"8fe2bb59ee9fc82c3e49b375d294aebc73e2175d699416cb28c587a153cbadc5\": container with ID starting with 8fe2bb59ee9fc82c3e49b375d294aebc73e2175d699416cb28c587a153cbadc5 not found: ID does not exist" Mar 20 17:37:28 crc kubenswrapper[4690]: I0320 17:37:28.381310 4690 scope.go:117] "RemoveContainer" containerID="83d020fd903a7b604233a4229c9a201a78f0f9d41864c94e82220090dd73e69e" Mar 20 17:37:28 crc kubenswrapper[4690]: E0320 17:37:28.381767 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83d020fd903a7b604233a4229c9a201a78f0f9d41864c94e82220090dd73e69e\": container with ID starting with 83d020fd903a7b604233a4229c9a201a78f0f9d41864c94e82220090dd73e69e not found: ID does not exist" containerID="83d020fd903a7b604233a4229c9a201a78f0f9d41864c94e82220090dd73e69e" Mar 20 17:37:28 crc kubenswrapper[4690]: I0320 17:37:28.381819 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83d020fd903a7b604233a4229c9a201a78f0f9d41864c94e82220090dd73e69e"} err="failed to get container status \"83d020fd903a7b604233a4229c9a201a78f0f9d41864c94e82220090dd73e69e\": rpc error: code = NotFound desc = could not find container \"83d020fd903a7b604233a4229c9a201a78f0f9d41864c94e82220090dd73e69e\": container with ID starting with 83d020fd903a7b604233a4229c9a201a78f0f9d41864c94e82220090dd73e69e not found: ID does not exist" Mar 20 17:37:28 crc kubenswrapper[4690]: I0320 17:37:28.381851 4690 scope.go:117] "RemoveContainer" containerID="abc5b19d4175f97a26633b3c61b49147f93e1edeb8975964cb23bbe474f6326e" Mar 20 17:37:28 crc kubenswrapper[4690]: E0320 17:37:28.382432 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abc5b19d4175f97a26633b3c61b49147f93e1edeb8975964cb23bbe474f6326e\": container with ID starting with abc5b19d4175f97a26633b3c61b49147f93e1edeb8975964cb23bbe474f6326e not found: ID does not exist" containerID="abc5b19d4175f97a26633b3c61b49147f93e1edeb8975964cb23bbe474f6326e" Mar 20 17:37:28 crc kubenswrapper[4690]: I0320 17:37:28.382471 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abc5b19d4175f97a26633b3c61b49147f93e1edeb8975964cb23bbe474f6326e"} err="failed to get container status \"abc5b19d4175f97a26633b3c61b49147f93e1edeb8975964cb23bbe474f6326e\": rpc error: code = NotFound desc = could not find container \"abc5b19d4175f97a26633b3c61b49147f93e1edeb8975964cb23bbe474f6326e\": container with ID starting with abc5b19d4175f97a26633b3c61b49147f93e1edeb8975964cb23bbe474f6326e not found: ID does not exist" Mar 20 17:37:28 crc kubenswrapper[4690]: I0320 17:37:28.382491 4690 scope.go:117] "RemoveContainer" containerID="438a96b878fe413aa54a56021b7ca5d2d38226050a036c2ce144aaead090aff7" Mar 20 17:37:28 crc kubenswrapper[4690]: E0320 17:37:28.382804 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"438a96b878fe413aa54a56021b7ca5d2d38226050a036c2ce144aaead090aff7\": container with ID starting with 438a96b878fe413aa54a56021b7ca5d2d38226050a036c2ce144aaead090aff7 not found: ID does not exist" containerID="438a96b878fe413aa54a56021b7ca5d2d38226050a036c2ce144aaead090aff7" Mar 20 17:37:28 crc kubenswrapper[4690]: I0320 17:37:28.382855 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"438a96b878fe413aa54a56021b7ca5d2d38226050a036c2ce144aaead090aff7"} err="failed to get container status \"438a96b878fe413aa54a56021b7ca5d2d38226050a036c2ce144aaead090aff7\": rpc error: code = NotFound desc = could not find container \"438a96b878fe413aa54a56021b7ca5d2d38226050a036c2ce144aaead090aff7\": container with ID starting with 438a96b878fe413aa54a56021b7ca5d2d38226050a036c2ce144aaead090aff7 not found: ID does not exist" Mar 20 17:37:28 crc kubenswrapper[4690]: E0320 17:37:28.792423 4690 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.192:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189e9d4a56d6035d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:37:25.840560989 +0000 UTC m=+320.706386667,LastTimestamp:2026-03-20 17:37:25.840560989 +0000 UTC m=+320.706386667,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:37:29 crc kubenswrapper[4690]: I0320 17:37:29.921641 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6848775948-wq8cb" Mar 20 17:37:29 crc kubenswrapper[4690]: I0320 17:37:29.929462 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6848775948-wq8cb" Mar 20 17:37:29 crc kubenswrapper[4690]: I0320 17:37:29.930434 4690 status_manager.go:851] "Failed to get status for pod" podUID="41de6756-32f4-474f-bc58-d461313abb73" pod="openshift-controller-manager/controller-manager-6848775948-wq8cb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6848775948-wq8cb\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 17:37:29 crc kubenswrapper[4690]: I0320 17:37:29.931225 4690 status_manager.go:851] "Failed to get status for pod" podUID="de98586b-5aaf-464b-aceb-0493a4c4a84b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 17:37:32 crc kubenswrapper[4690]: E0320 17:37:32.107181 4690 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 17:37:32 crc kubenswrapper[4690]: E0320 17:37:32.107792 4690 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 17:37:32 crc kubenswrapper[4690]: E0320 17:37:32.108096 4690 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 17:37:32 crc kubenswrapper[4690]: E0320 17:37:32.110722 4690 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 17:37:32 crc kubenswrapper[4690]: E0320 17:37:32.110924 4690 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 17:37:32 crc kubenswrapper[4690]: I0320 17:37:32.110955 4690 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 20 17:37:32 crc kubenswrapper[4690]: E0320 17:37:32.111180 4690 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" interval="200ms" Mar 20 17:37:32 crc kubenswrapper[4690]: E0320 17:37:32.312295 4690 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" interval="400ms" Mar 20 17:37:32 crc kubenswrapper[4690]: E0320 17:37:32.714241 4690 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" interval="800ms" Mar 20 17:37:33 crc kubenswrapper[4690]: E0320 17:37:33.022934 4690 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:37:33Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:37:33Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:37:33Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:37:33Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 17:37:33 crc kubenswrapper[4690]: E0320 17:37:33.023668 4690 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 17:37:33 crc kubenswrapper[4690]: E0320 17:37:33.024342 4690 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 17:37:33 crc kubenswrapper[4690]: E0320 17:37:33.024696 4690 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 17:37:33 crc kubenswrapper[4690]: E0320 17:37:33.025055 4690 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 17:37:33 crc kubenswrapper[4690]: E0320 17:37:33.025088 4690 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 17:37:33 crc kubenswrapper[4690]: E0320 17:37:33.516172 4690 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" interval="1.6s" Mar 20 17:37:34 crc kubenswrapper[4690]: I0320 17:37:34.294893 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:37:34 crc kubenswrapper[4690]: I0320 17:37:34.294962 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:37:34 crc kubenswrapper[4690]: I0320 17:37:34.295021 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3cb690cf-caea-4c1b-ad3c-7e17a802b1a3-metrics-certs\") pod \"network-metrics-daemon-bgj72\" (UID: \"3cb690cf-caea-4c1b-ad3c-7e17a802b1a3\") " pod="openshift-multus/network-metrics-daemon-bgj72" Mar 20 17:37:34 crc kubenswrapper[4690]: I0320 17:37:34.295129 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:37:34 crc kubenswrapper[4690]: I0320 17:37:34.295178 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:37:34 crc kubenswrapper[4690]: W0320 17:37:34.296287 4690 reflector.go:561] object-"openshift-network-console"/"networking-console-plugin-cert": failed to list *v1.Secret: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/secrets?fieldSelector=metadata.name%3Dnetworking-console-plugin-cert&resourceVersion=27296": dial tcp 38.102.83.192:6443: connect: connection refused Mar 20 17:37:34 crc kubenswrapper[4690]: E0320 17:37:34.296391 4690 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-console\"/\"networking-console-plugin-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/secrets?fieldSelector=metadata.name%3Dnetworking-console-plugin-cert&resourceVersion=27296\": dial tcp 38.102.83.192:6443: connect: connection refused" logger="UnhandledError" Mar 20 17:37:34 crc kubenswrapper[4690]: W0320 17:37:34.296535 4690 reflector.go:561] object-"openshift-network-diagnostics"/"kube-root-ca.crt": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&resourceVersion=27309": dial tcp 38.102.83.192:6443: connect: connection refused Mar 20 17:37:34 crc kubenswrapper[4690]: E0320 17:37:34.296589 4690 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&resourceVersion=27309\": dial tcp 38.102.83.192:6443: connect: connection refused" logger="UnhandledError" Mar 20 17:37:34 crc kubenswrapper[4690]: W0320 17:37:34.296707 4690 reflector.go:561] object-"openshift-network-console"/"networking-console-plugin": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/configmaps?fieldSelector=metadata.name%3Dnetworking-console-plugin&resourceVersion=27309": dial tcp 38.102.83.192:6443: connect: connection refused Mar 20 17:37:34 crc kubenswrapper[4690]: E0320 17:37:34.296759 4690 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-console\"/\"networking-console-plugin\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/configmaps?fieldSelector=metadata.name%3Dnetworking-console-plugin&resourceVersion=27309\": dial tcp 38.102.83.192:6443: connect: connection refused" logger="UnhandledError" Mar 20 17:37:34 crc kubenswrapper[4690]: W0320 17:37:34.297099 4690 reflector.go:561] object-"openshift-multus"/"metrics-daemon-secret": failed to list *v1.Secret: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-multus/secrets?fieldSelector=metadata.name%3Dmetrics-daemon-secret&resourceVersion=27296": dial tcp 38.102.83.192:6443: connect: connection refused Mar 20 17:37:34 crc kubenswrapper[4690]: E0320 17:37:34.297201 4690 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"metrics-daemon-secret\": Failed to watch *v1.Secret: failed to list *v1.Secret: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-multus/secrets?fieldSelector=metadata.name%3Dmetrics-daemon-secret&resourceVersion=27296\": dial tcp 38.102.83.192:6443: connect: connection refused" logger="UnhandledError" Mar 20 17:37:35 crc kubenswrapper[4690]: E0320 17:37:35.118547 4690 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" interval="3.2s" Mar 20 17:37:35 crc kubenswrapper[4690]: E0320 17:37:35.295364 4690 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 20 17:37:35 crc kubenswrapper[4690]: E0320 17:37:35.295388 4690 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: failed to sync configmap cache: timed out waiting for the condition Mar 20 17:37:35 crc kubenswrapper[4690]: E0320 17:37:35.295479 4690 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: failed to sync secret cache: timed out waiting for the condition Mar 20 17:37:35 crc kubenswrapper[4690]: E0320 17:37:35.295586 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 17:39:37.295542681 +0000 UTC m=+452.161368409 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : failed to sync configmap cache: timed out waiting for the condition Mar 20 17:37:35 crc kubenswrapper[4690]: E0320 17:37:35.295633 4690 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: failed to sync secret cache: timed out waiting for the condition Mar 20 17:37:35 crc kubenswrapper[4690]: E0320 17:37:35.295645 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 17:39:37.295604662 +0000 UTC m=+452.161430370 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : failed to sync secret cache: timed out waiting for the condition Mar 20 17:37:35 crc kubenswrapper[4690]: E0320 17:37:35.295695 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3cb690cf-caea-4c1b-ad3c-7e17a802b1a3-metrics-certs podName:3cb690cf-caea-4c1b-ad3c-7e17a802b1a3 nodeName:}" failed. No retries permitted until 2026-03-20 17:39:37.295678214 +0000 UTC m=+452.161504162 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3cb690cf-caea-4c1b-ad3c-7e17a802b1a3-metrics-certs") pod "network-metrics-daemon-bgj72" (UID: "3cb690cf-caea-4c1b-ad3c-7e17a802b1a3") : failed to sync secret cache: timed out waiting for the condition Mar 20 17:37:35 crc kubenswrapper[4690]: E0320 17:37:35.295699 4690 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 20 17:37:35 crc kubenswrapper[4690]: W0320 17:37:35.296591 4690 reflector.go:561] object-"openshift-network-diagnostics"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&resourceVersion=27309": dial tcp 38.102.83.192:6443: connect: connection refused Mar 20 17:37:35 crc kubenswrapper[4690]: E0320 17:37:35.296717 4690 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&resourceVersion=27309\": dial tcp 38.102.83.192:6443: connect: connection refused" logger="UnhandledError" Mar 20 17:37:35 crc kubenswrapper[4690]: I0320 17:37:35.888681 4690 status_manager.go:851] "Failed to get status for pod" podUID="de98586b-5aaf-464b-aceb-0493a4c4a84b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 17:37:35 crc kubenswrapper[4690]: I0320 17:37:35.889281 4690 status_manager.go:851] "Failed to get status for pod" podUID="41de6756-32f4-474f-bc58-d461313abb73" pod="openshift-controller-manager/controller-manager-6848775948-wq8cb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6848775948-wq8cb\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 17:37:36 crc kubenswrapper[4690]: E0320 17:37:36.295831 4690 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 20 17:37:36 crc kubenswrapper[4690]: E0320 17:37:36.296171 4690 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: failed to sync configmap cache: timed out waiting for the condition Mar 20 17:37:36 crc kubenswrapper[4690]: E0320 17:37:36.296286 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 17:39:38.296227876 +0000 UTC m=+453.162053584 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : failed to sync configmap cache: timed out waiting for the condition Mar 20 17:37:36 crc kubenswrapper[4690]: E0320 17:37:36.295861 4690 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 20 17:37:36 crc kubenswrapper[4690]: E0320 17:37:36.296339 4690 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: failed to sync configmap cache: timed out waiting for the condition Mar 20 17:37:36 crc kubenswrapper[4690]: E0320 17:37:36.296421 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 17:39:38.29639776 +0000 UTC m=+453.162223468 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : failed to sync configmap cache: timed out waiting for the condition Mar 20 17:37:36 crc kubenswrapper[4690]: W0320 17:37:36.705519 4690 reflector.go:561] object-"openshift-network-diagnostics"/"kube-root-ca.crt": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&resourceVersion=27309": dial tcp 38.102.83.192:6443: connect: connection refused Mar 20 17:37:36 crc kubenswrapper[4690]: E0320 17:37:36.705702 4690 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&resourceVersion=27309\": dial tcp 38.102.83.192:6443: connect: connection refused" logger="UnhandledError" Mar 20 17:37:36 crc kubenswrapper[4690]: W0320 17:37:36.905294 4690 reflector.go:561] object-"openshift-network-diagnostics"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&resourceVersion=27309": dial tcp 38.102.83.192:6443: connect: connection refused Mar 20 17:37:36 crc kubenswrapper[4690]: E0320 17:37:36.905472 4690 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&resourceVersion=27309\": dial tcp 38.102.83.192:6443: connect: connection refused" logger="UnhandledError" Mar 20 17:37:36 crc kubenswrapper[4690]: W0320 17:37:36.981217 4690 reflector.go:561] object-"openshift-network-console"/"networking-console-plugin-cert": failed to list *v1.Secret: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/secrets?fieldSelector=metadata.name%3Dnetworking-console-plugin-cert&resourceVersion=27296": dial tcp 38.102.83.192:6443: connect: connection refused Mar 20 17:37:36 crc kubenswrapper[4690]: E0320 17:37:36.981355 4690 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-console\"/\"networking-console-plugin-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/secrets?fieldSelector=metadata.name%3Dnetworking-console-plugin-cert&resourceVersion=27296\": dial tcp 38.102.83.192:6443: connect: connection refused" logger="UnhandledError" Mar 20 17:37:37 crc kubenswrapper[4690]: W0320 17:37:37.007352 4690 reflector.go:561] object-"openshift-network-console"/"networking-console-plugin": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/configmaps?fieldSelector=metadata.name%3Dnetworking-console-plugin&resourceVersion=27309": dial tcp 38.102.83.192:6443: connect: connection refused Mar 20 17:37:37 crc kubenswrapper[4690]: E0320 17:37:37.007425 4690 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-console\"/\"networking-console-plugin\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/configmaps?fieldSelector=metadata.name%3Dnetworking-console-plugin&resourceVersion=27309\": dial tcp 38.102.83.192:6443: connect: connection refused" logger="UnhandledError" Mar 20 17:37:37 crc kubenswrapper[4690]: W0320 17:37:37.319446 4690 reflector.go:561] object-"openshift-multus"/"metrics-daemon-secret": failed to list *v1.Secret: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-multus/secrets?fieldSelector=metadata.name%3Dmetrics-daemon-secret&resourceVersion=27296": dial tcp 38.102.83.192:6443: connect: connection refused Mar 20 17:37:37 crc kubenswrapper[4690]: E0320 17:37:37.319556 4690 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"metrics-daemon-secret\": Failed to watch *v1.Secret: failed to list *v1.Secret: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-multus/secrets?fieldSelector=metadata.name%3Dmetrics-daemon-secret&resourceVersion=27296\": dial tcp 38.102.83.192:6443: connect: connection refused" logger="UnhandledError" Mar 20 17:37:38 crc kubenswrapper[4690]: E0320 17:37:38.319675 4690 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" interval="6.4s" Mar 20 17:37:38 crc kubenswrapper[4690]: E0320 17:37:38.798993 4690 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.192:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189e9d4a56d6035d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:37:25.840560989 +0000 UTC m=+320.706386667,LastTimestamp:2026-03-20 17:37:25.840560989 +0000 UTC m=+320.706386667,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:37:39 crc kubenswrapper[4690]: I0320 17:37:39.327875 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 20 17:37:39 crc kubenswrapper[4690]: I0320 17:37:39.329194 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 20 17:37:39 crc kubenswrapper[4690]: I0320 17:37:39.329383 4690 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="43215f00bdcc0d708039a3dd34ce62baa101c8218cc73255f2027f3dbfe60198" exitCode=1 Mar 20 17:37:39 crc kubenswrapper[4690]: I0320 17:37:39.329478 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"43215f00bdcc0d708039a3dd34ce62baa101c8218cc73255f2027f3dbfe60198"} Mar 20 17:37:39 crc kubenswrapper[4690]: I0320 17:37:39.330116 4690 scope.go:117] "RemoveContainer" containerID="43215f00bdcc0d708039a3dd34ce62baa101c8218cc73255f2027f3dbfe60198" Mar 20 17:37:39 crc kubenswrapper[4690]: I0320 17:37:39.330895 4690 status_manager.go:851] "Failed to get status for pod" podUID="41de6756-32f4-474f-bc58-d461313abb73" pod="openshift-controller-manager/controller-manager-6848775948-wq8cb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6848775948-wq8cb\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 17:37:39 crc kubenswrapper[4690]: I0320 17:37:39.331673 4690 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 17:37:39 crc kubenswrapper[4690]: I0320 17:37:39.332220 4690 status_manager.go:851] "Failed to get status for pod" podUID="de98586b-5aaf-464b-aceb-0493a4c4a84b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 17:37:39 crc kubenswrapper[4690]: I0320 17:37:39.882913 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:37:39 crc kubenswrapper[4690]: I0320 17:37:39.884058 4690 status_manager.go:851] "Failed to get status for pod" podUID="de98586b-5aaf-464b-aceb-0493a4c4a84b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 17:37:39 crc kubenswrapper[4690]: I0320 17:37:39.884418 4690 status_manager.go:851] "Failed to get status for pod" podUID="41de6756-32f4-474f-bc58-d461313abb73" pod="openshift-controller-manager/controller-manager-6848775948-wq8cb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6848775948-wq8cb\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 17:37:39 crc kubenswrapper[4690]: I0320 17:37:39.884608 4690 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 17:37:39 crc kubenswrapper[4690]: I0320 17:37:39.909156 4690 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f1ec4f2e-81b3-4b81-b071-1306b93f352a" Mar 20 17:37:39 crc kubenswrapper[4690]: I0320 17:37:39.909213 4690 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f1ec4f2e-81b3-4b81-b071-1306b93f352a" Mar 20 17:37:39 crc kubenswrapper[4690]: E0320 17:37:39.910044 4690 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:37:39 crc kubenswrapper[4690]: I0320 17:37:39.910787 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:37:39 crc kubenswrapper[4690]: W0320 17:37:39.948144 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-749d6079ef9b9adc820698813a718745da66044b0ec77ec59fcb1cd7493f648d WatchSource:0}: Error finding container 749d6079ef9b9adc820698813a718745da66044b0ec77ec59fcb1cd7493f648d: Status 404 returned error can't find the container with id 749d6079ef9b9adc820698813a718745da66044b0ec77ec59fcb1cd7493f648d Mar 20 17:37:40 crc kubenswrapper[4690]: I0320 17:37:40.342131 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 20 17:37:40 crc kubenswrapper[4690]: I0320 17:37:40.343475 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 20 17:37:40 crc kubenswrapper[4690]: I0320 17:37:40.343579 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"bcefc31b5bac4f05c83160556c8975dd48988316945de0c7bad56507a25b6a30"} Mar 20 17:37:40 crc kubenswrapper[4690]: I0320 17:37:40.345070 4690 status_manager.go:851] "Failed to get status for pod" podUID="41de6756-32f4-474f-bc58-d461313abb73" pod="openshift-controller-manager/controller-manager-6848775948-wq8cb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6848775948-wq8cb\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 17:37:40 crc kubenswrapper[4690]: I0320 17:37:40.345625 4690 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 17:37:40 crc kubenswrapper[4690]: I0320 17:37:40.346187 4690 status_manager.go:851] "Failed to get status for pod" podUID="de98586b-5aaf-464b-aceb-0493a4c4a84b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 17:37:40 crc kubenswrapper[4690]: I0320 17:37:40.346642 4690 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="28846e8ee6b59debf1d9fbb0efa2218d4565b68c61e5e4c2dce1a4e5af78ec76" exitCode=0 Mar 20 17:37:40 crc kubenswrapper[4690]: I0320 17:37:40.346691 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"28846e8ee6b59debf1d9fbb0efa2218d4565b68c61e5e4c2dce1a4e5af78ec76"} Mar 20 17:37:40 crc kubenswrapper[4690]: I0320 17:37:40.346721 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"749d6079ef9b9adc820698813a718745da66044b0ec77ec59fcb1cd7493f648d"} Mar 20 17:37:40 crc kubenswrapper[4690]: I0320 17:37:40.347060 4690 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f1ec4f2e-81b3-4b81-b071-1306b93f352a" Mar 20 17:37:40 crc kubenswrapper[4690]: I0320 17:37:40.347092 4690 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f1ec4f2e-81b3-4b81-b071-1306b93f352a" Mar 20 17:37:40 crc kubenswrapper[4690]: I0320 17:37:40.347475 4690 status_manager.go:851] "Failed to get status for pod" podUID="de98586b-5aaf-464b-aceb-0493a4c4a84b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 17:37:40 crc kubenswrapper[4690]: E0320 17:37:40.347600 4690 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:37:40 crc kubenswrapper[4690]: I0320 17:37:40.347861 4690 status_manager.go:851] "Failed to get status for pod" podUID="41de6756-32f4-474f-bc58-d461313abb73" pod="openshift-controller-manager/controller-manager-6848775948-wq8cb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6848775948-wq8cb\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 17:37:40 crc kubenswrapper[4690]: I0320 17:37:40.348156 4690 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 17:37:40 crc kubenswrapper[4690]: W0320 17:37:40.355537 4690 reflector.go:561] object-"openshift-network-diagnostics"/"kube-root-ca.crt": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&resourceVersion=27309": dial tcp 38.102.83.192:6443: connect: connection refused Mar 20 17:37:40 crc kubenswrapper[4690]: E0320 17:37:40.355639 4690 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&resourceVersion=27309\": dial tcp 38.102.83.192:6443: connect: connection refused" logger="UnhandledError" Mar 20 17:37:40 crc kubenswrapper[4690]: I0320 17:37:40.824401 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 17:37:40 crc kubenswrapper[4690]: I0320 17:37:40.833597 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 17:37:41 crc kubenswrapper[4690]: I0320 17:37:41.355822 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"e31036b94c084812f9c7757bc9ce724c8346ef917f73378ce372ac8831549b0b"} Mar 20 17:37:41 crc kubenswrapper[4690]: I0320 17:37:41.356982 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"f5c140c6ef6291b9d8db20d6545c37570696371fbc2efc420a6a32a20bc73f96"} Mar 20 17:37:41 crc kubenswrapper[4690]: I0320 17:37:41.357076 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"66c73ec1849fa0ebe47eba7420ac430a9cca01edd24c38fe6e0fd46d0216698a"} Mar 20 17:37:41 crc kubenswrapper[4690]: I0320 17:37:41.357269 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 17:37:42 crc kubenswrapper[4690]: I0320 17:37:42.373802 4690 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f1ec4f2e-81b3-4b81-b071-1306b93f352a" Mar 20 17:37:42 crc kubenswrapper[4690]: I0320 17:37:42.373846 4690 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f1ec4f2e-81b3-4b81-b071-1306b93f352a" Mar 20 17:37:42 crc kubenswrapper[4690]: I0320 17:37:42.374073 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"4fbe2e1a689181a4d5fc73027d2d2723af2aabc4e62fd49d1c3fbcb50ffce5bd"} Mar 20 17:37:42 crc kubenswrapper[4690]: I0320 17:37:42.374100 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"2a86063ec87e2a3d813b5de0a424c2334c523feed632509440d194e36098ec82"} Mar 20 17:37:42 crc kubenswrapper[4690]: I0320 17:37:42.374163 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:37:44 crc kubenswrapper[4690]: I0320 17:37:44.911415 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:37:44 crc kubenswrapper[4690]: I0320 17:37:44.911731 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:37:44 crc kubenswrapper[4690]: I0320 17:37:44.922055 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:37:47 crc kubenswrapper[4690]: I0320 17:37:47.302078 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 20 17:37:47 crc kubenswrapper[4690]: I0320 17:37:47.302082 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 20 17:37:47 crc kubenswrapper[4690]: I0320 17:37:47.302905 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 20 17:37:47 crc kubenswrapper[4690]: I0320 17:37:47.385034 4690 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:37:47 crc kubenswrapper[4690]: I0320 17:37:47.415081 4690 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="5ea9b3e2-4a40-4ed1-ab16-033384f4fc0b" Mar 20 17:37:47 crc kubenswrapper[4690]: I0320 17:37:47.822647 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-2b2sh" podUID="1a46a1ee-5f40-4d85-b726-d758b7ceff37" containerName="oauth-openshift" containerID="cri-o://ef89507084b46c18386d13773118a9bf4f9d9e196a762ad2f4233f51a1e58cca" gracePeriod=15 Mar 20 17:37:47 crc kubenswrapper[4690]: I0320 17:37:47.971016 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 20 17:37:48 crc kubenswrapper[4690]: I0320 17:37:48.343135 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-2b2sh" Mar 20 17:37:48 crc kubenswrapper[4690]: I0320 17:37:48.413833 4690 generic.go:334] "Generic (PLEG): container finished" podID="1a46a1ee-5f40-4d85-b726-d758b7ceff37" containerID="ef89507084b46c18386d13773118a9bf4f9d9e196a762ad2f4233f51a1e58cca" exitCode=0 Mar 20 17:37:48 crc kubenswrapper[4690]: I0320 17:37:48.413887 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-2b2sh" Mar 20 17:37:48 crc kubenswrapper[4690]: I0320 17:37:48.413931 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-2b2sh" event={"ID":"1a46a1ee-5f40-4d85-b726-d758b7ceff37","Type":"ContainerDied","Data":"ef89507084b46c18386d13773118a9bf4f9d9e196a762ad2f4233f51a1e58cca"} Mar 20 17:37:48 crc kubenswrapper[4690]: I0320 17:37:48.413975 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-2b2sh" event={"ID":"1a46a1ee-5f40-4d85-b726-d758b7ceff37","Type":"ContainerDied","Data":"7f1b4bedf31dd6cf015182945d19cf4cd140006f9784c489c1d61c2a1433d0bd"} Mar 20 17:37:48 crc kubenswrapper[4690]: I0320 17:37:48.413995 4690 scope.go:117] "RemoveContainer" containerID="ef89507084b46c18386d13773118a9bf4f9d9e196a762ad2f4233f51a1e58cca" Mar 20 17:37:48 crc kubenswrapper[4690]: I0320 17:37:48.414404 4690 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f1ec4f2e-81b3-4b81-b071-1306b93f352a" Mar 20 17:37:48 crc kubenswrapper[4690]: I0320 17:37:48.414430 4690 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f1ec4f2e-81b3-4b81-b071-1306b93f352a" Mar 20 17:37:48 crc kubenswrapper[4690]: I0320 17:37:48.424505 4690 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="5ea9b3e2-4a40-4ed1-ab16-033384f4fc0b" Mar 20 17:37:48 crc kubenswrapper[4690]: I0320 17:37:48.434265 4690 scope.go:117] "RemoveContainer" containerID="ef89507084b46c18386d13773118a9bf4f9d9e196a762ad2f4233f51a1e58cca" Mar 20 17:37:48 crc kubenswrapper[4690]: E0320 17:37:48.434813 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef89507084b46c18386d13773118a9bf4f9d9e196a762ad2f4233f51a1e58cca\": container with ID starting with ef89507084b46c18386d13773118a9bf4f9d9e196a762ad2f4233f51a1e58cca not found: ID does not exist" containerID="ef89507084b46c18386d13773118a9bf4f9d9e196a762ad2f4233f51a1e58cca" Mar 20 17:37:48 crc kubenswrapper[4690]: I0320 17:37:48.434857 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef89507084b46c18386d13773118a9bf4f9d9e196a762ad2f4233f51a1e58cca"} err="failed to get container status \"ef89507084b46c18386d13773118a9bf4f9d9e196a762ad2f4233f51a1e58cca\": rpc error: code = NotFound desc = could not find container \"ef89507084b46c18386d13773118a9bf4f9d9e196a762ad2f4233f51a1e58cca\": container with ID starting with ef89507084b46c18386d13773118a9bf4f9d9e196a762ad2f4233f51a1e58cca not found: ID does not exist" Mar 20 17:37:48 crc kubenswrapper[4690]: I0320 17:37:48.498288 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1a46a1ee-5f40-4d85-b726-d758b7ceff37-v4-0-config-user-idp-0-file-data\") pod \"1a46a1ee-5f40-4d85-b726-d758b7ceff37\" (UID: \"1a46a1ee-5f40-4d85-b726-d758b7ceff37\") " Mar 20 17:37:48 crc kubenswrapper[4690]: I0320 17:37:48.498405 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1a46a1ee-5f40-4d85-b726-d758b7ceff37-v4-0-config-system-ocp-branding-template\") pod \"1a46a1ee-5f40-4d85-b726-d758b7ceff37\" (UID: \"1a46a1ee-5f40-4d85-b726-d758b7ceff37\") " Mar 20 17:37:48 crc kubenswrapper[4690]: I0320 17:37:48.498488 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1a46a1ee-5f40-4d85-b726-d758b7ceff37-v4-0-config-system-service-ca\") pod \"1a46a1ee-5f40-4d85-b726-d758b7ceff37\" (UID: \"1a46a1ee-5f40-4d85-b726-d758b7ceff37\") " Mar 20 17:37:48 crc kubenswrapper[4690]: I0320 17:37:48.498519 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1a46a1ee-5f40-4d85-b726-d758b7ceff37-audit-dir\") pod \"1a46a1ee-5f40-4d85-b726-d758b7ceff37\" (UID: \"1a46a1ee-5f40-4d85-b726-d758b7ceff37\") " Mar 20 17:37:48 crc kubenswrapper[4690]: I0320 17:37:48.498553 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1a46a1ee-5f40-4d85-b726-d758b7ceff37-v4-0-config-user-template-login\") pod \"1a46a1ee-5f40-4d85-b726-d758b7ceff37\" (UID: \"1a46a1ee-5f40-4d85-b726-d758b7ceff37\") " Mar 20 17:37:48 crc kubenswrapper[4690]: I0320 17:37:48.498599 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vr2vf\" (UniqueName: \"kubernetes.io/projected/1a46a1ee-5f40-4d85-b726-d758b7ceff37-kube-api-access-vr2vf\") pod \"1a46a1ee-5f40-4d85-b726-d758b7ceff37\" (UID: \"1a46a1ee-5f40-4d85-b726-d758b7ceff37\") " Mar 20 17:37:48 crc kubenswrapper[4690]: I0320 17:37:48.498650 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1a46a1ee-5f40-4d85-b726-d758b7ceff37-v4-0-config-system-session\") pod \"1a46a1ee-5f40-4d85-b726-d758b7ceff37\" (UID: \"1a46a1ee-5f40-4d85-b726-d758b7ceff37\") " Mar 20 17:37:48 crc kubenswrapper[4690]: I0320 17:37:48.499078 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1a46a1ee-5f40-4d85-b726-d758b7ceff37-v4-0-config-user-template-provider-selection\") pod \"1a46a1ee-5f40-4d85-b726-d758b7ceff37\" (UID: \"1a46a1ee-5f40-4d85-b726-d758b7ceff37\") " Mar 20 17:37:48 crc kubenswrapper[4690]: I0320 17:37:48.499151 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1a46a1ee-5f40-4d85-b726-d758b7ceff37-v4-0-config-user-template-error\") pod \"1a46a1ee-5f40-4d85-b726-d758b7ceff37\" (UID: \"1a46a1ee-5f40-4d85-b726-d758b7ceff37\") " Mar 20 17:37:48 crc kubenswrapper[4690]: I0320 17:37:48.499225 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1a46a1ee-5f40-4d85-b726-d758b7ceff37-audit-policies\") pod \"1a46a1ee-5f40-4d85-b726-d758b7ceff37\" (UID: \"1a46a1ee-5f40-4d85-b726-d758b7ceff37\") " Mar 20 17:37:48 crc kubenswrapper[4690]: I0320 17:37:48.499306 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a46a1ee-5f40-4d85-b726-d758b7ceff37-v4-0-config-system-trusted-ca-bundle\") pod \"1a46a1ee-5f40-4d85-b726-d758b7ceff37\" (UID: \"1a46a1ee-5f40-4d85-b726-d758b7ceff37\") " Mar 20 17:37:48 crc kubenswrapper[4690]: I0320 17:37:48.499363 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1a46a1ee-5f40-4d85-b726-d758b7ceff37-v4-0-config-system-router-certs\") pod \"1a46a1ee-5f40-4d85-b726-d758b7ceff37\" (UID: \"1a46a1ee-5f40-4d85-b726-d758b7ceff37\") " Mar 20 17:37:48 crc kubenswrapper[4690]: I0320 17:37:48.499416 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1a46a1ee-5f40-4d85-b726-d758b7ceff37-v4-0-config-system-cliconfig\") pod \"1a46a1ee-5f40-4d85-b726-d758b7ceff37\" (UID: \"1a46a1ee-5f40-4d85-b726-d758b7ceff37\") " Mar 20 17:37:48 crc kubenswrapper[4690]: I0320 17:37:48.499464 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1a46a1ee-5f40-4d85-b726-d758b7ceff37-v4-0-config-system-serving-cert\") pod \"1a46a1ee-5f40-4d85-b726-d758b7ceff37\" (UID: \"1a46a1ee-5f40-4d85-b726-d758b7ceff37\") " Mar 20 17:37:48 crc kubenswrapper[4690]: I0320 17:37:48.498698 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1a46a1ee-5f40-4d85-b726-d758b7ceff37-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "1a46a1ee-5f40-4d85-b726-d758b7ceff37" (UID: "1a46a1ee-5f40-4d85-b726-d758b7ceff37"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:37:48 crc kubenswrapper[4690]: I0320 17:37:48.499136 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a46a1ee-5f40-4d85-b726-d758b7ceff37-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "1a46a1ee-5f40-4d85-b726-d758b7ceff37" (UID: "1a46a1ee-5f40-4d85-b726-d758b7ceff37"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:37:48 crc kubenswrapper[4690]: I0320 17:37:48.499953 4690 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1a46a1ee-5f40-4d85-b726-d758b7ceff37-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:48 crc kubenswrapper[4690]: I0320 17:37:48.499967 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a46a1ee-5f40-4d85-b726-d758b7ceff37-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "1a46a1ee-5f40-4d85-b726-d758b7ceff37" (UID: "1a46a1ee-5f40-4d85-b726-d758b7ceff37"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:37:48 crc kubenswrapper[4690]: I0320 17:37:48.500158 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a46a1ee-5f40-4d85-b726-d758b7ceff37-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "1a46a1ee-5f40-4d85-b726-d758b7ceff37" (UID: "1a46a1ee-5f40-4d85-b726-d758b7ceff37"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:37:48 crc kubenswrapper[4690]: I0320 17:37:48.500724 4690 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1a46a1ee-5f40-4d85-b726-d758b7ceff37-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:48 crc kubenswrapper[4690]: I0320 17:37:48.500763 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a46a1ee-5f40-4d85-b726-d758b7ceff37-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "1a46a1ee-5f40-4d85-b726-d758b7ceff37" (UID: "1a46a1ee-5f40-4d85-b726-d758b7ceff37"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:37:48 crc kubenswrapper[4690]: I0320 17:37:48.504540 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a46a1ee-5f40-4d85-b726-d758b7ceff37-kube-api-access-vr2vf" (OuterVolumeSpecName: "kube-api-access-vr2vf") pod "1a46a1ee-5f40-4d85-b726-d758b7ceff37" (UID: "1a46a1ee-5f40-4d85-b726-d758b7ceff37"). InnerVolumeSpecName "kube-api-access-vr2vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:37:48 crc kubenswrapper[4690]: I0320 17:37:48.504737 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a46a1ee-5f40-4d85-b726-d758b7ceff37-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "1a46a1ee-5f40-4d85-b726-d758b7ceff37" (UID: "1a46a1ee-5f40-4d85-b726-d758b7ceff37"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:37:48 crc kubenswrapper[4690]: I0320 17:37:48.504823 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a46a1ee-5f40-4d85-b726-d758b7ceff37-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "1a46a1ee-5f40-4d85-b726-d758b7ceff37" (UID: "1a46a1ee-5f40-4d85-b726-d758b7ceff37"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:37:48 crc kubenswrapper[4690]: I0320 17:37:48.505325 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a46a1ee-5f40-4d85-b726-d758b7ceff37-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "1a46a1ee-5f40-4d85-b726-d758b7ceff37" (UID: "1a46a1ee-5f40-4d85-b726-d758b7ceff37"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:37:48 crc kubenswrapper[4690]: I0320 17:37:48.505692 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a46a1ee-5f40-4d85-b726-d758b7ceff37-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "1a46a1ee-5f40-4d85-b726-d758b7ceff37" (UID: "1a46a1ee-5f40-4d85-b726-d758b7ceff37"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:37:48 crc kubenswrapper[4690]: I0320 17:37:48.506587 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a46a1ee-5f40-4d85-b726-d758b7ceff37-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "1a46a1ee-5f40-4d85-b726-d758b7ceff37" (UID: "1a46a1ee-5f40-4d85-b726-d758b7ceff37"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:37:48 crc kubenswrapper[4690]: I0320 17:37:48.507287 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a46a1ee-5f40-4d85-b726-d758b7ceff37-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "1a46a1ee-5f40-4d85-b726-d758b7ceff37" (UID: "1a46a1ee-5f40-4d85-b726-d758b7ceff37"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:37:48 crc kubenswrapper[4690]: I0320 17:37:48.507463 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a46a1ee-5f40-4d85-b726-d758b7ceff37-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "1a46a1ee-5f40-4d85-b726-d758b7ceff37" (UID: "1a46a1ee-5f40-4d85-b726-d758b7ceff37"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:37:48 crc kubenswrapper[4690]: I0320 17:37:48.507676 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a46a1ee-5f40-4d85-b726-d758b7ceff37-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "1a46a1ee-5f40-4d85-b726-d758b7ceff37" (UID: "1a46a1ee-5f40-4d85-b726-d758b7ceff37"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:37:48 crc kubenswrapper[4690]: I0320 17:37:48.601475 4690 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1a46a1ee-5f40-4d85-b726-d758b7ceff37-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:48 crc kubenswrapper[4690]: I0320 17:37:48.601501 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vr2vf\" (UniqueName: \"kubernetes.io/projected/1a46a1ee-5f40-4d85-b726-d758b7ceff37-kube-api-access-vr2vf\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:48 crc kubenswrapper[4690]: I0320 17:37:48.601510 4690 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1a46a1ee-5f40-4d85-b726-d758b7ceff37-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:48 crc kubenswrapper[4690]: I0320 17:37:48.601519 4690 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1a46a1ee-5f40-4d85-b726-d758b7ceff37-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:48 crc kubenswrapper[4690]: I0320 17:37:48.601529 4690 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1a46a1ee-5f40-4d85-b726-d758b7ceff37-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:48 crc kubenswrapper[4690]: I0320 17:37:48.601537 4690 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a46a1ee-5f40-4d85-b726-d758b7ceff37-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:48 crc kubenswrapper[4690]: I0320 17:37:48.601545 4690 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1a46a1ee-5f40-4d85-b726-d758b7ceff37-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:48 crc kubenswrapper[4690]: I0320 17:37:48.601554 4690 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1a46a1ee-5f40-4d85-b726-d758b7ceff37-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:48 crc kubenswrapper[4690]: I0320 17:37:48.601563 4690 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1a46a1ee-5f40-4d85-b726-d758b7ceff37-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:48 crc kubenswrapper[4690]: I0320 17:37:48.601571 4690 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1a46a1ee-5f40-4d85-b726-d758b7ceff37-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:48 crc kubenswrapper[4690]: I0320 17:37:48.601579 4690 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1a46a1ee-5f40-4d85-b726-d758b7ceff37-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:48 crc kubenswrapper[4690]: I0320 17:37:48.601589 4690 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1a46a1ee-5f40-4d85-b726-d758b7ceff37-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:48 crc kubenswrapper[4690]: E0320 17:37:48.830611 4690 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a46a1ee_5f40_4d85_b726_d758b7ceff37.slice/crio-7f1b4bedf31dd6cf015182945d19cf4cd140006f9784c489c1d61c2a1433d0bd\": RecentStats: unable to find data in memory cache]" Mar 20 17:37:49 crc kubenswrapper[4690]: E0320 17:37:49.904244 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-bgj72" podUID="3cb690cf-caea-4c1b-ad3c-7e17a802b1a3" Mar 20 17:37:50 crc kubenswrapper[4690]: E0320 17:37:50.897821 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-cqllr], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:37:50 crc kubenswrapper[4690]: E0320 17:37:50.908345 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-s2dwl], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:37:50 crc kubenswrapper[4690]: E0320 17:37:50.915575 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert nginx-conf], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:37:50 crc kubenswrapper[4690]: I0320 17:37:50.986428 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 20 17:37:56 crc kubenswrapper[4690]: I0320 17:37:56.773402 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 20 17:37:56 crc kubenswrapper[4690]: I0320 17:37:56.900021 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 20 17:37:57 crc kubenswrapper[4690]: I0320 17:37:57.400047 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 20 17:37:57 crc kubenswrapper[4690]: I0320 17:37:57.817185 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 20 17:37:58 crc kubenswrapper[4690]: I0320 17:37:58.134732 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 20 17:37:58 crc kubenswrapper[4690]: I0320 17:37:58.178208 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 20 17:37:58 crc kubenswrapper[4690]: I0320 17:37:58.543351 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 17:37:58 crc kubenswrapper[4690]: I0320 17:37:58.577511 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 20 17:37:58 crc kubenswrapper[4690]: I0320 17:37:58.823297 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 20 17:37:59 crc kubenswrapper[4690]: I0320 17:37:59.435213 4690 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 20 17:37:59 crc kubenswrapper[4690]: I0320 17:37:59.494501 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 20 17:37:59 crc kubenswrapper[4690]: I0320 17:37:59.548102 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 20 17:37:59 crc kubenswrapper[4690]: I0320 17:37:59.830058 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 20 17:37:59 crc kubenswrapper[4690]: I0320 17:37:59.957332 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 20 17:38:00 crc kubenswrapper[4690]: I0320 17:38:00.239991 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 20 17:38:00 crc kubenswrapper[4690]: I0320 17:38:00.250538 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 20 17:38:00 crc kubenswrapper[4690]: I0320 17:38:00.345034 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 17:38:00 crc kubenswrapper[4690]: I0320 17:38:00.410828 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 20 17:38:00 crc kubenswrapper[4690]: I0320 17:38:00.422546 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 20 17:38:00 crc kubenswrapper[4690]: I0320 17:38:00.771781 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 20 17:38:00 crc kubenswrapper[4690]: I0320 17:38:00.778424 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 20 17:38:00 crc kubenswrapper[4690]: I0320 17:38:00.882639 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bgj72" Mar 20 17:38:00 crc kubenswrapper[4690]: I0320 17:38:00.916975 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 20 17:38:00 crc kubenswrapper[4690]: I0320 17:38:00.994493 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 20 17:38:01 crc kubenswrapper[4690]: I0320 17:38:01.008972 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 20 17:38:01 crc kubenswrapper[4690]: I0320 17:38:01.051113 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 20 17:38:01 crc kubenswrapper[4690]: I0320 17:38:01.123762 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 20 17:38:01 crc kubenswrapper[4690]: I0320 17:38:01.256815 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 20 17:38:01 crc kubenswrapper[4690]: I0320 17:38:01.275065 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 20 17:38:01 crc kubenswrapper[4690]: I0320 17:38:01.317765 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 20 17:38:01 crc kubenswrapper[4690]: I0320 17:38:01.335959 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 20 17:38:01 crc kubenswrapper[4690]: I0320 17:38:01.368014 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 17:38:01 crc kubenswrapper[4690]: I0320 17:38:01.433904 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 20 17:38:01 crc kubenswrapper[4690]: I0320 17:38:01.488276 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 17:38:01 crc kubenswrapper[4690]: I0320 17:38:01.545327 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 20 17:38:01 crc kubenswrapper[4690]: I0320 17:38:01.684531 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 20 17:38:01 crc kubenswrapper[4690]: I0320 17:38:01.858340 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 20 17:38:01 crc kubenswrapper[4690]: I0320 17:38:01.881167 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 20 17:38:01 crc kubenswrapper[4690]: I0320 17:38:01.890009 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 20 17:38:01 crc kubenswrapper[4690]: I0320 17:38:01.909635 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 20 17:38:01 crc kubenswrapper[4690]: I0320 17:38:01.959914 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 20 17:38:02 crc kubenswrapper[4690]: I0320 17:38:02.044790 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 20 17:38:02 crc kubenswrapper[4690]: I0320 17:38:02.114085 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 20 17:38:02 crc kubenswrapper[4690]: I0320 17:38:02.114444 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 20 17:38:02 crc kubenswrapper[4690]: I0320 17:38:02.200584 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 20 17:38:02 crc kubenswrapper[4690]: I0320 17:38:02.322388 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 20 17:38:02 crc kubenswrapper[4690]: I0320 17:38:02.362703 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 20 17:38:02 crc kubenswrapper[4690]: I0320 17:38:02.436070 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 20 17:38:02 crc kubenswrapper[4690]: I0320 17:38:02.507523 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 20 17:38:02 crc kubenswrapper[4690]: I0320 17:38:02.564819 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 20 17:38:02 crc kubenswrapper[4690]: I0320 17:38:02.663458 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 20 17:38:02 crc kubenswrapper[4690]: I0320 17:38:02.721374 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 20 17:38:02 crc kubenswrapper[4690]: I0320 17:38:02.735558 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 20 17:38:02 crc kubenswrapper[4690]: I0320 17:38:02.752037 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 20 17:38:02 crc kubenswrapper[4690]: I0320 17:38:02.815568 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 20 17:38:02 crc kubenswrapper[4690]: I0320 17:38:02.826127 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 20 17:38:02 crc kubenswrapper[4690]: I0320 17:38:02.850281 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 20 17:38:02 crc kubenswrapper[4690]: I0320 17:38:02.882429 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:38:02 crc kubenswrapper[4690]: I0320 17:38:02.882674 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:38:02 crc kubenswrapper[4690]: I0320 17:38:02.928586 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 20 17:38:02 crc kubenswrapper[4690]: I0320 17:38:02.973136 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 20 17:38:03 crc kubenswrapper[4690]: I0320 17:38:03.022731 4690 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 20 17:38:03 crc kubenswrapper[4690]: I0320 17:38:03.159032 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 20 17:38:03 crc kubenswrapper[4690]: I0320 17:38:03.193924 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 20 17:38:03 crc kubenswrapper[4690]: I0320 17:38:03.198590 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 20 17:38:03 crc kubenswrapper[4690]: I0320 17:38:03.231401 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 17:38:03 crc kubenswrapper[4690]: I0320 17:38:03.294650 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 20 17:38:03 crc kubenswrapper[4690]: I0320 17:38:03.404079 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 20 17:38:03 crc kubenswrapper[4690]: I0320 17:38:03.488822 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 20 17:38:03 crc kubenswrapper[4690]: I0320 17:38:03.500213 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 20 17:38:03 crc kubenswrapper[4690]: I0320 17:38:03.530782 4690 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 20 17:38:03 crc kubenswrapper[4690]: I0320 17:38:03.659307 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 20 17:38:03 crc kubenswrapper[4690]: I0320 17:38:03.678833 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 20 17:38:03 crc kubenswrapper[4690]: I0320 17:38:03.699417 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 20 17:38:03 crc kubenswrapper[4690]: I0320 17:38:03.744316 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 17:38:03 crc kubenswrapper[4690]: I0320 17:38:03.882651 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:38:03 crc kubenswrapper[4690]: I0320 17:38:03.965758 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 20 17:38:03 crc kubenswrapper[4690]: I0320 17:38:03.972970 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 20 17:38:03 crc kubenswrapper[4690]: I0320 17:38:03.973492 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 20 17:38:04 crc kubenswrapper[4690]: I0320 17:38:04.018338 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 20 17:38:04 crc kubenswrapper[4690]: I0320 17:38:04.020413 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 20 17:38:04 crc kubenswrapper[4690]: I0320 17:38:04.039678 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 20 17:38:04 crc kubenswrapper[4690]: I0320 17:38:04.100144 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 20 17:38:04 crc kubenswrapper[4690]: I0320 17:38:04.109430 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 20 17:38:04 crc kubenswrapper[4690]: I0320 17:38:04.133517 4690 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 20 17:38:04 crc kubenswrapper[4690]: I0320 17:38:04.136921 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 20 17:38:04 crc kubenswrapper[4690]: I0320 17:38:04.209813 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 20 17:38:04 crc kubenswrapper[4690]: I0320 17:38:04.215382 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 20 17:38:04 crc kubenswrapper[4690]: I0320 17:38:04.230327 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 20 17:38:04 crc kubenswrapper[4690]: I0320 17:38:04.272480 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 20 17:38:04 crc kubenswrapper[4690]: I0320 17:38:04.381523 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 20 17:38:04 crc kubenswrapper[4690]: I0320 17:38:04.411860 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 20 17:38:04 crc kubenswrapper[4690]: I0320 17:38:04.509828 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 20 17:38:04 crc kubenswrapper[4690]: I0320 17:38:04.579456 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 20 17:38:04 crc kubenswrapper[4690]: I0320 17:38:04.611933 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 20 17:38:04 crc kubenswrapper[4690]: I0320 17:38:04.671699 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 17:38:04 crc kubenswrapper[4690]: I0320 17:38:04.714579 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 20 17:38:04 crc kubenswrapper[4690]: I0320 17:38:04.831011 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 20 17:38:04 crc kubenswrapper[4690]: I0320 17:38:04.831525 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 20 17:38:04 crc kubenswrapper[4690]: I0320 17:38:04.846873 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 20 17:38:04 crc kubenswrapper[4690]: I0320 17:38:04.919733 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 20 17:38:04 crc kubenswrapper[4690]: I0320 17:38:04.962783 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 20 17:38:05 crc kubenswrapper[4690]: I0320 17:38:05.281012 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 17:38:05 crc kubenswrapper[4690]: I0320 17:38:05.311559 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 20 17:38:05 crc kubenswrapper[4690]: I0320 17:38:05.428813 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 20 17:38:05 crc kubenswrapper[4690]: I0320 17:38:05.535105 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 20 17:38:05 crc kubenswrapper[4690]: I0320 17:38:05.543846 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 20 17:38:05 crc kubenswrapper[4690]: I0320 17:38:05.562831 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 20 17:38:05 crc kubenswrapper[4690]: I0320 17:38:05.594215 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 20 17:38:05 crc kubenswrapper[4690]: I0320 17:38:05.718151 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 20 17:38:05 crc kubenswrapper[4690]: I0320 17:38:05.722117 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 20 17:38:05 crc kubenswrapper[4690]: I0320 17:38:05.940311 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 20 17:38:06 crc kubenswrapper[4690]: I0320 17:38:06.038012 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 20 17:38:06 crc kubenswrapper[4690]: I0320 17:38:06.074352 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 20 17:38:06 crc kubenswrapper[4690]: I0320 17:38:06.256529 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 17:38:06 crc kubenswrapper[4690]: I0320 17:38:06.328764 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 20 17:38:06 crc kubenswrapper[4690]: I0320 17:38:06.427009 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 20 17:38:06 crc kubenswrapper[4690]: I0320 17:38:06.518639 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 20 17:38:06 crc kubenswrapper[4690]: I0320 17:38:06.605927 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 20 17:38:06 crc kubenswrapper[4690]: I0320 17:38:06.636388 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 20 17:38:06 crc kubenswrapper[4690]: I0320 17:38:06.702680 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 20 17:38:06 crc kubenswrapper[4690]: I0320 17:38:06.773876 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 20 17:38:06 crc kubenswrapper[4690]: I0320 17:38:06.814545 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 20 17:38:06 crc kubenswrapper[4690]: I0320 17:38:06.834893 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 20 17:38:06 crc kubenswrapper[4690]: I0320 17:38:06.843285 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 20 17:38:06 crc kubenswrapper[4690]: I0320 17:38:06.849822 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 20 17:38:06 crc kubenswrapper[4690]: I0320 17:38:06.927887 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 20 17:38:06 crc kubenswrapper[4690]: I0320 17:38:06.950905 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 20 17:38:06 crc kubenswrapper[4690]: I0320 17:38:06.984366 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 20 17:38:07 crc kubenswrapper[4690]: I0320 17:38:07.012438 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 20 17:38:07 crc kubenswrapper[4690]: I0320 17:38:07.088470 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 20 17:38:07 crc kubenswrapper[4690]: I0320 17:38:07.089403 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 20 17:38:07 crc kubenswrapper[4690]: I0320 17:38:07.181486 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 17:38:07 crc kubenswrapper[4690]: I0320 17:38:07.228561 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 20 17:38:07 crc kubenswrapper[4690]: I0320 17:38:07.298548 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 20 17:38:07 crc kubenswrapper[4690]: I0320 17:38:07.444922 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 20 17:38:07 crc kubenswrapper[4690]: I0320 17:38:07.512551 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 20 17:38:07 crc kubenswrapper[4690]: I0320 17:38:07.550992 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 20 17:38:07 crc kubenswrapper[4690]: I0320 17:38:07.625205 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 20 17:38:07 crc kubenswrapper[4690]: I0320 17:38:07.655194 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 20 17:38:07 crc kubenswrapper[4690]: I0320 17:38:07.657022 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 20 17:38:07 crc kubenswrapper[4690]: I0320 17:38:07.738266 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 20 17:38:07 crc kubenswrapper[4690]: I0320 17:38:07.768332 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 20 17:38:07 crc kubenswrapper[4690]: I0320 17:38:07.989133 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 20 17:38:08 crc kubenswrapper[4690]: I0320 17:38:08.067193 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 20 17:38:08 crc kubenswrapper[4690]: I0320 17:38:08.136399 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 20 17:38:08 crc kubenswrapper[4690]: I0320 17:38:08.141888 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 20 17:38:08 crc kubenswrapper[4690]: I0320 17:38:08.212833 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 20 17:38:08 crc kubenswrapper[4690]: I0320 17:38:08.214380 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 20 17:38:08 crc kubenswrapper[4690]: I0320 17:38:08.231281 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 20 17:38:08 crc kubenswrapper[4690]: I0320 17:38:08.232866 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 20 17:38:08 crc kubenswrapper[4690]: I0320 17:38:08.401011 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 20 17:38:08 crc kubenswrapper[4690]: I0320 17:38:08.428813 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 20 17:38:08 crc kubenswrapper[4690]: I0320 17:38:08.429073 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 20 17:38:08 crc kubenswrapper[4690]: I0320 17:38:08.442720 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 20 17:38:08 crc kubenswrapper[4690]: I0320 17:38:08.447097 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 20 17:38:08 crc kubenswrapper[4690]: I0320 17:38:08.486911 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 20 17:38:08 crc kubenswrapper[4690]: I0320 17:38:08.514453 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 20 17:38:08 crc kubenswrapper[4690]: I0320 17:38:08.570977 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 20 17:38:08 crc kubenswrapper[4690]: I0320 17:38:08.639383 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 20 17:38:08 crc kubenswrapper[4690]: I0320 17:38:08.666132 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 20 17:38:08 crc kubenswrapper[4690]: I0320 17:38:08.738900 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 20 17:38:08 crc kubenswrapper[4690]: I0320 17:38:08.772010 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 20 17:38:08 crc kubenswrapper[4690]: I0320 17:38:08.804548 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 20 17:38:08 crc kubenswrapper[4690]: I0320 17:38:08.826683 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 20 17:38:08 crc kubenswrapper[4690]: I0320 17:38:08.849814 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 20 17:38:08 crc kubenswrapper[4690]: I0320 17:38:08.899825 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 20 17:38:08 crc kubenswrapper[4690]: I0320 17:38:08.963771 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 20 17:38:09 crc kubenswrapper[4690]: I0320 17:38:09.047572 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 20 17:38:09 crc kubenswrapper[4690]: I0320 17:38:09.064313 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 20 17:38:09 crc kubenswrapper[4690]: I0320 17:38:09.154049 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 20 17:38:09 crc kubenswrapper[4690]: I0320 17:38:09.190622 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 20 17:38:09 crc kubenswrapper[4690]: I0320 17:38:09.373753 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 20 17:38:09 crc kubenswrapper[4690]: I0320 17:38:09.481122 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 20 17:38:09 crc kubenswrapper[4690]: I0320 17:38:09.481586 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 20 17:38:09 crc kubenswrapper[4690]: I0320 17:38:09.497125 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 20 17:38:09 crc kubenswrapper[4690]: I0320 17:38:09.508756 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 20 17:38:09 crc kubenswrapper[4690]: I0320 17:38:09.576568 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 20 17:38:09 crc kubenswrapper[4690]: I0320 17:38:09.686105 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 20 17:38:09 crc kubenswrapper[4690]: I0320 17:38:09.831912 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 20 17:38:09 crc kubenswrapper[4690]: I0320 17:38:09.835619 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 20 17:38:09 crc kubenswrapper[4690]: I0320 17:38:09.835667 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 20 17:38:10 crc kubenswrapper[4690]: I0320 17:38:10.011758 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 17:38:10 crc kubenswrapper[4690]: I0320 17:38:10.012489 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 20 17:38:10 crc kubenswrapper[4690]: I0320 17:38:10.048046 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 20 17:38:10 crc kubenswrapper[4690]: I0320 17:38:10.068920 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 20 17:38:10 crc kubenswrapper[4690]: I0320 17:38:10.090684 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 20 17:38:10 crc kubenswrapper[4690]: I0320 17:38:10.108549 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 20 17:38:10 crc kubenswrapper[4690]: I0320 17:38:10.191119 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 20 17:38:10 crc kubenswrapper[4690]: I0320 17:38:10.303898 4690 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 20 17:38:10 crc kubenswrapper[4690]: I0320 17:38:10.311304 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-2b2sh","openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 17:38:10 crc kubenswrapper[4690]: I0320 17:38:10.311377 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567138-th8kh","openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 17:38:10 crc kubenswrapper[4690]: E0320 17:38:10.311642 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a46a1ee-5f40-4d85-b726-d758b7ceff37" containerName="oauth-openshift" Mar 20 17:38:10 crc kubenswrapper[4690]: I0320 17:38:10.311666 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a46a1ee-5f40-4d85-b726-d758b7ceff37" containerName="oauth-openshift" Mar 20 17:38:10 crc kubenswrapper[4690]: E0320 17:38:10.311685 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de98586b-5aaf-464b-aceb-0493a4c4a84b" containerName="installer" Mar 20 17:38:10 crc kubenswrapper[4690]: I0320 17:38:10.311694 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="de98586b-5aaf-464b-aceb-0493a4c4a84b" containerName="installer" Mar 20 17:38:10 crc kubenswrapper[4690]: I0320 17:38:10.311834 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="de98586b-5aaf-464b-aceb-0493a4c4a84b" containerName="installer" Mar 20 17:38:10 crc kubenswrapper[4690]: I0320 17:38:10.311855 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a46a1ee-5f40-4d85-b726-d758b7ceff37" containerName="oauth-openshift" Mar 20 17:38:10 crc kubenswrapper[4690]: I0320 17:38:10.312353 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567138-th8kh" Mar 20 17:38:10 crc kubenswrapper[4690]: I0320 17:38:10.315555 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 17:38:10 crc kubenswrapper[4690]: I0320 17:38:10.316019 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5fwhb" Mar 20 17:38:10 crc kubenswrapper[4690]: I0320 17:38:10.316308 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 17:38:10 crc kubenswrapper[4690]: I0320 17:38:10.317638 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:38:10 crc kubenswrapper[4690]: I0320 17:38:10.320037 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:38:10 crc kubenswrapper[4690]: I0320 17:38:10.335976 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=23.335958718 podStartE2EDuration="23.335958718s" podCreationTimestamp="2026-03-20 17:37:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:38:10.334334111 +0000 UTC m=+365.200159809" watchObservedRunningTime="2026-03-20 17:38:10.335958718 +0000 UTC m=+365.201784396" Mar 20 17:38:10 crc kubenswrapper[4690]: I0320 17:38:10.381442 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 20 17:38:10 crc kubenswrapper[4690]: I0320 17:38:10.411044 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nftz4\" (UniqueName: \"kubernetes.io/projected/8050c5b1-a071-48e0-a371-33b5a13765cd-kube-api-access-nftz4\") pod \"auto-csr-approver-29567138-th8kh\" (UID: \"8050c5b1-a071-48e0-a371-33b5a13765cd\") " pod="openshift-infra/auto-csr-approver-29567138-th8kh" Mar 20 17:38:10 crc kubenswrapper[4690]: I0320 17:38:10.429082 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 20 17:38:10 crc kubenswrapper[4690]: I0320 17:38:10.501276 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 20 17:38:10 crc kubenswrapper[4690]: I0320 17:38:10.513063 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nftz4\" (UniqueName: \"kubernetes.io/projected/8050c5b1-a071-48e0-a371-33b5a13765cd-kube-api-access-nftz4\") pod \"auto-csr-approver-29567138-th8kh\" (UID: \"8050c5b1-a071-48e0-a371-33b5a13765cd\") " pod="openshift-infra/auto-csr-approver-29567138-th8kh" Mar 20 17:38:10 crc kubenswrapper[4690]: I0320 17:38:10.541225 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nftz4\" (UniqueName: \"kubernetes.io/projected/8050c5b1-a071-48e0-a371-33b5a13765cd-kube-api-access-nftz4\") pod \"auto-csr-approver-29567138-th8kh\" (UID: \"8050c5b1-a071-48e0-a371-33b5a13765cd\") " pod="openshift-infra/auto-csr-approver-29567138-th8kh" Mar 20 17:38:10 crc kubenswrapper[4690]: I0320 17:38:10.623853 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 17:38:10 crc kubenswrapper[4690]: I0320 17:38:10.634687 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567138-th8kh" Mar 20 17:38:10 crc kubenswrapper[4690]: I0320 17:38:10.662401 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 20 17:38:10 crc kubenswrapper[4690]: I0320 17:38:10.665058 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 20 17:38:10 crc kubenswrapper[4690]: I0320 17:38:10.689530 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 20 17:38:10 crc kubenswrapper[4690]: I0320 17:38:10.707867 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 20 17:38:10 crc kubenswrapper[4690]: I0320 17:38:10.746811 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 20 17:38:10 crc kubenswrapper[4690]: I0320 17:38:10.781354 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 20 17:38:10 crc kubenswrapper[4690]: I0320 17:38:10.825532 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 20 17:38:10 crc kubenswrapper[4690]: I0320 17:38:10.834094 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 20 17:38:10 crc kubenswrapper[4690]: I0320 17:38:10.845196 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 20 17:38:10 crc kubenswrapper[4690]: I0320 17:38:10.984152 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 20 17:38:11 crc kubenswrapper[4690]: I0320 17:38:11.031057 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567138-th8kh"] Mar 20 17:38:11 crc kubenswrapper[4690]: I0320 17:38:11.048778 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 20 17:38:11 crc kubenswrapper[4690]: I0320 17:38:11.087485 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 20 17:38:11 crc kubenswrapper[4690]: I0320 17:38:11.325676 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 20 17:38:11 crc kubenswrapper[4690]: I0320 17:38:11.489931 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 20 17:38:11 crc kubenswrapper[4690]: I0320 17:38:11.511375 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 20 17:38:11 crc kubenswrapper[4690]: I0320 17:38:11.534884 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 20 17:38:11 crc kubenswrapper[4690]: I0320 17:38:11.551997 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567138-th8kh" event={"ID":"8050c5b1-a071-48e0-a371-33b5a13765cd","Type":"ContainerStarted","Data":"f9a9b7091ae48c379fb72a636014dc7143060ea39c33e2d992bf44369520089c"} Mar 20 17:38:11 crc kubenswrapper[4690]: I0320 17:38:11.560066 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 17:38:11 crc kubenswrapper[4690]: I0320 17:38:11.599762 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 20 17:38:11 crc kubenswrapper[4690]: I0320 17:38:11.632175 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 20 17:38:11 crc kubenswrapper[4690]: I0320 17:38:11.661764 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 20 17:38:11 crc kubenswrapper[4690]: I0320 17:38:11.795763 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 20 17:38:11 crc kubenswrapper[4690]: I0320 17:38:11.819716 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 20 17:38:11 crc kubenswrapper[4690]: I0320 17:38:11.890837 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a46a1ee-5f40-4d85-b726-d758b7ceff37" path="/var/lib/kubelet/pods/1a46a1ee-5f40-4d85-b726-d758b7ceff37/volumes" Mar 20 17:38:12 crc kubenswrapper[4690]: I0320 17:38:12.019523 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 20 17:38:12 crc kubenswrapper[4690]: I0320 17:38:12.023911 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 20 17:38:12 crc kubenswrapper[4690]: I0320 17:38:12.054313 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 17:38:12 crc kubenswrapper[4690]: I0320 17:38:12.157886 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 20 17:38:12 crc kubenswrapper[4690]: I0320 17:38:12.167714 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 20 17:38:12 crc kubenswrapper[4690]: I0320 17:38:12.293600 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 20 17:38:12 crc kubenswrapper[4690]: I0320 17:38:12.487348 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 20 17:38:12 crc kubenswrapper[4690]: I0320 17:38:12.549830 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 20 17:38:12 crc kubenswrapper[4690]: I0320 17:38:12.557395 4690 generic.go:334] "Generic (PLEG): container finished" podID="8050c5b1-a071-48e0-a371-33b5a13765cd" containerID="cfa34146f20ea5119cbff97b62210e4507312486c04e37ac7976f627f7405611" exitCode=0 Mar 20 17:38:12 crc kubenswrapper[4690]: I0320 17:38:12.557452 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567138-th8kh" event={"ID":"8050c5b1-a071-48e0-a371-33b5a13765cd","Type":"ContainerDied","Data":"cfa34146f20ea5119cbff97b62210e4507312486c04e37ac7976f627f7405611"} Mar 20 17:38:12 crc kubenswrapper[4690]: I0320 17:38:12.906402 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 20 17:38:13 crc kubenswrapper[4690]: I0320 17:38:13.142720 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-7cd676f696-6jgvj"] Mar 20 17:38:13 crc kubenswrapper[4690]: I0320 17:38:13.143639 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7cd676f696-6jgvj" Mar 20 17:38:13 crc kubenswrapper[4690]: I0320 17:38:13.152754 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 20 17:38:13 crc kubenswrapper[4690]: I0320 17:38:13.153700 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 20 17:38:13 crc kubenswrapper[4690]: I0320 17:38:13.154051 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 20 17:38:13 crc kubenswrapper[4690]: I0320 17:38:13.154320 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 20 17:38:13 crc kubenswrapper[4690]: I0320 17:38:13.154374 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 20 17:38:13 crc kubenswrapper[4690]: I0320 17:38:13.154624 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 20 17:38:13 crc kubenswrapper[4690]: I0320 17:38:13.154890 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 20 17:38:13 crc kubenswrapper[4690]: I0320 17:38:13.155806 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 20 17:38:13 crc kubenswrapper[4690]: I0320 17:38:13.156019 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 20 17:38:13 crc kubenswrapper[4690]: I0320 17:38:13.157981 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 20 17:38:13 crc kubenswrapper[4690]: I0320 17:38:13.158795 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 20 17:38:13 crc kubenswrapper[4690]: I0320 17:38:13.163862 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 20 17:38:13 crc kubenswrapper[4690]: I0320 17:38:13.167727 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7cd676f696-6jgvj"] Mar 20 17:38:13 crc kubenswrapper[4690]: I0320 17:38:13.177822 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 20 17:38:13 crc kubenswrapper[4690]: I0320 17:38:13.189548 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 20 17:38:13 crc kubenswrapper[4690]: I0320 17:38:13.194946 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 20 17:38:13 crc kubenswrapper[4690]: I0320 17:38:13.244677 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l277d\" (UniqueName: \"kubernetes.io/projected/843d7ef2-52ff-47f8-81a0-5690cf3fb8cc-kube-api-access-l277d\") pod \"oauth-openshift-7cd676f696-6jgvj\" (UID: \"843d7ef2-52ff-47f8-81a0-5690cf3fb8cc\") " pod="openshift-authentication/oauth-openshift-7cd676f696-6jgvj" Mar 20 17:38:13 crc kubenswrapper[4690]: I0320 17:38:13.244731 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/843d7ef2-52ff-47f8-81a0-5690cf3fb8cc-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7cd676f696-6jgvj\" (UID: \"843d7ef2-52ff-47f8-81a0-5690cf3fb8cc\") " pod="openshift-authentication/oauth-openshift-7cd676f696-6jgvj" Mar 20 17:38:13 crc kubenswrapper[4690]: I0320 17:38:13.244761 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/843d7ef2-52ff-47f8-81a0-5690cf3fb8cc-v4-0-config-system-session\") pod \"oauth-openshift-7cd676f696-6jgvj\" (UID: \"843d7ef2-52ff-47f8-81a0-5690cf3fb8cc\") " pod="openshift-authentication/oauth-openshift-7cd676f696-6jgvj" Mar 20 17:38:13 crc kubenswrapper[4690]: I0320 17:38:13.244816 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/843d7ef2-52ff-47f8-81a0-5690cf3fb8cc-audit-policies\") pod \"oauth-openshift-7cd676f696-6jgvj\" (UID: \"843d7ef2-52ff-47f8-81a0-5690cf3fb8cc\") " pod="openshift-authentication/oauth-openshift-7cd676f696-6jgvj" Mar 20 17:38:13 crc kubenswrapper[4690]: I0320 17:38:13.244838 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/843d7ef2-52ff-47f8-81a0-5690cf3fb8cc-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7cd676f696-6jgvj\" (UID: \"843d7ef2-52ff-47f8-81a0-5690cf3fb8cc\") " pod="openshift-authentication/oauth-openshift-7cd676f696-6jgvj" Mar 20 17:38:13 crc kubenswrapper[4690]: I0320 17:38:13.245039 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/843d7ef2-52ff-47f8-81a0-5690cf3fb8cc-audit-dir\") pod \"oauth-openshift-7cd676f696-6jgvj\" (UID: \"843d7ef2-52ff-47f8-81a0-5690cf3fb8cc\") " pod="openshift-authentication/oauth-openshift-7cd676f696-6jgvj" Mar 20 17:38:13 crc kubenswrapper[4690]: I0320 17:38:13.245160 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/843d7ef2-52ff-47f8-81a0-5690cf3fb8cc-v4-0-config-system-router-certs\") pod \"oauth-openshift-7cd676f696-6jgvj\" (UID: \"843d7ef2-52ff-47f8-81a0-5690cf3fb8cc\") " pod="openshift-authentication/oauth-openshift-7cd676f696-6jgvj" Mar 20 17:38:13 crc kubenswrapper[4690]: I0320 17:38:13.245279 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/843d7ef2-52ff-47f8-81a0-5690cf3fb8cc-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7cd676f696-6jgvj\" (UID: \"843d7ef2-52ff-47f8-81a0-5690cf3fb8cc\") " pod="openshift-authentication/oauth-openshift-7cd676f696-6jgvj" Mar 20 17:38:13 crc kubenswrapper[4690]: I0320 17:38:13.245372 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/843d7ef2-52ff-47f8-81a0-5690cf3fb8cc-v4-0-config-user-template-login\") pod \"oauth-openshift-7cd676f696-6jgvj\" (UID: \"843d7ef2-52ff-47f8-81a0-5690cf3fb8cc\") " pod="openshift-authentication/oauth-openshift-7cd676f696-6jgvj" Mar 20 17:38:13 crc kubenswrapper[4690]: I0320 17:38:13.245493 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/843d7ef2-52ff-47f8-81a0-5690cf3fb8cc-v4-0-config-system-service-ca\") pod \"oauth-openshift-7cd676f696-6jgvj\" (UID: \"843d7ef2-52ff-47f8-81a0-5690cf3fb8cc\") " pod="openshift-authentication/oauth-openshift-7cd676f696-6jgvj" Mar 20 17:38:13 crc kubenswrapper[4690]: I0320 17:38:13.245550 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/843d7ef2-52ff-47f8-81a0-5690cf3fb8cc-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7cd676f696-6jgvj\" (UID: \"843d7ef2-52ff-47f8-81a0-5690cf3fb8cc\") " pod="openshift-authentication/oauth-openshift-7cd676f696-6jgvj" Mar 20 17:38:13 crc kubenswrapper[4690]: I0320 17:38:13.245636 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/843d7ef2-52ff-47f8-81a0-5690cf3fb8cc-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7cd676f696-6jgvj\" (UID: \"843d7ef2-52ff-47f8-81a0-5690cf3fb8cc\") " pod="openshift-authentication/oauth-openshift-7cd676f696-6jgvj" Mar 20 17:38:13 crc kubenswrapper[4690]: I0320 17:38:13.245755 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/843d7ef2-52ff-47f8-81a0-5690cf3fb8cc-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7cd676f696-6jgvj\" (UID: \"843d7ef2-52ff-47f8-81a0-5690cf3fb8cc\") " pod="openshift-authentication/oauth-openshift-7cd676f696-6jgvj" Mar 20 17:38:13 crc kubenswrapper[4690]: I0320 17:38:13.245797 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/843d7ef2-52ff-47f8-81a0-5690cf3fb8cc-v4-0-config-user-template-error\") pod \"oauth-openshift-7cd676f696-6jgvj\" (UID: \"843d7ef2-52ff-47f8-81a0-5690cf3fb8cc\") " pod="openshift-authentication/oauth-openshift-7cd676f696-6jgvj" Mar 20 17:38:13 crc kubenswrapper[4690]: I0320 17:38:13.288213 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 20 17:38:13 crc kubenswrapper[4690]: I0320 17:38:13.343195 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 20 17:38:13 crc kubenswrapper[4690]: I0320 17:38:13.346883 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/843d7ef2-52ff-47f8-81a0-5690cf3fb8cc-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7cd676f696-6jgvj\" (UID: \"843d7ef2-52ff-47f8-81a0-5690cf3fb8cc\") " pod="openshift-authentication/oauth-openshift-7cd676f696-6jgvj" Mar 20 17:38:13 crc kubenswrapper[4690]: I0320 17:38:13.346963 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/843d7ef2-52ff-47f8-81a0-5690cf3fb8cc-v4-0-config-user-template-error\") pod \"oauth-openshift-7cd676f696-6jgvj\" (UID: \"843d7ef2-52ff-47f8-81a0-5690cf3fb8cc\") " pod="openshift-authentication/oauth-openshift-7cd676f696-6jgvj" Mar 20 17:38:13 crc kubenswrapper[4690]: I0320 17:38:13.347043 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l277d\" (UniqueName: \"kubernetes.io/projected/843d7ef2-52ff-47f8-81a0-5690cf3fb8cc-kube-api-access-l277d\") pod \"oauth-openshift-7cd676f696-6jgvj\" (UID: \"843d7ef2-52ff-47f8-81a0-5690cf3fb8cc\") " pod="openshift-authentication/oauth-openshift-7cd676f696-6jgvj" Mar 20 17:38:13 crc kubenswrapper[4690]: I0320 17:38:13.347098 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/843d7ef2-52ff-47f8-81a0-5690cf3fb8cc-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7cd676f696-6jgvj\" (UID: \"843d7ef2-52ff-47f8-81a0-5690cf3fb8cc\") " pod="openshift-authentication/oauth-openshift-7cd676f696-6jgvj" Mar 20 17:38:13 crc kubenswrapper[4690]: I0320 17:38:13.347164 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/843d7ef2-52ff-47f8-81a0-5690cf3fb8cc-v4-0-config-system-session\") pod \"oauth-openshift-7cd676f696-6jgvj\" (UID: \"843d7ef2-52ff-47f8-81a0-5690cf3fb8cc\") " pod="openshift-authentication/oauth-openshift-7cd676f696-6jgvj" Mar 20 17:38:13 crc kubenswrapper[4690]: I0320 17:38:13.347347 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/843d7ef2-52ff-47f8-81a0-5690cf3fb8cc-audit-policies\") pod \"oauth-openshift-7cd676f696-6jgvj\" (UID: \"843d7ef2-52ff-47f8-81a0-5690cf3fb8cc\") " pod="openshift-authentication/oauth-openshift-7cd676f696-6jgvj" Mar 20 17:38:13 crc kubenswrapper[4690]: I0320 17:38:13.347415 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/843d7ef2-52ff-47f8-81a0-5690cf3fb8cc-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7cd676f696-6jgvj\" (UID: \"843d7ef2-52ff-47f8-81a0-5690cf3fb8cc\") " pod="openshift-authentication/oauth-openshift-7cd676f696-6jgvj" Mar 20 17:38:13 crc kubenswrapper[4690]: I0320 17:38:13.347495 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/843d7ef2-52ff-47f8-81a0-5690cf3fb8cc-audit-dir\") pod \"oauth-openshift-7cd676f696-6jgvj\" (UID: \"843d7ef2-52ff-47f8-81a0-5690cf3fb8cc\") " pod="openshift-authentication/oauth-openshift-7cd676f696-6jgvj" Mar 20 17:38:13 crc kubenswrapper[4690]: I0320 17:38:13.347579 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/843d7ef2-52ff-47f8-81a0-5690cf3fb8cc-v4-0-config-system-router-certs\") pod \"oauth-openshift-7cd676f696-6jgvj\" (UID: \"843d7ef2-52ff-47f8-81a0-5690cf3fb8cc\") " pod="openshift-authentication/oauth-openshift-7cd676f696-6jgvj" Mar 20 17:38:13 crc kubenswrapper[4690]: I0320 17:38:13.347661 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/843d7ef2-52ff-47f8-81a0-5690cf3fb8cc-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7cd676f696-6jgvj\" (UID: \"843d7ef2-52ff-47f8-81a0-5690cf3fb8cc\") " pod="openshift-authentication/oauth-openshift-7cd676f696-6jgvj" Mar 20 17:38:13 crc kubenswrapper[4690]: I0320 17:38:13.347777 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/843d7ef2-52ff-47f8-81a0-5690cf3fb8cc-v4-0-config-user-template-login\") pod \"oauth-openshift-7cd676f696-6jgvj\" (UID: \"843d7ef2-52ff-47f8-81a0-5690cf3fb8cc\") " pod="openshift-authentication/oauth-openshift-7cd676f696-6jgvj" Mar 20 17:38:13 crc kubenswrapper[4690]: I0320 17:38:13.347859 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/843d7ef2-52ff-47f8-81a0-5690cf3fb8cc-v4-0-config-system-service-ca\") pod \"oauth-openshift-7cd676f696-6jgvj\" (UID: \"843d7ef2-52ff-47f8-81a0-5690cf3fb8cc\") " pod="openshift-authentication/oauth-openshift-7cd676f696-6jgvj" Mar 20 17:38:13 crc kubenswrapper[4690]: I0320 17:38:13.347914 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/843d7ef2-52ff-47f8-81a0-5690cf3fb8cc-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7cd676f696-6jgvj\" (UID: \"843d7ef2-52ff-47f8-81a0-5690cf3fb8cc\") " pod="openshift-authentication/oauth-openshift-7cd676f696-6jgvj" Mar 20 17:38:13 crc kubenswrapper[4690]: I0320 17:38:13.347968 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/843d7ef2-52ff-47f8-81a0-5690cf3fb8cc-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7cd676f696-6jgvj\" (UID: \"843d7ef2-52ff-47f8-81a0-5690cf3fb8cc\") " pod="openshift-authentication/oauth-openshift-7cd676f696-6jgvj" Mar 20 17:38:13 crc kubenswrapper[4690]: I0320 17:38:13.348211 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/843d7ef2-52ff-47f8-81a0-5690cf3fb8cc-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7cd676f696-6jgvj\" (UID: \"843d7ef2-52ff-47f8-81a0-5690cf3fb8cc\") " pod="openshift-authentication/oauth-openshift-7cd676f696-6jgvj" Mar 20 17:38:13 crc kubenswrapper[4690]: I0320 17:38:13.348700 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/843d7ef2-52ff-47f8-81a0-5690cf3fb8cc-audit-policies\") pod \"oauth-openshift-7cd676f696-6jgvj\" (UID: \"843d7ef2-52ff-47f8-81a0-5690cf3fb8cc\") " pod="openshift-authentication/oauth-openshift-7cd676f696-6jgvj" Mar 20 17:38:13 crc kubenswrapper[4690]: I0320 17:38:13.349489 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/843d7ef2-52ff-47f8-81a0-5690cf3fb8cc-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7cd676f696-6jgvj\" (UID: \"843d7ef2-52ff-47f8-81a0-5690cf3fb8cc\") " pod="openshift-authentication/oauth-openshift-7cd676f696-6jgvj" Mar 20 17:38:13 crc kubenswrapper[4690]: I0320 17:38:13.349671 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/843d7ef2-52ff-47f8-81a0-5690cf3fb8cc-audit-dir\") pod \"oauth-openshift-7cd676f696-6jgvj\" (UID: \"843d7ef2-52ff-47f8-81a0-5690cf3fb8cc\") " pod="openshift-authentication/oauth-openshift-7cd676f696-6jgvj" Mar 20 17:38:13 crc kubenswrapper[4690]: I0320 17:38:13.350311 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/843d7ef2-52ff-47f8-81a0-5690cf3fb8cc-v4-0-config-system-service-ca\") pod \"oauth-openshift-7cd676f696-6jgvj\" (UID: \"843d7ef2-52ff-47f8-81a0-5690cf3fb8cc\") " pod="openshift-authentication/oauth-openshift-7cd676f696-6jgvj" Mar 20 17:38:13 crc kubenswrapper[4690]: I0320 17:38:13.354106 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/843d7ef2-52ff-47f8-81a0-5690cf3fb8cc-v4-0-config-user-template-error\") pod \"oauth-openshift-7cd676f696-6jgvj\" (UID: \"843d7ef2-52ff-47f8-81a0-5690cf3fb8cc\") " pod="openshift-authentication/oauth-openshift-7cd676f696-6jgvj" Mar 20 17:38:13 crc kubenswrapper[4690]: I0320 17:38:13.354469 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/843d7ef2-52ff-47f8-81a0-5690cf3fb8cc-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7cd676f696-6jgvj\" (UID: \"843d7ef2-52ff-47f8-81a0-5690cf3fb8cc\") " pod="openshift-authentication/oauth-openshift-7cd676f696-6jgvj" Mar 20 17:38:13 crc kubenswrapper[4690]: I0320 17:38:13.354747 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/843d7ef2-52ff-47f8-81a0-5690cf3fb8cc-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7cd676f696-6jgvj\" (UID: \"843d7ef2-52ff-47f8-81a0-5690cf3fb8cc\") " pod="openshift-authentication/oauth-openshift-7cd676f696-6jgvj" Mar 20 17:38:13 crc kubenswrapper[4690]: I0320 17:38:13.356301 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/843d7ef2-52ff-47f8-81a0-5690cf3fb8cc-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7cd676f696-6jgvj\" (UID: \"843d7ef2-52ff-47f8-81a0-5690cf3fb8cc\") " pod="openshift-authentication/oauth-openshift-7cd676f696-6jgvj" Mar 20 17:38:13 crc kubenswrapper[4690]: I0320 17:38:13.357093 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/843d7ef2-52ff-47f8-81a0-5690cf3fb8cc-v4-0-config-system-router-certs\") pod \"oauth-openshift-7cd676f696-6jgvj\" (UID: \"843d7ef2-52ff-47f8-81a0-5690cf3fb8cc\") " pod="openshift-authentication/oauth-openshift-7cd676f696-6jgvj" Mar 20 17:38:13 crc kubenswrapper[4690]: I0320 17:38:13.357601 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/843d7ef2-52ff-47f8-81a0-5690cf3fb8cc-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7cd676f696-6jgvj\" (UID: \"843d7ef2-52ff-47f8-81a0-5690cf3fb8cc\") " pod="openshift-authentication/oauth-openshift-7cd676f696-6jgvj" Mar 20 17:38:13 crc kubenswrapper[4690]: I0320 17:38:13.362240 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/843d7ef2-52ff-47f8-81a0-5690cf3fb8cc-v4-0-config-user-template-login\") pod \"oauth-openshift-7cd676f696-6jgvj\" (UID: \"843d7ef2-52ff-47f8-81a0-5690cf3fb8cc\") " pod="openshift-authentication/oauth-openshift-7cd676f696-6jgvj" Mar 20 17:38:13 crc kubenswrapper[4690]: I0320 17:38:13.369720 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/843d7ef2-52ff-47f8-81a0-5690cf3fb8cc-v4-0-config-system-session\") pod \"oauth-openshift-7cd676f696-6jgvj\" (UID: \"843d7ef2-52ff-47f8-81a0-5690cf3fb8cc\") " pod="openshift-authentication/oauth-openshift-7cd676f696-6jgvj" Mar 20 17:38:13 crc kubenswrapper[4690]: I0320 17:38:13.381328 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l277d\" (UniqueName: \"kubernetes.io/projected/843d7ef2-52ff-47f8-81a0-5690cf3fb8cc-kube-api-access-l277d\") pod \"oauth-openshift-7cd676f696-6jgvj\" (UID: \"843d7ef2-52ff-47f8-81a0-5690cf3fb8cc\") " pod="openshift-authentication/oauth-openshift-7cd676f696-6jgvj" Mar 20 17:38:13 crc kubenswrapper[4690]: I0320 17:38:13.425984 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 20 17:38:13 crc kubenswrapper[4690]: I0320 17:38:13.480520 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7cd676f696-6jgvj" Mar 20 17:38:13 crc kubenswrapper[4690]: I0320 17:38:13.731945 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 20 17:38:13 crc kubenswrapper[4690]: I0320 17:38:13.756186 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 20 17:38:13 crc kubenswrapper[4690]: I0320 17:38:13.781559 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 20 17:38:13 crc kubenswrapper[4690]: I0320 17:38:13.808477 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 20 17:38:13 crc kubenswrapper[4690]: I0320 17:38:13.851905 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567138-th8kh" Mar 20 17:38:13 crc kubenswrapper[4690]: I0320 17:38:13.923769 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7cd676f696-6jgvj"] Mar 20 17:38:13 crc kubenswrapper[4690]: I0320 17:38:13.956062 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nftz4\" (UniqueName: \"kubernetes.io/projected/8050c5b1-a071-48e0-a371-33b5a13765cd-kube-api-access-nftz4\") pod \"8050c5b1-a071-48e0-a371-33b5a13765cd\" (UID: \"8050c5b1-a071-48e0-a371-33b5a13765cd\") " Mar 20 17:38:13 crc kubenswrapper[4690]: I0320 17:38:13.962446 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8050c5b1-a071-48e0-a371-33b5a13765cd-kube-api-access-nftz4" (OuterVolumeSpecName: "kube-api-access-nftz4") pod "8050c5b1-a071-48e0-a371-33b5a13765cd" (UID: "8050c5b1-a071-48e0-a371-33b5a13765cd"). InnerVolumeSpecName "kube-api-access-nftz4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:38:13 crc kubenswrapper[4690]: I0320 17:38:13.984382 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 20 17:38:14 crc kubenswrapper[4690]: I0320 17:38:14.058217 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nftz4\" (UniqueName: \"kubernetes.io/projected/8050c5b1-a071-48e0-a371-33b5a13765cd-kube-api-access-nftz4\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:14 crc kubenswrapper[4690]: I0320 17:38:14.432725 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 20 17:38:14 crc kubenswrapper[4690]: I0320 17:38:14.535444 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 20 17:38:14 crc kubenswrapper[4690]: I0320 17:38:14.571418 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7cd676f696-6jgvj" event={"ID":"843d7ef2-52ff-47f8-81a0-5690cf3fb8cc","Type":"ContainerStarted","Data":"b266a3e6c5ba8dcdbf95688d91779d5b2f179c2bdeee0fa8527d178874522079"} Mar 20 17:38:14 crc kubenswrapper[4690]: I0320 17:38:14.571464 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7cd676f696-6jgvj" event={"ID":"843d7ef2-52ff-47f8-81a0-5690cf3fb8cc","Type":"ContainerStarted","Data":"82ecaba3cd0d5f3b8c6e80343336f3a2366c83bf657a1443e0c753b7a4e98b99"} Mar 20 17:38:14 crc kubenswrapper[4690]: I0320 17:38:14.573131 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567138-th8kh" event={"ID":"8050c5b1-a071-48e0-a371-33b5a13765cd","Type":"ContainerDied","Data":"f9a9b7091ae48c379fb72a636014dc7143060ea39c33e2d992bf44369520089c"} Mar 20 17:38:14 crc kubenswrapper[4690]: I0320 17:38:14.573175 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f9a9b7091ae48c379fb72a636014dc7143060ea39c33e2d992bf44369520089c" Mar 20 17:38:14 crc kubenswrapper[4690]: I0320 17:38:14.573206 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567138-th8kh" Mar 20 17:38:14 crc kubenswrapper[4690]: I0320 17:38:14.600149 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-7cd676f696-6jgvj" podStartSLOduration=52.600132275 podStartE2EDuration="52.600132275s" podCreationTimestamp="2026-03-20 17:37:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:38:14.599284595 +0000 UTC m=+369.465110263" watchObservedRunningTime="2026-03-20 17:38:14.600132275 +0000 UTC m=+369.465957943" Mar 20 17:38:15 crc kubenswrapper[4690]: I0320 17:38:15.580659 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-7cd676f696-6jgvj" Mar 20 17:38:15 crc kubenswrapper[4690]: I0320 17:38:15.585749 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-7cd676f696-6jgvj" Mar 20 17:38:15 crc kubenswrapper[4690]: I0320 17:38:15.834280 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6848775948-wq8cb"] Mar 20 17:38:15 crc kubenswrapper[4690]: I0320 17:38:15.834492 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6848775948-wq8cb" podUID="41de6756-32f4-474f-bc58-d461313abb73" containerName="controller-manager" containerID="cri-o://7d51be2d341956bbe9d859e69bb65401366990924ddb5367436269bd6325d7ec" gracePeriod=30 Mar 20 17:38:15 crc kubenswrapper[4690]: I0320 17:38:15.932551 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f86fc9544-4rfsd"] Mar 20 17:38:15 crc kubenswrapper[4690]: I0320 17:38:15.932813 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6f86fc9544-4rfsd" podUID="0ae207bf-1228-48f1-ab12-e941daee4948" containerName="route-controller-manager" containerID="cri-o://60dce8b62715f585f73a49e595d756f55a4900e4723938b1ab5885c0395e03c7" gracePeriod=30 Mar 20 17:38:16 crc kubenswrapper[4690]: I0320 17:38:16.199337 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6848775948-wq8cb" Mar 20 17:38:16 crc kubenswrapper[4690]: I0320 17:38:16.288007 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6f86fc9544-4rfsd" Mar 20 17:38:16 crc kubenswrapper[4690]: I0320 17:38:16.288721 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kx988\" (UniqueName: \"kubernetes.io/projected/41de6756-32f4-474f-bc58-d461313abb73-kube-api-access-kx988\") pod \"41de6756-32f4-474f-bc58-d461313abb73\" (UID: \"41de6756-32f4-474f-bc58-d461313abb73\") " Mar 20 17:38:16 crc kubenswrapper[4690]: I0320 17:38:16.288821 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/41de6756-32f4-474f-bc58-d461313abb73-proxy-ca-bundles\") pod \"41de6756-32f4-474f-bc58-d461313abb73\" (UID: \"41de6756-32f4-474f-bc58-d461313abb73\") " Mar 20 17:38:16 crc kubenswrapper[4690]: I0320 17:38:16.288873 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/41de6756-32f4-474f-bc58-d461313abb73-client-ca\") pod \"41de6756-32f4-474f-bc58-d461313abb73\" (UID: \"41de6756-32f4-474f-bc58-d461313abb73\") " Mar 20 17:38:16 crc kubenswrapper[4690]: I0320 17:38:16.288894 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/41de6756-32f4-474f-bc58-d461313abb73-serving-cert\") pod \"41de6756-32f4-474f-bc58-d461313abb73\" (UID: \"41de6756-32f4-474f-bc58-d461313abb73\") " Mar 20 17:38:16 crc kubenswrapper[4690]: I0320 17:38:16.288951 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41de6756-32f4-474f-bc58-d461313abb73-config\") pod \"41de6756-32f4-474f-bc58-d461313abb73\" (UID: \"41de6756-32f4-474f-bc58-d461313abb73\") " Mar 20 17:38:16 crc kubenswrapper[4690]: I0320 17:38:16.289644 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41de6756-32f4-474f-bc58-d461313abb73-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "41de6756-32f4-474f-bc58-d461313abb73" (UID: "41de6756-32f4-474f-bc58-d461313abb73"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:38:16 crc kubenswrapper[4690]: I0320 17:38:16.289720 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41de6756-32f4-474f-bc58-d461313abb73-config" (OuterVolumeSpecName: "config") pod "41de6756-32f4-474f-bc58-d461313abb73" (UID: "41de6756-32f4-474f-bc58-d461313abb73"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:38:16 crc kubenswrapper[4690]: I0320 17:38:16.290016 4690 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41de6756-32f4-474f-bc58-d461313abb73-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:16 crc kubenswrapper[4690]: I0320 17:38:16.290040 4690 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/41de6756-32f4-474f-bc58-d461313abb73-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:16 crc kubenswrapper[4690]: I0320 17:38:16.290281 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41de6756-32f4-474f-bc58-d461313abb73-client-ca" (OuterVolumeSpecName: "client-ca") pod "41de6756-32f4-474f-bc58-d461313abb73" (UID: "41de6756-32f4-474f-bc58-d461313abb73"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:38:16 crc kubenswrapper[4690]: I0320 17:38:16.293933 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41de6756-32f4-474f-bc58-d461313abb73-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "41de6756-32f4-474f-bc58-d461313abb73" (UID: "41de6756-32f4-474f-bc58-d461313abb73"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:38:16 crc kubenswrapper[4690]: I0320 17:38:16.294045 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41de6756-32f4-474f-bc58-d461313abb73-kube-api-access-kx988" (OuterVolumeSpecName: "kube-api-access-kx988") pod "41de6756-32f4-474f-bc58-d461313abb73" (UID: "41de6756-32f4-474f-bc58-d461313abb73"). InnerVolumeSpecName "kube-api-access-kx988". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:38:16 crc kubenswrapper[4690]: I0320 17:38:16.391090 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0ae207bf-1228-48f1-ab12-e941daee4948-client-ca\") pod \"0ae207bf-1228-48f1-ab12-e941daee4948\" (UID: \"0ae207bf-1228-48f1-ab12-e941daee4948\") " Mar 20 17:38:16 crc kubenswrapper[4690]: I0320 17:38:16.391138 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8rl4z\" (UniqueName: \"kubernetes.io/projected/0ae207bf-1228-48f1-ab12-e941daee4948-kube-api-access-8rl4z\") pod \"0ae207bf-1228-48f1-ab12-e941daee4948\" (UID: \"0ae207bf-1228-48f1-ab12-e941daee4948\") " Mar 20 17:38:16 crc kubenswrapper[4690]: I0320 17:38:16.391160 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ae207bf-1228-48f1-ab12-e941daee4948-serving-cert\") pod \"0ae207bf-1228-48f1-ab12-e941daee4948\" (UID: \"0ae207bf-1228-48f1-ab12-e941daee4948\") " Mar 20 17:38:16 crc kubenswrapper[4690]: I0320 17:38:16.391187 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ae207bf-1228-48f1-ab12-e941daee4948-config\") pod \"0ae207bf-1228-48f1-ab12-e941daee4948\" (UID: \"0ae207bf-1228-48f1-ab12-e941daee4948\") " Mar 20 17:38:16 crc kubenswrapper[4690]: I0320 17:38:16.391519 4690 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/41de6756-32f4-474f-bc58-d461313abb73-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:16 crc kubenswrapper[4690]: I0320 17:38:16.391531 4690 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/41de6756-32f4-474f-bc58-d461313abb73-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:16 crc kubenswrapper[4690]: I0320 17:38:16.391540 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kx988\" (UniqueName: \"kubernetes.io/projected/41de6756-32f4-474f-bc58-d461313abb73-kube-api-access-kx988\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:16 crc kubenswrapper[4690]: I0320 17:38:16.391933 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ae207bf-1228-48f1-ab12-e941daee4948-client-ca" (OuterVolumeSpecName: "client-ca") pod "0ae207bf-1228-48f1-ab12-e941daee4948" (UID: "0ae207bf-1228-48f1-ab12-e941daee4948"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:38:16 crc kubenswrapper[4690]: I0320 17:38:16.392127 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ae207bf-1228-48f1-ab12-e941daee4948-config" (OuterVolumeSpecName: "config") pod "0ae207bf-1228-48f1-ab12-e941daee4948" (UID: "0ae207bf-1228-48f1-ab12-e941daee4948"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:38:16 crc kubenswrapper[4690]: I0320 17:38:16.393979 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ae207bf-1228-48f1-ab12-e941daee4948-kube-api-access-8rl4z" (OuterVolumeSpecName: "kube-api-access-8rl4z") pod "0ae207bf-1228-48f1-ab12-e941daee4948" (UID: "0ae207bf-1228-48f1-ab12-e941daee4948"). InnerVolumeSpecName "kube-api-access-8rl4z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:38:16 crc kubenswrapper[4690]: I0320 17:38:16.394095 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ae207bf-1228-48f1-ab12-e941daee4948-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0ae207bf-1228-48f1-ab12-e941daee4948" (UID: "0ae207bf-1228-48f1-ab12-e941daee4948"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:38:16 crc kubenswrapper[4690]: I0320 17:38:16.492042 4690 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0ae207bf-1228-48f1-ab12-e941daee4948-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:16 crc kubenswrapper[4690]: I0320 17:38:16.492070 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8rl4z\" (UniqueName: \"kubernetes.io/projected/0ae207bf-1228-48f1-ab12-e941daee4948-kube-api-access-8rl4z\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:16 crc kubenswrapper[4690]: I0320 17:38:16.492079 4690 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ae207bf-1228-48f1-ab12-e941daee4948-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:16 crc kubenswrapper[4690]: I0320 17:38:16.492087 4690 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ae207bf-1228-48f1-ab12-e941daee4948-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:16 crc kubenswrapper[4690]: I0320 17:38:16.587722 4690 generic.go:334] "Generic (PLEG): container finished" podID="41de6756-32f4-474f-bc58-d461313abb73" containerID="7d51be2d341956bbe9d859e69bb65401366990924ddb5367436269bd6325d7ec" exitCode=0 Mar 20 17:38:16 crc kubenswrapper[4690]: I0320 17:38:16.587795 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6848775948-wq8cb" event={"ID":"41de6756-32f4-474f-bc58-d461313abb73","Type":"ContainerDied","Data":"7d51be2d341956bbe9d859e69bb65401366990924ddb5367436269bd6325d7ec"} Mar 20 17:38:16 crc kubenswrapper[4690]: I0320 17:38:16.587794 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6848775948-wq8cb" Mar 20 17:38:16 crc kubenswrapper[4690]: I0320 17:38:16.587836 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6848775948-wq8cb" event={"ID":"41de6756-32f4-474f-bc58-d461313abb73","Type":"ContainerDied","Data":"5e6387354d7e6eee962a871d7beac40639c1031cc272e1915eff00db959d491f"} Mar 20 17:38:16 crc kubenswrapper[4690]: I0320 17:38:16.587853 4690 scope.go:117] "RemoveContainer" containerID="7d51be2d341956bbe9d859e69bb65401366990924ddb5367436269bd6325d7ec" Mar 20 17:38:16 crc kubenswrapper[4690]: I0320 17:38:16.589163 4690 generic.go:334] "Generic (PLEG): container finished" podID="0ae207bf-1228-48f1-ab12-e941daee4948" containerID="60dce8b62715f585f73a49e595d756f55a4900e4723938b1ab5885c0395e03c7" exitCode=0 Mar 20 17:38:16 crc kubenswrapper[4690]: I0320 17:38:16.589511 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6f86fc9544-4rfsd" Mar 20 17:38:16 crc kubenswrapper[4690]: I0320 17:38:16.589675 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6f86fc9544-4rfsd" event={"ID":"0ae207bf-1228-48f1-ab12-e941daee4948","Type":"ContainerDied","Data":"60dce8b62715f585f73a49e595d756f55a4900e4723938b1ab5885c0395e03c7"} Mar 20 17:38:16 crc kubenswrapper[4690]: I0320 17:38:16.589747 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6f86fc9544-4rfsd" event={"ID":"0ae207bf-1228-48f1-ab12-e941daee4948","Type":"ContainerDied","Data":"c221e53264c7aa7c6fc8ba79f304a71d77e7a5e3c11730cdbf22157647a0c883"} Mar 20 17:38:16 crc kubenswrapper[4690]: I0320 17:38:16.606210 4690 scope.go:117] "RemoveContainer" containerID="7d51be2d341956bbe9d859e69bb65401366990924ddb5367436269bd6325d7ec" Mar 20 17:38:16 crc kubenswrapper[4690]: E0320 17:38:16.606830 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d51be2d341956bbe9d859e69bb65401366990924ddb5367436269bd6325d7ec\": container with ID starting with 7d51be2d341956bbe9d859e69bb65401366990924ddb5367436269bd6325d7ec not found: ID does not exist" containerID="7d51be2d341956bbe9d859e69bb65401366990924ddb5367436269bd6325d7ec" Mar 20 17:38:16 crc kubenswrapper[4690]: I0320 17:38:16.606882 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d51be2d341956bbe9d859e69bb65401366990924ddb5367436269bd6325d7ec"} err="failed to get container status \"7d51be2d341956bbe9d859e69bb65401366990924ddb5367436269bd6325d7ec\": rpc error: code = NotFound desc = could not find container \"7d51be2d341956bbe9d859e69bb65401366990924ddb5367436269bd6325d7ec\": container with ID starting with 7d51be2d341956bbe9d859e69bb65401366990924ddb5367436269bd6325d7ec not found: ID does not exist" Mar 20 17:38:16 crc kubenswrapper[4690]: I0320 17:38:16.606913 4690 scope.go:117] "RemoveContainer" containerID="60dce8b62715f585f73a49e595d756f55a4900e4723938b1ab5885c0395e03c7" Mar 20 17:38:16 crc kubenswrapper[4690]: I0320 17:38:16.624646 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6848775948-wq8cb"] Mar 20 17:38:16 crc kubenswrapper[4690]: I0320 17:38:16.627092 4690 scope.go:117] "RemoveContainer" containerID="60dce8b62715f585f73a49e595d756f55a4900e4723938b1ab5885c0395e03c7" Mar 20 17:38:16 crc kubenswrapper[4690]: E0320 17:38:16.627664 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60dce8b62715f585f73a49e595d756f55a4900e4723938b1ab5885c0395e03c7\": container with ID starting with 60dce8b62715f585f73a49e595d756f55a4900e4723938b1ab5885c0395e03c7 not found: ID does not exist" containerID="60dce8b62715f585f73a49e595d756f55a4900e4723938b1ab5885c0395e03c7" Mar 20 17:38:16 crc kubenswrapper[4690]: I0320 17:38:16.627749 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60dce8b62715f585f73a49e595d756f55a4900e4723938b1ab5885c0395e03c7"} err="failed to get container status \"60dce8b62715f585f73a49e595d756f55a4900e4723938b1ab5885c0395e03c7\": rpc error: code = NotFound desc = could not find container \"60dce8b62715f585f73a49e595d756f55a4900e4723938b1ab5885c0395e03c7\": container with ID starting with 60dce8b62715f585f73a49e595d756f55a4900e4723938b1ab5885c0395e03c7 not found: ID does not exist" Mar 20 17:38:16 crc kubenswrapper[4690]: I0320 17:38:16.634612 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6848775948-wq8cb"] Mar 20 17:38:16 crc kubenswrapper[4690]: I0320 17:38:16.639049 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f86fc9544-4rfsd"] Mar 20 17:38:16 crc kubenswrapper[4690]: I0320 17:38:16.642926 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f86fc9544-4rfsd"] Mar 20 17:38:17 crc kubenswrapper[4690]: I0320 17:38:17.149906 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-8558fbbddd-qxfcj"] Mar 20 17:38:17 crc kubenswrapper[4690]: E0320 17:38:17.150701 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8050c5b1-a071-48e0-a371-33b5a13765cd" containerName="oc" Mar 20 17:38:17 crc kubenswrapper[4690]: I0320 17:38:17.150808 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="8050c5b1-a071-48e0-a371-33b5a13765cd" containerName="oc" Mar 20 17:38:17 crc kubenswrapper[4690]: E0320 17:38:17.151604 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ae207bf-1228-48f1-ab12-e941daee4948" containerName="route-controller-manager" Mar 20 17:38:17 crc kubenswrapper[4690]: I0320 17:38:17.151775 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ae207bf-1228-48f1-ab12-e941daee4948" containerName="route-controller-manager" Mar 20 17:38:17 crc kubenswrapper[4690]: E0320 17:38:17.151915 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41de6756-32f4-474f-bc58-d461313abb73" containerName="controller-manager" Mar 20 17:38:17 crc kubenswrapper[4690]: I0320 17:38:17.152030 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="41de6756-32f4-474f-bc58-d461313abb73" containerName="controller-manager" Mar 20 17:38:17 crc kubenswrapper[4690]: I0320 17:38:17.155025 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ae207bf-1228-48f1-ab12-e941daee4948" containerName="route-controller-manager" Mar 20 17:38:17 crc kubenswrapper[4690]: I0320 17:38:17.155201 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="8050c5b1-a071-48e0-a371-33b5a13765cd" containerName="oc" Mar 20 17:38:17 crc kubenswrapper[4690]: I0320 17:38:17.155295 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="41de6756-32f4-474f-bc58-d461313abb73" containerName="controller-manager" Mar 20 17:38:17 crc kubenswrapper[4690]: I0320 17:38:17.156085 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8558fbbddd-qxfcj" Mar 20 17:38:17 crc kubenswrapper[4690]: I0320 17:38:17.157124 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86498d848d-v9std"] Mar 20 17:38:17 crc kubenswrapper[4690]: I0320 17:38:17.158154 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-86498d848d-v9std" Mar 20 17:38:17 crc kubenswrapper[4690]: I0320 17:38:17.161250 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 17:38:17 crc kubenswrapper[4690]: I0320 17:38:17.161674 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 17:38:17 crc kubenswrapper[4690]: I0320 17:38:17.162409 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 17:38:17 crc kubenswrapper[4690]: I0320 17:38:17.162440 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 17:38:17 crc kubenswrapper[4690]: I0320 17:38:17.162565 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 17:38:17 crc kubenswrapper[4690]: I0320 17:38:17.162842 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 17:38:17 crc kubenswrapper[4690]: I0320 17:38:17.162938 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 17:38:17 crc kubenswrapper[4690]: I0320 17:38:17.163319 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 17:38:17 crc kubenswrapper[4690]: I0320 17:38:17.163489 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 17:38:17 crc kubenswrapper[4690]: I0320 17:38:17.163314 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 17:38:17 crc kubenswrapper[4690]: I0320 17:38:17.163893 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 17:38:17 crc kubenswrapper[4690]: I0320 17:38:17.166301 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86498d848d-v9std"] Mar 20 17:38:17 crc kubenswrapper[4690]: I0320 17:38:17.168285 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 17:38:17 crc kubenswrapper[4690]: I0320 17:38:17.169441 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-8558fbbddd-qxfcj"] Mar 20 17:38:17 crc kubenswrapper[4690]: I0320 17:38:17.177003 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 17:38:17 crc kubenswrapper[4690]: I0320 17:38:17.301860 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1cb82e4b-643f-431a-8036-0cc6295e1adc-client-ca\") pod \"controller-manager-8558fbbddd-qxfcj\" (UID: \"1cb82e4b-643f-431a-8036-0cc6295e1adc\") " pod="openshift-controller-manager/controller-manager-8558fbbddd-qxfcj" Mar 20 17:38:17 crc kubenswrapper[4690]: I0320 17:38:17.301904 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1cb82e4b-643f-431a-8036-0cc6295e1adc-config\") pod \"controller-manager-8558fbbddd-qxfcj\" (UID: \"1cb82e4b-643f-431a-8036-0cc6295e1adc\") " pod="openshift-controller-manager/controller-manager-8558fbbddd-qxfcj" Mar 20 17:38:17 crc kubenswrapper[4690]: I0320 17:38:17.301964 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/630eaf89-3200-4fc6-9262-e61e8f64b1ba-client-ca\") pod \"route-controller-manager-86498d848d-v9std\" (UID: \"630eaf89-3200-4fc6-9262-e61e8f64b1ba\") " pod="openshift-route-controller-manager/route-controller-manager-86498d848d-v9std" Mar 20 17:38:17 crc kubenswrapper[4690]: I0320 17:38:17.301997 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1cb82e4b-643f-431a-8036-0cc6295e1adc-proxy-ca-bundles\") pod \"controller-manager-8558fbbddd-qxfcj\" (UID: \"1cb82e4b-643f-431a-8036-0cc6295e1adc\") " pod="openshift-controller-manager/controller-manager-8558fbbddd-qxfcj" Mar 20 17:38:17 crc kubenswrapper[4690]: I0320 17:38:17.302044 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/630eaf89-3200-4fc6-9262-e61e8f64b1ba-config\") pod \"route-controller-manager-86498d848d-v9std\" (UID: \"630eaf89-3200-4fc6-9262-e61e8f64b1ba\") " pod="openshift-route-controller-manager/route-controller-manager-86498d848d-v9std" Mar 20 17:38:17 crc kubenswrapper[4690]: I0320 17:38:17.302402 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1cb82e4b-643f-431a-8036-0cc6295e1adc-serving-cert\") pod \"controller-manager-8558fbbddd-qxfcj\" (UID: \"1cb82e4b-643f-431a-8036-0cc6295e1adc\") " pod="openshift-controller-manager/controller-manager-8558fbbddd-qxfcj" Mar 20 17:38:17 crc kubenswrapper[4690]: I0320 17:38:17.302439 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzskt\" (UniqueName: \"kubernetes.io/projected/1cb82e4b-643f-431a-8036-0cc6295e1adc-kube-api-access-kzskt\") pod \"controller-manager-8558fbbddd-qxfcj\" (UID: \"1cb82e4b-643f-431a-8036-0cc6295e1adc\") " pod="openshift-controller-manager/controller-manager-8558fbbddd-qxfcj" Mar 20 17:38:17 crc kubenswrapper[4690]: I0320 17:38:17.302476 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mch7k\" (UniqueName: \"kubernetes.io/projected/630eaf89-3200-4fc6-9262-e61e8f64b1ba-kube-api-access-mch7k\") pod \"route-controller-manager-86498d848d-v9std\" (UID: \"630eaf89-3200-4fc6-9262-e61e8f64b1ba\") " pod="openshift-route-controller-manager/route-controller-manager-86498d848d-v9std" Mar 20 17:38:17 crc kubenswrapper[4690]: I0320 17:38:17.302496 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/630eaf89-3200-4fc6-9262-e61e8f64b1ba-serving-cert\") pod \"route-controller-manager-86498d848d-v9std\" (UID: \"630eaf89-3200-4fc6-9262-e61e8f64b1ba\") " pod="openshift-route-controller-manager/route-controller-manager-86498d848d-v9std" Mar 20 17:38:17 crc kubenswrapper[4690]: I0320 17:38:17.403815 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mch7k\" (UniqueName: \"kubernetes.io/projected/630eaf89-3200-4fc6-9262-e61e8f64b1ba-kube-api-access-mch7k\") pod \"route-controller-manager-86498d848d-v9std\" (UID: \"630eaf89-3200-4fc6-9262-e61e8f64b1ba\") " pod="openshift-route-controller-manager/route-controller-manager-86498d848d-v9std" Mar 20 17:38:17 crc kubenswrapper[4690]: I0320 17:38:17.403905 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/630eaf89-3200-4fc6-9262-e61e8f64b1ba-serving-cert\") pod \"route-controller-manager-86498d848d-v9std\" (UID: \"630eaf89-3200-4fc6-9262-e61e8f64b1ba\") " pod="openshift-route-controller-manager/route-controller-manager-86498d848d-v9std" Mar 20 17:38:17 crc kubenswrapper[4690]: I0320 17:38:17.404020 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1cb82e4b-643f-431a-8036-0cc6295e1adc-client-ca\") pod \"controller-manager-8558fbbddd-qxfcj\" (UID: \"1cb82e4b-643f-431a-8036-0cc6295e1adc\") " pod="openshift-controller-manager/controller-manager-8558fbbddd-qxfcj" Mar 20 17:38:17 crc kubenswrapper[4690]: I0320 17:38:17.404062 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1cb82e4b-643f-431a-8036-0cc6295e1adc-config\") pod \"controller-manager-8558fbbddd-qxfcj\" (UID: \"1cb82e4b-643f-431a-8036-0cc6295e1adc\") " pod="openshift-controller-manager/controller-manager-8558fbbddd-qxfcj" Mar 20 17:38:17 crc kubenswrapper[4690]: I0320 17:38:17.404195 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/630eaf89-3200-4fc6-9262-e61e8f64b1ba-client-ca\") pod \"route-controller-manager-86498d848d-v9std\" (UID: \"630eaf89-3200-4fc6-9262-e61e8f64b1ba\") " pod="openshift-route-controller-manager/route-controller-manager-86498d848d-v9std" Mar 20 17:38:17 crc kubenswrapper[4690]: I0320 17:38:17.404298 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1cb82e4b-643f-431a-8036-0cc6295e1adc-proxy-ca-bundles\") pod \"controller-manager-8558fbbddd-qxfcj\" (UID: \"1cb82e4b-643f-431a-8036-0cc6295e1adc\") " pod="openshift-controller-manager/controller-manager-8558fbbddd-qxfcj" Mar 20 17:38:17 crc kubenswrapper[4690]: I0320 17:38:17.404337 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/630eaf89-3200-4fc6-9262-e61e8f64b1ba-config\") pod \"route-controller-manager-86498d848d-v9std\" (UID: \"630eaf89-3200-4fc6-9262-e61e8f64b1ba\") " pod="openshift-route-controller-manager/route-controller-manager-86498d848d-v9std" Mar 20 17:38:17 crc kubenswrapper[4690]: I0320 17:38:17.404401 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1cb82e4b-643f-431a-8036-0cc6295e1adc-serving-cert\") pod \"controller-manager-8558fbbddd-qxfcj\" (UID: \"1cb82e4b-643f-431a-8036-0cc6295e1adc\") " pod="openshift-controller-manager/controller-manager-8558fbbddd-qxfcj" Mar 20 17:38:17 crc kubenswrapper[4690]: I0320 17:38:17.404498 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzskt\" (UniqueName: \"kubernetes.io/projected/1cb82e4b-643f-431a-8036-0cc6295e1adc-kube-api-access-kzskt\") pod \"controller-manager-8558fbbddd-qxfcj\" (UID: \"1cb82e4b-643f-431a-8036-0cc6295e1adc\") " pod="openshift-controller-manager/controller-manager-8558fbbddd-qxfcj" Mar 20 17:38:17 crc kubenswrapper[4690]: I0320 17:38:17.405307 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/630eaf89-3200-4fc6-9262-e61e8f64b1ba-client-ca\") pod \"route-controller-manager-86498d848d-v9std\" (UID: \"630eaf89-3200-4fc6-9262-e61e8f64b1ba\") " pod="openshift-route-controller-manager/route-controller-manager-86498d848d-v9std" Mar 20 17:38:17 crc kubenswrapper[4690]: I0320 17:38:17.406085 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/630eaf89-3200-4fc6-9262-e61e8f64b1ba-config\") pod \"route-controller-manager-86498d848d-v9std\" (UID: \"630eaf89-3200-4fc6-9262-e61e8f64b1ba\") " pod="openshift-route-controller-manager/route-controller-manager-86498d848d-v9std" Mar 20 17:38:17 crc kubenswrapper[4690]: I0320 17:38:17.408902 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1cb82e4b-643f-431a-8036-0cc6295e1adc-client-ca\") pod \"controller-manager-8558fbbddd-qxfcj\" (UID: \"1cb82e4b-643f-431a-8036-0cc6295e1adc\") " pod="openshift-controller-manager/controller-manager-8558fbbddd-qxfcj" Mar 20 17:38:17 crc kubenswrapper[4690]: I0320 17:38:17.410077 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1cb82e4b-643f-431a-8036-0cc6295e1adc-proxy-ca-bundles\") pod \"controller-manager-8558fbbddd-qxfcj\" (UID: \"1cb82e4b-643f-431a-8036-0cc6295e1adc\") " pod="openshift-controller-manager/controller-manager-8558fbbddd-qxfcj" Mar 20 17:38:17 crc kubenswrapper[4690]: I0320 17:38:17.410330 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/630eaf89-3200-4fc6-9262-e61e8f64b1ba-serving-cert\") pod \"route-controller-manager-86498d848d-v9std\" (UID: \"630eaf89-3200-4fc6-9262-e61e8f64b1ba\") " pod="openshift-route-controller-manager/route-controller-manager-86498d848d-v9std" Mar 20 17:38:17 crc kubenswrapper[4690]: I0320 17:38:17.411417 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1cb82e4b-643f-431a-8036-0cc6295e1adc-serving-cert\") pod \"controller-manager-8558fbbddd-qxfcj\" (UID: \"1cb82e4b-643f-431a-8036-0cc6295e1adc\") " pod="openshift-controller-manager/controller-manager-8558fbbddd-qxfcj" Mar 20 17:38:17 crc kubenswrapper[4690]: I0320 17:38:17.414018 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1cb82e4b-643f-431a-8036-0cc6295e1adc-config\") pod \"controller-manager-8558fbbddd-qxfcj\" (UID: \"1cb82e4b-643f-431a-8036-0cc6295e1adc\") " pod="openshift-controller-manager/controller-manager-8558fbbddd-qxfcj" Mar 20 17:38:17 crc kubenswrapper[4690]: I0320 17:38:17.423518 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mch7k\" (UniqueName: \"kubernetes.io/projected/630eaf89-3200-4fc6-9262-e61e8f64b1ba-kube-api-access-mch7k\") pod \"route-controller-manager-86498d848d-v9std\" (UID: \"630eaf89-3200-4fc6-9262-e61e8f64b1ba\") " pod="openshift-route-controller-manager/route-controller-manager-86498d848d-v9std" Mar 20 17:38:17 crc kubenswrapper[4690]: I0320 17:38:17.427986 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzskt\" (UniqueName: \"kubernetes.io/projected/1cb82e4b-643f-431a-8036-0cc6295e1adc-kube-api-access-kzskt\") pod \"controller-manager-8558fbbddd-qxfcj\" (UID: \"1cb82e4b-643f-431a-8036-0cc6295e1adc\") " pod="openshift-controller-manager/controller-manager-8558fbbddd-qxfcj" Mar 20 17:38:17 crc kubenswrapper[4690]: I0320 17:38:17.492659 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8558fbbddd-qxfcj" Mar 20 17:38:17 crc kubenswrapper[4690]: I0320 17:38:17.504002 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-86498d848d-v9std" Mar 20 17:38:17 crc kubenswrapper[4690]: I0320 17:38:17.792291 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-8558fbbddd-qxfcj"] Mar 20 17:38:17 crc kubenswrapper[4690]: W0320 17:38:17.796933 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1cb82e4b_643f_431a_8036_0cc6295e1adc.slice/crio-9c5ba83a509b658eb3927928793a002d4c78effc37f3f2d34cae0e59e1646489 WatchSource:0}: Error finding container 9c5ba83a509b658eb3927928793a002d4c78effc37f3f2d34cae0e59e1646489: Status 404 returned error can't find the container with id 9c5ba83a509b658eb3927928793a002d4c78effc37f3f2d34cae0e59e1646489 Mar 20 17:38:17 crc kubenswrapper[4690]: I0320 17:38:17.897475 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ae207bf-1228-48f1-ab12-e941daee4948" path="/var/lib/kubelet/pods/0ae207bf-1228-48f1-ab12-e941daee4948/volumes" Mar 20 17:38:17 crc kubenswrapper[4690]: I0320 17:38:17.899057 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41de6756-32f4-474f-bc58-d461313abb73" path="/var/lib/kubelet/pods/41de6756-32f4-474f-bc58-d461313abb73/volumes" Mar 20 17:38:17 crc kubenswrapper[4690]: I0320 17:38:17.943075 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86498d848d-v9std"] Mar 20 17:38:17 crc kubenswrapper[4690]: W0320 17:38:17.948672 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod630eaf89_3200_4fc6_9262_e61e8f64b1ba.slice/crio-5057c25669a0ae9ac9387af98c0c4d0ee8e50a1628332fe5a963a7a68364947f WatchSource:0}: Error finding container 5057c25669a0ae9ac9387af98c0c4d0ee8e50a1628332fe5a963a7a68364947f: Status 404 returned error can't find the container with id 5057c25669a0ae9ac9387af98c0c4d0ee8e50a1628332fe5a963a7a68364947f Mar 20 17:38:18 crc kubenswrapper[4690]: I0320 17:38:18.604635 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-86498d848d-v9std" event={"ID":"630eaf89-3200-4fc6-9262-e61e8f64b1ba","Type":"ContainerStarted","Data":"6ae737b357d873cd83656c272e0e1a449709b51b3b71cc56b43a15261c6a792c"} Mar 20 17:38:18 crc kubenswrapper[4690]: I0320 17:38:18.604865 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-86498d848d-v9std" event={"ID":"630eaf89-3200-4fc6-9262-e61e8f64b1ba","Type":"ContainerStarted","Data":"5057c25669a0ae9ac9387af98c0c4d0ee8e50a1628332fe5a963a7a68364947f"} Mar 20 17:38:18 crc kubenswrapper[4690]: I0320 17:38:18.605865 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-86498d848d-v9std" Mar 20 17:38:18 crc kubenswrapper[4690]: I0320 17:38:18.606500 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8558fbbddd-qxfcj" event={"ID":"1cb82e4b-643f-431a-8036-0cc6295e1adc","Type":"ContainerStarted","Data":"a160e323869f0f0da320cc6bb8ac9b02d43ed6fb0b63e0f345c2c5a32f2ea896"} Mar 20 17:38:18 crc kubenswrapper[4690]: I0320 17:38:18.606536 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8558fbbddd-qxfcj" event={"ID":"1cb82e4b-643f-431a-8036-0cc6295e1adc","Type":"ContainerStarted","Data":"9c5ba83a509b658eb3927928793a002d4c78effc37f3f2d34cae0e59e1646489"} Mar 20 17:38:18 crc kubenswrapper[4690]: I0320 17:38:18.607129 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-8558fbbddd-qxfcj" Mar 20 17:38:18 crc kubenswrapper[4690]: I0320 17:38:18.613010 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-8558fbbddd-qxfcj" Mar 20 17:38:18 crc kubenswrapper[4690]: I0320 17:38:18.661399 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-8558fbbddd-qxfcj" podStartSLOduration=3.661380464 podStartE2EDuration="3.661380464s" podCreationTimestamp="2026-03-20 17:38:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:38:18.659170364 +0000 UTC m=+373.524996042" watchObservedRunningTime="2026-03-20 17:38:18.661380464 +0000 UTC m=+373.527206142" Mar 20 17:38:18 crc kubenswrapper[4690]: I0320 17:38:18.663846 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-86498d848d-v9std" podStartSLOduration=3.663835811 podStartE2EDuration="3.663835811s" podCreationTimestamp="2026-03-20 17:38:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:38:18.631173213 +0000 UTC m=+373.496998891" watchObservedRunningTime="2026-03-20 17:38:18.663835811 +0000 UTC m=+373.529661489" Mar 20 17:38:18 crc kubenswrapper[4690]: I0320 17:38:18.757612 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-86498d848d-v9std" Mar 20 17:38:21 crc kubenswrapper[4690]: I0320 17:38:21.402058 4690 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 20 17:38:21 crc kubenswrapper[4690]: I0320 17:38:21.402631 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://0d3bb51a70f2d1c02efb6c8a28224826384cf12d0e33c9c1769ca6d92c266120" gracePeriod=5 Mar 20 17:38:24 crc kubenswrapper[4690]: I0320 17:38:24.812677 4690 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 20 17:38:26 crc kubenswrapper[4690]: I0320 17:38:26.943818 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 20 17:38:26 crc kubenswrapper[4690]: I0320 17:38:26.943901 4690 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="0d3bb51a70f2d1c02efb6c8a28224826384cf12d0e33c9c1769ca6d92c266120" exitCode=137 Mar 20 17:38:26 crc kubenswrapper[4690]: I0320 17:38:26.943955 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="391f1a48cd320662b7b725601192c318d3bb2be69dcddac0d4427c92fe0060bb" Mar 20 17:38:26 crc kubenswrapper[4690]: I0320 17:38:26.993814 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 20 17:38:26 crc kubenswrapper[4690]: I0320 17:38:26.994132 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 17:38:27 crc kubenswrapper[4690]: I0320 17:38:27.171284 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 17:38:27 crc kubenswrapper[4690]: I0320 17:38:27.171653 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 17:38:27 crc kubenswrapper[4690]: I0320 17:38:27.171683 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 17:38:27 crc kubenswrapper[4690]: I0320 17:38:27.171717 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 17:38:27 crc kubenswrapper[4690]: I0320 17:38:27.171737 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 17:38:27 crc kubenswrapper[4690]: I0320 17:38:27.171407 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:38:27 crc kubenswrapper[4690]: I0320 17:38:27.172042 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:38:27 crc kubenswrapper[4690]: I0320 17:38:27.172107 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:38:27 crc kubenswrapper[4690]: I0320 17:38:27.172136 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:38:27 crc kubenswrapper[4690]: I0320 17:38:27.181968 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:38:27 crc kubenswrapper[4690]: I0320 17:38:27.273502 4690 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:27 crc kubenswrapper[4690]: I0320 17:38:27.273533 4690 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:27 crc kubenswrapper[4690]: I0320 17:38:27.273542 4690 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:27 crc kubenswrapper[4690]: I0320 17:38:27.273552 4690 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:27 crc kubenswrapper[4690]: I0320 17:38:27.273562 4690 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:27 crc kubenswrapper[4690]: I0320 17:38:27.890680 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 20 17:38:27 crc kubenswrapper[4690]: I0320 17:38:27.970759 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 17:38:29 crc kubenswrapper[4690]: I0320 17:38:29.984619 4690 generic.go:334] "Generic (PLEG): container finished" podID="80d86fac-74cc-41d4-81df-2e718c1568d9" containerID="b760ad6cf95133d8fc74387d30f58aa3b60fa64983a86b7f2b2cf8c0828be7a1" exitCode=0 Mar 20 17:38:29 crc kubenswrapper[4690]: I0320 17:38:29.984717 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-dnpcn" event={"ID":"80d86fac-74cc-41d4-81df-2e718c1568d9","Type":"ContainerDied","Data":"b760ad6cf95133d8fc74387d30f58aa3b60fa64983a86b7f2b2cf8c0828be7a1"} Mar 20 17:38:29 crc kubenswrapper[4690]: I0320 17:38:29.985514 4690 scope.go:117] "RemoveContainer" containerID="b760ad6cf95133d8fc74387d30f58aa3b60fa64983a86b7f2b2cf8c0828be7a1" Mar 20 17:38:30 crc kubenswrapper[4690]: I0320 17:38:30.992568 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-dnpcn" event={"ID":"80d86fac-74cc-41d4-81df-2e718c1568d9","Type":"ContainerStarted","Data":"36dc46d3ac7a19e5b7a5729f297ee7763ce7e53a8aa3f958c84483bd1e69de57"} Mar 20 17:38:30 crc kubenswrapper[4690]: I0320 17:38:30.993309 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-dnpcn" Mar 20 17:38:30 crc kubenswrapper[4690]: I0320 17:38:30.994539 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-dnpcn" Mar 20 17:38:35 crc kubenswrapper[4690]: I0320 17:38:35.835747 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-8558fbbddd-qxfcj"] Mar 20 17:38:35 crc kubenswrapper[4690]: I0320 17:38:35.836572 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-8558fbbddd-qxfcj" podUID="1cb82e4b-643f-431a-8036-0cc6295e1adc" containerName="controller-manager" containerID="cri-o://a160e323869f0f0da320cc6bb8ac9b02d43ed6fb0b63e0f345c2c5a32f2ea896" gracePeriod=30 Mar 20 17:38:35 crc kubenswrapper[4690]: I0320 17:38:35.847715 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86498d848d-v9std"] Mar 20 17:38:35 crc kubenswrapper[4690]: I0320 17:38:35.847966 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-86498d848d-v9std" podUID="630eaf89-3200-4fc6-9262-e61e8f64b1ba" containerName="route-controller-manager" containerID="cri-o://6ae737b357d873cd83656c272e0e1a449709b51b3b71cc56b43a15261c6a792c" gracePeriod=30 Mar 20 17:38:36 crc kubenswrapper[4690]: I0320 17:38:36.108569 4690 generic.go:334] "Generic (PLEG): container finished" podID="630eaf89-3200-4fc6-9262-e61e8f64b1ba" containerID="6ae737b357d873cd83656c272e0e1a449709b51b3b71cc56b43a15261c6a792c" exitCode=0 Mar 20 17:38:36 crc kubenswrapper[4690]: I0320 17:38:36.108640 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-86498d848d-v9std" event={"ID":"630eaf89-3200-4fc6-9262-e61e8f64b1ba","Type":"ContainerDied","Data":"6ae737b357d873cd83656c272e0e1a449709b51b3b71cc56b43a15261c6a792c"} Mar 20 17:38:36 crc kubenswrapper[4690]: I0320 17:38:36.110876 4690 generic.go:334] "Generic (PLEG): container finished" podID="1cb82e4b-643f-431a-8036-0cc6295e1adc" containerID="a160e323869f0f0da320cc6bb8ac9b02d43ed6fb0b63e0f345c2c5a32f2ea896" exitCode=0 Mar 20 17:38:36 crc kubenswrapper[4690]: I0320 17:38:36.110914 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8558fbbddd-qxfcj" event={"ID":"1cb82e4b-643f-431a-8036-0cc6295e1adc","Type":"ContainerDied","Data":"a160e323869f0f0da320cc6bb8ac9b02d43ed6fb0b63e0f345c2c5a32f2ea896"} Mar 20 17:38:36 crc kubenswrapper[4690]: I0320 17:38:36.347714 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-86498d848d-v9std" Mar 20 17:38:36 crc kubenswrapper[4690]: I0320 17:38:36.400735 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8558fbbddd-qxfcj" Mar 20 17:38:36 crc kubenswrapper[4690]: I0320 17:38:36.517134 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/630eaf89-3200-4fc6-9262-e61e8f64b1ba-client-ca\") pod \"630eaf89-3200-4fc6-9262-e61e8f64b1ba\" (UID: \"630eaf89-3200-4fc6-9262-e61e8f64b1ba\") " Mar 20 17:38:36 crc kubenswrapper[4690]: I0320 17:38:36.517213 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1cb82e4b-643f-431a-8036-0cc6295e1adc-config\") pod \"1cb82e4b-643f-431a-8036-0cc6295e1adc\" (UID: \"1cb82e4b-643f-431a-8036-0cc6295e1adc\") " Mar 20 17:38:36 crc kubenswrapper[4690]: I0320 17:38:36.517242 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1cb82e4b-643f-431a-8036-0cc6295e1adc-client-ca\") pod \"1cb82e4b-643f-431a-8036-0cc6295e1adc\" (UID: \"1cb82e4b-643f-431a-8036-0cc6295e1adc\") " Mar 20 17:38:36 crc kubenswrapper[4690]: I0320 17:38:36.517285 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mch7k\" (UniqueName: \"kubernetes.io/projected/630eaf89-3200-4fc6-9262-e61e8f64b1ba-kube-api-access-mch7k\") pod \"630eaf89-3200-4fc6-9262-e61e8f64b1ba\" (UID: \"630eaf89-3200-4fc6-9262-e61e8f64b1ba\") " Mar 20 17:38:36 crc kubenswrapper[4690]: I0320 17:38:36.517310 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1cb82e4b-643f-431a-8036-0cc6295e1adc-proxy-ca-bundles\") pod \"1cb82e4b-643f-431a-8036-0cc6295e1adc\" (UID: \"1cb82e4b-643f-431a-8036-0cc6295e1adc\") " Mar 20 17:38:36 crc kubenswrapper[4690]: I0320 17:38:36.517333 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/630eaf89-3200-4fc6-9262-e61e8f64b1ba-serving-cert\") pod \"630eaf89-3200-4fc6-9262-e61e8f64b1ba\" (UID: \"630eaf89-3200-4fc6-9262-e61e8f64b1ba\") " Mar 20 17:38:36 crc kubenswrapper[4690]: I0320 17:38:36.517410 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1cb82e4b-643f-431a-8036-0cc6295e1adc-serving-cert\") pod \"1cb82e4b-643f-431a-8036-0cc6295e1adc\" (UID: \"1cb82e4b-643f-431a-8036-0cc6295e1adc\") " Mar 20 17:38:36 crc kubenswrapper[4690]: I0320 17:38:36.517449 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/630eaf89-3200-4fc6-9262-e61e8f64b1ba-config\") pod \"630eaf89-3200-4fc6-9262-e61e8f64b1ba\" (UID: \"630eaf89-3200-4fc6-9262-e61e8f64b1ba\") " Mar 20 17:38:36 crc kubenswrapper[4690]: I0320 17:38:36.517474 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kzskt\" (UniqueName: \"kubernetes.io/projected/1cb82e4b-643f-431a-8036-0cc6295e1adc-kube-api-access-kzskt\") pod \"1cb82e4b-643f-431a-8036-0cc6295e1adc\" (UID: \"1cb82e4b-643f-431a-8036-0cc6295e1adc\") " Mar 20 17:38:36 crc kubenswrapper[4690]: I0320 17:38:36.518001 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1cb82e4b-643f-431a-8036-0cc6295e1adc-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "1cb82e4b-643f-431a-8036-0cc6295e1adc" (UID: "1cb82e4b-643f-431a-8036-0cc6295e1adc"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:38:36 crc kubenswrapper[4690]: I0320 17:38:36.518042 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1cb82e4b-643f-431a-8036-0cc6295e1adc-client-ca" (OuterVolumeSpecName: "client-ca") pod "1cb82e4b-643f-431a-8036-0cc6295e1adc" (UID: "1cb82e4b-643f-431a-8036-0cc6295e1adc"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:38:36 crc kubenswrapper[4690]: I0320 17:38:36.518069 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1cb82e4b-643f-431a-8036-0cc6295e1adc-config" (OuterVolumeSpecName: "config") pod "1cb82e4b-643f-431a-8036-0cc6295e1adc" (UID: "1cb82e4b-643f-431a-8036-0cc6295e1adc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:38:36 crc kubenswrapper[4690]: I0320 17:38:36.518287 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/630eaf89-3200-4fc6-9262-e61e8f64b1ba-client-ca" (OuterVolumeSpecName: "client-ca") pod "630eaf89-3200-4fc6-9262-e61e8f64b1ba" (UID: "630eaf89-3200-4fc6-9262-e61e8f64b1ba"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:38:36 crc kubenswrapper[4690]: I0320 17:38:36.518304 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/630eaf89-3200-4fc6-9262-e61e8f64b1ba-config" (OuterVolumeSpecName: "config") pod "630eaf89-3200-4fc6-9262-e61e8f64b1ba" (UID: "630eaf89-3200-4fc6-9262-e61e8f64b1ba"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:38:36 crc kubenswrapper[4690]: I0320 17:38:36.518637 4690 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/630eaf89-3200-4fc6-9262-e61e8f64b1ba-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:36 crc kubenswrapper[4690]: I0320 17:38:36.518654 4690 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/630eaf89-3200-4fc6-9262-e61e8f64b1ba-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:36 crc kubenswrapper[4690]: I0320 17:38:36.518663 4690 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1cb82e4b-643f-431a-8036-0cc6295e1adc-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:36 crc kubenswrapper[4690]: I0320 17:38:36.518673 4690 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1cb82e4b-643f-431a-8036-0cc6295e1adc-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:36 crc kubenswrapper[4690]: I0320 17:38:36.518683 4690 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1cb82e4b-643f-431a-8036-0cc6295e1adc-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:36 crc kubenswrapper[4690]: I0320 17:38:36.521870 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/630eaf89-3200-4fc6-9262-e61e8f64b1ba-kube-api-access-mch7k" (OuterVolumeSpecName: "kube-api-access-mch7k") pod "630eaf89-3200-4fc6-9262-e61e8f64b1ba" (UID: "630eaf89-3200-4fc6-9262-e61e8f64b1ba"). InnerVolumeSpecName "kube-api-access-mch7k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:38:36 crc kubenswrapper[4690]: I0320 17:38:36.521941 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cb82e4b-643f-431a-8036-0cc6295e1adc-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1cb82e4b-643f-431a-8036-0cc6295e1adc" (UID: "1cb82e4b-643f-431a-8036-0cc6295e1adc"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:38:36 crc kubenswrapper[4690]: I0320 17:38:36.521951 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/630eaf89-3200-4fc6-9262-e61e8f64b1ba-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "630eaf89-3200-4fc6-9262-e61e8f64b1ba" (UID: "630eaf89-3200-4fc6-9262-e61e8f64b1ba"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:38:36 crc kubenswrapper[4690]: I0320 17:38:36.522275 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1cb82e4b-643f-431a-8036-0cc6295e1adc-kube-api-access-kzskt" (OuterVolumeSpecName: "kube-api-access-kzskt") pod "1cb82e4b-643f-431a-8036-0cc6295e1adc" (UID: "1cb82e4b-643f-431a-8036-0cc6295e1adc"). InnerVolumeSpecName "kube-api-access-kzskt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:38:36 crc kubenswrapper[4690]: I0320 17:38:36.619328 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mch7k\" (UniqueName: \"kubernetes.io/projected/630eaf89-3200-4fc6-9262-e61e8f64b1ba-kube-api-access-mch7k\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:36 crc kubenswrapper[4690]: I0320 17:38:36.619361 4690 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/630eaf89-3200-4fc6-9262-e61e8f64b1ba-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:36 crc kubenswrapper[4690]: I0320 17:38:36.619371 4690 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1cb82e4b-643f-431a-8036-0cc6295e1adc-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:36 crc kubenswrapper[4690]: I0320 17:38:36.619379 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kzskt\" (UniqueName: \"kubernetes.io/projected/1cb82e4b-643f-431a-8036-0cc6295e1adc-kube-api-access-kzskt\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:37 crc kubenswrapper[4690]: I0320 17:38:37.123225 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-86498d848d-v9std" event={"ID":"630eaf89-3200-4fc6-9262-e61e8f64b1ba","Type":"ContainerDied","Data":"5057c25669a0ae9ac9387af98c0c4d0ee8e50a1628332fe5a963a7a68364947f"} Mar 20 17:38:37 crc kubenswrapper[4690]: I0320 17:38:37.123273 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-86498d848d-v9std" Mar 20 17:38:37 crc kubenswrapper[4690]: I0320 17:38:37.123315 4690 scope.go:117] "RemoveContainer" containerID="6ae737b357d873cd83656c272e0e1a449709b51b3b71cc56b43a15261c6a792c" Mar 20 17:38:37 crc kubenswrapper[4690]: I0320 17:38:37.127763 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8558fbbddd-qxfcj" event={"ID":"1cb82e4b-643f-431a-8036-0cc6295e1adc","Type":"ContainerDied","Data":"9c5ba83a509b658eb3927928793a002d4c78effc37f3f2d34cae0e59e1646489"} Mar 20 17:38:37 crc kubenswrapper[4690]: I0320 17:38:37.127916 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8558fbbddd-qxfcj" Mar 20 17:38:37 crc kubenswrapper[4690]: I0320 17:38:37.146954 4690 scope.go:117] "RemoveContainer" containerID="a160e323869f0f0da320cc6bb8ac9b02d43ed6fb0b63e0f345c2c5a32f2ea896" Mar 20 17:38:37 crc kubenswrapper[4690]: I0320 17:38:37.149764 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86498d848d-v9std"] Mar 20 17:38:37 crc kubenswrapper[4690]: I0320 17:38:37.153112 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86498d848d-v9std"] Mar 20 17:38:37 crc kubenswrapper[4690]: I0320 17:38:37.174996 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-8558fbbddd-qxfcj"] Mar 20 17:38:37 crc kubenswrapper[4690]: I0320 17:38:37.178234 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-8558fbbddd-qxfcj"] Mar 20 17:38:37 crc kubenswrapper[4690]: I0320 17:38:37.396658 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-b75cbdb4b-5q69m"] Mar 20 17:38:37 crc kubenswrapper[4690]: E0320 17:38:37.397057 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cb82e4b-643f-431a-8036-0cc6295e1adc" containerName="controller-manager" Mar 20 17:38:37 crc kubenswrapper[4690]: I0320 17:38:37.397087 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cb82e4b-643f-431a-8036-0cc6295e1adc" containerName="controller-manager" Mar 20 17:38:37 crc kubenswrapper[4690]: E0320 17:38:37.397111 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="630eaf89-3200-4fc6-9262-e61e8f64b1ba" containerName="route-controller-manager" Mar 20 17:38:37 crc kubenswrapper[4690]: I0320 17:38:37.397123 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="630eaf89-3200-4fc6-9262-e61e8f64b1ba" containerName="route-controller-manager" Mar 20 17:38:37 crc kubenswrapper[4690]: E0320 17:38:37.397140 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 20 17:38:37 crc kubenswrapper[4690]: I0320 17:38:37.397149 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 20 17:38:37 crc kubenswrapper[4690]: I0320 17:38:37.397326 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="630eaf89-3200-4fc6-9262-e61e8f64b1ba" containerName="route-controller-manager" Mar 20 17:38:37 crc kubenswrapper[4690]: I0320 17:38:37.397347 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 20 17:38:37 crc kubenswrapper[4690]: I0320 17:38:37.397370 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cb82e4b-643f-431a-8036-0cc6295e1adc" containerName="controller-manager" Mar 20 17:38:37 crc kubenswrapper[4690]: I0320 17:38:37.397852 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-b75cbdb4b-5q69m" Mar 20 17:38:37 crc kubenswrapper[4690]: I0320 17:38:37.401325 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 17:38:37 crc kubenswrapper[4690]: I0320 17:38:37.401786 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 17:38:37 crc kubenswrapper[4690]: I0320 17:38:37.402056 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 17:38:37 crc kubenswrapper[4690]: I0320 17:38:37.402157 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 17:38:37 crc kubenswrapper[4690]: I0320 17:38:37.402630 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 17:38:37 crc kubenswrapper[4690]: I0320 17:38:37.405752 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-898b8ffb8-96bw7"] Mar 20 17:38:37 crc kubenswrapper[4690]: I0320 17:38:37.406614 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-898b8ffb8-96bw7" Mar 20 17:38:37 crc kubenswrapper[4690]: I0320 17:38:37.408563 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 17:38:37 crc kubenswrapper[4690]: I0320 17:38:37.408654 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 17:38:37 crc kubenswrapper[4690]: I0320 17:38:37.408765 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 17:38:37 crc kubenswrapper[4690]: I0320 17:38:37.408808 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 17:38:37 crc kubenswrapper[4690]: I0320 17:38:37.408983 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 17:38:37 crc kubenswrapper[4690]: I0320 17:38:37.410846 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 17:38:37 crc kubenswrapper[4690]: I0320 17:38:37.410928 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 17:38:37 crc kubenswrapper[4690]: I0320 17:38:37.412585 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 17:38:37 crc kubenswrapper[4690]: I0320 17:38:37.413410 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-b75cbdb4b-5q69m"] Mar 20 17:38:37 crc kubenswrapper[4690]: I0320 17:38:37.418381 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-898b8ffb8-96bw7"] Mar 20 17:38:37 crc kubenswrapper[4690]: I0320 17:38:37.533114 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c031c66-36c2-4758-a771-a38d505073c7-serving-cert\") pod \"controller-manager-b75cbdb4b-5q69m\" (UID: \"2c031c66-36c2-4758-a771-a38d505073c7\") " pod="openshift-controller-manager/controller-manager-b75cbdb4b-5q69m" Mar 20 17:38:37 crc kubenswrapper[4690]: I0320 17:38:37.533151 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gch5v\" (UniqueName: \"kubernetes.io/projected/2c031c66-36c2-4758-a771-a38d505073c7-kube-api-access-gch5v\") pod \"controller-manager-b75cbdb4b-5q69m\" (UID: \"2c031c66-36c2-4758-a771-a38d505073c7\") " pod="openshift-controller-manager/controller-manager-b75cbdb4b-5q69m" Mar 20 17:38:37 crc kubenswrapper[4690]: I0320 17:38:37.533178 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/322e197a-33c6-4d46-94da-ebc1bc673cb4-config\") pod \"route-controller-manager-898b8ffb8-96bw7\" (UID: \"322e197a-33c6-4d46-94da-ebc1bc673cb4\") " pod="openshift-route-controller-manager/route-controller-manager-898b8ffb8-96bw7" Mar 20 17:38:37 crc kubenswrapper[4690]: I0320 17:38:37.533200 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/322e197a-33c6-4d46-94da-ebc1bc673cb4-serving-cert\") pod \"route-controller-manager-898b8ffb8-96bw7\" (UID: \"322e197a-33c6-4d46-94da-ebc1bc673cb4\") " pod="openshift-route-controller-manager/route-controller-manager-898b8ffb8-96bw7" Mar 20 17:38:37 crc kubenswrapper[4690]: I0320 17:38:37.533246 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/322e197a-33c6-4d46-94da-ebc1bc673cb4-client-ca\") pod \"route-controller-manager-898b8ffb8-96bw7\" (UID: \"322e197a-33c6-4d46-94da-ebc1bc673cb4\") " pod="openshift-route-controller-manager/route-controller-manager-898b8ffb8-96bw7" Mar 20 17:38:37 crc kubenswrapper[4690]: I0320 17:38:37.533290 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2c031c66-36c2-4758-a771-a38d505073c7-client-ca\") pod \"controller-manager-b75cbdb4b-5q69m\" (UID: \"2c031c66-36c2-4758-a771-a38d505073c7\") " pod="openshift-controller-manager/controller-manager-b75cbdb4b-5q69m" Mar 20 17:38:37 crc kubenswrapper[4690]: I0320 17:38:37.533307 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2c031c66-36c2-4758-a771-a38d505073c7-proxy-ca-bundles\") pod \"controller-manager-b75cbdb4b-5q69m\" (UID: \"2c031c66-36c2-4758-a771-a38d505073c7\") " pod="openshift-controller-manager/controller-manager-b75cbdb4b-5q69m" Mar 20 17:38:37 crc kubenswrapper[4690]: I0320 17:38:37.533342 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knqm2\" (UniqueName: \"kubernetes.io/projected/322e197a-33c6-4d46-94da-ebc1bc673cb4-kube-api-access-knqm2\") pod \"route-controller-manager-898b8ffb8-96bw7\" (UID: \"322e197a-33c6-4d46-94da-ebc1bc673cb4\") " pod="openshift-route-controller-manager/route-controller-manager-898b8ffb8-96bw7" Mar 20 17:38:37 crc kubenswrapper[4690]: I0320 17:38:37.533362 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c031c66-36c2-4758-a771-a38d505073c7-config\") pod \"controller-manager-b75cbdb4b-5q69m\" (UID: \"2c031c66-36c2-4758-a771-a38d505073c7\") " pod="openshift-controller-manager/controller-manager-b75cbdb4b-5q69m" Mar 20 17:38:37 crc kubenswrapper[4690]: I0320 17:38:37.633996 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/322e197a-33c6-4d46-94da-ebc1bc673cb4-client-ca\") pod \"route-controller-manager-898b8ffb8-96bw7\" (UID: \"322e197a-33c6-4d46-94da-ebc1bc673cb4\") " pod="openshift-route-controller-manager/route-controller-manager-898b8ffb8-96bw7" Mar 20 17:38:37 crc kubenswrapper[4690]: I0320 17:38:37.634039 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2c031c66-36c2-4758-a771-a38d505073c7-client-ca\") pod \"controller-manager-b75cbdb4b-5q69m\" (UID: \"2c031c66-36c2-4758-a771-a38d505073c7\") " pod="openshift-controller-manager/controller-manager-b75cbdb4b-5q69m" Mar 20 17:38:37 crc kubenswrapper[4690]: I0320 17:38:37.634063 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2c031c66-36c2-4758-a771-a38d505073c7-proxy-ca-bundles\") pod \"controller-manager-b75cbdb4b-5q69m\" (UID: \"2c031c66-36c2-4758-a771-a38d505073c7\") " pod="openshift-controller-manager/controller-manager-b75cbdb4b-5q69m" Mar 20 17:38:37 crc kubenswrapper[4690]: I0320 17:38:37.634113 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knqm2\" (UniqueName: \"kubernetes.io/projected/322e197a-33c6-4d46-94da-ebc1bc673cb4-kube-api-access-knqm2\") pod \"route-controller-manager-898b8ffb8-96bw7\" (UID: \"322e197a-33c6-4d46-94da-ebc1bc673cb4\") " pod="openshift-route-controller-manager/route-controller-manager-898b8ffb8-96bw7" Mar 20 17:38:37 crc kubenswrapper[4690]: I0320 17:38:37.634143 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c031c66-36c2-4758-a771-a38d505073c7-config\") pod \"controller-manager-b75cbdb4b-5q69m\" (UID: \"2c031c66-36c2-4758-a771-a38d505073c7\") " pod="openshift-controller-manager/controller-manager-b75cbdb4b-5q69m" Mar 20 17:38:37 crc kubenswrapper[4690]: I0320 17:38:37.634192 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c031c66-36c2-4758-a771-a38d505073c7-serving-cert\") pod \"controller-manager-b75cbdb4b-5q69m\" (UID: \"2c031c66-36c2-4758-a771-a38d505073c7\") " pod="openshift-controller-manager/controller-manager-b75cbdb4b-5q69m" Mar 20 17:38:37 crc kubenswrapper[4690]: I0320 17:38:37.634216 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gch5v\" (UniqueName: \"kubernetes.io/projected/2c031c66-36c2-4758-a771-a38d505073c7-kube-api-access-gch5v\") pod \"controller-manager-b75cbdb4b-5q69m\" (UID: \"2c031c66-36c2-4758-a771-a38d505073c7\") " pod="openshift-controller-manager/controller-manager-b75cbdb4b-5q69m" Mar 20 17:38:37 crc kubenswrapper[4690]: I0320 17:38:37.634242 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/322e197a-33c6-4d46-94da-ebc1bc673cb4-config\") pod \"route-controller-manager-898b8ffb8-96bw7\" (UID: \"322e197a-33c6-4d46-94da-ebc1bc673cb4\") " pod="openshift-route-controller-manager/route-controller-manager-898b8ffb8-96bw7" Mar 20 17:38:37 crc kubenswrapper[4690]: I0320 17:38:37.634297 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/322e197a-33c6-4d46-94da-ebc1bc673cb4-serving-cert\") pod \"route-controller-manager-898b8ffb8-96bw7\" (UID: \"322e197a-33c6-4d46-94da-ebc1bc673cb4\") " pod="openshift-route-controller-manager/route-controller-manager-898b8ffb8-96bw7" Mar 20 17:38:37 crc kubenswrapper[4690]: I0320 17:38:37.635053 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/322e197a-33c6-4d46-94da-ebc1bc673cb4-client-ca\") pod \"route-controller-manager-898b8ffb8-96bw7\" (UID: \"322e197a-33c6-4d46-94da-ebc1bc673cb4\") " pod="openshift-route-controller-manager/route-controller-manager-898b8ffb8-96bw7" Mar 20 17:38:37 crc kubenswrapper[4690]: I0320 17:38:37.635061 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2c031c66-36c2-4758-a771-a38d505073c7-client-ca\") pod \"controller-manager-b75cbdb4b-5q69m\" (UID: \"2c031c66-36c2-4758-a771-a38d505073c7\") " pod="openshift-controller-manager/controller-manager-b75cbdb4b-5q69m" Mar 20 17:38:37 crc kubenswrapper[4690]: I0320 17:38:37.636252 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/322e197a-33c6-4d46-94da-ebc1bc673cb4-config\") pod \"route-controller-manager-898b8ffb8-96bw7\" (UID: \"322e197a-33c6-4d46-94da-ebc1bc673cb4\") " pod="openshift-route-controller-manager/route-controller-manager-898b8ffb8-96bw7" Mar 20 17:38:37 crc kubenswrapper[4690]: I0320 17:38:37.636291 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2c031c66-36c2-4758-a771-a38d505073c7-proxy-ca-bundles\") pod \"controller-manager-b75cbdb4b-5q69m\" (UID: \"2c031c66-36c2-4758-a771-a38d505073c7\") " pod="openshift-controller-manager/controller-manager-b75cbdb4b-5q69m" Mar 20 17:38:37 crc kubenswrapper[4690]: I0320 17:38:37.636596 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c031c66-36c2-4758-a771-a38d505073c7-config\") pod \"controller-manager-b75cbdb4b-5q69m\" (UID: \"2c031c66-36c2-4758-a771-a38d505073c7\") " pod="openshift-controller-manager/controller-manager-b75cbdb4b-5q69m" Mar 20 17:38:37 crc kubenswrapper[4690]: I0320 17:38:37.638543 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c031c66-36c2-4758-a771-a38d505073c7-serving-cert\") pod \"controller-manager-b75cbdb4b-5q69m\" (UID: \"2c031c66-36c2-4758-a771-a38d505073c7\") " pod="openshift-controller-manager/controller-manager-b75cbdb4b-5q69m" Mar 20 17:38:37 crc kubenswrapper[4690]: I0320 17:38:37.638961 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/322e197a-33c6-4d46-94da-ebc1bc673cb4-serving-cert\") pod \"route-controller-manager-898b8ffb8-96bw7\" (UID: \"322e197a-33c6-4d46-94da-ebc1bc673cb4\") " pod="openshift-route-controller-manager/route-controller-manager-898b8ffb8-96bw7" Mar 20 17:38:37 crc kubenswrapper[4690]: I0320 17:38:37.662750 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gch5v\" (UniqueName: \"kubernetes.io/projected/2c031c66-36c2-4758-a771-a38d505073c7-kube-api-access-gch5v\") pod \"controller-manager-b75cbdb4b-5q69m\" (UID: \"2c031c66-36c2-4758-a771-a38d505073c7\") " pod="openshift-controller-manager/controller-manager-b75cbdb4b-5q69m" Mar 20 17:38:37 crc kubenswrapper[4690]: I0320 17:38:37.663305 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knqm2\" (UniqueName: \"kubernetes.io/projected/322e197a-33c6-4d46-94da-ebc1bc673cb4-kube-api-access-knqm2\") pod \"route-controller-manager-898b8ffb8-96bw7\" (UID: \"322e197a-33c6-4d46-94da-ebc1bc673cb4\") " pod="openshift-route-controller-manager/route-controller-manager-898b8ffb8-96bw7" Mar 20 17:38:37 crc kubenswrapper[4690]: I0320 17:38:37.733663 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-b75cbdb4b-5q69m" Mar 20 17:38:37 crc kubenswrapper[4690]: I0320 17:38:37.739651 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-898b8ffb8-96bw7" Mar 20 17:38:37 crc kubenswrapper[4690]: I0320 17:38:37.890750 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1cb82e4b-643f-431a-8036-0cc6295e1adc" path="/var/lib/kubelet/pods/1cb82e4b-643f-431a-8036-0cc6295e1adc/volumes" Mar 20 17:38:37 crc kubenswrapper[4690]: I0320 17:38:37.891497 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="630eaf89-3200-4fc6-9262-e61e8f64b1ba" path="/var/lib/kubelet/pods/630eaf89-3200-4fc6-9262-e61e8f64b1ba/volumes" Mar 20 17:38:38 crc kubenswrapper[4690]: I0320 17:38:38.198178 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-b75cbdb4b-5q69m"] Mar 20 17:38:38 crc kubenswrapper[4690]: I0320 17:38:38.245504 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-898b8ffb8-96bw7"] Mar 20 17:38:39 crc kubenswrapper[4690]: I0320 17:38:39.145781 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-898b8ffb8-96bw7" event={"ID":"322e197a-33c6-4d46-94da-ebc1bc673cb4","Type":"ContainerStarted","Data":"de90bad0024da119d2fd3283eaf97ea630088c65a3a424af23ed09d5a31e01ef"} Mar 20 17:38:39 crc kubenswrapper[4690]: I0320 17:38:39.145872 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-898b8ffb8-96bw7" event={"ID":"322e197a-33c6-4d46-94da-ebc1bc673cb4","Type":"ContainerStarted","Data":"752deb10ba9e7c940b130b6856fab4a5c36ade32f364a5594a58cc549609eec7"} Mar 20 17:38:39 crc kubenswrapper[4690]: I0320 17:38:39.147764 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-898b8ffb8-96bw7" Mar 20 17:38:39 crc kubenswrapper[4690]: I0320 17:38:39.150598 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-b75cbdb4b-5q69m" event={"ID":"2c031c66-36c2-4758-a771-a38d505073c7","Type":"ContainerStarted","Data":"d4301f2245c8c191fdd5f4c8f8bad6744206d4d09f15114069ab311701c934a2"} Mar 20 17:38:39 crc kubenswrapper[4690]: I0320 17:38:39.150670 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-b75cbdb4b-5q69m" event={"ID":"2c031c66-36c2-4758-a771-a38d505073c7","Type":"ContainerStarted","Data":"ae1bd568b10fcfb8a3898b77b0d354efae02a2df4f16a05237d7cd0d525b5b46"} Mar 20 17:38:39 crc kubenswrapper[4690]: I0320 17:38:39.151143 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-b75cbdb4b-5q69m" Mar 20 17:38:39 crc kubenswrapper[4690]: I0320 17:38:39.155284 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-898b8ffb8-96bw7" Mar 20 17:38:39 crc kubenswrapper[4690]: I0320 17:38:39.156171 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-b75cbdb4b-5q69m" Mar 20 17:38:39 crc kubenswrapper[4690]: I0320 17:38:39.211241 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-898b8ffb8-96bw7" podStartSLOduration=4.211218527 podStartE2EDuration="4.211218527s" podCreationTimestamp="2026-03-20 17:38:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:38:39.177632425 +0000 UTC m=+394.043458143" watchObservedRunningTime="2026-03-20 17:38:39.211218527 +0000 UTC m=+394.077044215" Mar 20 17:38:39 crc kubenswrapper[4690]: I0320 17:38:39.239905 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-b75cbdb4b-5q69m" podStartSLOduration=4.239885539 podStartE2EDuration="4.239885539s" podCreationTimestamp="2026-03-20 17:38:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:38:39.238334848 +0000 UTC m=+394.104160536" watchObservedRunningTime="2026-03-20 17:38:39.239885539 +0000 UTC m=+394.105711237" Mar 20 17:38:54 crc kubenswrapper[4690]: I0320 17:38:54.274062 4690 patch_prober.go:28] interesting pod/machine-config-daemon-wtg2q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:38:54 crc kubenswrapper[4690]: I0320 17:38:54.274642 4690 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:38:55 crc kubenswrapper[4690]: I0320 17:38:55.831448 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-898b8ffb8-96bw7"] Mar 20 17:38:55 crc kubenswrapper[4690]: I0320 17:38:55.831698 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-898b8ffb8-96bw7" podUID="322e197a-33c6-4d46-94da-ebc1bc673cb4" containerName="route-controller-manager" containerID="cri-o://de90bad0024da119d2fd3283eaf97ea630088c65a3a424af23ed09d5a31e01ef" gracePeriod=30 Mar 20 17:38:56 crc kubenswrapper[4690]: I0320 17:38:56.286046 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-898b8ffb8-96bw7" Mar 20 17:38:56 crc kubenswrapper[4690]: I0320 17:38:56.324421 4690 generic.go:334] "Generic (PLEG): container finished" podID="322e197a-33c6-4d46-94da-ebc1bc673cb4" containerID="de90bad0024da119d2fd3283eaf97ea630088c65a3a424af23ed09d5a31e01ef" exitCode=0 Mar 20 17:38:56 crc kubenswrapper[4690]: I0320 17:38:56.324474 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-898b8ffb8-96bw7" event={"ID":"322e197a-33c6-4d46-94da-ebc1bc673cb4","Type":"ContainerDied","Data":"de90bad0024da119d2fd3283eaf97ea630088c65a3a424af23ed09d5a31e01ef"} Mar 20 17:38:56 crc kubenswrapper[4690]: I0320 17:38:56.324490 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-898b8ffb8-96bw7" Mar 20 17:38:56 crc kubenswrapper[4690]: I0320 17:38:56.324505 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-898b8ffb8-96bw7" event={"ID":"322e197a-33c6-4d46-94da-ebc1bc673cb4","Type":"ContainerDied","Data":"752deb10ba9e7c940b130b6856fab4a5c36ade32f364a5594a58cc549609eec7"} Mar 20 17:38:56 crc kubenswrapper[4690]: I0320 17:38:56.324529 4690 scope.go:117] "RemoveContainer" containerID="de90bad0024da119d2fd3283eaf97ea630088c65a3a424af23ed09d5a31e01ef" Mar 20 17:38:56 crc kubenswrapper[4690]: I0320 17:38:56.348121 4690 scope.go:117] "RemoveContainer" containerID="de90bad0024da119d2fd3283eaf97ea630088c65a3a424af23ed09d5a31e01ef" Mar 20 17:38:56 crc kubenswrapper[4690]: E0320 17:38:56.348656 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de90bad0024da119d2fd3283eaf97ea630088c65a3a424af23ed09d5a31e01ef\": container with ID starting with de90bad0024da119d2fd3283eaf97ea630088c65a3a424af23ed09d5a31e01ef not found: ID does not exist" containerID="de90bad0024da119d2fd3283eaf97ea630088c65a3a424af23ed09d5a31e01ef" Mar 20 17:38:56 crc kubenswrapper[4690]: I0320 17:38:56.348689 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de90bad0024da119d2fd3283eaf97ea630088c65a3a424af23ed09d5a31e01ef"} err="failed to get container status \"de90bad0024da119d2fd3283eaf97ea630088c65a3a424af23ed09d5a31e01ef\": rpc error: code = NotFound desc = could not find container \"de90bad0024da119d2fd3283eaf97ea630088c65a3a424af23ed09d5a31e01ef\": container with ID starting with de90bad0024da119d2fd3283eaf97ea630088c65a3a424af23ed09d5a31e01ef not found: ID does not exist" Mar 20 17:38:56 crc kubenswrapper[4690]: I0320 17:38:56.431373 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/322e197a-33c6-4d46-94da-ebc1bc673cb4-config\") pod \"322e197a-33c6-4d46-94da-ebc1bc673cb4\" (UID: \"322e197a-33c6-4d46-94da-ebc1bc673cb4\") " Mar 20 17:38:56 crc kubenswrapper[4690]: I0320 17:38:56.431491 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/322e197a-33c6-4d46-94da-ebc1bc673cb4-serving-cert\") pod \"322e197a-33c6-4d46-94da-ebc1bc673cb4\" (UID: \"322e197a-33c6-4d46-94da-ebc1bc673cb4\") " Mar 20 17:38:56 crc kubenswrapper[4690]: I0320 17:38:56.431531 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/322e197a-33c6-4d46-94da-ebc1bc673cb4-client-ca\") pod \"322e197a-33c6-4d46-94da-ebc1bc673cb4\" (UID: \"322e197a-33c6-4d46-94da-ebc1bc673cb4\") " Mar 20 17:38:56 crc kubenswrapper[4690]: I0320 17:38:56.431573 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-knqm2\" (UniqueName: \"kubernetes.io/projected/322e197a-33c6-4d46-94da-ebc1bc673cb4-kube-api-access-knqm2\") pod \"322e197a-33c6-4d46-94da-ebc1bc673cb4\" (UID: \"322e197a-33c6-4d46-94da-ebc1bc673cb4\") " Mar 20 17:38:56 crc kubenswrapper[4690]: I0320 17:38:56.432447 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/322e197a-33c6-4d46-94da-ebc1bc673cb4-config" (OuterVolumeSpecName: "config") pod "322e197a-33c6-4d46-94da-ebc1bc673cb4" (UID: "322e197a-33c6-4d46-94da-ebc1bc673cb4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:38:56 crc kubenswrapper[4690]: I0320 17:38:56.432826 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/322e197a-33c6-4d46-94da-ebc1bc673cb4-client-ca" (OuterVolumeSpecName: "client-ca") pod "322e197a-33c6-4d46-94da-ebc1bc673cb4" (UID: "322e197a-33c6-4d46-94da-ebc1bc673cb4"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:38:56 crc kubenswrapper[4690]: I0320 17:38:56.436033 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/322e197a-33c6-4d46-94da-ebc1bc673cb4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "322e197a-33c6-4d46-94da-ebc1bc673cb4" (UID: "322e197a-33c6-4d46-94da-ebc1bc673cb4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:38:56 crc kubenswrapper[4690]: I0320 17:38:56.436447 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/322e197a-33c6-4d46-94da-ebc1bc673cb4-kube-api-access-knqm2" (OuterVolumeSpecName: "kube-api-access-knqm2") pod "322e197a-33c6-4d46-94da-ebc1bc673cb4" (UID: "322e197a-33c6-4d46-94da-ebc1bc673cb4"). InnerVolumeSpecName "kube-api-access-knqm2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:38:56 crc kubenswrapper[4690]: I0320 17:38:56.532935 4690 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/322e197a-33c6-4d46-94da-ebc1bc673cb4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:56 crc kubenswrapper[4690]: I0320 17:38:56.532994 4690 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/322e197a-33c6-4d46-94da-ebc1bc673cb4-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:56 crc kubenswrapper[4690]: I0320 17:38:56.533020 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-knqm2\" (UniqueName: \"kubernetes.io/projected/322e197a-33c6-4d46-94da-ebc1bc673cb4-kube-api-access-knqm2\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:56 crc kubenswrapper[4690]: I0320 17:38:56.533039 4690 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/322e197a-33c6-4d46-94da-ebc1bc673cb4-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:56 crc kubenswrapper[4690]: I0320 17:38:56.669038 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-898b8ffb8-96bw7"] Mar 20 17:38:56 crc kubenswrapper[4690]: I0320 17:38:56.670209 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-898b8ffb8-96bw7"] Mar 20 17:38:57 crc kubenswrapper[4690]: I0320 17:38:57.413198 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86498d848d-r76ph"] Mar 20 17:38:57 crc kubenswrapper[4690]: E0320 17:38:57.413824 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="322e197a-33c6-4d46-94da-ebc1bc673cb4" containerName="route-controller-manager" Mar 20 17:38:57 crc kubenswrapper[4690]: I0320 17:38:57.413844 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="322e197a-33c6-4d46-94da-ebc1bc673cb4" containerName="route-controller-manager" Mar 20 17:38:57 crc kubenswrapper[4690]: I0320 17:38:57.414050 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="322e197a-33c6-4d46-94da-ebc1bc673cb4" containerName="route-controller-manager" Mar 20 17:38:57 crc kubenswrapper[4690]: I0320 17:38:57.414650 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-86498d848d-r76ph" Mar 20 17:38:57 crc kubenswrapper[4690]: I0320 17:38:57.417870 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 17:38:57 crc kubenswrapper[4690]: I0320 17:38:57.420781 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 17:38:57 crc kubenswrapper[4690]: I0320 17:38:57.420811 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 17:38:57 crc kubenswrapper[4690]: I0320 17:38:57.420915 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 17:38:57 crc kubenswrapper[4690]: I0320 17:38:57.421122 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 17:38:57 crc kubenswrapper[4690]: I0320 17:38:57.421317 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 17:38:57 crc kubenswrapper[4690]: I0320 17:38:57.430607 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86498d848d-r76ph"] Mar 20 17:38:57 crc kubenswrapper[4690]: I0320 17:38:57.551232 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bce43f6-172f-4543-8db6-3a3cdc04c5c1-config\") pod \"route-controller-manager-86498d848d-r76ph\" (UID: \"3bce43f6-172f-4543-8db6-3a3cdc04c5c1\") " pod="openshift-route-controller-manager/route-controller-manager-86498d848d-r76ph" Mar 20 17:38:57 crc kubenswrapper[4690]: I0320 17:38:57.551418 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3bce43f6-172f-4543-8db6-3a3cdc04c5c1-serving-cert\") pod \"route-controller-manager-86498d848d-r76ph\" (UID: \"3bce43f6-172f-4543-8db6-3a3cdc04c5c1\") " pod="openshift-route-controller-manager/route-controller-manager-86498d848d-r76ph" Mar 20 17:38:57 crc kubenswrapper[4690]: I0320 17:38:57.551473 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49xwt\" (UniqueName: \"kubernetes.io/projected/3bce43f6-172f-4543-8db6-3a3cdc04c5c1-kube-api-access-49xwt\") pod \"route-controller-manager-86498d848d-r76ph\" (UID: \"3bce43f6-172f-4543-8db6-3a3cdc04c5c1\") " pod="openshift-route-controller-manager/route-controller-manager-86498d848d-r76ph" Mar 20 17:38:57 crc kubenswrapper[4690]: I0320 17:38:57.551520 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3bce43f6-172f-4543-8db6-3a3cdc04c5c1-client-ca\") pod \"route-controller-manager-86498d848d-r76ph\" (UID: \"3bce43f6-172f-4543-8db6-3a3cdc04c5c1\") " pod="openshift-route-controller-manager/route-controller-manager-86498d848d-r76ph" Mar 20 17:38:57 crc kubenswrapper[4690]: I0320 17:38:57.653091 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bce43f6-172f-4543-8db6-3a3cdc04c5c1-config\") pod \"route-controller-manager-86498d848d-r76ph\" (UID: \"3bce43f6-172f-4543-8db6-3a3cdc04c5c1\") " pod="openshift-route-controller-manager/route-controller-manager-86498d848d-r76ph" Mar 20 17:38:57 crc kubenswrapper[4690]: I0320 17:38:57.653174 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3bce43f6-172f-4543-8db6-3a3cdc04c5c1-serving-cert\") pod \"route-controller-manager-86498d848d-r76ph\" (UID: \"3bce43f6-172f-4543-8db6-3a3cdc04c5c1\") " pod="openshift-route-controller-manager/route-controller-manager-86498d848d-r76ph" Mar 20 17:38:57 crc kubenswrapper[4690]: I0320 17:38:57.653223 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49xwt\" (UniqueName: \"kubernetes.io/projected/3bce43f6-172f-4543-8db6-3a3cdc04c5c1-kube-api-access-49xwt\") pod \"route-controller-manager-86498d848d-r76ph\" (UID: \"3bce43f6-172f-4543-8db6-3a3cdc04c5c1\") " pod="openshift-route-controller-manager/route-controller-manager-86498d848d-r76ph" Mar 20 17:38:57 crc kubenswrapper[4690]: I0320 17:38:57.653300 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3bce43f6-172f-4543-8db6-3a3cdc04c5c1-client-ca\") pod \"route-controller-manager-86498d848d-r76ph\" (UID: \"3bce43f6-172f-4543-8db6-3a3cdc04c5c1\") " pod="openshift-route-controller-manager/route-controller-manager-86498d848d-r76ph" Mar 20 17:38:57 crc kubenswrapper[4690]: I0320 17:38:57.654906 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3bce43f6-172f-4543-8db6-3a3cdc04c5c1-client-ca\") pod \"route-controller-manager-86498d848d-r76ph\" (UID: \"3bce43f6-172f-4543-8db6-3a3cdc04c5c1\") " pod="openshift-route-controller-manager/route-controller-manager-86498d848d-r76ph" Mar 20 17:38:57 crc kubenswrapper[4690]: I0320 17:38:57.655550 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bce43f6-172f-4543-8db6-3a3cdc04c5c1-config\") pod \"route-controller-manager-86498d848d-r76ph\" (UID: \"3bce43f6-172f-4543-8db6-3a3cdc04c5c1\") " pod="openshift-route-controller-manager/route-controller-manager-86498d848d-r76ph" Mar 20 17:38:57 crc kubenswrapper[4690]: I0320 17:38:57.660427 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3bce43f6-172f-4543-8db6-3a3cdc04c5c1-serving-cert\") pod \"route-controller-manager-86498d848d-r76ph\" (UID: \"3bce43f6-172f-4543-8db6-3a3cdc04c5c1\") " pod="openshift-route-controller-manager/route-controller-manager-86498d848d-r76ph" Mar 20 17:38:57 crc kubenswrapper[4690]: I0320 17:38:57.686715 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49xwt\" (UniqueName: \"kubernetes.io/projected/3bce43f6-172f-4543-8db6-3a3cdc04c5c1-kube-api-access-49xwt\") pod \"route-controller-manager-86498d848d-r76ph\" (UID: \"3bce43f6-172f-4543-8db6-3a3cdc04c5c1\") " pod="openshift-route-controller-manager/route-controller-manager-86498d848d-r76ph" Mar 20 17:38:57 crc kubenswrapper[4690]: I0320 17:38:57.741706 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-86498d848d-r76ph" Mar 20 17:38:57 crc kubenswrapper[4690]: I0320 17:38:57.891571 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="322e197a-33c6-4d46-94da-ebc1bc673cb4" path="/var/lib/kubelet/pods/322e197a-33c6-4d46-94da-ebc1bc673cb4/volumes" Mar 20 17:38:58 crc kubenswrapper[4690]: I0320 17:38:58.180957 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86498d848d-r76ph"] Mar 20 17:38:58 crc kubenswrapper[4690]: I0320 17:38:58.341287 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-86498d848d-r76ph" event={"ID":"3bce43f6-172f-4543-8db6-3a3cdc04c5c1","Type":"ContainerStarted","Data":"7c674aecfe60a96dc2b3963aeafe7f55ef093a137fce6982aac09c08f40e02b4"} Mar 20 17:38:58 crc kubenswrapper[4690]: I0320 17:38:58.341341 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-86498d848d-r76ph" event={"ID":"3bce43f6-172f-4543-8db6-3a3cdc04c5c1","Type":"ContainerStarted","Data":"0868a6b443a1353ed1c98a8c48636c8e58403f3de61fb88dedbb4d3bc648f3c7"} Mar 20 17:38:58 crc kubenswrapper[4690]: I0320 17:38:58.341770 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-86498d848d-r76ph" Mar 20 17:38:58 crc kubenswrapper[4690]: I0320 17:38:58.342893 4690 patch_prober.go:28] interesting pod/route-controller-manager-86498d848d-r76ph container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.71:8443/healthz\": dial tcp 10.217.0.71:8443: connect: connection refused" start-of-body= Mar 20 17:38:58 crc kubenswrapper[4690]: I0320 17:38:58.342967 4690 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-86498d848d-r76ph" podUID="3bce43f6-172f-4543-8db6-3a3cdc04c5c1" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.71:8443/healthz\": dial tcp 10.217.0.71:8443: connect: connection refused" Mar 20 17:38:58 crc kubenswrapper[4690]: I0320 17:38:58.361300 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-86498d848d-r76ph" podStartSLOduration=3.361280878 podStartE2EDuration="3.361280878s" podCreationTimestamp="2026-03-20 17:38:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:38:58.358743772 +0000 UTC m=+413.224569480" watchObservedRunningTime="2026-03-20 17:38:58.361280878 +0000 UTC m=+413.227106586" Mar 20 17:38:59 crc kubenswrapper[4690]: I0320 17:38:59.351515 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-86498d848d-r76ph" Mar 20 17:39:24 crc kubenswrapper[4690]: I0320 17:39:24.031104 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-4t2tn"] Mar 20 17:39:24 crc kubenswrapper[4690]: I0320 17:39:24.033024 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-4t2tn" Mar 20 17:39:24 crc kubenswrapper[4690]: I0320 17:39:24.048054 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-4t2tn"] Mar 20 17:39:24 crc kubenswrapper[4690]: I0320 17:39:24.129423 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/534783ea-5143-4ebd-ac66-cb36525aa992-ca-trust-extracted\") pod \"image-registry-66df7c8f76-4t2tn\" (UID: \"534783ea-5143-4ebd-ac66-cb36525aa992\") " pod="openshift-image-registry/image-registry-66df7c8f76-4t2tn" Mar 20 17:39:24 crc kubenswrapper[4690]: I0320 17:39:24.129496 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/534783ea-5143-4ebd-ac66-cb36525aa992-bound-sa-token\") pod \"image-registry-66df7c8f76-4t2tn\" (UID: \"534783ea-5143-4ebd-ac66-cb36525aa992\") " pod="openshift-image-registry/image-registry-66df7c8f76-4t2tn" Mar 20 17:39:24 crc kubenswrapper[4690]: I0320 17:39:24.129526 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjl9x\" (UniqueName: \"kubernetes.io/projected/534783ea-5143-4ebd-ac66-cb36525aa992-kube-api-access-kjl9x\") pod \"image-registry-66df7c8f76-4t2tn\" (UID: \"534783ea-5143-4ebd-ac66-cb36525aa992\") " pod="openshift-image-registry/image-registry-66df7c8f76-4t2tn" Mar 20 17:39:24 crc kubenswrapper[4690]: I0320 17:39:24.129555 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-4t2tn\" (UID: \"534783ea-5143-4ebd-ac66-cb36525aa992\") " pod="openshift-image-registry/image-registry-66df7c8f76-4t2tn" Mar 20 17:39:24 crc kubenswrapper[4690]: I0320 17:39:24.129586 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/534783ea-5143-4ebd-ac66-cb36525aa992-registry-certificates\") pod \"image-registry-66df7c8f76-4t2tn\" (UID: \"534783ea-5143-4ebd-ac66-cb36525aa992\") " pod="openshift-image-registry/image-registry-66df7c8f76-4t2tn" Mar 20 17:39:24 crc kubenswrapper[4690]: I0320 17:39:24.129655 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/534783ea-5143-4ebd-ac66-cb36525aa992-trusted-ca\") pod \"image-registry-66df7c8f76-4t2tn\" (UID: \"534783ea-5143-4ebd-ac66-cb36525aa992\") " pod="openshift-image-registry/image-registry-66df7c8f76-4t2tn" Mar 20 17:39:24 crc kubenswrapper[4690]: I0320 17:39:24.129868 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/534783ea-5143-4ebd-ac66-cb36525aa992-installation-pull-secrets\") pod \"image-registry-66df7c8f76-4t2tn\" (UID: \"534783ea-5143-4ebd-ac66-cb36525aa992\") " pod="openshift-image-registry/image-registry-66df7c8f76-4t2tn" Mar 20 17:39:24 crc kubenswrapper[4690]: I0320 17:39:24.129970 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/534783ea-5143-4ebd-ac66-cb36525aa992-registry-tls\") pod \"image-registry-66df7c8f76-4t2tn\" (UID: \"534783ea-5143-4ebd-ac66-cb36525aa992\") " pod="openshift-image-registry/image-registry-66df7c8f76-4t2tn" Mar 20 17:39:24 crc kubenswrapper[4690]: I0320 17:39:24.159746 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-4t2tn\" (UID: \"534783ea-5143-4ebd-ac66-cb36525aa992\") " pod="openshift-image-registry/image-registry-66df7c8f76-4t2tn" Mar 20 17:39:24 crc kubenswrapper[4690]: I0320 17:39:24.231205 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/534783ea-5143-4ebd-ac66-cb36525aa992-installation-pull-secrets\") pod \"image-registry-66df7c8f76-4t2tn\" (UID: \"534783ea-5143-4ebd-ac66-cb36525aa992\") " pod="openshift-image-registry/image-registry-66df7c8f76-4t2tn" Mar 20 17:39:24 crc kubenswrapper[4690]: I0320 17:39:24.231329 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/534783ea-5143-4ebd-ac66-cb36525aa992-registry-tls\") pod \"image-registry-66df7c8f76-4t2tn\" (UID: \"534783ea-5143-4ebd-ac66-cb36525aa992\") " pod="openshift-image-registry/image-registry-66df7c8f76-4t2tn" Mar 20 17:39:24 crc kubenswrapper[4690]: I0320 17:39:24.231418 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/534783ea-5143-4ebd-ac66-cb36525aa992-ca-trust-extracted\") pod \"image-registry-66df7c8f76-4t2tn\" (UID: \"534783ea-5143-4ebd-ac66-cb36525aa992\") " pod="openshift-image-registry/image-registry-66df7c8f76-4t2tn" Mar 20 17:39:24 crc kubenswrapper[4690]: I0320 17:39:24.231493 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/534783ea-5143-4ebd-ac66-cb36525aa992-bound-sa-token\") pod \"image-registry-66df7c8f76-4t2tn\" (UID: \"534783ea-5143-4ebd-ac66-cb36525aa992\") " pod="openshift-image-registry/image-registry-66df7c8f76-4t2tn" Mar 20 17:39:24 crc kubenswrapper[4690]: I0320 17:39:24.231534 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjl9x\" (UniqueName: \"kubernetes.io/projected/534783ea-5143-4ebd-ac66-cb36525aa992-kube-api-access-kjl9x\") pod \"image-registry-66df7c8f76-4t2tn\" (UID: \"534783ea-5143-4ebd-ac66-cb36525aa992\") " pod="openshift-image-registry/image-registry-66df7c8f76-4t2tn" Mar 20 17:39:24 crc kubenswrapper[4690]: I0320 17:39:24.231581 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/534783ea-5143-4ebd-ac66-cb36525aa992-registry-certificates\") pod \"image-registry-66df7c8f76-4t2tn\" (UID: \"534783ea-5143-4ebd-ac66-cb36525aa992\") " pod="openshift-image-registry/image-registry-66df7c8f76-4t2tn" Mar 20 17:39:24 crc kubenswrapper[4690]: I0320 17:39:24.231633 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/534783ea-5143-4ebd-ac66-cb36525aa992-trusted-ca\") pod \"image-registry-66df7c8f76-4t2tn\" (UID: \"534783ea-5143-4ebd-ac66-cb36525aa992\") " pod="openshift-image-registry/image-registry-66df7c8f76-4t2tn" Mar 20 17:39:24 crc kubenswrapper[4690]: I0320 17:39:24.233408 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/534783ea-5143-4ebd-ac66-cb36525aa992-ca-trust-extracted\") pod \"image-registry-66df7c8f76-4t2tn\" (UID: \"534783ea-5143-4ebd-ac66-cb36525aa992\") " pod="openshift-image-registry/image-registry-66df7c8f76-4t2tn" Mar 20 17:39:24 crc kubenswrapper[4690]: I0320 17:39:24.234184 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/534783ea-5143-4ebd-ac66-cb36525aa992-trusted-ca\") pod \"image-registry-66df7c8f76-4t2tn\" (UID: \"534783ea-5143-4ebd-ac66-cb36525aa992\") " pod="openshift-image-registry/image-registry-66df7c8f76-4t2tn" Mar 20 17:39:24 crc kubenswrapper[4690]: I0320 17:39:24.234455 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/534783ea-5143-4ebd-ac66-cb36525aa992-registry-certificates\") pod \"image-registry-66df7c8f76-4t2tn\" (UID: \"534783ea-5143-4ebd-ac66-cb36525aa992\") " pod="openshift-image-registry/image-registry-66df7c8f76-4t2tn" Mar 20 17:39:24 crc kubenswrapper[4690]: I0320 17:39:24.240603 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/534783ea-5143-4ebd-ac66-cb36525aa992-installation-pull-secrets\") pod \"image-registry-66df7c8f76-4t2tn\" (UID: \"534783ea-5143-4ebd-ac66-cb36525aa992\") " pod="openshift-image-registry/image-registry-66df7c8f76-4t2tn" Mar 20 17:39:24 crc kubenswrapper[4690]: I0320 17:39:24.241193 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/534783ea-5143-4ebd-ac66-cb36525aa992-registry-tls\") pod \"image-registry-66df7c8f76-4t2tn\" (UID: \"534783ea-5143-4ebd-ac66-cb36525aa992\") " pod="openshift-image-registry/image-registry-66df7c8f76-4t2tn" Mar 20 17:39:24 crc kubenswrapper[4690]: I0320 17:39:24.252483 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjl9x\" (UniqueName: \"kubernetes.io/projected/534783ea-5143-4ebd-ac66-cb36525aa992-kube-api-access-kjl9x\") pod \"image-registry-66df7c8f76-4t2tn\" (UID: \"534783ea-5143-4ebd-ac66-cb36525aa992\") " pod="openshift-image-registry/image-registry-66df7c8f76-4t2tn" Mar 20 17:39:24 crc kubenswrapper[4690]: I0320 17:39:24.261885 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/534783ea-5143-4ebd-ac66-cb36525aa992-bound-sa-token\") pod \"image-registry-66df7c8f76-4t2tn\" (UID: \"534783ea-5143-4ebd-ac66-cb36525aa992\") " pod="openshift-image-registry/image-registry-66df7c8f76-4t2tn" Mar 20 17:39:24 crc kubenswrapper[4690]: I0320 17:39:24.274131 4690 patch_prober.go:28] interesting pod/machine-config-daemon-wtg2q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:39:24 crc kubenswrapper[4690]: I0320 17:39:24.274205 4690 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:39:24 crc kubenswrapper[4690]: I0320 17:39:24.350218 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-4t2tn" Mar 20 17:39:24 crc kubenswrapper[4690]: I0320 17:39:24.822980 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-4t2tn"] Mar 20 17:39:24 crc kubenswrapper[4690]: W0320 17:39:24.829982 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod534783ea_5143_4ebd_ac66_cb36525aa992.slice/crio-c4a1c57882cd185b267c59fc755f650dd53e21565ee786b92856a8e6c9244457 WatchSource:0}: Error finding container c4a1c57882cd185b267c59fc755f650dd53e21565ee786b92856a8e6c9244457: Status 404 returned error can't find the container with id c4a1c57882cd185b267c59fc755f650dd53e21565ee786b92856a8e6c9244457 Mar 20 17:39:25 crc kubenswrapper[4690]: I0320 17:39:25.512908 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-4t2tn" event={"ID":"534783ea-5143-4ebd-ac66-cb36525aa992","Type":"ContainerStarted","Data":"3f9cfa082751813c37136c1daee53f3550b0ca6fe2d42e42dbebb74db4862afc"} Mar 20 17:39:25 crc kubenswrapper[4690]: I0320 17:39:25.513243 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-4t2tn" event={"ID":"534783ea-5143-4ebd-ac66-cb36525aa992","Type":"ContainerStarted","Data":"c4a1c57882cd185b267c59fc755f650dd53e21565ee786b92856a8e6c9244457"} Mar 20 17:39:25 crc kubenswrapper[4690]: I0320 17:39:25.513264 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-4t2tn" Mar 20 17:39:25 crc kubenswrapper[4690]: I0320 17:39:25.540551 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-4t2tn" podStartSLOduration=1.540521319 podStartE2EDuration="1.540521319s" podCreationTimestamp="2026-03-20 17:39:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:39:25.5340757 +0000 UTC m=+440.399901418" watchObservedRunningTime="2026-03-20 17:39:25.540521319 +0000 UTC m=+440.406347037" Mar 20 17:39:35 crc kubenswrapper[4690]: I0320 17:39:35.837226 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-b75cbdb4b-5q69m"] Mar 20 17:39:35 crc kubenswrapper[4690]: I0320 17:39:35.837942 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-b75cbdb4b-5q69m" podUID="2c031c66-36c2-4758-a771-a38d505073c7" containerName="controller-manager" containerID="cri-o://d4301f2245c8c191fdd5f4c8f8bad6744206d4d09f15114069ab311701c934a2" gracePeriod=30 Mar 20 17:39:36 crc kubenswrapper[4690]: I0320 17:39:36.581087 4690 generic.go:334] "Generic (PLEG): container finished" podID="2c031c66-36c2-4758-a771-a38d505073c7" containerID="d4301f2245c8c191fdd5f4c8f8bad6744206d4d09f15114069ab311701c934a2" exitCode=0 Mar 20 17:39:36 crc kubenswrapper[4690]: I0320 17:39:36.581145 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-b75cbdb4b-5q69m" event={"ID":"2c031c66-36c2-4758-a771-a38d505073c7","Type":"ContainerDied","Data":"d4301f2245c8c191fdd5f4c8f8bad6744206d4d09f15114069ab311701c934a2"} Mar 20 17:39:36 crc kubenswrapper[4690]: I0320 17:39:36.922357 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-b75cbdb4b-5q69m" Mar 20 17:39:36 crc kubenswrapper[4690]: I0320 17:39:36.964702 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-8558fbbddd-4vgmd"] Mar 20 17:39:36 crc kubenswrapper[4690]: E0320 17:39:36.965304 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c031c66-36c2-4758-a771-a38d505073c7" containerName="controller-manager" Mar 20 17:39:36 crc kubenswrapper[4690]: I0320 17:39:36.965328 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c031c66-36c2-4758-a771-a38d505073c7" containerName="controller-manager" Mar 20 17:39:36 crc kubenswrapper[4690]: I0320 17:39:36.965739 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c031c66-36c2-4758-a771-a38d505073c7" containerName="controller-manager" Mar 20 17:39:36 crc kubenswrapper[4690]: I0320 17:39:36.969896 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8558fbbddd-4vgmd" Mar 20 17:39:36 crc kubenswrapper[4690]: I0320 17:39:36.989315 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-8558fbbddd-4vgmd"] Mar 20 17:39:37 crc kubenswrapper[4690]: I0320 17:39:37.019901 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c031c66-36c2-4758-a771-a38d505073c7-serving-cert\") pod \"2c031c66-36c2-4758-a771-a38d505073c7\" (UID: \"2c031c66-36c2-4758-a771-a38d505073c7\") " Mar 20 17:39:37 crc kubenswrapper[4690]: I0320 17:39:37.020226 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c031c66-36c2-4758-a771-a38d505073c7-config\") pod \"2c031c66-36c2-4758-a771-a38d505073c7\" (UID: \"2c031c66-36c2-4758-a771-a38d505073c7\") " Mar 20 17:39:37 crc kubenswrapper[4690]: I0320 17:39:37.020274 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2c031c66-36c2-4758-a771-a38d505073c7-proxy-ca-bundles\") pod \"2c031c66-36c2-4758-a771-a38d505073c7\" (UID: \"2c031c66-36c2-4758-a771-a38d505073c7\") " Mar 20 17:39:37 crc kubenswrapper[4690]: I0320 17:39:37.020300 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2c031c66-36c2-4758-a771-a38d505073c7-client-ca\") pod \"2c031c66-36c2-4758-a771-a38d505073c7\" (UID: \"2c031c66-36c2-4758-a771-a38d505073c7\") " Mar 20 17:39:37 crc kubenswrapper[4690]: I0320 17:39:37.020374 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gch5v\" (UniqueName: \"kubernetes.io/projected/2c031c66-36c2-4758-a771-a38d505073c7-kube-api-access-gch5v\") pod \"2c031c66-36c2-4758-a771-a38d505073c7\" (UID: \"2c031c66-36c2-4758-a771-a38d505073c7\") " Mar 20 17:39:37 crc kubenswrapper[4690]: I0320 17:39:37.020540 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cfb7ef6f-78b3-4a42-b061-8acf2acbf4d7-proxy-ca-bundles\") pod \"controller-manager-8558fbbddd-4vgmd\" (UID: \"cfb7ef6f-78b3-4a42-b061-8acf2acbf4d7\") " pod="openshift-controller-manager/controller-manager-8558fbbddd-4vgmd" Mar 20 17:39:37 crc kubenswrapper[4690]: I0320 17:39:37.020655 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfb7ef6f-78b3-4a42-b061-8acf2acbf4d7-config\") pod \"controller-manager-8558fbbddd-4vgmd\" (UID: \"cfb7ef6f-78b3-4a42-b061-8acf2acbf4d7\") " pod="openshift-controller-manager/controller-manager-8558fbbddd-4vgmd" Mar 20 17:39:37 crc kubenswrapper[4690]: I0320 17:39:37.020684 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97bg8\" (UniqueName: \"kubernetes.io/projected/cfb7ef6f-78b3-4a42-b061-8acf2acbf4d7-kube-api-access-97bg8\") pod \"controller-manager-8558fbbddd-4vgmd\" (UID: \"cfb7ef6f-78b3-4a42-b061-8acf2acbf4d7\") " pod="openshift-controller-manager/controller-manager-8558fbbddd-4vgmd" Mar 20 17:39:37 crc kubenswrapper[4690]: I0320 17:39:37.020706 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cfb7ef6f-78b3-4a42-b061-8acf2acbf4d7-serving-cert\") pod \"controller-manager-8558fbbddd-4vgmd\" (UID: \"cfb7ef6f-78b3-4a42-b061-8acf2acbf4d7\") " pod="openshift-controller-manager/controller-manager-8558fbbddd-4vgmd" Mar 20 17:39:37 crc kubenswrapper[4690]: I0320 17:39:37.020731 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cfb7ef6f-78b3-4a42-b061-8acf2acbf4d7-client-ca\") pod \"controller-manager-8558fbbddd-4vgmd\" (UID: \"cfb7ef6f-78b3-4a42-b061-8acf2acbf4d7\") " pod="openshift-controller-manager/controller-manager-8558fbbddd-4vgmd" Mar 20 17:39:37 crc kubenswrapper[4690]: I0320 17:39:37.020984 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c031c66-36c2-4758-a771-a38d505073c7-client-ca" (OuterVolumeSpecName: "client-ca") pod "2c031c66-36c2-4758-a771-a38d505073c7" (UID: "2c031c66-36c2-4758-a771-a38d505073c7"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:39:37 crc kubenswrapper[4690]: I0320 17:39:37.021108 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c031c66-36c2-4758-a771-a38d505073c7-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "2c031c66-36c2-4758-a771-a38d505073c7" (UID: "2c031c66-36c2-4758-a771-a38d505073c7"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:39:37 crc kubenswrapper[4690]: I0320 17:39:37.022620 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c031c66-36c2-4758-a771-a38d505073c7-config" (OuterVolumeSpecName: "config") pod "2c031c66-36c2-4758-a771-a38d505073c7" (UID: "2c031c66-36c2-4758-a771-a38d505073c7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:39:37 crc kubenswrapper[4690]: I0320 17:39:37.025430 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c031c66-36c2-4758-a771-a38d505073c7-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2c031c66-36c2-4758-a771-a38d505073c7" (UID: "2c031c66-36c2-4758-a771-a38d505073c7"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:39:37 crc kubenswrapper[4690]: I0320 17:39:37.025525 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c031c66-36c2-4758-a771-a38d505073c7-kube-api-access-gch5v" (OuterVolumeSpecName: "kube-api-access-gch5v") pod "2c031c66-36c2-4758-a771-a38d505073c7" (UID: "2c031c66-36c2-4758-a771-a38d505073c7"). InnerVolumeSpecName "kube-api-access-gch5v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:39:37 crc kubenswrapper[4690]: I0320 17:39:37.123403 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cfb7ef6f-78b3-4a42-b061-8acf2acbf4d7-proxy-ca-bundles\") pod \"controller-manager-8558fbbddd-4vgmd\" (UID: \"cfb7ef6f-78b3-4a42-b061-8acf2acbf4d7\") " pod="openshift-controller-manager/controller-manager-8558fbbddd-4vgmd" Mar 20 17:39:37 crc kubenswrapper[4690]: I0320 17:39:37.123701 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfb7ef6f-78b3-4a42-b061-8acf2acbf4d7-config\") pod \"controller-manager-8558fbbddd-4vgmd\" (UID: \"cfb7ef6f-78b3-4a42-b061-8acf2acbf4d7\") " pod="openshift-controller-manager/controller-manager-8558fbbddd-4vgmd" Mar 20 17:39:37 crc kubenswrapper[4690]: I0320 17:39:37.123790 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97bg8\" (UniqueName: \"kubernetes.io/projected/cfb7ef6f-78b3-4a42-b061-8acf2acbf4d7-kube-api-access-97bg8\") pod \"controller-manager-8558fbbddd-4vgmd\" (UID: \"cfb7ef6f-78b3-4a42-b061-8acf2acbf4d7\") " pod="openshift-controller-manager/controller-manager-8558fbbddd-4vgmd" Mar 20 17:39:37 crc kubenswrapper[4690]: I0320 17:39:37.123837 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cfb7ef6f-78b3-4a42-b061-8acf2acbf4d7-serving-cert\") pod \"controller-manager-8558fbbddd-4vgmd\" (UID: \"cfb7ef6f-78b3-4a42-b061-8acf2acbf4d7\") " pod="openshift-controller-manager/controller-manager-8558fbbddd-4vgmd" Mar 20 17:39:37 crc kubenswrapper[4690]: I0320 17:39:37.123874 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cfb7ef6f-78b3-4a42-b061-8acf2acbf4d7-client-ca\") pod \"controller-manager-8558fbbddd-4vgmd\" (UID: \"cfb7ef6f-78b3-4a42-b061-8acf2acbf4d7\") " pod="openshift-controller-manager/controller-manager-8558fbbddd-4vgmd" Mar 20 17:39:37 crc kubenswrapper[4690]: I0320 17:39:37.123951 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gch5v\" (UniqueName: \"kubernetes.io/projected/2c031c66-36c2-4758-a771-a38d505073c7-kube-api-access-gch5v\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:37 crc kubenswrapper[4690]: I0320 17:39:37.123975 4690 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c031c66-36c2-4758-a771-a38d505073c7-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:37 crc kubenswrapper[4690]: I0320 17:39:37.123995 4690 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c031c66-36c2-4758-a771-a38d505073c7-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:37 crc kubenswrapper[4690]: I0320 17:39:37.124014 4690 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2c031c66-36c2-4758-a771-a38d505073c7-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:37 crc kubenswrapper[4690]: I0320 17:39:37.124033 4690 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2c031c66-36c2-4758-a771-a38d505073c7-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:37 crc kubenswrapper[4690]: I0320 17:39:37.125141 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfb7ef6f-78b3-4a42-b061-8acf2acbf4d7-config\") pod \"controller-manager-8558fbbddd-4vgmd\" (UID: \"cfb7ef6f-78b3-4a42-b061-8acf2acbf4d7\") " pod="openshift-controller-manager/controller-manager-8558fbbddd-4vgmd" Mar 20 17:39:37 crc kubenswrapper[4690]: I0320 17:39:37.125834 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cfb7ef6f-78b3-4a42-b061-8acf2acbf4d7-proxy-ca-bundles\") pod \"controller-manager-8558fbbddd-4vgmd\" (UID: \"cfb7ef6f-78b3-4a42-b061-8acf2acbf4d7\") " pod="openshift-controller-manager/controller-manager-8558fbbddd-4vgmd" Mar 20 17:39:37 crc kubenswrapper[4690]: I0320 17:39:37.125927 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cfb7ef6f-78b3-4a42-b061-8acf2acbf4d7-client-ca\") pod \"controller-manager-8558fbbddd-4vgmd\" (UID: \"cfb7ef6f-78b3-4a42-b061-8acf2acbf4d7\") " pod="openshift-controller-manager/controller-manager-8558fbbddd-4vgmd" Mar 20 17:39:37 crc kubenswrapper[4690]: I0320 17:39:37.129155 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cfb7ef6f-78b3-4a42-b061-8acf2acbf4d7-serving-cert\") pod \"controller-manager-8558fbbddd-4vgmd\" (UID: \"cfb7ef6f-78b3-4a42-b061-8acf2acbf4d7\") " pod="openshift-controller-manager/controller-manager-8558fbbddd-4vgmd" Mar 20 17:39:37 crc kubenswrapper[4690]: I0320 17:39:37.148544 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97bg8\" (UniqueName: \"kubernetes.io/projected/cfb7ef6f-78b3-4a42-b061-8acf2acbf4d7-kube-api-access-97bg8\") pod \"controller-manager-8558fbbddd-4vgmd\" (UID: \"cfb7ef6f-78b3-4a42-b061-8acf2acbf4d7\") " pod="openshift-controller-manager/controller-manager-8558fbbddd-4vgmd" Mar 20 17:39:37 crc kubenswrapper[4690]: I0320 17:39:37.315191 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8558fbbddd-4vgmd" Mar 20 17:39:37 crc kubenswrapper[4690]: I0320 17:39:37.327078 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:39:37 crc kubenswrapper[4690]: I0320 17:39:37.327138 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:39:37 crc kubenswrapper[4690]: I0320 17:39:37.327191 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3cb690cf-caea-4c1b-ad3c-7e17a802b1a3-metrics-certs\") pod \"network-metrics-daemon-bgj72\" (UID: \"3cb690cf-caea-4c1b-ad3c-7e17a802b1a3\") " pod="openshift-multus/network-metrics-daemon-bgj72" Mar 20 17:39:37 crc kubenswrapper[4690]: I0320 17:39:37.328975 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:39:37 crc kubenswrapper[4690]: I0320 17:39:37.331447 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3cb690cf-caea-4c1b-ad3c-7e17a802b1a3-metrics-certs\") pod \"network-metrics-daemon-bgj72\" (UID: \"3cb690cf-caea-4c1b-ad3c-7e17a802b1a3\") " pod="openshift-multus/network-metrics-daemon-bgj72" Mar 20 17:39:37 crc kubenswrapper[4690]: I0320 17:39:37.332487 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:39:37 crc kubenswrapper[4690]: I0320 17:39:37.383294 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:39:37 crc kubenswrapper[4690]: I0320 17:39:37.486565 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 20 17:39:37 crc kubenswrapper[4690]: I0320 17:39:37.494161 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bgj72" Mar 20 17:39:37 crc kubenswrapper[4690]: I0320 17:39:37.587998 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-b75cbdb4b-5q69m" event={"ID":"2c031c66-36c2-4758-a771-a38d505073c7","Type":"ContainerDied","Data":"ae1bd568b10fcfb8a3898b77b0d354efae02a2df4f16a05237d7cd0d525b5b46"} Mar 20 17:39:37 crc kubenswrapper[4690]: I0320 17:39:37.588063 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-b75cbdb4b-5q69m" Mar 20 17:39:37 crc kubenswrapper[4690]: I0320 17:39:37.588069 4690 scope.go:117] "RemoveContainer" containerID="d4301f2245c8c191fdd5f4c8f8bad6744206d4d09f15114069ab311701c934a2" Mar 20 17:39:37 crc kubenswrapper[4690]: I0320 17:39:37.619445 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-b75cbdb4b-5q69m"] Mar 20 17:39:37 crc kubenswrapper[4690]: I0320 17:39:37.623238 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-b75cbdb4b-5q69m"] Mar 20 17:39:37 crc kubenswrapper[4690]: I0320 17:39:37.692322 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-bgj72"] Mar 20 17:39:37 crc kubenswrapper[4690]: I0320 17:39:37.811566 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-8558fbbddd-4vgmd"] Mar 20 17:39:37 crc kubenswrapper[4690]: W0320 17:39:37.820085 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcfb7ef6f_78b3_4a42_b061_8acf2acbf4d7.slice/crio-2bf266003a39a6c5619250098103578469876cc66f5a4cfad8a7a07c362bfe67 WatchSource:0}: Error finding container 2bf266003a39a6c5619250098103578469876cc66f5a4cfad8a7a07c362bfe67: Status 404 returned error can't find the container with id 2bf266003a39a6c5619250098103578469876cc66f5a4cfad8a7a07c362bfe67 Mar 20 17:39:37 crc kubenswrapper[4690]: W0320 17:39:37.851639 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-62d4c426bf3d2b0fc454270db3f0464f7021956e07e39b06aa4b78c999b1259b WatchSource:0}: Error finding container 62d4c426bf3d2b0fc454270db3f0464f7021956e07e39b06aa4b78c999b1259b: Status 404 returned error can't find the container with id 62d4c426bf3d2b0fc454270db3f0464f7021956e07e39b06aa4b78c999b1259b Mar 20 17:39:37 crc kubenswrapper[4690]: I0320 17:39:37.891968 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c031c66-36c2-4758-a771-a38d505073c7" path="/var/lib/kubelet/pods/2c031c66-36c2-4758-a771-a38d505073c7/volumes" Mar 20 17:39:38 crc kubenswrapper[4690]: I0320 17:39:38.342092 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:39:38 crc kubenswrapper[4690]: I0320 17:39:38.342421 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:39:38 crc kubenswrapper[4690]: I0320 17:39:38.346784 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:39:38 crc kubenswrapper[4690]: I0320 17:39:38.347855 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:39:38 crc kubenswrapper[4690]: I0320 17:39:38.383815 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:39:38 crc kubenswrapper[4690]: I0320 17:39:38.583304 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:39:38 crc kubenswrapper[4690]: I0320 17:39:38.593892 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-bgj72" event={"ID":"3cb690cf-caea-4c1b-ad3c-7e17a802b1a3","Type":"ContainerStarted","Data":"5cd82bf27f488877c3628efcc78f0ecbe50f54e92b862b09b781cfcbddc0aeab"} Mar 20 17:39:38 crc kubenswrapper[4690]: I0320 17:39:38.593939 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-bgj72" event={"ID":"3cb690cf-caea-4c1b-ad3c-7e17a802b1a3","Type":"ContainerStarted","Data":"3cade7d413c0c83813ef3ddd2f39a293ab795e2f6cbe36e4eeb3cde0159b9f65"} Mar 20 17:39:38 crc kubenswrapper[4690]: I0320 17:39:38.595306 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"d2e718842b6b44e68300522fe7ea5a24d5dc22589a6ed1628b0f27deaea57ca6"} Mar 20 17:39:38 crc kubenswrapper[4690]: I0320 17:39:38.595349 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"62d4c426bf3d2b0fc454270db3f0464f7021956e07e39b06aa4b78c999b1259b"} Mar 20 17:39:38 crc kubenswrapper[4690]: I0320 17:39:38.597188 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8558fbbddd-4vgmd" event={"ID":"cfb7ef6f-78b3-4a42-b061-8acf2acbf4d7","Type":"ContainerStarted","Data":"04e921fca50d0ff006c0aa06389231096bbe7261cf10f33df9b9e3e9be841790"} Mar 20 17:39:38 crc kubenswrapper[4690]: I0320 17:39:38.597220 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8558fbbddd-4vgmd" event={"ID":"cfb7ef6f-78b3-4a42-b061-8acf2acbf4d7","Type":"ContainerStarted","Data":"2bf266003a39a6c5619250098103578469876cc66f5a4cfad8a7a07c362bfe67"} Mar 20 17:39:38 crc kubenswrapper[4690]: I0320 17:39:38.598075 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-8558fbbddd-4vgmd" Mar 20 17:39:38 crc kubenswrapper[4690]: I0320 17:39:38.602375 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-8558fbbddd-4vgmd" Mar 20 17:39:38 crc kubenswrapper[4690]: I0320 17:39:38.637405 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-8558fbbddd-4vgmd" podStartSLOduration=3.6373854469999998 podStartE2EDuration="3.637385447s" podCreationTimestamp="2026-03-20 17:39:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:39:38.633196067 +0000 UTC m=+453.499021745" watchObservedRunningTime="2026-03-20 17:39:38.637385447 +0000 UTC m=+453.503211125" Mar 20 17:39:38 crc kubenswrapper[4690]: W0320 17:39:38.805450 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-a037abbc6c2f65a05a4a280f250b21e76c4e609e76569468b341de879f0612cd WatchSource:0}: Error finding container a037abbc6c2f65a05a4a280f250b21e76c4e609e76569468b341de879f0612cd: Status 404 returned error can't find the container with id a037abbc6c2f65a05a4a280f250b21e76c4e609e76569468b341de879f0612cd Mar 20 17:39:38 crc kubenswrapper[4690]: W0320 17:39:38.835033 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-5fb5d8a82fa2d36fc7228c435bbb0e1d8a2264a70ee5de863769d12da08f8116 WatchSource:0}: Error finding container 5fb5d8a82fa2d36fc7228c435bbb0e1d8a2264a70ee5de863769d12da08f8116: Status 404 returned error can't find the container with id 5fb5d8a82fa2d36fc7228c435bbb0e1d8a2264a70ee5de863769d12da08f8116 Mar 20 17:39:39 crc kubenswrapper[4690]: I0320 17:39:39.605019 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"9132ec0727ba8df9820a5136efc5ead5cda5ba02452f168c19e6234eada0e52a"} Mar 20 17:39:39 crc kubenswrapper[4690]: I0320 17:39:39.605880 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"a037abbc6c2f65a05a4a280f250b21e76c4e609e76569468b341de879f0612cd"} Mar 20 17:39:39 crc kubenswrapper[4690]: I0320 17:39:39.607504 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-bgj72" event={"ID":"3cb690cf-caea-4c1b-ad3c-7e17a802b1a3","Type":"ContainerStarted","Data":"6400001ca2eff8ef6337c0c0d17d54c4e58b670d85581a0751bfc8b41a195d09"} Mar 20 17:39:39 crc kubenswrapper[4690]: I0320 17:39:39.611116 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"f6d75e234d6a560f66579617570bcbef104e4d31efb38e1a57d842dd25974be5"} Mar 20 17:39:39 crc kubenswrapper[4690]: I0320 17:39:39.611196 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"5fb5d8a82fa2d36fc7228c435bbb0e1d8a2264a70ee5de863769d12da08f8116"} Mar 20 17:39:39 crc kubenswrapper[4690]: I0320 17:39:39.611484 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:39:39 crc kubenswrapper[4690]: I0320 17:39:39.665202 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-bgj72" podStartSLOduration=411.665176577 podStartE2EDuration="6m51.665176577s" podCreationTimestamp="2026-03-20 17:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:39:39.659625321 +0000 UTC m=+454.525451059" watchObservedRunningTime="2026-03-20 17:39:39.665176577 +0000 UTC m=+454.531002265" Mar 20 17:39:44 crc kubenswrapper[4690]: I0320 17:39:44.355768 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-4t2tn" Mar 20 17:39:44 crc kubenswrapper[4690]: I0320 17:39:44.415897 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6fhf7"] Mar 20 17:39:54 crc kubenswrapper[4690]: I0320 17:39:54.274839 4690 patch_prober.go:28] interesting pod/machine-config-daemon-wtg2q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:39:54 crc kubenswrapper[4690]: I0320 17:39:54.275682 4690 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:39:54 crc kubenswrapper[4690]: I0320 17:39:54.275755 4690 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" Mar 20 17:39:54 crc kubenswrapper[4690]: I0320 17:39:54.276589 4690 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"810ef61dfd66653c97e50a7c5e658e3e4610648ff84dc8342c8cadb5532980bc"} pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 17:39:54 crc kubenswrapper[4690]: I0320 17:39:54.276672 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" containerName="machine-config-daemon" containerID="cri-o://810ef61dfd66653c97e50a7c5e658e3e4610648ff84dc8342c8cadb5532980bc" gracePeriod=600 Mar 20 17:39:54 crc kubenswrapper[4690]: I0320 17:39:54.745184 4690 generic.go:334] "Generic (PLEG): container finished" podID="c18651e4-89e3-43fd-a780-bfa6df87591e" containerID="810ef61dfd66653c97e50a7c5e658e3e4610648ff84dc8342c8cadb5532980bc" exitCode=0 Mar 20 17:39:54 crc kubenswrapper[4690]: I0320 17:39:54.745405 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" event={"ID":"c18651e4-89e3-43fd-a780-bfa6df87591e","Type":"ContainerDied","Data":"810ef61dfd66653c97e50a7c5e658e3e4610648ff84dc8342c8cadb5532980bc"} Mar 20 17:39:54 crc kubenswrapper[4690]: I0320 17:39:54.745747 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" event={"ID":"c18651e4-89e3-43fd-a780-bfa6df87591e","Type":"ContainerStarted","Data":"a6ff45a480211c2f0e008e8cd259faa67c07e468ebd44fd74d331920aaa63b33"} Mar 20 17:39:54 crc kubenswrapper[4690]: I0320 17:39:54.745788 4690 scope.go:117] "RemoveContainer" containerID="09565d72b6e11bc9bc4f72446c455016fb107bdf0fe367b56427ce9f79c20b0e" Mar 20 17:40:00 crc kubenswrapper[4690]: I0320 17:40:00.144133 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567140-xh2x5"] Mar 20 17:40:00 crc kubenswrapper[4690]: I0320 17:40:00.146911 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567140-xh2x5" Mar 20 17:40:00 crc kubenswrapper[4690]: I0320 17:40:00.149557 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5fwhb" Mar 20 17:40:00 crc kubenswrapper[4690]: I0320 17:40:00.150004 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 17:40:00 crc kubenswrapper[4690]: I0320 17:40:00.150000 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567140-xh2x5"] Mar 20 17:40:00 crc kubenswrapper[4690]: I0320 17:40:00.161770 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 17:40:00 crc kubenswrapper[4690]: I0320 17:40:00.223353 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmngx\" (UniqueName: \"kubernetes.io/projected/be701514-9b72-4af7-8a67-bbf545296477-kube-api-access-zmngx\") pod \"auto-csr-approver-29567140-xh2x5\" (UID: \"be701514-9b72-4af7-8a67-bbf545296477\") " pod="openshift-infra/auto-csr-approver-29567140-xh2x5" Mar 20 17:40:00 crc kubenswrapper[4690]: I0320 17:40:00.324794 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmngx\" (UniqueName: \"kubernetes.io/projected/be701514-9b72-4af7-8a67-bbf545296477-kube-api-access-zmngx\") pod \"auto-csr-approver-29567140-xh2x5\" (UID: \"be701514-9b72-4af7-8a67-bbf545296477\") " pod="openshift-infra/auto-csr-approver-29567140-xh2x5" Mar 20 17:40:00 crc kubenswrapper[4690]: I0320 17:40:00.357919 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmngx\" (UniqueName: \"kubernetes.io/projected/be701514-9b72-4af7-8a67-bbf545296477-kube-api-access-zmngx\") pod \"auto-csr-approver-29567140-xh2x5\" (UID: \"be701514-9b72-4af7-8a67-bbf545296477\") " pod="openshift-infra/auto-csr-approver-29567140-xh2x5" Mar 20 17:40:00 crc kubenswrapper[4690]: I0320 17:40:00.475674 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567140-xh2x5" Mar 20 17:40:00 crc kubenswrapper[4690]: I0320 17:40:00.864818 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567140-xh2x5"] Mar 20 17:40:00 crc kubenswrapper[4690]: W0320 17:40:00.874526 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbe701514_9b72_4af7_8a67_bbf545296477.slice/crio-13d54c97ffd1b29398ed9239518bafa327b044767f92db6142a26e1b360fdd6a WatchSource:0}: Error finding container 13d54c97ffd1b29398ed9239518bafa327b044767f92db6142a26e1b360fdd6a: Status 404 returned error can't find the container with id 13d54c97ffd1b29398ed9239518bafa327b044767f92db6142a26e1b360fdd6a Mar 20 17:40:01 crc kubenswrapper[4690]: I0320 17:40:01.802649 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567140-xh2x5" event={"ID":"be701514-9b72-4af7-8a67-bbf545296477","Type":"ContainerStarted","Data":"13d54c97ffd1b29398ed9239518bafa327b044767f92db6142a26e1b360fdd6a"} Mar 20 17:40:03 crc kubenswrapper[4690]: I0320 17:40:03.826203 4690 generic.go:334] "Generic (PLEG): container finished" podID="be701514-9b72-4af7-8a67-bbf545296477" containerID="40addebe99a4361632e57afb962d6551a5565f475c212055c71d2e0db97b2bce" exitCode=0 Mar 20 17:40:03 crc kubenswrapper[4690]: I0320 17:40:03.826348 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567140-xh2x5" event={"ID":"be701514-9b72-4af7-8a67-bbf545296477","Type":"ContainerDied","Data":"40addebe99a4361632e57afb962d6551a5565f475c212055c71d2e0db97b2bce"} Mar 20 17:40:05 crc kubenswrapper[4690]: I0320 17:40:05.286695 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567140-xh2x5" Mar 20 17:40:05 crc kubenswrapper[4690]: I0320 17:40:05.387564 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zmngx\" (UniqueName: \"kubernetes.io/projected/be701514-9b72-4af7-8a67-bbf545296477-kube-api-access-zmngx\") pod \"be701514-9b72-4af7-8a67-bbf545296477\" (UID: \"be701514-9b72-4af7-8a67-bbf545296477\") " Mar 20 17:40:05 crc kubenswrapper[4690]: I0320 17:40:05.393522 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be701514-9b72-4af7-8a67-bbf545296477-kube-api-access-zmngx" (OuterVolumeSpecName: "kube-api-access-zmngx") pod "be701514-9b72-4af7-8a67-bbf545296477" (UID: "be701514-9b72-4af7-8a67-bbf545296477"). InnerVolumeSpecName "kube-api-access-zmngx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:40:05 crc kubenswrapper[4690]: I0320 17:40:05.488755 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zmngx\" (UniqueName: \"kubernetes.io/projected/be701514-9b72-4af7-8a67-bbf545296477-kube-api-access-zmngx\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:05 crc kubenswrapper[4690]: I0320 17:40:05.840014 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567140-xh2x5" event={"ID":"be701514-9b72-4af7-8a67-bbf545296477","Type":"ContainerDied","Data":"13d54c97ffd1b29398ed9239518bafa327b044767f92db6142a26e1b360fdd6a"} Mar 20 17:40:05 crc kubenswrapper[4690]: I0320 17:40:05.840290 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="13d54c97ffd1b29398ed9239518bafa327b044767f92db6142a26e1b360fdd6a" Mar 20 17:40:05 crc kubenswrapper[4690]: I0320 17:40:05.840067 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567140-xh2x5" Mar 20 17:40:06 crc kubenswrapper[4690]: I0320 17:40:06.370710 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567134-66l98"] Mar 20 17:40:06 crc kubenswrapper[4690]: I0320 17:40:06.375552 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567134-66l98"] Mar 20 17:40:07 crc kubenswrapper[4690]: I0320 17:40:07.895143 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34d2f5b9-1f8e-4413-b178-58cd10fa7548" path="/var/lib/kubelet/pods/34d2f5b9-1f8e-4413-b178-58cd10fa7548/volumes" Mar 20 17:40:09 crc kubenswrapper[4690]: I0320 17:40:09.454709 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-6fhf7" podUID="11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8" containerName="registry" containerID="cri-o://4adc951754cfda921010f0fa0d9abfc0c746e7568c061110a54ad12757acf5eb" gracePeriod=30 Mar 20 17:40:09 crc kubenswrapper[4690]: I0320 17:40:09.872236 4690 generic.go:334] "Generic (PLEG): container finished" podID="11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8" containerID="4adc951754cfda921010f0fa0d9abfc0c746e7568c061110a54ad12757acf5eb" exitCode=0 Mar 20 17:40:09 crc kubenswrapper[4690]: I0320 17:40:09.872305 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-6fhf7" event={"ID":"11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8","Type":"ContainerDied","Data":"4adc951754cfda921010f0fa0d9abfc0c746e7568c061110a54ad12757acf5eb"} Mar 20 17:40:09 crc kubenswrapper[4690]: I0320 17:40:09.872618 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-6fhf7" event={"ID":"11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8","Type":"ContainerDied","Data":"656e2bbbac3fdf4d70614ec5676403b5a6fbb7ca5c8bba31f472a2bfcf23e8f4"} Mar 20 17:40:09 crc kubenswrapper[4690]: I0320 17:40:09.872647 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="656e2bbbac3fdf4d70614ec5676403b5a6fbb7ca5c8bba31f472a2bfcf23e8f4" Mar 20 17:40:09 crc kubenswrapper[4690]: I0320 17:40:09.907723 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-6fhf7" Mar 20 17:40:10 crc kubenswrapper[4690]: I0320 17:40:10.049631 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8\" (UID: \"11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8\") " Mar 20 17:40:10 crc kubenswrapper[4690]: I0320 17:40:10.049678 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cz9rp\" (UniqueName: \"kubernetes.io/projected/11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8-kube-api-access-cz9rp\") pod \"11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8\" (UID: \"11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8\") " Mar 20 17:40:10 crc kubenswrapper[4690]: I0320 17:40:10.049704 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8-registry-tls\") pod \"11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8\" (UID: \"11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8\") " Mar 20 17:40:10 crc kubenswrapper[4690]: I0320 17:40:10.049740 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8-bound-sa-token\") pod \"11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8\" (UID: \"11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8\") " Mar 20 17:40:10 crc kubenswrapper[4690]: I0320 17:40:10.049764 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8-ca-trust-extracted\") pod \"11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8\" (UID: \"11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8\") " Mar 20 17:40:10 crc kubenswrapper[4690]: I0320 17:40:10.049791 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8-registry-certificates\") pod \"11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8\" (UID: \"11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8\") " Mar 20 17:40:10 crc kubenswrapper[4690]: I0320 17:40:10.049820 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8-trusted-ca\") pod \"11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8\" (UID: \"11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8\") " Mar 20 17:40:10 crc kubenswrapper[4690]: I0320 17:40:10.049864 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8-installation-pull-secrets\") pod \"11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8\" (UID: \"11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8\") " Mar 20 17:40:10 crc kubenswrapper[4690]: I0320 17:40:10.050743 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8" (UID: "11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:40:10 crc kubenswrapper[4690]: I0320 17:40:10.051562 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8" (UID: "11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:40:10 crc kubenswrapper[4690]: I0320 17:40:10.055879 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8-kube-api-access-cz9rp" (OuterVolumeSpecName: "kube-api-access-cz9rp") pod "11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8" (UID: "11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8"). InnerVolumeSpecName "kube-api-access-cz9rp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:40:10 crc kubenswrapper[4690]: I0320 17:40:10.056119 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8" (UID: "11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:40:10 crc kubenswrapper[4690]: I0320 17:40:10.056354 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8" (UID: "11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:40:10 crc kubenswrapper[4690]: I0320 17:40:10.056880 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8" (UID: "11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:40:10 crc kubenswrapper[4690]: I0320 17:40:10.068046 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8" (UID: "11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 20 17:40:10 crc kubenswrapper[4690]: I0320 17:40:10.086166 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8" (UID: "11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:40:10 crc kubenswrapper[4690]: I0320 17:40:10.151172 4690 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:10 crc kubenswrapper[4690]: I0320 17:40:10.151229 4690 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:10 crc kubenswrapper[4690]: I0320 17:40:10.151250 4690 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:10 crc kubenswrapper[4690]: I0320 17:40:10.151294 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cz9rp\" (UniqueName: \"kubernetes.io/projected/11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8-kube-api-access-cz9rp\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:10 crc kubenswrapper[4690]: I0320 17:40:10.151311 4690 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:10 crc kubenswrapper[4690]: I0320 17:40:10.151327 4690 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:10 crc kubenswrapper[4690]: I0320 17:40:10.151344 4690 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:10 crc kubenswrapper[4690]: I0320 17:40:10.881650 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-6fhf7" Mar 20 17:40:10 crc kubenswrapper[4690]: I0320 17:40:10.919384 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6fhf7"] Mar 20 17:40:10 crc kubenswrapper[4690]: I0320 17:40:10.924929 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6fhf7"] Mar 20 17:40:11 crc kubenswrapper[4690]: I0320 17:40:11.893462 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8" path="/var/lib/kubelet/pods/11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8/volumes" Mar 20 17:40:18 crc kubenswrapper[4690]: I0320 17:40:18.587912 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:40:25 crc kubenswrapper[4690]: I0320 17:40:25.375042 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vwgk4"] Mar 20 17:40:25 crc kubenswrapper[4690]: I0320 17:40:25.375741 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vwgk4" podUID="416d626a-ef44-4b4e-91ce-51042b01a45a" containerName="registry-server" containerID="cri-o://1fe3dcdb3969906ffca8b3854da57c53585e6e2e9ca61660385b5987ad74672b" gracePeriod=30 Mar 20 17:40:25 crc kubenswrapper[4690]: I0320 17:40:25.386837 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4m7xw"] Mar 20 17:40:25 crc kubenswrapper[4690]: I0320 17:40:25.387115 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4m7xw" podUID="30d0d78a-3ea1-4ce6-b8fb-13645cfedf18" containerName="registry-server" containerID="cri-o://a1c7f63012a04e35bc876e57c218890507ef1e6d645db5b916abf9ec63e0c657" gracePeriod=30 Mar 20 17:40:25 crc kubenswrapper[4690]: I0320 17:40:25.396752 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dnpcn"] Mar 20 17:40:25 crc kubenswrapper[4690]: I0320 17:40:25.396958 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-dnpcn" podUID="80d86fac-74cc-41d4-81df-2e718c1568d9" containerName="marketplace-operator" containerID="cri-o://36dc46d3ac7a19e5b7a5729f297ee7763ce7e53a8aa3f958c84483bd1e69de57" gracePeriod=30 Mar 20 17:40:25 crc kubenswrapper[4690]: I0320 17:40:25.411602 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ssfrq"] Mar 20 17:40:25 crc kubenswrapper[4690]: I0320 17:40:25.411892 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ssfrq" podUID="244c63f8-c484-4edb-9cb6-0ac6a9dac136" containerName="registry-server" containerID="cri-o://d7de445ebe7990df16abb70ee2900f98d44fdfb8df9b82007bc8ece8b464694c" gracePeriod=30 Mar 20 17:40:25 crc kubenswrapper[4690]: I0320 17:40:25.427117 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zltxc"] Mar 20 17:40:25 crc kubenswrapper[4690]: I0320 17:40:25.427615 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zltxc" podUID="edacb8ae-57ae-41f3-b13b-a423afa0e2dd" containerName="registry-server" containerID="cri-o://09b034838f6bb8b6db8b82a642484201233acf32ac664e09394e60737b22b716" gracePeriod=30 Mar 20 17:40:25 crc kubenswrapper[4690]: I0320 17:40:25.430356 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7hpm8"] Mar 20 17:40:25 crc kubenswrapper[4690]: E0320 17:40:25.430600 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be701514-9b72-4af7-8a67-bbf545296477" containerName="oc" Mar 20 17:40:25 crc kubenswrapper[4690]: I0320 17:40:25.430619 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="be701514-9b72-4af7-8a67-bbf545296477" containerName="oc" Mar 20 17:40:25 crc kubenswrapper[4690]: E0320 17:40:25.430629 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8" containerName="registry" Mar 20 17:40:25 crc kubenswrapper[4690]: I0320 17:40:25.430635 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8" containerName="registry" Mar 20 17:40:25 crc kubenswrapper[4690]: I0320 17:40:25.430762 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="11dbcb55-a8fa-4a9b-a8a3-bf6e79fa18c8" containerName="registry" Mar 20 17:40:25 crc kubenswrapper[4690]: I0320 17:40:25.430774 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="be701514-9b72-4af7-8a67-bbf545296477" containerName="oc" Mar 20 17:40:25 crc kubenswrapper[4690]: I0320 17:40:25.431155 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-7hpm8" Mar 20 17:40:25 crc kubenswrapper[4690]: I0320 17:40:25.439571 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7hpm8"] Mar 20 17:40:25 crc kubenswrapper[4690]: I0320 17:40:25.460180 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dz5m7\" (UniqueName: \"kubernetes.io/projected/23f72eed-c3c0-4aed-a4a8-8243c27a2785-kube-api-access-dz5m7\") pod \"marketplace-operator-79b997595-7hpm8\" (UID: \"23f72eed-c3c0-4aed-a4a8-8243c27a2785\") " pod="openshift-marketplace/marketplace-operator-79b997595-7hpm8" Mar 20 17:40:25 crc kubenswrapper[4690]: I0320 17:40:25.460234 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/23f72eed-c3c0-4aed-a4a8-8243c27a2785-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-7hpm8\" (UID: \"23f72eed-c3c0-4aed-a4a8-8243c27a2785\") " pod="openshift-marketplace/marketplace-operator-79b997595-7hpm8" Mar 20 17:40:25 crc kubenswrapper[4690]: I0320 17:40:25.460349 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/23f72eed-c3c0-4aed-a4a8-8243c27a2785-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-7hpm8\" (UID: \"23f72eed-c3c0-4aed-a4a8-8243c27a2785\") " pod="openshift-marketplace/marketplace-operator-79b997595-7hpm8" Mar 20 17:40:25 crc kubenswrapper[4690]: I0320 17:40:25.561112 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dz5m7\" (UniqueName: \"kubernetes.io/projected/23f72eed-c3c0-4aed-a4a8-8243c27a2785-kube-api-access-dz5m7\") pod \"marketplace-operator-79b997595-7hpm8\" (UID: \"23f72eed-c3c0-4aed-a4a8-8243c27a2785\") " pod="openshift-marketplace/marketplace-operator-79b997595-7hpm8" Mar 20 17:40:25 crc kubenswrapper[4690]: I0320 17:40:25.561169 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/23f72eed-c3c0-4aed-a4a8-8243c27a2785-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-7hpm8\" (UID: \"23f72eed-c3c0-4aed-a4a8-8243c27a2785\") " pod="openshift-marketplace/marketplace-operator-79b997595-7hpm8" Mar 20 17:40:25 crc kubenswrapper[4690]: I0320 17:40:25.561210 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/23f72eed-c3c0-4aed-a4a8-8243c27a2785-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-7hpm8\" (UID: \"23f72eed-c3c0-4aed-a4a8-8243c27a2785\") " pod="openshift-marketplace/marketplace-operator-79b997595-7hpm8" Mar 20 17:40:25 crc kubenswrapper[4690]: I0320 17:40:25.562572 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/23f72eed-c3c0-4aed-a4a8-8243c27a2785-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-7hpm8\" (UID: \"23f72eed-c3c0-4aed-a4a8-8243c27a2785\") " pod="openshift-marketplace/marketplace-operator-79b997595-7hpm8" Mar 20 17:40:25 crc kubenswrapper[4690]: I0320 17:40:25.581320 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dz5m7\" (UniqueName: \"kubernetes.io/projected/23f72eed-c3c0-4aed-a4a8-8243c27a2785-kube-api-access-dz5m7\") pod \"marketplace-operator-79b997595-7hpm8\" (UID: \"23f72eed-c3c0-4aed-a4a8-8243c27a2785\") " pod="openshift-marketplace/marketplace-operator-79b997595-7hpm8" Mar 20 17:40:25 crc kubenswrapper[4690]: I0320 17:40:25.584462 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/23f72eed-c3c0-4aed-a4a8-8243c27a2785-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-7hpm8\" (UID: \"23f72eed-c3c0-4aed-a4a8-8243c27a2785\") " pod="openshift-marketplace/marketplace-operator-79b997595-7hpm8" Mar 20 17:40:25 crc kubenswrapper[4690]: I0320 17:40:25.814347 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-7hpm8" Mar 20 17:40:25 crc kubenswrapper[4690]: I0320 17:40:25.821041 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4m7xw" Mar 20 17:40:25 crc kubenswrapper[4690]: I0320 17:40:25.864903 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zltxc" Mar 20 17:40:25 crc kubenswrapper[4690]: I0320 17:40:25.890407 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vwgk4" Mar 20 17:40:25 crc kubenswrapper[4690]: I0320 17:40:25.921400 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ssfrq" Mar 20 17:40:25 crc kubenswrapper[4690]: I0320 17:40:25.928684 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-dnpcn" Mar 20 17:40:25 crc kubenswrapper[4690]: I0320 17:40:25.975911 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30d0d78a-3ea1-4ce6-b8fb-13645cfedf18-utilities\") pod \"30d0d78a-3ea1-4ce6-b8fb-13645cfedf18\" (UID: \"30d0d78a-3ea1-4ce6-b8fb-13645cfedf18\") " Mar 20 17:40:25 crc kubenswrapper[4690]: I0320 17:40:25.975973 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87j8k\" (UniqueName: \"kubernetes.io/projected/edacb8ae-57ae-41f3-b13b-a423afa0e2dd-kube-api-access-87j8k\") pod \"edacb8ae-57ae-41f3-b13b-a423afa0e2dd\" (UID: \"edacb8ae-57ae-41f3-b13b-a423afa0e2dd\") " Mar 20 17:40:25 crc kubenswrapper[4690]: I0320 17:40:25.975994 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edacb8ae-57ae-41f3-b13b-a423afa0e2dd-utilities\") pod \"edacb8ae-57ae-41f3-b13b-a423afa0e2dd\" (UID: \"edacb8ae-57ae-41f3-b13b-a423afa0e2dd\") " Mar 20 17:40:25 crc kubenswrapper[4690]: I0320 17:40:25.976011 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5jpch\" (UniqueName: \"kubernetes.io/projected/30d0d78a-3ea1-4ce6-b8fb-13645cfedf18-kube-api-access-5jpch\") pod \"30d0d78a-3ea1-4ce6-b8fb-13645cfedf18\" (UID: \"30d0d78a-3ea1-4ce6-b8fb-13645cfedf18\") " Mar 20 17:40:25 crc kubenswrapper[4690]: I0320 17:40:25.976065 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30d0d78a-3ea1-4ce6-b8fb-13645cfedf18-catalog-content\") pod \"30d0d78a-3ea1-4ce6-b8fb-13645cfedf18\" (UID: \"30d0d78a-3ea1-4ce6-b8fb-13645cfedf18\") " Mar 20 17:40:25 crc kubenswrapper[4690]: I0320 17:40:25.976086 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edacb8ae-57ae-41f3-b13b-a423afa0e2dd-catalog-content\") pod \"edacb8ae-57ae-41f3-b13b-a423afa0e2dd\" (UID: \"edacb8ae-57ae-41f3-b13b-a423afa0e2dd\") " Mar 20 17:40:25 crc kubenswrapper[4690]: I0320 17:40:25.976983 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/edacb8ae-57ae-41f3-b13b-a423afa0e2dd-utilities" (OuterVolumeSpecName: "utilities") pod "edacb8ae-57ae-41f3-b13b-a423afa0e2dd" (UID: "edacb8ae-57ae-41f3-b13b-a423afa0e2dd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:40:25 crc kubenswrapper[4690]: I0320 17:40:25.978164 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30d0d78a-3ea1-4ce6-b8fb-13645cfedf18-utilities" (OuterVolumeSpecName: "utilities") pod "30d0d78a-3ea1-4ce6-b8fb-13645cfedf18" (UID: "30d0d78a-3ea1-4ce6-b8fb-13645cfedf18"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:40:25 crc kubenswrapper[4690]: I0320 17:40:25.978985 4690 generic.go:334] "Generic (PLEG): container finished" podID="edacb8ae-57ae-41f3-b13b-a423afa0e2dd" containerID="09b034838f6bb8b6db8b82a642484201233acf32ac664e09394e60737b22b716" exitCode=0 Mar 20 17:40:25 crc kubenswrapper[4690]: I0320 17:40:25.979052 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zltxc" event={"ID":"edacb8ae-57ae-41f3-b13b-a423afa0e2dd","Type":"ContainerDied","Data":"09b034838f6bb8b6db8b82a642484201233acf32ac664e09394e60737b22b716"} Mar 20 17:40:25 crc kubenswrapper[4690]: I0320 17:40:25.979077 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zltxc" event={"ID":"edacb8ae-57ae-41f3-b13b-a423afa0e2dd","Type":"ContainerDied","Data":"a56a35a3fc4901935cd2bf973dcec03fe7d2b7eba6651944115022317aa5c473"} Mar 20 17:40:25 crc kubenswrapper[4690]: I0320 17:40:25.979093 4690 scope.go:117] "RemoveContainer" containerID="09b034838f6bb8b6db8b82a642484201233acf32ac664e09394e60737b22b716" Mar 20 17:40:25 crc kubenswrapper[4690]: I0320 17:40:25.979201 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zltxc" Mar 20 17:40:25 crc kubenswrapper[4690]: I0320 17:40:25.980636 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30d0d78a-3ea1-4ce6-b8fb-13645cfedf18-kube-api-access-5jpch" (OuterVolumeSpecName: "kube-api-access-5jpch") pod "30d0d78a-3ea1-4ce6-b8fb-13645cfedf18" (UID: "30d0d78a-3ea1-4ce6-b8fb-13645cfedf18"). InnerVolumeSpecName "kube-api-access-5jpch". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:40:25 crc kubenswrapper[4690]: I0320 17:40:25.982307 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/edacb8ae-57ae-41f3-b13b-a423afa0e2dd-kube-api-access-87j8k" (OuterVolumeSpecName: "kube-api-access-87j8k") pod "edacb8ae-57ae-41f3-b13b-a423afa0e2dd" (UID: "edacb8ae-57ae-41f3-b13b-a423afa0e2dd"). InnerVolumeSpecName "kube-api-access-87j8k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:40:25 crc kubenswrapper[4690]: I0320 17:40:25.991757 4690 generic.go:334] "Generic (PLEG): container finished" podID="244c63f8-c484-4edb-9cb6-0ac6a9dac136" containerID="d7de445ebe7990df16abb70ee2900f98d44fdfb8df9b82007bc8ece8b464694c" exitCode=0 Mar 20 17:40:25 crc kubenswrapper[4690]: I0320 17:40:25.991817 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ssfrq" event={"ID":"244c63f8-c484-4edb-9cb6-0ac6a9dac136","Type":"ContainerDied","Data":"d7de445ebe7990df16abb70ee2900f98d44fdfb8df9b82007bc8ece8b464694c"} Mar 20 17:40:25 crc kubenswrapper[4690]: I0320 17:40:25.991849 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ssfrq" event={"ID":"244c63f8-c484-4edb-9cb6-0ac6a9dac136","Type":"ContainerDied","Data":"ea5166e7133ca5ed7a8463d26ce78afda8c19a95a9619bfeb6567454c9547370"} Mar 20 17:40:25 crc kubenswrapper[4690]: I0320 17:40:25.991914 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ssfrq" Mar 20 17:40:25 crc kubenswrapper[4690]: I0320 17:40:25.996019 4690 scope.go:117] "RemoveContainer" containerID="84c094381485fb9029decdd3f4ffdb718e527e641f2e5b3bff237eed3c96ac6a" Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.000860 4690 generic.go:334] "Generic (PLEG): container finished" podID="416d626a-ef44-4b4e-91ce-51042b01a45a" containerID="1fe3dcdb3969906ffca8b3854da57c53585e6e2e9ca61660385b5987ad74672b" exitCode=0 Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.000984 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vwgk4" Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.001233 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vwgk4" event={"ID":"416d626a-ef44-4b4e-91ce-51042b01a45a","Type":"ContainerDied","Data":"1fe3dcdb3969906ffca8b3854da57c53585e6e2e9ca61660385b5987ad74672b"} Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.001280 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vwgk4" event={"ID":"416d626a-ef44-4b4e-91ce-51042b01a45a","Type":"ContainerDied","Data":"e8f670484c751ffd572d834f9d49b8b85a59e9f0b8533556624448b2653f7d87"} Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.013585 4690 generic.go:334] "Generic (PLEG): container finished" podID="30d0d78a-3ea1-4ce6-b8fb-13645cfedf18" containerID="a1c7f63012a04e35bc876e57c218890507ef1e6d645db5b916abf9ec63e0c657" exitCode=0 Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.013671 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4m7xw" event={"ID":"30d0d78a-3ea1-4ce6-b8fb-13645cfedf18","Type":"ContainerDied","Data":"a1c7f63012a04e35bc876e57c218890507ef1e6d645db5b916abf9ec63e0c657"} Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.013698 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4m7xw" event={"ID":"30d0d78a-3ea1-4ce6-b8fb-13645cfedf18","Type":"ContainerDied","Data":"354d0cde1963d6ff64bbe5e6e497b0642e6f4bdfe3ab86580d7ec32d37b65735"} Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.014217 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4m7xw" Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.019816 4690 generic.go:334] "Generic (PLEG): container finished" podID="80d86fac-74cc-41d4-81df-2e718c1568d9" containerID="36dc46d3ac7a19e5b7a5729f297ee7763ce7e53a8aa3f958c84483bd1e69de57" exitCode=0 Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.019964 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-dnpcn" event={"ID":"80d86fac-74cc-41d4-81df-2e718c1568d9","Type":"ContainerDied","Data":"36dc46d3ac7a19e5b7a5729f297ee7763ce7e53a8aa3f958c84483bd1e69de57"} Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.019992 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-dnpcn" event={"ID":"80d86fac-74cc-41d4-81df-2e718c1568d9","Type":"ContainerDied","Data":"cf6df4e6f7fe75b0a336e3b90e47481446533d1991a1e2916fbe7c0f4b5977b2"} Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.020042 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-dnpcn" Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.026678 4690 scope.go:117] "RemoveContainer" containerID="1db2ba672516ba3a1e846e0216de4fa0af7bbd987928438b3a0ce4db94a58503" Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.029083 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30d0d78a-3ea1-4ce6-b8fb-13645cfedf18-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "30d0d78a-3ea1-4ce6-b8fb-13645cfedf18" (UID: "30d0d78a-3ea1-4ce6-b8fb-13645cfedf18"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.045919 4690 scope.go:117] "RemoveContainer" containerID="09b034838f6bb8b6db8b82a642484201233acf32ac664e09394e60737b22b716" Mar 20 17:40:26 crc kubenswrapper[4690]: E0320 17:40:26.046246 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09b034838f6bb8b6db8b82a642484201233acf32ac664e09394e60737b22b716\": container with ID starting with 09b034838f6bb8b6db8b82a642484201233acf32ac664e09394e60737b22b716 not found: ID does not exist" containerID="09b034838f6bb8b6db8b82a642484201233acf32ac664e09394e60737b22b716" Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.046301 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09b034838f6bb8b6db8b82a642484201233acf32ac664e09394e60737b22b716"} err="failed to get container status \"09b034838f6bb8b6db8b82a642484201233acf32ac664e09394e60737b22b716\": rpc error: code = NotFound desc = could not find container \"09b034838f6bb8b6db8b82a642484201233acf32ac664e09394e60737b22b716\": container with ID starting with 09b034838f6bb8b6db8b82a642484201233acf32ac664e09394e60737b22b716 not found: ID does not exist" Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.046328 4690 scope.go:117] "RemoveContainer" containerID="84c094381485fb9029decdd3f4ffdb718e527e641f2e5b3bff237eed3c96ac6a" Mar 20 17:40:26 crc kubenswrapper[4690]: E0320 17:40:26.046796 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84c094381485fb9029decdd3f4ffdb718e527e641f2e5b3bff237eed3c96ac6a\": container with ID starting with 84c094381485fb9029decdd3f4ffdb718e527e641f2e5b3bff237eed3c96ac6a not found: ID does not exist" containerID="84c094381485fb9029decdd3f4ffdb718e527e641f2e5b3bff237eed3c96ac6a" Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.046818 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84c094381485fb9029decdd3f4ffdb718e527e641f2e5b3bff237eed3c96ac6a"} err="failed to get container status \"84c094381485fb9029decdd3f4ffdb718e527e641f2e5b3bff237eed3c96ac6a\": rpc error: code = NotFound desc = could not find container \"84c094381485fb9029decdd3f4ffdb718e527e641f2e5b3bff237eed3c96ac6a\": container with ID starting with 84c094381485fb9029decdd3f4ffdb718e527e641f2e5b3bff237eed3c96ac6a not found: ID does not exist" Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.046831 4690 scope.go:117] "RemoveContainer" containerID="1db2ba672516ba3a1e846e0216de4fa0af7bbd987928438b3a0ce4db94a58503" Mar 20 17:40:26 crc kubenswrapper[4690]: E0320 17:40:26.047133 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1db2ba672516ba3a1e846e0216de4fa0af7bbd987928438b3a0ce4db94a58503\": container with ID starting with 1db2ba672516ba3a1e846e0216de4fa0af7bbd987928438b3a0ce4db94a58503 not found: ID does not exist" containerID="1db2ba672516ba3a1e846e0216de4fa0af7bbd987928438b3a0ce4db94a58503" Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.047164 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1db2ba672516ba3a1e846e0216de4fa0af7bbd987928438b3a0ce4db94a58503"} err="failed to get container status \"1db2ba672516ba3a1e846e0216de4fa0af7bbd987928438b3a0ce4db94a58503\": rpc error: code = NotFound desc = could not find container \"1db2ba672516ba3a1e846e0216de4fa0af7bbd987928438b3a0ce4db94a58503\": container with ID starting with 1db2ba672516ba3a1e846e0216de4fa0af7bbd987928438b3a0ce4db94a58503 not found: ID does not exist" Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.047184 4690 scope.go:117] "RemoveContainer" containerID="d7de445ebe7990df16abb70ee2900f98d44fdfb8df9b82007bc8ece8b464694c" Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.058443 4690 scope.go:117] "RemoveContainer" containerID="bad2421123885acf87544a625c009c38d815d1ae18f099a7195e0ae1f3e2d913" Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.075327 4690 scope.go:117] "RemoveContainer" containerID="f2e746f01c034ad6a813a3d33d439ba2886dbd797f83e2a72ce1203e983cbcde" Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.078517 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/416d626a-ef44-4b4e-91ce-51042b01a45a-catalog-content\") pod \"416d626a-ef44-4b4e-91ce-51042b01a45a\" (UID: \"416d626a-ef44-4b4e-91ce-51042b01a45a\") " Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.078574 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-phklv\" (UniqueName: \"kubernetes.io/projected/244c63f8-c484-4edb-9cb6-0ac6a9dac136-kube-api-access-phklv\") pod \"244c63f8-c484-4edb-9cb6-0ac6a9dac136\" (UID: \"244c63f8-c484-4edb-9cb6-0ac6a9dac136\") " Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.078600 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/80d86fac-74cc-41d4-81df-2e718c1568d9-marketplace-operator-metrics\") pod \"80d86fac-74cc-41d4-81df-2e718c1568d9\" (UID: \"80d86fac-74cc-41d4-81df-2e718c1568d9\") " Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.078648 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/244c63f8-c484-4edb-9cb6-0ac6a9dac136-utilities\") pod \"244c63f8-c484-4edb-9cb6-0ac6a9dac136\" (UID: \"244c63f8-c484-4edb-9cb6-0ac6a9dac136\") " Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.079109 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57922\" (UniqueName: \"kubernetes.io/projected/80d86fac-74cc-41d4-81df-2e718c1568d9-kube-api-access-57922\") pod \"80d86fac-74cc-41d4-81df-2e718c1568d9\" (UID: \"80d86fac-74cc-41d4-81df-2e718c1568d9\") " Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.079134 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/244c63f8-c484-4edb-9cb6-0ac6a9dac136-catalog-content\") pod \"244c63f8-c484-4edb-9cb6-0ac6a9dac136\" (UID: \"244c63f8-c484-4edb-9cb6-0ac6a9dac136\") " Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.079173 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/80d86fac-74cc-41d4-81df-2e718c1568d9-marketplace-trusted-ca\") pod \"80d86fac-74cc-41d4-81df-2e718c1568d9\" (UID: \"80d86fac-74cc-41d4-81df-2e718c1568d9\") " Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.079204 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jtsrr\" (UniqueName: \"kubernetes.io/projected/416d626a-ef44-4b4e-91ce-51042b01a45a-kube-api-access-jtsrr\") pod \"416d626a-ef44-4b4e-91ce-51042b01a45a\" (UID: \"416d626a-ef44-4b4e-91ce-51042b01a45a\") " Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.079234 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/416d626a-ef44-4b4e-91ce-51042b01a45a-utilities\") pod \"416d626a-ef44-4b4e-91ce-51042b01a45a\" (UID: \"416d626a-ef44-4b4e-91ce-51042b01a45a\") " Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.080082 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/416d626a-ef44-4b4e-91ce-51042b01a45a-utilities" (OuterVolumeSpecName: "utilities") pod "416d626a-ef44-4b4e-91ce-51042b01a45a" (UID: "416d626a-ef44-4b4e-91ce-51042b01a45a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.080221 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/244c63f8-c484-4edb-9cb6-0ac6a9dac136-utilities" (OuterVolumeSpecName: "utilities") pod "244c63f8-c484-4edb-9cb6-0ac6a9dac136" (UID: "244c63f8-c484-4edb-9cb6-0ac6a9dac136"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.080416 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80d86fac-74cc-41d4-81df-2e718c1568d9-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "80d86fac-74cc-41d4-81df-2e718c1568d9" (UID: "80d86fac-74cc-41d4-81df-2e718c1568d9"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.082115 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80d86fac-74cc-41d4-81df-2e718c1568d9-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "80d86fac-74cc-41d4-81df-2e718c1568d9" (UID: "80d86fac-74cc-41d4-81df-2e718c1568d9"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.082339 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80d86fac-74cc-41d4-81df-2e718c1568d9-kube-api-access-57922" (OuterVolumeSpecName: "kube-api-access-57922") pod "80d86fac-74cc-41d4-81df-2e718c1568d9" (UID: "80d86fac-74cc-41d4-81df-2e718c1568d9"). InnerVolumeSpecName "kube-api-access-57922". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.082409 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/416d626a-ef44-4b4e-91ce-51042b01a45a-kube-api-access-jtsrr" (OuterVolumeSpecName: "kube-api-access-jtsrr") pod "416d626a-ef44-4b4e-91ce-51042b01a45a" (UID: "416d626a-ef44-4b4e-91ce-51042b01a45a"). InnerVolumeSpecName "kube-api-access-jtsrr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.083139 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/244c63f8-c484-4edb-9cb6-0ac6a9dac136-kube-api-access-phklv" (OuterVolumeSpecName: "kube-api-access-phklv") pod "244c63f8-c484-4edb-9cb6-0ac6a9dac136" (UID: "244c63f8-c484-4edb-9cb6-0ac6a9dac136"). InnerVolumeSpecName "kube-api-access-phklv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.084228 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-87j8k\" (UniqueName: \"kubernetes.io/projected/edacb8ae-57ae-41f3-b13b-a423afa0e2dd-kube-api-access-87j8k\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.084320 4690 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edacb8ae-57ae-41f3-b13b-a423afa0e2dd-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.084338 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5jpch\" (UniqueName: \"kubernetes.io/projected/30d0d78a-3ea1-4ce6-b8fb-13645cfedf18-kube-api-access-5jpch\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.084349 4690 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30d0d78a-3ea1-4ce6-b8fb-13645cfedf18-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.084359 4690 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30d0d78a-3ea1-4ce6-b8fb-13645cfedf18-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.090315 4690 scope.go:117] "RemoveContainer" containerID="d7de445ebe7990df16abb70ee2900f98d44fdfb8df9b82007bc8ece8b464694c" Mar 20 17:40:26 crc kubenswrapper[4690]: E0320 17:40:26.091506 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7de445ebe7990df16abb70ee2900f98d44fdfb8df9b82007bc8ece8b464694c\": container with ID starting with d7de445ebe7990df16abb70ee2900f98d44fdfb8df9b82007bc8ece8b464694c not found: ID does not exist" containerID="d7de445ebe7990df16abb70ee2900f98d44fdfb8df9b82007bc8ece8b464694c" Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.092035 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7de445ebe7990df16abb70ee2900f98d44fdfb8df9b82007bc8ece8b464694c"} err="failed to get container status \"d7de445ebe7990df16abb70ee2900f98d44fdfb8df9b82007bc8ece8b464694c\": rpc error: code = NotFound desc = could not find container \"d7de445ebe7990df16abb70ee2900f98d44fdfb8df9b82007bc8ece8b464694c\": container with ID starting with d7de445ebe7990df16abb70ee2900f98d44fdfb8df9b82007bc8ece8b464694c not found: ID does not exist" Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.092320 4690 scope.go:117] "RemoveContainer" containerID="bad2421123885acf87544a625c009c38d815d1ae18f099a7195e0ae1f3e2d913" Mar 20 17:40:26 crc kubenswrapper[4690]: E0320 17:40:26.093879 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bad2421123885acf87544a625c009c38d815d1ae18f099a7195e0ae1f3e2d913\": container with ID starting with bad2421123885acf87544a625c009c38d815d1ae18f099a7195e0ae1f3e2d913 not found: ID does not exist" containerID="bad2421123885acf87544a625c009c38d815d1ae18f099a7195e0ae1f3e2d913" Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.093922 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bad2421123885acf87544a625c009c38d815d1ae18f099a7195e0ae1f3e2d913"} err="failed to get container status \"bad2421123885acf87544a625c009c38d815d1ae18f099a7195e0ae1f3e2d913\": rpc error: code = NotFound desc = could not find container \"bad2421123885acf87544a625c009c38d815d1ae18f099a7195e0ae1f3e2d913\": container with ID starting with bad2421123885acf87544a625c009c38d815d1ae18f099a7195e0ae1f3e2d913 not found: ID does not exist" Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.093949 4690 scope.go:117] "RemoveContainer" containerID="f2e746f01c034ad6a813a3d33d439ba2886dbd797f83e2a72ce1203e983cbcde" Mar 20 17:40:26 crc kubenswrapper[4690]: E0320 17:40:26.094668 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2e746f01c034ad6a813a3d33d439ba2886dbd797f83e2a72ce1203e983cbcde\": container with ID starting with f2e746f01c034ad6a813a3d33d439ba2886dbd797f83e2a72ce1203e983cbcde not found: ID does not exist" containerID="f2e746f01c034ad6a813a3d33d439ba2886dbd797f83e2a72ce1203e983cbcde" Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.094715 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2e746f01c034ad6a813a3d33d439ba2886dbd797f83e2a72ce1203e983cbcde"} err="failed to get container status \"f2e746f01c034ad6a813a3d33d439ba2886dbd797f83e2a72ce1203e983cbcde\": rpc error: code = NotFound desc = could not find container \"f2e746f01c034ad6a813a3d33d439ba2886dbd797f83e2a72ce1203e983cbcde\": container with ID starting with f2e746f01c034ad6a813a3d33d439ba2886dbd797f83e2a72ce1203e983cbcde not found: ID does not exist" Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.094750 4690 scope.go:117] "RemoveContainer" containerID="1fe3dcdb3969906ffca8b3854da57c53585e6e2e9ca61660385b5987ad74672b" Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.101540 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/edacb8ae-57ae-41f3-b13b-a423afa0e2dd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "edacb8ae-57ae-41f3-b13b-a423afa0e2dd" (UID: "edacb8ae-57ae-41f3-b13b-a423afa0e2dd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.119356 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/244c63f8-c484-4edb-9cb6-0ac6a9dac136-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "244c63f8-c484-4edb-9cb6-0ac6a9dac136" (UID: "244c63f8-c484-4edb-9cb6-0ac6a9dac136"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.122515 4690 scope.go:117] "RemoveContainer" containerID="0bec18c10ccf11a63ee46b90dc5f20be0e93558e8ef490df6229d21da1a612bf" Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.144060 4690 scope.go:117] "RemoveContainer" containerID="798662e81e66e461de94f26e9ad33bd80b165e58ad4a15aa5b734cdc24628353" Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.157266 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/416d626a-ef44-4b4e-91ce-51042b01a45a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "416d626a-ef44-4b4e-91ce-51042b01a45a" (UID: "416d626a-ef44-4b4e-91ce-51042b01a45a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.164428 4690 scope.go:117] "RemoveContainer" containerID="1fe3dcdb3969906ffca8b3854da57c53585e6e2e9ca61660385b5987ad74672b" Mar 20 17:40:26 crc kubenswrapper[4690]: E0320 17:40:26.164758 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1fe3dcdb3969906ffca8b3854da57c53585e6e2e9ca61660385b5987ad74672b\": container with ID starting with 1fe3dcdb3969906ffca8b3854da57c53585e6e2e9ca61660385b5987ad74672b not found: ID does not exist" containerID="1fe3dcdb3969906ffca8b3854da57c53585e6e2e9ca61660385b5987ad74672b" Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.164801 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fe3dcdb3969906ffca8b3854da57c53585e6e2e9ca61660385b5987ad74672b"} err="failed to get container status \"1fe3dcdb3969906ffca8b3854da57c53585e6e2e9ca61660385b5987ad74672b\": rpc error: code = NotFound desc = could not find container \"1fe3dcdb3969906ffca8b3854da57c53585e6e2e9ca61660385b5987ad74672b\": container with ID starting with 1fe3dcdb3969906ffca8b3854da57c53585e6e2e9ca61660385b5987ad74672b not found: ID does not exist" Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.164848 4690 scope.go:117] "RemoveContainer" containerID="0bec18c10ccf11a63ee46b90dc5f20be0e93558e8ef490df6229d21da1a612bf" Mar 20 17:40:26 crc kubenswrapper[4690]: E0320 17:40:26.165168 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0bec18c10ccf11a63ee46b90dc5f20be0e93558e8ef490df6229d21da1a612bf\": container with ID starting with 0bec18c10ccf11a63ee46b90dc5f20be0e93558e8ef490df6229d21da1a612bf not found: ID does not exist" containerID="0bec18c10ccf11a63ee46b90dc5f20be0e93558e8ef490df6229d21da1a612bf" Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.165197 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bec18c10ccf11a63ee46b90dc5f20be0e93558e8ef490df6229d21da1a612bf"} err="failed to get container status \"0bec18c10ccf11a63ee46b90dc5f20be0e93558e8ef490df6229d21da1a612bf\": rpc error: code = NotFound desc = could not find container \"0bec18c10ccf11a63ee46b90dc5f20be0e93558e8ef490df6229d21da1a612bf\": container with ID starting with 0bec18c10ccf11a63ee46b90dc5f20be0e93558e8ef490df6229d21da1a612bf not found: ID does not exist" Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.165219 4690 scope.go:117] "RemoveContainer" containerID="798662e81e66e461de94f26e9ad33bd80b165e58ad4a15aa5b734cdc24628353" Mar 20 17:40:26 crc kubenswrapper[4690]: E0320 17:40:26.165533 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"798662e81e66e461de94f26e9ad33bd80b165e58ad4a15aa5b734cdc24628353\": container with ID starting with 798662e81e66e461de94f26e9ad33bd80b165e58ad4a15aa5b734cdc24628353 not found: ID does not exist" containerID="798662e81e66e461de94f26e9ad33bd80b165e58ad4a15aa5b734cdc24628353" Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.165567 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"798662e81e66e461de94f26e9ad33bd80b165e58ad4a15aa5b734cdc24628353"} err="failed to get container status \"798662e81e66e461de94f26e9ad33bd80b165e58ad4a15aa5b734cdc24628353\": rpc error: code = NotFound desc = could not find container \"798662e81e66e461de94f26e9ad33bd80b165e58ad4a15aa5b734cdc24628353\": container with ID starting with 798662e81e66e461de94f26e9ad33bd80b165e58ad4a15aa5b734cdc24628353 not found: ID does not exist" Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.165607 4690 scope.go:117] "RemoveContainer" containerID="a1c7f63012a04e35bc876e57c218890507ef1e6d645db5b916abf9ec63e0c657" Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.179294 4690 scope.go:117] "RemoveContainer" containerID="b2b52611441a520ab3625dab25e0048ab70ec8325a6a02bbe734c01f5f5c7f9f" Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.192684 4690 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/244c63f8-c484-4edb-9cb6-0ac6a9dac136-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.192710 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-57922\" (UniqueName: \"kubernetes.io/projected/80d86fac-74cc-41d4-81df-2e718c1568d9-kube-api-access-57922\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.192723 4690 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/80d86fac-74cc-41d4-81df-2e718c1568d9-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.192731 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jtsrr\" (UniqueName: \"kubernetes.io/projected/416d626a-ef44-4b4e-91ce-51042b01a45a-kube-api-access-jtsrr\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.192742 4690 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edacb8ae-57ae-41f3-b13b-a423afa0e2dd-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.192750 4690 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/416d626a-ef44-4b4e-91ce-51042b01a45a-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.192759 4690 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/416d626a-ef44-4b4e-91ce-51042b01a45a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.192767 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-phklv\" (UniqueName: \"kubernetes.io/projected/244c63f8-c484-4edb-9cb6-0ac6a9dac136-kube-api-access-phklv\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.192777 4690 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/80d86fac-74cc-41d4-81df-2e718c1568d9-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.192785 4690 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/244c63f8-c484-4edb-9cb6-0ac6a9dac136-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.201987 4690 scope.go:117] "RemoveContainer" containerID="0ddcfbf9f5cd054792d532a739efea5d4020042e55919f7bef16c4c048b1328b" Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.212132 4690 scope.go:117] "RemoveContainer" containerID="a1c7f63012a04e35bc876e57c218890507ef1e6d645db5b916abf9ec63e0c657" Mar 20 17:40:26 crc kubenswrapper[4690]: E0320 17:40:26.212710 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1c7f63012a04e35bc876e57c218890507ef1e6d645db5b916abf9ec63e0c657\": container with ID starting with a1c7f63012a04e35bc876e57c218890507ef1e6d645db5b916abf9ec63e0c657 not found: ID does not exist" containerID="a1c7f63012a04e35bc876e57c218890507ef1e6d645db5b916abf9ec63e0c657" Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.212743 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1c7f63012a04e35bc876e57c218890507ef1e6d645db5b916abf9ec63e0c657"} err="failed to get container status \"a1c7f63012a04e35bc876e57c218890507ef1e6d645db5b916abf9ec63e0c657\": rpc error: code = NotFound desc = could not find container \"a1c7f63012a04e35bc876e57c218890507ef1e6d645db5b916abf9ec63e0c657\": container with ID starting with a1c7f63012a04e35bc876e57c218890507ef1e6d645db5b916abf9ec63e0c657 not found: ID does not exist" Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.212769 4690 scope.go:117] "RemoveContainer" containerID="b2b52611441a520ab3625dab25e0048ab70ec8325a6a02bbe734c01f5f5c7f9f" Mar 20 17:40:26 crc kubenswrapper[4690]: E0320 17:40:26.213099 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2b52611441a520ab3625dab25e0048ab70ec8325a6a02bbe734c01f5f5c7f9f\": container with ID starting with b2b52611441a520ab3625dab25e0048ab70ec8325a6a02bbe734c01f5f5c7f9f not found: ID does not exist" containerID="b2b52611441a520ab3625dab25e0048ab70ec8325a6a02bbe734c01f5f5c7f9f" Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.213138 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2b52611441a520ab3625dab25e0048ab70ec8325a6a02bbe734c01f5f5c7f9f"} err="failed to get container status \"b2b52611441a520ab3625dab25e0048ab70ec8325a6a02bbe734c01f5f5c7f9f\": rpc error: code = NotFound desc = could not find container \"b2b52611441a520ab3625dab25e0048ab70ec8325a6a02bbe734c01f5f5c7f9f\": container with ID starting with b2b52611441a520ab3625dab25e0048ab70ec8325a6a02bbe734c01f5f5c7f9f not found: ID does not exist" Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.213171 4690 scope.go:117] "RemoveContainer" containerID="0ddcfbf9f5cd054792d532a739efea5d4020042e55919f7bef16c4c048b1328b" Mar 20 17:40:26 crc kubenswrapper[4690]: E0320 17:40:26.216402 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ddcfbf9f5cd054792d532a739efea5d4020042e55919f7bef16c4c048b1328b\": container with ID starting with 0ddcfbf9f5cd054792d532a739efea5d4020042e55919f7bef16c4c048b1328b not found: ID does not exist" containerID="0ddcfbf9f5cd054792d532a739efea5d4020042e55919f7bef16c4c048b1328b" Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.216441 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ddcfbf9f5cd054792d532a739efea5d4020042e55919f7bef16c4c048b1328b"} err="failed to get container status \"0ddcfbf9f5cd054792d532a739efea5d4020042e55919f7bef16c4c048b1328b\": rpc error: code = NotFound desc = could not find container \"0ddcfbf9f5cd054792d532a739efea5d4020042e55919f7bef16c4c048b1328b\": container with ID starting with 0ddcfbf9f5cd054792d532a739efea5d4020042e55919f7bef16c4c048b1328b not found: ID does not exist" Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.216463 4690 scope.go:117] "RemoveContainer" containerID="36dc46d3ac7a19e5b7a5729f297ee7763ce7e53a8aa3f958c84483bd1e69de57" Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.233368 4690 scope.go:117] "RemoveContainer" containerID="b760ad6cf95133d8fc74387d30f58aa3b60fa64983a86b7f2b2cf8c0828be7a1" Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.239882 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7hpm8"] Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.252607 4690 scope.go:117] "RemoveContainer" containerID="36dc46d3ac7a19e5b7a5729f297ee7763ce7e53a8aa3f958c84483bd1e69de57" Mar 20 17:40:26 crc kubenswrapper[4690]: E0320 17:40:26.252958 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36dc46d3ac7a19e5b7a5729f297ee7763ce7e53a8aa3f958c84483bd1e69de57\": container with ID starting with 36dc46d3ac7a19e5b7a5729f297ee7763ce7e53a8aa3f958c84483bd1e69de57 not found: ID does not exist" containerID="36dc46d3ac7a19e5b7a5729f297ee7763ce7e53a8aa3f958c84483bd1e69de57" Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.253018 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36dc46d3ac7a19e5b7a5729f297ee7763ce7e53a8aa3f958c84483bd1e69de57"} err="failed to get container status \"36dc46d3ac7a19e5b7a5729f297ee7763ce7e53a8aa3f958c84483bd1e69de57\": rpc error: code = NotFound desc = could not find container \"36dc46d3ac7a19e5b7a5729f297ee7763ce7e53a8aa3f958c84483bd1e69de57\": container with ID starting with 36dc46d3ac7a19e5b7a5729f297ee7763ce7e53a8aa3f958c84483bd1e69de57 not found: ID does not exist" Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.253045 4690 scope.go:117] "RemoveContainer" containerID="b760ad6cf95133d8fc74387d30f58aa3b60fa64983a86b7f2b2cf8c0828be7a1" Mar 20 17:40:26 crc kubenswrapper[4690]: E0320 17:40:26.253349 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b760ad6cf95133d8fc74387d30f58aa3b60fa64983a86b7f2b2cf8c0828be7a1\": container with ID starting with b760ad6cf95133d8fc74387d30f58aa3b60fa64983a86b7f2b2cf8c0828be7a1 not found: ID does not exist" containerID="b760ad6cf95133d8fc74387d30f58aa3b60fa64983a86b7f2b2cf8c0828be7a1" Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.253376 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b760ad6cf95133d8fc74387d30f58aa3b60fa64983a86b7f2b2cf8c0828be7a1"} err="failed to get container status \"b760ad6cf95133d8fc74387d30f58aa3b60fa64983a86b7f2b2cf8c0828be7a1\": rpc error: code = NotFound desc = could not find container \"b760ad6cf95133d8fc74387d30f58aa3b60fa64983a86b7f2b2cf8c0828be7a1\": container with ID starting with b760ad6cf95133d8fc74387d30f58aa3b60fa64983a86b7f2b2cf8c0828be7a1 not found: ID does not exist" Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.322620 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zltxc"] Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.329150 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zltxc"] Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.341801 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ssfrq"] Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.343762 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ssfrq"] Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.354852 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vwgk4"] Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.357620 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vwgk4"] Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.366603 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dnpcn"] Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.371287 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dnpcn"] Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.380148 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4m7xw"] Mar 20 17:40:26 crc kubenswrapper[4690]: I0320 17:40:26.383716 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4m7xw"] Mar 20 17:40:27 crc kubenswrapper[4690]: I0320 17:40:27.037953 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-7hpm8" event={"ID":"23f72eed-c3c0-4aed-a4a8-8243c27a2785","Type":"ContainerStarted","Data":"1904f503005eec0e9437fcfb25880beaecd7054bdd841d97bbe3eb81ad82d6dd"} Mar 20 17:40:27 crc kubenswrapper[4690]: I0320 17:40:27.038356 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-7hpm8" Mar 20 17:40:27 crc kubenswrapper[4690]: I0320 17:40:27.038378 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-7hpm8" event={"ID":"23f72eed-c3c0-4aed-a4a8-8243c27a2785","Type":"ContainerStarted","Data":"647ca70d44b4359e01d77147636950db51b467540faa76843e6f8281c645835e"} Mar 20 17:40:27 crc kubenswrapper[4690]: I0320 17:40:27.045755 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-7hpm8" Mar 20 17:40:27 crc kubenswrapper[4690]: I0320 17:40:27.061478 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-7hpm8" podStartSLOduration=2.061459589 podStartE2EDuration="2.061459589s" podCreationTimestamp="2026-03-20 17:40:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:40:27.059560526 +0000 UTC m=+501.925386214" watchObservedRunningTime="2026-03-20 17:40:27.061459589 +0000 UTC m=+501.927285267" Mar 20 17:40:27 crc kubenswrapper[4690]: I0320 17:40:27.595454 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xxql6"] Mar 20 17:40:27 crc kubenswrapper[4690]: E0320 17:40:27.595658 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="244c63f8-c484-4edb-9cb6-0ac6a9dac136" containerName="registry-server" Mar 20 17:40:27 crc kubenswrapper[4690]: I0320 17:40:27.595669 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="244c63f8-c484-4edb-9cb6-0ac6a9dac136" containerName="registry-server" Mar 20 17:40:27 crc kubenswrapper[4690]: E0320 17:40:27.595679 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="416d626a-ef44-4b4e-91ce-51042b01a45a" containerName="extract-utilities" Mar 20 17:40:27 crc kubenswrapper[4690]: I0320 17:40:27.595685 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="416d626a-ef44-4b4e-91ce-51042b01a45a" containerName="extract-utilities" Mar 20 17:40:27 crc kubenswrapper[4690]: E0320 17:40:27.595696 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="244c63f8-c484-4edb-9cb6-0ac6a9dac136" containerName="extract-content" Mar 20 17:40:27 crc kubenswrapper[4690]: I0320 17:40:27.595701 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="244c63f8-c484-4edb-9cb6-0ac6a9dac136" containerName="extract-content" Mar 20 17:40:27 crc kubenswrapper[4690]: E0320 17:40:27.595710 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30d0d78a-3ea1-4ce6-b8fb-13645cfedf18" containerName="extract-content" Mar 20 17:40:27 crc kubenswrapper[4690]: I0320 17:40:27.595715 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="30d0d78a-3ea1-4ce6-b8fb-13645cfedf18" containerName="extract-content" Mar 20 17:40:27 crc kubenswrapper[4690]: E0320 17:40:27.595723 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edacb8ae-57ae-41f3-b13b-a423afa0e2dd" containerName="registry-server" Mar 20 17:40:27 crc kubenswrapper[4690]: I0320 17:40:27.595728 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="edacb8ae-57ae-41f3-b13b-a423afa0e2dd" containerName="registry-server" Mar 20 17:40:27 crc kubenswrapper[4690]: E0320 17:40:27.595736 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30d0d78a-3ea1-4ce6-b8fb-13645cfedf18" containerName="extract-utilities" Mar 20 17:40:27 crc kubenswrapper[4690]: I0320 17:40:27.595743 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="30d0d78a-3ea1-4ce6-b8fb-13645cfedf18" containerName="extract-utilities" Mar 20 17:40:27 crc kubenswrapper[4690]: E0320 17:40:27.595809 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30d0d78a-3ea1-4ce6-b8fb-13645cfedf18" containerName="registry-server" Mar 20 17:40:27 crc kubenswrapper[4690]: I0320 17:40:27.595819 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="30d0d78a-3ea1-4ce6-b8fb-13645cfedf18" containerName="registry-server" Mar 20 17:40:27 crc kubenswrapper[4690]: E0320 17:40:27.595828 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="244c63f8-c484-4edb-9cb6-0ac6a9dac136" containerName="extract-utilities" Mar 20 17:40:27 crc kubenswrapper[4690]: I0320 17:40:27.595861 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="244c63f8-c484-4edb-9cb6-0ac6a9dac136" containerName="extract-utilities" Mar 20 17:40:27 crc kubenswrapper[4690]: E0320 17:40:27.595870 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80d86fac-74cc-41d4-81df-2e718c1568d9" containerName="marketplace-operator" Mar 20 17:40:27 crc kubenswrapper[4690]: I0320 17:40:27.595890 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="80d86fac-74cc-41d4-81df-2e718c1568d9" containerName="marketplace-operator" Mar 20 17:40:27 crc kubenswrapper[4690]: E0320 17:40:27.595899 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edacb8ae-57ae-41f3-b13b-a423afa0e2dd" containerName="extract-utilities" Mar 20 17:40:27 crc kubenswrapper[4690]: I0320 17:40:27.595904 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="edacb8ae-57ae-41f3-b13b-a423afa0e2dd" containerName="extract-utilities" Mar 20 17:40:27 crc kubenswrapper[4690]: E0320 17:40:27.595911 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="416d626a-ef44-4b4e-91ce-51042b01a45a" containerName="registry-server" Mar 20 17:40:27 crc kubenswrapper[4690]: I0320 17:40:27.595926 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="416d626a-ef44-4b4e-91ce-51042b01a45a" containerName="registry-server" Mar 20 17:40:27 crc kubenswrapper[4690]: E0320 17:40:27.595935 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80d86fac-74cc-41d4-81df-2e718c1568d9" containerName="marketplace-operator" Mar 20 17:40:27 crc kubenswrapper[4690]: I0320 17:40:27.595942 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="80d86fac-74cc-41d4-81df-2e718c1568d9" containerName="marketplace-operator" Mar 20 17:40:27 crc kubenswrapper[4690]: E0320 17:40:27.595952 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="416d626a-ef44-4b4e-91ce-51042b01a45a" containerName="extract-content" Mar 20 17:40:27 crc kubenswrapper[4690]: I0320 17:40:27.595958 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="416d626a-ef44-4b4e-91ce-51042b01a45a" containerName="extract-content" Mar 20 17:40:27 crc kubenswrapper[4690]: E0320 17:40:27.595970 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edacb8ae-57ae-41f3-b13b-a423afa0e2dd" containerName="extract-content" Mar 20 17:40:27 crc kubenswrapper[4690]: I0320 17:40:27.595976 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="edacb8ae-57ae-41f3-b13b-a423afa0e2dd" containerName="extract-content" Mar 20 17:40:27 crc kubenswrapper[4690]: I0320 17:40:27.596105 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="80d86fac-74cc-41d4-81df-2e718c1568d9" containerName="marketplace-operator" Mar 20 17:40:27 crc kubenswrapper[4690]: I0320 17:40:27.596118 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="416d626a-ef44-4b4e-91ce-51042b01a45a" containerName="registry-server" Mar 20 17:40:27 crc kubenswrapper[4690]: I0320 17:40:27.596127 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="edacb8ae-57ae-41f3-b13b-a423afa0e2dd" containerName="registry-server" Mar 20 17:40:27 crc kubenswrapper[4690]: I0320 17:40:27.596143 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="244c63f8-c484-4edb-9cb6-0ac6a9dac136" containerName="registry-server" Mar 20 17:40:27 crc kubenswrapper[4690]: I0320 17:40:27.596176 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="30d0d78a-3ea1-4ce6-b8fb-13645cfedf18" containerName="registry-server" Mar 20 17:40:27 crc kubenswrapper[4690]: I0320 17:40:27.596436 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="80d86fac-74cc-41d4-81df-2e718c1568d9" containerName="marketplace-operator" Mar 20 17:40:27 crc kubenswrapper[4690]: I0320 17:40:27.597158 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xxql6" Mar 20 17:40:27 crc kubenswrapper[4690]: I0320 17:40:27.599217 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 20 17:40:27 crc kubenswrapper[4690]: I0320 17:40:27.617474 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xxql6"] Mar 20 17:40:27 crc kubenswrapper[4690]: I0320 17:40:27.717572 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/255ea7b7-2364-4ebf-9104-6a78278ee9c0-catalog-content\") pod \"redhat-marketplace-xxql6\" (UID: \"255ea7b7-2364-4ebf-9104-6a78278ee9c0\") " pod="openshift-marketplace/redhat-marketplace-xxql6" Mar 20 17:40:27 crc kubenswrapper[4690]: I0320 17:40:27.717777 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-875lm\" (UniqueName: \"kubernetes.io/projected/255ea7b7-2364-4ebf-9104-6a78278ee9c0-kube-api-access-875lm\") pod \"redhat-marketplace-xxql6\" (UID: \"255ea7b7-2364-4ebf-9104-6a78278ee9c0\") " pod="openshift-marketplace/redhat-marketplace-xxql6" Mar 20 17:40:27 crc kubenswrapper[4690]: I0320 17:40:27.717907 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/255ea7b7-2364-4ebf-9104-6a78278ee9c0-utilities\") pod \"redhat-marketplace-xxql6\" (UID: \"255ea7b7-2364-4ebf-9104-6a78278ee9c0\") " pod="openshift-marketplace/redhat-marketplace-xxql6" Mar 20 17:40:27 crc kubenswrapper[4690]: I0320 17:40:27.799625 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qx8lq"] Mar 20 17:40:27 crc kubenswrapper[4690]: I0320 17:40:27.800752 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qx8lq" Mar 20 17:40:27 crc kubenswrapper[4690]: I0320 17:40:27.811136 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 20 17:40:27 crc kubenswrapper[4690]: I0320 17:40:27.819583 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/255ea7b7-2364-4ebf-9104-6a78278ee9c0-catalog-content\") pod \"redhat-marketplace-xxql6\" (UID: \"255ea7b7-2364-4ebf-9104-6a78278ee9c0\") " pod="openshift-marketplace/redhat-marketplace-xxql6" Mar 20 17:40:27 crc kubenswrapper[4690]: I0320 17:40:27.819680 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-875lm\" (UniqueName: \"kubernetes.io/projected/255ea7b7-2364-4ebf-9104-6a78278ee9c0-kube-api-access-875lm\") pod \"redhat-marketplace-xxql6\" (UID: \"255ea7b7-2364-4ebf-9104-6a78278ee9c0\") " pod="openshift-marketplace/redhat-marketplace-xxql6" Mar 20 17:40:27 crc kubenswrapper[4690]: I0320 17:40:27.820820 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/255ea7b7-2364-4ebf-9104-6a78278ee9c0-catalog-content\") pod \"redhat-marketplace-xxql6\" (UID: \"255ea7b7-2364-4ebf-9104-6a78278ee9c0\") " pod="openshift-marketplace/redhat-marketplace-xxql6" Mar 20 17:40:27 crc kubenswrapper[4690]: I0320 17:40:27.820999 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/255ea7b7-2364-4ebf-9104-6a78278ee9c0-utilities\") pod \"redhat-marketplace-xxql6\" (UID: \"255ea7b7-2364-4ebf-9104-6a78278ee9c0\") " pod="openshift-marketplace/redhat-marketplace-xxql6" Mar 20 17:40:27 crc kubenswrapper[4690]: I0320 17:40:27.821360 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/255ea7b7-2364-4ebf-9104-6a78278ee9c0-utilities\") pod \"redhat-marketplace-xxql6\" (UID: \"255ea7b7-2364-4ebf-9104-6a78278ee9c0\") " pod="openshift-marketplace/redhat-marketplace-xxql6" Mar 20 17:40:27 crc kubenswrapper[4690]: I0320 17:40:27.824597 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qx8lq"] Mar 20 17:40:27 crc kubenswrapper[4690]: I0320 17:40:27.855529 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-875lm\" (UniqueName: \"kubernetes.io/projected/255ea7b7-2364-4ebf-9104-6a78278ee9c0-kube-api-access-875lm\") pod \"redhat-marketplace-xxql6\" (UID: \"255ea7b7-2364-4ebf-9104-6a78278ee9c0\") " pod="openshift-marketplace/redhat-marketplace-xxql6" Mar 20 17:40:27 crc kubenswrapper[4690]: I0320 17:40:27.888859 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="244c63f8-c484-4edb-9cb6-0ac6a9dac136" path="/var/lib/kubelet/pods/244c63f8-c484-4edb-9cb6-0ac6a9dac136/volumes" Mar 20 17:40:27 crc kubenswrapper[4690]: I0320 17:40:27.889670 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30d0d78a-3ea1-4ce6-b8fb-13645cfedf18" path="/var/lib/kubelet/pods/30d0d78a-3ea1-4ce6-b8fb-13645cfedf18/volumes" Mar 20 17:40:27 crc kubenswrapper[4690]: I0320 17:40:27.890587 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="416d626a-ef44-4b4e-91ce-51042b01a45a" path="/var/lib/kubelet/pods/416d626a-ef44-4b4e-91ce-51042b01a45a/volumes" Mar 20 17:40:27 crc kubenswrapper[4690]: I0320 17:40:27.892240 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80d86fac-74cc-41d4-81df-2e718c1568d9" path="/var/lib/kubelet/pods/80d86fac-74cc-41d4-81df-2e718c1568d9/volumes" Mar 20 17:40:27 crc kubenswrapper[4690]: I0320 17:40:27.893079 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="edacb8ae-57ae-41f3-b13b-a423afa0e2dd" path="/var/lib/kubelet/pods/edacb8ae-57ae-41f3-b13b-a423afa0e2dd/volumes" Mar 20 17:40:27 crc kubenswrapper[4690]: I0320 17:40:27.919539 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xxql6" Mar 20 17:40:27 crc kubenswrapper[4690]: I0320 17:40:27.922664 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d788b569-8dbd-4311-bab4-04c7cd0f1444-utilities\") pod \"redhat-operators-qx8lq\" (UID: \"d788b569-8dbd-4311-bab4-04c7cd0f1444\") " pod="openshift-marketplace/redhat-operators-qx8lq" Mar 20 17:40:27 crc kubenswrapper[4690]: I0320 17:40:27.923073 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d788b569-8dbd-4311-bab4-04c7cd0f1444-catalog-content\") pod \"redhat-operators-qx8lq\" (UID: \"d788b569-8dbd-4311-bab4-04c7cd0f1444\") " pod="openshift-marketplace/redhat-operators-qx8lq" Mar 20 17:40:27 crc kubenswrapper[4690]: I0320 17:40:27.923205 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xc5tg\" (UniqueName: \"kubernetes.io/projected/d788b569-8dbd-4311-bab4-04c7cd0f1444-kube-api-access-xc5tg\") pod \"redhat-operators-qx8lq\" (UID: \"d788b569-8dbd-4311-bab4-04c7cd0f1444\") " pod="openshift-marketplace/redhat-operators-qx8lq" Mar 20 17:40:28 crc kubenswrapper[4690]: I0320 17:40:28.025021 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d788b569-8dbd-4311-bab4-04c7cd0f1444-utilities\") pod \"redhat-operators-qx8lq\" (UID: \"d788b569-8dbd-4311-bab4-04c7cd0f1444\") " pod="openshift-marketplace/redhat-operators-qx8lq" Mar 20 17:40:28 crc kubenswrapper[4690]: I0320 17:40:28.025067 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d788b569-8dbd-4311-bab4-04c7cd0f1444-catalog-content\") pod \"redhat-operators-qx8lq\" (UID: \"d788b569-8dbd-4311-bab4-04c7cd0f1444\") " pod="openshift-marketplace/redhat-operators-qx8lq" Mar 20 17:40:28 crc kubenswrapper[4690]: I0320 17:40:28.025087 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xc5tg\" (UniqueName: \"kubernetes.io/projected/d788b569-8dbd-4311-bab4-04c7cd0f1444-kube-api-access-xc5tg\") pod \"redhat-operators-qx8lq\" (UID: \"d788b569-8dbd-4311-bab4-04c7cd0f1444\") " pod="openshift-marketplace/redhat-operators-qx8lq" Mar 20 17:40:28 crc kubenswrapper[4690]: I0320 17:40:28.025747 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d788b569-8dbd-4311-bab4-04c7cd0f1444-utilities\") pod \"redhat-operators-qx8lq\" (UID: \"d788b569-8dbd-4311-bab4-04c7cd0f1444\") " pod="openshift-marketplace/redhat-operators-qx8lq" Mar 20 17:40:28 crc kubenswrapper[4690]: I0320 17:40:28.025783 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d788b569-8dbd-4311-bab4-04c7cd0f1444-catalog-content\") pod \"redhat-operators-qx8lq\" (UID: \"d788b569-8dbd-4311-bab4-04c7cd0f1444\") " pod="openshift-marketplace/redhat-operators-qx8lq" Mar 20 17:40:28 crc kubenswrapper[4690]: I0320 17:40:28.051600 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xc5tg\" (UniqueName: \"kubernetes.io/projected/d788b569-8dbd-4311-bab4-04c7cd0f1444-kube-api-access-xc5tg\") pod \"redhat-operators-qx8lq\" (UID: \"d788b569-8dbd-4311-bab4-04c7cd0f1444\") " pod="openshift-marketplace/redhat-operators-qx8lq" Mar 20 17:40:28 crc kubenswrapper[4690]: I0320 17:40:28.136847 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qx8lq" Mar 20 17:40:28 crc kubenswrapper[4690]: I0320 17:40:28.342163 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xxql6"] Mar 20 17:40:28 crc kubenswrapper[4690]: W0320 17:40:28.348200 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod255ea7b7_2364_4ebf_9104_6a78278ee9c0.slice/crio-62998c18f0c96bbee00768e155cd5ce1f6b7b698b4595340c61a3181e0cb5b06 WatchSource:0}: Error finding container 62998c18f0c96bbee00768e155cd5ce1f6b7b698b4595340c61a3181e0cb5b06: Status 404 returned error can't find the container with id 62998c18f0c96bbee00768e155cd5ce1f6b7b698b4595340c61a3181e0cb5b06 Mar 20 17:40:28 crc kubenswrapper[4690]: I0320 17:40:28.519451 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qx8lq"] Mar 20 17:40:28 crc kubenswrapper[4690]: W0320 17:40:28.561461 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd788b569_8dbd_4311_bab4_04c7cd0f1444.slice/crio-73d16f650efd677241b66c05eba3111534e8bc54a59d178621f6891a5301092c WatchSource:0}: Error finding container 73d16f650efd677241b66c05eba3111534e8bc54a59d178621f6891a5301092c: Status 404 returned error can't find the container with id 73d16f650efd677241b66c05eba3111534e8bc54a59d178621f6891a5301092c Mar 20 17:40:29 crc kubenswrapper[4690]: I0320 17:40:29.057626 4690 generic.go:334] "Generic (PLEG): container finished" podID="255ea7b7-2364-4ebf-9104-6a78278ee9c0" containerID="02bd511b1272fe3a11743abdb20a1bb24c1719f47ac7089cf42db020e46cc4aa" exitCode=0 Mar 20 17:40:29 crc kubenswrapper[4690]: I0320 17:40:29.057741 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xxql6" event={"ID":"255ea7b7-2364-4ebf-9104-6a78278ee9c0","Type":"ContainerDied","Data":"02bd511b1272fe3a11743abdb20a1bb24c1719f47ac7089cf42db020e46cc4aa"} Mar 20 17:40:29 crc kubenswrapper[4690]: I0320 17:40:29.057795 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xxql6" event={"ID":"255ea7b7-2364-4ebf-9104-6a78278ee9c0","Type":"ContainerStarted","Data":"62998c18f0c96bbee00768e155cd5ce1f6b7b698b4595340c61a3181e0cb5b06"} Mar 20 17:40:29 crc kubenswrapper[4690]: I0320 17:40:29.059340 4690 generic.go:334] "Generic (PLEG): container finished" podID="d788b569-8dbd-4311-bab4-04c7cd0f1444" containerID="4796abe423dd75b1f64f0337f2e8e6ea5f3410bffd2ae43602a624273ef8745c" exitCode=0 Mar 20 17:40:29 crc kubenswrapper[4690]: I0320 17:40:29.059383 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qx8lq" event={"ID":"d788b569-8dbd-4311-bab4-04c7cd0f1444","Type":"ContainerDied","Data":"4796abe423dd75b1f64f0337f2e8e6ea5f3410bffd2ae43602a624273ef8745c"} Mar 20 17:40:29 crc kubenswrapper[4690]: I0320 17:40:29.059414 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qx8lq" event={"ID":"d788b569-8dbd-4311-bab4-04c7cd0f1444","Type":"ContainerStarted","Data":"73d16f650efd677241b66c05eba3111534e8bc54a59d178621f6891a5301092c"} Mar 20 17:40:29 crc kubenswrapper[4690]: I0320 17:40:29.996141 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bnw8t"] Mar 20 17:40:29 crc kubenswrapper[4690]: I0320 17:40:29.998078 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bnw8t" Mar 20 17:40:30 crc kubenswrapper[4690]: I0320 17:40:30.000287 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 20 17:40:30 crc kubenswrapper[4690]: I0320 17:40:30.000989 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bnw8t"] Mar 20 17:40:30 crc kubenswrapper[4690]: I0320 17:40:30.066341 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qx8lq" event={"ID":"d788b569-8dbd-4311-bab4-04c7cd0f1444","Type":"ContainerStarted","Data":"b9b0f6c4210374c71c584e84daf40823ada0f4e36be0f8bf704c9a0c1e9f2e3b"} Mar 20 17:40:30 crc kubenswrapper[4690]: I0320 17:40:30.069700 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xxql6" event={"ID":"255ea7b7-2364-4ebf-9104-6a78278ee9c0","Type":"ContainerStarted","Data":"83200800c4a15d46c0103b9e068cd06ab83bd46a621eebb426fbee4716280844"} Mar 20 17:40:30 crc kubenswrapper[4690]: I0320 17:40:30.151931 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/254fbc18-10d1-444c-aef5-12f66b65b191-catalog-content\") pod \"community-operators-bnw8t\" (UID: \"254fbc18-10d1-444c-aef5-12f66b65b191\") " pod="openshift-marketplace/community-operators-bnw8t" Mar 20 17:40:30 crc kubenswrapper[4690]: I0320 17:40:30.152353 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/254fbc18-10d1-444c-aef5-12f66b65b191-utilities\") pod \"community-operators-bnw8t\" (UID: \"254fbc18-10d1-444c-aef5-12f66b65b191\") " pod="openshift-marketplace/community-operators-bnw8t" Mar 20 17:40:30 crc kubenswrapper[4690]: I0320 17:40:30.152554 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6wnl\" (UniqueName: \"kubernetes.io/projected/254fbc18-10d1-444c-aef5-12f66b65b191-kube-api-access-m6wnl\") pod \"community-operators-bnw8t\" (UID: \"254fbc18-10d1-444c-aef5-12f66b65b191\") " pod="openshift-marketplace/community-operators-bnw8t" Mar 20 17:40:30 crc kubenswrapper[4690]: I0320 17:40:30.204870 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wk568"] Mar 20 17:40:30 crc kubenswrapper[4690]: I0320 17:40:30.206577 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wk568" Mar 20 17:40:30 crc kubenswrapper[4690]: I0320 17:40:30.210025 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 20 17:40:30 crc kubenswrapper[4690]: I0320 17:40:30.210083 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wk568"] Mar 20 17:40:30 crc kubenswrapper[4690]: I0320 17:40:30.254135 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6wnl\" (UniqueName: \"kubernetes.io/projected/254fbc18-10d1-444c-aef5-12f66b65b191-kube-api-access-m6wnl\") pod \"community-operators-bnw8t\" (UID: \"254fbc18-10d1-444c-aef5-12f66b65b191\") " pod="openshift-marketplace/community-operators-bnw8t" Mar 20 17:40:30 crc kubenswrapper[4690]: I0320 17:40:30.254220 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/254fbc18-10d1-444c-aef5-12f66b65b191-catalog-content\") pod \"community-operators-bnw8t\" (UID: \"254fbc18-10d1-444c-aef5-12f66b65b191\") " pod="openshift-marketplace/community-operators-bnw8t" Mar 20 17:40:30 crc kubenswrapper[4690]: I0320 17:40:30.254246 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/254fbc18-10d1-444c-aef5-12f66b65b191-utilities\") pod \"community-operators-bnw8t\" (UID: \"254fbc18-10d1-444c-aef5-12f66b65b191\") " pod="openshift-marketplace/community-operators-bnw8t" Mar 20 17:40:30 crc kubenswrapper[4690]: I0320 17:40:30.254695 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/254fbc18-10d1-444c-aef5-12f66b65b191-utilities\") pod \"community-operators-bnw8t\" (UID: \"254fbc18-10d1-444c-aef5-12f66b65b191\") " pod="openshift-marketplace/community-operators-bnw8t" Mar 20 17:40:30 crc kubenswrapper[4690]: I0320 17:40:30.254969 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/254fbc18-10d1-444c-aef5-12f66b65b191-catalog-content\") pod \"community-operators-bnw8t\" (UID: \"254fbc18-10d1-444c-aef5-12f66b65b191\") " pod="openshift-marketplace/community-operators-bnw8t" Mar 20 17:40:30 crc kubenswrapper[4690]: I0320 17:40:30.275272 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6wnl\" (UniqueName: \"kubernetes.io/projected/254fbc18-10d1-444c-aef5-12f66b65b191-kube-api-access-m6wnl\") pod \"community-operators-bnw8t\" (UID: \"254fbc18-10d1-444c-aef5-12f66b65b191\") " pod="openshift-marketplace/community-operators-bnw8t" Mar 20 17:40:30 crc kubenswrapper[4690]: I0320 17:40:30.316347 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bnw8t" Mar 20 17:40:30 crc kubenswrapper[4690]: I0320 17:40:30.355572 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09aec03c-b31e-4b02-aed6-ffce07763b4d-catalog-content\") pod \"certified-operators-wk568\" (UID: \"09aec03c-b31e-4b02-aed6-ffce07763b4d\") " pod="openshift-marketplace/certified-operators-wk568" Mar 20 17:40:30 crc kubenswrapper[4690]: I0320 17:40:30.355656 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09aec03c-b31e-4b02-aed6-ffce07763b4d-utilities\") pod \"certified-operators-wk568\" (UID: \"09aec03c-b31e-4b02-aed6-ffce07763b4d\") " pod="openshift-marketplace/certified-operators-wk568" Mar 20 17:40:30 crc kubenswrapper[4690]: I0320 17:40:30.355701 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4rk6\" (UniqueName: \"kubernetes.io/projected/09aec03c-b31e-4b02-aed6-ffce07763b4d-kube-api-access-v4rk6\") pod \"certified-operators-wk568\" (UID: \"09aec03c-b31e-4b02-aed6-ffce07763b4d\") " pod="openshift-marketplace/certified-operators-wk568" Mar 20 17:40:31 crc kubenswrapper[4690]: I0320 17:40:30.456800 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09aec03c-b31e-4b02-aed6-ffce07763b4d-catalog-content\") pod \"certified-operators-wk568\" (UID: \"09aec03c-b31e-4b02-aed6-ffce07763b4d\") " pod="openshift-marketplace/certified-operators-wk568" Mar 20 17:40:31 crc kubenswrapper[4690]: I0320 17:40:30.456859 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09aec03c-b31e-4b02-aed6-ffce07763b4d-utilities\") pod \"certified-operators-wk568\" (UID: \"09aec03c-b31e-4b02-aed6-ffce07763b4d\") " pod="openshift-marketplace/certified-operators-wk568" Mar 20 17:40:31 crc kubenswrapper[4690]: I0320 17:40:30.456888 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4rk6\" (UniqueName: \"kubernetes.io/projected/09aec03c-b31e-4b02-aed6-ffce07763b4d-kube-api-access-v4rk6\") pod \"certified-operators-wk568\" (UID: \"09aec03c-b31e-4b02-aed6-ffce07763b4d\") " pod="openshift-marketplace/certified-operators-wk568" Mar 20 17:40:31 crc kubenswrapper[4690]: I0320 17:40:30.457456 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09aec03c-b31e-4b02-aed6-ffce07763b4d-catalog-content\") pod \"certified-operators-wk568\" (UID: \"09aec03c-b31e-4b02-aed6-ffce07763b4d\") " pod="openshift-marketplace/certified-operators-wk568" Mar 20 17:40:31 crc kubenswrapper[4690]: I0320 17:40:30.457544 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09aec03c-b31e-4b02-aed6-ffce07763b4d-utilities\") pod \"certified-operators-wk568\" (UID: \"09aec03c-b31e-4b02-aed6-ffce07763b4d\") " pod="openshift-marketplace/certified-operators-wk568" Mar 20 17:40:31 crc kubenswrapper[4690]: I0320 17:40:30.489224 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4rk6\" (UniqueName: \"kubernetes.io/projected/09aec03c-b31e-4b02-aed6-ffce07763b4d-kube-api-access-v4rk6\") pod \"certified-operators-wk568\" (UID: \"09aec03c-b31e-4b02-aed6-ffce07763b4d\") " pod="openshift-marketplace/certified-operators-wk568" Mar 20 17:40:31 crc kubenswrapper[4690]: I0320 17:40:30.535609 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wk568" Mar 20 17:40:31 crc kubenswrapper[4690]: I0320 17:40:31.085192 4690 generic.go:334] "Generic (PLEG): container finished" podID="d788b569-8dbd-4311-bab4-04c7cd0f1444" containerID="b9b0f6c4210374c71c584e84daf40823ada0f4e36be0f8bf704c9a0c1e9f2e3b" exitCode=0 Mar 20 17:40:31 crc kubenswrapper[4690]: I0320 17:40:31.085247 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qx8lq" event={"ID":"d788b569-8dbd-4311-bab4-04c7cd0f1444","Type":"ContainerDied","Data":"b9b0f6c4210374c71c584e84daf40823ada0f4e36be0f8bf704c9a0c1e9f2e3b"} Mar 20 17:40:31 crc kubenswrapper[4690]: I0320 17:40:31.088763 4690 generic.go:334] "Generic (PLEG): container finished" podID="255ea7b7-2364-4ebf-9104-6a78278ee9c0" containerID="83200800c4a15d46c0103b9e068cd06ab83bd46a621eebb426fbee4716280844" exitCode=0 Mar 20 17:40:31 crc kubenswrapper[4690]: I0320 17:40:31.088841 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xxql6" event={"ID":"255ea7b7-2364-4ebf-9104-6a78278ee9c0","Type":"ContainerDied","Data":"83200800c4a15d46c0103b9e068cd06ab83bd46a621eebb426fbee4716280844"} Mar 20 17:40:31 crc kubenswrapper[4690]: I0320 17:40:31.088888 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xxql6" event={"ID":"255ea7b7-2364-4ebf-9104-6a78278ee9c0","Type":"ContainerStarted","Data":"d31c15200165664acd3332a4a95321ffe6e896303a7efab5885d744f878980a0"} Mar 20 17:40:31 crc kubenswrapper[4690]: I0320 17:40:31.137452 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xxql6" podStartSLOduration=2.677180534 podStartE2EDuration="4.137435945s" podCreationTimestamp="2026-03-20 17:40:27 +0000 UTC" firstStartedPulling="2026-03-20 17:40:29.06080394 +0000 UTC m=+503.926629658" lastFinishedPulling="2026-03-20 17:40:30.521059391 +0000 UTC m=+505.386885069" observedRunningTime="2026-03-20 17:40:31.136323984 +0000 UTC m=+506.002149672" watchObservedRunningTime="2026-03-20 17:40:31.137435945 +0000 UTC m=+506.003261623" Mar 20 17:40:31 crc kubenswrapper[4690]: I0320 17:40:31.394340 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wk568"] Mar 20 17:40:31 crc kubenswrapper[4690]: W0320 17:40:31.397484 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09aec03c_b31e_4b02_aed6_ffce07763b4d.slice/crio-4a873520ac19400446f575541e893fac035686b7518241968a369f706a2b2f2c WatchSource:0}: Error finding container 4a873520ac19400446f575541e893fac035686b7518241968a369f706a2b2f2c: Status 404 returned error can't find the container with id 4a873520ac19400446f575541e893fac035686b7518241968a369f706a2b2f2c Mar 20 17:40:31 crc kubenswrapper[4690]: I0320 17:40:31.426936 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bnw8t"] Mar 20 17:40:32 crc kubenswrapper[4690]: I0320 17:40:32.103644 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wk568" event={"ID":"09aec03c-b31e-4b02-aed6-ffce07763b4d","Type":"ContainerDied","Data":"a9edfbb98722840af3e3fff49f0b5b309332b8d846e9e01f337019b51a70cac0"} Mar 20 17:40:32 crc kubenswrapper[4690]: I0320 17:40:32.103520 4690 generic.go:334] "Generic (PLEG): container finished" podID="09aec03c-b31e-4b02-aed6-ffce07763b4d" containerID="a9edfbb98722840af3e3fff49f0b5b309332b8d846e9e01f337019b51a70cac0" exitCode=0 Mar 20 17:40:32 crc kubenswrapper[4690]: I0320 17:40:32.104218 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wk568" event={"ID":"09aec03c-b31e-4b02-aed6-ffce07763b4d","Type":"ContainerStarted","Data":"4a873520ac19400446f575541e893fac035686b7518241968a369f706a2b2f2c"} Mar 20 17:40:32 crc kubenswrapper[4690]: I0320 17:40:32.111611 4690 generic.go:334] "Generic (PLEG): container finished" podID="254fbc18-10d1-444c-aef5-12f66b65b191" containerID="6f542c67b0550c2f2c88d3a7a0205567f2646f37d9c9a193d5fcb0b06117e842" exitCode=0 Mar 20 17:40:32 crc kubenswrapper[4690]: I0320 17:40:32.111868 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bnw8t" event={"ID":"254fbc18-10d1-444c-aef5-12f66b65b191","Type":"ContainerDied","Data":"6f542c67b0550c2f2c88d3a7a0205567f2646f37d9c9a193d5fcb0b06117e842"} Mar 20 17:40:32 crc kubenswrapper[4690]: I0320 17:40:32.111937 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bnw8t" event={"ID":"254fbc18-10d1-444c-aef5-12f66b65b191","Type":"ContainerStarted","Data":"1ceea201669e4d2d1b1547a4e7a2e1ab2ff27ce6d5d1f75bc722559bcd858104"} Mar 20 17:40:32 crc kubenswrapper[4690]: I0320 17:40:32.119210 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qx8lq" event={"ID":"d788b569-8dbd-4311-bab4-04c7cd0f1444","Type":"ContainerStarted","Data":"23c03a01d9d7ad1c72a23c93a64cc7dba06e4f7570e8b64c79d4f6975d4c63c8"} Mar 20 17:40:32 crc kubenswrapper[4690]: I0320 17:40:32.166994 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qx8lq" podStartSLOduration=2.679405256 podStartE2EDuration="5.166973704s" podCreationTimestamp="2026-03-20 17:40:27 +0000 UTC" firstStartedPulling="2026-03-20 17:40:29.060986015 +0000 UTC m=+503.926811693" lastFinishedPulling="2026-03-20 17:40:31.548554463 +0000 UTC m=+506.414380141" observedRunningTime="2026-03-20 17:40:32.160676969 +0000 UTC m=+507.026502647" watchObservedRunningTime="2026-03-20 17:40:32.166973704 +0000 UTC m=+507.032799382" Mar 20 17:40:34 crc kubenswrapper[4690]: I0320 17:40:34.130236 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wk568" event={"ID":"09aec03c-b31e-4b02-aed6-ffce07763b4d","Type":"ContainerStarted","Data":"aa7981990c68b2ea91a311c25f4f1dbe9255c0002203813f3197221af55000cb"} Mar 20 17:40:34 crc kubenswrapper[4690]: I0320 17:40:34.132844 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bnw8t" event={"ID":"254fbc18-10d1-444c-aef5-12f66b65b191","Type":"ContainerStarted","Data":"f22fff76c86ff2ecf19b43d9d80211dcc6c18a966ddd3a0459645966b3533904"} Mar 20 17:40:35 crc kubenswrapper[4690]: I0320 17:40:35.142329 4690 generic.go:334] "Generic (PLEG): container finished" podID="09aec03c-b31e-4b02-aed6-ffce07763b4d" containerID="aa7981990c68b2ea91a311c25f4f1dbe9255c0002203813f3197221af55000cb" exitCode=0 Mar 20 17:40:35 crc kubenswrapper[4690]: I0320 17:40:35.142699 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wk568" event={"ID":"09aec03c-b31e-4b02-aed6-ffce07763b4d","Type":"ContainerDied","Data":"aa7981990c68b2ea91a311c25f4f1dbe9255c0002203813f3197221af55000cb"} Mar 20 17:40:35 crc kubenswrapper[4690]: I0320 17:40:35.149688 4690 generic.go:334] "Generic (PLEG): container finished" podID="254fbc18-10d1-444c-aef5-12f66b65b191" containerID="f22fff76c86ff2ecf19b43d9d80211dcc6c18a966ddd3a0459645966b3533904" exitCode=0 Mar 20 17:40:35 crc kubenswrapper[4690]: I0320 17:40:35.149760 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bnw8t" event={"ID":"254fbc18-10d1-444c-aef5-12f66b65b191","Type":"ContainerDied","Data":"f22fff76c86ff2ecf19b43d9d80211dcc6c18a966ddd3a0459645966b3533904"} Mar 20 17:40:36 crc kubenswrapper[4690]: I0320 17:40:36.164938 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wk568" event={"ID":"09aec03c-b31e-4b02-aed6-ffce07763b4d","Type":"ContainerStarted","Data":"71d97eb223d86f53014515b0110ba5406b6331d29f8eef60028f8b7a708c39d9"} Mar 20 17:40:36 crc kubenswrapper[4690]: I0320 17:40:36.172565 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bnw8t" event={"ID":"254fbc18-10d1-444c-aef5-12f66b65b191","Type":"ContainerStarted","Data":"6fa5748d3218c9c4621d994ab3216ac0e09eae2e543a32cf1e6b80d1fef1099c"} Mar 20 17:40:36 crc kubenswrapper[4690]: I0320 17:40:36.186767 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wk568" podStartSLOduration=2.6881424860000003 podStartE2EDuration="6.18675049s" podCreationTimestamp="2026-03-20 17:40:30 +0000 UTC" firstStartedPulling="2026-03-20 17:40:32.10558493 +0000 UTC m=+506.971410608" lastFinishedPulling="2026-03-20 17:40:35.604192934 +0000 UTC m=+510.470018612" observedRunningTime="2026-03-20 17:40:36.184031834 +0000 UTC m=+511.049857512" watchObservedRunningTime="2026-03-20 17:40:36.18675049 +0000 UTC m=+511.052576168" Mar 20 17:40:36 crc kubenswrapper[4690]: I0320 17:40:36.209346 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bnw8t" podStartSLOduration=3.7491560489999998 podStartE2EDuration="7.209332596s" podCreationTimestamp="2026-03-20 17:40:29 +0000 UTC" firstStartedPulling="2026-03-20 17:40:32.11456355 +0000 UTC m=+506.980389258" lastFinishedPulling="2026-03-20 17:40:35.574740127 +0000 UTC m=+510.440565805" observedRunningTime="2026-03-20 17:40:36.20802316 +0000 UTC m=+511.073848928" watchObservedRunningTime="2026-03-20 17:40:36.209332596 +0000 UTC m=+511.075158274" Mar 20 17:40:37 crc kubenswrapper[4690]: I0320 17:40:37.920790 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xxql6" Mar 20 17:40:37 crc kubenswrapper[4690]: I0320 17:40:37.920860 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xxql6" Mar 20 17:40:37 crc kubenswrapper[4690]: I0320 17:40:37.963155 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xxql6" Mar 20 17:40:38 crc kubenswrapper[4690]: I0320 17:40:38.137156 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qx8lq" Mar 20 17:40:38 crc kubenswrapper[4690]: I0320 17:40:38.137364 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qx8lq" Mar 20 17:40:38 crc kubenswrapper[4690]: I0320 17:40:38.232630 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xxql6" Mar 20 17:40:39 crc kubenswrapper[4690]: I0320 17:40:39.178057 4690 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qx8lq" podUID="d788b569-8dbd-4311-bab4-04c7cd0f1444" containerName="registry-server" probeResult="failure" output=< Mar 20 17:40:39 crc kubenswrapper[4690]: timeout: failed to connect service ":50051" within 1s Mar 20 17:40:39 crc kubenswrapper[4690]: > Mar 20 17:40:40 crc kubenswrapper[4690]: I0320 17:40:40.317393 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bnw8t" Mar 20 17:40:40 crc kubenswrapper[4690]: I0320 17:40:40.317451 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bnw8t" Mar 20 17:40:40 crc kubenswrapper[4690]: I0320 17:40:40.379928 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bnw8t" Mar 20 17:40:40 crc kubenswrapper[4690]: I0320 17:40:40.536954 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wk568" Mar 20 17:40:40 crc kubenswrapper[4690]: I0320 17:40:40.536990 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wk568" Mar 20 17:40:40 crc kubenswrapper[4690]: I0320 17:40:40.594329 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wk568" Mar 20 17:40:41 crc kubenswrapper[4690]: I0320 17:40:41.261078 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wk568" Mar 20 17:40:41 crc kubenswrapper[4690]: I0320 17:40:41.273794 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bnw8t" Mar 20 17:40:48 crc kubenswrapper[4690]: I0320 17:40:48.195579 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qx8lq" Mar 20 17:40:48 crc kubenswrapper[4690]: I0320 17:40:48.258843 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qx8lq" Mar 20 17:41:54 crc kubenswrapper[4690]: I0320 17:41:54.274979 4690 patch_prober.go:28] interesting pod/machine-config-daemon-wtg2q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:41:54 crc kubenswrapper[4690]: I0320 17:41:54.275825 4690 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:42:00 crc kubenswrapper[4690]: I0320 17:42:00.144568 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567142-r6ngc"] Mar 20 17:42:00 crc kubenswrapper[4690]: I0320 17:42:00.145893 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567142-r6ngc" Mar 20 17:42:00 crc kubenswrapper[4690]: I0320 17:42:00.148481 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 17:42:00 crc kubenswrapper[4690]: I0320 17:42:00.149118 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 17:42:00 crc kubenswrapper[4690]: I0320 17:42:00.150899 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5fwhb" Mar 20 17:42:00 crc kubenswrapper[4690]: I0320 17:42:00.159353 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567142-r6ngc"] Mar 20 17:42:00 crc kubenswrapper[4690]: I0320 17:42:00.299896 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqlcj\" (UniqueName: \"kubernetes.io/projected/75cd9cef-2bb6-4c9c-97a4-ed93def89d56-kube-api-access-wqlcj\") pod \"auto-csr-approver-29567142-r6ngc\" (UID: \"75cd9cef-2bb6-4c9c-97a4-ed93def89d56\") " pod="openshift-infra/auto-csr-approver-29567142-r6ngc" Mar 20 17:42:00 crc kubenswrapper[4690]: I0320 17:42:00.401165 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqlcj\" (UniqueName: \"kubernetes.io/projected/75cd9cef-2bb6-4c9c-97a4-ed93def89d56-kube-api-access-wqlcj\") pod \"auto-csr-approver-29567142-r6ngc\" (UID: \"75cd9cef-2bb6-4c9c-97a4-ed93def89d56\") " pod="openshift-infra/auto-csr-approver-29567142-r6ngc" Mar 20 17:42:00 crc kubenswrapper[4690]: I0320 17:42:00.440032 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqlcj\" (UniqueName: \"kubernetes.io/projected/75cd9cef-2bb6-4c9c-97a4-ed93def89d56-kube-api-access-wqlcj\") pod \"auto-csr-approver-29567142-r6ngc\" (UID: \"75cd9cef-2bb6-4c9c-97a4-ed93def89d56\") " pod="openshift-infra/auto-csr-approver-29567142-r6ngc" Mar 20 17:42:00 crc kubenswrapper[4690]: I0320 17:42:00.465970 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567142-r6ngc" Mar 20 17:42:00 crc kubenswrapper[4690]: I0320 17:42:00.702986 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567142-r6ngc"] Mar 20 17:42:00 crc kubenswrapper[4690]: I0320 17:42:00.710525 4690 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 17:42:01 crc kubenswrapper[4690]: I0320 17:42:01.310379 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567142-r6ngc" event={"ID":"75cd9cef-2bb6-4c9c-97a4-ed93def89d56","Type":"ContainerStarted","Data":"2dfc35018c3b11f9d9d40175975d6e7cb0c34359d1a7268bbe00d6400735235d"} Mar 20 17:42:03 crc kubenswrapper[4690]: I0320 17:42:03.326964 4690 generic.go:334] "Generic (PLEG): container finished" podID="75cd9cef-2bb6-4c9c-97a4-ed93def89d56" containerID="09ba0fe92b758b8a8fff30349a490788fbe227c83e5ae7e7a6eeb0f893dcdaec" exitCode=0 Mar 20 17:42:03 crc kubenswrapper[4690]: I0320 17:42:03.327540 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567142-r6ngc" event={"ID":"75cd9cef-2bb6-4c9c-97a4-ed93def89d56","Type":"ContainerDied","Data":"09ba0fe92b758b8a8fff30349a490788fbe227c83e5ae7e7a6eeb0f893dcdaec"} Mar 20 17:42:04 crc kubenswrapper[4690]: I0320 17:42:04.600079 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567142-r6ngc" Mar 20 17:42:04 crc kubenswrapper[4690]: I0320 17:42:04.759140 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wqlcj\" (UniqueName: \"kubernetes.io/projected/75cd9cef-2bb6-4c9c-97a4-ed93def89d56-kube-api-access-wqlcj\") pod \"75cd9cef-2bb6-4c9c-97a4-ed93def89d56\" (UID: \"75cd9cef-2bb6-4c9c-97a4-ed93def89d56\") " Mar 20 17:42:04 crc kubenswrapper[4690]: I0320 17:42:04.765945 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75cd9cef-2bb6-4c9c-97a4-ed93def89d56-kube-api-access-wqlcj" (OuterVolumeSpecName: "kube-api-access-wqlcj") pod "75cd9cef-2bb6-4c9c-97a4-ed93def89d56" (UID: "75cd9cef-2bb6-4c9c-97a4-ed93def89d56"). InnerVolumeSpecName "kube-api-access-wqlcj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:42:04 crc kubenswrapper[4690]: I0320 17:42:04.860652 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wqlcj\" (UniqueName: \"kubernetes.io/projected/75cd9cef-2bb6-4c9c-97a4-ed93def89d56-kube-api-access-wqlcj\") on node \"crc\" DevicePath \"\"" Mar 20 17:42:05 crc kubenswrapper[4690]: I0320 17:42:05.341741 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567142-r6ngc" event={"ID":"75cd9cef-2bb6-4c9c-97a4-ed93def89d56","Type":"ContainerDied","Data":"2dfc35018c3b11f9d9d40175975d6e7cb0c34359d1a7268bbe00d6400735235d"} Mar 20 17:42:05 crc kubenswrapper[4690]: I0320 17:42:05.341802 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2dfc35018c3b11f9d9d40175975d6e7cb0c34359d1a7268bbe00d6400735235d" Mar 20 17:42:05 crc kubenswrapper[4690]: I0320 17:42:05.341851 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567142-r6ngc" Mar 20 17:42:05 crc kubenswrapper[4690]: I0320 17:42:05.663836 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567136-lsh75"] Mar 20 17:42:05 crc kubenswrapper[4690]: I0320 17:42:05.669645 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567136-lsh75"] Mar 20 17:42:05 crc kubenswrapper[4690]: I0320 17:42:05.905470 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1c872c1-ae2b-4fd2-bb6f-e387fab73a06" path="/var/lib/kubelet/pods/d1c872c1-ae2b-4fd2-bb6f-e387fab73a06/volumes" Mar 20 17:42:07 crc kubenswrapper[4690]: I0320 17:42:07.028194 4690 scope.go:117] "RemoveContainer" containerID="4adc951754cfda921010f0fa0d9abfc0c746e7568c061110a54ad12757acf5eb" Mar 20 17:42:24 crc kubenswrapper[4690]: I0320 17:42:24.274181 4690 patch_prober.go:28] interesting pod/machine-config-daemon-wtg2q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:42:24 crc kubenswrapper[4690]: I0320 17:42:24.274709 4690 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:42:54 crc kubenswrapper[4690]: I0320 17:42:54.274452 4690 patch_prober.go:28] interesting pod/machine-config-daemon-wtg2q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:42:54 crc kubenswrapper[4690]: I0320 17:42:54.275056 4690 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:42:54 crc kubenswrapper[4690]: I0320 17:42:54.275129 4690 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" Mar 20 17:42:54 crc kubenswrapper[4690]: I0320 17:42:54.276085 4690 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a6ff45a480211c2f0e008e8cd259faa67c07e468ebd44fd74d331920aaa63b33"} pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 17:42:54 crc kubenswrapper[4690]: I0320 17:42:54.276172 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" containerName="machine-config-daemon" containerID="cri-o://a6ff45a480211c2f0e008e8cd259faa67c07e468ebd44fd74d331920aaa63b33" gracePeriod=600 Mar 20 17:42:54 crc kubenswrapper[4690]: I0320 17:42:54.778311 4690 generic.go:334] "Generic (PLEG): container finished" podID="c18651e4-89e3-43fd-a780-bfa6df87591e" containerID="a6ff45a480211c2f0e008e8cd259faa67c07e468ebd44fd74d331920aaa63b33" exitCode=0 Mar 20 17:42:54 crc kubenswrapper[4690]: I0320 17:42:54.778427 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" event={"ID":"c18651e4-89e3-43fd-a780-bfa6df87591e","Type":"ContainerDied","Data":"a6ff45a480211c2f0e008e8cd259faa67c07e468ebd44fd74d331920aaa63b33"} Mar 20 17:42:54 crc kubenswrapper[4690]: I0320 17:42:54.778785 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" event={"ID":"c18651e4-89e3-43fd-a780-bfa6df87591e","Type":"ContainerStarted","Data":"cf3fdd9123c95cd6ed2bd1666f574e1450c7c3856ffdba8c0585b34757d0cf92"} Mar 20 17:42:54 crc kubenswrapper[4690]: I0320 17:42:54.778819 4690 scope.go:117] "RemoveContainer" containerID="810ef61dfd66653c97e50a7c5e658e3e4610648ff84dc8342c8cadb5532980bc" Mar 20 17:43:07 crc kubenswrapper[4690]: I0320 17:43:07.087385 4690 scope.go:117] "RemoveContainer" containerID="6901f038f408141511eb1c951407621da6d4ab4dff87c1828b77f43ae8798bbb" Mar 20 17:43:07 crc kubenswrapper[4690]: I0320 17:43:07.140067 4690 scope.go:117] "RemoveContainer" containerID="0fae6e0c4bfd93a4ea4458663879b020eaa4104b33a23ce7f203217e5c6f2138" Mar 20 17:44:00 crc kubenswrapper[4690]: I0320 17:44:00.148984 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567144-gsxt8"] Mar 20 17:44:00 crc kubenswrapper[4690]: E0320 17:44:00.149908 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75cd9cef-2bb6-4c9c-97a4-ed93def89d56" containerName="oc" Mar 20 17:44:00 crc kubenswrapper[4690]: I0320 17:44:00.149929 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="75cd9cef-2bb6-4c9c-97a4-ed93def89d56" containerName="oc" Mar 20 17:44:00 crc kubenswrapper[4690]: I0320 17:44:00.150104 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="75cd9cef-2bb6-4c9c-97a4-ed93def89d56" containerName="oc" Mar 20 17:44:00 crc kubenswrapper[4690]: I0320 17:44:00.150774 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567144-gsxt8" Mar 20 17:44:00 crc kubenswrapper[4690]: I0320 17:44:00.154443 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 17:44:00 crc kubenswrapper[4690]: I0320 17:44:00.154554 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 17:44:00 crc kubenswrapper[4690]: I0320 17:44:00.154791 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5fwhb" Mar 20 17:44:00 crc kubenswrapper[4690]: I0320 17:44:00.164119 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567144-gsxt8"] Mar 20 17:44:00 crc kubenswrapper[4690]: I0320 17:44:00.264697 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qx67q\" (UniqueName: \"kubernetes.io/projected/7bbf81b2-25de-4f9f-b7df-e61e997e9418-kube-api-access-qx67q\") pod \"auto-csr-approver-29567144-gsxt8\" (UID: \"7bbf81b2-25de-4f9f-b7df-e61e997e9418\") " pod="openshift-infra/auto-csr-approver-29567144-gsxt8" Mar 20 17:44:00 crc kubenswrapper[4690]: I0320 17:44:00.367714 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qx67q\" (UniqueName: \"kubernetes.io/projected/7bbf81b2-25de-4f9f-b7df-e61e997e9418-kube-api-access-qx67q\") pod \"auto-csr-approver-29567144-gsxt8\" (UID: \"7bbf81b2-25de-4f9f-b7df-e61e997e9418\") " pod="openshift-infra/auto-csr-approver-29567144-gsxt8" Mar 20 17:44:00 crc kubenswrapper[4690]: I0320 17:44:00.416203 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qx67q\" (UniqueName: \"kubernetes.io/projected/7bbf81b2-25de-4f9f-b7df-e61e997e9418-kube-api-access-qx67q\") pod \"auto-csr-approver-29567144-gsxt8\" (UID: \"7bbf81b2-25de-4f9f-b7df-e61e997e9418\") " pod="openshift-infra/auto-csr-approver-29567144-gsxt8" Mar 20 17:44:00 crc kubenswrapper[4690]: I0320 17:44:00.482450 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567144-gsxt8" Mar 20 17:44:00 crc kubenswrapper[4690]: I0320 17:44:00.930489 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567144-gsxt8"] Mar 20 17:44:00 crc kubenswrapper[4690]: W0320 17:44:00.938847 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7bbf81b2_25de_4f9f_b7df_e61e997e9418.slice/crio-5285af33142026cfed68f7649f7ec6ba168e07b8c4c6cc6f79939fb88977244a WatchSource:0}: Error finding container 5285af33142026cfed68f7649f7ec6ba168e07b8c4c6cc6f79939fb88977244a: Status 404 returned error can't find the container with id 5285af33142026cfed68f7649f7ec6ba168e07b8c4c6cc6f79939fb88977244a Mar 20 17:44:01 crc kubenswrapper[4690]: I0320 17:44:01.412398 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567144-gsxt8" event={"ID":"7bbf81b2-25de-4f9f-b7df-e61e997e9418","Type":"ContainerStarted","Data":"5285af33142026cfed68f7649f7ec6ba168e07b8c4c6cc6f79939fb88977244a"} Mar 20 17:44:03 crc kubenswrapper[4690]: I0320 17:44:03.427335 4690 generic.go:334] "Generic (PLEG): container finished" podID="7bbf81b2-25de-4f9f-b7df-e61e997e9418" containerID="607bc9a4bf1c0ac71f852eab2f02e5048ac0c738c9bdea5c44150edcba212202" exitCode=0 Mar 20 17:44:03 crc kubenswrapper[4690]: I0320 17:44:03.427418 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567144-gsxt8" event={"ID":"7bbf81b2-25de-4f9f-b7df-e61e997e9418","Type":"ContainerDied","Data":"607bc9a4bf1c0ac71f852eab2f02e5048ac0c738c9bdea5c44150edcba212202"} Mar 20 17:44:04 crc kubenswrapper[4690]: I0320 17:44:04.751893 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567144-gsxt8" Mar 20 17:44:04 crc kubenswrapper[4690]: I0320 17:44:04.930879 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qx67q\" (UniqueName: \"kubernetes.io/projected/7bbf81b2-25de-4f9f-b7df-e61e997e9418-kube-api-access-qx67q\") pod \"7bbf81b2-25de-4f9f-b7df-e61e997e9418\" (UID: \"7bbf81b2-25de-4f9f-b7df-e61e997e9418\") " Mar 20 17:44:04 crc kubenswrapper[4690]: I0320 17:44:04.939776 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bbf81b2-25de-4f9f-b7df-e61e997e9418-kube-api-access-qx67q" (OuterVolumeSpecName: "kube-api-access-qx67q") pod "7bbf81b2-25de-4f9f-b7df-e61e997e9418" (UID: "7bbf81b2-25de-4f9f-b7df-e61e997e9418"). InnerVolumeSpecName "kube-api-access-qx67q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:44:05 crc kubenswrapper[4690]: I0320 17:44:05.033091 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qx67q\" (UniqueName: \"kubernetes.io/projected/7bbf81b2-25de-4f9f-b7df-e61e997e9418-kube-api-access-qx67q\") on node \"crc\" DevicePath \"\"" Mar 20 17:44:05 crc kubenswrapper[4690]: I0320 17:44:05.446738 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567144-gsxt8" event={"ID":"7bbf81b2-25de-4f9f-b7df-e61e997e9418","Type":"ContainerDied","Data":"5285af33142026cfed68f7649f7ec6ba168e07b8c4c6cc6f79939fb88977244a"} Mar 20 17:44:05 crc kubenswrapper[4690]: I0320 17:44:05.447207 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5285af33142026cfed68f7649f7ec6ba168e07b8c4c6cc6f79939fb88977244a" Mar 20 17:44:05 crc kubenswrapper[4690]: I0320 17:44:05.446818 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567144-gsxt8" Mar 20 17:44:05 crc kubenswrapper[4690]: I0320 17:44:05.823771 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567138-th8kh"] Mar 20 17:44:05 crc kubenswrapper[4690]: I0320 17:44:05.829659 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567138-th8kh"] Mar 20 17:44:05 crc kubenswrapper[4690]: I0320 17:44:05.894202 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8050c5b1-a071-48e0-a371-33b5a13765cd" path="/var/lib/kubelet/pods/8050c5b1-a071-48e0-a371-33b5a13765cd/volumes" Mar 20 17:44:07 crc kubenswrapper[4690]: I0320 17:44:07.228017 4690 scope.go:117] "RemoveContainer" containerID="0d3bb51a70f2d1c02efb6c8a28224826384cf12d0e33c9c1769ca6d92c266120" Mar 20 17:44:54 crc kubenswrapper[4690]: I0320 17:44:54.273971 4690 patch_prober.go:28] interesting pod/machine-config-daemon-wtg2q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:44:54 crc kubenswrapper[4690]: I0320 17:44:54.274721 4690 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:45:00 crc kubenswrapper[4690]: I0320 17:45:00.143594 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567145-rch7w"] Mar 20 17:45:00 crc kubenswrapper[4690]: E0320 17:45:00.144504 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bbf81b2-25de-4f9f-b7df-e61e997e9418" containerName="oc" Mar 20 17:45:00 crc kubenswrapper[4690]: I0320 17:45:00.146089 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bbf81b2-25de-4f9f-b7df-e61e997e9418" containerName="oc" Mar 20 17:45:00 crc kubenswrapper[4690]: I0320 17:45:00.146305 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bbf81b2-25de-4f9f-b7df-e61e997e9418" containerName="oc" Mar 20 17:45:00 crc kubenswrapper[4690]: I0320 17:45:00.147028 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567145-rch7w" Mar 20 17:45:00 crc kubenswrapper[4690]: I0320 17:45:00.149899 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 17:45:00 crc kubenswrapper[4690]: I0320 17:45:00.149900 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 17:45:00 crc kubenswrapper[4690]: I0320 17:45:00.151699 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567145-rch7w"] Mar 20 17:45:00 crc kubenswrapper[4690]: I0320 17:45:00.288695 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1eb37b19-502e-4b27-8fda-4d31630eb068-config-volume\") pod \"collect-profiles-29567145-rch7w\" (UID: \"1eb37b19-502e-4b27-8fda-4d31630eb068\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567145-rch7w" Mar 20 17:45:00 crc kubenswrapper[4690]: I0320 17:45:00.288791 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqnhp\" (UniqueName: \"kubernetes.io/projected/1eb37b19-502e-4b27-8fda-4d31630eb068-kube-api-access-kqnhp\") pod \"collect-profiles-29567145-rch7w\" (UID: \"1eb37b19-502e-4b27-8fda-4d31630eb068\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567145-rch7w" Mar 20 17:45:00 crc kubenswrapper[4690]: I0320 17:45:00.288967 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1eb37b19-502e-4b27-8fda-4d31630eb068-secret-volume\") pod \"collect-profiles-29567145-rch7w\" (UID: \"1eb37b19-502e-4b27-8fda-4d31630eb068\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567145-rch7w" Mar 20 17:45:00 crc kubenswrapper[4690]: I0320 17:45:00.390448 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1eb37b19-502e-4b27-8fda-4d31630eb068-config-volume\") pod \"collect-profiles-29567145-rch7w\" (UID: \"1eb37b19-502e-4b27-8fda-4d31630eb068\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567145-rch7w" Mar 20 17:45:00 crc kubenswrapper[4690]: I0320 17:45:00.390665 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqnhp\" (UniqueName: \"kubernetes.io/projected/1eb37b19-502e-4b27-8fda-4d31630eb068-kube-api-access-kqnhp\") pod \"collect-profiles-29567145-rch7w\" (UID: \"1eb37b19-502e-4b27-8fda-4d31630eb068\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567145-rch7w" Mar 20 17:45:00 crc kubenswrapper[4690]: I0320 17:45:00.390758 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1eb37b19-502e-4b27-8fda-4d31630eb068-secret-volume\") pod \"collect-profiles-29567145-rch7w\" (UID: \"1eb37b19-502e-4b27-8fda-4d31630eb068\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567145-rch7w" Mar 20 17:45:00 crc kubenswrapper[4690]: I0320 17:45:00.392937 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1eb37b19-502e-4b27-8fda-4d31630eb068-config-volume\") pod \"collect-profiles-29567145-rch7w\" (UID: \"1eb37b19-502e-4b27-8fda-4d31630eb068\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567145-rch7w" Mar 20 17:45:00 crc kubenswrapper[4690]: I0320 17:45:00.410598 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1eb37b19-502e-4b27-8fda-4d31630eb068-secret-volume\") pod \"collect-profiles-29567145-rch7w\" (UID: \"1eb37b19-502e-4b27-8fda-4d31630eb068\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567145-rch7w" Mar 20 17:45:00 crc kubenswrapper[4690]: I0320 17:45:00.435194 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqnhp\" (UniqueName: \"kubernetes.io/projected/1eb37b19-502e-4b27-8fda-4d31630eb068-kube-api-access-kqnhp\") pod \"collect-profiles-29567145-rch7w\" (UID: \"1eb37b19-502e-4b27-8fda-4d31630eb068\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567145-rch7w" Mar 20 17:45:00 crc kubenswrapper[4690]: I0320 17:45:00.472222 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567145-rch7w" Mar 20 17:45:00 crc kubenswrapper[4690]: I0320 17:45:00.697219 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567145-rch7w"] Mar 20 17:45:01 crc kubenswrapper[4690]: I0320 17:45:01.007982 4690 generic.go:334] "Generic (PLEG): container finished" podID="1eb37b19-502e-4b27-8fda-4d31630eb068" containerID="b274e77be08a76224f30f0f6fe348b80db5f4c05e6e9047246f5ded12e6d429f" exitCode=0 Mar 20 17:45:01 crc kubenswrapper[4690]: I0320 17:45:01.008211 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567145-rch7w" event={"ID":"1eb37b19-502e-4b27-8fda-4d31630eb068","Type":"ContainerDied","Data":"b274e77be08a76224f30f0f6fe348b80db5f4c05e6e9047246f5ded12e6d429f"} Mar 20 17:45:01 crc kubenswrapper[4690]: I0320 17:45:01.008388 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567145-rch7w" event={"ID":"1eb37b19-502e-4b27-8fda-4d31630eb068","Type":"ContainerStarted","Data":"5af99de47547bcf3669c49481b37cd5e629c2aaf39a7f0cbb97b60a5c0d3692c"} Mar 20 17:45:02 crc kubenswrapper[4690]: I0320 17:45:02.265494 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567145-rch7w" Mar 20 17:45:02 crc kubenswrapper[4690]: I0320 17:45:02.417887 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1eb37b19-502e-4b27-8fda-4d31630eb068-secret-volume\") pod \"1eb37b19-502e-4b27-8fda-4d31630eb068\" (UID: \"1eb37b19-502e-4b27-8fda-4d31630eb068\") " Mar 20 17:45:02 crc kubenswrapper[4690]: I0320 17:45:02.418035 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1eb37b19-502e-4b27-8fda-4d31630eb068-config-volume\") pod \"1eb37b19-502e-4b27-8fda-4d31630eb068\" (UID: \"1eb37b19-502e-4b27-8fda-4d31630eb068\") " Mar 20 17:45:02 crc kubenswrapper[4690]: I0320 17:45:02.418155 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kqnhp\" (UniqueName: \"kubernetes.io/projected/1eb37b19-502e-4b27-8fda-4d31630eb068-kube-api-access-kqnhp\") pod \"1eb37b19-502e-4b27-8fda-4d31630eb068\" (UID: \"1eb37b19-502e-4b27-8fda-4d31630eb068\") " Mar 20 17:45:02 crc kubenswrapper[4690]: I0320 17:45:02.418904 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1eb37b19-502e-4b27-8fda-4d31630eb068-config-volume" (OuterVolumeSpecName: "config-volume") pod "1eb37b19-502e-4b27-8fda-4d31630eb068" (UID: "1eb37b19-502e-4b27-8fda-4d31630eb068"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:45:02 crc kubenswrapper[4690]: I0320 17:45:02.423752 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1eb37b19-502e-4b27-8fda-4d31630eb068-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1eb37b19-502e-4b27-8fda-4d31630eb068" (UID: "1eb37b19-502e-4b27-8fda-4d31630eb068"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:45:02 crc kubenswrapper[4690]: I0320 17:45:02.423980 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1eb37b19-502e-4b27-8fda-4d31630eb068-kube-api-access-kqnhp" (OuterVolumeSpecName: "kube-api-access-kqnhp") pod "1eb37b19-502e-4b27-8fda-4d31630eb068" (UID: "1eb37b19-502e-4b27-8fda-4d31630eb068"). InnerVolumeSpecName "kube-api-access-kqnhp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:45:02 crc kubenswrapper[4690]: I0320 17:45:02.519061 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kqnhp\" (UniqueName: \"kubernetes.io/projected/1eb37b19-502e-4b27-8fda-4d31630eb068-kube-api-access-kqnhp\") on node \"crc\" DevicePath \"\"" Mar 20 17:45:02 crc kubenswrapper[4690]: I0320 17:45:02.519099 4690 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1eb37b19-502e-4b27-8fda-4d31630eb068-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 17:45:02 crc kubenswrapper[4690]: I0320 17:45:02.519110 4690 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1eb37b19-502e-4b27-8fda-4d31630eb068-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 17:45:03 crc kubenswrapper[4690]: I0320 17:45:03.023626 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567145-rch7w" event={"ID":"1eb37b19-502e-4b27-8fda-4d31630eb068","Type":"ContainerDied","Data":"5af99de47547bcf3669c49481b37cd5e629c2aaf39a7f0cbb97b60a5c0d3692c"} Mar 20 17:45:03 crc kubenswrapper[4690]: I0320 17:45:03.023690 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5af99de47547bcf3669c49481b37cd5e629c2aaf39a7f0cbb97b60a5c0d3692c" Mar 20 17:45:03 crc kubenswrapper[4690]: I0320 17:45:03.023712 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567145-rch7w" Mar 20 17:45:07 crc kubenswrapper[4690]: I0320 17:45:07.279802 4690 scope.go:117] "RemoveContainer" containerID="cfa34146f20ea5119cbff97b62210e4507312486c04e37ac7976f627f7405611" Mar 20 17:45:24 crc kubenswrapper[4690]: I0320 17:45:24.274508 4690 patch_prober.go:28] interesting pod/machine-config-daemon-wtg2q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:45:24 crc kubenswrapper[4690]: I0320 17:45:24.275099 4690 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:45:54 crc kubenswrapper[4690]: I0320 17:45:54.274042 4690 patch_prober.go:28] interesting pod/machine-config-daemon-wtg2q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:45:54 crc kubenswrapper[4690]: I0320 17:45:54.274821 4690 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:45:54 crc kubenswrapper[4690]: I0320 17:45:54.274872 4690 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" Mar 20 17:45:54 crc kubenswrapper[4690]: I0320 17:45:54.275463 4690 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cf3fdd9123c95cd6ed2bd1666f574e1450c7c3856ffdba8c0585b34757d0cf92"} pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 17:45:54 crc kubenswrapper[4690]: I0320 17:45:54.275525 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" containerName="machine-config-daemon" containerID="cri-o://cf3fdd9123c95cd6ed2bd1666f574e1450c7c3856ffdba8c0585b34757d0cf92" gracePeriod=600 Mar 20 17:45:54 crc kubenswrapper[4690]: I0320 17:45:54.431361 4690 generic.go:334] "Generic (PLEG): container finished" podID="c18651e4-89e3-43fd-a780-bfa6df87591e" containerID="cf3fdd9123c95cd6ed2bd1666f574e1450c7c3856ffdba8c0585b34757d0cf92" exitCode=0 Mar 20 17:45:54 crc kubenswrapper[4690]: I0320 17:45:54.431579 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" event={"ID":"c18651e4-89e3-43fd-a780-bfa6df87591e","Type":"ContainerDied","Data":"cf3fdd9123c95cd6ed2bd1666f574e1450c7c3856ffdba8c0585b34757d0cf92"} Mar 20 17:45:54 crc kubenswrapper[4690]: I0320 17:45:54.431645 4690 scope.go:117] "RemoveContainer" containerID="a6ff45a480211c2f0e008e8cd259faa67c07e468ebd44fd74d331920aaa63b33" Mar 20 17:45:55 crc kubenswrapper[4690]: I0320 17:45:55.105219 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-5lwhr"] Mar 20 17:45:55 crc kubenswrapper[4690]: E0320 17:45:55.105795 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1eb37b19-502e-4b27-8fda-4d31630eb068" containerName="collect-profiles" Mar 20 17:45:55 crc kubenswrapper[4690]: I0320 17:45:55.105813 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="1eb37b19-502e-4b27-8fda-4d31630eb068" containerName="collect-profiles" Mar 20 17:45:55 crc kubenswrapper[4690]: I0320 17:45:55.105918 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="1eb37b19-502e-4b27-8fda-4d31630eb068" containerName="collect-profiles" Mar 20 17:45:55 crc kubenswrapper[4690]: I0320 17:45:55.106363 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-5lwhr" Mar 20 17:45:55 crc kubenswrapper[4690]: I0320 17:45:55.109853 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-sj4vr"] Mar 20 17:45:55 crc kubenswrapper[4690]: I0320 17:45:55.110410 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 20 17:45:55 crc kubenswrapper[4690]: I0320 17:45:55.110564 4690 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-wwv6c" Mar 20 17:45:55 crc kubenswrapper[4690]: I0320 17:45:55.110656 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-sj4vr" Mar 20 17:45:55 crc kubenswrapper[4690]: I0320 17:45:55.110650 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 20 17:45:55 crc kubenswrapper[4690]: I0320 17:45:55.113960 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-5lwhr"] Mar 20 17:45:55 crc kubenswrapper[4690]: I0320 17:45:55.115501 4690 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-grlcx" Mar 20 17:45:55 crc kubenswrapper[4690]: I0320 17:45:55.118554 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-sj4vr"] Mar 20 17:45:55 crc kubenswrapper[4690]: I0320 17:45:55.142193 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-s9shq"] Mar 20 17:45:55 crc kubenswrapper[4690]: I0320 17:45:55.142922 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-s9shq" Mar 20 17:45:55 crc kubenswrapper[4690]: I0320 17:45:55.145783 4690 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-ltt2p" Mar 20 17:45:55 crc kubenswrapper[4690]: I0320 17:45:55.157039 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-s9shq"] Mar 20 17:45:55 crc kubenswrapper[4690]: I0320 17:45:55.163100 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4hkc\" (UniqueName: \"kubernetes.io/projected/7f7c4ed7-ab53-40e9-8977-77afd116ce1b-kube-api-access-z4hkc\") pod \"cert-manager-858654f9db-sj4vr\" (UID: \"7f7c4ed7-ab53-40e9-8977-77afd116ce1b\") " pod="cert-manager/cert-manager-858654f9db-sj4vr" Mar 20 17:45:55 crc kubenswrapper[4690]: I0320 17:45:55.163174 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2b9x\" (UniqueName: \"kubernetes.io/projected/2040aed1-0ccc-4068-8e68-5ddda58ddd5e-kube-api-access-z2b9x\") pod \"cert-manager-cainjector-cf98fcc89-5lwhr\" (UID: \"2040aed1-0ccc-4068-8e68-5ddda58ddd5e\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-5lwhr" Mar 20 17:45:55 crc kubenswrapper[4690]: I0320 17:45:55.264146 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4hkc\" (UniqueName: \"kubernetes.io/projected/7f7c4ed7-ab53-40e9-8977-77afd116ce1b-kube-api-access-z4hkc\") pod \"cert-manager-858654f9db-sj4vr\" (UID: \"7f7c4ed7-ab53-40e9-8977-77afd116ce1b\") " pod="cert-manager/cert-manager-858654f9db-sj4vr" Mar 20 17:45:55 crc kubenswrapper[4690]: I0320 17:45:55.264195 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dm9dg\" (UniqueName: \"kubernetes.io/projected/46b55360-cf52-4b63-90e4-b578b7181c19-kube-api-access-dm9dg\") pod \"cert-manager-webhook-687f57d79b-s9shq\" (UID: \"46b55360-cf52-4b63-90e4-b578b7181c19\") " pod="cert-manager/cert-manager-webhook-687f57d79b-s9shq" Mar 20 17:45:55 crc kubenswrapper[4690]: I0320 17:45:55.264251 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2b9x\" (UniqueName: \"kubernetes.io/projected/2040aed1-0ccc-4068-8e68-5ddda58ddd5e-kube-api-access-z2b9x\") pod \"cert-manager-cainjector-cf98fcc89-5lwhr\" (UID: \"2040aed1-0ccc-4068-8e68-5ddda58ddd5e\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-5lwhr" Mar 20 17:45:55 crc kubenswrapper[4690]: I0320 17:45:55.282480 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4hkc\" (UniqueName: \"kubernetes.io/projected/7f7c4ed7-ab53-40e9-8977-77afd116ce1b-kube-api-access-z4hkc\") pod \"cert-manager-858654f9db-sj4vr\" (UID: \"7f7c4ed7-ab53-40e9-8977-77afd116ce1b\") " pod="cert-manager/cert-manager-858654f9db-sj4vr" Mar 20 17:45:55 crc kubenswrapper[4690]: I0320 17:45:55.283031 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2b9x\" (UniqueName: \"kubernetes.io/projected/2040aed1-0ccc-4068-8e68-5ddda58ddd5e-kube-api-access-z2b9x\") pod \"cert-manager-cainjector-cf98fcc89-5lwhr\" (UID: \"2040aed1-0ccc-4068-8e68-5ddda58ddd5e\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-5lwhr" Mar 20 17:45:55 crc kubenswrapper[4690]: I0320 17:45:55.365497 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dm9dg\" (UniqueName: \"kubernetes.io/projected/46b55360-cf52-4b63-90e4-b578b7181c19-kube-api-access-dm9dg\") pod \"cert-manager-webhook-687f57d79b-s9shq\" (UID: \"46b55360-cf52-4b63-90e4-b578b7181c19\") " pod="cert-manager/cert-manager-webhook-687f57d79b-s9shq" Mar 20 17:45:55 crc kubenswrapper[4690]: I0320 17:45:55.384427 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dm9dg\" (UniqueName: \"kubernetes.io/projected/46b55360-cf52-4b63-90e4-b578b7181c19-kube-api-access-dm9dg\") pod \"cert-manager-webhook-687f57d79b-s9shq\" (UID: \"46b55360-cf52-4b63-90e4-b578b7181c19\") " pod="cert-manager/cert-manager-webhook-687f57d79b-s9shq" Mar 20 17:45:55 crc kubenswrapper[4690]: I0320 17:45:55.432613 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-5lwhr" Mar 20 17:45:55 crc kubenswrapper[4690]: I0320 17:45:55.439521 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" event={"ID":"c18651e4-89e3-43fd-a780-bfa6df87591e","Type":"ContainerStarted","Data":"3597106c9e9367c28d243129fc42edbd4550b54914b1aeed86c0200ac6936ead"} Mar 20 17:45:55 crc kubenswrapper[4690]: I0320 17:45:55.440002 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-sj4vr" Mar 20 17:45:55 crc kubenswrapper[4690]: I0320 17:45:55.460791 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-s9shq" Mar 20 17:45:55 crc kubenswrapper[4690]: I0320 17:45:55.877154 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-s9shq"] Mar 20 17:45:55 crc kubenswrapper[4690]: I0320 17:45:55.916790 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-5lwhr"] Mar 20 17:45:55 crc kubenswrapper[4690]: I0320 17:45:55.920870 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-sj4vr"] Mar 20 17:45:55 crc kubenswrapper[4690]: W0320 17:45:55.928965 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2040aed1_0ccc_4068_8e68_5ddda58ddd5e.slice/crio-04f8ca505cb8cd72da952bb263cd9e14f260ac2391a37e939494c89478fd3c9d WatchSource:0}: Error finding container 04f8ca505cb8cd72da952bb263cd9e14f260ac2391a37e939494c89478fd3c9d: Status 404 returned error can't find the container with id 04f8ca505cb8cd72da952bb263cd9e14f260ac2391a37e939494c89478fd3c9d Mar 20 17:45:56 crc kubenswrapper[4690]: I0320 17:45:56.449202 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-s9shq" event={"ID":"46b55360-cf52-4b63-90e4-b578b7181c19","Type":"ContainerStarted","Data":"2f7bec666d9ca82787e1a611f40b80537af1a54fabd737e28731076e4cac8949"} Mar 20 17:45:56 crc kubenswrapper[4690]: I0320 17:45:56.453141 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-sj4vr" event={"ID":"7f7c4ed7-ab53-40e9-8977-77afd116ce1b","Type":"ContainerStarted","Data":"3f17699e01d3acb221e26acc8856251180d60765b3077c1e40dec7f0c7ab3694"} Mar 20 17:45:56 crc kubenswrapper[4690]: I0320 17:45:56.455043 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-5lwhr" event={"ID":"2040aed1-0ccc-4068-8e68-5ddda58ddd5e","Type":"ContainerStarted","Data":"04f8ca505cb8cd72da952bb263cd9e14f260ac2391a37e939494c89478fd3c9d"} Mar 20 17:46:00 crc kubenswrapper[4690]: I0320 17:46:00.127204 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567146-qqzwn"] Mar 20 17:46:00 crc kubenswrapper[4690]: I0320 17:46:00.128573 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567146-qqzwn" Mar 20 17:46:00 crc kubenswrapper[4690]: I0320 17:46:00.130485 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 17:46:00 crc kubenswrapper[4690]: I0320 17:46:00.130865 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5fwhb" Mar 20 17:46:00 crc kubenswrapper[4690]: I0320 17:46:00.142150 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567146-qqzwn"] Mar 20 17:46:00 crc kubenswrapper[4690]: I0320 17:46:00.143250 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 17:46:00 crc kubenswrapper[4690]: I0320 17:46:00.246095 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrtzn\" (UniqueName: \"kubernetes.io/projected/a6397759-9cf9-4996-9fc7-6ec98f00014a-kube-api-access-jrtzn\") pod \"auto-csr-approver-29567146-qqzwn\" (UID: \"a6397759-9cf9-4996-9fc7-6ec98f00014a\") " pod="openshift-infra/auto-csr-approver-29567146-qqzwn" Mar 20 17:46:00 crc kubenswrapper[4690]: I0320 17:46:00.346897 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrtzn\" (UniqueName: \"kubernetes.io/projected/a6397759-9cf9-4996-9fc7-6ec98f00014a-kube-api-access-jrtzn\") pod \"auto-csr-approver-29567146-qqzwn\" (UID: \"a6397759-9cf9-4996-9fc7-6ec98f00014a\") " pod="openshift-infra/auto-csr-approver-29567146-qqzwn" Mar 20 17:46:00 crc kubenswrapper[4690]: I0320 17:46:00.375201 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrtzn\" (UniqueName: \"kubernetes.io/projected/a6397759-9cf9-4996-9fc7-6ec98f00014a-kube-api-access-jrtzn\") pod \"auto-csr-approver-29567146-qqzwn\" (UID: \"a6397759-9cf9-4996-9fc7-6ec98f00014a\") " pod="openshift-infra/auto-csr-approver-29567146-qqzwn" Mar 20 17:46:00 crc kubenswrapper[4690]: I0320 17:46:00.476005 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567146-qqzwn" Mar 20 17:46:00 crc kubenswrapper[4690]: I0320 17:46:00.496035 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-sj4vr" event={"ID":"7f7c4ed7-ab53-40e9-8977-77afd116ce1b","Type":"ContainerStarted","Data":"b0d77bdd79d600804cab5f434505ab1ffdac31d4d7f141ec4e81179597416c89"} Mar 20 17:46:00 crc kubenswrapper[4690]: I0320 17:46:00.499075 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-5lwhr" event={"ID":"2040aed1-0ccc-4068-8e68-5ddda58ddd5e","Type":"ContainerStarted","Data":"44a2fe21c8fe8c8216a1510239b4d976d4eedf4d95601d9698bd78022d2f8b9a"} Mar 20 17:46:00 crc kubenswrapper[4690]: I0320 17:46:00.502242 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-s9shq" event={"ID":"46b55360-cf52-4b63-90e4-b578b7181c19","Type":"ContainerStarted","Data":"e33ce2eee09ce6e5053337597033e788702c34372b3c3811ffc55098e58dcc08"} Mar 20 17:46:00 crc kubenswrapper[4690]: I0320 17:46:00.502490 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-s9shq" Mar 20 17:46:00 crc kubenswrapper[4690]: I0320 17:46:00.514830 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-sj4vr" podStartSLOduration=2.141604095 podStartE2EDuration="5.514808341s" podCreationTimestamp="2026-03-20 17:45:55 +0000 UTC" firstStartedPulling="2026-03-20 17:45:55.92778957 +0000 UTC m=+830.793615258" lastFinishedPulling="2026-03-20 17:45:59.300993816 +0000 UTC m=+834.166819504" observedRunningTime="2026-03-20 17:46:00.510717085 +0000 UTC m=+835.376542773" watchObservedRunningTime="2026-03-20 17:46:00.514808341 +0000 UTC m=+835.380634029" Mar 20 17:46:00 crc kubenswrapper[4690]: I0320 17:46:00.536430 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-s9shq" podStartSLOduration=2.116327229 podStartE2EDuration="5.536412352s" podCreationTimestamp="2026-03-20 17:45:55 +0000 UTC" firstStartedPulling="2026-03-20 17:45:55.884290179 +0000 UTC m=+830.750115867" lastFinishedPulling="2026-03-20 17:45:59.304375302 +0000 UTC m=+834.170200990" observedRunningTime="2026-03-20 17:46:00.53493014 +0000 UTC m=+835.400755828" watchObservedRunningTime="2026-03-20 17:46:00.536412352 +0000 UTC m=+835.402238040" Mar 20 17:46:00 crc kubenswrapper[4690]: I0320 17:46:00.560191 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-5lwhr" podStartSLOduration=2.104240708 podStartE2EDuration="5.560172144s" podCreationTimestamp="2026-03-20 17:45:55 +0000 UTC" firstStartedPulling="2026-03-20 17:45:55.933196323 +0000 UTC m=+830.799022001" lastFinishedPulling="2026-03-20 17:45:59.389127739 +0000 UTC m=+834.254953437" observedRunningTime="2026-03-20 17:46:00.55717969 +0000 UTC m=+835.423005378" watchObservedRunningTime="2026-03-20 17:46:00.560172144 +0000 UTC m=+835.425997822" Mar 20 17:46:00 crc kubenswrapper[4690]: I0320 17:46:00.957226 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567146-qqzwn"] Mar 20 17:46:00 crc kubenswrapper[4690]: W0320 17:46:00.980601 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda6397759_9cf9_4996_9fc7_6ec98f00014a.slice/crio-b5e829630273b1dacddd4d14e28f7ea40633b4e56691be78a775b4c24f47cd73 WatchSource:0}: Error finding container b5e829630273b1dacddd4d14e28f7ea40633b4e56691be78a775b4c24f47cd73: Status 404 returned error can't find the container with id b5e829630273b1dacddd4d14e28f7ea40633b4e56691be78a775b4c24f47cd73 Mar 20 17:46:01 crc kubenswrapper[4690]: I0320 17:46:01.512466 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567146-qqzwn" event={"ID":"a6397759-9cf9-4996-9fc7-6ec98f00014a","Type":"ContainerStarted","Data":"b5e829630273b1dacddd4d14e28f7ea40633b4e56691be78a775b4c24f47cd73"} Mar 20 17:46:02 crc kubenswrapper[4690]: I0320 17:46:02.520715 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567146-qqzwn" event={"ID":"a6397759-9cf9-4996-9fc7-6ec98f00014a","Type":"ContainerStarted","Data":"8ba775c8634732f742dc32ece3d7951ce7af5e382cd21ed071defbae46a49a7a"} Mar 20 17:46:02 crc kubenswrapper[4690]: I0320 17:46:02.537860 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567146-qqzwn" podStartSLOduration=1.517365029 podStartE2EDuration="2.537835975s" podCreationTimestamp="2026-03-20 17:46:00 +0000 UTC" firstStartedPulling="2026-03-20 17:46:00.982352696 +0000 UTC m=+835.848178374" lastFinishedPulling="2026-03-20 17:46:02.002823612 +0000 UTC m=+836.868649320" observedRunningTime="2026-03-20 17:46:02.533815181 +0000 UTC m=+837.399640879" watchObservedRunningTime="2026-03-20 17:46:02.537835975 +0000 UTC m=+837.403661693" Mar 20 17:46:03 crc kubenswrapper[4690]: I0320 17:46:03.530901 4690 generic.go:334] "Generic (PLEG): container finished" podID="a6397759-9cf9-4996-9fc7-6ec98f00014a" containerID="8ba775c8634732f742dc32ece3d7951ce7af5e382cd21ed071defbae46a49a7a" exitCode=0 Mar 20 17:46:03 crc kubenswrapper[4690]: I0320 17:46:03.530998 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567146-qqzwn" event={"ID":"a6397759-9cf9-4996-9fc7-6ec98f00014a","Type":"ContainerDied","Data":"8ba775c8634732f742dc32ece3d7951ce7af5e382cd21ed071defbae46a49a7a"} Mar 20 17:46:04 crc kubenswrapper[4690]: I0320 17:46:04.716757 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-7bsmm"] Mar 20 17:46:04 crc kubenswrapper[4690]: I0320 17:46:04.717776 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" podUID="01a728ab-e286-4606-b922-d510978b863a" containerName="ovn-controller" containerID="cri-o://95c9b322e5da6bc8172886af77d6507bccaaf8e4489181c78d3f5e522d781aa4" gracePeriod=30 Mar 20 17:46:04 crc kubenswrapper[4690]: I0320 17:46:04.717798 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" podUID="01a728ab-e286-4606-b922-d510978b863a" containerName="sbdb" containerID="cri-o://6447a78cef9ba2045f7928077399b681d152b37755ec287ae1633a26a67711ff" gracePeriod=30 Mar 20 17:46:04 crc kubenswrapper[4690]: I0320 17:46:04.718057 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" podUID="01a728ab-e286-4606-b922-d510978b863a" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://187278dddcc4ae295ce37bb5966dd95b70987cf9579d8a302c45162906caa098" gracePeriod=30 Mar 20 17:46:04 crc kubenswrapper[4690]: I0320 17:46:04.718232 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" podUID="01a728ab-e286-4606-b922-d510978b863a" containerName="ovn-acl-logging" containerID="cri-o://11c8e8059826df28ea1bdafe3ca56a8a902ff916246367be3ece76d468194901" gracePeriod=30 Mar 20 17:46:04 crc kubenswrapper[4690]: I0320 17:46:04.718368 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" podUID="01a728ab-e286-4606-b922-d510978b863a" containerName="northd" containerID="cri-o://78b79e7c6bc179739a43168addace3ea75f4067c5938f219a5cb0e545f65472f" gracePeriod=30 Mar 20 17:46:04 crc kubenswrapper[4690]: I0320 17:46:04.718610 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" podUID="01a728ab-e286-4606-b922-d510978b863a" containerName="kube-rbac-proxy-node" containerID="cri-o://89f5bb035f84384df58eb38689bda300611344d78c38c548c61cd02a479b6852" gracePeriod=30 Mar 20 17:46:04 crc kubenswrapper[4690]: I0320 17:46:04.727230 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" podUID="01a728ab-e286-4606-b922-d510978b863a" containerName="nbdb" containerID="cri-o://d198c0b94cfc2e9429a02ccb1bf444b3746c37cd3278cc5c41cccad3a92f3a7c" gracePeriod=30 Mar 20 17:46:04 crc kubenswrapper[4690]: I0320 17:46:04.774270 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" podUID="01a728ab-e286-4606-b922-d510978b863a" containerName="ovnkube-controller" containerID="cri-o://e6135f9b9f357be4756c24d0e74244d64acd2fdfe7868743556379250a02e5ec" gracePeriod=30 Mar 20 17:46:04 crc kubenswrapper[4690]: I0320 17:46:04.820896 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567146-qqzwn" Mar 20 17:46:04 crc kubenswrapper[4690]: I0320 17:46:04.911724 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jrtzn\" (UniqueName: \"kubernetes.io/projected/a6397759-9cf9-4996-9fc7-6ec98f00014a-kube-api-access-jrtzn\") pod \"a6397759-9cf9-4996-9fc7-6ec98f00014a\" (UID: \"a6397759-9cf9-4996-9fc7-6ec98f00014a\") " Mar 20 17:46:04 crc kubenswrapper[4690]: I0320 17:46:04.918756 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6397759-9cf9-4996-9fc7-6ec98f00014a-kube-api-access-jrtzn" (OuterVolumeSpecName: "kube-api-access-jrtzn") pod "a6397759-9cf9-4996-9fc7-6ec98f00014a" (UID: "a6397759-9cf9-4996-9fc7-6ec98f00014a"). InnerVolumeSpecName "kube-api-access-jrtzn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:46:04 crc kubenswrapper[4690]: I0320 17:46:04.999684 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7bsmm_01a728ab-e286-4606-b922-d510978b863a/ovnkube-controller/3.log" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.002596 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7bsmm_01a728ab-e286-4606-b922-d510978b863a/ovn-acl-logging/0.log" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.003072 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7bsmm_01a728ab-e286-4606-b922-d510978b863a/ovn-controller/0.log" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.003656 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.013654 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jrtzn\" (UniqueName: \"kubernetes.io/projected/a6397759-9cf9-4996-9fc7-6ec98f00014a-kube-api-access-jrtzn\") on node \"crc\" DevicePath \"\"" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.068627 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-qvkzb"] Mar 20 17:46:05 crc kubenswrapper[4690]: E0320 17:46:05.068920 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01a728ab-e286-4606-b922-d510978b863a" containerName="ovnkube-controller" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.068941 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="01a728ab-e286-4606-b922-d510978b863a" containerName="ovnkube-controller" Mar 20 17:46:05 crc kubenswrapper[4690]: E0320 17:46:05.068959 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01a728ab-e286-4606-b922-d510978b863a" containerName="ovnkube-controller" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.068971 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="01a728ab-e286-4606-b922-d510978b863a" containerName="ovnkube-controller" Mar 20 17:46:05 crc kubenswrapper[4690]: E0320 17:46:05.068986 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01a728ab-e286-4606-b922-d510978b863a" containerName="kubecfg-setup" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.069000 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="01a728ab-e286-4606-b922-d510978b863a" containerName="kubecfg-setup" Mar 20 17:46:05 crc kubenswrapper[4690]: E0320 17:46:05.069019 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01a728ab-e286-4606-b922-d510978b863a" containerName="ovnkube-controller" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.069031 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="01a728ab-e286-4606-b922-d510978b863a" containerName="ovnkube-controller" Mar 20 17:46:05 crc kubenswrapper[4690]: E0320 17:46:05.069047 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01a728ab-e286-4606-b922-d510978b863a" containerName="ovnkube-controller" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.069059 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="01a728ab-e286-4606-b922-d510978b863a" containerName="ovnkube-controller" Mar 20 17:46:05 crc kubenswrapper[4690]: E0320 17:46:05.069078 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01a728ab-e286-4606-b922-d510978b863a" containerName="kube-rbac-proxy-ovn-metrics" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.069091 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="01a728ab-e286-4606-b922-d510978b863a" containerName="kube-rbac-proxy-ovn-metrics" Mar 20 17:46:05 crc kubenswrapper[4690]: E0320 17:46:05.069109 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01a728ab-e286-4606-b922-d510978b863a" containerName="kube-rbac-proxy-node" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.069121 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="01a728ab-e286-4606-b922-d510978b863a" containerName="kube-rbac-proxy-node" Mar 20 17:46:05 crc kubenswrapper[4690]: E0320 17:46:05.069140 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01a728ab-e286-4606-b922-d510978b863a" containerName="nbdb" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.069153 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="01a728ab-e286-4606-b922-d510978b863a" containerName="nbdb" Mar 20 17:46:05 crc kubenswrapper[4690]: E0320 17:46:05.069168 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01a728ab-e286-4606-b922-d510978b863a" containerName="ovn-controller" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.069183 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="01a728ab-e286-4606-b922-d510978b863a" containerName="ovn-controller" Mar 20 17:46:05 crc kubenswrapper[4690]: E0320 17:46:05.069197 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01a728ab-e286-4606-b922-d510978b863a" containerName="ovn-acl-logging" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.069208 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="01a728ab-e286-4606-b922-d510978b863a" containerName="ovn-acl-logging" Mar 20 17:46:05 crc kubenswrapper[4690]: E0320 17:46:05.069226 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6397759-9cf9-4996-9fc7-6ec98f00014a" containerName="oc" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.069238 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6397759-9cf9-4996-9fc7-6ec98f00014a" containerName="oc" Mar 20 17:46:05 crc kubenswrapper[4690]: E0320 17:46:05.069285 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01a728ab-e286-4606-b922-d510978b863a" containerName="sbdb" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.069298 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="01a728ab-e286-4606-b922-d510978b863a" containerName="sbdb" Mar 20 17:46:05 crc kubenswrapper[4690]: E0320 17:46:05.069317 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01a728ab-e286-4606-b922-d510978b863a" containerName="northd" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.069356 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="01a728ab-e286-4606-b922-d510978b863a" containerName="northd" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.069519 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6397759-9cf9-4996-9fc7-6ec98f00014a" containerName="oc" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.069542 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="01a728ab-e286-4606-b922-d510978b863a" containerName="ovn-controller" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.069562 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="01a728ab-e286-4606-b922-d510978b863a" containerName="ovn-acl-logging" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.069574 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="01a728ab-e286-4606-b922-d510978b863a" containerName="ovnkube-controller" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.069588 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="01a728ab-e286-4606-b922-d510978b863a" containerName="ovnkube-controller" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.069602 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="01a728ab-e286-4606-b922-d510978b863a" containerName="kube-rbac-proxy-ovn-metrics" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.069618 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="01a728ab-e286-4606-b922-d510978b863a" containerName="ovnkube-controller" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.069632 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="01a728ab-e286-4606-b922-d510978b863a" containerName="nbdb" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.069645 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="01a728ab-e286-4606-b922-d510978b863a" containerName="kube-rbac-proxy-node" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.069662 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="01a728ab-e286-4606-b922-d510978b863a" containerName="northd" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.069677 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="01a728ab-e286-4606-b922-d510978b863a" containerName="ovnkube-controller" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.069694 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="01a728ab-e286-4606-b922-d510978b863a" containerName="sbdb" Mar 20 17:46:05 crc kubenswrapper[4690]: E0320 17:46:05.069877 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01a728ab-e286-4606-b922-d510978b863a" containerName="ovnkube-controller" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.069891 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="01a728ab-e286-4606-b922-d510978b863a" containerName="ovnkube-controller" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.070037 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="01a728ab-e286-4606-b922-d510978b863a" containerName="ovnkube-controller" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.072956 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qvkzb" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.114042 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/01a728ab-e286-4606-b922-d510978b863a-host-run-ovn-kubernetes\") pod \"01a728ab-e286-4606-b922-d510978b863a\" (UID: \"01a728ab-e286-4606-b922-d510978b863a\") " Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.114083 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/01a728ab-e286-4606-b922-d510978b863a-run-ovn\") pod \"01a728ab-e286-4606-b922-d510978b863a\" (UID: \"01a728ab-e286-4606-b922-d510978b863a\") " Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.114114 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/01a728ab-e286-4606-b922-d510978b863a-host-kubelet\") pod \"01a728ab-e286-4606-b922-d510978b863a\" (UID: \"01a728ab-e286-4606-b922-d510978b863a\") " Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.114140 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmwk9\" (UniqueName: \"kubernetes.io/projected/01a728ab-e286-4606-b922-d510978b863a-kube-api-access-nmwk9\") pod \"01a728ab-e286-4606-b922-d510978b863a\" (UID: \"01a728ab-e286-4606-b922-d510978b863a\") " Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.114170 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/01a728ab-e286-4606-b922-d510978b863a-ovnkube-script-lib\") pod \"01a728ab-e286-4606-b922-d510978b863a\" (UID: \"01a728ab-e286-4606-b922-d510978b863a\") " Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.114190 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/01a728ab-e286-4606-b922-d510978b863a-etc-openvswitch\") pod \"01a728ab-e286-4606-b922-d510978b863a\" (UID: \"01a728ab-e286-4606-b922-d510978b863a\") " Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.114176 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/01a728ab-e286-4606-b922-d510978b863a-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "01a728ab-e286-4606-b922-d510978b863a" (UID: "01a728ab-e286-4606-b922-d510978b863a"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.114205 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/01a728ab-e286-4606-b922-d510978b863a-host-cni-bin\") pod \"01a728ab-e286-4606-b922-d510978b863a\" (UID: \"01a728ab-e286-4606-b922-d510978b863a\") " Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.114219 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/01a728ab-e286-4606-b922-d510978b863a-run-openvswitch\") pod \"01a728ab-e286-4606-b922-d510978b863a\" (UID: \"01a728ab-e286-4606-b922-d510978b863a\") " Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.114234 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/01a728ab-e286-4606-b922-d510978b863a-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "01a728ab-e286-4606-b922-d510978b863a" (UID: "01a728ab-e286-4606-b922-d510978b863a"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.114221 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/01a728ab-e286-4606-b922-d510978b863a-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "01a728ab-e286-4606-b922-d510978b863a" (UID: "01a728ab-e286-4606-b922-d510978b863a"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.114275 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/01a728ab-e286-4606-b922-d510978b863a-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "01a728ab-e286-4606-b922-d510978b863a" (UID: "01a728ab-e286-4606-b922-d510978b863a"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.114242 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/01a728ab-e286-4606-b922-d510978b863a-host-slash\") pod \"01a728ab-e286-4606-b922-d510978b863a\" (UID: \"01a728ab-e286-4606-b922-d510978b863a\") " Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.114281 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/01a728ab-e286-4606-b922-d510978b863a-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "01a728ab-e286-4606-b922-d510978b863a" (UID: "01a728ab-e286-4606-b922-d510978b863a"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.114307 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/01a728ab-e286-4606-b922-d510978b863a-host-slash" (OuterVolumeSpecName: "host-slash") pod "01a728ab-e286-4606-b922-d510978b863a" (UID: "01a728ab-e286-4606-b922-d510978b863a"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.114352 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/01a728ab-e286-4606-b922-d510978b863a-node-log\") pod \"01a728ab-e286-4606-b922-d510978b863a\" (UID: \"01a728ab-e286-4606-b922-d510978b863a\") " Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.114385 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/01a728ab-e286-4606-b922-d510978b863a-ovn-node-metrics-cert\") pod \"01a728ab-e286-4606-b922-d510978b863a\" (UID: \"01a728ab-e286-4606-b922-d510978b863a\") " Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.114421 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/01a728ab-e286-4606-b922-d510978b863a-var-lib-openvswitch\") pod \"01a728ab-e286-4606-b922-d510978b863a\" (UID: \"01a728ab-e286-4606-b922-d510978b863a\") " Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.114445 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/01a728ab-e286-4606-b922-d510978b863a-run-systemd\") pod \"01a728ab-e286-4606-b922-d510978b863a\" (UID: \"01a728ab-e286-4606-b922-d510978b863a\") " Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.114475 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/01a728ab-e286-4606-b922-d510978b863a-host-cni-netd\") pod \"01a728ab-e286-4606-b922-d510978b863a\" (UID: \"01a728ab-e286-4606-b922-d510978b863a\") " Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.114496 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/01a728ab-e286-4606-b922-d510978b863a-host-run-netns\") pod \"01a728ab-e286-4606-b922-d510978b863a\" (UID: \"01a728ab-e286-4606-b922-d510978b863a\") " Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.114485 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/01a728ab-e286-4606-b922-d510978b863a-node-log" (OuterVolumeSpecName: "node-log") pod "01a728ab-e286-4606-b922-d510978b863a" (UID: "01a728ab-e286-4606-b922-d510978b863a"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.114549 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/01a728ab-e286-4606-b922-d510978b863a-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "01a728ab-e286-4606-b922-d510978b863a" (UID: "01a728ab-e286-4606-b922-d510978b863a"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.114527 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/01a728ab-e286-4606-b922-d510978b863a-systemd-units\") pod \"01a728ab-e286-4606-b922-d510978b863a\" (UID: \"01a728ab-e286-4606-b922-d510978b863a\") " Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.114578 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/01a728ab-e286-4606-b922-d510978b863a-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "01a728ab-e286-4606-b922-d510978b863a" (UID: "01a728ab-e286-4606-b922-d510978b863a"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.114696 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/01a728ab-e286-4606-b922-d510978b863a-env-overrides\") pod \"01a728ab-e286-4606-b922-d510978b863a\" (UID: \"01a728ab-e286-4606-b922-d510978b863a\") " Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.114736 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/01a728ab-e286-4606-b922-d510978b863a-log-socket\") pod \"01a728ab-e286-4606-b922-d510978b863a\" (UID: \"01a728ab-e286-4606-b922-d510978b863a\") " Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.114770 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/01a728ab-e286-4606-b922-d510978b863a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"01a728ab-e286-4606-b922-d510978b863a\" (UID: \"01a728ab-e286-4606-b922-d510978b863a\") " Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.114803 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/01a728ab-e286-4606-b922-d510978b863a-ovnkube-config\") pod \"01a728ab-e286-4606-b922-d510978b863a\" (UID: \"01a728ab-e286-4606-b922-d510978b863a\") " Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.114851 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01a728ab-e286-4606-b922-d510978b863a-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "01a728ab-e286-4606-b922-d510978b863a" (UID: "01a728ab-e286-4606-b922-d510978b863a"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.114919 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/01a728ab-e286-4606-b922-d510978b863a-log-socket" (OuterVolumeSpecName: "log-socket") pod "01a728ab-e286-4606-b922-d510978b863a" (UID: "01a728ab-e286-4606-b922-d510978b863a"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.114933 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/01a728ab-e286-4606-b922-d510978b863a-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "01a728ab-e286-4606-b922-d510978b863a" (UID: "01a728ab-e286-4606-b922-d510978b863a"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.114973 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/01a728ab-e286-4606-b922-d510978b863a-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "01a728ab-e286-4606-b922-d510978b863a" (UID: "01a728ab-e286-4606-b922-d510978b863a"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.115005 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/01a728ab-e286-4606-b922-d510978b863a-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "01a728ab-e286-4606-b922-d510978b863a" (UID: "01a728ab-e286-4606-b922-d510978b863a"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.115011 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/01a728ab-e286-4606-b922-d510978b863a-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "01a728ab-e286-4606-b922-d510978b863a" (UID: "01a728ab-e286-4606-b922-d510978b863a"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.115279 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01a728ab-e286-4606-b922-d510978b863a-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "01a728ab-e286-4606-b922-d510978b863a" (UID: "01a728ab-e286-4606-b922-d510978b863a"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.115380 4690 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/01a728ab-e286-4606-b922-d510978b863a-log-socket\") on node \"crc\" DevicePath \"\"" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.115420 4690 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/01a728ab-e286-4606-b922-d510978b863a-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.115447 4690 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/01a728ab-e286-4606-b922-d510978b863a-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.115467 4690 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/01a728ab-e286-4606-b922-d510978b863a-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.115469 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01a728ab-e286-4606-b922-d510978b863a-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "01a728ab-e286-4606-b922-d510978b863a" (UID: "01a728ab-e286-4606-b922-d510978b863a"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.115484 4690 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/01a728ab-e286-4606-b922-d510978b863a-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.115502 4690 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/01a728ab-e286-4606-b922-d510978b863a-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.115519 4690 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/01a728ab-e286-4606-b922-d510978b863a-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.115534 4690 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/01a728ab-e286-4606-b922-d510978b863a-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.115550 4690 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/01a728ab-e286-4606-b922-d510978b863a-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.115567 4690 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/01a728ab-e286-4606-b922-d510978b863a-host-slash\") on node \"crc\" DevicePath \"\"" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.115582 4690 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/01a728ab-e286-4606-b922-d510978b863a-node-log\") on node \"crc\" DevicePath \"\"" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.115599 4690 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/01a728ab-e286-4606-b922-d510978b863a-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.115615 4690 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/01a728ab-e286-4606-b922-d510978b863a-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.115631 4690 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/01a728ab-e286-4606-b922-d510978b863a-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.115646 4690 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/01a728ab-e286-4606-b922-d510978b863a-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.118478 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01a728ab-e286-4606-b922-d510978b863a-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "01a728ab-e286-4606-b922-d510978b863a" (UID: "01a728ab-e286-4606-b922-d510978b863a"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.118620 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01a728ab-e286-4606-b922-d510978b863a-kube-api-access-nmwk9" (OuterVolumeSpecName: "kube-api-access-nmwk9") pod "01a728ab-e286-4606-b922-d510978b863a" (UID: "01a728ab-e286-4606-b922-d510978b863a"). InnerVolumeSpecName "kube-api-access-nmwk9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.125218 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/01a728ab-e286-4606-b922-d510978b863a-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "01a728ab-e286-4606-b922-d510978b863a" (UID: "01a728ab-e286-4606-b922-d510978b863a"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.216829 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/281396bf-6e05-4dd8-ab54-5058d65c9eb0-host-kubelet\") pod \"ovnkube-node-qvkzb\" (UID: \"281396bf-6e05-4dd8-ab54-5058d65c9eb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvkzb" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.216889 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/281396bf-6e05-4dd8-ab54-5058d65c9eb0-node-log\") pod \"ovnkube-node-qvkzb\" (UID: \"281396bf-6e05-4dd8-ab54-5058d65c9eb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvkzb" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.216931 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/281396bf-6e05-4dd8-ab54-5058d65c9eb0-run-ovn\") pod \"ovnkube-node-qvkzb\" (UID: \"281396bf-6e05-4dd8-ab54-5058d65c9eb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvkzb" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.216964 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/281396bf-6e05-4dd8-ab54-5058d65c9eb0-run-systemd\") pod \"ovnkube-node-qvkzb\" (UID: \"281396bf-6e05-4dd8-ab54-5058d65c9eb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvkzb" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.217001 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvw2v\" (UniqueName: \"kubernetes.io/projected/281396bf-6e05-4dd8-ab54-5058d65c9eb0-kube-api-access-nvw2v\") pod \"ovnkube-node-qvkzb\" (UID: \"281396bf-6e05-4dd8-ab54-5058d65c9eb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvkzb" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.217042 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/281396bf-6e05-4dd8-ab54-5058d65c9eb0-ovnkube-config\") pod \"ovnkube-node-qvkzb\" (UID: \"281396bf-6e05-4dd8-ab54-5058d65c9eb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvkzb" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.217152 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/281396bf-6e05-4dd8-ab54-5058d65c9eb0-host-run-ovn-kubernetes\") pod \"ovnkube-node-qvkzb\" (UID: \"281396bf-6e05-4dd8-ab54-5058d65c9eb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvkzb" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.217299 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/281396bf-6e05-4dd8-ab54-5058d65c9eb0-log-socket\") pod \"ovnkube-node-qvkzb\" (UID: \"281396bf-6e05-4dd8-ab54-5058d65c9eb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvkzb" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.217415 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/281396bf-6e05-4dd8-ab54-5058d65c9eb0-host-cni-bin\") pod \"ovnkube-node-qvkzb\" (UID: \"281396bf-6e05-4dd8-ab54-5058d65c9eb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvkzb" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.217528 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/281396bf-6e05-4dd8-ab54-5058d65c9eb0-systemd-units\") pod \"ovnkube-node-qvkzb\" (UID: \"281396bf-6e05-4dd8-ab54-5058d65c9eb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvkzb" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.217616 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/281396bf-6e05-4dd8-ab54-5058d65c9eb0-run-openvswitch\") pod \"ovnkube-node-qvkzb\" (UID: \"281396bf-6e05-4dd8-ab54-5058d65c9eb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvkzb" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.217692 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/281396bf-6e05-4dd8-ab54-5058d65c9eb0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qvkzb\" (UID: \"281396bf-6e05-4dd8-ab54-5058d65c9eb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvkzb" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.217739 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/281396bf-6e05-4dd8-ab54-5058d65c9eb0-ovn-node-metrics-cert\") pod \"ovnkube-node-qvkzb\" (UID: \"281396bf-6e05-4dd8-ab54-5058d65c9eb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvkzb" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.217792 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/281396bf-6e05-4dd8-ab54-5058d65c9eb0-var-lib-openvswitch\") pod \"ovnkube-node-qvkzb\" (UID: \"281396bf-6e05-4dd8-ab54-5058d65c9eb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvkzb" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.217830 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/281396bf-6e05-4dd8-ab54-5058d65c9eb0-etc-openvswitch\") pod \"ovnkube-node-qvkzb\" (UID: \"281396bf-6e05-4dd8-ab54-5058d65c9eb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvkzb" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.217922 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/281396bf-6e05-4dd8-ab54-5058d65c9eb0-host-slash\") pod \"ovnkube-node-qvkzb\" (UID: \"281396bf-6e05-4dd8-ab54-5058d65c9eb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvkzb" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.217969 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/281396bf-6e05-4dd8-ab54-5058d65c9eb0-env-overrides\") pod \"ovnkube-node-qvkzb\" (UID: \"281396bf-6e05-4dd8-ab54-5058d65c9eb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvkzb" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.218058 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/281396bf-6e05-4dd8-ab54-5058d65c9eb0-host-run-netns\") pod \"ovnkube-node-qvkzb\" (UID: \"281396bf-6e05-4dd8-ab54-5058d65c9eb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvkzb" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.218148 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/281396bf-6e05-4dd8-ab54-5058d65c9eb0-host-cni-netd\") pod \"ovnkube-node-qvkzb\" (UID: \"281396bf-6e05-4dd8-ab54-5058d65c9eb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvkzb" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.218225 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/281396bf-6e05-4dd8-ab54-5058d65c9eb0-ovnkube-script-lib\") pod \"ovnkube-node-qvkzb\" (UID: \"281396bf-6e05-4dd8-ab54-5058d65c9eb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvkzb" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.218431 4690 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/01a728ab-e286-4606-b922-d510978b863a-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.218509 4690 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/01a728ab-e286-4606-b922-d510978b863a-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.218539 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nmwk9\" (UniqueName: \"kubernetes.io/projected/01a728ab-e286-4606-b922-d510978b863a-kube-api-access-nmwk9\") on node \"crc\" DevicePath \"\"" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.218603 4690 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/01a728ab-e286-4606-b922-d510978b863a-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.218623 4690 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/01a728ab-e286-4606-b922-d510978b863a-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.319367 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/281396bf-6e05-4dd8-ab54-5058d65c9eb0-var-lib-openvswitch\") pod \"ovnkube-node-qvkzb\" (UID: \"281396bf-6e05-4dd8-ab54-5058d65c9eb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvkzb" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.319432 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/281396bf-6e05-4dd8-ab54-5058d65c9eb0-etc-openvswitch\") pod \"ovnkube-node-qvkzb\" (UID: \"281396bf-6e05-4dd8-ab54-5058d65c9eb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvkzb" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.319456 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/281396bf-6e05-4dd8-ab54-5058d65c9eb0-host-slash\") pod \"ovnkube-node-qvkzb\" (UID: \"281396bf-6e05-4dd8-ab54-5058d65c9eb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvkzb" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.319475 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/281396bf-6e05-4dd8-ab54-5058d65c9eb0-env-overrides\") pod \"ovnkube-node-qvkzb\" (UID: \"281396bf-6e05-4dd8-ab54-5058d65c9eb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvkzb" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.319499 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/281396bf-6e05-4dd8-ab54-5058d65c9eb0-var-lib-openvswitch\") pod \"ovnkube-node-qvkzb\" (UID: \"281396bf-6e05-4dd8-ab54-5058d65c9eb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvkzb" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.319561 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/281396bf-6e05-4dd8-ab54-5058d65c9eb0-host-run-netns\") pod \"ovnkube-node-qvkzb\" (UID: \"281396bf-6e05-4dd8-ab54-5058d65c9eb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvkzb" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.319499 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/281396bf-6e05-4dd8-ab54-5058d65c9eb0-host-run-netns\") pod \"ovnkube-node-qvkzb\" (UID: \"281396bf-6e05-4dd8-ab54-5058d65c9eb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvkzb" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.319619 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/281396bf-6e05-4dd8-ab54-5058d65c9eb0-host-cni-netd\") pod \"ovnkube-node-qvkzb\" (UID: \"281396bf-6e05-4dd8-ab54-5058d65c9eb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvkzb" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.319646 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/281396bf-6e05-4dd8-ab54-5058d65c9eb0-ovnkube-script-lib\") pod \"ovnkube-node-qvkzb\" (UID: \"281396bf-6e05-4dd8-ab54-5058d65c9eb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvkzb" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.319677 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/281396bf-6e05-4dd8-ab54-5058d65c9eb0-node-log\") pod \"ovnkube-node-qvkzb\" (UID: \"281396bf-6e05-4dd8-ab54-5058d65c9eb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvkzb" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.319698 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/281396bf-6e05-4dd8-ab54-5058d65c9eb0-host-kubelet\") pod \"ovnkube-node-qvkzb\" (UID: \"281396bf-6e05-4dd8-ab54-5058d65c9eb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvkzb" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.319696 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/281396bf-6e05-4dd8-ab54-5058d65c9eb0-host-slash\") pod \"ovnkube-node-qvkzb\" (UID: \"281396bf-6e05-4dd8-ab54-5058d65c9eb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvkzb" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.319744 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/281396bf-6e05-4dd8-ab54-5058d65c9eb0-run-ovn\") pod \"ovnkube-node-qvkzb\" (UID: \"281396bf-6e05-4dd8-ab54-5058d65c9eb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvkzb" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.319776 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/281396bf-6e05-4dd8-ab54-5058d65c9eb0-host-cni-netd\") pod \"ovnkube-node-qvkzb\" (UID: \"281396bf-6e05-4dd8-ab54-5058d65c9eb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvkzb" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.319723 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/281396bf-6e05-4dd8-ab54-5058d65c9eb0-run-ovn\") pod \"ovnkube-node-qvkzb\" (UID: \"281396bf-6e05-4dd8-ab54-5058d65c9eb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvkzb" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.319807 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/281396bf-6e05-4dd8-ab54-5058d65c9eb0-host-kubelet\") pod \"ovnkube-node-qvkzb\" (UID: \"281396bf-6e05-4dd8-ab54-5058d65c9eb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvkzb" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.319850 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/281396bf-6e05-4dd8-ab54-5058d65c9eb0-run-systemd\") pod \"ovnkube-node-qvkzb\" (UID: \"281396bf-6e05-4dd8-ab54-5058d65c9eb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvkzb" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.319950 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvw2v\" (UniqueName: \"kubernetes.io/projected/281396bf-6e05-4dd8-ab54-5058d65c9eb0-kube-api-access-nvw2v\") pod \"ovnkube-node-qvkzb\" (UID: \"281396bf-6e05-4dd8-ab54-5058d65c9eb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvkzb" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.320009 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/281396bf-6e05-4dd8-ab54-5058d65c9eb0-node-log\") pod \"ovnkube-node-qvkzb\" (UID: \"281396bf-6e05-4dd8-ab54-5058d65c9eb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvkzb" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.320027 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/281396bf-6e05-4dd8-ab54-5058d65c9eb0-ovnkube-config\") pod \"ovnkube-node-qvkzb\" (UID: \"281396bf-6e05-4dd8-ab54-5058d65c9eb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvkzb" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.320323 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/281396bf-6e05-4dd8-ab54-5058d65c9eb0-host-run-ovn-kubernetes\") pod \"ovnkube-node-qvkzb\" (UID: \"281396bf-6e05-4dd8-ab54-5058d65c9eb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvkzb" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.320520 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/281396bf-6e05-4dd8-ab54-5058d65c9eb0-run-systemd\") pod \"ovnkube-node-qvkzb\" (UID: \"281396bf-6e05-4dd8-ab54-5058d65c9eb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvkzb" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.320531 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/281396bf-6e05-4dd8-ab54-5058d65c9eb0-env-overrides\") pod \"ovnkube-node-qvkzb\" (UID: \"281396bf-6e05-4dd8-ab54-5058d65c9eb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvkzb" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.320588 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/281396bf-6e05-4dd8-ab54-5058d65c9eb0-log-socket\") pod \"ovnkube-node-qvkzb\" (UID: \"281396bf-6e05-4dd8-ab54-5058d65c9eb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvkzb" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.320607 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/281396bf-6e05-4dd8-ab54-5058d65c9eb0-host-run-ovn-kubernetes\") pod \"ovnkube-node-qvkzb\" (UID: \"281396bf-6e05-4dd8-ab54-5058d65c9eb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvkzb" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.320589 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/281396bf-6e05-4dd8-ab54-5058d65c9eb0-etc-openvswitch\") pod \"ovnkube-node-qvkzb\" (UID: \"281396bf-6e05-4dd8-ab54-5058d65c9eb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvkzb" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.320644 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/281396bf-6e05-4dd8-ab54-5058d65c9eb0-log-socket\") pod \"ovnkube-node-qvkzb\" (UID: \"281396bf-6e05-4dd8-ab54-5058d65c9eb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvkzb" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.320662 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/281396bf-6e05-4dd8-ab54-5058d65c9eb0-host-cni-bin\") pod \"ovnkube-node-qvkzb\" (UID: \"281396bf-6e05-4dd8-ab54-5058d65c9eb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvkzb" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.320677 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/281396bf-6e05-4dd8-ab54-5058d65c9eb0-ovnkube-script-lib\") pod \"ovnkube-node-qvkzb\" (UID: \"281396bf-6e05-4dd8-ab54-5058d65c9eb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvkzb" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.320717 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/281396bf-6e05-4dd8-ab54-5058d65c9eb0-host-cni-bin\") pod \"ovnkube-node-qvkzb\" (UID: \"281396bf-6e05-4dd8-ab54-5058d65c9eb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvkzb" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.320782 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/281396bf-6e05-4dd8-ab54-5058d65c9eb0-systemd-units\") pod \"ovnkube-node-qvkzb\" (UID: \"281396bf-6e05-4dd8-ab54-5058d65c9eb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvkzb" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.320822 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/281396bf-6e05-4dd8-ab54-5058d65c9eb0-run-openvswitch\") pod \"ovnkube-node-qvkzb\" (UID: \"281396bf-6e05-4dd8-ab54-5058d65c9eb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvkzb" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.320886 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/281396bf-6e05-4dd8-ab54-5058d65c9eb0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qvkzb\" (UID: \"281396bf-6e05-4dd8-ab54-5058d65c9eb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvkzb" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.320930 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/281396bf-6e05-4dd8-ab54-5058d65c9eb0-ovn-node-metrics-cert\") pod \"ovnkube-node-qvkzb\" (UID: \"281396bf-6e05-4dd8-ab54-5058d65c9eb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvkzb" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.321194 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/281396bf-6e05-4dd8-ab54-5058d65c9eb0-run-openvswitch\") pod \"ovnkube-node-qvkzb\" (UID: \"281396bf-6e05-4dd8-ab54-5058d65c9eb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvkzb" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.321240 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/281396bf-6e05-4dd8-ab54-5058d65c9eb0-systemd-units\") pod \"ovnkube-node-qvkzb\" (UID: \"281396bf-6e05-4dd8-ab54-5058d65c9eb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvkzb" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.321250 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/281396bf-6e05-4dd8-ab54-5058d65c9eb0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qvkzb\" (UID: \"281396bf-6e05-4dd8-ab54-5058d65c9eb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvkzb" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.321524 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/281396bf-6e05-4dd8-ab54-5058d65c9eb0-ovnkube-config\") pod \"ovnkube-node-qvkzb\" (UID: \"281396bf-6e05-4dd8-ab54-5058d65c9eb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvkzb" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.326210 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/281396bf-6e05-4dd8-ab54-5058d65c9eb0-ovn-node-metrics-cert\") pod \"ovnkube-node-qvkzb\" (UID: \"281396bf-6e05-4dd8-ab54-5058d65c9eb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvkzb" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.351537 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvw2v\" (UniqueName: \"kubernetes.io/projected/281396bf-6e05-4dd8-ab54-5058d65c9eb0-kube-api-access-nvw2v\") pod \"ovnkube-node-qvkzb\" (UID: \"281396bf-6e05-4dd8-ab54-5058d65c9eb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvkzb" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.402775 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qvkzb" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.466320 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-s9shq" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.546614 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bf8dm_189715be-f690-4a1d-9bd3-fb0dcae7affe/kube-multus/2.log" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.548026 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bf8dm_189715be-f690-4a1d-9bd3-fb0dcae7affe/kube-multus/1.log" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.548121 4690 generic.go:334] "Generic (PLEG): container finished" podID="189715be-f690-4a1d-9bd3-fb0dcae7affe" containerID="50174b4b1d0d5ad19c52c1f42347f6d15551581b6ce597a9860c9607c408f9ff" exitCode=2 Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.548231 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bf8dm" event={"ID":"189715be-f690-4a1d-9bd3-fb0dcae7affe","Type":"ContainerDied","Data":"50174b4b1d0d5ad19c52c1f42347f6d15551581b6ce597a9860c9607c408f9ff"} Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.548337 4690 scope.go:117] "RemoveContainer" containerID="1a2c238f16fbb8b532515c8ae6456c4e5b9b6e5797597ea258171e573c9f4ba7" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.549140 4690 scope.go:117] "RemoveContainer" containerID="50174b4b1d0d5ad19c52c1f42347f6d15551581b6ce597a9860c9607c408f9ff" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.552607 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567146-qqzwn" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.552619 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567146-qqzwn" event={"ID":"a6397759-9cf9-4996-9fc7-6ec98f00014a","Type":"ContainerDied","Data":"b5e829630273b1dacddd4d14e28f7ea40633b4e56691be78a775b4c24f47cd73"} Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.552659 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b5e829630273b1dacddd4d14e28f7ea40633b4e56691be78a775b4c24f47cd73" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.557727 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7bsmm_01a728ab-e286-4606-b922-d510978b863a/ovnkube-controller/3.log" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.574026 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7bsmm_01a728ab-e286-4606-b922-d510978b863a/ovn-acl-logging/0.log" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.576034 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7bsmm_01a728ab-e286-4606-b922-d510978b863a/ovn-controller/0.log" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.577719 4690 generic.go:334] "Generic (PLEG): container finished" podID="01a728ab-e286-4606-b922-d510978b863a" containerID="e6135f9b9f357be4756c24d0e74244d64acd2fdfe7868743556379250a02e5ec" exitCode=0 Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.577753 4690 generic.go:334] "Generic (PLEG): container finished" podID="01a728ab-e286-4606-b922-d510978b863a" containerID="6447a78cef9ba2045f7928077399b681d152b37755ec287ae1633a26a67711ff" exitCode=0 Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.577768 4690 generic.go:334] "Generic (PLEG): container finished" podID="01a728ab-e286-4606-b922-d510978b863a" containerID="d198c0b94cfc2e9429a02ccb1bf444b3746c37cd3278cc5c41cccad3a92f3a7c" exitCode=0 Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.577781 4690 generic.go:334] "Generic (PLEG): container finished" podID="01a728ab-e286-4606-b922-d510978b863a" containerID="78b79e7c6bc179739a43168addace3ea75f4067c5938f219a5cb0e545f65472f" exitCode=0 Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.577792 4690 generic.go:334] "Generic (PLEG): container finished" podID="01a728ab-e286-4606-b922-d510978b863a" containerID="187278dddcc4ae295ce37bb5966dd95b70987cf9579d8a302c45162906caa098" exitCode=0 Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.577807 4690 generic.go:334] "Generic (PLEG): container finished" podID="01a728ab-e286-4606-b922-d510978b863a" containerID="89f5bb035f84384df58eb38689bda300611344d78c38c548c61cd02a479b6852" exitCode=0 Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.577820 4690 generic.go:334] "Generic (PLEG): container finished" podID="01a728ab-e286-4606-b922-d510978b863a" containerID="11c8e8059826df28ea1bdafe3ca56a8a902ff916246367be3ece76d468194901" exitCode=143 Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.577834 4690 generic.go:334] "Generic (PLEG): container finished" podID="01a728ab-e286-4606-b922-d510978b863a" containerID="95c9b322e5da6bc8172886af77d6507bccaaf8e4489181c78d3f5e522d781aa4" exitCode=143 Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.577905 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" event={"ID":"01a728ab-e286-4606-b922-d510978b863a","Type":"ContainerDied","Data":"e6135f9b9f357be4756c24d0e74244d64acd2fdfe7868743556379250a02e5ec"} Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.577940 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" event={"ID":"01a728ab-e286-4606-b922-d510978b863a","Type":"ContainerDied","Data":"6447a78cef9ba2045f7928077399b681d152b37755ec287ae1633a26a67711ff"} Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.577955 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" event={"ID":"01a728ab-e286-4606-b922-d510978b863a","Type":"ContainerDied","Data":"d198c0b94cfc2e9429a02ccb1bf444b3746c37cd3278cc5c41cccad3a92f3a7c"} Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.577969 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" event={"ID":"01a728ab-e286-4606-b922-d510978b863a","Type":"ContainerDied","Data":"78b79e7c6bc179739a43168addace3ea75f4067c5938f219a5cb0e545f65472f"} Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.577988 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" event={"ID":"01a728ab-e286-4606-b922-d510978b863a","Type":"ContainerDied","Data":"187278dddcc4ae295ce37bb5966dd95b70987cf9579d8a302c45162906caa098"} Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.578007 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" event={"ID":"01a728ab-e286-4606-b922-d510978b863a","Type":"ContainerDied","Data":"89f5bb035f84384df58eb38689bda300611344d78c38c548c61cd02a479b6852"} Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.578026 4690 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e6135f9b9f357be4756c24d0e74244d64acd2fdfe7868743556379250a02e5ec"} Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.578041 4690 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"81abe4d654d381b11ab7ff28d592be23303e3f7934bb0c68d3f3c30316b491ca"} Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.578051 4690 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6447a78cef9ba2045f7928077399b681d152b37755ec287ae1633a26a67711ff"} Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.578061 4690 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d198c0b94cfc2e9429a02ccb1bf444b3746c37cd3278cc5c41cccad3a92f3a7c"} Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.577906 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.578069 4690 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"78b79e7c6bc179739a43168addace3ea75f4067c5938f219a5cb0e545f65472f"} Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.578881 4690 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"187278dddcc4ae295ce37bb5966dd95b70987cf9579d8a302c45162906caa098"} Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.578896 4690 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"89f5bb035f84384df58eb38689bda300611344d78c38c548c61cd02a479b6852"} Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.578904 4690 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"11c8e8059826df28ea1bdafe3ca56a8a902ff916246367be3ece76d468194901"} Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.578922 4690 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"95c9b322e5da6bc8172886af77d6507bccaaf8e4489181c78d3f5e522d781aa4"} Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.578931 4690 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"13ad2529bd38d1e0c84ca456ccdcc8020ce82a667c5aa5ea3a0027d397ec94f3"} Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.578945 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" event={"ID":"01a728ab-e286-4606-b922-d510978b863a","Type":"ContainerDied","Data":"11c8e8059826df28ea1bdafe3ca56a8a902ff916246367be3ece76d468194901"} Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.578963 4690 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e6135f9b9f357be4756c24d0e74244d64acd2fdfe7868743556379250a02e5ec"} Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.578974 4690 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"81abe4d654d381b11ab7ff28d592be23303e3f7934bb0c68d3f3c30316b491ca"} Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.578981 4690 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6447a78cef9ba2045f7928077399b681d152b37755ec287ae1633a26a67711ff"} Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.578988 4690 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d198c0b94cfc2e9429a02ccb1bf444b3746c37cd3278cc5c41cccad3a92f3a7c"} Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.578995 4690 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"78b79e7c6bc179739a43168addace3ea75f4067c5938f219a5cb0e545f65472f"} Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.579001 4690 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"187278dddcc4ae295ce37bb5966dd95b70987cf9579d8a302c45162906caa098"} Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.579007 4690 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"89f5bb035f84384df58eb38689bda300611344d78c38c548c61cd02a479b6852"} Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.579014 4690 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"11c8e8059826df28ea1bdafe3ca56a8a902ff916246367be3ece76d468194901"} Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.579020 4690 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"95c9b322e5da6bc8172886af77d6507bccaaf8e4489181c78d3f5e522d781aa4"} Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.579026 4690 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"13ad2529bd38d1e0c84ca456ccdcc8020ce82a667c5aa5ea3a0027d397ec94f3"} Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.579036 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" event={"ID":"01a728ab-e286-4606-b922-d510978b863a","Type":"ContainerDied","Data":"95c9b322e5da6bc8172886af77d6507bccaaf8e4489181c78d3f5e522d781aa4"} Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.579046 4690 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e6135f9b9f357be4756c24d0e74244d64acd2fdfe7868743556379250a02e5ec"} Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.579054 4690 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"81abe4d654d381b11ab7ff28d592be23303e3f7934bb0c68d3f3c30316b491ca"} Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.579061 4690 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6447a78cef9ba2045f7928077399b681d152b37755ec287ae1633a26a67711ff"} Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.579068 4690 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d198c0b94cfc2e9429a02ccb1bf444b3746c37cd3278cc5c41cccad3a92f3a7c"} Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.579074 4690 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"78b79e7c6bc179739a43168addace3ea75f4067c5938f219a5cb0e545f65472f"} Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.579083 4690 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"187278dddcc4ae295ce37bb5966dd95b70987cf9579d8a302c45162906caa098"} Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.579091 4690 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"89f5bb035f84384df58eb38689bda300611344d78c38c548c61cd02a479b6852"} Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.579097 4690 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"11c8e8059826df28ea1bdafe3ca56a8a902ff916246367be3ece76d468194901"} Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.579104 4690 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"95c9b322e5da6bc8172886af77d6507bccaaf8e4489181c78d3f5e522d781aa4"} Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.579110 4690 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"13ad2529bd38d1e0c84ca456ccdcc8020ce82a667c5aa5ea3a0027d397ec94f3"} Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.579119 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7bsmm" event={"ID":"01a728ab-e286-4606-b922-d510978b863a","Type":"ContainerDied","Data":"cca249cb4b6b5151a2967ed0c06b0f8a24549915a836d9597d1d837c4b055a6e"} Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.579130 4690 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e6135f9b9f357be4756c24d0e74244d64acd2fdfe7868743556379250a02e5ec"} Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.579137 4690 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"81abe4d654d381b11ab7ff28d592be23303e3f7934bb0c68d3f3c30316b491ca"} Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.579144 4690 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6447a78cef9ba2045f7928077399b681d152b37755ec287ae1633a26a67711ff"} Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.579152 4690 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d198c0b94cfc2e9429a02ccb1bf444b3746c37cd3278cc5c41cccad3a92f3a7c"} Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.579159 4690 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"78b79e7c6bc179739a43168addace3ea75f4067c5938f219a5cb0e545f65472f"} Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.579165 4690 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"187278dddcc4ae295ce37bb5966dd95b70987cf9579d8a302c45162906caa098"} Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.579172 4690 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"89f5bb035f84384df58eb38689bda300611344d78c38c548c61cd02a479b6852"} Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.579181 4690 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"11c8e8059826df28ea1bdafe3ca56a8a902ff916246367be3ece76d468194901"} Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.579189 4690 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"95c9b322e5da6bc8172886af77d6507bccaaf8e4489181c78d3f5e522d781aa4"} Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.579197 4690 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"13ad2529bd38d1e0c84ca456ccdcc8020ce82a667c5aa5ea3a0027d397ec94f3"} Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.600887 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qvkzb" event={"ID":"281396bf-6e05-4dd8-ab54-5058d65c9eb0","Type":"ContainerStarted","Data":"5996763a346cc52ed430078ac389b894b53ada138e8177a697f6f6f01b828b41"} Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.625963 4690 scope.go:117] "RemoveContainer" containerID="e6135f9b9f357be4756c24d0e74244d64acd2fdfe7868743556379250a02e5ec" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.636239 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567140-xh2x5"] Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.641471 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567140-xh2x5"] Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.676269 4690 scope.go:117] "RemoveContainer" containerID="81abe4d654d381b11ab7ff28d592be23303e3f7934bb0c68d3f3c30316b491ca" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.709734 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-7bsmm"] Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.717693 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-7bsmm"] Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.725191 4690 scope.go:117] "RemoveContainer" containerID="6447a78cef9ba2045f7928077399b681d152b37755ec287ae1633a26a67711ff" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.741008 4690 scope.go:117] "RemoveContainer" containerID="d198c0b94cfc2e9429a02ccb1bf444b3746c37cd3278cc5c41cccad3a92f3a7c" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.757326 4690 scope.go:117] "RemoveContainer" containerID="78b79e7c6bc179739a43168addace3ea75f4067c5938f219a5cb0e545f65472f" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.772807 4690 scope.go:117] "RemoveContainer" containerID="187278dddcc4ae295ce37bb5966dd95b70987cf9579d8a302c45162906caa098" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.822144 4690 scope.go:117] "RemoveContainer" containerID="89f5bb035f84384df58eb38689bda300611344d78c38c548c61cd02a479b6852" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.841525 4690 scope.go:117] "RemoveContainer" containerID="11c8e8059826df28ea1bdafe3ca56a8a902ff916246367be3ece76d468194901" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.857358 4690 scope.go:117] "RemoveContainer" containerID="95c9b322e5da6bc8172886af77d6507bccaaf8e4489181c78d3f5e522d781aa4" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.883668 4690 scope.go:117] "RemoveContainer" containerID="13ad2529bd38d1e0c84ca456ccdcc8020ce82a667c5aa5ea3a0027d397ec94f3" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.896733 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01a728ab-e286-4606-b922-d510978b863a" path="/var/lib/kubelet/pods/01a728ab-e286-4606-b922-d510978b863a/volumes" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.899929 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be701514-9b72-4af7-8a67-bbf545296477" path="/var/lib/kubelet/pods/be701514-9b72-4af7-8a67-bbf545296477/volumes" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.907784 4690 scope.go:117] "RemoveContainer" containerID="e6135f9b9f357be4756c24d0e74244d64acd2fdfe7868743556379250a02e5ec" Mar 20 17:46:05 crc kubenswrapper[4690]: E0320 17:46:05.908746 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6135f9b9f357be4756c24d0e74244d64acd2fdfe7868743556379250a02e5ec\": container with ID starting with e6135f9b9f357be4756c24d0e74244d64acd2fdfe7868743556379250a02e5ec not found: ID does not exist" containerID="e6135f9b9f357be4756c24d0e74244d64acd2fdfe7868743556379250a02e5ec" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.908813 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6135f9b9f357be4756c24d0e74244d64acd2fdfe7868743556379250a02e5ec"} err="failed to get container status \"e6135f9b9f357be4756c24d0e74244d64acd2fdfe7868743556379250a02e5ec\": rpc error: code = NotFound desc = could not find container \"e6135f9b9f357be4756c24d0e74244d64acd2fdfe7868743556379250a02e5ec\": container with ID starting with e6135f9b9f357be4756c24d0e74244d64acd2fdfe7868743556379250a02e5ec not found: ID does not exist" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.908858 4690 scope.go:117] "RemoveContainer" containerID="81abe4d654d381b11ab7ff28d592be23303e3f7934bb0c68d3f3c30316b491ca" Mar 20 17:46:05 crc kubenswrapper[4690]: E0320 17:46:05.909224 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81abe4d654d381b11ab7ff28d592be23303e3f7934bb0c68d3f3c30316b491ca\": container with ID starting with 81abe4d654d381b11ab7ff28d592be23303e3f7934bb0c68d3f3c30316b491ca not found: ID does not exist" containerID="81abe4d654d381b11ab7ff28d592be23303e3f7934bb0c68d3f3c30316b491ca" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.910135 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81abe4d654d381b11ab7ff28d592be23303e3f7934bb0c68d3f3c30316b491ca"} err="failed to get container status \"81abe4d654d381b11ab7ff28d592be23303e3f7934bb0c68d3f3c30316b491ca\": rpc error: code = NotFound desc = could not find container \"81abe4d654d381b11ab7ff28d592be23303e3f7934bb0c68d3f3c30316b491ca\": container with ID starting with 81abe4d654d381b11ab7ff28d592be23303e3f7934bb0c68d3f3c30316b491ca not found: ID does not exist" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.910175 4690 scope.go:117] "RemoveContainer" containerID="6447a78cef9ba2045f7928077399b681d152b37755ec287ae1633a26a67711ff" Mar 20 17:46:05 crc kubenswrapper[4690]: E0320 17:46:05.910584 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6447a78cef9ba2045f7928077399b681d152b37755ec287ae1633a26a67711ff\": container with ID starting with 6447a78cef9ba2045f7928077399b681d152b37755ec287ae1633a26a67711ff not found: ID does not exist" containerID="6447a78cef9ba2045f7928077399b681d152b37755ec287ae1633a26a67711ff" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.910680 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6447a78cef9ba2045f7928077399b681d152b37755ec287ae1633a26a67711ff"} err="failed to get container status \"6447a78cef9ba2045f7928077399b681d152b37755ec287ae1633a26a67711ff\": rpc error: code = NotFound desc = could not find container \"6447a78cef9ba2045f7928077399b681d152b37755ec287ae1633a26a67711ff\": container with ID starting with 6447a78cef9ba2045f7928077399b681d152b37755ec287ae1633a26a67711ff not found: ID does not exist" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.910707 4690 scope.go:117] "RemoveContainer" containerID="d198c0b94cfc2e9429a02ccb1bf444b3746c37cd3278cc5c41cccad3a92f3a7c" Mar 20 17:46:05 crc kubenswrapper[4690]: E0320 17:46:05.911171 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d198c0b94cfc2e9429a02ccb1bf444b3746c37cd3278cc5c41cccad3a92f3a7c\": container with ID starting with d198c0b94cfc2e9429a02ccb1bf444b3746c37cd3278cc5c41cccad3a92f3a7c not found: ID does not exist" containerID="d198c0b94cfc2e9429a02ccb1bf444b3746c37cd3278cc5c41cccad3a92f3a7c" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.911198 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d198c0b94cfc2e9429a02ccb1bf444b3746c37cd3278cc5c41cccad3a92f3a7c"} err="failed to get container status \"d198c0b94cfc2e9429a02ccb1bf444b3746c37cd3278cc5c41cccad3a92f3a7c\": rpc error: code = NotFound desc = could not find container \"d198c0b94cfc2e9429a02ccb1bf444b3746c37cd3278cc5c41cccad3a92f3a7c\": container with ID starting with d198c0b94cfc2e9429a02ccb1bf444b3746c37cd3278cc5c41cccad3a92f3a7c not found: ID does not exist" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.911216 4690 scope.go:117] "RemoveContainer" containerID="78b79e7c6bc179739a43168addace3ea75f4067c5938f219a5cb0e545f65472f" Mar 20 17:46:05 crc kubenswrapper[4690]: E0320 17:46:05.911498 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78b79e7c6bc179739a43168addace3ea75f4067c5938f219a5cb0e545f65472f\": container with ID starting with 78b79e7c6bc179739a43168addace3ea75f4067c5938f219a5cb0e545f65472f not found: ID does not exist" containerID="78b79e7c6bc179739a43168addace3ea75f4067c5938f219a5cb0e545f65472f" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.911521 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78b79e7c6bc179739a43168addace3ea75f4067c5938f219a5cb0e545f65472f"} err="failed to get container status \"78b79e7c6bc179739a43168addace3ea75f4067c5938f219a5cb0e545f65472f\": rpc error: code = NotFound desc = could not find container \"78b79e7c6bc179739a43168addace3ea75f4067c5938f219a5cb0e545f65472f\": container with ID starting with 78b79e7c6bc179739a43168addace3ea75f4067c5938f219a5cb0e545f65472f not found: ID does not exist" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.911534 4690 scope.go:117] "RemoveContainer" containerID="187278dddcc4ae295ce37bb5966dd95b70987cf9579d8a302c45162906caa098" Mar 20 17:46:05 crc kubenswrapper[4690]: E0320 17:46:05.911870 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"187278dddcc4ae295ce37bb5966dd95b70987cf9579d8a302c45162906caa098\": container with ID starting with 187278dddcc4ae295ce37bb5966dd95b70987cf9579d8a302c45162906caa098 not found: ID does not exist" containerID="187278dddcc4ae295ce37bb5966dd95b70987cf9579d8a302c45162906caa098" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.911895 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"187278dddcc4ae295ce37bb5966dd95b70987cf9579d8a302c45162906caa098"} err="failed to get container status \"187278dddcc4ae295ce37bb5966dd95b70987cf9579d8a302c45162906caa098\": rpc error: code = NotFound desc = could not find container \"187278dddcc4ae295ce37bb5966dd95b70987cf9579d8a302c45162906caa098\": container with ID starting with 187278dddcc4ae295ce37bb5966dd95b70987cf9579d8a302c45162906caa098 not found: ID does not exist" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.911911 4690 scope.go:117] "RemoveContainer" containerID="89f5bb035f84384df58eb38689bda300611344d78c38c548c61cd02a479b6852" Mar 20 17:46:05 crc kubenswrapper[4690]: E0320 17:46:05.912475 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89f5bb035f84384df58eb38689bda300611344d78c38c548c61cd02a479b6852\": container with ID starting with 89f5bb035f84384df58eb38689bda300611344d78c38c548c61cd02a479b6852 not found: ID does not exist" containerID="89f5bb035f84384df58eb38689bda300611344d78c38c548c61cd02a479b6852" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.912557 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89f5bb035f84384df58eb38689bda300611344d78c38c548c61cd02a479b6852"} err="failed to get container status \"89f5bb035f84384df58eb38689bda300611344d78c38c548c61cd02a479b6852\": rpc error: code = NotFound desc = could not find container \"89f5bb035f84384df58eb38689bda300611344d78c38c548c61cd02a479b6852\": container with ID starting with 89f5bb035f84384df58eb38689bda300611344d78c38c548c61cd02a479b6852 not found: ID does not exist" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.912610 4690 scope.go:117] "RemoveContainer" containerID="11c8e8059826df28ea1bdafe3ca56a8a902ff916246367be3ece76d468194901" Mar 20 17:46:05 crc kubenswrapper[4690]: E0320 17:46:05.913028 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11c8e8059826df28ea1bdafe3ca56a8a902ff916246367be3ece76d468194901\": container with ID starting with 11c8e8059826df28ea1bdafe3ca56a8a902ff916246367be3ece76d468194901 not found: ID does not exist" containerID="11c8e8059826df28ea1bdafe3ca56a8a902ff916246367be3ece76d468194901" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.913078 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11c8e8059826df28ea1bdafe3ca56a8a902ff916246367be3ece76d468194901"} err="failed to get container status \"11c8e8059826df28ea1bdafe3ca56a8a902ff916246367be3ece76d468194901\": rpc error: code = NotFound desc = could not find container \"11c8e8059826df28ea1bdafe3ca56a8a902ff916246367be3ece76d468194901\": container with ID starting with 11c8e8059826df28ea1bdafe3ca56a8a902ff916246367be3ece76d468194901 not found: ID does not exist" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.913110 4690 scope.go:117] "RemoveContainer" containerID="95c9b322e5da6bc8172886af77d6507bccaaf8e4489181c78d3f5e522d781aa4" Mar 20 17:46:05 crc kubenswrapper[4690]: E0320 17:46:05.913469 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95c9b322e5da6bc8172886af77d6507bccaaf8e4489181c78d3f5e522d781aa4\": container with ID starting with 95c9b322e5da6bc8172886af77d6507bccaaf8e4489181c78d3f5e522d781aa4 not found: ID does not exist" containerID="95c9b322e5da6bc8172886af77d6507bccaaf8e4489181c78d3f5e522d781aa4" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.913489 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95c9b322e5da6bc8172886af77d6507bccaaf8e4489181c78d3f5e522d781aa4"} err="failed to get container status \"95c9b322e5da6bc8172886af77d6507bccaaf8e4489181c78d3f5e522d781aa4\": rpc error: code = NotFound desc = could not find container \"95c9b322e5da6bc8172886af77d6507bccaaf8e4489181c78d3f5e522d781aa4\": container with ID starting with 95c9b322e5da6bc8172886af77d6507bccaaf8e4489181c78d3f5e522d781aa4 not found: ID does not exist" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.913506 4690 scope.go:117] "RemoveContainer" containerID="13ad2529bd38d1e0c84ca456ccdcc8020ce82a667c5aa5ea3a0027d397ec94f3" Mar 20 17:46:05 crc kubenswrapper[4690]: E0320 17:46:05.913886 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13ad2529bd38d1e0c84ca456ccdcc8020ce82a667c5aa5ea3a0027d397ec94f3\": container with ID starting with 13ad2529bd38d1e0c84ca456ccdcc8020ce82a667c5aa5ea3a0027d397ec94f3 not found: ID does not exist" containerID="13ad2529bd38d1e0c84ca456ccdcc8020ce82a667c5aa5ea3a0027d397ec94f3" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.913912 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13ad2529bd38d1e0c84ca456ccdcc8020ce82a667c5aa5ea3a0027d397ec94f3"} err="failed to get container status \"13ad2529bd38d1e0c84ca456ccdcc8020ce82a667c5aa5ea3a0027d397ec94f3\": rpc error: code = NotFound desc = could not find container \"13ad2529bd38d1e0c84ca456ccdcc8020ce82a667c5aa5ea3a0027d397ec94f3\": container with ID starting with 13ad2529bd38d1e0c84ca456ccdcc8020ce82a667c5aa5ea3a0027d397ec94f3 not found: ID does not exist" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.913953 4690 scope.go:117] "RemoveContainer" containerID="e6135f9b9f357be4756c24d0e74244d64acd2fdfe7868743556379250a02e5ec" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.914281 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6135f9b9f357be4756c24d0e74244d64acd2fdfe7868743556379250a02e5ec"} err="failed to get container status \"e6135f9b9f357be4756c24d0e74244d64acd2fdfe7868743556379250a02e5ec\": rpc error: code = NotFound desc = could not find container \"e6135f9b9f357be4756c24d0e74244d64acd2fdfe7868743556379250a02e5ec\": container with ID starting with e6135f9b9f357be4756c24d0e74244d64acd2fdfe7868743556379250a02e5ec not found: ID does not exist" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.914304 4690 scope.go:117] "RemoveContainer" containerID="81abe4d654d381b11ab7ff28d592be23303e3f7934bb0c68d3f3c30316b491ca" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.914642 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81abe4d654d381b11ab7ff28d592be23303e3f7934bb0c68d3f3c30316b491ca"} err="failed to get container status \"81abe4d654d381b11ab7ff28d592be23303e3f7934bb0c68d3f3c30316b491ca\": rpc error: code = NotFound desc = could not find container \"81abe4d654d381b11ab7ff28d592be23303e3f7934bb0c68d3f3c30316b491ca\": container with ID starting with 81abe4d654d381b11ab7ff28d592be23303e3f7934bb0c68d3f3c30316b491ca not found: ID does not exist" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.914661 4690 scope.go:117] "RemoveContainer" containerID="6447a78cef9ba2045f7928077399b681d152b37755ec287ae1633a26a67711ff" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.915057 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6447a78cef9ba2045f7928077399b681d152b37755ec287ae1633a26a67711ff"} err="failed to get container status \"6447a78cef9ba2045f7928077399b681d152b37755ec287ae1633a26a67711ff\": rpc error: code = NotFound desc = could not find container \"6447a78cef9ba2045f7928077399b681d152b37755ec287ae1633a26a67711ff\": container with ID starting with 6447a78cef9ba2045f7928077399b681d152b37755ec287ae1633a26a67711ff not found: ID does not exist" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.915106 4690 scope.go:117] "RemoveContainer" containerID="d198c0b94cfc2e9429a02ccb1bf444b3746c37cd3278cc5c41cccad3a92f3a7c" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.915630 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d198c0b94cfc2e9429a02ccb1bf444b3746c37cd3278cc5c41cccad3a92f3a7c"} err="failed to get container status \"d198c0b94cfc2e9429a02ccb1bf444b3746c37cd3278cc5c41cccad3a92f3a7c\": rpc error: code = NotFound desc = could not find container \"d198c0b94cfc2e9429a02ccb1bf444b3746c37cd3278cc5c41cccad3a92f3a7c\": container with ID starting with d198c0b94cfc2e9429a02ccb1bf444b3746c37cd3278cc5c41cccad3a92f3a7c not found: ID does not exist" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.915682 4690 scope.go:117] "RemoveContainer" containerID="78b79e7c6bc179739a43168addace3ea75f4067c5938f219a5cb0e545f65472f" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.916049 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78b79e7c6bc179739a43168addace3ea75f4067c5938f219a5cb0e545f65472f"} err="failed to get container status \"78b79e7c6bc179739a43168addace3ea75f4067c5938f219a5cb0e545f65472f\": rpc error: code = NotFound desc = could not find container \"78b79e7c6bc179739a43168addace3ea75f4067c5938f219a5cb0e545f65472f\": container with ID starting with 78b79e7c6bc179739a43168addace3ea75f4067c5938f219a5cb0e545f65472f not found: ID does not exist" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.916099 4690 scope.go:117] "RemoveContainer" containerID="187278dddcc4ae295ce37bb5966dd95b70987cf9579d8a302c45162906caa098" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.916452 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"187278dddcc4ae295ce37bb5966dd95b70987cf9579d8a302c45162906caa098"} err="failed to get container status \"187278dddcc4ae295ce37bb5966dd95b70987cf9579d8a302c45162906caa098\": rpc error: code = NotFound desc = could not find container \"187278dddcc4ae295ce37bb5966dd95b70987cf9579d8a302c45162906caa098\": container with ID starting with 187278dddcc4ae295ce37bb5966dd95b70987cf9579d8a302c45162906caa098 not found: ID does not exist" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.916483 4690 scope.go:117] "RemoveContainer" containerID="89f5bb035f84384df58eb38689bda300611344d78c38c548c61cd02a479b6852" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.916760 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89f5bb035f84384df58eb38689bda300611344d78c38c548c61cd02a479b6852"} err="failed to get container status \"89f5bb035f84384df58eb38689bda300611344d78c38c548c61cd02a479b6852\": rpc error: code = NotFound desc = could not find container \"89f5bb035f84384df58eb38689bda300611344d78c38c548c61cd02a479b6852\": container with ID starting with 89f5bb035f84384df58eb38689bda300611344d78c38c548c61cd02a479b6852 not found: ID does not exist" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.916783 4690 scope.go:117] "RemoveContainer" containerID="11c8e8059826df28ea1bdafe3ca56a8a902ff916246367be3ece76d468194901" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.917083 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11c8e8059826df28ea1bdafe3ca56a8a902ff916246367be3ece76d468194901"} err="failed to get container status \"11c8e8059826df28ea1bdafe3ca56a8a902ff916246367be3ece76d468194901\": rpc error: code = NotFound desc = could not find container \"11c8e8059826df28ea1bdafe3ca56a8a902ff916246367be3ece76d468194901\": container with ID starting with 11c8e8059826df28ea1bdafe3ca56a8a902ff916246367be3ece76d468194901 not found: ID does not exist" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.917106 4690 scope.go:117] "RemoveContainer" containerID="95c9b322e5da6bc8172886af77d6507bccaaf8e4489181c78d3f5e522d781aa4" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.917433 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95c9b322e5da6bc8172886af77d6507bccaaf8e4489181c78d3f5e522d781aa4"} err="failed to get container status \"95c9b322e5da6bc8172886af77d6507bccaaf8e4489181c78d3f5e522d781aa4\": rpc error: code = NotFound desc = could not find container \"95c9b322e5da6bc8172886af77d6507bccaaf8e4489181c78d3f5e522d781aa4\": container with ID starting with 95c9b322e5da6bc8172886af77d6507bccaaf8e4489181c78d3f5e522d781aa4 not found: ID does not exist" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.917475 4690 scope.go:117] "RemoveContainer" containerID="13ad2529bd38d1e0c84ca456ccdcc8020ce82a667c5aa5ea3a0027d397ec94f3" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.917801 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13ad2529bd38d1e0c84ca456ccdcc8020ce82a667c5aa5ea3a0027d397ec94f3"} err="failed to get container status \"13ad2529bd38d1e0c84ca456ccdcc8020ce82a667c5aa5ea3a0027d397ec94f3\": rpc error: code = NotFound desc = could not find container \"13ad2529bd38d1e0c84ca456ccdcc8020ce82a667c5aa5ea3a0027d397ec94f3\": container with ID starting with 13ad2529bd38d1e0c84ca456ccdcc8020ce82a667c5aa5ea3a0027d397ec94f3 not found: ID does not exist" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.917826 4690 scope.go:117] "RemoveContainer" containerID="e6135f9b9f357be4756c24d0e74244d64acd2fdfe7868743556379250a02e5ec" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.918137 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6135f9b9f357be4756c24d0e74244d64acd2fdfe7868743556379250a02e5ec"} err="failed to get container status \"e6135f9b9f357be4756c24d0e74244d64acd2fdfe7868743556379250a02e5ec\": rpc error: code = NotFound desc = could not find container \"e6135f9b9f357be4756c24d0e74244d64acd2fdfe7868743556379250a02e5ec\": container with ID starting with e6135f9b9f357be4756c24d0e74244d64acd2fdfe7868743556379250a02e5ec not found: ID does not exist" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.918159 4690 scope.go:117] "RemoveContainer" containerID="81abe4d654d381b11ab7ff28d592be23303e3f7934bb0c68d3f3c30316b491ca" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.918481 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81abe4d654d381b11ab7ff28d592be23303e3f7934bb0c68d3f3c30316b491ca"} err="failed to get container status \"81abe4d654d381b11ab7ff28d592be23303e3f7934bb0c68d3f3c30316b491ca\": rpc error: code = NotFound desc = could not find container \"81abe4d654d381b11ab7ff28d592be23303e3f7934bb0c68d3f3c30316b491ca\": container with ID starting with 81abe4d654d381b11ab7ff28d592be23303e3f7934bb0c68d3f3c30316b491ca not found: ID does not exist" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.918525 4690 scope.go:117] "RemoveContainer" containerID="6447a78cef9ba2045f7928077399b681d152b37755ec287ae1633a26a67711ff" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.918851 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6447a78cef9ba2045f7928077399b681d152b37755ec287ae1633a26a67711ff"} err="failed to get container status \"6447a78cef9ba2045f7928077399b681d152b37755ec287ae1633a26a67711ff\": rpc error: code = NotFound desc = could not find container \"6447a78cef9ba2045f7928077399b681d152b37755ec287ae1633a26a67711ff\": container with ID starting with 6447a78cef9ba2045f7928077399b681d152b37755ec287ae1633a26a67711ff not found: ID does not exist" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.918879 4690 scope.go:117] "RemoveContainer" containerID="d198c0b94cfc2e9429a02ccb1bf444b3746c37cd3278cc5c41cccad3a92f3a7c" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.919107 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d198c0b94cfc2e9429a02ccb1bf444b3746c37cd3278cc5c41cccad3a92f3a7c"} err="failed to get container status \"d198c0b94cfc2e9429a02ccb1bf444b3746c37cd3278cc5c41cccad3a92f3a7c\": rpc error: code = NotFound desc = could not find container \"d198c0b94cfc2e9429a02ccb1bf444b3746c37cd3278cc5c41cccad3a92f3a7c\": container with ID starting with d198c0b94cfc2e9429a02ccb1bf444b3746c37cd3278cc5c41cccad3a92f3a7c not found: ID does not exist" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.919123 4690 scope.go:117] "RemoveContainer" containerID="78b79e7c6bc179739a43168addace3ea75f4067c5938f219a5cb0e545f65472f" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.919799 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78b79e7c6bc179739a43168addace3ea75f4067c5938f219a5cb0e545f65472f"} err="failed to get container status \"78b79e7c6bc179739a43168addace3ea75f4067c5938f219a5cb0e545f65472f\": rpc error: code = NotFound desc = could not find container \"78b79e7c6bc179739a43168addace3ea75f4067c5938f219a5cb0e545f65472f\": container with ID starting with 78b79e7c6bc179739a43168addace3ea75f4067c5938f219a5cb0e545f65472f not found: ID does not exist" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.919825 4690 scope.go:117] "RemoveContainer" containerID="187278dddcc4ae295ce37bb5966dd95b70987cf9579d8a302c45162906caa098" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.920198 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"187278dddcc4ae295ce37bb5966dd95b70987cf9579d8a302c45162906caa098"} err="failed to get container status \"187278dddcc4ae295ce37bb5966dd95b70987cf9579d8a302c45162906caa098\": rpc error: code = NotFound desc = could not find container \"187278dddcc4ae295ce37bb5966dd95b70987cf9579d8a302c45162906caa098\": container with ID starting with 187278dddcc4ae295ce37bb5966dd95b70987cf9579d8a302c45162906caa098 not found: ID does not exist" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.920229 4690 scope.go:117] "RemoveContainer" containerID="89f5bb035f84384df58eb38689bda300611344d78c38c548c61cd02a479b6852" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.920491 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89f5bb035f84384df58eb38689bda300611344d78c38c548c61cd02a479b6852"} err="failed to get container status \"89f5bb035f84384df58eb38689bda300611344d78c38c548c61cd02a479b6852\": rpc error: code = NotFound desc = could not find container \"89f5bb035f84384df58eb38689bda300611344d78c38c548c61cd02a479b6852\": container with ID starting with 89f5bb035f84384df58eb38689bda300611344d78c38c548c61cd02a479b6852 not found: ID does not exist" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.920507 4690 scope.go:117] "RemoveContainer" containerID="11c8e8059826df28ea1bdafe3ca56a8a902ff916246367be3ece76d468194901" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.920829 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11c8e8059826df28ea1bdafe3ca56a8a902ff916246367be3ece76d468194901"} err="failed to get container status \"11c8e8059826df28ea1bdafe3ca56a8a902ff916246367be3ece76d468194901\": rpc error: code = NotFound desc = could not find container \"11c8e8059826df28ea1bdafe3ca56a8a902ff916246367be3ece76d468194901\": container with ID starting with 11c8e8059826df28ea1bdafe3ca56a8a902ff916246367be3ece76d468194901 not found: ID does not exist" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.920855 4690 scope.go:117] "RemoveContainer" containerID="95c9b322e5da6bc8172886af77d6507bccaaf8e4489181c78d3f5e522d781aa4" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.921313 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95c9b322e5da6bc8172886af77d6507bccaaf8e4489181c78d3f5e522d781aa4"} err="failed to get container status \"95c9b322e5da6bc8172886af77d6507bccaaf8e4489181c78d3f5e522d781aa4\": rpc error: code = NotFound desc = could not find container \"95c9b322e5da6bc8172886af77d6507bccaaf8e4489181c78d3f5e522d781aa4\": container with ID starting with 95c9b322e5da6bc8172886af77d6507bccaaf8e4489181c78d3f5e522d781aa4 not found: ID does not exist" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.921360 4690 scope.go:117] "RemoveContainer" containerID="13ad2529bd38d1e0c84ca456ccdcc8020ce82a667c5aa5ea3a0027d397ec94f3" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.921685 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13ad2529bd38d1e0c84ca456ccdcc8020ce82a667c5aa5ea3a0027d397ec94f3"} err="failed to get container status \"13ad2529bd38d1e0c84ca456ccdcc8020ce82a667c5aa5ea3a0027d397ec94f3\": rpc error: code = NotFound desc = could not find container \"13ad2529bd38d1e0c84ca456ccdcc8020ce82a667c5aa5ea3a0027d397ec94f3\": container with ID starting with 13ad2529bd38d1e0c84ca456ccdcc8020ce82a667c5aa5ea3a0027d397ec94f3 not found: ID does not exist" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.921712 4690 scope.go:117] "RemoveContainer" containerID="e6135f9b9f357be4756c24d0e74244d64acd2fdfe7868743556379250a02e5ec" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.922051 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6135f9b9f357be4756c24d0e74244d64acd2fdfe7868743556379250a02e5ec"} err="failed to get container status \"e6135f9b9f357be4756c24d0e74244d64acd2fdfe7868743556379250a02e5ec\": rpc error: code = NotFound desc = could not find container \"e6135f9b9f357be4756c24d0e74244d64acd2fdfe7868743556379250a02e5ec\": container with ID starting with e6135f9b9f357be4756c24d0e74244d64acd2fdfe7868743556379250a02e5ec not found: ID does not exist" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.922073 4690 scope.go:117] "RemoveContainer" containerID="81abe4d654d381b11ab7ff28d592be23303e3f7934bb0c68d3f3c30316b491ca" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.922337 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81abe4d654d381b11ab7ff28d592be23303e3f7934bb0c68d3f3c30316b491ca"} err="failed to get container status \"81abe4d654d381b11ab7ff28d592be23303e3f7934bb0c68d3f3c30316b491ca\": rpc error: code = NotFound desc = could not find container \"81abe4d654d381b11ab7ff28d592be23303e3f7934bb0c68d3f3c30316b491ca\": container with ID starting with 81abe4d654d381b11ab7ff28d592be23303e3f7934bb0c68d3f3c30316b491ca not found: ID does not exist" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.922367 4690 scope.go:117] "RemoveContainer" containerID="6447a78cef9ba2045f7928077399b681d152b37755ec287ae1633a26a67711ff" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.923354 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6447a78cef9ba2045f7928077399b681d152b37755ec287ae1633a26a67711ff"} err="failed to get container status \"6447a78cef9ba2045f7928077399b681d152b37755ec287ae1633a26a67711ff\": rpc error: code = NotFound desc = could not find container \"6447a78cef9ba2045f7928077399b681d152b37755ec287ae1633a26a67711ff\": container with ID starting with 6447a78cef9ba2045f7928077399b681d152b37755ec287ae1633a26a67711ff not found: ID does not exist" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.923396 4690 scope.go:117] "RemoveContainer" containerID="d198c0b94cfc2e9429a02ccb1bf444b3746c37cd3278cc5c41cccad3a92f3a7c" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.923819 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d198c0b94cfc2e9429a02ccb1bf444b3746c37cd3278cc5c41cccad3a92f3a7c"} err="failed to get container status \"d198c0b94cfc2e9429a02ccb1bf444b3746c37cd3278cc5c41cccad3a92f3a7c\": rpc error: code = NotFound desc = could not find container \"d198c0b94cfc2e9429a02ccb1bf444b3746c37cd3278cc5c41cccad3a92f3a7c\": container with ID starting with d198c0b94cfc2e9429a02ccb1bf444b3746c37cd3278cc5c41cccad3a92f3a7c not found: ID does not exist" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.923838 4690 scope.go:117] "RemoveContainer" containerID="78b79e7c6bc179739a43168addace3ea75f4067c5938f219a5cb0e545f65472f" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.924398 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78b79e7c6bc179739a43168addace3ea75f4067c5938f219a5cb0e545f65472f"} err="failed to get container status \"78b79e7c6bc179739a43168addace3ea75f4067c5938f219a5cb0e545f65472f\": rpc error: code = NotFound desc = could not find container \"78b79e7c6bc179739a43168addace3ea75f4067c5938f219a5cb0e545f65472f\": container with ID starting with 78b79e7c6bc179739a43168addace3ea75f4067c5938f219a5cb0e545f65472f not found: ID does not exist" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.924444 4690 scope.go:117] "RemoveContainer" containerID="187278dddcc4ae295ce37bb5966dd95b70987cf9579d8a302c45162906caa098" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.924787 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"187278dddcc4ae295ce37bb5966dd95b70987cf9579d8a302c45162906caa098"} err="failed to get container status \"187278dddcc4ae295ce37bb5966dd95b70987cf9579d8a302c45162906caa098\": rpc error: code = NotFound desc = could not find container \"187278dddcc4ae295ce37bb5966dd95b70987cf9579d8a302c45162906caa098\": container with ID starting with 187278dddcc4ae295ce37bb5966dd95b70987cf9579d8a302c45162906caa098 not found: ID does not exist" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.924818 4690 scope.go:117] "RemoveContainer" containerID="89f5bb035f84384df58eb38689bda300611344d78c38c548c61cd02a479b6852" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.925662 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89f5bb035f84384df58eb38689bda300611344d78c38c548c61cd02a479b6852"} err="failed to get container status \"89f5bb035f84384df58eb38689bda300611344d78c38c548c61cd02a479b6852\": rpc error: code = NotFound desc = could not find container \"89f5bb035f84384df58eb38689bda300611344d78c38c548c61cd02a479b6852\": container with ID starting with 89f5bb035f84384df58eb38689bda300611344d78c38c548c61cd02a479b6852 not found: ID does not exist" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.925688 4690 scope.go:117] "RemoveContainer" containerID="11c8e8059826df28ea1bdafe3ca56a8a902ff916246367be3ece76d468194901" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.926039 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11c8e8059826df28ea1bdafe3ca56a8a902ff916246367be3ece76d468194901"} err="failed to get container status \"11c8e8059826df28ea1bdafe3ca56a8a902ff916246367be3ece76d468194901\": rpc error: code = NotFound desc = could not find container \"11c8e8059826df28ea1bdafe3ca56a8a902ff916246367be3ece76d468194901\": container with ID starting with 11c8e8059826df28ea1bdafe3ca56a8a902ff916246367be3ece76d468194901 not found: ID does not exist" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.926062 4690 scope.go:117] "RemoveContainer" containerID="95c9b322e5da6bc8172886af77d6507bccaaf8e4489181c78d3f5e522d781aa4" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.926321 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95c9b322e5da6bc8172886af77d6507bccaaf8e4489181c78d3f5e522d781aa4"} err="failed to get container status \"95c9b322e5da6bc8172886af77d6507bccaaf8e4489181c78d3f5e522d781aa4\": rpc error: code = NotFound desc = could not find container \"95c9b322e5da6bc8172886af77d6507bccaaf8e4489181c78d3f5e522d781aa4\": container with ID starting with 95c9b322e5da6bc8172886af77d6507bccaaf8e4489181c78d3f5e522d781aa4 not found: ID does not exist" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.926339 4690 scope.go:117] "RemoveContainer" containerID="13ad2529bd38d1e0c84ca456ccdcc8020ce82a667c5aa5ea3a0027d397ec94f3" Mar 20 17:46:05 crc kubenswrapper[4690]: I0320 17:46:05.926530 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13ad2529bd38d1e0c84ca456ccdcc8020ce82a667c5aa5ea3a0027d397ec94f3"} err="failed to get container status \"13ad2529bd38d1e0c84ca456ccdcc8020ce82a667c5aa5ea3a0027d397ec94f3\": rpc error: code = NotFound desc = could not find container \"13ad2529bd38d1e0c84ca456ccdcc8020ce82a667c5aa5ea3a0027d397ec94f3\": container with ID starting with 13ad2529bd38d1e0c84ca456ccdcc8020ce82a667c5aa5ea3a0027d397ec94f3 not found: ID does not exist" Mar 20 17:46:06 crc kubenswrapper[4690]: I0320 17:46:06.612516 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bf8dm_189715be-f690-4a1d-9bd3-fb0dcae7affe/kube-multus/2.log" Mar 20 17:46:06 crc kubenswrapper[4690]: I0320 17:46:06.612663 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bf8dm" event={"ID":"189715be-f690-4a1d-9bd3-fb0dcae7affe","Type":"ContainerStarted","Data":"1fe69a40d153f2097e6c2bb5b660e5b1b6d5550e75afdad6572cca624d6207c0"} Mar 20 17:46:06 crc kubenswrapper[4690]: I0320 17:46:06.618041 4690 generic.go:334] "Generic (PLEG): container finished" podID="281396bf-6e05-4dd8-ab54-5058d65c9eb0" containerID="059dfc6abfb42c0a375d4d739998a26e020c219c18244ba58e6fee3523e0cf31" exitCode=0 Mar 20 17:46:06 crc kubenswrapper[4690]: I0320 17:46:06.618097 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qvkzb" event={"ID":"281396bf-6e05-4dd8-ab54-5058d65c9eb0","Type":"ContainerDied","Data":"059dfc6abfb42c0a375d4d739998a26e020c219c18244ba58e6fee3523e0cf31"} Mar 20 17:46:07 crc kubenswrapper[4690]: I0320 17:46:07.219566 4690 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 20 17:46:07 crc kubenswrapper[4690]: I0320 17:46:07.368477 4690 scope.go:117] "RemoveContainer" containerID="40addebe99a4361632e57afb962d6551a5565f475c212055c71d2e0db97b2bce" Mar 20 17:46:07 crc kubenswrapper[4690]: I0320 17:46:07.625745 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qvkzb" event={"ID":"281396bf-6e05-4dd8-ab54-5058d65c9eb0","Type":"ContainerStarted","Data":"36adc20e1aad8020e62181ce6c6eb6d9d8bda51280004a0f4cafa5705a103722"} Mar 20 17:46:07 crc kubenswrapper[4690]: I0320 17:46:07.626018 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qvkzb" event={"ID":"281396bf-6e05-4dd8-ab54-5058d65c9eb0","Type":"ContainerStarted","Data":"0854c01487e74de71f71fddbf9a27942777343afd54395f514360291c81489f6"} Mar 20 17:46:07 crc kubenswrapper[4690]: I0320 17:46:07.626032 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qvkzb" event={"ID":"281396bf-6e05-4dd8-ab54-5058d65c9eb0","Type":"ContainerStarted","Data":"9819434fe62bdfde4bfd0e8837df15564b5b42d33ea9c8a3f15f4b7ef3a304a3"} Mar 20 17:46:07 crc kubenswrapper[4690]: I0320 17:46:07.626044 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qvkzb" event={"ID":"281396bf-6e05-4dd8-ab54-5058d65c9eb0","Type":"ContainerStarted","Data":"f11398a51d6b1d265b3cbddf93f776d110eb8baa7074f995c6a0af9cff61596d"} Mar 20 17:46:07 crc kubenswrapper[4690]: I0320 17:46:07.626055 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qvkzb" event={"ID":"281396bf-6e05-4dd8-ab54-5058d65c9eb0","Type":"ContainerStarted","Data":"d0a965bf5db262091adab497cb48b61ff6fab7831cdc4181ed927b72b3f6f721"} Mar 20 17:46:07 crc kubenswrapper[4690]: I0320 17:46:07.626085 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qvkzb" event={"ID":"281396bf-6e05-4dd8-ab54-5058d65c9eb0","Type":"ContainerStarted","Data":"86bff42f877f4cfb8ca6aedae58cb67b760a144dbc65f6fe147b8ee557aeb163"} Mar 20 17:46:10 crc kubenswrapper[4690]: I0320 17:46:10.655367 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qvkzb" event={"ID":"281396bf-6e05-4dd8-ab54-5058d65c9eb0","Type":"ContainerStarted","Data":"159f77efee0eaede248a3d97214f5f0fb6a4979ee123ad9e905c669dbfce59c1"} Mar 20 17:46:12 crc kubenswrapper[4690]: I0320 17:46:12.670791 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qvkzb" event={"ID":"281396bf-6e05-4dd8-ab54-5058d65c9eb0","Type":"ContainerStarted","Data":"4dcb9b470aa98470423f3b46a0c93dff4a93f80ca31434c56c6caa8bb9124254"} Mar 20 17:46:12 crc kubenswrapper[4690]: I0320 17:46:12.671245 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qvkzb" Mar 20 17:46:12 crc kubenswrapper[4690]: I0320 17:46:12.671275 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qvkzb" Mar 20 17:46:12 crc kubenswrapper[4690]: I0320 17:46:12.699787 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-qvkzb" podStartSLOduration=7.6997698329999995 podStartE2EDuration="7.699769833s" podCreationTimestamp="2026-03-20 17:46:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:46:12.693773344 +0000 UTC m=+847.559599042" watchObservedRunningTime="2026-03-20 17:46:12.699769833 +0000 UTC m=+847.565595511" Mar 20 17:46:12 crc kubenswrapper[4690]: I0320 17:46:12.704037 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qvkzb" Mar 20 17:46:13 crc kubenswrapper[4690]: I0320 17:46:13.678819 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qvkzb" Mar 20 17:46:13 crc kubenswrapper[4690]: I0320 17:46:13.725066 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qvkzb" Mar 20 17:46:35 crc kubenswrapper[4690]: I0320 17:46:35.437204 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qvkzb" Mar 20 17:46:42 crc kubenswrapper[4690]: I0320 17:46:42.875617 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8742tzwd"] Mar 20 17:46:42 crc kubenswrapper[4690]: I0320 17:46:42.878345 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8742tzwd" Mar 20 17:46:42 crc kubenswrapper[4690]: I0320 17:46:42.879713 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 20 17:46:42 crc kubenswrapper[4690]: I0320 17:46:42.886926 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8742tzwd"] Mar 20 17:46:42 crc kubenswrapper[4690]: I0320 17:46:42.948661 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5xmq\" (UniqueName: \"kubernetes.io/projected/417924bb-8f83-4db1-b370-92e0fac118f4-kube-api-access-l5xmq\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8742tzwd\" (UID: \"417924bb-8f83-4db1-b370-92e0fac118f4\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8742tzwd" Mar 20 17:46:42 crc kubenswrapper[4690]: I0320 17:46:42.949362 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/417924bb-8f83-4db1-b370-92e0fac118f4-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8742tzwd\" (UID: \"417924bb-8f83-4db1-b370-92e0fac118f4\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8742tzwd" Mar 20 17:46:42 crc kubenswrapper[4690]: I0320 17:46:42.949556 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/417924bb-8f83-4db1-b370-92e0fac118f4-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8742tzwd\" (UID: \"417924bb-8f83-4db1-b370-92e0fac118f4\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8742tzwd" Mar 20 17:46:43 crc kubenswrapper[4690]: I0320 17:46:43.051073 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/417924bb-8f83-4db1-b370-92e0fac118f4-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8742tzwd\" (UID: \"417924bb-8f83-4db1-b370-92e0fac118f4\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8742tzwd" Mar 20 17:46:43 crc kubenswrapper[4690]: I0320 17:46:43.051190 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/417924bb-8f83-4db1-b370-92e0fac118f4-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8742tzwd\" (UID: \"417924bb-8f83-4db1-b370-92e0fac118f4\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8742tzwd" Mar 20 17:46:43 crc kubenswrapper[4690]: I0320 17:46:43.051399 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5xmq\" (UniqueName: \"kubernetes.io/projected/417924bb-8f83-4db1-b370-92e0fac118f4-kube-api-access-l5xmq\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8742tzwd\" (UID: \"417924bb-8f83-4db1-b370-92e0fac118f4\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8742tzwd" Mar 20 17:46:43 crc kubenswrapper[4690]: I0320 17:46:43.052116 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/417924bb-8f83-4db1-b370-92e0fac118f4-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8742tzwd\" (UID: \"417924bb-8f83-4db1-b370-92e0fac118f4\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8742tzwd" Mar 20 17:46:43 crc kubenswrapper[4690]: I0320 17:46:43.052144 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/417924bb-8f83-4db1-b370-92e0fac118f4-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8742tzwd\" (UID: \"417924bb-8f83-4db1-b370-92e0fac118f4\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8742tzwd" Mar 20 17:46:43 crc kubenswrapper[4690]: I0320 17:46:43.086966 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5xmq\" (UniqueName: \"kubernetes.io/projected/417924bb-8f83-4db1-b370-92e0fac118f4-kube-api-access-l5xmq\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8742tzwd\" (UID: \"417924bb-8f83-4db1-b370-92e0fac118f4\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8742tzwd" Mar 20 17:46:43 crc kubenswrapper[4690]: I0320 17:46:43.194463 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8742tzwd" Mar 20 17:46:43 crc kubenswrapper[4690]: I0320 17:46:43.678315 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8742tzwd"] Mar 20 17:46:43 crc kubenswrapper[4690]: W0320 17:46:43.693615 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod417924bb_8f83_4db1_b370_92e0fac118f4.slice/crio-d3ba8b53a32ea8a40aa14fbcd087d171cffd709e0cec73a8d8e332a5c04864bc WatchSource:0}: Error finding container d3ba8b53a32ea8a40aa14fbcd087d171cffd709e0cec73a8d8e332a5c04864bc: Status 404 returned error can't find the container with id d3ba8b53a32ea8a40aa14fbcd087d171cffd709e0cec73a8d8e332a5c04864bc Mar 20 17:46:43 crc kubenswrapper[4690]: I0320 17:46:43.895223 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8742tzwd" event={"ID":"417924bb-8f83-4db1-b370-92e0fac118f4","Type":"ContainerStarted","Data":"679b25303d45db2d26f6a68d0e96b299aa81a54433c0f47eaa0d9cdfed168ca2"} Mar 20 17:46:43 crc kubenswrapper[4690]: I0320 17:46:43.895304 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8742tzwd" event={"ID":"417924bb-8f83-4db1-b370-92e0fac118f4","Type":"ContainerStarted","Data":"d3ba8b53a32ea8a40aa14fbcd087d171cffd709e0cec73a8d8e332a5c04864bc"} Mar 20 17:46:44 crc kubenswrapper[4690]: I0320 17:46:44.904574 4690 generic.go:334] "Generic (PLEG): container finished" podID="417924bb-8f83-4db1-b370-92e0fac118f4" containerID="679b25303d45db2d26f6a68d0e96b299aa81a54433c0f47eaa0d9cdfed168ca2" exitCode=0 Mar 20 17:46:44 crc kubenswrapper[4690]: I0320 17:46:44.904704 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8742tzwd" event={"ID":"417924bb-8f83-4db1-b370-92e0fac118f4","Type":"ContainerDied","Data":"679b25303d45db2d26f6a68d0e96b299aa81a54433c0f47eaa0d9cdfed168ca2"} Mar 20 17:46:45 crc kubenswrapper[4690]: I0320 17:46:45.041430 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-kjd6h"] Mar 20 17:46:45 crc kubenswrapper[4690]: I0320 17:46:45.042415 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kjd6h" Mar 20 17:46:45 crc kubenswrapper[4690]: I0320 17:46:45.066028 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kjd6h"] Mar 20 17:46:45 crc kubenswrapper[4690]: I0320 17:46:45.077856 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbdfl\" (UniqueName: \"kubernetes.io/projected/15f6141c-d70c-4c93-90ad-ff803278dc41-kube-api-access-wbdfl\") pod \"redhat-operators-kjd6h\" (UID: \"15f6141c-d70c-4c93-90ad-ff803278dc41\") " pod="openshift-marketplace/redhat-operators-kjd6h" Mar 20 17:46:45 crc kubenswrapper[4690]: I0320 17:46:45.077956 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15f6141c-d70c-4c93-90ad-ff803278dc41-catalog-content\") pod \"redhat-operators-kjd6h\" (UID: \"15f6141c-d70c-4c93-90ad-ff803278dc41\") " pod="openshift-marketplace/redhat-operators-kjd6h" Mar 20 17:46:45 crc kubenswrapper[4690]: I0320 17:46:45.078003 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15f6141c-d70c-4c93-90ad-ff803278dc41-utilities\") pod \"redhat-operators-kjd6h\" (UID: \"15f6141c-d70c-4c93-90ad-ff803278dc41\") " pod="openshift-marketplace/redhat-operators-kjd6h" Mar 20 17:46:45 crc kubenswrapper[4690]: I0320 17:46:45.179348 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbdfl\" (UniqueName: \"kubernetes.io/projected/15f6141c-d70c-4c93-90ad-ff803278dc41-kube-api-access-wbdfl\") pod \"redhat-operators-kjd6h\" (UID: \"15f6141c-d70c-4c93-90ad-ff803278dc41\") " pod="openshift-marketplace/redhat-operators-kjd6h" Mar 20 17:46:45 crc kubenswrapper[4690]: I0320 17:46:45.179442 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15f6141c-d70c-4c93-90ad-ff803278dc41-catalog-content\") pod \"redhat-operators-kjd6h\" (UID: \"15f6141c-d70c-4c93-90ad-ff803278dc41\") " pod="openshift-marketplace/redhat-operators-kjd6h" Mar 20 17:46:45 crc kubenswrapper[4690]: I0320 17:46:45.179494 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15f6141c-d70c-4c93-90ad-ff803278dc41-utilities\") pod \"redhat-operators-kjd6h\" (UID: \"15f6141c-d70c-4c93-90ad-ff803278dc41\") " pod="openshift-marketplace/redhat-operators-kjd6h" Mar 20 17:46:45 crc kubenswrapper[4690]: I0320 17:46:45.180021 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15f6141c-d70c-4c93-90ad-ff803278dc41-catalog-content\") pod \"redhat-operators-kjd6h\" (UID: \"15f6141c-d70c-4c93-90ad-ff803278dc41\") " pod="openshift-marketplace/redhat-operators-kjd6h" Mar 20 17:46:45 crc kubenswrapper[4690]: I0320 17:46:45.180088 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15f6141c-d70c-4c93-90ad-ff803278dc41-utilities\") pod \"redhat-operators-kjd6h\" (UID: \"15f6141c-d70c-4c93-90ad-ff803278dc41\") " pod="openshift-marketplace/redhat-operators-kjd6h" Mar 20 17:46:45 crc kubenswrapper[4690]: I0320 17:46:45.217066 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbdfl\" (UniqueName: \"kubernetes.io/projected/15f6141c-d70c-4c93-90ad-ff803278dc41-kube-api-access-wbdfl\") pod \"redhat-operators-kjd6h\" (UID: \"15f6141c-d70c-4c93-90ad-ff803278dc41\") " pod="openshift-marketplace/redhat-operators-kjd6h" Mar 20 17:46:45 crc kubenswrapper[4690]: I0320 17:46:45.406895 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kjd6h" Mar 20 17:46:45 crc kubenswrapper[4690]: I0320 17:46:45.596829 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kjd6h"] Mar 20 17:46:45 crc kubenswrapper[4690]: W0320 17:46:45.604693 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15f6141c_d70c_4c93_90ad_ff803278dc41.slice/crio-5b94116cfeb217800d954e1c13948b965df1e876c0bf369aa0ad6ad10e23c281 WatchSource:0}: Error finding container 5b94116cfeb217800d954e1c13948b965df1e876c0bf369aa0ad6ad10e23c281: Status 404 returned error can't find the container with id 5b94116cfeb217800d954e1c13948b965df1e876c0bf369aa0ad6ad10e23c281 Mar 20 17:46:45 crc kubenswrapper[4690]: I0320 17:46:45.914750 4690 generic.go:334] "Generic (PLEG): container finished" podID="15f6141c-d70c-4c93-90ad-ff803278dc41" containerID="c5ac24c7b5b842e842f9a13015e636e3dfc77aaa4ba4cbde8f6ac8619faf450b" exitCode=0 Mar 20 17:46:45 crc kubenswrapper[4690]: I0320 17:46:45.914810 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kjd6h" event={"ID":"15f6141c-d70c-4c93-90ad-ff803278dc41","Type":"ContainerDied","Data":"c5ac24c7b5b842e842f9a13015e636e3dfc77aaa4ba4cbde8f6ac8619faf450b"} Mar 20 17:46:45 crc kubenswrapper[4690]: I0320 17:46:45.914851 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kjd6h" event={"ID":"15f6141c-d70c-4c93-90ad-ff803278dc41","Type":"ContainerStarted","Data":"5b94116cfeb217800d954e1c13948b965df1e876c0bf369aa0ad6ad10e23c281"} Mar 20 17:46:46 crc kubenswrapper[4690]: I0320 17:46:46.925111 4690 generic.go:334] "Generic (PLEG): container finished" podID="417924bb-8f83-4db1-b370-92e0fac118f4" containerID="ead39ab5b05782574664dbc889ac390b4e018e2a005aed47e5211b350eab8c93" exitCode=0 Mar 20 17:46:46 crc kubenswrapper[4690]: I0320 17:46:46.925230 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8742tzwd" event={"ID":"417924bb-8f83-4db1-b370-92e0fac118f4","Type":"ContainerDied","Data":"ead39ab5b05782574664dbc889ac390b4e018e2a005aed47e5211b350eab8c93"} Mar 20 17:46:47 crc kubenswrapper[4690]: I0320 17:46:47.934236 4690 generic.go:334] "Generic (PLEG): container finished" podID="417924bb-8f83-4db1-b370-92e0fac118f4" containerID="c01f68484e37f5387274636dd071f023b1714e13216888e425dbd7d5c5d4e374" exitCode=0 Mar 20 17:46:47 crc kubenswrapper[4690]: I0320 17:46:47.934332 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8742tzwd" event={"ID":"417924bb-8f83-4db1-b370-92e0fac118f4","Type":"ContainerDied","Data":"c01f68484e37f5387274636dd071f023b1714e13216888e425dbd7d5c5d4e374"} Mar 20 17:46:47 crc kubenswrapper[4690]: I0320 17:46:47.937567 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kjd6h" event={"ID":"15f6141c-d70c-4c93-90ad-ff803278dc41","Type":"ContainerStarted","Data":"b2ce3132325536d87c94d6c149b23e29c3f4537f7bcbdeb9d3f2e208a210db5f"} Mar 20 17:46:48 crc kubenswrapper[4690]: I0320 17:46:48.947544 4690 generic.go:334] "Generic (PLEG): container finished" podID="15f6141c-d70c-4c93-90ad-ff803278dc41" containerID="b2ce3132325536d87c94d6c149b23e29c3f4537f7bcbdeb9d3f2e208a210db5f" exitCode=0 Mar 20 17:46:48 crc kubenswrapper[4690]: I0320 17:46:48.947630 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kjd6h" event={"ID":"15f6141c-d70c-4c93-90ad-ff803278dc41","Type":"ContainerDied","Data":"b2ce3132325536d87c94d6c149b23e29c3f4537f7bcbdeb9d3f2e208a210db5f"} Mar 20 17:46:49 crc kubenswrapper[4690]: I0320 17:46:49.334174 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8742tzwd" Mar 20 17:46:49 crc kubenswrapper[4690]: I0320 17:46:49.434522 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/417924bb-8f83-4db1-b370-92e0fac118f4-bundle\") pod \"417924bb-8f83-4db1-b370-92e0fac118f4\" (UID: \"417924bb-8f83-4db1-b370-92e0fac118f4\") " Mar 20 17:46:49 crc kubenswrapper[4690]: I0320 17:46:49.434597 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5xmq\" (UniqueName: \"kubernetes.io/projected/417924bb-8f83-4db1-b370-92e0fac118f4-kube-api-access-l5xmq\") pod \"417924bb-8f83-4db1-b370-92e0fac118f4\" (UID: \"417924bb-8f83-4db1-b370-92e0fac118f4\") " Mar 20 17:46:49 crc kubenswrapper[4690]: I0320 17:46:49.434700 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/417924bb-8f83-4db1-b370-92e0fac118f4-util\") pod \"417924bb-8f83-4db1-b370-92e0fac118f4\" (UID: \"417924bb-8f83-4db1-b370-92e0fac118f4\") " Mar 20 17:46:49 crc kubenswrapper[4690]: I0320 17:46:49.435283 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/417924bb-8f83-4db1-b370-92e0fac118f4-bundle" (OuterVolumeSpecName: "bundle") pod "417924bb-8f83-4db1-b370-92e0fac118f4" (UID: "417924bb-8f83-4db1-b370-92e0fac118f4"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:46:49 crc kubenswrapper[4690]: I0320 17:46:49.442838 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/417924bb-8f83-4db1-b370-92e0fac118f4-kube-api-access-l5xmq" (OuterVolumeSpecName: "kube-api-access-l5xmq") pod "417924bb-8f83-4db1-b370-92e0fac118f4" (UID: "417924bb-8f83-4db1-b370-92e0fac118f4"). InnerVolumeSpecName "kube-api-access-l5xmq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:46:49 crc kubenswrapper[4690]: I0320 17:46:49.536791 4690 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/417924bb-8f83-4db1-b370-92e0fac118f4-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:46:49 crc kubenswrapper[4690]: I0320 17:46:49.536842 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5xmq\" (UniqueName: \"kubernetes.io/projected/417924bb-8f83-4db1-b370-92e0fac118f4-kube-api-access-l5xmq\") on node \"crc\" DevicePath \"\"" Mar 20 17:46:49 crc kubenswrapper[4690]: I0320 17:46:49.611155 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/417924bb-8f83-4db1-b370-92e0fac118f4-util" (OuterVolumeSpecName: "util") pod "417924bb-8f83-4db1-b370-92e0fac118f4" (UID: "417924bb-8f83-4db1-b370-92e0fac118f4"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:46:49 crc kubenswrapper[4690]: I0320 17:46:49.638284 4690 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/417924bb-8f83-4db1-b370-92e0fac118f4-util\") on node \"crc\" DevicePath \"\"" Mar 20 17:46:49 crc kubenswrapper[4690]: I0320 17:46:49.957629 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8742tzwd" Mar 20 17:46:49 crc kubenswrapper[4690]: I0320 17:46:49.957600 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8742tzwd" event={"ID":"417924bb-8f83-4db1-b370-92e0fac118f4","Type":"ContainerDied","Data":"d3ba8b53a32ea8a40aa14fbcd087d171cffd709e0cec73a8d8e332a5c04864bc"} Mar 20 17:46:49 crc kubenswrapper[4690]: I0320 17:46:49.957811 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d3ba8b53a32ea8a40aa14fbcd087d171cffd709e0cec73a8d8e332a5c04864bc" Mar 20 17:46:49 crc kubenswrapper[4690]: I0320 17:46:49.960286 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kjd6h" event={"ID":"15f6141c-d70c-4c93-90ad-ff803278dc41","Type":"ContainerStarted","Data":"a9f6442062ddcad51c20e399346df9a9e7c6cbdd59288b0eedec24d4b3b7807d"} Mar 20 17:46:49 crc kubenswrapper[4690]: I0320 17:46:49.988910 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-kjd6h" podStartSLOduration=1.513512902 podStartE2EDuration="4.988890849s" podCreationTimestamp="2026-03-20 17:46:45 +0000 UTC" firstStartedPulling="2026-03-20 17:46:45.92091536 +0000 UTC m=+880.786741048" lastFinishedPulling="2026-03-20 17:46:49.396293317 +0000 UTC m=+884.262118995" observedRunningTime="2026-03-20 17:46:49.985314478 +0000 UTC m=+884.851140166" watchObservedRunningTime="2026-03-20 17:46:49.988890849 +0000 UTC m=+884.854716527" Mar 20 17:46:53 crc kubenswrapper[4690]: I0320 17:46:53.427937 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-6th4x"] Mar 20 17:46:53 crc kubenswrapper[4690]: E0320 17:46:53.428476 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="417924bb-8f83-4db1-b370-92e0fac118f4" containerName="util" Mar 20 17:46:53 crc kubenswrapper[4690]: I0320 17:46:53.428494 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="417924bb-8f83-4db1-b370-92e0fac118f4" containerName="util" Mar 20 17:46:53 crc kubenswrapper[4690]: E0320 17:46:53.428503 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="417924bb-8f83-4db1-b370-92e0fac118f4" containerName="pull" Mar 20 17:46:53 crc kubenswrapper[4690]: I0320 17:46:53.428510 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="417924bb-8f83-4db1-b370-92e0fac118f4" containerName="pull" Mar 20 17:46:53 crc kubenswrapper[4690]: E0320 17:46:53.428529 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="417924bb-8f83-4db1-b370-92e0fac118f4" containerName="extract" Mar 20 17:46:53 crc kubenswrapper[4690]: I0320 17:46:53.428537 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="417924bb-8f83-4db1-b370-92e0fac118f4" containerName="extract" Mar 20 17:46:53 crc kubenswrapper[4690]: I0320 17:46:53.428671 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="417924bb-8f83-4db1-b370-92e0fac118f4" containerName="extract" Mar 20 17:46:53 crc kubenswrapper[4690]: I0320 17:46:53.429080 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-6th4x" Mar 20 17:46:53 crc kubenswrapper[4690]: I0320 17:46:53.431488 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 20 17:46:53 crc kubenswrapper[4690]: I0320 17:46:53.432214 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 20 17:46:53 crc kubenswrapper[4690]: I0320 17:46:53.441869 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-nv7pm" Mar 20 17:46:53 crc kubenswrapper[4690]: I0320 17:46:53.454687 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-6th4x"] Mar 20 17:46:53 crc kubenswrapper[4690]: I0320 17:46:53.490174 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8v9c\" (UniqueName: \"kubernetes.io/projected/7e787826-7e6f-4ac9-856e-73304533640d-kube-api-access-f8v9c\") pod \"nmstate-operator-796d4cfff4-6th4x\" (UID: \"7e787826-7e6f-4ac9-856e-73304533640d\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-6th4x" Mar 20 17:46:53 crc kubenswrapper[4690]: I0320 17:46:53.592090 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8v9c\" (UniqueName: \"kubernetes.io/projected/7e787826-7e6f-4ac9-856e-73304533640d-kube-api-access-f8v9c\") pod \"nmstate-operator-796d4cfff4-6th4x\" (UID: \"7e787826-7e6f-4ac9-856e-73304533640d\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-6th4x" Mar 20 17:46:53 crc kubenswrapper[4690]: I0320 17:46:53.619232 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8v9c\" (UniqueName: \"kubernetes.io/projected/7e787826-7e6f-4ac9-856e-73304533640d-kube-api-access-f8v9c\") pod \"nmstate-operator-796d4cfff4-6th4x\" (UID: \"7e787826-7e6f-4ac9-856e-73304533640d\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-6th4x" Mar 20 17:46:53 crc kubenswrapper[4690]: I0320 17:46:53.756184 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-6th4x" Mar 20 17:46:53 crc kubenswrapper[4690]: I0320 17:46:53.985065 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-6th4x"] Mar 20 17:46:54 crc kubenswrapper[4690]: I0320 17:46:54.990524 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-6th4x" event={"ID":"7e787826-7e6f-4ac9-856e-73304533640d","Type":"ContainerStarted","Data":"d2572d658c2451e5f26ec02652b0ae29c58ddf7ec7e21d3689b9e60636d88ec7"} Mar 20 17:46:55 crc kubenswrapper[4690]: I0320 17:46:55.407552 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-kjd6h" Mar 20 17:46:55 crc kubenswrapper[4690]: I0320 17:46:55.407724 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-kjd6h" Mar 20 17:46:56 crc kubenswrapper[4690]: I0320 17:46:56.470593 4690 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-kjd6h" podUID="15f6141c-d70c-4c93-90ad-ff803278dc41" containerName="registry-server" probeResult="failure" output=< Mar 20 17:46:56 crc kubenswrapper[4690]: timeout: failed to connect service ":50051" within 1s Mar 20 17:46:56 crc kubenswrapper[4690]: > Mar 20 17:46:57 crc kubenswrapper[4690]: I0320 17:46:57.003679 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-6th4x" event={"ID":"7e787826-7e6f-4ac9-856e-73304533640d","Type":"ContainerStarted","Data":"e850556acae871ad62301ee75d58f050bfb4be832980e17b6703dd4e55754210"} Mar 20 17:46:57 crc kubenswrapper[4690]: I0320 17:46:57.031290 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-796d4cfff4-6th4x" podStartSLOduration=1.4790325100000001 podStartE2EDuration="4.031241843s" podCreationTimestamp="2026-03-20 17:46:53 +0000 UTC" firstStartedPulling="2026-03-20 17:46:53.99189585 +0000 UTC m=+888.857721528" lastFinishedPulling="2026-03-20 17:46:56.544105183 +0000 UTC m=+891.409930861" observedRunningTime="2026-03-20 17:46:57.029627717 +0000 UTC m=+891.895453405" watchObservedRunningTime="2026-03-20 17:46:57.031241843 +0000 UTC m=+891.897067541" Mar 20 17:47:02 crc kubenswrapper[4690]: I0320 17:47:02.917957 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-zpndn"] Mar 20 17:47:02 crc kubenswrapper[4690]: I0320 17:47:02.919788 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-zpndn" Mar 20 17:47:02 crc kubenswrapper[4690]: I0320 17:47:02.925760 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-nxsxm" Mar 20 17:47:02 crc kubenswrapper[4690]: I0320 17:47:02.935625 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-2gftm"] Mar 20 17:47:02 crc kubenswrapper[4690]: I0320 17:47:02.936685 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-2gftm" Mar 20 17:47:02 crc kubenswrapper[4690]: I0320 17:47:02.939147 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 20 17:47:02 crc kubenswrapper[4690]: I0320 17:47:02.956375 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-h7mj5"] Mar 20 17:47:02 crc kubenswrapper[4690]: I0320 17:47:02.957687 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-h7mj5" Mar 20 17:47:02 crc kubenswrapper[4690]: I0320 17:47:02.959915 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-2gftm"] Mar 20 17:47:03 crc kubenswrapper[4690]: I0320 17:47:03.016724 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-zpndn"] Mar 20 17:47:03 crc kubenswrapper[4690]: I0320 17:47:03.017729 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sz7tb\" (UniqueName: \"kubernetes.io/projected/943c74c5-b182-46da-9ea4-164a4eb553d0-kube-api-access-sz7tb\") pod \"nmstate-webhook-5f558f5558-2gftm\" (UID: \"943c74c5-b182-46da-9ea4-164a4eb553d0\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-2gftm" Mar 20 17:47:03 crc kubenswrapper[4690]: I0320 17:47:03.017773 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/943c74c5-b182-46da-9ea4-164a4eb553d0-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-2gftm\" (UID: \"943c74c5-b182-46da-9ea4-164a4eb553d0\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-2gftm" Mar 20 17:47:03 crc kubenswrapper[4690]: I0320 17:47:03.017806 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/15577f5b-3df3-4e8e-bebe-6abe5379debf-ovs-socket\") pod \"nmstate-handler-h7mj5\" (UID: \"15577f5b-3df3-4e8e-bebe-6abe5379debf\") " pod="openshift-nmstate/nmstate-handler-h7mj5" Mar 20 17:47:03 crc kubenswrapper[4690]: I0320 17:47:03.017829 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xgz9\" (UniqueName: \"kubernetes.io/projected/2a8c0e04-bbfb-46b1-8562-2a1697b85035-kube-api-access-6xgz9\") pod \"nmstate-metrics-9b8c8685d-zpndn\" (UID: \"2a8c0e04-bbfb-46b1-8562-2a1697b85035\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-zpndn" Mar 20 17:47:03 crc kubenswrapper[4690]: I0320 17:47:03.017857 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/15577f5b-3df3-4e8e-bebe-6abe5379debf-dbus-socket\") pod \"nmstate-handler-h7mj5\" (UID: \"15577f5b-3df3-4e8e-bebe-6abe5379debf\") " pod="openshift-nmstate/nmstate-handler-h7mj5" Mar 20 17:47:03 crc kubenswrapper[4690]: I0320 17:47:03.017876 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/15577f5b-3df3-4e8e-bebe-6abe5379debf-nmstate-lock\") pod \"nmstate-handler-h7mj5\" (UID: \"15577f5b-3df3-4e8e-bebe-6abe5379debf\") " pod="openshift-nmstate/nmstate-handler-h7mj5" Mar 20 17:47:03 crc kubenswrapper[4690]: I0320 17:47:03.017938 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mglgk\" (UniqueName: \"kubernetes.io/projected/15577f5b-3df3-4e8e-bebe-6abe5379debf-kube-api-access-mglgk\") pod \"nmstate-handler-h7mj5\" (UID: \"15577f5b-3df3-4e8e-bebe-6abe5379debf\") " pod="openshift-nmstate/nmstate-handler-h7mj5" Mar 20 17:47:03 crc kubenswrapper[4690]: I0320 17:47:03.067219 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-66twl"] Mar 20 17:47:03 crc kubenswrapper[4690]: I0320 17:47:03.067916 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-66twl" Mar 20 17:47:03 crc kubenswrapper[4690]: I0320 17:47:03.071310 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 20 17:47:03 crc kubenswrapper[4690]: I0320 17:47:03.071346 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 20 17:47:03 crc kubenswrapper[4690]: I0320 17:47:03.071329 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-9jcgl" Mar 20 17:47:03 crc kubenswrapper[4690]: I0320 17:47:03.080013 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-66twl"] Mar 20 17:47:03 crc kubenswrapper[4690]: I0320 17:47:03.119611 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xgz9\" (UniqueName: \"kubernetes.io/projected/2a8c0e04-bbfb-46b1-8562-2a1697b85035-kube-api-access-6xgz9\") pod \"nmstate-metrics-9b8c8685d-zpndn\" (UID: \"2a8c0e04-bbfb-46b1-8562-2a1697b85035\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-zpndn" Mar 20 17:47:03 crc kubenswrapper[4690]: I0320 17:47:03.119665 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/15577f5b-3df3-4e8e-bebe-6abe5379debf-dbus-socket\") pod \"nmstate-handler-h7mj5\" (UID: \"15577f5b-3df3-4e8e-bebe-6abe5379debf\") " pod="openshift-nmstate/nmstate-handler-h7mj5" Mar 20 17:47:03 crc kubenswrapper[4690]: I0320 17:47:03.119687 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/15577f5b-3df3-4e8e-bebe-6abe5379debf-nmstate-lock\") pod \"nmstate-handler-h7mj5\" (UID: \"15577f5b-3df3-4e8e-bebe-6abe5379debf\") " pod="openshift-nmstate/nmstate-handler-h7mj5" Mar 20 17:47:03 crc kubenswrapper[4690]: I0320 17:47:03.119729 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/bab1f41a-57aa-43a8-b690-62eb634c99dc-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-66twl\" (UID: \"bab1f41a-57aa-43a8-b690-62eb634c99dc\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-66twl" Mar 20 17:47:03 crc kubenswrapper[4690]: I0320 17:47:03.119756 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mglgk\" (UniqueName: \"kubernetes.io/projected/15577f5b-3df3-4e8e-bebe-6abe5379debf-kube-api-access-mglgk\") pod \"nmstate-handler-h7mj5\" (UID: \"15577f5b-3df3-4e8e-bebe-6abe5379debf\") " pod="openshift-nmstate/nmstate-handler-h7mj5" Mar 20 17:47:03 crc kubenswrapper[4690]: I0320 17:47:03.119788 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/bab1f41a-57aa-43a8-b690-62eb634c99dc-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-66twl\" (UID: \"bab1f41a-57aa-43a8-b690-62eb634c99dc\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-66twl" Mar 20 17:47:03 crc kubenswrapper[4690]: I0320 17:47:03.119804 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqz4m\" (UniqueName: \"kubernetes.io/projected/bab1f41a-57aa-43a8-b690-62eb634c99dc-kube-api-access-wqz4m\") pod \"nmstate-console-plugin-86f58fcf4-66twl\" (UID: \"bab1f41a-57aa-43a8-b690-62eb634c99dc\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-66twl" Mar 20 17:47:03 crc kubenswrapper[4690]: I0320 17:47:03.119831 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sz7tb\" (UniqueName: \"kubernetes.io/projected/943c74c5-b182-46da-9ea4-164a4eb553d0-kube-api-access-sz7tb\") pod \"nmstate-webhook-5f558f5558-2gftm\" (UID: \"943c74c5-b182-46da-9ea4-164a4eb553d0\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-2gftm" Mar 20 17:47:03 crc kubenswrapper[4690]: I0320 17:47:03.119847 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/943c74c5-b182-46da-9ea4-164a4eb553d0-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-2gftm\" (UID: \"943c74c5-b182-46da-9ea4-164a4eb553d0\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-2gftm" Mar 20 17:47:03 crc kubenswrapper[4690]: I0320 17:47:03.119870 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/15577f5b-3df3-4e8e-bebe-6abe5379debf-ovs-socket\") pod \"nmstate-handler-h7mj5\" (UID: \"15577f5b-3df3-4e8e-bebe-6abe5379debf\") " pod="openshift-nmstate/nmstate-handler-h7mj5" Mar 20 17:47:03 crc kubenswrapper[4690]: I0320 17:47:03.119927 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/15577f5b-3df3-4e8e-bebe-6abe5379debf-ovs-socket\") pod \"nmstate-handler-h7mj5\" (UID: \"15577f5b-3df3-4e8e-bebe-6abe5379debf\") " pod="openshift-nmstate/nmstate-handler-h7mj5" Mar 20 17:47:03 crc kubenswrapper[4690]: I0320 17:47:03.120541 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/15577f5b-3df3-4e8e-bebe-6abe5379debf-nmstate-lock\") pod \"nmstate-handler-h7mj5\" (UID: \"15577f5b-3df3-4e8e-bebe-6abe5379debf\") " pod="openshift-nmstate/nmstate-handler-h7mj5" Mar 20 17:47:03 crc kubenswrapper[4690]: E0320 17:47:03.120614 4690 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Mar 20 17:47:03 crc kubenswrapper[4690]: E0320 17:47:03.120692 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/943c74c5-b182-46da-9ea4-164a4eb553d0-tls-key-pair podName:943c74c5-b182-46da-9ea4-164a4eb553d0 nodeName:}" failed. No retries permitted until 2026-03-20 17:47:03.620671971 +0000 UTC m=+898.486497659 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/943c74c5-b182-46da-9ea4-164a4eb553d0-tls-key-pair") pod "nmstate-webhook-5f558f5558-2gftm" (UID: "943c74c5-b182-46da-9ea4-164a4eb553d0") : secret "openshift-nmstate-webhook" not found Mar 20 17:47:03 crc kubenswrapper[4690]: I0320 17:47:03.120721 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/15577f5b-3df3-4e8e-bebe-6abe5379debf-dbus-socket\") pod \"nmstate-handler-h7mj5\" (UID: \"15577f5b-3df3-4e8e-bebe-6abe5379debf\") " pod="openshift-nmstate/nmstate-handler-h7mj5" Mar 20 17:47:03 crc kubenswrapper[4690]: I0320 17:47:03.139893 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sz7tb\" (UniqueName: \"kubernetes.io/projected/943c74c5-b182-46da-9ea4-164a4eb553d0-kube-api-access-sz7tb\") pod \"nmstate-webhook-5f558f5558-2gftm\" (UID: \"943c74c5-b182-46da-9ea4-164a4eb553d0\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-2gftm" Mar 20 17:47:03 crc kubenswrapper[4690]: I0320 17:47:03.150859 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mglgk\" (UniqueName: \"kubernetes.io/projected/15577f5b-3df3-4e8e-bebe-6abe5379debf-kube-api-access-mglgk\") pod \"nmstate-handler-h7mj5\" (UID: \"15577f5b-3df3-4e8e-bebe-6abe5379debf\") " pod="openshift-nmstate/nmstate-handler-h7mj5" Mar 20 17:47:03 crc kubenswrapper[4690]: I0320 17:47:03.159310 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xgz9\" (UniqueName: \"kubernetes.io/projected/2a8c0e04-bbfb-46b1-8562-2a1697b85035-kube-api-access-6xgz9\") pod \"nmstate-metrics-9b8c8685d-zpndn\" (UID: \"2a8c0e04-bbfb-46b1-8562-2a1697b85035\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-zpndn" Mar 20 17:47:03 crc kubenswrapper[4690]: I0320 17:47:03.221470 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/bab1f41a-57aa-43a8-b690-62eb634c99dc-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-66twl\" (UID: \"bab1f41a-57aa-43a8-b690-62eb634c99dc\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-66twl" Mar 20 17:47:03 crc kubenswrapper[4690]: I0320 17:47:03.221558 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/bab1f41a-57aa-43a8-b690-62eb634c99dc-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-66twl\" (UID: \"bab1f41a-57aa-43a8-b690-62eb634c99dc\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-66twl" Mar 20 17:47:03 crc kubenswrapper[4690]: I0320 17:47:03.221592 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqz4m\" (UniqueName: \"kubernetes.io/projected/bab1f41a-57aa-43a8-b690-62eb634c99dc-kube-api-access-wqz4m\") pod \"nmstate-console-plugin-86f58fcf4-66twl\" (UID: \"bab1f41a-57aa-43a8-b690-62eb634c99dc\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-66twl" Mar 20 17:47:03 crc kubenswrapper[4690]: I0320 17:47:03.222396 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/bab1f41a-57aa-43a8-b690-62eb634c99dc-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-66twl\" (UID: \"bab1f41a-57aa-43a8-b690-62eb634c99dc\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-66twl" Mar 20 17:47:03 crc kubenswrapper[4690]: I0320 17:47:03.224798 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/bab1f41a-57aa-43a8-b690-62eb634c99dc-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-66twl\" (UID: \"bab1f41a-57aa-43a8-b690-62eb634c99dc\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-66twl" Mar 20 17:47:03 crc kubenswrapper[4690]: I0320 17:47:03.241722 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-zpndn" Mar 20 17:47:03 crc kubenswrapper[4690]: I0320 17:47:03.248328 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqz4m\" (UniqueName: \"kubernetes.io/projected/bab1f41a-57aa-43a8-b690-62eb634c99dc-kube-api-access-wqz4m\") pod \"nmstate-console-plugin-86f58fcf4-66twl\" (UID: \"bab1f41a-57aa-43a8-b690-62eb634c99dc\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-66twl" Mar 20 17:47:03 crc kubenswrapper[4690]: I0320 17:47:03.256996 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-7cc96b59bf-qn2zh"] Mar 20 17:47:03 crc kubenswrapper[4690]: I0320 17:47:03.257777 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7cc96b59bf-qn2zh" Mar 20 17:47:03 crc kubenswrapper[4690]: I0320 17:47:03.277054 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-h7mj5" Mar 20 17:47:03 crc kubenswrapper[4690]: I0320 17:47:03.277084 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7cc96b59bf-qn2zh"] Mar 20 17:47:03 crc kubenswrapper[4690]: I0320 17:47:03.305153 4690 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 17:47:03 crc kubenswrapper[4690]: I0320 17:47:03.323299 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/13206aa0-3692-44c2-a381-d69bb042f883-console-serving-cert\") pod \"console-7cc96b59bf-qn2zh\" (UID: \"13206aa0-3692-44c2-a381-d69bb042f883\") " pod="openshift-console/console-7cc96b59bf-qn2zh" Mar 20 17:47:03 crc kubenswrapper[4690]: I0320 17:47:03.323937 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/13206aa0-3692-44c2-a381-d69bb042f883-console-config\") pod \"console-7cc96b59bf-qn2zh\" (UID: \"13206aa0-3692-44c2-a381-d69bb042f883\") " pod="openshift-console/console-7cc96b59bf-qn2zh" Mar 20 17:47:03 crc kubenswrapper[4690]: I0320 17:47:03.324076 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/13206aa0-3692-44c2-a381-d69bb042f883-console-oauth-config\") pod \"console-7cc96b59bf-qn2zh\" (UID: \"13206aa0-3692-44c2-a381-d69bb042f883\") " pod="openshift-console/console-7cc96b59bf-qn2zh" Mar 20 17:47:03 crc kubenswrapper[4690]: I0320 17:47:03.324177 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/13206aa0-3692-44c2-a381-d69bb042f883-service-ca\") pod \"console-7cc96b59bf-qn2zh\" (UID: \"13206aa0-3692-44c2-a381-d69bb042f883\") " pod="openshift-console/console-7cc96b59bf-qn2zh" Mar 20 17:47:03 crc kubenswrapper[4690]: I0320 17:47:03.324454 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6wkf\" (UniqueName: \"kubernetes.io/projected/13206aa0-3692-44c2-a381-d69bb042f883-kube-api-access-h6wkf\") pod \"console-7cc96b59bf-qn2zh\" (UID: \"13206aa0-3692-44c2-a381-d69bb042f883\") " pod="openshift-console/console-7cc96b59bf-qn2zh" Mar 20 17:47:03 crc kubenswrapper[4690]: I0320 17:47:03.324597 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/13206aa0-3692-44c2-a381-d69bb042f883-oauth-serving-cert\") pod \"console-7cc96b59bf-qn2zh\" (UID: \"13206aa0-3692-44c2-a381-d69bb042f883\") " pod="openshift-console/console-7cc96b59bf-qn2zh" Mar 20 17:47:03 crc kubenswrapper[4690]: I0320 17:47:03.324623 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/13206aa0-3692-44c2-a381-d69bb042f883-trusted-ca-bundle\") pod \"console-7cc96b59bf-qn2zh\" (UID: \"13206aa0-3692-44c2-a381-d69bb042f883\") " pod="openshift-console/console-7cc96b59bf-qn2zh" Mar 20 17:47:03 crc kubenswrapper[4690]: I0320 17:47:03.383728 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-66twl" Mar 20 17:47:03 crc kubenswrapper[4690]: I0320 17:47:03.425745 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/13206aa0-3692-44c2-a381-d69bb042f883-trusted-ca-bundle\") pod \"console-7cc96b59bf-qn2zh\" (UID: \"13206aa0-3692-44c2-a381-d69bb042f883\") " pod="openshift-console/console-7cc96b59bf-qn2zh" Mar 20 17:47:03 crc kubenswrapper[4690]: I0320 17:47:03.425810 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/13206aa0-3692-44c2-a381-d69bb042f883-console-serving-cert\") pod \"console-7cc96b59bf-qn2zh\" (UID: \"13206aa0-3692-44c2-a381-d69bb042f883\") " pod="openshift-console/console-7cc96b59bf-qn2zh" Mar 20 17:47:03 crc kubenswrapper[4690]: I0320 17:47:03.425837 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/13206aa0-3692-44c2-a381-d69bb042f883-console-config\") pod \"console-7cc96b59bf-qn2zh\" (UID: \"13206aa0-3692-44c2-a381-d69bb042f883\") " pod="openshift-console/console-7cc96b59bf-qn2zh" Mar 20 17:47:03 crc kubenswrapper[4690]: I0320 17:47:03.425874 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/13206aa0-3692-44c2-a381-d69bb042f883-console-oauth-config\") pod \"console-7cc96b59bf-qn2zh\" (UID: \"13206aa0-3692-44c2-a381-d69bb042f883\") " pod="openshift-console/console-7cc96b59bf-qn2zh" Mar 20 17:47:03 crc kubenswrapper[4690]: I0320 17:47:03.425901 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/13206aa0-3692-44c2-a381-d69bb042f883-service-ca\") pod \"console-7cc96b59bf-qn2zh\" (UID: \"13206aa0-3692-44c2-a381-d69bb042f883\") " pod="openshift-console/console-7cc96b59bf-qn2zh" Mar 20 17:47:03 crc kubenswrapper[4690]: I0320 17:47:03.427130 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6wkf\" (UniqueName: \"kubernetes.io/projected/13206aa0-3692-44c2-a381-d69bb042f883-kube-api-access-h6wkf\") pod \"console-7cc96b59bf-qn2zh\" (UID: \"13206aa0-3692-44c2-a381-d69bb042f883\") " pod="openshift-console/console-7cc96b59bf-qn2zh" Mar 20 17:47:03 crc kubenswrapper[4690]: I0320 17:47:03.427201 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/13206aa0-3692-44c2-a381-d69bb042f883-oauth-serving-cert\") pod \"console-7cc96b59bf-qn2zh\" (UID: \"13206aa0-3692-44c2-a381-d69bb042f883\") " pod="openshift-console/console-7cc96b59bf-qn2zh" Mar 20 17:47:03 crc kubenswrapper[4690]: I0320 17:47:03.427466 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/13206aa0-3692-44c2-a381-d69bb042f883-console-config\") pod \"console-7cc96b59bf-qn2zh\" (UID: \"13206aa0-3692-44c2-a381-d69bb042f883\") " pod="openshift-console/console-7cc96b59bf-qn2zh" Mar 20 17:47:03 crc kubenswrapper[4690]: I0320 17:47:03.427571 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/13206aa0-3692-44c2-a381-d69bb042f883-service-ca\") pod \"console-7cc96b59bf-qn2zh\" (UID: \"13206aa0-3692-44c2-a381-d69bb042f883\") " pod="openshift-console/console-7cc96b59bf-qn2zh" Mar 20 17:47:03 crc kubenswrapper[4690]: I0320 17:47:03.427594 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/13206aa0-3692-44c2-a381-d69bb042f883-trusted-ca-bundle\") pod \"console-7cc96b59bf-qn2zh\" (UID: \"13206aa0-3692-44c2-a381-d69bb042f883\") " pod="openshift-console/console-7cc96b59bf-qn2zh" Mar 20 17:47:03 crc kubenswrapper[4690]: I0320 17:47:03.427990 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/13206aa0-3692-44c2-a381-d69bb042f883-oauth-serving-cert\") pod \"console-7cc96b59bf-qn2zh\" (UID: \"13206aa0-3692-44c2-a381-d69bb042f883\") " pod="openshift-console/console-7cc96b59bf-qn2zh" Mar 20 17:47:03 crc kubenswrapper[4690]: I0320 17:47:03.434745 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/13206aa0-3692-44c2-a381-d69bb042f883-console-serving-cert\") pod \"console-7cc96b59bf-qn2zh\" (UID: \"13206aa0-3692-44c2-a381-d69bb042f883\") " pod="openshift-console/console-7cc96b59bf-qn2zh" Mar 20 17:47:03 crc kubenswrapper[4690]: I0320 17:47:03.434818 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/13206aa0-3692-44c2-a381-d69bb042f883-console-oauth-config\") pod \"console-7cc96b59bf-qn2zh\" (UID: \"13206aa0-3692-44c2-a381-d69bb042f883\") " pod="openshift-console/console-7cc96b59bf-qn2zh" Mar 20 17:47:03 crc kubenswrapper[4690]: I0320 17:47:03.438186 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-zpndn"] Mar 20 17:47:03 crc kubenswrapper[4690]: I0320 17:47:03.444034 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6wkf\" (UniqueName: \"kubernetes.io/projected/13206aa0-3692-44c2-a381-d69bb042f883-kube-api-access-h6wkf\") pod \"console-7cc96b59bf-qn2zh\" (UID: \"13206aa0-3692-44c2-a381-d69bb042f883\") " pod="openshift-console/console-7cc96b59bf-qn2zh" Mar 20 17:47:03 crc kubenswrapper[4690]: W0320 17:47:03.457218 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a8c0e04_bbfb_46b1_8562_2a1697b85035.slice/crio-9b3edf3088778386e7d72fd6b56a95ffd300f45246bb62c767a94f11aa7c0dfe WatchSource:0}: Error finding container 9b3edf3088778386e7d72fd6b56a95ffd300f45246bb62c767a94f11aa7c0dfe: Status 404 returned error can't find the container with id 9b3edf3088778386e7d72fd6b56a95ffd300f45246bb62c767a94f11aa7c0dfe Mar 20 17:47:03 crc kubenswrapper[4690]: I0320 17:47:03.551042 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-66twl"] Mar 20 17:47:03 crc kubenswrapper[4690]: I0320 17:47:03.599182 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7cc96b59bf-qn2zh" Mar 20 17:47:03 crc kubenswrapper[4690]: I0320 17:47:03.630355 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/943c74c5-b182-46da-9ea4-164a4eb553d0-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-2gftm\" (UID: \"943c74c5-b182-46da-9ea4-164a4eb553d0\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-2gftm" Mar 20 17:47:03 crc kubenswrapper[4690]: I0320 17:47:03.633804 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/943c74c5-b182-46da-9ea4-164a4eb553d0-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-2gftm\" (UID: \"943c74c5-b182-46da-9ea4-164a4eb553d0\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-2gftm" Mar 20 17:47:03 crc kubenswrapper[4690]: I0320 17:47:03.829708 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7cc96b59bf-qn2zh"] Mar 20 17:47:03 crc kubenswrapper[4690]: W0320 17:47:03.836701 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13206aa0_3692_44c2_a381_d69bb042f883.slice/crio-4802a30dab7d893a8ba783caf6afc69ebe274d07f1146984e0f6f09cf308c9c1 WatchSource:0}: Error finding container 4802a30dab7d893a8ba783caf6afc69ebe274d07f1146984e0f6f09cf308c9c1: Status 404 returned error can't find the container with id 4802a30dab7d893a8ba783caf6afc69ebe274d07f1146984e0f6f09cf308c9c1 Mar 20 17:47:03 crc kubenswrapper[4690]: I0320 17:47:03.856074 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-2gftm" Mar 20 17:47:04 crc kubenswrapper[4690]: I0320 17:47:04.051050 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-66twl" event={"ID":"bab1f41a-57aa-43a8-b690-62eb634c99dc","Type":"ContainerStarted","Data":"abe5d93f35766c9172d667ea4b724fe5caefddc33b92ad09a90f225d320abe7f"} Mar 20 17:47:04 crc kubenswrapper[4690]: I0320 17:47:04.052169 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-zpndn" event={"ID":"2a8c0e04-bbfb-46b1-8562-2a1697b85035","Type":"ContainerStarted","Data":"9b3edf3088778386e7d72fd6b56a95ffd300f45246bb62c767a94f11aa7c0dfe"} Mar 20 17:47:04 crc kubenswrapper[4690]: I0320 17:47:04.053232 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-h7mj5" event={"ID":"15577f5b-3df3-4e8e-bebe-6abe5379debf","Type":"ContainerStarted","Data":"d60266d2ce0e8fa0cb6d278d21ab6f06e7c80a8b5fc74b03c3b2716a46741d9c"} Mar 20 17:47:04 crc kubenswrapper[4690]: I0320 17:47:04.055185 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7cc96b59bf-qn2zh" event={"ID":"13206aa0-3692-44c2-a381-d69bb042f883","Type":"ContainerStarted","Data":"78c6df70834b4d4e0765fd712849cbd729f931d02a83430e07c124dcd1512286"} Mar 20 17:47:04 crc kubenswrapper[4690]: I0320 17:47:04.055468 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7cc96b59bf-qn2zh" event={"ID":"13206aa0-3692-44c2-a381-d69bb042f883","Type":"ContainerStarted","Data":"4802a30dab7d893a8ba783caf6afc69ebe274d07f1146984e0f6f09cf308c9c1"} Mar 20 17:47:04 crc kubenswrapper[4690]: I0320 17:47:04.074770 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7cc96b59bf-qn2zh" podStartSLOduration=1.074744458 podStartE2EDuration="1.074744458s" podCreationTimestamp="2026-03-20 17:47:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:47:04.072156485 +0000 UTC m=+898.937982173" watchObservedRunningTime="2026-03-20 17:47:04.074744458 +0000 UTC m=+898.940570136" Mar 20 17:47:04 crc kubenswrapper[4690]: I0320 17:47:04.323431 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-2gftm"] Mar 20 17:47:04 crc kubenswrapper[4690]: W0320 17:47:04.339291 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod943c74c5_b182_46da_9ea4_164a4eb553d0.slice/crio-b728d7a11cd5fd986c7014ed7ba35c92503968551380f16d599d11a1a6af7a00 WatchSource:0}: Error finding container b728d7a11cd5fd986c7014ed7ba35c92503968551380f16d599d11a1a6af7a00: Status 404 returned error can't find the container with id b728d7a11cd5fd986c7014ed7ba35c92503968551380f16d599d11a1a6af7a00 Mar 20 17:47:05 crc kubenswrapper[4690]: I0320 17:47:05.061610 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-2gftm" event={"ID":"943c74c5-b182-46da-9ea4-164a4eb553d0","Type":"ContainerStarted","Data":"b728d7a11cd5fd986c7014ed7ba35c92503968551380f16d599d11a1a6af7a00"} Mar 20 17:47:05 crc kubenswrapper[4690]: I0320 17:47:05.451773 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-kjd6h" Mar 20 17:47:05 crc kubenswrapper[4690]: I0320 17:47:05.502001 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-kjd6h" Mar 20 17:47:05 crc kubenswrapper[4690]: I0320 17:47:05.686838 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kjd6h"] Mar 20 17:47:07 crc kubenswrapper[4690]: I0320 17:47:07.076325 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-66twl" event={"ID":"bab1f41a-57aa-43a8-b690-62eb634c99dc","Type":"ContainerStarted","Data":"3cb16ff729acacbd15741a7e48be1af682f1c928c03b84bcc22c5052e3c4676f"} Mar 20 17:47:07 crc kubenswrapper[4690]: I0320 17:47:07.077750 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-zpndn" event={"ID":"2a8c0e04-bbfb-46b1-8562-2a1697b85035","Type":"ContainerStarted","Data":"98709171d5e457c04c335af9cdbb6322a0974fe04719e5a4c5396f32b35309ba"} Mar 20 17:47:07 crc kubenswrapper[4690]: I0320 17:47:07.079825 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-2gftm" event={"ID":"943c74c5-b182-46da-9ea4-164a4eb553d0","Type":"ContainerStarted","Data":"8eda50b833c3d8e71a49ef08c7e2b9ba42e0aa39d72e249980d337e56a2d2b27"} Mar 20 17:47:07 crc kubenswrapper[4690]: I0320 17:47:07.080016 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f558f5558-2gftm" Mar 20 17:47:07 crc kubenswrapper[4690]: I0320 17:47:07.082023 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-kjd6h" podUID="15f6141c-d70c-4c93-90ad-ff803278dc41" containerName="registry-server" containerID="cri-o://a9f6442062ddcad51c20e399346df9a9e7c6cbdd59288b0eedec24d4b3b7807d" gracePeriod=2 Mar 20 17:47:07 crc kubenswrapper[4690]: I0320 17:47:07.082075 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-h7mj5" event={"ID":"15577f5b-3df3-4e8e-bebe-6abe5379debf","Type":"ContainerStarted","Data":"30fa2ecc7fc5ecae6dcd9ca17bc1ad3a8dfa8944f93bef96862bc3cc01015c6b"} Mar 20 17:47:07 crc kubenswrapper[4690]: I0320 17:47:07.082193 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-h7mj5" Mar 20 17:47:07 crc kubenswrapper[4690]: I0320 17:47:07.132608 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-66twl" podStartSLOduration=1.121969285 podStartE2EDuration="4.132577694s" podCreationTimestamp="2026-03-20 17:47:03 +0000 UTC" firstStartedPulling="2026-03-20 17:47:03.557129717 +0000 UTC m=+898.422955405" lastFinishedPulling="2026-03-20 17:47:06.567738116 +0000 UTC m=+901.433563814" observedRunningTime="2026-03-20 17:47:07.099038055 +0000 UTC m=+901.964863773" watchObservedRunningTime="2026-03-20 17:47:07.132577694 +0000 UTC m=+901.998403412" Mar 20 17:47:07 crc kubenswrapper[4690]: I0320 17:47:07.157664 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-h7mj5" podStartSLOduration=1.896682312 podStartE2EDuration="5.157643023s" podCreationTimestamp="2026-03-20 17:47:02 +0000 UTC" firstStartedPulling="2026-03-20 17:47:03.304930243 +0000 UTC m=+898.170755921" lastFinishedPulling="2026-03-20 17:47:06.565890954 +0000 UTC m=+901.431716632" observedRunningTime="2026-03-20 17:47:07.132043349 +0000 UTC m=+901.997869057" watchObservedRunningTime="2026-03-20 17:47:07.157643023 +0000 UTC m=+902.023468701" Mar 20 17:47:07 crc kubenswrapper[4690]: I0320 17:47:07.158477 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f558f5558-2gftm" podStartSLOduration=2.929694552 podStartE2EDuration="5.158473726s" podCreationTimestamp="2026-03-20 17:47:02 +0000 UTC" firstStartedPulling="2026-03-20 17:47:04.340498766 +0000 UTC m=+899.206324454" lastFinishedPulling="2026-03-20 17:47:06.56927794 +0000 UTC m=+901.435103628" observedRunningTime="2026-03-20 17:47:07.156281844 +0000 UTC m=+902.022107522" watchObservedRunningTime="2026-03-20 17:47:07.158473726 +0000 UTC m=+902.024299404" Mar 20 17:47:07 crc kubenswrapper[4690]: I0320 17:47:07.473443 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kjd6h" Mar 20 17:47:07 crc kubenswrapper[4690]: I0320 17:47:07.587650 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wbdfl\" (UniqueName: \"kubernetes.io/projected/15f6141c-d70c-4c93-90ad-ff803278dc41-kube-api-access-wbdfl\") pod \"15f6141c-d70c-4c93-90ad-ff803278dc41\" (UID: \"15f6141c-d70c-4c93-90ad-ff803278dc41\") " Mar 20 17:47:07 crc kubenswrapper[4690]: I0320 17:47:07.587728 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15f6141c-d70c-4c93-90ad-ff803278dc41-catalog-content\") pod \"15f6141c-d70c-4c93-90ad-ff803278dc41\" (UID: \"15f6141c-d70c-4c93-90ad-ff803278dc41\") " Mar 20 17:47:07 crc kubenswrapper[4690]: I0320 17:47:07.587767 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15f6141c-d70c-4c93-90ad-ff803278dc41-utilities\") pod \"15f6141c-d70c-4c93-90ad-ff803278dc41\" (UID: \"15f6141c-d70c-4c93-90ad-ff803278dc41\") " Mar 20 17:47:07 crc kubenswrapper[4690]: I0320 17:47:07.589453 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15f6141c-d70c-4c93-90ad-ff803278dc41-utilities" (OuterVolumeSpecName: "utilities") pod "15f6141c-d70c-4c93-90ad-ff803278dc41" (UID: "15f6141c-d70c-4c93-90ad-ff803278dc41"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:47:07 crc kubenswrapper[4690]: I0320 17:47:07.594622 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15f6141c-d70c-4c93-90ad-ff803278dc41-kube-api-access-wbdfl" (OuterVolumeSpecName: "kube-api-access-wbdfl") pod "15f6141c-d70c-4c93-90ad-ff803278dc41" (UID: "15f6141c-d70c-4c93-90ad-ff803278dc41"). InnerVolumeSpecName "kube-api-access-wbdfl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:47:07 crc kubenswrapper[4690]: I0320 17:47:07.688795 4690 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15f6141c-d70c-4c93-90ad-ff803278dc41-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 17:47:07 crc kubenswrapper[4690]: I0320 17:47:07.688828 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wbdfl\" (UniqueName: \"kubernetes.io/projected/15f6141c-d70c-4c93-90ad-ff803278dc41-kube-api-access-wbdfl\") on node \"crc\" DevicePath \"\"" Mar 20 17:47:07 crc kubenswrapper[4690]: I0320 17:47:07.708463 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15f6141c-d70c-4c93-90ad-ff803278dc41-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "15f6141c-d70c-4c93-90ad-ff803278dc41" (UID: "15f6141c-d70c-4c93-90ad-ff803278dc41"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:47:07 crc kubenswrapper[4690]: I0320 17:47:07.790394 4690 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15f6141c-d70c-4c93-90ad-ff803278dc41-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 17:47:08 crc kubenswrapper[4690]: I0320 17:47:08.095650 4690 generic.go:334] "Generic (PLEG): container finished" podID="15f6141c-d70c-4c93-90ad-ff803278dc41" containerID="a9f6442062ddcad51c20e399346df9a9e7c6cbdd59288b0eedec24d4b3b7807d" exitCode=0 Mar 20 17:47:08 crc kubenswrapper[4690]: I0320 17:47:08.097449 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kjd6h" event={"ID":"15f6141c-d70c-4c93-90ad-ff803278dc41","Type":"ContainerDied","Data":"a9f6442062ddcad51c20e399346df9a9e7c6cbdd59288b0eedec24d4b3b7807d"} Mar 20 17:47:08 crc kubenswrapper[4690]: I0320 17:47:08.097523 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kjd6h" Mar 20 17:47:08 crc kubenswrapper[4690]: I0320 17:47:08.097563 4690 scope.go:117] "RemoveContainer" containerID="a9f6442062ddcad51c20e399346df9a9e7c6cbdd59288b0eedec24d4b3b7807d" Mar 20 17:47:08 crc kubenswrapper[4690]: I0320 17:47:08.097547 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kjd6h" event={"ID":"15f6141c-d70c-4c93-90ad-ff803278dc41","Type":"ContainerDied","Data":"5b94116cfeb217800d954e1c13948b965df1e876c0bf369aa0ad6ad10e23c281"} Mar 20 17:47:08 crc kubenswrapper[4690]: I0320 17:47:08.121569 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kjd6h"] Mar 20 17:47:08 crc kubenswrapper[4690]: I0320 17:47:08.126896 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-kjd6h"] Mar 20 17:47:08 crc kubenswrapper[4690]: I0320 17:47:08.130247 4690 scope.go:117] "RemoveContainer" containerID="b2ce3132325536d87c94d6c149b23e29c3f4537f7bcbdeb9d3f2e208a210db5f" Mar 20 17:47:08 crc kubenswrapper[4690]: I0320 17:47:08.160995 4690 scope.go:117] "RemoveContainer" containerID="c5ac24c7b5b842e842f9a13015e636e3dfc77aaa4ba4cbde8f6ac8619faf450b" Mar 20 17:47:08 crc kubenswrapper[4690]: I0320 17:47:08.188041 4690 scope.go:117] "RemoveContainer" containerID="a9f6442062ddcad51c20e399346df9a9e7c6cbdd59288b0eedec24d4b3b7807d" Mar 20 17:47:08 crc kubenswrapper[4690]: E0320 17:47:08.188555 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9f6442062ddcad51c20e399346df9a9e7c6cbdd59288b0eedec24d4b3b7807d\": container with ID starting with a9f6442062ddcad51c20e399346df9a9e7c6cbdd59288b0eedec24d4b3b7807d not found: ID does not exist" containerID="a9f6442062ddcad51c20e399346df9a9e7c6cbdd59288b0eedec24d4b3b7807d" Mar 20 17:47:08 crc kubenswrapper[4690]: I0320 17:47:08.188598 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9f6442062ddcad51c20e399346df9a9e7c6cbdd59288b0eedec24d4b3b7807d"} err="failed to get container status \"a9f6442062ddcad51c20e399346df9a9e7c6cbdd59288b0eedec24d4b3b7807d\": rpc error: code = NotFound desc = could not find container \"a9f6442062ddcad51c20e399346df9a9e7c6cbdd59288b0eedec24d4b3b7807d\": container with ID starting with a9f6442062ddcad51c20e399346df9a9e7c6cbdd59288b0eedec24d4b3b7807d not found: ID does not exist" Mar 20 17:47:08 crc kubenswrapper[4690]: I0320 17:47:08.188625 4690 scope.go:117] "RemoveContainer" containerID="b2ce3132325536d87c94d6c149b23e29c3f4537f7bcbdeb9d3f2e208a210db5f" Mar 20 17:47:08 crc kubenswrapper[4690]: E0320 17:47:08.189056 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2ce3132325536d87c94d6c149b23e29c3f4537f7bcbdeb9d3f2e208a210db5f\": container with ID starting with b2ce3132325536d87c94d6c149b23e29c3f4537f7bcbdeb9d3f2e208a210db5f not found: ID does not exist" containerID="b2ce3132325536d87c94d6c149b23e29c3f4537f7bcbdeb9d3f2e208a210db5f" Mar 20 17:47:08 crc kubenswrapper[4690]: I0320 17:47:08.189104 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2ce3132325536d87c94d6c149b23e29c3f4537f7bcbdeb9d3f2e208a210db5f"} err="failed to get container status \"b2ce3132325536d87c94d6c149b23e29c3f4537f7bcbdeb9d3f2e208a210db5f\": rpc error: code = NotFound desc = could not find container \"b2ce3132325536d87c94d6c149b23e29c3f4537f7bcbdeb9d3f2e208a210db5f\": container with ID starting with b2ce3132325536d87c94d6c149b23e29c3f4537f7bcbdeb9d3f2e208a210db5f not found: ID does not exist" Mar 20 17:47:08 crc kubenswrapper[4690]: I0320 17:47:08.189136 4690 scope.go:117] "RemoveContainer" containerID="c5ac24c7b5b842e842f9a13015e636e3dfc77aaa4ba4cbde8f6ac8619faf450b" Mar 20 17:47:08 crc kubenswrapper[4690]: E0320 17:47:08.189459 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5ac24c7b5b842e842f9a13015e636e3dfc77aaa4ba4cbde8f6ac8619faf450b\": container with ID starting with c5ac24c7b5b842e842f9a13015e636e3dfc77aaa4ba4cbde8f6ac8619faf450b not found: ID does not exist" containerID="c5ac24c7b5b842e842f9a13015e636e3dfc77aaa4ba4cbde8f6ac8619faf450b" Mar 20 17:47:08 crc kubenswrapper[4690]: I0320 17:47:08.189490 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5ac24c7b5b842e842f9a13015e636e3dfc77aaa4ba4cbde8f6ac8619faf450b"} err="failed to get container status \"c5ac24c7b5b842e842f9a13015e636e3dfc77aaa4ba4cbde8f6ac8619faf450b\": rpc error: code = NotFound desc = could not find container \"c5ac24c7b5b842e842f9a13015e636e3dfc77aaa4ba4cbde8f6ac8619faf450b\": container with ID starting with c5ac24c7b5b842e842f9a13015e636e3dfc77aaa4ba4cbde8f6ac8619faf450b not found: ID does not exist" Mar 20 17:47:09 crc kubenswrapper[4690]: I0320 17:47:09.902592 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15f6141c-d70c-4c93-90ad-ff803278dc41" path="/var/lib/kubelet/pods/15f6141c-d70c-4c93-90ad-ff803278dc41/volumes" Mar 20 17:47:10 crc kubenswrapper[4690]: I0320 17:47:10.109813 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-zpndn" event={"ID":"2a8c0e04-bbfb-46b1-8562-2a1697b85035","Type":"ContainerStarted","Data":"d0a663e359cb4177ea72b01ac2a1bcd922d0861386df8e3843a8ddcb2940b37a"} Mar 20 17:47:10 crc kubenswrapper[4690]: I0320 17:47:10.128663 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-zpndn" podStartSLOduration=2.222853888 podStartE2EDuration="8.128647175s" podCreationTimestamp="2026-03-20 17:47:02 +0000 UTC" firstStartedPulling="2026-03-20 17:47:03.461714728 +0000 UTC m=+898.327540406" lastFinishedPulling="2026-03-20 17:47:09.367508015 +0000 UTC m=+904.233333693" observedRunningTime="2026-03-20 17:47:10.125353572 +0000 UTC m=+904.991179270" watchObservedRunningTime="2026-03-20 17:47:10.128647175 +0000 UTC m=+904.994472853" Mar 20 17:47:13 crc kubenswrapper[4690]: I0320 17:47:13.322079 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-h7mj5" Mar 20 17:47:13 crc kubenswrapper[4690]: I0320 17:47:13.600373 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-7cc96b59bf-qn2zh" Mar 20 17:47:13 crc kubenswrapper[4690]: I0320 17:47:13.601079 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7cc96b59bf-qn2zh" Mar 20 17:47:13 crc kubenswrapper[4690]: I0320 17:47:13.606346 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7cc96b59bf-qn2zh" Mar 20 17:47:14 crc kubenswrapper[4690]: I0320 17:47:14.148744 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7cc96b59bf-qn2zh" Mar 20 17:47:14 crc kubenswrapper[4690]: I0320 17:47:14.226036 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-ppgjz"] Mar 20 17:47:23 crc kubenswrapper[4690]: I0320 17:47:23.865706 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f558f5558-2gftm" Mar 20 17:47:38 crc kubenswrapper[4690]: I0320 17:47:38.130788 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1428tj"] Mar 20 17:47:38 crc kubenswrapper[4690]: E0320 17:47:38.131924 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15f6141c-d70c-4c93-90ad-ff803278dc41" containerName="registry-server" Mar 20 17:47:38 crc kubenswrapper[4690]: I0320 17:47:38.131953 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="15f6141c-d70c-4c93-90ad-ff803278dc41" containerName="registry-server" Mar 20 17:47:38 crc kubenswrapper[4690]: E0320 17:47:38.131975 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15f6141c-d70c-4c93-90ad-ff803278dc41" containerName="extract-content" Mar 20 17:47:38 crc kubenswrapper[4690]: I0320 17:47:38.131993 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="15f6141c-d70c-4c93-90ad-ff803278dc41" containerName="extract-content" Mar 20 17:47:38 crc kubenswrapper[4690]: E0320 17:47:38.132023 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15f6141c-d70c-4c93-90ad-ff803278dc41" containerName="extract-utilities" Mar 20 17:47:38 crc kubenswrapper[4690]: I0320 17:47:38.132042 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="15f6141c-d70c-4c93-90ad-ff803278dc41" containerName="extract-utilities" Mar 20 17:47:38 crc kubenswrapper[4690]: I0320 17:47:38.132768 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="15f6141c-d70c-4c93-90ad-ff803278dc41" containerName="registry-server" Mar 20 17:47:38 crc kubenswrapper[4690]: I0320 17:47:38.134571 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1428tj" Mar 20 17:47:38 crc kubenswrapper[4690]: I0320 17:47:38.138393 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 20 17:47:38 crc kubenswrapper[4690]: I0320 17:47:38.157511 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1428tj"] Mar 20 17:47:38 crc kubenswrapper[4690]: I0320 17:47:38.178807 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/40861b19-ba1a-4adf-8ee2-25a7c3016940-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1428tj\" (UID: \"40861b19-ba1a-4adf-8ee2-25a7c3016940\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1428tj" Mar 20 17:47:38 crc kubenswrapper[4690]: I0320 17:47:38.178867 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/40861b19-ba1a-4adf-8ee2-25a7c3016940-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1428tj\" (UID: \"40861b19-ba1a-4adf-8ee2-25a7c3016940\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1428tj" Mar 20 17:47:38 crc kubenswrapper[4690]: I0320 17:47:38.178984 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h99rg\" (UniqueName: \"kubernetes.io/projected/40861b19-ba1a-4adf-8ee2-25a7c3016940-kube-api-access-h99rg\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1428tj\" (UID: \"40861b19-ba1a-4adf-8ee2-25a7c3016940\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1428tj" Mar 20 17:47:38 crc kubenswrapper[4690]: I0320 17:47:38.280140 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h99rg\" (UniqueName: \"kubernetes.io/projected/40861b19-ba1a-4adf-8ee2-25a7c3016940-kube-api-access-h99rg\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1428tj\" (UID: \"40861b19-ba1a-4adf-8ee2-25a7c3016940\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1428tj" Mar 20 17:47:38 crc kubenswrapper[4690]: I0320 17:47:38.280208 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/40861b19-ba1a-4adf-8ee2-25a7c3016940-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1428tj\" (UID: \"40861b19-ba1a-4adf-8ee2-25a7c3016940\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1428tj" Mar 20 17:47:38 crc kubenswrapper[4690]: I0320 17:47:38.280234 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/40861b19-ba1a-4adf-8ee2-25a7c3016940-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1428tj\" (UID: \"40861b19-ba1a-4adf-8ee2-25a7c3016940\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1428tj" Mar 20 17:47:38 crc kubenswrapper[4690]: I0320 17:47:38.280767 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/40861b19-ba1a-4adf-8ee2-25a7c3016940-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1428tj\" (UID: \"40861b19-ba1a-4adf-8ee2-25a7c3016940\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1428tj" Mar 20 17:47:38 crc kubenswrapper[4690]: I0320 17:47:38.281283 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/40861b19-ba1a-4adf-8ee2-25a7c3016940-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1428tj\" (UID: \"40861b19-ba1a-4adf-8ee2-25a7c3016940\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1428tj" Mar 20 17:47:38 crc kubenswrapper[4690]: I0320 17:47:38.303670 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h99rg\" (UniqueName: \"kubernetes.io/projected/40861b19-ba1a-4adf-8ee2-25a7c3016940-kube-api-access-h99rg\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1428tj\" (UID: \"40861b19-ba1a-4adf-8ee2-25a7c3016940\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1428tj" Mar 20 17:47:38 crc kubenswrapper[4690]: I0320 17:47:38.477415 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1428tj" Mar 20 17:47:38 crc kubenswrapper[4690]: I0320 17:47:38.990932 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1428tj"] Mar 20 17:47:39 crc kubenswrapper[4690]: I0320 17:47:39.292668 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-ppgjz" podUID="c4eaf3f2-8536-46bf-8c5f-82606abec128" containerName="console" containerID="cri-o://afca7cf50c05785fa233a36eb9a7627d0add01a18be711776213fdd9ed33b0e2" gracePeriod=15 Mar 20 17:47:39 crc kubenswrapper[4690]: I0320 17:47:39.338003 4690 generic.go:334] "Generic (PLEG): container finished" podID="40861b19-ba1a-4adf-8ee2-25a7c3016940" containerID="96ca6a297e47259805cf69ed43aaab7bc18752b1481943b00a20d120eb8901c6" exitCode=0 Mar 20 17:47:39 crc kubenswrapper[4690]: I0320 17:47:39.338069 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1428tj" event={"ID":"40861b19-ba1a-4adf-8ee2-25a7c3016940","Type":"ContainerDied","Data":"96ca6a297e47259805cf69ed43aaab7bc18752b1481943b00a20d120eb8901c6"} Mar 20 17:47:39 crc kubenswrapper[4690]: I0320 17:47:39.338109 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1428tj" event={"ID":"40861b19-ba1a-4adf-8ee2-25a7c3016940","Type":"ContainerStarted","Data":"9f961ccae126cff2ff4abed26969a25b6d3278ceadb96a3348d919d1e613e13e"} Mar 20 17:47:39 crc kubenswrapper[4690]: I0320 17:47:39.677859 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-ppgjz_c4eaf3f2-8536-46bf-8c5f-82606abec128/console/0.log" Mar 20 17:47:39 crc kubenswrapper[4690]: I0320 17:47:39.677924 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-ppgjz" Mar 20 17:47:39 crc kubenswrapper[4690]: I0320 17:47:39.820968 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c4eaf3f2-8536-46bf-8c5f-82606abec128-service-ca\") pod \"c4eaf3f2-8536-46bf-8c5f-82606abec128\" (UID: \"c4eaf3f2-8536-46bf-8c5f-82606abec128\") " Mar 20 17:47:39 crc kubenswrapper[4690]: I0320 17:47:39.821062 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c4eaf3f2-8536-46bf-8c5f-82606abec128-console-config\") pod \"c4eaf3f2-8536-46bf-8c5f-82606abec128\" (UID: \"c4eaf3f2-8536-46bf-8c5f-82606abec128\") " Mar 20 17:47:39 crc kubenswrapper[4690]: I0320 17:47:39.821173 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4c77\" (UniqueName: \"kubernetes.io/projected/c4eaf3f2-8536-46bf-8c5f-82606abec128-kube-api-access-w4c77\") pod \"c4eaf3f2-8536-46bf-8c5f-82606abec128\" (UID: \"c4eaf3f2-8536-46bf-8c5f-82606abec128\") " Mar 20 17:47:39 crc kubenswrapper[4690]: I0320 17:47:39.821238 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4eaf3f2-8536-46bf-8c5f-82606abec128-trusted-ca-bundle\") pod \"c4eaf3f2-8536-46bf-8c5f-82606abec128\" (UID: \"c4eaf3f2-8536-46bf-8c5f-82606abec128\") " Mar 20 17:47:39 crc kubenswrapper[4690]: I0320 17:47:39.821365 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c4eaf3f2-8536-46bf-8c5f-82606abec128-oauth-serving-cert\") pod \"c4eaf3f2-8536-46bf-8c5f-82606abec128\" (UID: \"c4eaf3f2-8536-46bf-8c5f-82606abec128\") " Mar 20 17:47:39 crc kubenswrapper[4690]: I0320 17:47:39.821416 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c4eaf3f2-8536-46bf-8c5f-82606abec128-console-oauth-config\") pod \"c4eaf3f2-8536-46bf-8c5f-82606abec128\" (UID: \"c4eaf3f2-8536-46bf-8c5f-82606abec128\") " Mar 20 17:47:39 crc kubenswrapper[4690]: I0320 17:47:39.821467 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c4eaf3f2-8536-46bf-8c5f-82606abec128-console-serving-cert\") pod \"c4eaf3f2-8536-46bf-8c5f-82606abec128\" (UID: \"c4eaf3f2-8536-46bf-8c5f-82606abec128\") " Mar 20 17:47:39 crc kubenswrapper[4690]: I0320 17:47:39.822399 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4eaf3f2-8536-46bf-8c5f-82606abec128-console-config" (OuterVolumeSpecName: "console-config") pod "c4eaf3f2-8536-46bf-8c5f-82606abec128" (UID: "c4eaf3f2-8536-46bf-8c5f-82606abec128"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:47:39 crc kubenswrapper[4690]: I0320 17:47:39.822425 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4eaf3f2-8536-46bf-8c5f-82606abec128-service-ca" (OuterVolumeSpecName: "service-ca") pod "c4eaf3f2-8536-46bf-8c5f-82606abec128" (UID: "c4eaf3f2-8536-46bf-8c5f-82606abec128"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:47:39 crc kubenswrapper[4690]: I0320 17:47:39.823051 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4eaf3f2-8536-46bf-8c5f-82606abec128-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "c4eaf3f2-8536-46bf-8c5f-82606abec128" (UID: "c4eaf3f2-8536-46bf-8c5f-82606abec128"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:47:39 crc kubenswrapper[4690]: I0320 17:47:39.823510 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4eaf3f2-8536-46bf-8c5f-82606abec128-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "c4eaf3f2-8536-46bf-8c5f-82606abec128" (UID: "c4eaf3f2-8536-46bf-8c5f-82606abec128"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:47:39 crc kubenswrapper[4690]: I0320 17:47:39.828874 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4eaf3f2-8536-46bf-8c5f-82606abec128-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "c4eaf3f2-8536-46bf-8c5f-82606abec128" (UID: "c4eaf3f2-8536-46bf-8c5f-82606abec128"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:47:39 crc kubenswrapper[4690]: I0320 17:47:39.828894 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4eaf3f2-8536-46bf-8c5f-82606abec128-kube-api-access-w4c77" (OuterVolumeSpecName: "kube-api-access-w4c77") pod "c4eaf3f2-8536-46bf-8c5f-82606abec128" (UID: "c4eaf3f2-8536-46bf-8c5f-82606abec128"). InnerVolumeSpecName "kube-api-access-w4c77". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:47:39 crc kubenswrapper[4690]: I0320 17:47:39.837136 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4eaf3f2-8536-46bf-8c5f-82606abec128-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "c4eaf3f2-8536-46bf-8c5f-82606abec128" (UID: "c4eaf3f2-8536-46bf-8c5f-82606abec128"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:47:39 crc kubenswrapper[4690]: I0320 17:47:39.924192 4690 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c4eaf3f2-8536-46bf-8c5f-82606abec128-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:47:39 crc kubenswrapper[4690]: I0320 17:47:39.924233 4690 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c4eaf3f2-8536-46bf-8c5f-82606abec128-console-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:47:39 crc kubenswrapper[4690]: I0320 17:47:39.924279 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4c77\" (UniqueName: \"kubernetes.io/projected/c4eaf3f2-8536-46bf-8c5f-82606abec128-kube-api-access-w4c77\") on node \"crc\" DevicePath \"\"" Mar 20 17:47:39 crc kubenswrapper[4690]: I0320 17:47:39.924298 4690 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4eaf3f2-8536-46bf-8c5f-82606abec128-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:47:39 crc kubenswrapper[4690]: I0320 17:47:39.924318 4690 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c4eaf3f2-8536-46bf-8c5f-82606abec128-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:47:39 crc kubenswrapper[4690]: I0320 17:47:39.924334 4690 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c4eaf3f2-8536-46bf-8c5f-82606abec128-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:47:39 crc kubenswrapper[4690]: I0320 17:47:39.924349 4690 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c4eaf3f2-8536-46bf-8c5f-82606abec128-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:47:40 crc kubenswrapper[4690]: I0320 17:47:40.349188 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-ppgjz_c4eaf3f2-8536-46bf-8c5f-82606abec128/console/0.log" Mar 20 17:47:40 crc kubenswrapper[4690]: I0320 17:47:40.349352 4690 generic.go:334] "Generic (PLEG): container finished" podID="c4eaf3f2-8536-46bf-8c5f-82606abec128" containerID="afca7cf50c05785fa233a36eb9a7627d0add01a18be711776213fdd9ed33b0e2" exitCode=2 Mar 20 17:47:40 crc kubenswrapper[4690]: I0320 17:47:40.349441 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-ppgjz" event={"ID":"c4eaf3f2-8536-46bf-8c5f-82606abec128","Type":"ContainerDied","Data":"afca7cf50c05785fa233a36eb9a7627d0add01a18be711776213fdd9ed33b0e2"} Mar 20 17:47:40 crc kubenswrapper[4690]: I0320 17:47:40.349528 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-ppgjz" event={"ID":"c4eaf3f2-8536-46bf-8c5f-82606abec128","Type":"ContainerDied","Data":"1486a39e93d8e0297220cafe8abb99f98cb2467ab826f72f2f6792f169089f3e"} Mar 20 17:47:40 crc kubenswrapper[4690]: I0320 17:47:40.349591 4690 scope.go:117] "RemoveContainer" containerID="afca7cf50c05785fa233a36eb9a7627d0add01a18be711776213fdd9ed33b0e2" Mar 20 17:47:40 crc kubenswrapper[4690]: I0320 17:47:40.349465 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-ppgjz" Mar 20 17:47:40 crc kubenswrapper[4690]: I0320 17:47:40.382427 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-ppgjz"] Mar 20 17:47:40 crc kubenswrapper[4690]: I0320 17:47:40.394008 4690 scope.go:117] "RemoveContainer" containerID="afca7cf50c05785fa233a36eb9a7627d0add01a18be711776213fdd9ed33b0e2" Mar 20 17:47:40 crc kubenswrapper[4690]: E0320 17:47:40.394513 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"afca7cf50c05785fa233a36eb9a7627d0add01a18be711776213fdd9ed33b0e2\": container with ID starting with afca7cf50c05785fa233a36eb9a7627d0add01a18be711776213fdd9ed33b0e2 not found: ID does not exist" containerID="afca7cf50c05785fa233a36eb9a7627d0add01a18be711776213fdd9ed33b0e2" Mar 20 17:47:40 crc kubenswrapper[4690]: I0320 17:47:40.394550 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afca7cf50c05785fa233a36eb9a7627d0add01a18be711776213fdd9ed33b0e2"} err="failed to get container status \"afca7cf50c05785fa233a36eb9a7627d0add01a18be711776213fdd9ed33b0e2\": rpc error: code = NotFound desc = could not find container \"afca7cf50c05785fa233a36eb9a7627d0add01a18be711776213fdd9ed33b0e2\": container with ID starting with afca7cf50c05785fa233a36eb9a7627d0add01a18be711776213fdd9ed33b0e2 not found: ID does not exist" Mar 20 17:47:40 crc kubenswrapper[4690]: I0320 17:47:40.394954 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-ppgjz"] Mar 20 17:47:41 crc kubenswrapper[4690]: I0320 17:47:41.901311 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4eaf3f2-8536-46bf-8c5f-82606abec128" path="/var/lib/kubelet/pods/c4eaf3f2-8536-46bf-8c5f-82606abec128/volumes" Mar 20 17:47:42 crc kubenswrapper[4690]: I0320 17:47:42.372345 4690 generic.go:334] "Generic (PLEG): container finished" podID="40861b19-ba1a-4adf-8ee2-25a7c3016940" containerID="d7d5b69b6b0095553f601bbd665646cebe406d148b0e16fadb9b78a4811f14fa" exitCode=0 Mar 20 17:47:42 crc kubenswrapper[4690]: I0320 17:47:42.372400 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1428tj" event={"ID":"40861b19-ba1a-4adf-8ee2-25a7c3016940","Type":"ContainerDied","Data":"d7d5b69b6b0095553f601bbd665646cebe406d148b0e16fadb9b78a4811f14fa"} Mar 20 17:47:43 crc kubenswrapper[4690]: I0320 17:47:43.388544 4690 generic.go:334] "Generic (PLEG): container finished" podID="40861b19-ba1a-4adf-8ee2-25a7c3016940" containerID="78c3e18b4f9f98941f2659a7bf16b82e2a014169aee0390b9699e183713cdddc" exitCode=0 Mar 20 17:47:43 crc kubenswrapper[4690]: I0320 17:47:43.388717 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1428tj" event={"ID":"40861b19-ba1a-4adf-8ee2-25a7c3016940","Type":"ContainerDied","Data":"78c3e18b4f9f98941f2659a7bf16b82e2a014169aee0390b9699e183713cdddc"} Mar 20 17:47:44 crc kubenswrapper[4690]: I0320 17:47:44.688403 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1428tj" Mar 20 17:47:44 crc kubenswrapper[4690]: I0320 17:47:44.791635 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/40861b19-ba1a-4adf-8ee2-25a7c3016940-util\") pod \"40861b19-ba1a-4adf-8ee2-25a7c3016940\" (UID: \"40861b19-ba1a-4adf-8ee2-25a7c3016940\") " Mar 20 17:47:44 crc kubenswrapper[4690]: I0320 17:47:44.791746 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/40861b19-ba1a-4adf-8ee2-25a7c3016940-bundle\") pod \"40861b19-ba1a-4adf-8ee2-25a7c3016940\" (UID: \"40861b19-ba1a-4adf-8ee2-25a7c3016940\") " Mar 20 17:47:44 crc kubenswrapper[4690]: I0320 17:47:44.791778 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h99rg\" (UniqueName: \"kubernetes.io/projected/40861b19-ba1a-4adf-8ee2-25a7c3016940-kube-api-access-h99rg\") pod \"40861b19-ba1a-4adf-8ee2-25a7c3016940\" (UID: \"40861b19-ba1a-4adf-8ee2-25a7c3016940\") " Mar 20 17:47:44 crc kubenswrapper[4690]: I0320 17:47:44.793458 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40861b19-ba1a-4adf-8ee2-25a7c3016940-bundle" (OuterVolumeSpecName: "bundle") pod "40861b19-ba1a-4adf-8ee2-25a7c3016940" (UID: "40861b19-ba1a-4adf-8ee2-25a7c3016940"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:47:44 crc kubenswrapper[4690]: I0320 17:47:44.799513 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40861b19-ba1a-4adf-8ee2-25a7c3016940-kube-api-access-h99rg" (OuterVolumeSpecName: "kube-api-access-h99rg") pod "40861b19-ba1a-4adf-8ee2-25a7c3016940" (UID: "40861b19-ba1a-4adf-8ee2-25a7c3016940"). InnerVolumeSpecName "kube-api-access-h99rg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:47:44 crc kubenswrapper[4690]: I0320 17:47:44.871951 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40861b19-ba1a-4adf-8ee2-25a7c3016940-util" (OuterVolumeSpecName: "util") pod "40861b19-ba1a-4adf-8ee2-25a7c3016940" (UID: "40861b19-ba1a-4adf-8ee2-25a7c3016940"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:47:44 crc kubenswrapper[4690]: I0320 17:47:44.893389 4690 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/40861b19-ba1a-4adf-8ee2-25a7c3016940-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:47:44 crc kubenswrapper[4690]: I0320 17:47:44.893443 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h99rg\" (UniqueName: \"kubernetes.io/projected/40861b19-ba1a-4adf-8ee2-25a7c3016940-kube-api-access-h99rg\") on node \"crc\" DevicePath \"\"" Mar 20 17:47:44 crc kubenswrapper[4690]: I0320 17:47:44.893466 4690 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/40861b19-ba1a-4adf-8ee2-25a7c3016940-util\") on node \"crc\" DevicePath \"\"" Mar 20 17:47:45 crc kubenswrapper[4690]: I0320 17:47:45.417141 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1428tj" event={"ID":"40861b19-ba1a-4adf-8ee2-25a7c3016940","Type":"ContainerDied","Data":"9f961ccae126cff2ff4abed26969a25b6d3278ceadb96a3348d919d1e613e13e"} Mar 20 17:47:45 crc kubenswrapper[4690]: I0320 17:47:45.417208 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f961ccae126cff2ff4abed26969a25b6d3278ceadb96a3348d919d1e613e13e" Mar 20 17:47:45 crc kubenswrapper[4690]: I0320 17:47:45.417249 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1428tj" Mar 20 17:47:53 crc kubenswrapper[4690]: I0320 17:47:53.175756 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-77445cdfc6-46r4h"] Mar 20 17:47:53 crc kubenswrapper[4690]: E0320 17:47:53.176483 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40861b19-ba1a-4adf-8ee2-25a7c3016940" containerName="util" Mar 20 17:47:53 crc kubenswrapper[4690]: I0320 17:47:53.176495 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="40861b19-ba1a-4adf-8ee2-25a7c3016940" containerName="util" Mar 20 17:47:53 crc kubenswrapper[4690]: E0320 17:47:53.176514 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40861b19-ba1a-4adf-8ee2-25a7c3016940" containerName="pull" Mar 20 17:47:53 crc kubenswrapper[4690]: I0320 17:47:53.176520 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="40861b19-ba1a-4adf-8ee2-25a7c3016940" containerName="pull" Mar 20 17:47:53 crc kubenswrapper[4690]: E0320 17:47:53.176530 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4eaf3f2-8536-46bf-8c5f-82606abec128" containerName="console" Mar 20 17:47:53 crc kubenswrapper[4690]: I0320 17:47:53.176536 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4eaf3f2-8536-46bf-8c5f-82606abec128" containerName="console" Mar 20 17:47:53 crc kubenswrapper[4690]: E0320 17:47:53.176545 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40861b19-ba1a-4adf-8ee2-25a7c3016940" containerName="extract" Mar 20 17:47:53 crc kubenswrapper[4690]: I0320 17:47:53.176550 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="40861b19-ba1a-4adf-8ee2-25a7c3016940" containerName="extract" Mar 20 17:47:53 crc kubenswrapper[4690]: I0320 17:47:53.176654 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4eaf3f2-8536-46bf-8c5f-82606abec128" containerName="console" Mar 20 17:47:53 crc kubenswrapper[4690]: I0320 17:47:53.176672 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="40861b19-ba1a-4adf-8ee2-25a7c3016940" containerName="extract" Mar 20 17:47:53 crc kubenswrapper[4690]: I0320 17:47:53.177032 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-77445cdfc6-46r4h" Mar 20 17:47:53 crc kubenswrapper[4690]: I0320 17:47:53.182577 4690 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 20 17:47:53 crc kubenswrapper[4690]: I0320 17:47:53.182847 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 20 17:47:53 crc kubenswrapper[4690]: I0320 17:47:53.183765 4690 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 20 17:47:53 crc kubenswrapper[4690]: I0320 17:47:53.184133 4690 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-4gr5t" Mar 20 17:47:53 crc kubenswrapper[4690]: I0320 17:47:53.184369 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 20 17:47:53 crc kubenswrapper[4690]: I0320 17:47:53.189541 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-77445cdfc6-46r4h"] Mar 20 17:47:53 crc kubenswrapper[4690]: I0320 17:47:53.314354 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fcfa35ef-e556-4c2e-a742-84265930366f-apiservice-cert\") pod \"metallb-operator-controller-manager-77445cdfc6-46r4h\" (UID: \"fcfa35ef-e556-4c2e-a742-84265930366f\") " pod="metallb-system/metallb-operator-controller-manager-77445cdfc6-46r4h" Mar 20 17:47:53 crc kubenswrapper[4690]: I0320 17:47:53.314646 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jp4f5\" (UniqueName: \"kubernetes.io/projected/fcfa35ef-e556-4c2e-a742-84265930366f-kube-api-access-jp4f5\") pod \"metallb-operator-controller-manager-77445cdfc6-46r4h\" (UID: \"fcfa35ef-e556-4c2e-a742-84265930366f\") " pod="metallb-system/metallb-operator-controller-manager-77445cdfc6-46r4h" Mar 20 17:47:53 crc kubenswrapper[4690]: I0320 17:47:53.314854 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fcfa35ef-e556-4c2e-a742-84265930366f-webhook-cert\") pod \"metallb-operator-controller-manager-77445cdfc6-46r4h\" (UID: \"fcfa35ef-e556-4c2e-a742-84265930366f\") " pod="metallb-system/metallb-operator-controller-manager-77445cdfc6-46r4h" Mar 20 17:47:53 crc kubenswrapper[4690]: I0320 17:47:53.415743 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fcfa35ef-e556-4c2e-a742-84265930366f-apiservice-cert\") pod \"metallb-operator-controller-manager-77445cdfc6-46r4h\" (UID: \"fcfa35ef-e556-4c2e-a742-84265930366f\") " pod="metallb-system/metallb-operator-controller-manager-77445cdfc6-46r4h" Mar 20 17:47:53 crc kubenswrapper[4690]: I0320 17:47:53.415809 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jp4f5\" (UniqueName: \"kubernetes.io/projected/fcfa35ef-e556-4c2e-a742-84265930366f-kube-api-access-jp4f5\") pod \"metallb-operator-controller-manager-77445cdfc6-46r4h\" (UID: \"fcfa35ef-e556-4c2e-a742-84265930366f\") " pod="metallb-system/metallb-operator-controller-manager-77445cdfc6-46r4h" Mar 20 17:47:53 crc kubenswrapper[4690]: I0320 17:47:53.415869 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fcfa35ef-e556-4c2e-a742-84265930366f-webhook-cert\") pod \"metallb-operator-controller-manager-77445cdfc6-46r4h\" (UID: \"fcfa35ef-e556-4c2e-a742-84265930366f\") " pod="metallb-system/metallb-operator-controller-manager-77445cdfc6-46r4h" Mar 20 17:47:53 crc kubenswrapper[4690]: I0320 17:47:53.428059 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fcfa35ef-e556-4c2e-a742-84265930366f-webhook-cert\") pod \"metallb-operator-controller-manager-77445cdfc6-46r4h\" (UID: \"fcfa35ef-e556-4c2e-a742-84265930366f\") " pod="metallb-system/metallb-operator-controller-manager-77445cdfc6-46r4h" Mar 20 17:47:53 crc kubenswrapper[4690]: I0320 17:47:53.428075 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fcfa35ef-e556-4c2e-a742-84265930366f-apiservice-cert\") pod \"metallb-operator-controller-manager-77445cdfc6-46r4h\" (UID: \"fcfa35ef-e556-4c2e-a742-84265930366f\") " pod="metallb-system/metallb-operator-controller-manager-77445cdfc6-46r4h" Mar 20 17:47:53 crc kubenswrapper[4690]: I0320 17:47:53.434353 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jp4f5\" (UniqueName: \"kubernetes.io/projected/fcfa35ef-e556-4c2e-a742-84265930366f-kube-api-access-jp4f5\") pod \"metallb-operator-controller-manager-77445cdfc6-46r4h\" (UID: \"fcfa35ef-e556-4c2e-a742-84265930366f\") " pod="metallb-system/metallb-operator-controller-manager-77445cdfc6-46r4h" Mar 20 17:47:53 crc kubenswrapper[4690]: I0320 17:47:53.492672 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-77445cdfc6-46r4h" Mar 20 17:47:53 crc kubenswrapper[4690]: I0320 17:47:53.609086 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-5b96b44647-l5zw2"] Mar 20 17:47:53 crc kubenswrapper[4690]: I0320 17:47:53.610538 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-5b96b44647-l5zw2" Mar 20 17:47:53 crc kubenswrapper[4690]: I0320 17:47:53.613403 4690 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 20 17:47:53 crc kubenswrapper[4690]: I0320 17:47:53.613627 4690 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-mbmhc" Mar 20 17:47:53 crc kubenswrapper[4690]: I0320 17:47:53.617872 4690 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 20 17:47:53 crc kubenswrapper[4690]: I0320 17:47:53.618505 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7a5a20c4-0745-41d2-a2ff-f389423513b2-apiservice-cert\") pod \"metallb-operator-webhook-server-5b96b44647-l5zw2\" (UID: \"7a5a20c4-0745-41d2-a2ff-f389423513b2\") " pod="metallb-system/metallb-operator-webhook-server-5b96b44647-l5zw2" Mar 20 17:47:53 crc kubenswrapper[4690]: I0320 17:47:53.618571 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7a5a20c4-0745-41d2-a2ff-f389423513b2-webhook-cert\") pod \"metallb-operator-webhook-server-5b96b44647-l5zw2\" (UID: \"7a5a20c4-0745-41d2-a2ff-f389423513b2\") " pod="metallb-system/metallb-operator-webhook-server-5b96b44647-l5zw2" Mar 20 17:47:53 crc kubenswrapper[4690]: I0320 17:47:53.618605 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hfhw\" (UniqueName: \"kubernetes.io/projected/7a5a20c4-0745-41d2-a2ff-f389423513b2-kube-api-access-9hfhw\") pod \"metallb-operator-webhook-server-5b96b44647-l5zw2\" (UID: \"7a5a20c4-0745-41d2-a2ff-f389423513b2\") " pod="metallb-system/metallb-operator-webhook-server-5b96b44647-l5zw2" Mar 20 17:47:53 crc kubenswrapper[4690]: I0320 17:47:53.648283 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-5b96b44647-l5zw2"] Mar 20 17:47:53 crc kubenswrapper[4690]: I0320 17:47:53.719275 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7a5a20c4-0745-41d2-a2ff-f389423513b2-webhook-cert\") pod \"metallb-operator-webhook-server-5b96b44647-l5zw2\" (UID: \"7a5a20c4-0745-41d2-a2ff-f389423513b2\") " pod="metallb-system/metallb-operator-webhook-server-5b96b44647-l5zw2" Mar 20 17:47:53 crc kubenswrapper[4690]: I0320 17:47:53.719323 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hfhw\" (UniqueName: \"kubernetes.io/projected/7a5a20c4-0745-41d2-a2ff-f389423513b2-kube-api-access-9hfhw\") pod \"metallb-operator-webhook-server-5b96b44647-l5zw2\" (UID: \"7a5a20c4-0745-41d2-a2ff-f389423513b2\") " pod="metallb-system/metallb-operator-webhook-server-5b96b44647-l5zw2" Mar 20 17:47:53 crc kubenswrapper[4690]: I0320 17:47:53.719381 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7a5a20c4-0745-41d2-a2ff-f389423513b2-apiservice-cert\") pod \"metallb-operator-webhook-server-5b96b44647-l5zw2\" (UID: \"7a5a20c4-0745-41d2-a2ff-f389423513b2\") " pod="metallb-system/metallb-operator-webhook-server-5b96b44647-l5zw2" Mar 20 17:47:53 crc kubenswrapper[4690]: I0320 17:47:53.724780 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7a5a20c4-0745-41d2-a2ff-f389423513b2-webhook-cert\") pod \"metallb-operator-webhook-server-5b96b44647-l5zw2\" (UID: \"7a5a20c4-0745-41d2-a2ff-f389423513b2\") " pod="metallb-system/metallb-operator-webhook-server-5b96b44647-l5zw2" Mar 20 17:47:53 crc kubenswrapper[4690]: I0320 17:47:53.725806 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7a5a20c4-0745-41d2-a2ff-f389423513b2-apiservice-cert\") pod \"metallb-operator-webhook-server-5b96b44647-l5zw2\" (UID: \"7a5a20c4-0745-41d2-a2ff-f389423513b2\") " pod="metallb-system/metallb-operator-webhook-server-5b96b44647-l5zw2" Mar 20 17:47:53 crc kubenswrapper[4690]: I0320 17:47:53.765955 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hfhw\" (UniqueName: \"kubernetes.io/projected/7a5a20c4-0745-41d2-a2ff-f389423513b2-kube-api-access-9hfhw\") pod \"metallb-operator-webhook-server-5b96b44647-l5zw2\" (UID: \"7a5a20c4-0745-41d2-a2ff-f389423513b2\") " pod="metallb-system/metallb-operator-webhook-server-5b96b44647-l5zw2" Mar 20 17:47:53 crc kubenswrapper[4690]: I0320 17:47:53.790165 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-77445cdfc6-46r4h"] Mar 20 17:47:53 crc kubenswrapper[4690]: I0320 17:47:53.947130 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-5b96b44647-l5zw2" Mar 20 17:47:54 crc kubenswrapper[4690]: I0320 17:47:54.158977 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-5b96b44647-l5zw2"] Mar 20 17:47:54 crc kubenswrapper[4690]: I0320 17:47:54.273607 4690 patch_prober.go:28] interesting pod/machine-config-daemon-wtg2q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:47:54 crc kubenswrapper[4690]: I0320 17:47:54.273672 4690 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:47:54 crc kubenswrapper[4690]: I0320 17:47:54.467284 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-77445cdfc6-46r4h" event={"ID":"fcfa35ef-e556-4c2e-a742-84265930366f","Type":"ContainerStarted","Data":"fdaf8c67c21cf878ba7b1c8215e360559d314f46f5b6f15510adc746ceb75df3"} Mar 20 17:47:54 crc kubenswrapper[4690]: I0320 17:47:54.468334 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-5b96b44647-l5zw2" event={"ID":"7a5a20c4-0745-41d2-a2ff-f389423513b2","Type":"ContainerStarted","Data":"172f7081264be48cdcbbe0d4339d3a34496a947ab79592fe649b31bce2584f79"} Mar 20 17:47:57 crc kubenswrapper[4690]: I0320 17:47:57.508286 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-77445cdfc6-46r4h" event={"ID":"fcfa35ef-e556-4c2e-a742-84265930366f","Type":"ContainerStarted","Data":"74e069d588324057f48ebcb8d7259b6cf0085a82022dd7b6f18e46bede1f953e"} Mar 20 17:47:57 crc kubenswrapper[4690]: I0320 17:47:57.508710 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-77445cdfc6-46r4h" Mar 20 17:47:57 crc kubenswrapper[4690]: I0320 17:47:57.524905 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-77445cdfc6-46r4h" podStartSLOduration=1.30218068 podStartE2EDuration="4.524886501s" podCreationTimestamp="2026-03-20 17:47:53 +0000 UTC" firstStartedPulling="2026-03-20 17:47:53.805129625 +0000 UTC m=+948.670955303" lastFinishedPulling="2026-03-20 17:47:57.027835446 +0000 UTC m=+951.893661124" observedRunningTime="2026-03-20 17:47:57.52345519 +0000 UTC m=+952.389280878" watchObservedRunningTime="2026-03-20 17:47:57.524886501 +0000 UTC m=+952.390712179" Mar 20 17:48:00 crc kubenswrapper[4690]: I0320 17:48:00.125743 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567148-8qwm8"] Mar 20 17:48:00 crc kubenswrapper[4690]: I0320 17:48:00.128417 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567148-8qwm8" Mar 20 17:48:00 crc kubenswrapper[4690]: I0320 17:48:00.142462 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567148-8qwm8"] Mar 20 17:48:00 crc kubenswrapper[4690]: I0320 17:48:00.169289 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 17:48:00 crc kubenswrapper[4690]: I0320 17:48:00.169560 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5fwhb" Mar 20 17:48:00 crc kubenswrapper[4690]: I0320 17:48:00.169675 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 17:48:00 crc kubenswrapper[4690]: I0320 17:48:00.224462 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nx5p\" (UniqueName: \"kubernetes.io/projected/8bb50ad2-7215-449a-8280-a13e4e324734-kube-api-access-4nx5p\") pod \"auto-csr-approver-29567148-8qwm8\" (UID: \"8bb50ad2-7215-449a-8280-a13e4e324734\") " pod="openshift-infra/auto-csr-approver-29567148-8qwm8" Mar 20 17:48:00 crc kubenswrapper[4690]: I0320 17:48:00.325832 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nx5p\" (UniqueName: \"kubernetes.io/projected/8bb50ad2-7215-449a-8280-a13e4e324734-kube-api-access-4nx5p\") pod \"auto-csr-approver-29567148-8qwm8\" (UID: \"8bb50ad2-7215-449a-8280-a13e4e324734\") " pod="openshift-infra/auto-csr-approver-29567148-8qwm8" Mar 20 17:48:00 crc kubenswrapper[4690]: I0320 17:48:00.359516 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nx5p\" (UniqueName: \"kubernetes.io/projected/8bb50ad2-7215-449a-8280-a13e4e324734-kube-api-access-4nx5p\") pod \"auto-csr-approver-29567148-8qwm8\" (UID: \"8bb50ad2-7215-449a-8280-a13e4e324734\") " pod="openshift-infra/auto-csr-approver-29567148-8qwm8" Mar 20 17:48:00 crc kubenswrapper[4690]: I0320 17:48:00.490427 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567148-8qwm8" Mar 20 17:48:00 crc kubenswrapper[4690]: I0320 17:48:00.531075 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-5b96b44647-l5zw2" event={"ID":"7a5a20c4-0745-41d2-a2ff-f389423513b2","Type":"ContainerStarted","Data":"d187cad54d34797697ea986aae610b17e3089f9aea1842d711574ea54f00fffa"} Mar 20 17:48:00 crc kubenswrapper[4690]: I0320 17:48:00.531215 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-5b96b44647-l5zw2" Mar 20 17:48:00 crc kubenswrapper[4690]: I0320 17:48:00.554364 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-5b96b44647-l5zw2" podStartSLOduration=2.321570245 podStartE2EDuration="7.554340804s" podCreationTimestamp="2026-03-20 17:47:53 +0000 UTC" firstStartedPulling="2026-03-20 17:47:54.173751486 +0000 UTC m=+949.039577164" lastFinishedPulling="2026-03-20 17:47:59.406522055 +0000 UTC m=+954.272347723" observedRunningTime="2026-03-20 17:48:00.549102573 +0000 UTC m=+955.414928291" watchObservedRunningTime="2026-03-20 17:48:00.554340804 +0000 UTC m=+955.420166482" Mar 20 17:48:00 crc kubenswrapper[4690]: I0320 17:48:00.752825 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567148-8qwm8"] Mar 20 17:48:01 crc kubenswrapper[4690]: I0320 17:48:01.540861 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567148-8qwm8" event={"ID":"8bb50ad2-7215-449a-8280-a13e4e324734","Type":"ContainerStarted","Data":"ae2a776033ae6194353d5960f272535488bba861d2a5b33f92671911d4ed6434"} Mar 20 17:48:01 crc kubenswrapper[4690]: I0320 17:48:01.855700 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vpf95"] Mar 20 17:48:01 crc kubenswrapper[4690]: I0320 17:48:01.856742 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vpf95" Mar 20 17:48:01 crc kubenswrapper[4690]: I0320 17:48:01.867182 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vpf95"] Mar 20 17:48:01 crc kubenswrapper[4690]: I0320 17:48:01.947798 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f23f3d2-ebe3-44b0-9872-dfb5da5932e2-utilities\") pod \"certified-operators-vpf95\" (UID: \"2f23f3d2-ebe3-44b0-9872-dfb5da5932e2\") " pod="openshift-marketplace/certified-operators-vpf95" Mar 20 17:48:01 crc kubenswrapper[4690]: I0320 17:48:01.947860 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f23f3d2-ebe3-44b0-9872-dfb5da5932e2-catalog-content\") pod \"certified-operators-vpf95\" (UID: \"2f23f3d2-ebe3-44b0-9872-dfb5da5932e2\") " pod="openshift-marketplace/certified-operators-vpf95" Mar 20 17:48:01 crc kubenswrapper[4690]: I0320 17:48:01.947929 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2b7m\" (UniqueName: \"kubernetes.io/projected/2f23f3d2-ebe3-44b0-9872-dfb5da5932e2-kube-api-access-b2b7m\") pod \"certified-operators-vpf95\" (UID: \"2f23f3d2-ebe3-44b0-9872-dfb5da5932e2\") " pod="openshift-marketplace/certified-operators-vpf95" Mar 20 17:48:02 crc kubenswrapper[4690]: I0320 17:48:02.049489 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2b7m\" (UniqueName: \"kubernetes.io/projected/2f23f3d2-ebe3-44b0-9872-dfb5da5932e2-kube-api-access-b2b7m\") pod \"certified-operators-vpf95\" (UID: \"2f23f3d2-ebe3-44b0-9872-dfb5da5932e2\") " pod="openshift-marketplace/certified-operators-vpf95" Mar 20 17:48:02 crc kubenswrapper[4690]: I0320 17:48:02.049567 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f23f3d2-ebe3-44b0-9872-dfb5da5932e2-utilities\") pod \"certified-operators-vpf95\" (UID: \"2f23f3d2-ebe3-44b0-9872-dfb5da5932e2\") " pod="openshift-marketplace/certified-operators-vpf95" Mar 20 17:48:02 crc kubenswrapper[4690]: I0320 17:48:02.049595 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f23f3d2-ebe3-44b0-9872-dfb5da5932e2-catalog-content\") pod \"certified-operators-vpf95\" (UID: \"2f23f3d2-ebe3-44b0-9872-dfb5da5932e2\") " pod="openshift-marketplace/certified-operators-vpf95" Mar 20 17:48:02 crc kubenswrapper[4690]: I0320 17:48:02.049972 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f23f3d2-ebe3-44b0-9872-dfb5da5932e2-utilities\") pod \"certified-operators-vpf95\" (UID: \"2f23f3d2-ebe3-44b0-9872-dfb5da5932e2\") " pod="openshift-marketplace/certified-operators-vpf95" Mar 20 17:48:02 crc kubenswrapper[4690]: I0320 17:48:02.050026 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f23f3d2-ebe3-44b0-9872-dfb5da5932e2-catalog-content\") pod \"certified-operators-vpf95\" (UID: \"2f23f3d2-ebe3-44b0-9872-dfb5da5932e2\") " pod="openshift-marketplace/certified-operators-vpf95" Mar 20 17:48:02 crc kubenswrapper[4690]: I0320 17:48:02.077550 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2b7m\" (UniqueName: \"kubernetes.io/projected/2f23f3d2-ebe3-44b0-9872-dfb5da5932e2-kube-api-access-b2b7m\") pod \"certified-operators-vpf95\" (UID: \"2f23f3d2-ebe3-44b0-9872-dfb5da5932e2\") " pod="openshift-marketplace/certified-operators-vpf95" Mar 20 17:48:02 crc kubenswrapper[4690]: I0320 17:48:02.185089 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vpf95" Mar 20 17:48:02 crc kubenswrapper[4690]: I0320 17:48:02.428428 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vpf95"] Mar 20 17:48:02 crc kubenswrapper[4690]: W0320 17:48:02.429899 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f23f3d2_ebe3_44b0_9872_dfb5da5932e2.slice/crio-b239ed8491bdaf48400bdde3087f751cfa9bafddeedc106f1f744fb30e472193 WatchSource:0}: Error finding container b239ed8491bdaf48400bdde3087f751cfa9bafddeedc106f1f744fb30e472193: Status 404 returned error can't find the container with id b239ed8491bdaf48400bdde3087f751cfa9bafddeedc106f1f744fb30e472193 Mar 20 17:48:02 crc kubenswrapper[4690]: I0320 17:48:02.547176 4690 generic.go:334] "Generic (PLEG): container finished" podID="8bb50ad2-7215-449a-8280-a13e4e324734" containerID="6403cdd07c5bf3519ce0d679d19102efd34e6c654fbee43e83f521f45730b56f" exitCode=0 Mar 20 17:48:02 crc kubenswrapper[4690]: I0320 17:48:02.547263 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567148-8qwm8" event={"ID":"8bb50ad2-7215-449a-8280-a13e4e324734","Type":"ContainerDied","Data":"6403cdd07c5bf3519ce0d679d19102efd34e6c654fbee43e83f521f45730b56f"} Mar 20 17:48:02 crc kubenswrapper[4690]: I0320 17:48:02.549668 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vpf95" event={"ID":"2f23f3d2-ebe3-44b0-9872-dfb5da5932e2","Type":"ContainerStarted","Data":"b239ed8491bdaf48400bdde3087f751cfa9bafddeedc106f1f744fb30e472193"} Mar 20 17:48:03 crc kubenswrapper[4690]: I0320 17:48:03.558412 4690 generic.go:334] "Generic (PLEG): container finished" podID="2f23f3d2-ebe3-44b0-9872-dfb5da5932e2" containerID="75086afd8373e2e29ecae252f24e23505960dab4ba3b56035f571b5b8869caac" exitCode=0 Mar 20 17:48:03 crc kubenswrapper[4690]: I0320 17:48:03.558520 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vpf95" event={"ID":"2f23f3d2-ebe3-44b0-9872-dfb5da5932e2","Type":"ContainerDied","Data":"75086afd8373e2e29ecae252f24e23505960dab4ba3b56035f571b5b8869caac"} Mar 20 17:48:03 crc kubenswrapper[4690]: I0320 17:48:03.871759 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567148-8qwm8" Mar 20 17:48:03 crc kubenswrapper[4690]: I0320 17:48:03.973963 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4nx5p\" (UniqueName: \"kubernetes.io/projected/8bb50ad2-7215-449a-8280-a13e4e324734-kube-api-access-4nx5p\") pod \"8bb50ad2-7215-449a-8280-a13e4e324734\" (UID: \"8bb50ad2-7215-449a-8280-a13e4e324734\") " Mar 20 17:48:03 crc kubenswrapper[4690]: I0320 17:48:03.979974 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bb50ad2-7215-449a-8280-a13e4e324734-kube-api-access-4nx5p" (OuterVolumeSpecName: "kube-api-access-4nx5p") pod "8bb50ad2-7215-449a-8280-a13e4e324734" (UID: "8bb50ad2-7215-449a-8280-a13e4e324734"). InnerVolumeSpecName "kube-api-access-4nx5p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:48:04 crc kubenswrapper[4690]: I0320 17:48:04.077613 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4nx5p\" (UniqueName: \"kubernetes.io/projected/8bb50ad2-7215-449a-8280-a13e4e324734-kube-api-access-4nx5p\") on node \"crc\" DevicePath \"\"" Mar 20 17:48:04 crc kubenswrapper[4690]: I0320 17:48:04.570421 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567148-8qwm8" event={"ID":"8bb50ad2-7215-449a-8280-a13e4e324734","Type":"ContainerDied","Data":"ae2a776033ae6194353d5960f272535488bba861d2a5b33f92671911d4ed6434"} Mar 20 17:48:04 crc kubenswrapper[4690]: I0320 17:48:04.570463 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae2a776033ae6194353d5960f272535488bba861d2a5b33f92671911d4ed6434" Mar 20 17:48:04 crc kubenswrapper[4690]: I0320 17:48:04.570512 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567148-8qwm8" Mar 20 17:48:04 crc kubenswrapper[4690]: I0320 17:48:04.998821 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567142-r6ngc"] Mar 20 17:48:05 crc kubenswrapper[4690]: I0320 17:48:05.003160 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567142-r6ngc"] Mar 20 17:48:05 crc kubenswrapper[4690]: I0320 17:48:05.894174 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75cd9cef-2bb6-4c9c-97a4-ed93def89d56" path="/var/lib/kubelet/pods/75cd9cef-2bb6-4c9c-97a4-ed93def89d56/volumes" Mar 20 17:48:07 crc kubenswrapper[4690]: I0320 17:48:07.470454 4690 scope.go:117] "RemoveContainer" containerID="09ba0fe92b758b8a8fff30349a490788fbe227c83e5ae7e7a6eeb0f893dcdaec" Mar 20 17:48:09 crc kubenswrapper[4690]: I0320 17:48:09.600356 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vpf95" event={"ID":"2f23f3d2-ebe3-44b0-9872-dfb5da5932e2","Type":"ContainerStarted","Data":"a9f3c7233d10590fe248e4c48d623bd97c96eec11a22bebf4954591e183544e4"} Mar 20 17:48:10 crc kubenswrapper[4690]: I0320 17:48:10.609706 4690 generic.go:334] "Generic (PLEG): container finished" podID="2f23f3d2-ebe3-44b0-9872-dfb5da5932e2" containerID="a9f3c7233d10590fe248e4c48d623bd97c96eec11a22bebf4954591e183544e4" exitCode=0 Mar 20 17:48:10 crc kubenswrapper[4690]: I0320 17:48:10.609807 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vpf95" event={"ID":"2f23f3d2-ebe3-44b0-9872-dfb5da5932e2","Type":"ContainerDied","Data":"a9f3c7233d10590fe248e4c48d623bd97c96eec11a22bebf4954591e183544e4"} Mar 20 17:48:11 crc kubenswrapper[4690]: I0320 17:48:11.620730 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vpf95" event={"ID":"2f23f3d2-ebe3-44b0-9872-dfb5da5932e2","Type":"ContainerStarted","Data":"f504ea165b7f1666568f40b837c274968d1f9c4f793e995a857fa2da2009d009"} Mar 20 17:48:12 crc kubenswrapper[4690]: I0320 17:48:12.185446 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vpf95" Mar 20 17:48:12 crc kubenswrapper[4690]: I0320 17:48:12.185824 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vpf95" Mar 20 17:48:13 crc kubenswrapper[4690]: I0320 17:48:13.221087 4690 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-vpf95" podUID="2f23f3d2-ebe3-44b0-9872-dfb5da5932e2" containerName="registry-server" probeResult="failure" output=< Mar 20 17:48:13 crc kubenswrapper[4690]: timeout: failed to connect service ":50051" within 1s Mar 20 17:48:13 crc kubenswrapper[4690]: > Mar 20 17:48:13 crc kubenswrapper[4690]: I0320 17:48:13.952458 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-5b96b44647-l5zw2" Mar 20 17:48:13 crc kubenswrapper[4690]: I0320 17:48:13.980425 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vpf95" podStartSLOduration=5.471240038 podStartE2EDuration="12.980408372s" podCreationTimestamp="2026-03-20 17:48:01 +0000 UTC" firstStartedPulling="2026-03-20 17:48:03.560247401 +0000 UTC m=+958.426073099" lastFinishedPulling="2026-03-20 17:48:11.069415715 +0000 UTC m=+965.935241433" observedRunningTime="2026-03-20 17:48:11.65340444 +0000 UTC m=+966.519230158" watchObservedRunningTime="2026-03-20 17:48:13.980408372 +0000 UTC m=+968.846234050" Mar 20 17:48:22 crc kubenswrapper[4690]: I0320 17:48:22.247096 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vpf95" Mar 20 17:48:22 crc kubenswrapper[4690]: I0320 17:48:22.310292 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vpf95" Mar 20 17:48:22 crc kubenswrapper[4690]: I0320 17:48:22.419691 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vpf95"] Mar 20 17:48:22 crc kubenswrapper[4690]: I0320 17:48:22.497761 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wk568"] Mar 20 17:48:22 crc kubenswrapper[4690]: I0320 17:48:22.498109 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-wk568" podUID="09aec03c-b31e-4b02-aed6-ffce07763b4d" containerName="registry-server" containerID="cri-o://71d97eb223d86f53014515b0110ba5406b6331d29f8eef60028f8b7a708c39d9" gracePeriod=2 Mar 20 17:48:22 crc kubenswrapper[4690]: I0320 17:48:22.695046 4690 generic.go:334] "Generic (PLEG): container finished" podID="09aec03c-b31e-4b02-aed6-ffce07763b4d" containerID="71d97eb223d86f53014515b0110ba5406b6331d29f8eef60028f8b7a708c39d9" exitCode=0 Mar 20 17:48:22 crc kubenswrapper[4690]: I0320 17:48:22.696238 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wk568" event={"ID":"09aec03c-b31e-4b02-aed6-ffce07763b4d","Type":"ContainerDied","Data":"71d97eb223d86f53014515b0110ba5406b6331d29f8eef60028f8b7a708c39d9"} Mar 20 17:48:22 crc kubenswrapper[4690]: I0320 17:48:22.923672 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wk568" Mar 20 17:48:23 crc kubenswrapper[4690]: I0320 17:48:23.059416 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09aec03c-b31e-4b02-aed6-ffce07763b4d-utilities\") pod \"09aec03c-b31e-4b02-aed6-ffce07763b4d\" (UID: \"09aec03c-b31e-4b02-aed6-ffce07763b4d\") " Mar 20 17:48:23 crc kubenswrapper[4690]: I0320 17:48:23.059598 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4rk6\" (UniqueName: \"kubernetes.io/projected/09aec03c-b31e-4b02-aed6-ffce07763b4d-kube-api-access-v4rk6\") pod \"09aec03c-b31e-4b02-aed6-ffce07763b4d\" (UID: \"09aec03c-b31e-4b02-aed6-ffce07763b4d\") " Mar 20 17:48:23 crc kubenswrapper[4690]: I0320 17:48:23.059667 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09aec03c-b31e-4b02-aed6-ffce07763b4d-catalog-content\") pod \"09aec03c-b31e-4b02-aed6-ffce07763b4d\" (UID: \"09aec03c-b31e-4b02-aed6-ffce07763b4d\") " Mar 20 17:48:23 crc kubenswrapper[4690]: I0320 17:48:23.060329 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09aec03c-b31e-4b02-aed6-ffce07763b4d-utilities" (OuterVolumeSpecName: "utilities") pod "09aec03c-b31e-4b02-aed6-ffce07763b4d" (UID: "09aec03c-b31e-4b02-aed6-ffce07763b4d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:48:23 crc kubenswrapper[4690]: I0320 17:48:23.066701 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09aec03c-b31e-4b02-aed6-ffce07763b4d-kube-api-access-v4rk6" (OuterVolumeSpecName: "kube-api-access-v4rk6") pod "09aec03c-b31e-4b02-aed6-ffce07763b4d" (UID: "09aec03c-b31e-4b02-aed6-ffce07763b4d"). InnerVolumeSpecName "kube-api-access-v4rk6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:48:23 crc kubenswrapper[4690]: I0320 17:48:23.103188 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09aec03c-b31e-4b02-aed6-ffce07763b4d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "09aec03c-b31e-4b02-aed6-ffce07763b4d" (UID: "09aec03c-b31e-4b02-aed6-ffce07763b4d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:48:23 crc kubenswrapper[4690]: I0320 17:48:23.161579 4690 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09aec03c-b31e-4b02-aed6-ffce07763b4d-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 17:48:23 crc kubenswrapper[4690]: I0320 17:48:23.161614 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v4rk6\" (UniqueName: \"kubernetes.io/projected/09aec03c-b31e-4b02-aed6-ffce07763b4d-kube-api-access-v4rk6\") on node \"crc\" DevicePath \"\"" Mar 20 17:48:23 crc kubenswrapper[4690]: I0320 17:48:23.161626 4690 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09aec03c-b31e-4b02-aed6-ffce07763b4d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 17:48:23 crc kubenswrapper[4690]: I0320 17:48:23.703795 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wk568" event={"ID":"09aec03c-b31e-4b02-aed6-ffce07763b4d","Type":"ContainerDied","Data":"4a873520ac19400446f575541e893fac035686b7518241968a369f706a2b2f2c"} Mar 20 17:48:23 crc kubenswrapper[4690]: I0320 17:48:23.703835 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wk568" Mar 20 17:48:23 crc kubenswrapper[4690]: I0320 17:48:23.703884 4690 scope.go:117] "RemoveContainer" containerID="71d97eb223d86f53014515b0110ba5406b6331d29f8eef60028f8b7a708c39d9" Mar 20 17:48:23 crc kubenswrapper[4690]: I0320 17:48:23.726157 4690 scope.go:117] "RemoveContainer" containerID="aa7981990c68b2ea91a311c25f4f1dbe9255c0002203813f3197221af55000cb" Mar 20 17:48:23 crc kubenswrapper[4690]: I0320 17:48:23.741212 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wk568"] Mar 20 17:48:23 crc kubenswrapper[4690]: I0320 17:48:23.748721 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-wk568"] Mar 20 17:48:23 crc kubenswrapper[4690]: I0320 17:48:23.768085 4690 scope.go:117] "RemoveContainer" containerID="a9edfbb98722840af3e3fff49f0b5b309332b8d846e9e01f337019b51a70cac0" Mar 20 17:48:23 crc kubenswrapper[4690]: I0320 17:48:23.891918 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09aec03c-b31e-4b02-aed6-ffce07763b4d" path="/var/lib/kubelet/pods/09aec03c-b31e-4b02-aed6-ffce07763b4d/volumes" Mar 20 17:48:24 crc kubenswrapper[4690]: I0320 17:48:24.274542 4690 patch_prober.go:28] interesting pod/machine-config-daemon-wtg2q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:48:24 crc kubenswrapper[4690]: I0320 17:48:24.274812 4690 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:48:33 crc kubenswrapper[4690]: I0320 17:48:33.641118 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-77445cdfc6-46r4h" Mar 20 17:48:34 crc kubenswrapper[4690]: I0320 17:48:34.475782 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-fn9hk"] Mar 20 17:48:34 crc kubenswrapper[4690]: E0320 17:48:34.476105 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09aec03c-b31e-4b02-aed6-ffce07763b4d" containerName="extract-utilities" Mar 20 17:48:34 crc kubenswrapper[4690]: I0320 17:48:34.476135 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="09aec03c-b31e-4b02-aed6-ffce07763b4d" containerName="extract-utilities" Mar 20 17:48:34 crc kubenswrapper[4690]: E0320 17:48:34.476150 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09aec03c-b31e-4b02-aed6-ffce07763b4d" containerName="extract-content" Mar 20 17:48:34 crc kubenswrapper[4690]: I0320 17:48:34.476161 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="09aec03c-b31e-4b02-aed6-ffce07763b4d" containerName="extract-content" Mar 20 17:48:34 crc kubenswrapper[4690]: E0320 17:48:34.476176 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09aec03c-b31e-4b02-aed6-ffce07763b4d" containerName="registry-server" Mar 20 17:48:34 crc kubenswrapper[4690]: I0320 17:48:34.476186 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="09aec03c-b31e-4b02-aed6-ffce07763b4d" containerName="registry-server" Mar 20 17:48:34 crc kubenswrapper[4690]: E0320 17:48:34.476203 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bb50ad2-7215-449a-8280-a13e4e324734" containerName="oc" Mar 20 17:48:34 crc kubenswrapper[4690]: I0320 17:48:34.476213 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bb50ad2-7215-449a-8280-a13e4e324734" containerName="oc" Mar 20 17:48:34 crc kubenswrapper[4690]: I0320 17:48:34.476425 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="09aec03c-b31e-4b02-aed6-ffce07763b4d" containerName="registry-server" Mar 20 17:48:34 crc kubenswrapper[4690]: I0320 17:48:34.476457 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bb50ad2-7215-449a-8280-a13e4e324734" containerName="oc" Mar 20 17:48:34 crc kubenswrapper[4690]: I0320 17:48:34.476975 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-fn9hk" Mar 20 17:48:34 crc kubenswrapper[4690]: I0320 17:48:34.481625 4690 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 20 17:48:34 crc kubenswrapper[4690]: I0320 17:48:34.481635 4690 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-ts2g9" Mar 20 17:48:34 crc kubenswrapper[4690]: I0320 17:48:34.491072 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-9cwmb"] Mar 20 17:48:34 crc kubenswrapper[4690]: I0320 17:48:34.496733 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-9cwmb" Mar 20 17:48:34 crc kubenswrapper[4690]: I0320 17:48:34.498822 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 20 17:48:34 crc kubenswrapper[4690]: I0320 17:48:34.498955 4690 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 20 17:48:34 crc kubenswrapper[4690]: I0320 17:48:34.501208 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-fn9hk"] Mar 20 17:48:34 crc kubenswrapper[4690]: I0320 17:48:34.517799 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/60df4ef1-2b12-4b8f-aec8-0a716fa5f7d0-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-fn9hk\" (UID: \"60df4ef1-2b12-4b8f-aec8-0a716fa5f7d0\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-fn9hk" Mar 20 17:48:34 crc kubenswrapper[4690]: I0320 17:48:34.517889 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hl5dj\" (UniqueName: \"kubernetes.io/projected/60df4ef1-2b12-4b8f-aec8-0a716fa5f7d0-kube-api-access-hl5dj\") pod \"frr-k8s-webhook-server-bcc4b6f68-fn9hk\" (UID: \"60df4ef1-2b12-4b8f-aec8-0a716fa5f7d0\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-fn9hk" Mar 20 17:48:34 crc kubenswrapper[4690]: I0320 17:48:34.551951 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-cfggn"] Mar 20 17:48:34 crc kubenswrapper[4690]: I0320 17:48:34.552787 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-cfggn" Mar 20 17:48:34 crc kubenswrapper[4690]: I0320 17:48:34.554733 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 20 17:48:34 crc kubenswrapper[4690]: I0320 17:48:34.555037 4690 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-gfzgk" Mar 20 17:48:34 crc kubenswrapper[4690]: I0320 17:48:34.555216 4690 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 20 17:48:34 crc kubenswrapper[4690]: I0320 17:48:34.557444 4690 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 20 17:48:34 crc kubenswrapper[4690]: I0320 17:48:34.564429 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-7bb4cc7c98-jbfw7"] Mar 20 17:48:34 crc kubenswrapper[4690]: I0320 17:48:34.565301 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-jbfw7" Mar 20 17:48:34 crc kubenswrapper[4690]: I0320 17:48:34.574476 4690 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 20 17:48:34 crc kubenswrapper[4690]: I0320 17:48:34.584664 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-jbfw7"] Mar 20 17:48:34 crc kubenswrapper[4690]: I0320 17:48:34.618868 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sw4tr\" (UniqueName: \"kubernetes.io/projected/fa6b1a4f-0c86-4aa0-8d19-f29a78797c6e-kube-api-access-sw4tr\") pod \"controller-7bb4cc7c98-jbfw7\" (UID: \"fa6b1a4f-0c86-4aa0-8d19-f29a78797c6e\") " pod="metallb-system/controller-7bb4cc7c98-jbfw7" Mar 20 17:48:34 crc kubenswrapper[4690]: I0320 17:48:34.618923 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/60df4ef1-2b12-4b8f-aec8-0a716fa5f7d0-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-fn9hk\" (UID: \"60df4ef1-2b12-4b8f-aec8-0a716fa5f7d0\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-fn9hk" Mar 20 17:48:34 crc kubenswrapper[4690]: I0320 17:48:34.618945 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/2edd85fb-0387-4738-ba35-2b326b635a1b-metrics\") pod \"frr-k8s-9cwmb\" (UID: \"2edd85fb-0387-4738-ba35-2b326b635a1b\") " pod="metallb-system/frr-k8s-9cwmb" Mar 20 17:48:34 crc kubenswrapper[4690]: I0320 17:48:34.618968 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/2edd85fb-0387-4738-ba35-2b326b635a1b-frr-conf\") pod \"frr-k8s-9cwmb\" (UID: \"2edd85fb-0387-4738-ba35-2b326b635a1b\") " pod="metallb-system/frr-k8s-9cwmb" Mar 20 17:48:34 crc kubenswrapper[4690]: I0320 17:48:34.618987 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/2edd85fb-0387-4738-ba35-2b326b635a1b-frr-sockets\") pod \"frr-k8s-9cwmb\" (UID: \"2edd85fb-0387-4738-ba35-2b326b635a1b\") " pod="metallb-system/frr-k8s-9cwmb" Mar 20 17:48:34 crc kubenswrapper[4690]: I0320 17:48:34.619006 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2edd85fb-0387-4738-ba35-2b326b635a1b-metrics-certs\") pod \"frr-k8s-9cwmb\" (UID: \"2edd85fb-0387-4738-ba35-2b326b635a1b\") " pod="metallb-system/frr-k8s-9cwmb" Mar 20 17:48:34 crc kubenswrapper[4690]: I0320 17:48:34.619024 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-st6r5\" (UniqueName: \"kubernetes.io/projected/9e0ecbbf-1e0c-408a-b58c-07cd90497c39-kube-api-access-st6r5\") pod \"speaker-cfggn\" (UID: \"9e0ecbbf-1e0c-408a-b58c-07cd90497c39\") " pod="metallb-system/speaker-cfggn" Mar 20 17:48:34 crc kubenswrapper[4690]: I0320 17:48:34.619044 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fa6b1a4f-0c86-4aa0-8d19-f29a78797c6e-metrics-certs\") pod \"controller-7bb4cc7c98-jbfw7\" (UID: \"fa6b1a4f-0c86-4aa0-8d19-f29a78797c6e\") " pod="metallb-system/controller-7bb4cc7c98-jbfw7" Mar 20 17:48:34 crc kubenswrapper[4690]: I0320 17:48:34.619082 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hl5dj\" (UniqueName: \"kubernetes.io/projected/60df4ef1-2b12-4b8f-aec8-0a716fa5f7d0-kube-api-access-hl5dj\") pod \"frr-k8s-webhook-server-bcc4b6f68-fn9hk\" (UID: \"60df4ef1-2b12-4b8f-aec8-0a716fa5f7d0\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-fn9hk" Mar 20 17:48:34 crc kubenswrapper[4690]: I0320 17:48:34.619103 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fa6b1a4f-0c86-4aa0-8d19-f29a78797c6e-cert\") pod \"controller-7bb4cc7c98-jbfw7\" (UID: \"fa6b1a4f-0c86-4aa0-8d19-f29a78797c6e\") " pod="metallb-system/controller-7bb4cc7c98-jbfw7" Mar 20 17:48:34 crc kubenswrapper[4690]: I0320 17:48:34.619120 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4vhx\" (UniqueName: \"kubernetes.io/projected/2edd85fb-0387-4738-ba35-2b326b635a1b-kube-api-access-v4vhx\") pod \"frr-k8s-9cwmb\" (UID: \"2edd85fb-0387-4738-ba35-2b326b635a1b\") " pod="metallb-system/frr-k8s-9cwmb" Mar 20 17:48:34 crc kubenswrapper[4690]: I0320 17:48:34.619144 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/2edd85fb-0387-4738-ba35-2b326b635a1b-reloader\") pod \"frr-k8s-9cwmb\" (UID: \"2edd85fb-0387-4738-ba35-2b326b635a1b\") " pod="metallb-system/frr-k8s-9cwmb" Mar 20 17:48:34 crc kubenswrapper[4690]: I0320 17:48:34.619159 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/9e0ecbbf-1e0c-408a-b58c-07cd90497c39-metallb-excludel2\") pod \"speaker-cfggn\" (UID: \"9e0ecbbf-1e0c-408a-b58c-07cd90497c39\") " pod="metallb-system/speaker-cfggn" Mar 20 17:48:34 crc kubenswrapper[4690]: I0320 17:48:34.619175 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/9e0ecbbf-1e0c-408a-b58c-07cd90497c39-memberlist\") pod \"speaker-cfggn\" (UID: \"9e0ecbbf-1e0c-408a-b58c-07cd90497c39\") " pod="metallb-system/speaker-cfggn" Mar 20 17:48:34 crc kubenswrapper[4690]: I0320 17:48:34.619193 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9e0ecbbf-1e0c-408a-b58c-07cd90497c39-metrics-certs\") pod \"speaker-cfggn\" (UID: \"9e0ecbbf-1e0c-408a-b58c-07cd90497c39\") " pod="metallb-system/speaker-cfggn" Mar 20 17:48:34 crc kubenswrapper[4690]: I0320 17:48:34.619205 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/2edd85fb-0387-4738-ba35-2b326b635a1b-frr-startup\") pod \"frr-k8s-9cwmb\" (UID: \"2edd85fb-0387-4738-ba35-2b326b635a1b\") " pod="metallb-system/frr-k8s-9cwmb" Mar 20 17:48:34 crc kubenswrapper[4690]: E0320 17:48:34.619334 4690 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Mar 20 17:48:34 crc kubenswrapper[4690]: E0320 17:48:34.619373 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/60df4ef1-2b12-4b8f-aec8-0a716fa5f7d0-cert podName:60df4ef1-2b12-4b8f-aec8-0a716fa5f7d0 nodeName:}" failed. No retries permitted until 2026-03-20 17:48:35.119358256 +0000 UTC m=+989.985183934 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/60df4ef1-2b12-4b8f-aec8-0a716fa5f7d0-cert") pod "frr-k8s-webhook-server-bcc4b6f68-fn9hk" (UID: "60df4ef1-2b12-4b8f-aec8-0a716fa5f7d0") : secret "frr-k8s-webhook-server-cert" not found Mar 20 17:48:34 crc kubenswrapper[4690]: I0320 17:48:34.637126 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hl5dj\" (UniqueName: \"kubernetes.io/projected/60df4ef1-2b12-4b8f-aec8-0a716fa5f7d0-kube-api-access-hl5dj\") pod \"frr-k8s-webhook-server-bcc4b6f68-fn9hk\" (UID: \"60df4ef1-2b12-4b8f-aec8-0a716fa5f7d0\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-fn9hk" Mar 20 17:48:34 crc kubenswrapper[4690]: I0320 17:48:34.720790 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fa6b1a4f-0c86-4aa0-8d19-f29a78797c6e-cert\") pod \"controller-7bb4cc7c98-jbfw7\" (UID: \"fa6b1a4f-0c86-4aa0-8d19-f29a78797c6e\") " pod="metallb-system/controller-7bb4cc7c98-jbfw7" Mar 20 17:48:34 crc kubenswrapper[4690]: I0320 17:48:34.721086 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4vhx\" (UniqueName: \"kubernetes.io/projected/2edd85fb-0387-4738-ba35-2b326b635a1b-kube-api-access-v4vhx\") pod \"frr-k8s-9cwmb\" (UID: \"2edd85fb-0387-4738-ba35-2b326b635a1b\") " pod="metallb-system/frr-k8s-9cwmb" Mar 20 17:48:34 crc kubenswrapper[4690]: I0320 17:48:34.721115 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/2edd85fb-0387-4738-ba35-2b326b635a1b-reloader\") pod \"frr-k8s-9cwmb\" (UID: \"2edd85fb-0387-4738-ba35-2b326b635a1b\") " pod="metallb-system/frr-k8s-9cwmb" Mar 20 17:48:34 crc kubenswrapper[4690]: I0320 17:48:34.721134 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/9e0ecbbf-1e0c-408a-b58c-07cd90497c39-metallb-excludel2\") pod \"speaker-cfggn\" (UID: \"9e0ecbbf-1e0c-408a-b58c-07cd90497c39\") " pod="metallb-system/speaker-cfggn" Mar 20 17:48:34 crc kubenswrapper[4690]: I0320 17:48:34.721150 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/9e0ecbbf-1e0c-408a-b58c-07cd90497c39-memberlist\") pod \"speaker-cfggn\" (UID: \"9e0ecbbf-1e0c-408a-b58c-07cd90497c39\") " pod="metallb-system/speaker-cfggn" Mar 20 17:48:34 crc kubenswrapper[4690]: I0320 17:48:34.721170 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9e0ecbbf-1e0c-408a-b58c-07cd90497c39-metrics-certs\") pod \"speaker-cfggn\" (UID: \"9e0ecbbf-1e0c-408a-b58c-07cd90497c39\") " pod="metallb-system/speaker-cfggn" Mar 20 17:48:34 crc kubenswrapper[4690]: I0320 17:48:34.721189 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/2edd85fb-0387-4738-ba35-2b326b635a1b-frr-startup\") pod \"frr-k8s-9cwmb\" (UID: \"2edd85fb-0387-4738-ba35-2b326b635a1b\") " pod="metallb-system/frr-k8s-9cwmb" Mar 20 17:48:34 crc kubenswrapper[4690]: I0320 17:48:34.721226 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sw4tr\" (UniqueName: \"kubernetes.io/projected/fa6b1a4f-0c86-4aa0-8d19-f29a78797c6e-kube-api-access-sw4tr\") pod \"controller-7bb4cc7c98-jbfw7\" (UID: \"fa6b1a4f-0c86-4aa0-8d19-f29a78797c6e\") " pod="metallb-system/controller-7bb4cc7c98-jbfw7" Mar 20 17:48:34 crc kubenswrapper[4690]: I0320 17:48:34.721302 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/2edd85fb-0387-4738-ba35-2b326b635a1b-metrics\") pod \"frr-k8s-9cwmb\" (UID: \"2edd85fb-0387-4738-ba35-2b326b635a1b\") " pod="metallb-system/frr-k8s-9cwmb" Mar 20 17:48:34 crc kubenswrapper[4690]: E0320 17:48:34.721320 4690 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 20 17:48:34 crc kubenswrapper[4690]: E0320 17:48:34.721383 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9e0ecbbf-1e0c-408a-b58c-07cd90497c39-memberlist podName:9e0ecbbf-1e0c-408a-b58c-07cd90497c39 nodeName:}" failed. No retries permitted until 2026-03-20 17:48:35.22136452 +0000 UTC m=+990.087190198 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/9e0ecbbf-1e0c-408a-b58c-07cd90497c39-memberlist") pod "speaker-cfggn" (UID: "9e0ecbbf-1e0c-408a-b58c-07cd90497c39") : secret "metallb-memberlist" not found Mar 20 17:48:34 crc kubenswrapper[4690]: I0320 17:48:34.721329 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/2edd85fb-0387-4738-ba35-2b326b635a1b-frr-conf\") pod \"frr-k8s-9cwmb\" (UID: \"2edd85fb-0387-4738-ba35-2b326b635a1b\") " pod="metallb-system/frr-k8s-9cwmb" Mar 20 17:48:34 crc kubenswrapper[4690]: I0320 17:48:34.721753 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/2edd85fb-0387-4738-ba35-2b326b635a1b-frr-sockets\") pod \"frr-k8s-9cwmb\" (UID: \"2edd85fb-0387-4738-ba35-2b326b635a1b\") " pod="metallb-system/frr-k8s-9cwmb" Mar 20 17:48:34 crc kubenswrapper[4690]: I0320 17:48:34.721813 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2edd85fb-0387-4738-ba35-2b326b635a1b-metrics-certs\") pod \"frr-k8s-9cwmb\" (UID: \"2edd85fb-0387-4738-ba35-2b326b635a1b\") " pod="metallb-system/frr-k8s-9cwmb" Mar 20 17:48:34 crc kubenswrapper[4690]: I0320 17:48:34.721852 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/2edd85fb-0387-4738-ba35-2b326b635a1b-reloader\") pod \"frr-k8s-9cwmb\" (UID: \"2edd85fb-0387-4738-ba35-2b326b635a1b\") " pod="metallb-system/frr-k8s-9cwmb" Mar 20 17:48:34 crc kubenswrapper[4690]: I0320 17:48:34.721994 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-st6r5\" (UniqueName: \"kubernetes.io/projected/9e0ecbbf-1e0c-408a-b58c-07cd90497c39-kube-api-access-st6r5\") pod \"speaker-cfggn\" (UID: \"9e0ecbbf-1e0c-408a-b58c-07cd90497c39\") " pod="metallb-system/speaker-cfggn" Mar 20 17:48:34 crc kubenswrapper[4690]: I0320 17:48:34.722035 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fa6b1a4f-0c86-4aa0-8d19-f29a78797c6e-metrics-certs\") pod \"controller-7bb4cc7c98-jbfw7\" (UID: \"fa6b1a4f-0c86-4aa0-8d19-f29a78797c6e\") " pod="metallb-system/controller-7bb4cc7c98-jbfw7" Mar 20 17:48:34 crc kubenswrapper[4690]: I0320 17:48:34.722508 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/9e0ecbbf-1e0c-408a-b58c-07cd90497c39-metallb-excludel2\") pod \"speaker-cfggn\" (UID: \"9e0ecbbf-1e0c-408a-b58c-07cd90497c39\") " pod="metallb-system/speaker-cfggn" Mar 20 17:48:34 crc kubenswrapper[4690]: I0320 17:48:34.722663 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/2edd85fb-0387-4738-ba35-2b326b635a1b-metrics\") pod \"frr-k8s-9cwmb\" (UID: \"2edd85fb-0387-4738-ba35-2b326b635a1b\") " pod="metallb-system/frr-k8s-9cwmb" Mar 20 17:48:34 crc kubenswrapper[4690]: I0320 17:48:34.722729 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/2edd85fb-0387-4738-ba35-2b326b635a1b-frr-sockets\") pod \"frr-k8s-9cwmb\" (UID: \"2edd85fb-0387-4738-ba35-2b326b635a1b\") " pod="metallb-system/frr-k8s-9cwmb" Mar 20 17:48:34 crc kubenswrapper[4690]: I0320 17:48:34.722911 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/2edd85fb-0387-4738-ba35-2b326b635a1b-frr-conf\") pod \"frr-k8s-9cwmb\" (UID: \"2edd85fb-0387-4738-ba35-2b326b635a1b\") " pod="metallb-system/frr-k8s-9cwmb" Mar 20 17:48:34 crc kubenswrapper[4690]: I0320 17:48:34.723856 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/2edd85fb-0387-4738-ba35-2b326b635a1b-frr-startup\") pod \"frr-k8s-9cwmb\" (UID: \"2edd85fb-0387-4738-ba35-2b326b635a1b\") " pod="metallb-system/frr-k8s-9cwmb" Mar 20 17:48:34 crc kubenswrapper[4690]: I0320 17:48:34.724318 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9e0ecbbf-1e0c-408a-b58c-07cd90497c39-metrics-certs\") pod \"speaker-cfggn\" (UID: \"9e0ecbbf-1e0c-408a-b58c-07cd90497c39\") " pod="metallb-system/speaker-cfggn" Mar 20 17:48:34 crc kubenswrapper[4690]: I0320 17:48:34.724728 4690 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 20 17:48:34 crc kubenswrapper[4690]: I0320 17:48:34.727280 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fa6b1a4f-0c86-4aa0-8d19-f29a78797c6e-metrics-certs\") pod \"controller-7bb4cc7c98-jbfw7\" (UID: \"fa6b1a4f-0c86-4aa0-8d19-f29a78797c6e\") " pod="metallb-system/controller-7bb4cc7c98-jbfw7" Mar 20 17:48:34 crc kubenswrapper[4690]: I0320 17:48:34.729904 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2edd85fb-0387-4738-ba35-2b326b635a1b-metrics-certs\") pod \"frr-k8s-9cwmb\" (UID: \"2edd85fb-0387-4738-ba35-2b326b635a1b\") " pod="metallb-system/frr-k8s-9cwmb" Mar 20 17:48:34 crc kubenswrapper[4690]: I0320 17:48:34.733404 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fa6b1a4f-0c86-4aa0-8d19-f29a78797c6e-cert\") pod \"controller-7bb4cc7c98-jbfw7\" (UID: \"fa6b1a4f-0c86-4aa0-8d19-f29a78797c6e\") " pod="metallb-system/controller-7bb4cc7c98-jbfw7" Mar 20 17:48:34 crc kubenswrapper[4690]: I0320 17:48:34.738483 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-st6r5\" (UniqueName: \"kubernetes.io/projected/9e0ecbbf-1e0c-408a-b58c-07cd90497c39-kube-api-access-st6r5\") pod \"speaker-cfggn\" (UID: \"9e0ecbbf-1e0c-408a-b58c-07cd90497c39\") " pod="metallb-system/speaker-cfggn" Mar 20 17:48:34 crc kubenswrapper[4690]: I0320 17:48:34.740431 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4vhx\" (UniqueName: \"kubernetes.io/projected/2edd85fb-0387-4738-ba35-2b326b635a1b-kube-api-access-v4vhx\") pod \"frr-k8s-9cwmb\" (UID: \"2edd85fb-0387-4738-ba35-2b326b635a1b\") " pod="metallb-system/frr-k8s-9cwmb" Mar 20 17:48:34 crc kubenswrapper[4690]: I0320 17:48:34.741180 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sw4tr\" (UniqueName: \"kubernetes.io/projected/fa6b1a4f-0c86-4aa0-8d19-f29a78797c6e-kube-api-access-sw4tr\") pod \"controller-7bb4cc7c98-jbfw7\" (UID: \"fa6b1a4f-0c86-4aa0-8d19-f29a78797c6e\") " pod="metallb-system/controller-7bb4cc7c98-jbfw7" Mar 20 17:48:34 crc kubenswrapper[4690]: I0320 17:48:34.824664 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-9cwmb" Mar 20 17:48:34 crc kubenswrapper[4690]: I0320 17:48:34.877242 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-jbfw7" Mar 20 17:48:35 crc kubenswrapper[4690]: I0320 17:48:35.098602 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-jbfw7"] Mar 20 17:48:35 crc kubenswrapper[4690]: I0320 17:48:35.128381 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/60df4ef1-2b12-4b8f-aec8-0a716fa5f7d0-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-fn9hk\" (UID: \"60df4ef1-2b12-4b8f-aec8-0a716fa5f7d0\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-fn9hk" Mar 20 17:48:35 crc kubenswrapper[4690]: I0320 17:48:35.138372 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/60df4ef1-2b12-4b8f-aec8-0a716fa5f7d0-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-fn9hk\" (UID: \"60df4ef1-2b12-4b8f-aec8-0a716fa5f7d0\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-fn9hk" Mar 20 17:48:35 crc kubenswrapper[4690]: I0320 17:48:35.229722 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/9e0ecbbf-1e0c-408a-b58c-07cd90497c39-memberlist\") pod \"speaker-cfggn\" (UID: \"9e0ecbbf-1e0c-408a-b58c-07cd90497c39\") " pod="metallb-system/speaker-cfggn" Mar 20 17:48:35 crc kubenswrapper[4690]: E0320 17:48:35.229919 4690 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 20 17:48:35 crc kubenswrapper[4690]: E0320 17:48:35.229989 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9e0ecbbf-1e0c-408a-b58c-07cd90497c39-memberlist podName:9e0ecbbf-1e0c-408a-b58c-07cd90497c39 nodeName:}" failed. No retries permitted until 2026-03-20 17:48:36.229969757 +0000 UTC m=+991.095795455 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/9e0ecbbf-1e0c-408a-b58c-07cd90497c39-memberlist") pod "speaker-cfggn" (UID: "9e0ecbbf-1e0c-408a-b58c-07cd90497c39") : secret "metallb-memberlist" not found Mar 20 17:48:35 crc kubenswrapper[4690]: I0320 17:48:35.398315 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-fn9hk" Mar 20 17:48:35 crc kubenswrapper[4690]: I0320 17:48:35.633481 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-fn9hk"] Mar 20 17:48:35 crc kubenswrapper[4690]: I0320 17:48:35.673490 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-fn9hk" event={"ID":"60df4ef1-2b12-4b8f-aec8-0a716fa5f7d0","Type":"ContainerStarted","Data":"021f72e400f7d4676ff531dd3a69f0129b88449fde8f98968117ca2fdd9728cf"} Mar 20 17:48:35 crc kubenswrapper[4690]: I0320 17:48:35.674638 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9cwmb" event={"ID":"2edd85fb-0387-4738-ba35-2b326b635a1b","Type":"ContainerStarted","Data":"6fd8534e1601f821c83254cdbe5dc64f84e4ac69bd51898055db202e321637b4"} Mar 20 17:48:35 crc kubenswrapper[4690]: I0320 17:48:35.676987 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-jbfw7" event={"ID":"fa6b1a4f-0c86-4aa0-8d19-f29a78797c6e","Type":"ContainerStarted","Data":"16ecaa3766640536300734e74b24561cd4565e6b44aca81b10c497846b575f01"} Mar 20 17:48:35 crc kubenswrapper[4690]: I0320 17:48:35.677030 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-jbfw7" event={"ID":"fa6b1a4f-0c86-4aa0-8d19-f29a78797c6e","Type":"ContainerStarted","Data":"baf6b4bb8193914652f1306e53f1871a950dc7529ce9d1ec64a55b57bb895a9b"} Mar 20 17:48:35 crc kubenswrapper[4690]: I0320 17:48:35.677039 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-jbfw7" event={"ID":"fa6b1a4f-0c86-4aa0-8d19-f29a78797c6e","Type":"ContainerStarted","Data":"ca0f2730d38c6d816110a65237ca73f8f198f6bfe8c340219a5fe296cb74d796"} Mar 20 17:48:35 crc kubenswrapper[4690]: I0320 17:48:35.677157 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-7bb4cc7c98-jbfw7" Mar 20 17:48:35 crc kubenswrapper[4690]: I0320 17:48:35.697853 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-7bb4cc7c98-jbfw7" podStartSLOduration=1.697836052 podStartE2EDuration="1.697836052s" podCreationTimestamp="2026-03-20 17:48:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:48:35.690660996 +0000 UTC m=+990.556486674" watchObservedRunningTime="2026-03-20 17:48:35.697836052 +0000 UTC m=+990.563661730" Mar 20 17:48:36 crc kubenswrapper[4690]: I0320 17:48:36.245500 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/9e0ecbbf-1e0c-408a-b58c-07cd90497c39-memberlist\") pod \"speaker-cfggn\" (UID: \"9e0ecbbf-1e0c-408a-b58c-07cd90497c39\") " pod="metallb-system/speaker-cfggn" Mar 20 17:48:36 crc kubenswrapper[4690]: I0320 17:48:36.252152 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/9e0ecbbf-1e0c-408a-b58c-07cd90497c39-memberlist\") pod \"speaker-cfggn\" (UID: \"9e0ecbbf-1e0c-408a-b58c-07cd90497c39\") " pod="metallb-system/speaker-cfggn" Mar 20 17:48:36 crc kubenswrapper[4690]: I0320 17:48:36.366553 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-cfggn" Mar 20 17:48:36 crc kubenswrapper[4690]: I0320 17:48:36.686358 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-cfggn" event={"ID":"9e0ecbbf-1e0c-408a-b58c-07cd90497c39","Type":"ContainerStarted","Data":"a07c2baf87dcfe09661d1938a4e2f435a300c533e7d2fd8c2cd8f4f17d98300a"} Mar 20 17:48:36 crc kubenswrapper[4690]: I0320 17:48:36.686407 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-cfggn" event={"ID":"9e0ecbbf-1e0c-408a-b58c-07cd90497c39","Type":"ContainerStarted","Data":"80c13021af2c33bec2b3d06ab29cd8b03cacdc40de22a85c48e49443ae194af4"} Mar 20 17:48:37 crc kubenswrapper[4690]: I0320 17:48:37.581761 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4pvcd"] Mar 20 17:48:37 crc kubenswrapper[4690]: I0320 17:48:37.583268 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4pvcd" Mar 20 17:48:37 crc kubenswrapper[4690]: I0320 17:48:37.593204 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4pvcd"] Mar 20 17:48:37 crc kubenswrapper[4690]: I0320 17:48:37.667247 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d5cf370-fbfa-4dba-bced-93152a414c47-catalog-content\") pod \"community-operators-4pvcd\" (UID: \"6d5cf370-fbfa-4dba-bced-93152a414c47\") " pod="openshift-marketplace/community-operators-4pvcd" Mar 20 17:48:37 crc kubenswrapper[4690]: I0320 17:48:37.667380 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfc25\" (UniqueName: \"kubernetes.io/projected/6d5cf370-fbfa-4dba-bced-93152a414c47-kube-api-access-wfc25\") pod \"community-operators-4pvcd\" (UID: \"6d5cf370-fbfa-4dba-bced-93152a414c47\") " pod="openshift-marketplace/community-operators-4pvcd" Mar 20 17:48:37 crc kubenswrapper[4690]: I0320 17:48:37.667402 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d5cf370-fbfa-4dba-bced-93152a414c47-utilities\") pod \"community-operators-4pvcd\" (UID: \"6d5cf370-fbfa-4dba-bced-93152a414c47\") " pod="openshift-marketplace/community-operators-4pvcd" Mar 20 17:48:37 crc kubenswrapper[4690]: I0320 17:48:37.699405 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-cfggn" event={"ID":"9e0ecbbf-1e0c-408a-b58c-07cd90497c39","Type":"ContainerStarted","Data":"784e35dea66d38ed26a4860144a083a350993f87f257ef722064261cebf3e9eb"} Mar 20 17:48:37 crc kubenswrapper[4690]: I0320 17:48:37.699629 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-cfggn" Mar 20 17:48:37 crc kubenswrapper[4690]: I0320 17:48:37.725027 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-cfggn" podStartSLOduration=3.725012931 podStartE2EDuration="3.725012931s" podCreationTimestamp="2026-03-20 17:48:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:48:37.722115548 +0000 UTC m=+992.587941226" watchObservedRunningTime="2026-03-20 17:48:37.725012931 +0000 UTC m=+992.590838609" Mar 20 17:48:37 crc kubenswrapper[4690]: I0320 17:48:37.769118 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfc25\" (UniqueName: \"kubernetes.io/projected/6d5cf370-fbfa-4dba-bced-93152a414c47-kube-api-access-wfc25\") pod \"community-operators-4pvcd\" (UID: \"6d5cf370-fbfa-4dba-bced-93152a414c47\") " pod="openshift-marketplace/community-operators-4pvcd" Mar 20 17:48:37 crc kubenswrapper[4690]: I0320 17:48:37.769163 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d5cf370-fbfa-4dba-bced-93152a414c47-utilities\") pod \"community-operators-4pvcd\" (UID: \"6d5cf370-fbfa-4dba-bced-93152a414c47\") " pod="openshift-marketplace/community-operators-4pvcd" Mar 20 17:48:37 crc kubenswrapper[4690]: I0320 17:48:37.769214 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d5cf370-fbfa-4dba-bced-93152a414c47-catalog-content\") pod \"community-operators-4pvcd\" (UID: \"6d5cf370-fbfa-4dba-bced-93152a414c47\") " pod="openshift-marketplace/community-operators-4pvcd" Mar 20 17:48:37 crc kubenswrapper[4690]: I0320 17:48:37.769726 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d5cf370-fbfa-4dba-bced-93152a414c47-catalog-content\") pod \"community-operators-4pvcd\" (UID: \"6d5cf370-fbfa-4dba-bced-93152a414c47\") " pod="openshift-marketplace/community-operators-4pvcd" Mar 20 17:48:37 crc kubenswrapper[4690]: I0320 17:48:37.769835 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d5cf370-fbfa-4dba-bced-93152a414c47-utilities\") pod \"community-operators-4pvcd\" (UID: \"6d5cf370-fbfa-4dba-bced-93152a414c47\") " pod="openshift-marketplace/community-operators-4pvcd" Mar 20 17:48:37 crc kubenswrapper[4690]: I0320 17:48:37.809087 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfc25\" (UniqueName: \"kubernetes.io/projected/6d5cf370-fbfa-4dba-bced-93152a414c47-kube-api-access-wfc25\") pod \"community-operators-4pvcd\" (UID: \"6d5cf370-fbfa-4dba-bced-93152a414c47\") " pod="openshift-marketplace/community-operators-4pvcd" Mar 20 17:48:37 crc kubenswrapper[4690]: I0320 17:48:37.897143 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4pvcd" Mar 20 17:48:38 crc kubenswrapper[4690]: I0320 17:48:38.382794 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4pvcd"] Mar 20 17:48:38 crc kubenswrapper[4690]: I0320 17:48:38.717081 4690 generic.go:334] "Generic (PLEG): container finished" podID="6d5cf370-fbfa-4dba-bced-93152a414c47" containerID="e6ac4db99cea2bdc8a1003a621c48f9d4b9599d1004b267324e453173742eb51" exitCode=0 Mar 20 17:48:38 crc kubenswrapper[4690]: I0320 17:48:38.717266 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4pvcd" event={"ID":"6d5cf370-fbfa-4dba-bced-93152a414c47","Type":"ContainerDied","Data":"e6ac4db99cea2bdc8a1003a621c48f9d4b9599d1004b267324e453173742eb51"} Mar 20 17:48:38 crc kubenswrapper[4690]: I0320 17:48:38.718210 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4pvcd" event={"ID":"6d5cf370-fbfa-4dba-bced-93152a414c47","Type":"ContainerStarted","Data":"aeec5115a8e8ae6be7578a79fbb83fc5a661b647424503fad144302043174859"} Mar 20 17:48:39 crc kubenswrapper[4690]: I0320 17:48:39.727843 4690 generic.go:334] "Generic (PLEG): container finished" podID="6d5cf370-fbfa-4dba-bced-93152a414c47" containerID="7bd21c65c37281b815be53e217d1087210f6e035017127586d42bcf208989222" exitCode=0 Mar 20 17:48:39 crc kubenswrapper[4690]: I0320 17:48:39.728086 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4pvcd" event={"ID":"6d5cf370-fbfa-4dba-bced-93152a414c47","Type":"ContainerDied","Data":"7bd21c65c37281b815be53e217d1087210f6e035017127586d42bcf208989222"} Mar 20 17:48:40 crc kubenswrapper[4690]: I0320 17:48:40.572907 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pvd9b"] Mar 20 17:48:40 crc kubenswrapper[4690]: I0320 17:48:40.575175 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pvd9b" Mar 20 17:48:40 crc kubenswrapper[4690]: I0320 17:48:40.584323 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pvd9b"] Mar 20 17:48:40 crc kubenswrapper[4690]: I0320 17:48:40.617074 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4331cc3d-7905-49e0-b80a-f6b178d411ae-catalog-content\") pod \"redhat-marketplace-pvd9b\" (UID: \"4331cc3d-7905-49e0-b80a-f6b178d411ae\") " pod="openshift-marketplace/redhat-marketplace-pvd9b" Mar 20 17:48:40 crc kubenswrapper[4690]: I0320 17:48:40.617134 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4331cc3d-7905-49e0-b80a-f6b178d411ae-utilities\") pod \"redhat-marketplace-pvd9b\" (UID: \"4331cc3d-7905-49e0-b80a-f6b178d411ae\") " pod="openshift-marketplace/redhat-marketplace-pvd9b" Mar 20 17:48:40 crc kubenswrapper[4690]: I0320 17:48:40.617197 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfwpt\" (UniqueName: \"kubernetes.io/projected/4331cc3d-7905-49e0-b80a-f6b178d411ae-kube-api-access-wfwpt\") pod \"redhat-marketplace-pvd9b\" (UID: \"4331cc3d-7905-49e0-b80a-f6b178d411ae\") " pod="openshift-marketplace/redhat-marketplace-pvd9b" Mar 20 17:48:40 crc kubenswrapper[4690]: I0320 17:48:40.718286 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4331cc3d-7905-49e0-b80a-f6b178d411ae-utilities\") pod \"redhat-marketplace-pvd9b\" (UID: \"4331cc3d-7905-49e0-b80a-f6b178d411ae\") " pod="openshift-marketplace/redhat-marketplace-pvd9b" Mar 20 17:48:40 crc kubenswrapper[4690]: I0320 17:48:40.718383 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfwpt\" (UniqueName: \"kubernetes.io/projected/4331cc3d-7905-49e0-b80a-f6b178d411ae-kube-api-access-wfwpt\") pod \"redhat-marketplace-pvd9b\" (UID: \"4331cc3d-7905-49e0-b80a-f6b178d411ae\") " pod="openshift-marketplace/redhat-marketplace-pvd9b" Mar 20 17:48:40 crc kubenswrapper[4690]: I0320 17:48:40.718428 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4331cc3d-7905-49e0-b80a-f6b178d411ae-catalog-content\") pod \"redhat-marketplace-pvd9b\" (UID: \"4331cc3d-7905-49e0-b80a-f6b178d411ae\") " pod="openshift-marketplace/redhat-marketplace-pvd9b" Mar 20 17:48:40 crc kubenswrapper[4690]: I0320 17:48:40.718853 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4331cc3d-7905-49e0-b80a-f6b178d411ae-utilities\") pod \"redhat-marketplace-pvd9b\" (UID: \"4331cc3d-7905-49e0-b80a-f6b178d411ae\") " pod="openshift-marketplace/redhat-marketplace-pvd9b" Mar 20 17:48:40 crc kubenswrapper[4690]: I0320 17:48:40.718930 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4331cc3d-7905-49e0-b80a-f6b178d411ae-catalog-content\") pod \"redhat-marketplace-pvd9b\" (UID: \"4331cc3d-7905-49e0-b80a-f6b178d411ae\") " pod="openshift-marketplace/redhat-marketplace-pvd9b" Mar 20 17:48:40 crc kubenswrapper[4690]: I0320 17:48:40.744444 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfwpt\" (UniqueName: \"kubernetes.io/projected/4331cc3d-7905-49e0-b80a-f6b178d411ae-kube-api-access-wfwpt\") pod \"redhat-marketplace-pvd9b\" (UID: \"4331cc3d-7905-49e0-b80a-f6b178d411ae\") " pod="openshift-marketplace/redhat-marketplace-pvd9b" Mar 20 17:48:40 crc kubenswrapper[4690]: I0320 17:48:40.934857 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pvd9b" Mar 20 17:48:42 crc kubenswrapper[4690]: I0320 17:48:42.947816 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pvd9b"] Mar 20 17:48:43 crc kubenswrapper[4690]: I0320 17:48:43.759546 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4pvcd" event={"ID":"6d5cf370-fbfa-4dba-bced-93152a414c47","Type":"ContainerStarted","Data":"18abf448cda002c8eae623f06974c8fa35fe43165fef3140f2c7fd26e024db45"} Mar 20 17:48:43 crc kubenswrapper[4690]: I0320 17:48:43.764558 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-fn9hk" event={"ID":"60df4ef1-2b12-4b8f-aec8-0a716fa5f7d0","Type":"ContainerStarted","Data":"1e32b42411227f1274120d6a4611632414a8749b29dc80b118d6524ee5f18faa"} Mar 20 17:48:43 crc kubenswrapper[4690]: I0320 17:48:43.764814 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-fn9hk" Mar 20 17:48:43 crc kubenswrapper[4690]: I0320 17:48:43.771059 4690 generic.go:334] "Generic (PLEG): container finished" podID="4331cc3d-7905-49e0-b80a-f6b178d411ae" containerID="1965a485a3e86fd29b9983142bd03d03793ebad7052e1db882233deb479e4f49" exitCode=0 Mar 20 17:48:43 crc kubenswrapper[4690]: I0320 17:48:43.771149 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pvd9b" event={"ID":"4331cc3d-7905-49e0-b80a-f6b178d411ae","Type":"ContainerDied","Data":"1965a485a3e86fd29b9983142bd03d03793ebad7052e1db882233deb479e4f49"} Mar 20 17:48:43 crc kubenswrapper[4690]: I0320 17:48:43.771183 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pvd9b" event={"ID":"4331cc3d-7905-49e0-b80a-f6b178d411ae","Type":"ContainerStarted","Data":"a05bbd80e6dcd92210e6ce5562ea88b613bf238548aacc22b6c4fe6074af17a8"} Mar 20 17:48:43 crc kubenswrapper[4690]: I0320 17:48:43.774831 4690 generic.go:334] "Generic (PLEG): container finished" podID="2edd85fb-0387-4738-ba35-2b326b635a1b" containerID="e42a8f5cb54bdfbf8ed835850f97c904610f3f9702840433e5ede4c718ba16d2" exitCode=0 Mar 20 17:48:43 crc kubenswrapper[4690]: I0320 17:48:43.774884 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9cwmb" event={"ID":"2edd85fb-0387-4738-ba35-2b326b635a1b","Type":"ContainerDied","Data":"e42a8f5cb54bdfbf8ed835850f97c904610f3f9702840433e5ede4c718ba16d2"} Mar 20 17:48:43 crc kubenswrapper[4690]: I0320 17:48:43.790163 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4pvcd" podStartSLOduration=2.696306952 podStartE2EDuration="6.790141466s" podCreationTimestamp="2026-03-20 17:48:37 +0000 UTC" firstStartedPulling="2026-03-20 17:48:38.719484281 +0000 UTC m=+993.585309959" lastFinishedPulling="2026-03-20 17:48:42.813318805 +0000 UTC m=+997.679144473" observedRunningTime="2026-03-20 17:48:43.786270305 +0000 UTC m=+998.652096123" watchObservedRunningTime="2026-03-20 17:48:43.790141466 +0000 UTC m=+998.655967144" Mar 20 17:48:43 crc kubenswrapper[4690]: I0320 17:48:43.857212 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-fn9hk" podStartSLOduration=2.645458134 podStartE2EDuration="9.857194225s" podCreationTimestamp="2026-03-20 17:48:34 +0000 UTC" firstStartedPulling="2026-03-20 17:48:35.649024538 +0000 UTC m=+990.514850226" lastFinishedPulling="2026-03-20 17:48:42.860760639 +0000 UTC m=+997.726586317" observedRunningTime="2026-03-20 17:48:43.855395143 +0000 UTC m=+998.721220831" watchObservedRunningTime="2026-03-20 17:48:43.857194225 +0000 UTC m=+998.723019903" Mar 20 17:48:44 crc kubenswrapper[4690]: I0320 17:48:44.782147 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pvd9b" event={"ID":"4331cc3d-7905-49e0-b80a-f6b178d411ae","Type":"ContainerStarted","Data":"9ad9115247aafa1301ac3af9dec61ff4aef0639cbad4072484683f177f717f15"} Mar 20 17:48:44 crc kubenswrapper[4690]: I0320 17:48:44.784125 4690 generic.go:334] "Generic (PLEG): container finished" podID="2edd85fb-0387-4738-ba35-2b326b635a1b" containerID="7e267ee8c70535907df0cc14140dea17d43dde78b5b45e20a3f294b4a0a875da" exitCode=0 Mar 20 17:48:44 crc kubenswrapper[4690]: I0320 17:48:44.784204 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9cwmb" event={"ID":"2edd85fb-0387-4738-ba35-2b326b635a1b","Type":"ContainerDied","Data":"7e267ee8c70535907df0cc14140dea17d43dde78b5b45e20a3f294b4a0a875da"} Mar 20 17:48:45 crc kubenswrapper[4690]: I0320 17:48:45.794764 4690 generic.go:334] "Generic (PLEG): container finished" podID="2edd85fb-0387-4738-ba35-2b326b635a1b" containerID="c4b6d4ef9141dc6e58e6336ef6600e43651ff1c390781f78a8fa33d53e3264c0" exitCode=0 Mar 20 17:48:45 crc kubenswrapper[4690]: I0320 17:48:45.795147 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9cwmb" event={"ID":"2edd85fb-0387-4738-ba35-2b326b635a1b","Type":"ContainerDied","Data":"c4b6d4ef9141dc6e58e6336ef6600e43651ff1c390781f78a8fa33d53e3264c0"} Mar 20 17:48:45 crc kubenswrapper[4690]: I0320 17:48:45.799627 4690 generic.go:334] "Generic (PLEG): container finished" podID="4331cc3d-7905-49e0-b80a-f6b178d411ae" containerID="9ad9115247aafa1301ac3af9dec61ff4aef0639cbad4072484683f177f717f15" exitCode=0 Mar 20 17:48:45 crc kubenswrapper[4690]: I0320 17:48:45.799669 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pvd9b" event={"ID":"4331cc3d-7905-49e0-b80a-f6b178d411ae","Type":"ContainerDied","Data":"9ad9115247aafa1301ac3af9dec61ff4aef0639cbad4072484683f177f717f15"} Mar 20 17:48:46 crc kubenswrapper[4690]: I0320 17:48:46.370359 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-cfggn" Mar 20 17:48:46 crc kubenswrapper[4690]: I0320 17:48:46.809708 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pvd9b" event={"ID":"4331cc3d-7905-49e0-b80a-f6b178d411ae","Type":"ContainerStarted","Data":"0c296493f2d26f37a45c3fc71b5bad3937b0d9c8f5e3e133beffb7249e0856a3"} Mar 20 17:48:46 crc kubenswrapper[4690]: I0320 17:48:46.816986 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9cwmb" event={"ID":"2edd85fb-0387-4738-ba35-2b326b635a1b","Type":"ContainerStarted","Data":"4a3c16999f9d78973c279bc63769d9497a6880a1fd01caa4083d25d903f326ec"} Mar 20 17:48:46 crc kubenswrapper[4690]: I0320 17:48:46.817037 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9cwmb" event={"ID":"2edd85fb-0387-4738-ba35-2b326b635a1b","Type":"ContainerStarted","Data":"e5fb21486d756dd6d9ca9b40dd8a8ec6a6dd401e8c6b02c3b571404f5969ed1a"} Mar 20 17:48:46 crc kubenswrapper[4690]: I0320 17:48:46.817050 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9cwmb" event={"ID":"2edd85fb-0387-4738-ba35-2b326b635a1b","Type":"ContainerStarted","Data":"a57831146ec69d40127b7ebcf78bcb8e3ff135726fcad07381f3432a7a6cef5e"} Mar 20 17:48:46 crc kubenswrapper[4690]: I0320 17:48:46.817062 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9cwmb" event={"ID":"2edd85fb-0387-4738-ba35-2b326b635a1b","Type":"ContainerStarted","Data":"df72040de117d117b252d9d884193e5c20c15200ccb7e901d1ca4e761cd35401"} Mar 20 17:48:46 crc kubenswrapper[4690]: I0320 17:48:46.817074 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9cwmb" event={"ID":"2edd85fb-0387-4738-ba35-2b326b635a1b","Type":"ContainerStarted","Data":"b0f339744d351db5031c0a8facb06953d4aa806f91528335cdca15c5174cce7f"} Mar 20 17:48:46 crc kubenswrapper[4690]: I0320 17:48:46.845518 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pvd9b" podStartSLOduration=4.445600705 podStartE2EDuration="6.845491924s" podCreationTimestamp="2026-03-20 17:48:40 +0000 UTC" firstStartedPulling="2026-03-20 17:48:43.772234421 +0000 UTC m=+998.638060099" lastFinishedPulling="2026-03-20 17:48:46.17212564 +0000 UTC m=+1001.037951318" observedRunningTime="2026-03-20 17:48:46.838897635 +0000 UTC m=+1001.704723343" watchObservedRunningTime="2026-03-20 17:48:46.845491924 +0000 UTC m=+1001.711317632" Mar 20 17:48:47 crc kubenswrapper[4690]: I0320 17:48:47.831773 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9cwmb" event={"ID":"2edd85fb-0387-4738-ba35-2b326b635a1b","Type":"ContainerStarted","Data":"3b73e4ed30948816b7e0dda5dbfa3009b893ef0aaef8be448d38d811997313b2"} Mar 20 17:48:47 crc kubenswrapper[4690]: I0320 17:48:47.860802 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-9cwmb" podStartSLOduration=5.956698171 podStartE2EDuration="13.860783203s" podCreationTimestamp="2026-03-20 17:48:34 +0000 UTC" firstStartedPulling="2026-03-20 17:48:34.956595055 +0000 UTC m=+989.822420733" lastFinishedPulling="2026-03-20 17:48:42.860680077 +0000 UTC m=+997.726505765" observedRunningTime="2026-03-20 17:48:47.858447766 +0000 UTC m=+1002.724273454" watchObservedRunningTime="2026-03-20 17:48:47.860783203 +0000 UTC m=+1002.726608901" Mar 20 17:48:47 crc kubenswrapper[4690]: I0320 17:48:47.897921 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4pvcd" Mar 20 17:48:47 crc kubenswrapper[4690]: I0320 17:48:47.897969 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4pvcd" Mar 20 17:48:47 crc kubenswrapper[4690]: I0320 17:48:47.966664 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4pvcd" Mar 20 17:48:48 crc kubenswrapper[4690]: I0320 17:48:48.841291 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-9cwmb" Mar 20 17:48:48 crc kubenswrapper[4690]: I0320 17:48:48.895804 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4pvcd" Mar 20 17:48:49 crc kubenswrapper[4690]: I0320 17:48:49.159238 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4pvcd"] Mar 20 17:48:49 crc kubenswrapper[4690]: I0320 17:48:49.825309 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-9cwmb" Mar 20 17:48:49 crc kubenswrapper[4690]: I0320 17:48:49.896108 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-9cwmb" Mar 20 17:48:50 crc kubenswrapper[4690]: I0320 17:48:50.850746 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4pvcd" podUID="6d5cf370-fbfa-4dba-bced-93152a414c47" containerName="registry-server" containerID="cri-o://18abf448cda002c8eae623f06974c8fa35fe43165fef3140f2c7fd26e024db45" gracePeriod=2 Mar 20 17:48:50 crc kubenswrapper[4690]: I0320 17:48:50.935598 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pvd9b" Mar 20 17:48:50 crc kubenswrapper[4690]: I0320 17:48:50.935885 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pvd9b" Mar 20 17:48:50 crc kubenswrapper[4690]: I0320 17:48:50.981942 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pvd9b" Mar 20 17:48:51 crc kubenswrapper[4690]: I0320 17:48:51.207205 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4pvcd" Mar 20 17:48:51 crc kubenswrapper[4690]: I0320 17:48:51.274455 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wfc25\" (UniqueName: \"kubernetes.io/projected/6d5cf370-fbfa-4dba-bced-93152a414c47-kube-api-access-wfc25\") pod \"6d5cf370-fbfa-4dba-bced-93152a414c47\" (UID: \"6d5cf370-fbfa-4dba-bced-93152a414c47\") " Mar 20 17:48:51 crc kubenswrapper[4690]: I0320 17:48:51.275073 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d5cf370-fbfa-4dba-bced-93152a414c47-utilities\") pod \"6d5cf370-fbfa-4dba-bced-93152a414c47\" (UID: \"6d5cf370-fbfa-4dba-bced-93152a414c47\") " Mar 20 17:48:51 crc kubenswrapper[4690]: I0320 17:48:51.275229 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d5cf370-fbfa-4dba-bced-93152a414c47-catalog-content\") pod \"6d5cf370-fbfa-4dba-bced-93152a414c47\" (UID: \"6d5cf370-fbfa-4dba-bced-93152a414c47\") " Mar 20 17:48:51 crc kubenswrapper[4690]: I0320 17:48:51.276153 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d5cf370-fbfa-4dba-bced-93152a414c47-utilities" (OuterVolumeSpecName: "utilities") pod "6d5cf370-fbfa-4dba-bced-93152a414c47" (UID: "6d5cf370-fbfa-4dba-bced-93152a414c47"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:48:51 crc kubenswrapper[4690]: I0320 17:48:51.283818 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d5cf370-fbfa-4dba-bced-93152a414c47-kube-api-access-wfc25" (OuterVolumeSpecName: "kube-api-access-wfc25") pod "6d5cf370-fbfa-4dba-bced-93152a414c47" (UID: "6d5cf370-fbfa-4dba-bced-93152a414c47"). InnerVolumeSpecName "kube-api-access-wfc25". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:48:51 crc kubenswrapper[4690]: I0320 17:48:51.288014 4690 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d5cf370-fbfa-4dba-bced-93152a414c47-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 17:48:51 crc kubenswrapper[4690]: I0320 17:48:51.288058 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wfc25\" (UniqueName: \"kubernetes.io/projected/6d5cf370-fbfa-4dba-bced-93152a414c47-kube-api-access-wfc25\") on node \"crc\" DevicePath \"\"" Mar 20 17:48:51 crc kubenswrapper[4690]: I0320 17:48:51.350121 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d5cf370-fbfa-4dba-bced-93152a414c47-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6d5cf370-fbfa-4dba-bced-93152a414c47" (UID: "6d5cf370-fbfa-4dba-bced-93152a414c47"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:48:51 crc kubenswrapper[4690]: I0320 17:48:51.388914 4690 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d5cf370-fbfa-4dba-bced-93152a414c47-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 17:48:51 crc kubenswrapper[4690]: I0320 17:48:51.862914 4690 generic.go:334] "Generic (PLEG): container finished" podID="6d5cf370-fbfa-4dba-bced-93152a414c47" containerID="18abf448cda002c8eae623f06974c8fa35fe43165fef3140f2c7fd26e024db45" exitCode=0 Mar 20 17:48:51 crc kubenswrapper[4690]: I0320 17:48:51.863024 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4pvcd" Mar 20 17:48:51 crc kubenswrapper[4690]: I0320 17:48:51.863048 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4pvcd" event={"ID":"6d5cf370-fbfa-4dba-bced-93152a414c47","Type":"ContainerDied","Data":"18abf448cda002c8eae623f06974c8fa35fe43165fef3140f2c7fd26e024db45"} Mar 20 17:48:51 crc kubenswrapper[4690]: I0320 17:48:51.863137 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4pvcd" event={"ID":"6d5cf370-fbfa-4dba-bced-93152a414c47","Type":"ContainerDied","Data":"aeec5115a8e8ae6be7578a79fbb83fc5a661b647424503fad144302043174859"} Mar 20 17:48:51 crc kubenswrapper[4690]: I0320 17:48:51.863177 4690 scope.go:117] "RemoveContainer" containerID="18abf448cda002c8eae623f06974c8fa35fe43165fef3140f2c7fd26e024db45" Mar 20 17:48:51 crc kubenswrapper[4690]: I0320 17:48:51.921592 4690 scope.go:117] "RemoveContainer" containerID="7bd21c65c37281b815be53e217d1087210f6e035017127586d42bcf208989222" Mar 20 17:48:51 crc kubenswrapper[4690]: I0320 17:48:51.922582 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4pvcd"] Mar 20 17:48:51 crc kubenswrapper[4690]: I0320 17:48:51.937153 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4pvcd"] Mar 20 17:48:51 crc kubenswrapper[4690]: I0320 17:48:51.942887 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pvd9b" Mar 20 17:48:51 crc kubenswrapper[4690]: I0320 17:48:51.946640 4690 scope.go:117] "RemoveContainer" containerID="e6ac4db99cea2bdc8a1003a621c48f9d4b9599d1004b267324e453173742eb51" Mar 20 17:48:51 crc kubenswrapper[4690]: I0320 17:48:51.971423 4690 scope.go:117] "RemoveContainer" containerID="18abf448cda002c8eae623f06974c8fa35fe43165fef3140f2c7fd26e024db45" Mar 20 17:48:51 crc kubenswrapper[4690]: E0320 17:48:51.971858 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18abf448cda002c8eae623f06974c8fa35fe43165fef3140f2c7fd26e024db45\": container with ID starting with 18abf448cda002c8eae623f06974c8fa35fe43165fef3140f2c7fd26e024db45 not found: ID does not exist" containerID="18abf448cda002c8eae623f06974c8fa35fe43165fef3140f2c7fd26e024db45" Mar 20 17:48:51 crc kubenswrapper[4690]: I0320 17:48:51.971896 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18abf448cda002c8eae623f06974c8fa35fe43165fef3140f2c7fd26e024db45"} err="failed to get container status \"18abf448cda002c8eae623f06974c8fa35fe43165fef3140f2c7fd26e024db45\": rpc error: code = NotFound desc = could not find container \"18abf448cda002c8eae623f06974c8fa35fe43165fef3140f2c7fd26e024db45\": container with ID starting with 18abf448cda002c8eae623f06974c8fa35fe43165fef3140f2c7fd26e024db45 not found: ID does not exist" Mar 20 17:48:51 crc kubenswrapper[4690]: I0320 17:48:51.971923 4690 scope.go:117] "RemoveContainer" containerID="7bd21c65c37281b815be53e217d1087210f6e035017127586d42bcf208989222" Mar 20 17:48:51 crc kubenswrapper[4690]: E0320 17:48:51.972436 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7bd21c65c37281b815be53e217d1087210f6e035017127586d42bcf208989222\": container with ID starting with 7bd21c65c37281b815be53e217d1087210f6e035017127586d42bcf208989222 not found: ID does not exist" containerID="7bd21c65c37281b815be53e217d1087210f6e035017127586d42bcf208989222" Mar 20 17:48:51 crc kubenswrapper[4690]: I0320 17:48:51.972472 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bd21c65c37281b815be53e217d1087210f6e035017127586d42bcf208989222"} err="failed to get container status \"7bd21c65c37281b815be53e217d1087210f6e035017127586d42bcf208989222\": rpc error: code = NotFound desc = could not find container \"7bd21c65c37281b815be53e217d1087210f6e035017127586d42bcf208989222\": container with ID starting with 7bd21c65c37281b815be53e217d1087210f6e035017127586d42bcf208989222 not found: ID does not exist" Mar 20 17:48:51 crc kubenswrapper[4690]: I0320 17:48:51.972498 4690 scope.go:117] "RemoveContainer" containerID="e6ac4db99cea2bdc8a1003a621c48f9d4b9599d1004b267324e453173742eb51" Mar 20 17:48:51 crc kubenswrapper[4690]: E0320 17:48:51.972738 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6ac4db99cea2bdc8a1003a621c48f9d4b9599d1004b267324e453173742eb51\": container with ID starting with e6ac4db99cea2bdc8a1003a621c48f9d4b9599d1004b267324e453173742eb51 not found: ID does not exist" containerID="e6ac4db99cea2bdc8a1003a621c48f9d4b9599d1004b267324e453173742eb51" Mar 20 17:48:51 crc kubenswrapper[4690]: I0320 17:48:51.972764 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6ac4db99cea2bdc8a1003a621c48f9d4b9599d1004b267324e453173742eb51"} err="failed to get container status \"e6ac4db99cea2bdc8a1003a621c48f9d4b9599d1004b267324e453173742eb51\": rpc error: code = NotFound desc = could not find container \"e6ac4db99cea2bdc8a1003a621c48f9d4b9599d1004b267324e453173742eb51\": container with ID starting with e6ac4db99cea2bdc8a1003a621c48f9d4b9599d1004b267324e453173742eb51 not found: ID does not exist" Mar 20 17:48:52 crc kubenswrapper[4690]: I0320 17:48:52.379365 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-xcb58"] Mar 20 17:48:52 crc kubenswrapper[4690]: E0320 17:48:52.382209 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d5cf370-fbfa-4dba-bced-93152a414c47" containerName="extract-utilities" Mar 20 17:48:52 crc kubenswrapper[4690]: I0320 17:48:52.382308 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d5cf370-fbfa-4dba-bced-93152a414c47" containerName="extract-utilities" Mar 20 17:48:52 crc kubenswrapper[4690]: E0320 17:48:52.382381 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d5cf370-fbfa-4dba-bced-93152a414c47" containerName="extract-content" Mar 20 17:48:52 crc kubenswrapper[4690]: I0320 17:48:52.382401 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d5cf370-fbfa-4dba-bced-93152a414c47" containerName="extract-content" Mar 20 17:48:52 crc kubenswrapper[4690]: E0320 17:48:52.382434 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d5cf370-fbfa-4dba-bced-93152a414c47" containerName="registry-server" Mar 20 17:48:52 crc kubenswrapper[4690]: I0320 17:48:52.382451 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d5cf370-fbfa-4dba-bced-93152a414c47" containerName="registry-server" Mar 20 17:48:52 crc kubenswrapper[4690]: I0320 17:48:52.382750 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d5cf370-fbfa-4dba-bced-93152a414c47" containerName="registry-server" Mar 20 17:48:52 crc kubenswrapper[4690]: I0320 17:48:52.384302 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-xcb58" Mar 20 17:48:52 crc kubenswrapper[4690]: I0320 17:48:52.387573 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 20 17:48:52 crc kubenswrapper[4690]: I0320 17:48:52.388019 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 20 17:48:52 crc kubenswrapper[4690]: I0320 17:48:52.388298 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-2zmpq" Mar 20 17:48:52 crc kubenswrapper[4690]: I0320 17:48:52.391082 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-xcb58"] Mar 20 17:48:52 crc kubenswrapper[4690]: I0320 17:48:52.506476 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ptmk\" (UniqueName: \"kubernetes.io/projected/595a25b2-d477-4ec7-b9ad-8eb670c2ea3f-kube-api-access-8ptmk\") pod \"openstack-operator-index-xcb58\" (UID: \"595a25b2-d477-4ec7-b9ad-8eb670c2ea3f\") " pod="openstack-operators/openstack-operator-index-xcb58" Mar 20 17:48:52 crc kubenswrapper[4690]: I0320 17:48:52.608438 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ptmk\" (UniqueName: \"kubernetes.io/projected/595a25b2-d477-4ec7-b9ad-8eb670c2ea3f-kube-api-access-8ptmk\") pod \"openstack-operator-index-xcb58\" (UID: \"595a25b2-d477-4ec7-b9ad-8eb670c2ea3f\") " pod="openstack-operators/openstack-operator-index-xcb58" Mar 20 17:48:52 crc kubenswrapper[4690]: I0320 17:48:52.641957 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ptmk\" (UniqueName: \"kubernetes.io/projected/595a25b2-d477-4ec7-b9ad-8eb670c2ea3f-kube-api-access-8ptmk\") pod \"openstack-operator-index-xcb58\" (UID: \"595a25b2-d477-4ec7-b9ad-8eb670c2ea3f\") " pod="openstack-operators/openstack-operator-index-xcb58" Mar 20 17:48:52 crc kubenswrapper[4690]: I0320 17:48:52.712036 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-xcb58" Mar 20 17:48:53 crc kubenswrapper[4690]: I0320 17:48:53.256806 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-xcb58"] Mar 20 17:48:53 crc kubenswrapper[4690]: W0320 17:48:53.262585 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod595a25b2_d477_4ec7_b9ad_8eb670c2ea3f.slice/crio-0a4faa6b5095fcadc6a1ab105ac8fa77f9351f73fbdb8d81f244f75573ee904c WatchSource:0}: Error finding container 0a4faa6b5095fcadc6a1ab105ac8fa77f9351f73fbdb8d81f244f75573ee904c: Status 404 returned error can't find the container with id 0a4faa6b5095fcadc6a1ab105ac8fa77f9351f73fbdb8d81f244f75573ee904c Mar 20 17:48:53 crc kubenswrapper[4690]: I0320 17:48:53.911969 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d5cf370-fbfa-4dba-bced-93152a414c47" path="/var/lib/kubelet/pods/6d5cf370-fbfa-4dba-bced-93152a414c47/volumes" Mar 20 17:48:53 crc kubenswrapper[4690]: I0320 17:48:53.912656 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-xcb58" event={"ID":"595a25b2-d477-4ec7-b9ad-8eb670c2ea3f","Type":"ContainerStarted","Data":"0a4faa6b5095fcadc6a1ab105ac8fa77f9351f73fbdb8d81f244f75573ee904c"} Mar 20 17:48:54 crc kubenswrapper[4690]: I0320 17:48:54.274334 4690 patch_prober.go:28] interesting pod/machine-config-daemon-wtg2q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:48:54 crc kubenswrapper[4690]: I0320 17:48:54.274390 4690 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:48:54 crc kubenswrapper[4690]: I0320 17:48:54.274437 4690 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" Mar 20 17:48:54 crc kubenswrapper[4690]: I0320 17:48:54.275015 4690 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3597106c9e9367c28d243129fc42edbd4550b54914b1aeed86c0200ac6936ead"} pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 17:48:54 crc kubenswrapper[4690]: I0320 17:48:54.275969 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" containerName="machine-config-daemon" containerID="cri-o://3597106c9e9367c28d243129fc42edbd4550b54914b1aeed86c0200ac6936ead" gracePeriod=600 Mar 20 17:48:54 crc kubenswrapper[4690]: I0320 17:48:54.886078 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-7bb4cc7c98-jbfw7" Mar 20 17:48:54 crc kubenswrapper[4690]: I0320 17:48:54.917521 4690 generic.go:334] "Generic (PLEG): container finished" podID="c18651e4-89e3-43fd-a780-bfa6df87591e" containerID="3597106c9e9367c28d243129fc42edbd4550b54914b1aeed86c0200ac6936ead" exitCode=0 Mar 20 17:48:54 crc kubenswrapper[4690]: I0320 17:48:54.917574 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" event={"ID":"c18651e4-89e3-43fd-a780-bfa6df87591e","Type":"ContainerDied","Data":"3597106c9e9367c28d243129fc42edbd4550b54914b1aeed86c0200ac6936ead"} Mar 20 17:48:54 crc kubenswrapper[4690]: I0320 17:48:54.917607 4690 scope.go:117] "RemoveContainer" containerID="cf3fdd9123c95cd6ed2bd1666f574e1450c7c3856ffdba8c0585b34757d0cf92" Mar 20 17:48:55 crc kubenswrapper[4690]: I0320 17:48:55.417674 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-fn9hk" Mar 20 17:48:56 crc kubenswrapper[4690]: I0320 17:48:56.947359 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" event={"ID":"c18651e4-89e3-43fd-a780-bfa6df87591e","Type":"ContainerStarted","Data":"ab2561b6600e9d6bebb46c2c746c35623906cf56d05e6dcd356c447e3e87dfa1"} Mar 20 17:48:56 crc kubenswrapper[4690]: I0320 17:48:56.950376 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-xcb58" event={"ID":"595a25b2-d477-4ec7-b9ad-8eb670c2ea3f","Type":"ContainerStarted","Data":"a5895acacaa6ec144613d8e7781289331c7c08be1b3199a556e87c768a0db2e1"} Mar 20 17:48:56 crc kubenswrapper[4690]: I0320 17:48:56.985844 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-xcb58" podStartSLOduration=1.9227438289999998 podStartE2EDuration="4.985820189s" podCreationTimestamp="2026-03-20 17:48:52 +0000 UTC" firstStartedPulling="2026-03-20 17:48:53.267366451 +0000 UTC m=+1008.133192149" lastFinishedPulling="2026-03-20 17:48:56.330442831 +0000 UTC m=+1011.196268509" observedRunningTime="2026-03-20 17:48:56.98376163 +0000 UTC m=+1011.849587358" watchObservedRunningTime="2026-03-20 17:48:56.985820189 +0000 UTC m=+1011.851645907" Mar 20 17:48:57 crc kubenswrapper[4690]: I0320 17:48:57.559634 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pvd9b"] Mar 20 17:48:57 crc kubenswrapper[4690]: I0320 17:48:57.560890 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-pvd9b" podUID="4331cc3d-7905-49e0-b80a-f6b178d411ae" containerName="registry-server" containerID="cri-o://0c296493f2d26f37a45c3fc71b5bad3937b0d9c8f5e3e133beffb7249e0856a3" gracePeriod=2 Mar 20 17:48:57 crc kubenswrapper[4690]: I0320 17:48:57.962300 4690 generic.go:334] "Generic (PLEG): container finished" podID="4331cc3d-7905-49e0-b80a-f6b178d411ae" containerID="0c296493f2d26f37a45c3fc71b5bad3937b0d9c8f5e3e133beffb7249e0856a3" exitCode=0 Mar 20 17:48:57 crc kubenswrapper[4690]: I0320 17:48:57.962405 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pvd9b" event={"ID":"4331cc3d-7905-49e0-b80a-f6b178d411ae","Type":"ContainerDied","Data":"0c296493f2d26f37a45c3fc71b5bad3937b0d9c8f5e3e133beffb7249e0856a3"} Mar 20 17:48:57 crc kubenswrapper[4690]: I0320 17:48:57.962676 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pvd9b" event={"ID":"4331cc3d-7905-49e0-b80a-f6b178d411ae","Type":"ContainerDied","Data":"a05bbd80e6dcd92210e6ce5562ea88b613bf238548aacc22b6c4fe6074af17a8"} Mar 20 17:48:57 crc kubenswrapper[4690]: I0320 17:48:57.962728 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a05bbd80e6dcd92210e6ce5562ea88b613bf238548aacc22b6c4fe6074af17a8" Mar 20 17:48:57 crc kubenswrapper[4690]: I0320 17:48:57.973742 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pvd9b" Mar 20 17:48:58 crc kubenswrapper[4690]: I0320 17:48:58.145706 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wfwpt\" (UniqueName: \"kubernetes.io/projected/4331cc3d-7905-49e0-b80a-f6b178d411ae-kube-api-access-wfwpt\") pod \"4331cc3d-7905-49e0-b80a-f6b178d411ae\" (UID: \"4331cc3d-7905-49e0-b80a-f6b178d411ae\") " Mar 20 17:48:58 crc kubenswrapper[4690]: I0320 17:48:58.145807 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4331cc3d-7905-49e0-b80a-f6b178d411ae-utilities\") pod \"4331cc3d-7905-49e0-b80a-f6b178d411ae\" (UID: \"4331cc3d-7905-49e0-b80a-f6b178d411ae\") " Mar 20 17:48:58 crc kubenswrapper[4690]: I0320 17:48:58.145845 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4331cc3d-7905-49e0-b80a-f6b178d411ae-catalog-content\") pod \"4331cc3d-7905-49e0-b80a-f6b178d411ae\" (UID: \"4331cc3d-7905-49e0-b80a-f6b178d411ae\") " Mar 20 17:48:58 crc kubenswrapper[4690]: I0320 17:48:58.147948 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4331cc3d-7905-49e0-b80a-f6b178d411ae-utilities" (OuterVolumeSpecName: "utilities") pod "4331cc3d-7905-49e0-b80a-f6b178d411ae" (UID: "4331cc3d-7905-49e0-b80a-f6b178d411ae"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:48:58 crc kubenswrapper[4690]: I0320 17:48:58.152822 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4331cc3d-7905-49e0-b80a-f6b178d411ae-kube-api-access-wfwpt" (OuterVolumeSpecName: "kube-api-access-wfwpt") pod "4331cc3d-7905-49e0-b80a-f6b178d411ae" (UID: "4331cc3d-7905-49e0-b80a-f6b178d411ae"). InnerVolumeSpecName "kube-api-access-wfwpt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:48:58 crc kubenswrapper[4690]: I0320 17:48:58.190840 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4331cc3d-7905-49e0-b80a-f6b178d411ae-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4331cc3d-7905-49e0-b80a-f6b178d411ae" (UID: "4331cc3d-7905-49e0-b80a-f6b178d411ae"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:48:58 crc kubenswrapper[4690]: I0320 17:48:58.248314 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wfwpt\" (UniqueName: \"kubernetes.io/projected/4331cc3d-7905-49e0-b80a-f6b178d411ae-kube-api-access-wfwpt\") on node \"crc\" DevicePath \"\"" Mar 20 17:48:58 crc kubenswrapper[4690]: I0320 17:48:58.248352 4690 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4331cc3d-7905-49e0-b80a-f6b178d411ae-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 17:48:58 crc kubenswrapper[4690]: I0320 17:48:58.248366 4690 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4331cc3d-7905-49e0-b80a-f6b178d411ae-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 17:48:58 crc kubenswrapper[4690]: I0320 17:48:58.969701 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pvd9b" Mar 20 17:48:59 crc kubenswrapper[4690]: I0320 17:48:59.027066 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pvd9b"] Mar 20 17:48:59 crc kubenswrapper[4690]: I0320 17:48:59.035070 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-pvd9b"] Mar 20 17:48:59 crc kubenswrapper[4690]: I0320 17:48:59.896909 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4331cc3d-7905-49e0-b80a-f6b178d411ae" path="/var/lib/kubelet/pods/4331cc3d-7905-49e0-b80a-f6b178d411ae/volumes" Mar 20 17:49:02 crc kubenswrapper[4690]: I0320 17:49:02.712903 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-xcb58" Mar 20 17:49:02 crc kubenswrapper[4690]: I0320 17:49:02.713862 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-xcb58" Mar 20 17:49:02 crc kubenswrapper[4690]: I0320 17:49:02.761024 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-xcb58" Mar 20 17:49:03 crc kubenswrapper[4690]: I0320 17:49:03.042961 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-xcb58" Mar 20 17:49:04 crc kubenswrapper[4690]: I0320 17:49:04.419334 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/d43f0597759d048207e8f8942c34d73ccb7a2672e1af8b0630dbcc16b1wfdpl"] Mar 20 17:49:04 crc kubenswrapper[4690]: E0320 17:49:04.419623 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4331cc3d-7905-49e0-b80a-f6b178d411ae" containerName="extract-utilities" Mar 20 17:49:04 crc kubenswrapper[4690]: I0320 17:49:04.419638 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="4331cc3d-7905-49e0-b80a-f6b178d411ae" containerName="extract-utilities" Mar 20 17:49:04 crc kubenswrapper[4690]: E0320 17:49:04.419655 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4331cc3d-7905-49e0-b80a-f6b178d411ae" containerName="extract-content" Mar 20 17:49:04 crc kubenswrapper[4690]: I0320 17:49:04.419664 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="4331cc3d-7905-49e0-b80a-f6b178d411ae" containerName="extract-content" Mar 20 17:49:04 crc kubenswrapper[4690]: E0320 17:49:04.419677 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4331cc3d-7905-49e0-b80a-f6b178d411ae" containerName="registry-server" Mar 20 17:49:04 crc kubenswrapper[4690]: I0320 17:49:04.419686 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="4331cc3d-7905-49e0-b80a-f6b178d411ae" containerName="registry-server" Mar 20 17:49:04 crc kubenswrapper[4690]: I0320 17:49:04.419850 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="4331cc3d-7905-49e0-b80a-f6b178d411ae" containerName="registry-server" Mar 20 17:49:04 crc kubenswrapper[4690]: I0320 17:49:04.421016 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d43f0597759d048207e8f8942c34d73ccb7a2672e1af8b0630dbcc16b1wfdpl" Mar 20 17:49:04 crc kubenswrapper[4690]: I0320 17:49:04.424528 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-6fwxb" Mar 20 17:49:04 crc kubenswrapper[4690]: I0320 17:49:04.429464 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/d43f0597759d048207e8f8942c34d73ccb7a2672e1af8b0630dbcc16b1wfdpl"] Mar 20 17:49:04 crc kubenswrapper[4690]: I0320 17:49:04.440437 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvxwk\" (UniqueName: \"kubernetes.io/projected/d1b6dbe3-2fff-4985-91e0-270e2d42fcbc-kube-api-access-hvxwk\") pod \"d43f0597759d048207e8f8942c34d73ccb7a2672e1af8b0630dbcc16b1wfdpl\" (UID: \"d1b6dbe3-2fff-4985-91e0-270e2d42fcbc\") " pod="openstack-operators/d43f0597759d048207e8f8942c34d73ccb7a2672e1af8b0630dbcc16b1wfdpl" Mar 20 17:49:04 crc kubenswrapper[4690]: I0320 17:49:04.440783 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d1b6dbe3-2fff-4985-91e0-270e2d42fcbc-util\") pod \"d43f0597759d048207e8f8942c34d73ccb7a2672e1af8b0630dbcc16b1wfdpl\" (UID: \"d1b6dbe3-2fff-4985-91e0-270e2d42fcbc\") " pod="openstack-operators/d43f0597759d048207e8f8942c34d73ccb7a2672e1af8b0630dbcc16b1wfdpl" Mar 20 17:49:04 crc kubenswrapper[4690]: I0320 17:49:04.440850 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d1b6dbe3-2fff-4985-91e0-270e2d42fcbc-bundle\") pod \"d43f0597759d048207e8f8942c34d73ccb7a2672e1af8b0630dbcc16b1wfdpl\" (UID: \"d1b6dbe3-2fff-4985-91e0-270e2d42fcbc\") " pod="openstack-operators/d43f0597759d048207e8f8942c34d73ccb7a2672e1af8b0630dbcc16b1wfdpl" Mar 20 17:49:04 crc kubenswrapper[4690]: I0320 17:49:04.542047 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d1b6dbe3-2fff-4985-91e0-270e2d42fcbc-bundle\") pod \"d43f0597759d048207e8f8942c34d73ccb7a2672e1af8b0630dbcc16b1wfdpl\" (UID: \"d1b6dbe3-2fff-4985-91e0-270e2d42fcbc\") " pod="openstack-operators/d43f0597759d048207e8f8942c34d73ccb7a2672e1af8b0630dbcc16b1wfdpl" Mar 20 17:49:04 crc kubenswrapper[4690]: I0320 17:49:04.542128 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvxwk\" (UniqueName: \"kubernetes.io/projected/d1b6dbe3-2fff-4985-91e0-270e2d42fcbc-kube-api-access-hvxwk\") pod \"d43f0597759d048207e8f8942c34d73ccb7a2672e1af8b0630dbcc16b1wfdpl\" (UID: \"d1b6dbe3-2fff-4985-91e0-270e2d42fcbc\") " pod="openstack-operators/d43f0597759d048207e8f8942c34d73ccb7a2672e1af8b0630dbcc16b1wfdpl" Mar 20 17:49:04 crc kubenswrapper[4690]: I0320 17:49:04.542166 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d1b6dbe3-2fff-4985-91e0-270e2d42fcbc-util\") pod \"d43f0597759d048207e8f8942c34d73ccb7a2672e1af8b0630dbcc16b1wfdpl\" (UID: \"d1b6dbe3-2fff-4985-91e0-270e2d42fcbc\") " pod="openstack-operators/d43f0597759d048207e8f8942c34d73ccb7a2672e1af8b0630dbcc16b1wfdpl" Mar 20 17:49:04 crc kubenswrapper[4690]: I0320 17:49:04.542605 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d1b6dbe3-2fff-4985-91e0-270e2d42fcbc-util\") pod \"d43f0597759d048207e8f8942c34d73ccb7a2672e1af8b0630dbcc16b1wfdpl\" (UID: \"d1b6dbe3-2fff-4985-91e0-270e2d42fcbc\") " pod="openstack-operators/d43f0597759d048207e8f8942c34d73ccb7a2672e1af8b0630dbcc16b1wfdpl" Mar 20 17:49:04 crc kubenswrapper[4690]: I0320 17:49:04.542755 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d1b6dbe3-2fff-4985-91e0-270e2d42fcbc-bundle\") pod \"d43f0597759d048207e8f8942c34d73ccb7a2672e1af8b0630dbcc16b1wfdpl\" (UID: \"d1b6dbe3-2fff-4985-91e0-270e2d42fcbc\") " pod="openstack-operators/d43f0597759d048207e8f8942c34d73ccb7a2672e1af8b0630dbcc16b1wfdpl" Mar 20 17:49:04 crc kubenswrapper[4690]: I0320 17:49:04.565709 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvxwk\" (UniqueName: \"kubernetes.io/projected/d1b6dbe3-2fff-4985-91e0-270e2d42fcbc-kube-api-access-hvxwk\") pod \"d43f0597759d048207e8f8942c34d73ccb7a2672e1af8b0630dbcc16b1wfdpl\" (UID: \"d1b6dbe3-2fff-4985-91e0-270e2d42fcbc\") " pod="openstack-operators/d43f0597759d048207e8f8942c34d73ccb7a2672e1af8b0630dbcc16b1wfdpl" Mar 20 17:49:04 crc kubenswrapper[4690]: I0320 17:49:04.738205 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d43f0597759d048207e8f8942c34d73ccb7a2672e1af8b0630dbcc16b1wfdpl" Mar 20 17:49:04 crc kubenswrapper[4690]: I0320 17:49:04.828539 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-9cwmb" Mar 20 17:49:05 crc kubenswrapper[4690]: I0320 17:49:05.085390 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/d43f0597759d048207e8f8942c34d73ccb7a2672e1af8b0630dbcc16b1wfdpl"] Mar 20 17:49:05 crc kubenswrapper[4690]: W0320 17:49:05.096209 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b6dbe3_2fff_4985_91e0_270e2d42fcbc.slice/crio-94b9f9c945882f961d37e5ed1f5b777feb33189c0d7a0abd15d3d83bc39c8978 WatchSource:0}: Error finding container 94b9f9c945882f961d37e5ed1f5b777feb33189c0d7a0abd15d3d83bc39c8978: Status 404 returned error can't find the container with id 94b9f9c945882f961d37e5ed1f5b777feb33189c0d7a0abd15d3d83bc39c8978 Mar 20 17:49:06 crc kubenswrapper[4690]: I0320 17:49:06.026085 4690 generic.go:334] "Generic (PLEG): container finished" podID="d1b6dbe3-2fff-4985-91e0-270e2d42fcbc" containerID="60750e14604bc581437fd2c4ffba76bd95365508efd63fbeb533987498363244" exitCode=0 Mar 20 17:49:06 crc kubenswrapper[4690]: I0320 17:49:06.026356 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d43f0597759d048207e8f8942c34d73ccb7a2672e1af8b0630dbcc16b1wfdpl" event={"ID":"d1b6dbe3-2fff-4985-91e0-270e2d42fcbc","Type":"ContainerDied","Data":"60750e14604bc581437fd2c4ffba76bd95365508efd63fbeb533987498363244"} Mar 20 17:49:06 crc kubenswrapper[4690]: I0320 17:49:06.026434 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d43f0597759d048207e8f8942c34d73ccb7a2672e1af8b0630dbcc16b1wfdpl" event={"ID":"d1b6dbe3-2fff-4985-91e0-270e2d42fcbc","Type":"ContainerStarted","Data":"94b9f9c945882f961d37e5ed1f5b777feb33189c0d7a0abd15d3d83bc39c8978"} Mar 20 17:49:07 crc kubenswrapper[4690]: I0320 17:49:07.033736 4690 generic.go:334] "Generic (PLEG): container finished" podID="d1b6dbe3-2fff-4985-91e0-270e2d42fcbc" containerID="03aa867f1dfcc060b92e6b50bba032e63a1273e572e90d78c636006eeb42dc50" exitCode=0 Mar 20 17:49:07 crc kubenswrapper[4690]: I0320 17:49:07.033829 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d43f0597759d048207e8f8942c34d73ccb7a2672e1af8b0630dbcc16b1wfdpl" event={"ID":"d1b6dbe3-2fff-4985-91e0-270e2d42fcbc","Type":"ContainerDied","Data":"03aa867f1dfcc060b92e6b50bba032e63a1273e572e90d78c636006eeb42dc50"} Mar 20 17:49:08 crc kubenswrapper[4690]: I0320 17:49:08.044102 4690 generic.go:334] "Generic (PLEG): container finished" podID="d1b6dbe3-2fff-4985-91e0-270e2d42fcbc" containerID="bd7d4df445662d44cf6cfc1c10c9b63d5d66299e8f3a041f37675117477b0b40" exitCode=0 Mar 20 17:49:08 crc kubenswrapper[4690]: I0320 17:49:08.044196 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d43f0597759d048207e8f8942c34d73ccb7a2672e1af8b0630dbcc16b1wfdpl" event={"ID":"d1b6dbe3-2fff-4985-91e0-270e2d42fcbc","Type":"ContainerDied","Data":"bd7d4df445662d44cf6cfc1c10c9b63d5d66299e8f3a041f37675117477b0b40"} Mar 20 17:49:09 crc kubenswrapper[4690]: I0320 17:49:09.373513 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d43f0597759d048207e8f8942c34d73ccb7a2672e1af8b0630dbcc16b1wfdpl" Mar 20 17:49:09 crc kubenswrapper[4690]: I0320 17:49:09.514300 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d1b6dbe3-2fff-4985-91e0-270e2d42fcbc-bundle\") pod \"d1b6dbe3-2fff-4985-91e0-270e2d42fcbc\" (UID: \"d1b6dbe3-2fff-4985-91e0-270e2d42fcbc\") " Mar 20 17:49:09 crc kubenswrapper[4690]: I0320 17:49:09.514427 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hvxwk\" (UniqueName: \"kubernetes.io/projected/d1b6dbe3-2fff-4985-91e0-270e2d42fcbc-kube-api-access-hvxwk\") pod \"d1b6dbe3-2fff-4985-91e0-270e2d42fcbc\" (UID: \"d1b6dbe3-2fff-4985-91e0-270e2d42fcbc\") " Mar 20 17:49:09 crc kubenswrapper[4690]: I0320 17:49:09.514493 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d1b6dbe3-2fff-4985-91e0-270e2d42fcbc-util\") pod \"d1b6dbe3-2fff-4985-91e0-270e2d42fcbc\" (UID: \"d1b6dbe3-2fff-4985-91e0-270e2d42fcbc\") " Mar 20 17:49:09 crc kubenswrapper[4690]: I0320 17:49:09.515218 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1b6dbe3-2fff-4985-91e0-270e2d42fcbc-bundle" (OuterVolumeSpecName: "bundle") pod "d1b6dbe3-2fff-4985-91e0-270e2d42fcbc" (UID: "d1b6dbe3-2fff-4985-91e0-270e2d42fcbc"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:49:09 crc kubenswrapper[4690]: I0320 17:49:09.520397 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1b6dbe3-2fff-4985-91e0-270e2d42fcbc-kube-api-access-hvxwk" (OuterVolumeSpecName: "kube-api-access-hvxwk") pod "d1b6dbe3-2fff-4985-91e0-270e2d42fcbc" (UID: "d1b6dbe3-2fff-4985-91e0-270e2d42fcbc"). InnerVolumeSpecName "kube-api-access-hvxwk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:49:09 crc kubenswrapper[4690]: I0320 17:49:09.537342 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1b6dbe3-2fff-4985-91e0-270e2d42fcbc-util" (OuterVolumeSpecName: "util") pod "d1b6dbe3-2fff-4985-91e0-270e2d42fcbc" (UID: "d1b6dbe3-2fff-4985-91e0-270e2d42fcbc"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:49:09 crc kubenswrapper[4690]: I0320 17:49:09.616197 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hvxwk\" (UniqueName: \"kubernetes.io/projected/d1b6dbe3-2fff-4985-91e0-270e2d42fcbc-kube-api-access-hvxwk\") on node \"crc\" DevicePath \"\"" Mar 20 17:49:09 crc kubenswrapper[4690]: I0320 17:49:09.616643 4690 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d1b6dbe3-2fff-4985-91e0-270e2d42fcbc-util\") on node \"crc\" DevicePath \"\"" Mar 20 17:49:09 crc kubenswrapper[4690]: I0320 17:49:09.616665 4690 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d1b6dbe3-2fff-4985-91e0-270e2d42fcbc-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:49:10 crc kubenswrapper[4690]: I0320 17:49:10.060492 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d43f0597759d048207e8f8942c34d73ccb7a2672e1af8b0630dbcc16b1wfdpl" event={"ID":"d1b6dbe3-2fff-4985-91e0-270e2d42fcbc","Type":"ContainerDied","Data":"94b9f9c945882f961d37e5ed1f5b777feb33189c0d7a0abd15d3d83bc39c8978"} Mar 20 17:49:10 crc kubenswrapper[4690]: I0320 17:49:10.060568 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="94b9f9c945882f961d37e5ed1f5b777feb33189c0d7a0abd15d3d83bc39c8978" Mar 20 17:49:10 crc kubenswrapper[4690]: I0320 17:49:10.060612 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d43f0597759d048207e8f8942c34d73ccb7a2672e1af8b0630dbcc16b1wfdpl" Mar 20 17:49:17 crc kubenswrapper[4690]: I0320 17:49:17.626224 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-77cd8cbff5-64vnn"] Mar 20 17:49:17 crc kubenswrapper[4690]: E0320 17:49:17.626878 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1b6dbe3-2fff-4985-91e0-270e2d42fcbc" containerName="util" Mar 20 17:49:17 crc kubenswrapper[4690]: I0320 17:49:17.626890 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1b6dbe3-2fff-4985-91e0-270e2d42fcbc" containerName="util" Mar 20 17:49:17 crc kubenswrapper[4690]: E0320 17:49:17.626899 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1b6dbe3-2fff-4985-91e0-270e2d42fcbc" containerName="extract" Mar 20 17:49:17 crc kubenswrapper[4690]: I0320 17:49:17.626905 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1b6dbe3-2fff-4985-91e0-270e2d42fcbc" containerName="extract" Mar 20 17:49:17 crc kubenswrapper[4690]: E0320 17:49:17.626918 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1b6dbe3-2fff-4985-91e0-270e2d42fcbc" containerName="pull" Mar 20 17:49:17 crc kubenswrapper[4690]: I0320 17:49:17.626923 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1b6dbe3-2fff-4985-91e0-270e2d42fcbc" containerName="pull" Mar 20 17:49:17 crc kubenswrapper[4690]: I0320 17:49:17.627024 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1b6dbe3-2fff-4985-91e0-270e2d42fcbc" containerName="extract" Mar 20 17:49:17 crc kubenswrapper[4690]: I0320 17:49:17.627425 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-77cd8cbff5-64vnn" Mar 20 17:49:17 crc kubenswrapper[4690]: I0320 17:49:17.629044 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-8kc9b" Mar 20 17:49:17 crc kubenswrapper[4690]: I0320 17:49:17.648794 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-77cd8cbff5-64vnn"] Mar 20 17:49:17 crc kubenswrapper[4690]: I0320 17:49:17.827707 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-942dg\" (UniqueName: \"kubernetes.io/projected/2e2d986c-6f20-4436-a367-df98a71f79f0-kube-api-access-942dg\") pod \"openstack-operator-controller-init-77cd8cbff5-64vnn\" (UID: \"2e2d986c-6f20-4436-a367-df98a71f79f0\") " pod="openstack-operators/openstack-operator-controller-init-77cd8cbff5-64vnn" Mar 20 17:49:17 crc kubenswrapper[4690]: I0320 17:49:17.928855 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-942dg\" (UniqueName: \"kubernetes.io/projected/2e2d986c-6f20-4436-a367-df98a71f79f0-kube-api-access-942dg\") pod \"openstack-operator-controller-init-77cd8cbff5-64vnn\" (UID: \"2e2d986c-6f20-4436-a367-df98a71f79f0\") " pod="openstack-operators/openstack-operator-controller-init-77cd8cbff5-64vnn" Mar 20 17:49:17 crc kubenswrapper[4690]: I0320 17:49:17.965414 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-942dg\" (UniqueName: \"kubernetes.io/projected/2e2d986c-6f20-4436-a367-df98a71f79f0-kube-api-access-942dg\") pod \"openstack-operator-controller-init-77cd8cbff5-64vnn\" (UID: \"2e2d986c-6f20-4436-a367-df98a71f79f0\") " pod="openstack-operators/openstack-operator-controller-init-77cd8cbff5-64vnn" Mar 20 17:49:18 crc kubenswrapper[4690]: I0320 17:49:18.243215 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-77cd8cbff5-64vnn" Mar 20 17:49:18 crc kubenswrapper[4690]: I0320 17:49:18.718962 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-77cd8cbff5-64vnn"] Mar 20 17:49:19 crc kubenswrapper[4690]: I0320 17:49:19.124867 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-77cd8cbff5-64vnn" event={"ID":"2e2d986c-6f20-4436-a367-df98a71f79f0","Type":"ContainerStarted","Data":"4f04c8dfbdc0e4acde7b243470bcb5896fa40785844b6a97ab4edad4f2aec6cd"} Mar 20 17:49:23 crc kubenswrapper[4690]: I0320 17:49:23.153862 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-77cd8cbff5-64vnn" event={"ID":"2e2d986c-6f20-4436-a367-df98a71f79f0","Type":"ContainerStarted","Data":"d95cd1439e4b871cb8e5e300f28999eb3f0490043f1c36a29625f0976745ab4b"} Mar 20 17:49:23 crc kubenswrapper[4690]: I0320 17:49:23.155332 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-77cd8cbff5-64vnn" Mar 20 17:49:23 crc kubenswrapper[4690]: I0320 17:49:23.198798 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-77cd8cbff5-64vnn" podStartSLOduration=2.693769972 podStartE2EDuration="6.198764281s" podCreationTimestamp="2026-03-20 17:49:17 +0000 UTC" firstStartedPulling="2026-03-20 17:49:18.708166058 +0000 UTC m=+1033.573991736" lastFinishedPulling="2026-03-20 17:49:22.213160367 +0000 UTC m=+1037.078986045" observedRunningTime="2026-03-20 17:49:23.188436794 +0000 UTC m=+1038.054262512" watchObservedRunningTime="2026-03-20 17:49:23.198764281 +0000 UTC m=+1038.064589999" Mar 20 17:49:28 crc kubenswrapper[4690]: I0320 17:49:28.247133 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-77cd8cbff5-64vnn" Mar 20 17:49:45 crc kubenswrapper[4690]: I0320 17:49:45.686409 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-5bf49"] Mar 20 17:49:45 crc kubenswrapper[4690]: I0320 17:49:45.688213 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-5bf49" Mar 20 17:49:45 crc kubenswrapper[4690]: I0320 17:49:45.693310 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-5h5vj" Mar 20 17:49:45 crc kubenswrapper[4690]: I0320 17:49:45.706979 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-5bf49"] Mar 20 17:49:45 crc kubenswrapper[4690]: I0320 17:49:45.731516 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-mbp48"] Mar 20 17:49:45 crc kubenswrapper[4690]: I0320 17:49:45.732349 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-mbp48" Mar 20 17:49:45 crc kubenswrapper[4690]: I0320 17:49:45.746544 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-n25ht" Mar 20 17:49:45 crc kubenswrapper[4690]: I0320 17:49:45.780537 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-mbp48"] Mar 20 17:49:45 crc kubenswrapper[4690]: I0320 17:49:45.809182 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-27xp7"] Mar 20 17:49:45 crc kubenswrapper[4690]: I0320 17:49:45.809983 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-27xp7" Mar 20 17:49:45 crc kubenswrapper[4690]: I0320 17:49:45.818994 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-ptggn" Mar 20 17:49:45 crc kubenswrapper[4690]: I0320 17:49:45.819188 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgsdm\" (UniqueName: \"kubernetes.io/projected/a3db7a74-f9a7-4dfc-89a3-5727f538a3a7-kube-api-access-tgsdm\") pod \"cinder-operator-controller-manager-8d58dc466-mbp48\" (UID: \"a3db7a74-f9a7-4dfc-89a3-5727f538a3a7\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-mbp48" Mar 20 17:49:45 crc kubenswrapper[4690]: I0320 17:49:45.819264 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s55kp\" (UniqueName: \"kubernetes.io/projected/06fbcef9-d6fa-4dac-bfeb-93e3fc501f55-kube-api-access-s55kp\") pod \"barbican-operator-controller-manager-59bc569d95-5bf49\" (UID: \"06fbcef9-d6fa-4dac-bfeb-93e3fc501f55\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-5bf49" Mar 20 17:49:45 crc kubenswrapper[4690]: I0320 17:49:45.819314 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65glm\" (UniqueName: \"kubernetes.io/projected/1c5d887a-7a69-4f43-8b75-36de19325428-kube-api-access-65glm\") pod \"designate-operator-controller-manager-588d4d986b-27xp7\" (UID: \"1c5d887a-7a69-4f43-8b75-36de19325428\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-27xp7" Mar 20 17:49:45 crc kubenswrapper[4690]: I0320 17:49:45.827098 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-wclkd"] Mar 20 17:49:45 crc kubenswrapper[4690]: I0320 17:49:45.827966 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-wclkd" Mar 20 17:49:45 crc kubenswrapper[4690]: I0320 17:49:45.831619 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-s5gxl" Mar 20 17:49:45 crc kubenswrapper[4690]: I0320 17:49:45.844397 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-rtb26"] Mar 20 17:49:45 crc kubenswrapper[4690]: I0320 17:49:45.845213 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-rtb26" Mar 20 17:49:45 crc kubenswrapper[4690]: I0320 17:49:45.847022 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-7jh2f" Mar 20 17:49:45 crc kubenswrapper[4690]: I0320 17:49:45.857748 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-wclkd"] Mar 20 17:49:45 crc kubenswrapper[4690]: I0320 17:49:45.872780 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-27xp7"] Mar 20 17:49:45 crc kubenswrapper[4690]: I0320 17:49:45.902754 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-rtb26"] Mar 20 17:49:45 crc kubenswrapper[4690]: I0320 17:49:45.902791 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-w2m9s"] Mar 20 17:49:45 crc kubenswrapper[4690]: I0320 17:49:45.903497 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-c55d6cc99-gzcjf"] Mar 20 17:49:45 crc kubenswrapper[4690]: I0320 17:49:45.904083 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-c55d6cc99-gzcjf" Mar 20 17:49:45 crc kubenswrapper[4690]: I0320 17:49:45.904499 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-w2m9s" Mar 20 17:49:45 crc kubenswrapper[4690]: I0320 17:49:45.909447 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-w2m9s"] Mar 20 17:49:45 crc kubenswrapper[4690]: I0320 17:49:45.909619 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 20 17:49:45 crc kubenswrapper[4690]: I0320 17:49:45.909838 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-q6spz" Mar 20 17:49:45 crc kubenswrapper[4690]: I0320 17:49:45.909996 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-45655" Mar 20 17:49:45 crc kubenswrapper[4690]: I0320 17:49:45.912508 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-c55d6cc99-gzcjf"] Mar 20 17:49:45 crc kubenswrapper[4690]: I0320 17:49:45.917529 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-mmwbh"] Mar 20 17:49:45 crc kubenswrapper[4690]: I0320 17:49:45.918401 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-mmwbh" Mar 20 17:49:45 crc kubenswrapper[4690]: I0320 17:49:45.921930 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s55kp\" (UniqueName: \"kubernetes.io/projected/06fbcef9-d6fa-4dac-bfeb-93e3fc501f55-kube-api-access-s55kp\") pod \"barbican-operator-controller-manager-59bc569d95-5bf49\" (UID: \"06fbcef9-d6fa-4dac-bfeb-93e3fc501f55\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-5bf49" Mar 20 17:49:45 crc kubenswrapper[4690]: I0320 17:49:45.921987 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5prxm\" (UniqueName: \"kubernetes.io/projected/6a4da3b7-e419-4565-8b7a-2f3f3fd20aa1-kube-api-access-5prxm\") pod \"glance-operator-controller-manager-79df6bcc97-wclkd\" (UID: \"6a4da3b7-e419-4565-8b7a-2f3f3fd20aa1\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-wclkd" Mar 20 17:49:45 crc kubenswrapper[4690]: I0320 17:49:45.922015 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w22p7\" (UniqueName: \"kubernetes.io/projected/d71d628c-8060-418c-b0bf-f83193220e88-kube-api-access-w22p7\") pod \"heat-operator-controller-manager-67dd5f86f5-rtb26\" (UID: \"d71d628c-8060-418c-b0bf-f83193220e88\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-rtb26" Mar 20 17:49:45 crc kubenswrapper[4690]: I0320 17:49:45.922043 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2shbt\" (UniqueName: \"kubernetes.io/projected/535eb2e4-3de8-49bd-97a8-135823a8d1c9-kube-api-access-2shbt\") pod \"infra-operator-controller-manager-c55d6cc99-gzcjf\" (UID: \"535eb2e4-3de8-49bd-97a8-135823a8d1c9\") " pod="openstack-operators/infra-operator-controller-manager-c55d6cc99-gzcjf" Mar 20 17:49:45 crc kubenswrapper[4690]: I0320 17:49:45.922062 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65glm\" (UniqueName: \"kubernetes.io/projected/1c5d887a-7a69-4f43-8b75-36de19325428-kube-api-access-65glm\") pod \"designate-operator-controller-manager-588d4d986b-27xp7\" (UID: \"1c5d887a-7a69-4f43-8b75-36de19325428\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-27xp7" Mar 20 17:49:45 crc kubenswrapper[4690]: I0320 17:49:45.922102 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/535eb2e4-3de8-49bd-97a8-135823a8d1c9-cert\") pod \"infra-operator-controller-manager-c55d6cc99-gzcjf\" (UID: \"535eb2e4-3de8-49bd-97a8-135823a8d1c9\") " pod="openstack-operators/infra-operator-controller-manager-c55d6cc99-gzcjf" Mar 20 17:49:45 crc kubenswrapper[4690]: I0320 17:49:45.922133 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zd8lb\" (UniqueName: \"kubernetes.io/projected/3871373d-0b43-4e90-84f8-01ee2e8e4159-kube-api-access-zd8lb\") pod \"horizon-operator-controller-manager-8464cc45fb-w2m9s\" (UID: \"3871373d-0b43-4e90-84f8-01ee2e8e4159\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-w2m9s" Mar 20 17:49:45 crc kubenswrapper[4690]: I0320 17:49:45.922159 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgsdm\" (UniqueName: \"kubernetes.io/projected/a3db7a74-f9a7-4dfc-89a3-5727f538a3a7-kube-api-access-tgsdm\") pod \"cinder-operator-controller-manager-8d58dc466-mbp48\" (UID: \"a3db7a74-f9a7-4dfc-89a3-5727f538a3a7\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-mbp48" Mar 20 17:49:45 crc kubenswrapper[4690]: I0320 17:49:45.926435 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-mmwbh"] Mar 20 17:49:45 crc kubenswrapper[4690]: I0320 17:49:45.942841 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-tllng"] Mar 20 17:49:45 crc kubenswrapper[4690]: I0320 17:49:45.943918 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-tllng" Mar 20 17:49:45 crc kubenswrapper[4690]: I0320 17:49:45.948313 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-tllng"] Mar 20 17:49:45 crc kubenswrapper[4690]: I0320 17:49:45.953969 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-kzrl4" Mar 20 17:49:45 crc kubenswrapper[4690]: I0320 17:49:45.954177 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-q2j8j" Mar 20 17:49:45 crc kubenswrapper[4690]: I0320 17:49:45.961620 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgsdm\" (UniqueName: \"kubernetes.io/projected/a3db7a74-f9a7-4dfc-89a3-5727f538a3a7-kube-api-access-tgsdm\") pod \"cinder-operator-controller-manager-8d58dc466-mbp48\" (UID: \"a3db7a74-f9a7-4dfc-89a3-5727f538a3a7\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-mbp48" Mar 20 17:49:45 crc kubenswrapper[4690]: I0320 17:49:45.962548 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s55kp\" (UniqueName: \"kubernetes.io/projected/06fbcef9-d6fa-4dac-bfeb-93e3fc501f55-kube-api-access-s55kp\") pod \"barbican-operator-controller-manager-59bc569d95-5bf49\" (UID: \"06fbcef9-d6fa-4dac-bfeb-93e3fc501f55\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-5bf49" Mar 20 17:49:45 crc kubenswrapper[4690]: I0320 17:49:45.975913 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65glm\" (UniqueName: \"kubernetes.io/projected/1c5d887a-7a69-4f43-8b75-36de19325428-kube-api-access-65glm\") pod \"designate-operator-controller-manager-588d4d986b-27xp7\" (UID: \"1c5d887a-7a69-4f43-8b75-36de19325428\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-27xp7" Mar 20 17:49:45 crc kubenswrapper[4690]: I0320 17:49:45.978126 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-xgkr9"] Mar 20 17:49:45 crc kubenswrapper[4690]: I0320 17:49:45.984056 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-xgkr9" Mar 20 17:49:45 crc kubenswrapper[4690]: I0320 17:49:45.986515 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-xgkr9"] Mar 20 17:49:45 crc kubenswrapper[4690]: I0320 17:49:45.995561 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-fs4g6" Mar 20 17:49:45 crc kubenswrapper[4690]: I0320 17:49:45.995730 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-pxcl7"] Mar 20 17:49:45 crc kubenswrapper[4690]: I0320 17:49:45.996465 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-pxcl7" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.008858 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-5bf49" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.017824 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-pxcl7"] Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.023008 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lwp2\" (UniqueName: \"kubernetes.io/projected/dc81b6ac-2881-4a5c-b3f6-e09fc1c634e4-kube-api-access-6lwp2\") pod \"keystone-operator-controller-manager-768b96df4c-xgkr9\" (UID: \"dc81b6ac-2881-4a5c-b3f6-e09fc1c634e4\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-xgkr9" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.023047 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdpmp\" (UniqueName: \"kubernetes.io/projected/09c39274-3aa3-4774-98e2-10f70b707a97-kube-api-access-kdpmp\") pod \"ironic-operator-controller-manager-6f787dddc9-mmwbh\" (UID: \"09c39274-3aa3-4774-98e2-10f70b707a97\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-mmwbh" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.023078 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/535eb2e4-3de8-49bd-97a8-135823a8d1c9-cert\") pod \"infra-operator-controller-manager-c55d6cc99-gzcjf\" (UID: \"535eb2e4-3de8-49bd-97a8-135823a8d1c9\") " pod="openstack-operators/infra-operator-controller-manager-c55d6cc99-gzcjf" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.023108 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zd8lb\" (UniqueName: \"kubernetes.io/projected/3871373d-0b43-4e90-84f8-01ee2e8e4159-kube-api-access-zd8lb\") pod \"horizon-operator-controller-manager-8464cc45fb-w2m9s\" (UID: \"3871373d-0b43-4e90-84f8-01ee2e8e4159\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-w2m9s" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.023157 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbt9f\" (UniqueName: \"kubernetes.io/projected/3efbe084-4e50-405e-b477-b3b87635d465-kube-api-access-qbt9f\") pod \"mariadb-operator-controller-manager-67ccfc9778-pxcl7\" (UID: \"3efbe084-4e50-405e-b477-b3b87635d465\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-pxcl7" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.023180 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5prxm\" (UniqueName: \"kubernetes.io/projected/6a4da3b7-e419-4565-8b7a-2f3f3fd20aa1-kube-api-access-5prxm\") pod \"glance-operator-controller-manager-79df6bcc97-wclkd\" (UID: \"6a4da3b7-e419-4565-8b7a-2f3f3fd20aa1\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-wclkd" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.023210 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w22p7\" (UniqueName: \"kubernetes.io/projected/d71d628c-8060-418c-b0bf-f83193220e88-kube-api-access-w22p7\") pod \"heat-operator-controller-manager-67dd5f86f5-rtb26\" (UID: \"d71d628c-8060-418c-b0bf-f83193220e88\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-rtb26" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.023243 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sxx4\" (UniqueName: \"kubernetes.io/projected/a8f81ddb-a5b3-4881-88de-66ed78d8d344-kube-api-access-9sxx4\") pod \"manila-operator-controller-manager-55f864c847-tllng\" (UID: \"a8f81ddb-a5b3-4881-88de-66ed78d8d344\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-tllng" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.023280 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2shbt\" (UniqueName: \"kubernetes.io/projected/535eb2e4-3de8-49bd-97a8-135823a8d1c9-kube-api-access-2shbt\") pod \"infra-operator-controller-manager-c55d6cc99-gzcjf\" (UID: \"535eb2e4-3de8-49bd-97a8-135823a8d1c9\") " pod="openstack-operators/infra-operator-controller-manager-c55d6cc99-gzcjf" Mar 20 17:49:46 crc kubenswrapper[4690]: E0320 17:49:46.023734 4690 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 17:49:46 crc kubenswrapper[4690]: E0320 17:49:46.023774 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/535eb2e4-3de8-49bd-97a8-135823a8d1c9-cert podName:535eb2e4-3de8-49bd-97a8-135823a8d1c9 nodeName:}" failed. No retries permitted until 2026-03-20 17:49:46.523760048 +0000 UTC m=+1061.389585726 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/535eb2e4-3de8-49bd-97a8-135823a8d1c9-cert") pod "infra-operator-controller-manager-c55d6cc99-gzcjf" (UID: "535eb2e4-3de8-49bd-97a8-135823a8d1c9") : secret "infra-operator-webhook-server-cert" not found Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.024381 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-spklb" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.032428 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-xs6lt"] Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.033291 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-xs6lt" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.052495 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-cppz6" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.061570 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-mbp48" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.062523 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5prxm\" (UniqueName: \"kubernetes.io/projected/6a4da3b7-e419-4565-8b7a-2f3f3fd20aa1-kube-api-access-5prxm\") pod \"glance-operator-controller-manager-79df6bcc97-wclkd\" (UID: \"6a4da3b7-e419-4565-8b7a-2f3f3fd20aa1\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-wclkd" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.073977 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2shbt\" (UniqueName: \"kubernetes.io/projected/535eb2e4-3de8-49bd-97a8-135823a8d1c9-kube-api-access-2shbt\") pod \"infra-operator-controller-manager-c55d6cc99-gzcjf\" (UID: \"535eb2e4-3de8-49bd-97a8-135823a8d1c9\") " pod="openstack-operators/infra-operator-controller-manager-c55d6cc99-gzcjf" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.074039 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-xs6lt"] Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.097368 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w22p7\" (UniqueName: \"kubernetes.io/projected/d71d628c-8060-418c-b0bf-f83193220e88-kube-api-access-w22p7\") pod \"heat-operator-controller-manager-67dd5f86f5-rtb26\" (UID: \"d71d628c-8060-418c-b0bf-f83193220e88\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-rtb26" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.098112 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-55787"] Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.099017 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zd8lb\" (UniqueName: \"kubernetes.io/projected/3871373d-0b43-4e90-84f8-01ee2e8e4159-kube-api-access-zd8lb\") pod \"horizon-operator-controller-manager-8464cc45fb-w2m9s\" (UID: \"3871373d-0b43-4e90-84f8-01ee2e8e4159\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-w2m9s" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.103411 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-55787" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.105567 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-4jvhd" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.112693 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-w2m9s" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.117340 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-xzzx7"] Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.118833 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-xzzx7" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.121355 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-8qgm5" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.124021 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdpmp\" (UniqueName: \"kubernetes.io/projected/09c39274-3aa3-4774-98e2-10f70b707a97-kube-api-access-kdpmp\") pod \"ironic-operator-controller-manager-6f787dddc9-mmwbh\" (UID: \"09c39274-3aa3-4774-98e2-10f70b707a97\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-mmwbh" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.124060 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94c7w\" (UniqueName: \"kubernetes.io/projected/a31564d4-ce19-4893-bde8-871ced7c077b-kube-api-access-94c7w\") pod \"octavia-operator-controller-manager-5b9f45d989-xzzx7\" (UID: \"a31564d4-ce19-4893-bde8-871ced7c077b\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-xzzx7" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.124120 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lwkj\" (UniqueName: \"kubernetes.io/projected/61746313-5249-48cd-8dae-f7984ba74f85-kube-api-access-4lwkj\") pod \"neutron-operator-controller-manager-767865f676-xs6lt\" (UID: \"61746313-5249-48cd-8dae-f7984ba74f85\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-xs6lt" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.124154 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbt9f\" (UniqueName: \"kubernetes.io/projected/3efbe084-4e50-405e-b477-b3b87635d465-kube-api-access-qbt9f\") pod \"mariadb-operator-controller-manager-67ccfc9778-pxcl7\" (UID: \"3efbe084-4e50-405e-b477-b3b87635d465\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-pxcl7" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.124188 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9sxx4\" (UniqueName: \"kubernetes.io/projected/a8f81ddb-a5b3-4881-88de-66ed78d8d344-kube-api-access-9sxx4\") pod \"manila-operator-controller-manager-55f864c847-tllng\" (UID: \"a8f81ddb-a5b3-4881-88de-66ed78d8d344\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-tllng" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.124210 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lwp2\" (UniqueName: \"kubernetes.io/projected/dc81b6ac-2881-4a5c-b3f6-e09fc1c634e4-kube-api-access-6lwp2\") pod \"keystone-operator-controller-manager-768b96df4c-xgkr9\" (UID: \"dc81b6ac-2881-4a5c-b3f6-e09fc1c634e4\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-xgkr9" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.124227 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prblj\" (UniqueName: \"kubernetes.io/projected/458fe699-42d5-44ad-9288-3b6fbcd87161-kube-api-access-prblj\") pod \"nova-operator-controller-manager-5d488d59fb-55787\" (UID: \"458fe699-42d5-44ad-9288-3b6fbcd87161\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-55787" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.132744 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-55787"] Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.148067 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-27xp7" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.155282 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-wclkd" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.168873 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-xzzx7"] Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.183578 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-rtb26" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.190292 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdpmp\" (UniqueName: \"kubernetes.io/projected/09c39274-3aa3-4774-98e2-10f70b707a97-kube-api-access-kdpmp\") pod \"ironic-operator-controller-manager-6f787dddc9-mmwbh\" (UID: \"09c39274-3aa3-4774-98e2-10f70b707a97\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-mmwbh" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.190826 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sxx4\" (UniqueName: \"kubernetes.io/projected/a8f81ddb-a5b3-4881-88de-66ed78d8d344-kube-api-access-9sxx4\") pod \"manila-operator-controller-manager-55f864c847-tllng\" (UID: \"a8f81ddb-a5b3-4881-88de-66ed78d8d344\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-tllng" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.191014 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lwp2\" (UniqueName: \"kubernetes.io/projected/dc81b6ac-2881-4a5c-b3f6-e09fc1c634e4-kube-api-access-6lwp2\") pod \"keystone-operator-controller-manager-768b96df4c-xgkr9\" (UID: \"dc81b6ac-2881-4a5c-b3f6-e09fc1c634e4\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-xgkr9" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.202268 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbt9f\" (UniqueName: \"kubernetes.io/projected/3efbe084-4e50-405e-b477-b3b87635d465-kube-api-access-qbt9f\") pod \"mariadb-operator-controller-manager-67ccfc9778-pxcl7\" (UID: \"3efbe084-4e50-405e-b477-b3b87635d465\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-pxcl7" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.203641 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5m7n5b"] Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.204515 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5m7n5b" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.213425 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.214031 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-5z8z7" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.229772 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prblj\" (UniqueName: \"kubernetes.io/projected/458fe699-42d5-44ad-9288-3b6fbcd87161-kube-api-access-prblj\") pod \"nova-operator-controller-manager-5d488d59fb-55787\" (UID: \"458fe699-42d5-44ad-9288-3b6fbcd87161\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-55787" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.229830 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94c7w\" (UniqueName: \"kubernetes.io/projected/a31564d4-ce19-4893-bde8-871ced7c077b-kube-api-access-94c7w\") pod \"octavia-operator-controller-manager-5b9f45d989-xzzx7\" (UID: \"a31564d4-ce19-4893-bde8-871ced7c077b\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-xzzx7" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.229894 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lwkj\" (UniqueName: \"kubernetes.io/projected/61746313-5249-48cd-8dae-f7984ba74f85-kube-api-access-4lwkj\") pod \"neutron-operator-controller-manager-767865f676-xs6lt\" (UID: \"61746313-5249-48cd-8dae-f7984ba74f85\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-xs6lt" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.230580 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-fm55z"] Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.239674 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-xgkr9" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.241509 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-fm55z" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.255441 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-zz2g7" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.258991 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-pxcl7" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.281131 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94c7w\" (UniqueName: \"kubernetes.io/projected/a31564d4-ce19-4893-bde8-871ced7c077b-kube-api-access-94c7w\") pod \"octavia-operator-controller-manager-5b9f45d989-xzzx7\" (UID: \"a31564d4-ce19-4893-bde8-871ced7c077b\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-xzzx7" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.297971 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lwkj\" (UniqueName: \"kubernetes.io/projected/61746313-5249-48cd-8dae-f7984ba74f85-kube-api-access-4lwkj\") pod \"neutron-operator-controller-manager-767865f676-xs6lt\" (UID: \"61746313-5249-48cd-8dae-f7984ba74f85\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-xs6lt" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.304210 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prblj\" (UniqueName: \"kubernetes.io/projected/458fe699-42d5-44ad-9288-3b6fbcd87161-kube-api-access-prblj\") pod \"nova-operator-controller-manager-5d488d59fb-55787\" (UID: \"458fe699-42d5-44ad-9288-3b6fbcd87161\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-55787" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.318468 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-xs6lt" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.319007 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-48tn7"] Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.335648 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqqzg\" (UniqueName: \"kubernetes.io/projected/f3bca6f7-be2b-4420-8664-b94ba53d5f7f-kube-api-access-qqqzg\") pod \"openstack-baremetal-operator-controller-manager-86657c54f5m7n5b\" (UID: \"f3bca6f7-be2b-4420-8664-b94ba53d5f7f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5m7n5b" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.335885 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f3bca6f7-be2b-4420-8664-b94ba53d5f7f-cert\") pod \"openstack-baremetal-operator-controller-manager-86657c54f5m7n5b\" (UID: \"f3bca6f7-be2b-4420-8664-b94ba53d5f7f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5m7n5b" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.335990 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2wmd\" (UniqueName: \"kubernetes.io/projected/a27fabe1-095d-4c34-8e91-862aa1dbf964-kube-api-access-r2wmd\") pod \"placement-operator-controller-manager-5784578c99-fm55z\" (UID: \"a27fabe1-095d-4c34-8e91-862aa1dbf964\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-fm55z" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.350803 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5m7n5b"] Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.350989 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-48tn7" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.357395 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-sn9fv" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.378712 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-klmr6"] Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.380418 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-55787" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.381135 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-klmr6" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.384227 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-2hj6g" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.388431 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-fm55z"] Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.397827 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-48tn7"] Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.412007 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-klmr6"] Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.413952 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-xzzx7" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.423851 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-8kl25"] Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.430528 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-8kl25" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.438235 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqqzg\" (UniqueName: \"kubernetes.io/projected/f3bca6f7-be2b-4420-8664-b94ba53d5f7f-kube-api-access-qqqzg\") pod \"openstack-baremetal-operator-controller-manager-86657c54f5m7n5b\" (UID: \"f3bca6f7-be2b-4420-8664-b94ba53d5f7f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5m7n5b" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.438291 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f3bca6f7-be2b-4420-8664-b94ba53d5f7f-cert\") pod \"openstack-baremetal-operator-controller-manager-86657c54f5m7n5b\" (UID: \"f3bca6f7-be2b-4420-8664-b94ba53d5f7f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5m7n5b" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.438310 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2wmd\" (UniqueName: \"kubernetes.io/projected/a27fabe1-095d-4c34-8e91-862aa1dbf964-kube-api-access-r2wmd\") pod \"placement-operator-controller-manager-5784578c99-fm55z\" (UID: \"a27fabe1-095d-4c34-8e91-862aa1dbf964\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-fm55z" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.438374 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzn5f\" (UniqueName: \"kubernetes.io/projected/3730dc8b-cf83-4f29-ac0b-3776ef3efeba-kube-api-access-zzn5f\") pod \"ovn-operator-controller-manager-884679f54-48tn7\" (UID: \"3730dc8b-cf83-4f29-ac0b-3776ef3efeba\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-48tn7" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.438803 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-5q2fx" Mar 20 17:49:46 crc kubenswrapper[4690]: E0320 17:49:46.438920 4690 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 17:49:46 crc kubenswrapper[4690]: E0320 17:49:46.438959 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3bca6f7-be2b-4420-8664-b94ba53d5f7f-cert podName:f3bca6f7-be2b-4420-8664-b94ba53d5f7f nodeName:}" failed. No retries permitted until 2026-03-20 17:49:46.938946372 +0000 UTC m=+1061.804772050 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f3bca6f7-be2b-4420-8664-b94ba53d5f7f-cert") pod "openstack-baremetal-operator-controller-manager-86657c54f5m7n5b" (UID: "f3bca6f7-be2b-4420-8664-b94ba53d5f7f") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.444297 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-mmwbh" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.452632 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-8kl25"] Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.463364 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-vnl4s"] Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.465417 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-vnl4s" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.473662 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2wmd\" (UniqueName: \"kubernetes.io/projected/a27fabe1-095d-4c34-8e91-862aa1dbf964-kube-api-access-r2wmd\") pod \"placement-operator-controller-manager-5784578c99-fm55z\" (UID: \"a27fabe1-095d-4c34-8e91-862aa1dbf964\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-fm55z" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.473899 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-7r9sj" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.480447 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-tllng" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.485433 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqqzg\" (UniqueName: \"kubernetes.io/projected/f3bca6f7-be2b-4420-8664-b94ba53d5f7f-kube-api-access-qqqzg\") pod \"openstack-baremetal-operator-controller-manager-86657c54f5m7n5b\" (UID: \"f3bca6f7-be2b-4420-8664-b94ba53d5f7f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5m7n5b" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.487557 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-vnl4s"] Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.509929 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-fm55z" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.524366 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-pv7x5"] Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.525660 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-pv7x5" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.528059 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-pv7x5"] Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.528646 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-gjx5m" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.540183 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2csf7\" (UniqueName: \"kubernetes.io/projected/64a2959f-0b79-4b19-934b-486aad0e782b-kube-api-access-2csf7\") pod \"swift-operator-controller-manager-c674c5965-klmr6\" (UID: \"64a2959f-0b79-4b19-934b-486aad0e782b\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-klmr6" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.540241 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzn5f\" (UniqueName: \"kubernetes.io/projected/3730dc8b-cf83-4f29-ac0b-3776ef3efeba-kube-api-access-zzn5f\") pod \"ovn-operator-controller-manager-884679f54-48tn7\" (UID: \"3730dc8b-cf83-4f29-ac0b-3776ef3efeba\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-48tn7" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.540323 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/535eb2e4-3de8-49bd-97a8-135823a8d1c9-cert\") pod \"infra-operator-controller-manager-c55d6cc99-gzcjf\" (UID: \"535eb2e4-3de8-49bd-97a8-135823a8d1c9\") " pod="openstack-operators/infra-operator-controller-manager-c55d6cc99-gzcjf" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.540365 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rm4h\" (UniqueName: \"kubernetes.io/projected/c7a8e424-00f3-4e97-b6b3-bd2513624b2e-kube-api-access-4rm4h\") pod \"telemetry-operator-controller-manager-d6b694c5-8kl25\" (UID: \"c7a8e424-00f3-4e97-b6b3-bd2513624b2e\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-8kl25" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.540399 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d68zw\" (UniqueName: \"kubernetes.io/projected/d7d5bc08-99d0-4361-ae0e-ca9732db6154-kube-api-access-d68zw\") pod \"test-operator-controller-manager-5c5cb9c4d7-vnl4s\" (UID: \"d7d5bc08-99d0-4361-ae0e-ca9732db6154\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-vnl4s" Mar 20 17:49:46 crc kubenswrapper[4690]: E0320 17:49:46.540945 4690 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 17:49:46 crc kubenswrapper[4690]: E0320 17:49:46.541029 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/535eb2e4-3de8-49bd-97a8-135823a8d1c9-cert podName:535eb2e4-3de8-49bd-97a8-135823a8d1c9 nodeName:}" failed. No retries permitted until 2026-03-20 17:49:47.541005443 +0000 UTC m=+1062.406831171 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/535eb2e4-3de8-49bd-97a8-135823a8d1c9-cert") pod "infra-operator-controller-manager-c55d6cc99-gzcjf" (UID: "535eb2e4-3de8-49bd-97a8-135823a8d1c9") : secret "infra-operator-webhook-server-cert" not found Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.566038 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-54c9b8654f-ms4r7"] Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.568626 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-54c9b8654f-ms4r7" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.570538 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-54c9b8654f-ms4r7"] Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.572511 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-2jwf2" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.580787 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.581603 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.584773 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzn5f\" (UniqueName: \"kubernetes.io/projected/3730dc8b-cf83-4f29-ac0b-3776ef3efeba-kube-api-access-zzn5f\") pod \"ovn-operator-controller-manager-884679f54-48tn7\" (UID: \"3730dc8b-cf83-4f29-ac0b-3776ef3efeba\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-48tn7" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.648373 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2csf7\" (UniqueName: \"kubernetes.io/projected/64a2959f-0b79-4b19-934b-486aad0e782b-kube-api-access-2csf7\") pod \"swift-operator-controller-manager-c674c5965-klmr6\" (UID: \"64a2959f-0b79-4b19-934b-486aad0e782b\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-klmr6" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.648449 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4cq8\" (UniqueName: \"kubernetes.io/projected/64ced890-6363-43c3-83e5-0001c72851ef-kube-api-access-q4cq8\") pod \"watcher-operator-controller-manager-6c4d75f7f9-pv7x5\" (UID: \"64ced890-6363-43c3-83e5-0001c72851ef\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-pv7x5" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.648752 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rm4h\" (UniqueName: \"kubernetes.io/projected/c7a8e424-00f3-4e97-b6b3-bd2513624b2e-kube-api-access-4rm4h\") pod \"telemetry-operator-controller-manager-d6b694c5-8kl25\" (UID: \"c7a8e424-00f3-4e97-b6b3-bd2513624b2e\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-8kl25" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.648805 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d68zw\" (UniqueName: \"kubernetes.io/projected/d7d5bc08-99d0-4361-ae0e-ca9732db6154-kube-api-access-d68zw\") pod \"test-operator-controller-manager-5c5cb9c4d7-vnl4s\" (UID: \"d7d5bc08-99d0-4361-ae0e-ca9732db6154\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-vnl4s" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.666421 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d68zw\" (UniqueName: \"kubernetes.io/projected/d7d5bc08-99d0-4361-ae0e-ca9732db6154-kube-api-access-d68zw\") pod \"test-operator-controller-manager-5c5cb9c4d7-vnl4s\" (UID: \"d7d5bc08-99d0-4361-ae0e-ca9732db6154\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-vnl4s" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.673047 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2csf7\" (UniqueName: \"kubernetes.io/projected/64a2959f-0b79-4b19-934b-486aad0e782b-kube-api-access-2csf7\") pod \"swift-operator-controller-manager-c674c5965-klmr6\" (UID: \"64a2959f-0b79-4b19-934b-486aad0e782b\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-klmr6" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.675547 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rm4h\" (UniqueName: \"kubernetes.io/projected/c7a8e424-00f3-4e97-b6b3-bd2513624b2e-kube-api-access-4rm4h\") pod \"telemetry-operator-controller-manager-d6b694c5-8kl25\" (UID: \"c7a8e424-00f3-4e97-b6b3-bd2513624b2e\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-8kl25" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.739781 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-5bf49"] Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.750011 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/26b1c9fc-55f9-4895-9d23-a7c7e0e811c3-webhook-certs\") pod \"openstack-operator-controller-manager-54c9b8654f-ms4r7\" (UID: \"26b1c9fc-55f9-4895-9d23-a7c7e0e811c3\") " pod="openstack-operators/openstack-operator-controller-manager-54c9b8654f-ms4r7" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.750173 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/26b1c9fc-55f9-4895-9d23-a7c7e0e811c3-metrics-certs\") pod \"openstack-operator-controller-manager-54c9b8654f-ms4r7\" (UID: \"26b1c9fc-55f9-4895-9d23-a7c7e0e811c3\") " pod="openstack-operators/openstack-operator-controller-manager-54c9b8654f-ms4r7" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.750241 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4cq8\" (UniqueName: \"kubernetes.io/projected/64ced890-6363-43c3-83e5-0001c72851ef-kube-api-access-q4cq8\") pod \"watcher-operator-controller-manager-6c4d75f7f9-pv7x5\" (UID: \"64ced890-6363-43c3-83e5-0001c72851ef\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-pv7x5" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.750291 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bgpj\" (UniqueName: \"kubernetes.io/projected/26b1c9fc-55f9-4895-9d23-a7c7e0e811c3-kube-api-access-4bgpj\") pod \"openstack-operator-controller-manager-54c9b8654f-ms4r7\" (UID: \"26b1c9fc-55f9-4895-9d23-a7c7e0e811c3\") " pod="openstack-operators/openstack-operator-controller-manager-54c9b8654f-ms4r7" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.774916 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4cq8\" (UniqueName: \"kubernetes.io/projected/64ced890-6363-43c3-83e5-0001c72851ef-kube-api-access-q4cq8\") pod \"watcher-operator-controller-manager-6c4d75f7f9-pv7x5\" (UID: \"64ced890-6363-43c3-83e5-0001c72851ef\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-pv7x5" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.836880 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-48tn7" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.851597 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/26b1c9fc-55f9-4895-9d23-a7c7e0e811c3-webhook-certs\") pod \"openstack-operator-controller-manager-54c9b8654f-ms4r7\" (UID: \"26b1c9fc-55f9-4895-9d23-a7c7e0e811c3\") " pod="openstack-operators/openstack-operator-controller-manager-54c9b8654f-ms4r7" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.851676 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/26b1c9fc-55f9-4895-9d23-a7c7e0e811c3-metrics-certs\") pod \"openstack-operator-controller-manager-54c9b8654f-ms4r7\" (UID: \"26b1c9fc-55f9-4895-9d23-a7c7e0e811c3\") " pod="openstack-operators/openstack-operator-controller-manager-54c9b8654f-ms4r7" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.851727 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bgpj\" (UniqueName: \"kubernetes.io/projected/26b1c9fc-55f9-4895-9d23-a7c7e0e811c3-kube-api-access-4bgpj\") pod \"openstack-operator-controller-manager-54c9b8654f-ms4r7\" (UID: \"26b1c9fc-55f9-4895-9d23-a7c7e0e811c3\") " pod="openstack-operators/openstack-operator-controller-manager-54c9b8654f-ms4r7" Mar 20 17:49:46 crc kubenswrapper[4690]: E0320 17:49:46.851752 4690 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 17:49:46 crc kubenswrapper[4690]: E0320 17:49:46.851815 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26b1c9fc-55f9-4895-9d23-a7c7e0e811c3-webhook-certs podName:26b1c9fc-55f9-4895-9d23-a7c7e0e811c3 nodeName:}" failed. No retries permitted until 2026-03-20 17:49:47.351796548 +0000 UTC m=+1062.217622226 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/26b1c9fc-55f9-4895-9d23-a7c7e0e811c3-webhook-certs") pod "openstack-operator-controller-manager-54c9b8654f-ms4r7" (UID: "26b1c9fc-55f9-4895-9d23-a7c7e0e811c3") : secret "webhook-server-cert" not found Mar 20 17:49:46 crc kubenswrapper[4690]: E0320 17:49:46.852022 4690 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 17:49:46 crc kubenswrapper[4690]: E0320 17:49:46.852045 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26b1c9fc-55f9-4895-9d23-a7c7e0e811c3-metrics-certs podName:26b1c9fc-55f9-4895-9d23-a7c7e0e811c3 nodeName:}" failed. No retries permitted until 2026-03-20 17:49:47.352039035 +0000 UTC m=+1062.217864713 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/26b1c9fc-55f9-4895-9d23-a7c7e0e811c3-metrics-certs") pod "openstack-operator-controller-manager-54c9b8654f-ms4r7" (UID: "26b1c9fc-55f9-4895-9d23-a7c7e0e811c3") : secret "metrics-server-cert" not found Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.867788 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-klmr6" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.869047 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bgpj\" (UniqueName: \"kubernetes.io/projected/26b1c9fc-55f9-4895-9d23-a7c7e0e811c3-kube-api-access-4bgpj\") pod \"openstack-operator-controller-manager-54c9b8654f-ms4r7\" (UID: \"26b1c9fc-55f9-4895-9d23-a7c7e0e811c3\") " pod="openstack-operators/openstack-operator-controller-manager-54c9b8654f-ms4r7" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.910152 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-8kl25" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.927850 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-vnl4s" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.928197 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-mbp48"] Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.943987 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-pv7x5" Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.953420 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f3bca6f7-be2b-4420-8664-b94ba53d5f7f-cert\") pod \"openstack-baremetal-operator-controller-manager-86657c54f5m7n5b\" (UID: \"f3bca6f7-be2b-4420-8664-b94ba53d5f7f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5m7n5b" Mar 20 17:49:46 crc kubenswrapper[4690]: E0320 17:49:46.953693 4690 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 17:49:46 crc kubenswrapper[4690]: E0320 17:49:46.953751 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3bca6f7-be2b-4420-8664-b94ba53d5f7f-cert podName:f3bca6f7-be2b-4420-8664-b94ba53d5f7f nodeName:}" failed. No retries permitted until 2026-03-20 17:49:47.953733256 +0000 UTC m=+1062.819558934 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f3bca6f7-be2b-4420-8664-b94ba53d5f7f-cert") pod "openstack-baremetal-operator-controller-manager-86657c54f5m7n5b" (UID: "f3bca6f7-be2b-4420-8664-b94ba53d5f7f") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 17:49:46 crc kubenswrapper[4690]: I0320 17:49:46.972460 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-w2m9s"] Mar 20 17:49:46 crc kubenswrapper[4690]: W0320 17:49:46.994908 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3871373d_0b43_4e90_84f8_01ee2e8e4159.slice/crio-8043f2cce7ed55b32a751966c3e638976a5027622a4e6ba6bf10776ec8ebf5bf WatchSource:0}: Error finding container 8043f2cce7ed55b32a751966c3e638976a5027622a4e6ba6bf10776ec8ebf5bf: Status 404 returned error can't find the container with id 8043f2cce7ed55b32a751966c3e638976a5027622a4e6ba6bf10776ec8ebf5bf Mar 20 17:49:47 crc kubenswrapper[4690]: I0320 17:49:47.078533 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-wclkd"] Mar 20 17:49:47 crc kubenswrapper[4690]: I0320 17:49:47.212018 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-rtb26"] Mar 20 17:49:47 crc kubenswrapper[4690]: I0320 17:49:47.253426 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-xs6lt"] Mar 20 17:49:47 crc kubenswrapper[4690]: I0320 17:49:47.259822 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-27xp7"] Mar 20 17:49:47 crc kubenswrapper[4690]: I0320 17:49:47.330869 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-pxcl7"] Mar 20 17:49:47 crc kubenswrapper[4690]: I0320 17:49:47.362438 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-xzzx7"] Mar 20 17:49:47 crc kubenswrapper[4690]: I0320 17:49:47.367206 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/26b1c9fc-55f9-4895-9d23-a7c7e0e811c3-metrics-certs\") pod \"openstack-operator-controller-manager-54c9b8654f-ms4r7\" (UID: \"26b1c9fc-55f9-4895-9d23-a7c7e0e811c3\") " pod="openstack-operators/openstack-operator-controller-manager-54c9b8654f-ms4r7" Mar 20 17:49:47 crc kubenswrapper[4690]: I0320 17:49:47.367320 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/26b1c9fc-55f9-4895-9d23-a7c7e0e811c3-webhook-certs\") pod \"openstack-operator-controller-manager-54c9b8654f-ms4r7\" (UID: \"26b1c9fc-55f9-4895-9d23-a7c7e0e811c3\") " pod="openstack-operators/openstack-operator-controller-manager-54c9b8654f-ms4r7" Mar 20 17:49:47 crc kubenswrapper[4690]: E0320 17:49:47.367514 4690 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 17:49:47 crc kubenswrapper[4690]: E0320 17:49:47.367587 4690 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 17:49:47 crc kubenswrapper[4690]: E0320 17:49:47.367592 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26b1c9fc-55f9-4895-9d23-a7c7e0e811c3-metrics-certs podName:26b1c9fc-55f9-4895-9d23-a7c7e0e811c3 nodeName:}" failed. No retries permitted until 2026-03-20 17:49:48.367572771 +0000 UTC m=+1063.233398519 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/26b1c9fc-55f9-4895-9d23-a7c7e0e811c3-metrics-certs") pod "openstack-operator-controller-manager-54c9b8654f-ms4r7" (UID: "26b1c9fc-55f9-4895-9d23-a7c7e0e811c3") : secret "metrics-server-cert" not found Mar 20 17:49:47 crc kubenswrapper[4690]: E0320 17:49:47.367684 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26b1c9fc-55f9-4895-9d23-a7c7e0e811c3-webhook-certs podName:26b1c9fc-55f9-4895-9d23-a7c7e0e811c3 nodeName:}" failed. No retries permitted until 2026-03-20 17:49:48.367662543 +0000 UTC m=+1063.233488281 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/26b1c9fc-55f9-4895-9d23-a7c7e0e811c3-webhook-certs") pod "openstack-operator-controller-manager-54c9b8654f-ms4r7" (UID: "26b1c9fc-55f9-4895-9d23-a7c7e0e811c3") : secret "webhook-server-cert" not found Mar 20 17:49:47 crc kubenswrapper[4690]: I0320 17:49:47.370197 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-fm55z"] Mar 20 17:49:47 crc kubenswrapper[4690]: W0320 17:49:47.376866 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda27fabe1_095d_4c34_8e91_862aa1dbf964.slice/crio-f8e615f2f6d6c521a4cb669bb8641eea509c67325b805145c258f1295874534b WatchSource:0}: Error finding container f8e615f2f6d6c521a4cb669bb8641eea509c67325b805145c258f1295874534b: Status 404 returned error can't find the container with id f8e615f2f6d6c521a4cb669bb8641eea509c67325b805145c258f1295874534b Mar 20 17:49:47 crc kubenswrapper[4690]: W0320 17:49:47.377185 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda31564d4_ce19_4893_bde8_871ced7c077b.slice/crio-e65f32389b91fc10daa1e7b243669fba4d92477e662fe2ffffd9b252cb72d2d8 WatchSource:0}: Error finding container e65f32389b91fc10daa1e7b243669fba4d92477e662fe2ffffd9b252cb72d2d8: Status 404 returned error can't find the container with id e65f32389b91fc10daa1e7b243669fba4d92477e662fe2ffffd9b252cb72d2d8 Mar 20 17:49:47 crc kubenswrapper[4690]: I0320 17:49:47.412232 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-wclkd" event={"ID":"6a4da3b7-e419-4565-8b7a-2f3f3fd20aa1","Type":"ContainerStarted","Data":"c7e85f22a7ed9d8f43a443fa6df299794fbfff84e6f8100b998cfa62716cfe08"} Mar 20 17:49:47 crc kubenswrapper[4690]: I0320 17:49:47.415827 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-pxcl7" event={"ID":"3efbe084-4e50-405e-b477-b3b87635d465","Type":"ContainerStarted","Data":"6ad6f8c1bb2a40fcc6e4b94b5a6d6199fbf61ac8aaba95970ac4f62158572a5b"} Mar 20 17:49:47 crc kubenswrapper[4690]: I0320 17:49:47.417691 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-xs6lt" event={"ID":"61746313-5249-48cd-8dae-f7984ba74f85","Type":"ContainerStarted","Data":"f5e1f1ef86c29171a2f2b9729b9d3104c9a7ee5220fd4713930c6f1aeb846578"} Mar 20 17:49:47 crc kubenswrapper[4690]: I0320 17:49:47.419711 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-rtb26" event={"ID":"d71d628c-8060-418c-b0bf-f83193220e88","Type":"ContainerStarted","Data":"afc4e1c8f6a16d9a0982f35676a90550649b87f7e83ee7d683183199c055e7b6"} Mar 20 17:49:47 crc kubenswrapper[4690]: I0320 17:49:47.420499 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-mbp48" event={"ID":"a3db7a74-f9a7-4dfc-89a3-5727f538a3a7","Type":"ContainerStarted","Data":"680e1e59bc3957ae0d1608804d5e334bb120ab271d1147ece865efebff04ae2f"} Mar 20 17:49:47 crc kubenswrapper[4690]: I0320 17:49:47.421395 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-w2m9s" event={"ID":"3871373d-0b43-4e90-84f8-01ee2e8e4159","Type":"ContainerStarted","Data":"8043f2cce7ed55b32a751966c3e638976a5027622a4e6ba6bf10776ec8ebf5bf"} Mar 20 17:49:47 crc kubenswrapper[4690]: I0320 17:49:47.424145 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-27xp7" event={"ID":"1c5d887a-7a69-4f43-8b75-36de19325428","Type":"ContainerStarted","Data":"88968dd99a32ca5e850a464586a85e181a05beb4609c73f4e424c03d5b282878"} Mar 20 17:49:47 crc kubenswrapper[4690]: I0320 17:49:47.425527 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-xzzx7" event={"ID":"a31564d4-ce19-4893-bde8-871ced7c077b","Type":"ContainerStarted","Data":"e65f32389b91fc10daa1e7b243669fba4d92477e662fe2ffffd9b252cb72d2d8"} Mar 20 17:49:47 crc kubenswrapper[4690]: I0320 17:49:47.426870 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-fm55z" event={"ID":"a27fabe1-095d-4c34-8e91-862aa1dbf964","Type":"ContainerStarted","Data":"f8e615f2f6d6c521a4cb669bb8641eea509c67325b805145c258f1295874534b"} Mar 20 17:49:47 crc kubenswrapper[4690]: I0320 17:49:47.427886 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-5bf49" event={"ID":"06fbcef9-d6fa-4dac-bfeb-93e3fc501f55","Type":"ContainerStarted","Data":"8f47453afee6ee13a85f79fa45efc37d2f4ea132d075116106054c8aa0716ccd"} Mar 20 17:49:47 crc kubenswrapper[4690]: I0320 17:49:47.545630 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-48tn7"] Mar 20 17:49:47 crc kubenswrapper[4690]: I0320 17:49:47.562043 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-tllng"] Mar 20 17:49:47 crc kubenswrapper[4690]: I0320 17:49:47.569230 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/535eb2e4-3de8-49bd-97a8-135823a8d1c9-cert\") pod \"infra-operator-controller-manager-c55d6cc99-gzcjf\" (UID: \"535eb2e4-3de8-49bd-97a8-135823a8d1c9\") " pod="openstack-operators/infra-operator-controller-manager-c55d6cc99-gzcjf" Mar 20 17:49:47 crc kubenswrapper[4690]: E0320 17:49:47.569532 4690 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 17:49:47 crc kubenswrapper[4690]: E0320 17:49:47.569614 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/535eb2e4-3de8-49bd-97a8-135823a8d1c9-cert podName:535eb2e4-3de8-49bd-97a8-135823a8d1c9 nodeName:}" failed. No retries permitted until 2026-03-20 17:49:49.569588932 +0000 UTC m=+1064.435414650 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/535eb2e4-3de8-49bd-97a8-135823a8d1c9-cert") pod "infra-operator-controller-manager-c55d6cc99-gzcjf" (UID: "535eb2e4-3de8-49bd-97a8-135823a8d1c9") : secret "infra-operator-webhook-server-cert" not found Mar 20 17:49:47 crc kubenswrapper[4690]: I0320 17:49:47.573149 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-mmwbh"] Mar 20 17:49:47 crc kubenswrapper[4690]: I0320 17:49:47.579065 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-55787"] Mar 20 17:49:47 crc kubenswrapper[4690]: E0320 17:49:47.585878 4690 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6lwp2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-768b96df4c-xgkr9_openstack-operators(dc81b6ac-2881-4a5c-b3f6-e09fc1c634e4): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 17:49:47 crc kubenswrapper[4690]: E0320 17:49:47.587170 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-xgkr9" podUID="dc81b6ac-2881-4a5c-b3f6-e09fc1c634e4" Mar 20 17:49:47 crc kubenswrapper[4690]: I0320 17:49:47.587265 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-xgkr9"] Mar 20 17:49:47 crc kubenswrapper[4690]: I0320 17:49:47.662715 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-vnl4s"] Mar 20 17:49:47 crc kubenswrapper[4690]: W0320 17:49:47.667240 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7d5bc08_99d0_4361_ae0e_ca9732db6154.slice/crio-f03dca1e0b9443ec1d3a1116734e1c3449ebb85b978e3cdc4c80704df1fc2aac WatchSource:0}: Error finding container f03dca1e0b9443ec1d3a1116734e1c3449ebb85b978e3cdc4c80704df1fc2aac: Status 404 returned error can't find the container with id f03dca1e0b9443ec1d3a1116734e1c3449ebb85b978e3cdc4c80704df1fc2aac Mar 20 17:49:47 crc kubenswrapper[4690]: E0320 17:49:47.669417 4690 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-d68zw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5c5cb9c4d7-vnl4s_openstack-operators(d7d5bc08-99d0-4361-ae0e-ca9732db6154): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 17:49:47 crc kubenswrapper[4690]: I0320 17:49:47.670545 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-8kl25"] Mar 20 17:49:47 crc kubenswrapper[4690]: E0320 17:49:47.671329 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-vnl4s" podUID="d7d5bc08-99d0-4361-ae0e-ca9732db6154" Mar 20 17:49:47 crc kubenswrapper[4690]: W0320 17:49:47.672608 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64a2959f_0b79_4b19_934b_486aad0e782b.slice/crio-10e50bf67398da240aa91dd5900c07f827d4be8a6ecd4f5d0b2202bfdd406d42 WatchSource:0}: Error finding container 10e50bf67398da240aa91dd5900c07f827d4be8a6ecd4f5d0b2202bfdd406d42: Status 404 returned error can't find the container with id 10e50bf67398da240aa91dd5900c07f827d4be8a6ecd4f5d0b2202bfdd406d42 Mar 20 17:49:47 crc kubenswrapper[4690]: E0320 17:49:47.676423 4690 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2csf7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-c674c5965-klmr6_openstack-operators(64a2959f-0b79-4b19-934b-486aad0e782b): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 17:49:47 crc kubenswrapper[4690]: E0320 17:49:47.678538 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-c674c5965-klmr6" podUID="64a2959f-0b79-4b19-934b-486aad0e782b" Mar 20 17:49:47 crc kubenswrapper[4690]: I0320 17:49:47.681085 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-klmr6"] Mar 20 17:49:47 crc kubenswrapper[4690]: W0320 17:49:47.682817 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7a8e424_00f3_4e97_b6b3_bd2513624b2e.slice/crio-f04cc28d9f3e17f970da5a9eb841d2909b568d3e12650aa1bc01fce98b7580fb WatchSource:0}: Error finding container f04cc28d9f3e17f970da5a9eb841d2909b568d3e12650aa1bc01fce98b7580fb: Status 404 returned error can't find the container with id f04cc28d9f3e17f970da5a9eb841d2909b568d3e12650aa1bc01fce98b7580fb Mar 20 17:49:47 crc kubenswrapper[4690]: W0320 17:49:47.684901 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64ced890_6363_43c3_83e5_0001c72851ef.slice/crio-f1586754999a032afebff66bafbf5ae17943ae3070834e1e08add963fdf6b9e8 WatchSource:0}: Error finding container f1586754999a032afebff66bafbf5ae17943ae3070834e1e08add963fdf6b9e8: Status 404 returned error can't find the container with id f1586754999a032afebff66bafbf5ae17943ae3070834e1e08add963fdf6b9e8 Mar 20 17:49:47 crc kubenswrapper[4690]: I0320 17:49:47.687375 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-pv7x5"] Mar 20 17:49:47 crc kubenswrapper[4690]: E0320 17:49:47.687889 4690 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-q4cq8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-6c4d75f7f9-pv7x5_openstack-operators(64ced890-6363-43c3-83e5-0001c72851ef): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 17:49:47 crc kubenswrapper[4690]: E0320 17:49:47.689704 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-pv7x5" podUID="64ced890-6363-43c3-83e5-0001c72851ef" Mar 20 17:49:47 crc kubenswrapper[4690]: E0320 17:49:47.689781 4690 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4rm4h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-d6b694c5-8kl25_openstack-operators(c7a8e424-00f3-4e97-b6b3-bd2513624b2e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 17:49:47 crc kubenswrapper[4690]: E0320 17:49:47.691505 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-8kl25" podUID="c7a8e424-00f3-4e97-b6b3-bd2513624b2e" Mar 20 17:49:47 crc kubenswrapper[4690]: I0320 17:49:47.974305 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f3bca6f7-be2b-4420-8664-b94ba53d5f7f-cert\") pod \"openstack-baremetal-operator-controller-manager-86657c54f5m7n5b\" (UID: \"f3bca6f7-be2b-4420-8664-b94ba53d5f7f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5m7n5b" Mar 20 17:49:47 crc kubenswrapper[4690]: E0320 17:49:47.974504 4690 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 17:49:47 crc kubenswrapper[4690]: E0320 17:49:47.974903 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3bca6f7-be2b-4420-8664-b94ba53d5f7f-cert podName:f3bca6f7-be2b-4420-8664-b94ba53d5f7f nodeName:}" failed. No retries permitted until 2026-03-20 17:49:49.9748828 +0000 UTC m=+1064.840708488 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f3bca6f7-be2b-4420-8664-b94ba53d5f7f-cert") pod "openstack-baremetal-operator-controller-manager-86657c54f5m7n5b" (UID: "f3bca6f7-be2b-4420-8664-b94ba53d5f7f") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 17:49:48 crc kubenswrapper[4690]: I0320 17:49:48.379910 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/26b1c9fc-55f9-4895-9d23-a7c7e0e811c3-webhook-certs\") pod \"openstack-operator-controller-manager-54c9b8654f-ms4r7\" (UID: \"26b1c9fc-55f9-4895-9d23-a7c7e0e811c3\") " pod="openstack-operators/openstack-operator-controller-manager-54c9b8654f-ms4r7" Mar 20 17:49:48 crc kubenswrapper[4690]: I0320 17:49:48.379994 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/26b1c9fc-55f9-4895-9d23-a7c7e0e811c3-metrics-certs\") pod \"openstack-operator-controller-manager-54c9b8654f-ms4r7\" (UID: \"26b1c9fc-55f9-4895-9d23-a7c7e0e811c3\") " pod="openstack-operators/openstack-operator-controller-manager-54c9b8654f-ms4r7" Mar 20 17:49:48 crc kubenswrapper[4690]: E0320 17:49:48.380170 4690 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 17:49:48 crc kubenswrapper[4690]: E0320 17:49:48.380295 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26b1c9fc-55f9-4895-9d23-a7c7e0e811c3-metrics-certs podName:26b1c9fc-55f9-4895-9d23-a7c7e0e811c3 nodeName:}" failed. No retries permitted until 2026-03-20 17:49:50.380280521 +0000 UTC m=+1065.246106199 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/26b1c9fc-55f9-4895-9d23-a7c7e0e811c3-metrics-certs") pod "openstack-operator-controller-manager-54c9b8654f-ms4r7" (UID: "26b1c9fc-55f9-4895-9d23-a7c7e0e811c3") : secret "metrics-server-cert" not found Mar 20 17:49:48 crc kubenswrapper[4690]: E0320 17:49:48.380746 4690 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 17:49:48 crc kubenswrapper[4690]: E0320 17:49:48.380782 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26b1c9fc-55f9-4895-9d23-a7c7e0e811c3-webhook-certs podName:26b1c9fc-55f9-4895-9d23-a7c7e0e811c3 nodeName:}" failed. No retries permitted until 2026-03-20 17:49:50.380770206 +0000 UTC m=+1065.246595884 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/26b1c9fc-55f9-4895-9d23-a7c7e0e811c3-webhook-certs") pod "openstack-operator-controller-manager-54c9b8654f-ms4r7" (UID: "26b1c9fc-55f9-4895-9d23-a7c7e0e811c3") : secret "webhook-server-cert" not found Mar 20 17:49:48 crc kubenswrapper[4690]: I0320 17:49:48.438157 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-klmr6" event={"ID":"64a2959f-0b79-4b19-934b-486aad0e782b","Type":"ContainerStarted","Data":"10e50bf67398da240aa91dd5900c07f827d4be8a6ecd4f5d0b2202bfdd406d42"} Mar 20 17:49:48 crc kubenswrapper[4690]: I0320 17:49:48.442498 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-8kl25" event={"ID":"c7a8e424-00f3-4e97-b6b3-bd2513624b2e","Type":"ContainerStarted","Data":"f04cc28d9f3e17f970da5a9eb841d2909b568d3e12650aa1bc01fce98b7580fb"} Mar 20 17:49:48 crc kubenswrapper[4690]: E0320 17:49:48.442884 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e\\\"\"" pod="openstack-operators/swift-operator-controller-manager-c674c5965-klmr6" podUID="64a2959f-0b79-4b19-934b-486aad0e782b" Mar 20 17:49:48 crc kubenswrapper[4690]: E0320 17:49:48.443949 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-8kl25" podUID="c7a8e424-00f3-4e97-b6b3-bd2513624b2e" Mar 20 17:49:48 crc kubenswrapper[4690]: I0320 17:49:48.447061 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-55787" event={"ID":"458fe699-42d5-44ad-9288-3b6fbcd87161","Type":"ContainerStarted","Data":"bf7054913097ee7a54c7013c88777afa1970bb97f7f147e0d17026c6f912ef8f"} Mar 20 17:49:48 crc kubenswrapper[4690]: I0320 17:49:48.451119 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-mmwbh" event={"ID":"09c39274-3aa3-4774-98e2-10f70b707a97","Type":"ContainerStarted","Data":"df4c28308038f9505392d5fbc924ee863f3f4ec39fc6119b023c0d3220f8810a"} Mar 20 17:49:48 crc kubenswrapper[4690]: I0320 17:49:48.455292 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-48tn7" event={"ID":"3730dc8b-cf83-4f29-ac0b-3776ef3efeba","Type":"ContainerStarted","Data":"4ecd552ea5239d3b7e9df7a07d2fccf014a2fe3ec68efdac9a3169eaf8116385"} Mar 20 17:49:48 crc kubenswrapper[4690]: I0320 17:49:48.458221 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-tllng" event={"ID":"a8f81ddb-a5b3-4881-88de-66ed78d8d344","Type":"ContainerStarted","Data":"99d69de9f548e423354c6d125ad39880f59309ffb12f8291081c23ee734946a3"} Mar 20 17:49:48 crc kubenswrapper[4690]: I0320 17:49:48.460763 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-pv7x5" event={"ID":"64ced890-6363-43c3-83e5-0001c72851ef","Type":"ContainerStarted","Data":"f1586754999a032afebff66bafbf5ae17943ae3070834e1e08add963fdf6b9e8"} Mar 20 17:49:48 crc kubenswrapper[4690]: I0320 17:49:48.465411 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-vnl4s" event={"ID":"d7d5bc08-99d0-4361-ae0e-ca9732db6154","Type":"ContainerStarted","Data":"f03dca1e0b9443ec1d3a1116734e1c3449ebb85b978e3cdc4c80704df1fc2aac"} Mar 20 17:49:48 crc kubenswrapper[4690]: E0320 17:49:48.465533 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-pv7x5" podUID="64ced890-6363-43c3-83e5-0001c72851ef" Mar 20 17:49:48 crc kubenswrapper[4690]: E0320 17:49:48.468317 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\"\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-vnl4s" podUID="d7d5bc08-99d0-4361-ae0e-ca9732db6154" Mar 20 17:49:48 crc kubenswrapper[4690]: I0320 17:49:48.471525 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-xgkr9" event={"ID":"dc81b6ac-2881-4a5c-b3f6-e09fc1c634e4","Type":"ContainerStarted","Data":"78268c48caf6669cde8e7f145610b96f3b4883a22065a3b4ce9c8b42faf6f0cc"} Mar 20 17:49:48 crc kubenswrapper[4690]: E0320 17:49:48.472794 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-xgkr9" podUID="dc81b6ac-2881-4a5c-b3f6-e09fc1c634e4" Mar 20 17:49:49 crc kubenswrapper[4690]: E0320 17:49:49.480788 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e\\\"\"" pod="openstack-operators/swift-operator-controller-manager-c674c5965-klmr6" podUID="64a2959f-0b79-4b19-934b-486aad0e782b" Mar 20 17:49:49 crc kubenswrapper[4690]: E0320 17:49:49.481372 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-xgkr9" podUID="dc81b6ac-2881-4a5c-b3f6-e09fc1c634e4" Mar 20 17:49:49 crc kubenswrapper[4690]: E0320 17:49:49.481427 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-8kl25" podUID="c7a8e424-00f3-4e97-b6b3-bd2513624b2e" Mar 20 17:49:49 crc kubenswrapper[4690]: E0320 17:49:49.481465 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-pv7x5" podUID="64ced890-6363-43c3-83e5-0001c72851ef" Mar 20 17:49:49 crc kubenswrapper[4690]: E0320 17:49:49.481500 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\"\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-vnl4s" podUID="d7d5bc08-99d0-4361-ae0e-ca9732db6154" Mar 20 17:49:49 crc kubenswrapper[4690]: I0320 17:49:49.604044 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/535eb2e4-3de8-49bd-97a8-135823a8d1c9-cert\") pod \"infra-operator-controller-manager-c55d6cc99-gzcjf\" (UID: \"535eb2e4-3de8-49bd-97a8-135823a8d1c9\") " pod="openstack-operators/infra-operator-controller-manager-c55d6cc99-gzcjf" Mar 20 17:49:49 crc kubenswrapper[4690]: E0320 17:49:49.604809 4690 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 17:49:49 crc kubenswrapper[4690]: E0320 17:49:49.604890 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/535eb2e4-3de8-49bd-97a8-135823a8d1c9-cert podName:535eb2e4-3de8-49bd-97a8-135823a8d1c9 nodeName:}" failed. No retries permitted until 2026-03-20 17:49:53.604868429 +0000 UTC m=+1068.470694187 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/535eb2e4-3de8-49bd-97a8-135823a8d1c9-cert") pod "infra-operator-controller-manager-c55d6cc99-gzcjf" (UID: "535eb2e4-3de8-49bd-97a8-135823a8d1c9") : secret "infra-operator-webhook-server-cert" not found Mar 20 17:49:50 crc kubenswrapper[4690]: I0320 17:49:50.009194 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f3bca6f7-be2b-4420-8664-b94ba53d5f7f-cert\") pod \"openstack-baremetal-operator-controller-manager-86657c54f5m7n5b\" (UID: \"f3bca6f7-be2b-4420-8664-b94ba53d5f7f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5m7n5b" Mar 20 17:49:50 crc kubenswrapper[4690]: E0320 17:49:50.012964 4690 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 17:49:50 crc kubenswrapper[4690]: E0320 17:49:50.013011 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3bca6f7-be2b-4420-8664-b94ba53d5f7f-cert podName:f3bca6f7-be2b-4420-8664-b94ba53d5f7f nodeName:}" failed. No retries permitted until 2026-03-20 17:49:54.012996049 +0000 UTC m=+1068.878821727 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f3bca6f7-be2b-4420-8664-b94ba53d5f7f-cert") pod "openstack-baremetal-operator-controller-manager-86657c54f5m7n5b" (UID: "f3bca6f7-be2b-4420-8664-b94ba53d5f7f") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 17:49:50 crc kubenswrapper[4690]: I0320 17:49:50.417165 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/26b1c9fc-55f9-4895-9d23-a7c7e0e811c3-webhook-certs\") pod \"openstack-operator-controller-manager-54c9b8654f-ms4r7\" (UID: \"26b1c9fc-55f9-4895-9d23-a7c7e0e811c3\") " pod="openstack-operators/openstack-operator-controller-manager-54c9b8654f-ms4r7" Mar 20 17:49:50 crc kubenswrapper[4690]: E0320 17:49:50.417371 4690 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 17:49:50 crc kubenswrapper[4690]: E0320 17:49:50.417448 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26b1c9fc-55f9-4895-9d23-a7c7e0e811c3-webhook-certs podName:26b1c9fc-55f9-4895-9d23-a7c7e0e811c3 nodeName:}" failed. No retries permitted until 2026-03-20 17:49:54.417429653 +0000 UTC m=+1069.283255331 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/26b1c9fc-55f9-4895-9d23-a7c7e0e811c3-webhook-certs") pod "openstack-operator-controller-manager-54c9b8654f-ms4r7" (UID: "26b1c9fc-55f9-4895-9d23-a7c7e0e811c3") : secret "webhook-server-cert" not found Mar 20 17:49:50 crc kubenswrapper[4690]: I0320 17:49:50.417387 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/26b1c9fc-55f9-4895-9d23-a7c7e0e811c3-metrics-certs\") pod \"openstack-operator-controller-manager-54c9b8654f-ms4r7\" (UID: \"26b1c9fc-55f9-4895-9d23-a7c7e0e811c3\") " pod="openstack-operators/openstack-operator-controller-manager-54c9b8654f-ms4r7" Mar 20 17:49:50 crc kubenswrapper[4690]: E0320 17:49:50.417565 4690 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 17:49:50 crc kubenswrapper[4690]: E0320 17:49:50.417667 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26b1c9fc-55f9-4895-9d23-a7c7e0e811c3-metrics-certs podName:26b1c9fc-55f9-4895-9d23-a7c7e0e811c3 nodeName:}" failed. No retries permitted until 2026-03-20 17:49:54.417641319 +0000 UTC m=+1069.283467077 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/26b1c9fc-55f9-4895-9d23-a7c7e0e811c3-metrics-certs") pod "openstack-operator-controller-manager-54c9b8654f-ms4r7" (UID: "26b1c9fc-55f9-4895-9d23-a7c7e0e811c3") : secret "metrics-server-cert" not found Mar 20 17:49:53 crc kubenswrapper[4690]: I0320 17:49:53.662608 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/535eb2e4-3de8-49bd-97a8-135823a8d1c9-cert\") pod \"infra-operator-controller-manager-c55d6cc99-gzcjf\" (UID: \"535eb2e4-3de8-49bd-97a8-135823a8d1c9\") " pod="openstack-operators/infra-operator-controller-manager-c55d6cc99-gzcjf" Mar 20 17:49:53 crc kubenswrapper[4690]: E0320 17:49:53.662927 4690 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 17:49:53 crc kubenswrapper[4690]: E0320 17:49:53.663039 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/535eb2e4-3de8-49bd-97a8-135823a8d1c9-cert podName:535eb2e4-3de8-49bd-97a8-135823a8d1c9 nodeName:}" failed. No retries permitted until 2026-03-20 17:50:01.663010125 +0000 UTC m=+1076.528835843 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/535eb2e4-3de8-49bd-97a8-135823a8d1c9-cert") pod "infra-operator-controller-manager-c55d6cc99-gzcjf" (UID: "535eb2e4-3de8-49bd-97a8-135823a8d1c9") : secret "infra-operator-webhook-server-cert" not found Mar 20 17:49:54 crc kubenswrapper[4690]: I0320 17:49:54.068903 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f3bca6f7-be2b-4420-8664-b94ba53d5f7f-cert\") pod \"openstack-baremetal-operator-controller-manager-86657c54f5m7n5b\" (UID: \"f3bca6f7-be2b-4420-8664-b94ba53d5f7f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5m7n5b" Mar 20 17:49:54 crc kubenswrapper[4690]: E0320 17:49:54.069099 4690 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 17:49:54 crc kubenswrapper[4690]: E0320 17:49:54.069181 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3bca6f7-be2b-4420-8664-b94ba53d5f7f-cert podName:f3bca6f7-be2b-4420-8664-b94ba53d5f7f nodeName:}" failed. No retries permitted until 2026-03-20 17:50:02.069160019 +0000 UTC m=+1076.934985697 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f3bca6f7-be2b-4420-8664-b94ba53d5f7f-cert") pod "openstack-baremetal-operator-controller-manager-86657c54f5m7n5b" (UID: "f3bca6f7-be2b-4420-8664-b94ba53d5f7f") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 17:49:54 crc kubenswrapper[4690]: I0320 17:49:54.473729 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/26b1c9fc-55f9-4895-9d23-a7c7e0e811c3-webhook-certs\") pod \"openstack-operator-controller-manager-54c9b8654f-ms4r7\" (UID: \"26b1c9fc-55f9-4895-9d23-a7c7e0e811c3\") " pod="openstack-operators/openstack-operator-controller-manager-54c9b8654f-ms4r7" Mar 20 17:49:54 crc kubenswrapper[4690]: I0320 17:49:54.473812 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/26b1c9fc-55f9-4895-9d23-a7c7e0e811c3-metrics-certs\") pod \"openstack-operator-controller-manager-54c9b8654f-ms4r7\" (UID: \"26b1c9fc-55f9-4895-9d23-a7c7e0e811c3\") " pod="openstack-operators/openstack-operator-controller-manager-54c9b8654f-ms4r7" Mar 20 17:49:54 crc kubenswrapper[4690]: E0320 17:49:54.474016 4690 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 17:49:54 crc kubenswrapper[4690]: E0320 17:49:54.474068 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26b1c9fc-55f9-4895-9d23-a7c7e0e811c3-metrics-certs podName:26b1c9fc-55f9-4895-9d23-a7c7e0e811c3 nodeName:}" failed. No retries permitted until 2026-03-20 17:50:02.474052826 +0000 UTC m=+1077.339878524 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/26b1c9fc-55f9-4895-9d23-a7c7e0e811c3-metrics-certs") pod "openstack-operator-controller-manager-54c9b8654f-ms4r7" (UID: "26b1c9fc-55f9-4895-9d23-a7c7e0e811c3") : secret "metrics-server-cert" not found Mar 20 17:49:54 crc kubenswrapper[4690]: E0320 17:49:54.474468 4690 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 17:49:54 crc kubenswrapper[4690]: E0320 17:49:54.474510 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26b1c9fc-55f9-4895-9d23-a7c7e0e811c3-webhook-certs podName:26b1c9fc-55f9-4895-9d23-a7c7e0e811c3 nodeName:}" failed. No retries permitted until 2026-03-20 17:50:02.474499449 +0000 UTC m=+1077.340325137 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/26b1c9fc-55f9-4895-9d23-a7c7e0e811c3-webhook-certs") pod "openstack-operator-controller-manager-54c9b8654f-ms4r7" (UID: "26b1c9fc-55f9-4895-9d23-a7c7e0e811c3") : secret "webhook-server-cert" not found Mar 20 17:49:59 crc kubenswrapper[4690]: E0320 17:49:59.532613 4690 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/manila-operator@sha256:f2e0b0fb34995b8acbbf1b0b60b5dbcf488b4f3899d1bb0763ae7dcee9bae6da" Mar 20 17:49:59 crc kubenswrapper[4690]: E0320 17:49:59.533722 4690 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:f2e0b0fb34995b8acbbf1b0b60b5dbcf488b4f3899d1bb0763ae7dcee9bae6da,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9sxx4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-55f864c847-tllng_openstack-operators(a8f81ddb-a5b3-4881-88de-66ed78d8d344): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 17:49:59 crc kubenswrapper[4690]: E0320 17:49:59.535464 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/manila-operator-controller-manager-55f864c847-tllng" podUID="a8f81ddb-a5b3-4881-88de-66ed78d8d344" Mar 20 17:50:00 crc kubenswrapper[4690]: I0320 17:50:00.132059 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567150-4m2nn"] Mar 20 17:50:00 crc kubenswrapper[4690]: I0320 17:50:00.133327 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567150-4m2nn" Mar 20 17:50:00 crc kubenswrapper[4690]: I0320 17:50:00.140083 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567150-4m2nn"] Mar 20 17:50:00 crc kubenswrapper[4690]: I0320 17:50:00.195826 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5fwhb" Mar 20 17:50:00 crc kubenswrapper[4690]: I0320 17:50:00.195861 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 17:50:00 crc kubenswrapper[4690]: I0320 17:50:00.196146 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 17:50:00 crc kubenswrapper[4690]: I0320 17:50:00.276286 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrtgp\" (UniqueName: \"kubernetes.io/projected/cb74f825-5d0a-4a8a-8d15-d95cfdcf2729-kube-api-access-zrtgp\") pod \"auto-csr-approver-29567150-4m2nn\" (UID: \"cb74f825-5d0a-4a8a-8d15-d95cfdcf2729\") " pod="openshift-infra/auto-csr-approver-29567150-4m2nn" Mar 20 17:50:00 crc kubenswrapper[4690]: E0320 17:50:00.371160 4690 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a" Mar 20 17:50:00 crc kubenswrapper[4690]: E0320 17:50:00.371377 4690 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-prblj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-5d488d59fb-55787_openstack-operators(458fe699-42d5-44ad-9288-3b6fbcd87161): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 17:50:00 crc kubenswrapper[4690]: E0320 17:50:00.372600 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-55787" podUID="458fe699-42d5-44ad-9288-3b6fbcd87161" Mar 20 17:50:00 crc kubenswrapper[4690]: I0320 17:50:00.378843 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrtgp\" (UniqueName: \"kubernetes.io/projected/cb74f825-5d0a-4a8a-8d15-d95cfdcf2729-kube-api-access-zrtgp\") pod \"auto-csr-approver-29567150-4m2nn\" (UID: \"cb74f825-5d0a-4a8a-8d15-d95cfdcf2729\") " pod="openshift-infra/auto-csr-approver-29567150-4m2nn" Mar 20 17:50:00 crc kubenswrapper[4690]: I0320 17:50:00.404581 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrtgp\" (UniqueName: \"kubernetes.io/projected/cb74f825-5d0a-4a8a-8d15-d95cfdcf2729-kube-api-access-zrtgp\") pod \"auto-csr-approver-29567150-4m2nn\" (UID: \"cb74f825-5d0a-4a8a-8d15-d95cfdcf2729\") " pod="openshift-infra/auto-csr-approver-29567150-4m2nn" Mar 20 17:50:00 crc kubenswrapper[4690]: I0320 17:50:00.520564 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567150-4m2nn" Mar 20 17:50:00 crc kubenswrapper[4690]: E0320 17:50:00.555736 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a\\\"\"" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-55787" podUID="458fe699-42d5-44ad-9288-3b6fbcd87161" Mar 20 17:50:00 crc kubenswrapper[4690]: E0320 17:50:00.556609 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:f2e0b0fb34995b8acbbf1b0b60b5dbcf488b4f3899d1bb0763ae7dcee9bae6da\\\"\"" pod="openstack-operators/manila-operator-controller-manager-55f864c847-tllng" podUID="a8f81ddb-a5b3-4881-88de-66ed78d8d344" Mar 20 17:50:00 crc kubenswrapper[4690]: I0320 17:50:00.984217 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567150-4m2nn"] Mar 20 17:50:01 crc kubenswrapper[4690]: I0320 17:50:01.556067 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-5bf49" event={"ID":"06fbcef9-d6fa-4dac-bfeb-93e3fc501f55","Type":"ContainerStarted","Data":"8a65bdc1e8bbcfe94984499b2815f4b27e61323ae843ba311a8da87f329cf329"} Mar 20 17:50:01 crc kubenswrapper[4690]: I0320 17:50:01.557024 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-5bf49" Mar 20 17:50:01 crc kubenswrapper[4690]: I0320 17:50:01.558474 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-w2m9s" event={"ID":"3871373d-0b43-4e90-84f8-01ee2e8e4159","Type":"ContainerStarted","Data":"52aa73e18e82d3170f3bf51994546b1d9befa1eaab404b2f26cc2e1802ad7c7f"} Mar 20 17:50:01 crc kubenswrapper[4690]: I0320 17:50:01.558665 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-w2m9s" Mar 20 17:50:01 crc kubenswrapper[4690]: I0320 17:50:01.560075 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-pxcl7" event={"ID":"3efbe084-4e50-405e-b477-b3b87635d465","Type":"ContainerStarted","Data":"6dc5f1d4104fecb1390e6a8caef0171ba1dc5bc7cda9aedaefdb45c4642e4f7f"} Mar 20 17:50:01 crc kubenswrapper[4690]: I0320 17:50:01.560311 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-pxcl7" Mar 20 17:50:01 crc kubenswrapper[4690]: I0320 17:50:01.562142 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-xs6lt" event={"ID":"61746313-5249-48cd-8dae-f7984ba74f85","Type":"ContainerStarted","Data":"da1d85bd35cbe9e959d0e449efaee97a291d304bc45621c12e1fea20ad95911d"} Mar 20 17:50:01 crc kubenswrapper[4690]: I0320 17:50:01.562320 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-767865f676-xs6lt" Mar 20 17:50:01 crc kubenswrapper[4690]: I0320 17:50:01.563492 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-fm55z" event={"ID":"a27fabe1-095d-4c34-8e91-862aa1dbf964","Type":"ContainerStarted","Data":"c01430fc6c3adba51c506d01f18f214173eea84af18ec8432fd33bc96fc56849"} Mar 20 17:50:01 crc kubenswrapper[4690]: I0320 17:50:01.564078 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-5784578c99-fm55z" Mar 20 17:50:01 crc kubenswrapper[4690]: I0320 17:50:01.567244 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-wclkd" event={"ID":"6a4da3b7-e419-4565-8b7a-2f3f3fd20aa1","Type":"ContainerStarted","Data":"10b33595db56868a17b6365bc7de20ed01e1634619d0ead1fb26f1be99e921af"} Mar 20 17:50:01 crc kubenswrapper[4690]: I0320 17:50:01.568039 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-wclkd" Mar 20 17:50:01 crc kubenswrapper[4690]: I0320 17:50:01.569141 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-rtb26" event={"ID":"d71d628c-8060-418c-b0bf-f83193220e88","Type":"ContainerStarted","Data":"0fdebfd999ee2fb14b606978ab1549d41bb543cb54c69b817ecec0cb49396e39"} Mar 20 17:50:01 crc kubenswrapper[4690]: I0320 17:50:01.569636 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-rtb26" Mar 20 17:50:01 crc kubenswrapper[4690]: I0320 17:50:01.570895 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-xzzx7" event={"ID":"a31564d4-ce19-4893-bde8-871ced7c077b","Type":"ContainerStarted","Data":"fb8b5146a8de22cb87936358419033b88de4d7e43dfe81a77dfd0e4b3160c3cf"} Mar 20 17:50:01 crc kubenswrapper[4690]: I0320 17:50:01.571016 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-xzzx7" Mar 20 17:50:01 crc kubenswrapper[4690]: I0320 17:50:01.572154 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-mmwbh" event={"ID":"09c39274-3aa3-4774-98e2-10f70b707a97","Type":"ContainerStarted","Data":"8a5f831c5e40430e001a415272e9473107d073e50e3fe41d7822ba2d89e617dd"} Mar 20 17:50:01 crc kubenswrapper[4690]: I0320 17:50:01.572595 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-mmwbh" Mar 20 17:50:01 crc kubenswrapper[4690]: I0320 17:50:01.573719 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-48tn7" event={"ID":"3730dc8b-cf83-4f29-ac0b-3776ef3efeba","Type":"ContainerStarted","Data":"9fc830f6f76ebfd7285982abc009d2c3933261d318ac7b86850aada52eff48b6"} Mar 20 17:50:01 crc kubenswrapper[4690]: I0320 17:50:01.574169 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-884679f54-48tn7" Mar 20 17:50:01 crc kubenswrapper[4690]: I0320 17:50:01.575805 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567150-4m2nn" event={"ID":"cb74f825-5d0a-4a8a-8d15-d95cfdcf2729","Type":"ContainerStarted","Data":"8a894c57066acf32b0b61c9c3cc7f5c07a316b635b935926062d22f80b793997"} Mar 20 17:50:01 crc kubenswrapper[4690]: I0320 17:50:01.577119 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-27xp7" event={"ID":"1c5d887a-7a69-4f43-8b75-36de19325428","Type":"ContainerStarted","Data":"5675ad31a7ae0ca31e3c578985c28cdcbfa041e9bd1dd2f2470d2fd988c0e3ab"} Mar 20 17:50:01 crc kubenswrapper[4690]: I0320 17:50:01.577303 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-27xp7" Mar 20 17:50:01 crc kubenswrapper[4690]: I0320 17:50:01.578302 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-5bf49" podStartSLOduration=3.041196301 podStartE2EDuration="16.578291857s" podCreationTimestamp="2026-03-20 17:49:45 +0000 UTC" firstStartedPulling="2026-03-20 17:49:46.764556444 +0000 UTC m=+1061.630382122" lastFinishedPulling="2026-03-20 17:50:00.301652 +0000 UTC m=+1075.167477678" observedRunningTime="2026-03-20 17:50:01.575059663 +0000 UTC m=+1076.440885341" watchObservedRunningTime="2026-03-20 17:50:01.578291857 +0000 UTC m=+1076.444117535" Mar 20 17:50:01 crc kubenswrapper[4690]: I0320 17:50:01.581295 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-mbp48" event={"ID":"a3db7a74-f9a7-4dfc-89a3-5727f538a3a7","Type":"ContainerStarted","Data":"38230ab7c3bab5f19e3d0c3c23bf9e7a80bd33b4a8215d97e26a8bb8dd4ae5bb"} Mar 20 17:50:01 crc kubenswrapper[4690]: I0320 17:50:01.581439 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-mbp48" Mar 20 17:50:01 crc kubenswrapper[4690]: I0320 17:50:01.603018 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-mmwbh" podStartSLOduration=3.886338195 podStartE2EDuration="16.603001349s" podCreationTimestamp="2026-03-20 17:49:45 +0000 UTC" firstStartedPulling="2026-03-20 17:49:47.581382422 +0000 UTC m=+1062.447208110" lastFinishedPulling="2026-03-20 17:50:00.298045586 +0000 UTC m=+1075.163871264" observedRunningTime="2026-03-20 17:50:01.597992604 +0000 UTC m=+1076.463818282" watchObservedRunningTime="2026-03-20 17:50:01.603001349 +0000 UTC m=+1076.468827027" Mar 20 17:50:01 crc kubenswrapper[4690]: I0320 17:50:01.626145 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-5784578c99-fm55z" podStartSLOduration=2.716231143 podStartE2EDuration="15.626128415s" podCreationTimestamp="2026-03-20 17:49:46 +0000 UTC" firstStartedPulling="2026-03-20 17:49:47.39044214 +0000 UTC m=+1062.256267818" lastFinishedPulling="2026-03-20 17:50:00.300339412 +0000 UTC m=+1075.166165090" observedRunningTime="2026-03-20 17:50:01.620653407 +0000 UTC m=+1076.486479085" watchObservedRunningTime="2026-03-20 17:50:01.626128415 +0000 UTC m=+1076.491954093" Mar 20 17:50:01 crc kubenswrapper[4690]: I0320 17:50:01.647849 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-w2m9s" podStartSLOduration=3.355298992 podStartE2EDuration="16.64783071s" podCreationTimestamp="2026-03-20 17:49:45 +0000 UTC" firstStartedPulling="2026-03-20 17:49:47.007179706 +0000 UTC m=+1061.873005384" lastFinishedPulling="2026-03-20 17:50:00.299711424 +0000 UTC m=+1075.165537102" observedRunningTime="2026-03-20 17:50:01.646638886 +0000 UTC m=+1076.512464564" watchObservedRunningTime="2026-03-20 17:50:01.64783071 +0000 UTC m=+1076.513656388" Mar 20 17:50:01 crc kubenswrapper[4690]: I0320 17:50:01.678984 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-xzzx7" podStartSLOduration=3.775253294 podStartE2EDuration="16.678968678s" podCreationTimestamp="2026-03-20 17:49:45 +0000 UTC" firstStartedPulling="2026-03-20 17:49:47.390068769 +0000 UTC m=+1062.255894447" lastFinishedPulling="2026-03-20 17:50:00.293784153 +0000 UTC m=+1075.159609831" observedRunningTime="2026-03-20 17:50:01.678288248 +0000 UTC m=+1076.544113926" watchObservedRunningTime="2026-03-20 17:50:01.678968678 +0000 UTC m=+1076.544794356" Mar 20 17:50:01 crc kubenswrapper[4690]: I0320 17:50:01.698974 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/535eb2e4-3de8-49bd-97a8-135823a8d1c9-cert\") pod \"infra-operator-controller-manager-c55d6cc99-gzcjf\" (UID: \"535eb2e4-3de8-49bd-97a8-135823a8d1c9\") " pod="openstack-operators/infra-operator-controller-manager-c55d6cc99-gzcjf" Mar 20 17:50:01 crc kubenswrapper[4690]: E0320 17:50:01.699877 4690 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 17:50:01 crc kubenswrapper[4690]: E0320 17:50:01.699918 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/535eb2e4-3de8-49bd-97a8-135823a8d1c9-cert podName:535eb2e4-3de8-49bd-97a8-135823a8d1c9 nodeName:}" failed. No retries permitted until 2026-03-20 17:50:17.699906141 +0000 UTC m=+1092.565731809 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/535eb2e4-3de8-49bd-97a8-135823a8d1c9-cert") pod "infra-operator-controller-manager-c55d6cc99-gzcjf" (UID: "535eb2e4-3de8-49bd-97a8-135823a8d1c9") : secret "infra-operator-webhook-server-cert" not found Mar 20 17:50:01 crc kubenswrapper[4690]: I0320 17:50:01.725219 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-rtb26" podStartSLOduration=3.62554184 podStartE2EDuration="16.72520034s" podCreationTimestamp="2026-03-20 17:49:45 +0000 UTC" firstStartedPulling="2026-03-20 17:49:47.200359382 +0000 UTC m=+1062.066185060" lastFinishedPulling="2026-03-20 17:50:00.300017882 +0000 UTC m=+1075.165843560" observedRunningTime="2026-03-20 17:50:01.696980627 +0000 UTC m=+1076.562806295" watchObservedRunningTime="2026-03-20 17:50:01.72520034 +0000 UTC m=+1076.591026018" Mar 20 17:50:01 crc kubenswrapper[4690]: I0320 17:50:01.744081 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-884679f54-48tn7" podStartSLOduration=2.99514363 podStartE2EDuration="15.744056073s" podCreationTimestamp="2026-03-20 17:49:46 +0000 UTC" firstStartedPulling="2026-03-20 17:49:47.571119586 +0000 UTC m=+1062.436945264" lastFinishedPulling="2026-03-20 17:50:00.320032029 +0000 UTC m=+1075.185857707" observedRunningTime="2026-03-20 17:50:01.739107281 +0000 UTC m=+1076.604932959" watchObservedRunningTime="2026-03-20 17:50:01.744056073 +0000 UTC m=+1076.609881751" Mar 20 17:50:01 crc kubenswrapper[4690]: I0320 17:50:01.745738 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-767865f676-xs6lt" podStartSLOduration=3.648862961 podStartE2EDuration="16.745727961s" podCreationTimestamp="2026-03-20 17:49:45 +0000 UTC" firstStartedPulling="2026-03-20 17:49:47.200699002 +0000 UTC m=+1062.066524720" lastFinishedPulling="2026-03-20 17:50:00.297564022 +0000 UTC m=+1075.163389720" observedRunningTime="2026-03-20 17:50:01.722901644 +0000 UTC m=+1076.588727322" watchObservedRunningTime="2026-03-20 17:50:01.745727961 +0000 UTC m=+1076.611553639" Mar 20 17:50:01 crc kubenswrapper[4690]: I0320 17:50:01.789475 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-wclkd" podStartSLOduration=3.630516193 podStartE2EDuration="16.789455461s" podCreationTimestamp="2026-03-20 17:49:45 +0000 UTC" firstStartedPulling="2026-03-20 17:49:47.138941323 +0000 UTC m=+1062.004767001" lastFinishedPulling="2026-03-20 17:50:00.297880591 +0000 UTC m=+1075.163706269" observedRunningTime="2026-03-20 17:50:01.778226538 +0000 UTC m=+1076.644052216" watchObservedRunningTime="2026-03-20 17:50:01.789455461 +0000 UTC m=+1076.655281139" Mar 20 17:50:01 crc kubenswrapper[4690]: I0320 17:50:01.809726 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-pxcl7" podStartSLOduration=3.8771500100000003 podStartE2EDuration="16.809709575s" podCreationTimestamp="2026-03-20 17:49:45 +0000 UTC" firstStartedPulling="2026-03-20 17:49:47.368869358 +0000 UTC m=+1062.234695036" lastFinishedPulling="2026-03-20 17:50:00.301428923 +0000 UTC m=+1075.167254601" observedRunningTime="2026-03-20 17:50:01.804780493 +0000 UTC m=+1076.670606171" watchObservedRunningTime="2026-03-20 17:50:01.809709575 +0000 UTC m=+1076.675535253" Mar 20 17:50:01 crc kubenswrapper[4690]: I0320 17:50:01.855952 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-27xp7" podStartSLOduration=3.786152338 podStartE2EDuration="16.855923527s" podCreationTimestamp="2026-03-20 17:49:45 +0000 UTC" firstStartedPulling="2026-03-20 17:49:47.230053868 +0000 UTC m=+1062.095879546" lastFinishedPulling="2026-03-20 17:50:00.299825047 +0000 UTC m=+1075.165650735" observedRunningTime="2026-03-20 17:50:01.852349784 +0000 UTC m=+1076.718175462" watchObservedRunningTime="2026-03-20 17:50:01.855923527 +0000 UTC m=+1076.721749195" Mar 20 17:50:01 crc kubenswrapper[4690]: I0320 17:50:01.917498 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-mbp48" podStartSLOduration=3.598624583 podStartE2EDuration="16.917474s" podCreationTimestamp="2026-03-20 17:49:45 +0000 UTC" firstStartedPulling="2026-03-20 17:49:46.978712205 +0000 UTC m=+1061.844537883" lastFinishedPulling="2026-03-20 17:50:00.297561622 +0000 UTC m=+1075.163387300" observedRunningTime="2026-03-20 17:50:01.901962033 +0000 UTC m=+1076.767787721" watchObservedRunningTime="2026-03-20 17:50:01.917474 +0000 UTC m=+1076.783299678" Mar 20 17:50:02 crc kubenswrapper[4690]: I0320 17:50:02.104759 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f3bca6f7-be2b-4420-8664-b94ba53d5f7f-cert\") pod \"openstack-baremetal-operator-controller-manager-86657c54f5m7n5b\" (UID: \"f3bca6f7-be2b-4420-8664-b94ba53d5f7f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5m7n5b" Mar 20 17:50:02 crc kubenswrapper[4690]: E0320 17:50:02.104916 4690 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 17:50:02 crc kubenswrapper[4690]: E0320 17:50:02.104966 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3bca6f7-be2b-4420-8664-b94ba53d5f7f-cert podName:f3bca6f7-be2b-4420-8664-b94ba53d5f7f nodeName:}" failed. No retries permitted until 2026-03-20 17:50:18.104949441 +0000 UTC m=+1092.970775119 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f3bca6f7-be2b-4420-8664-b94ba53d5f7f-cert") pod "openstack-baremetal-operator-controller-manager-86657c54f5m7n5b" (UID: "f3bca6f7-be2b-4420-8664-b94ba53d5f7f") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 17:50:02 crc kubenswrapper[4690]: I0320 17:50:02.509477 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/26b1c9fc-55f9-4895-9d23-a7c7e0e811c3-metrics-certs\") pod \"openstack-operator-controller-manager-54c9b8654f-ms4r7\" (UID: \"26b1c9fc-55f9-4895-9d23-a7c7e0e811c3\") " pod="openstack-operators/openstack-operator-controller-manager-54c9b8654f-ms4r7" Mar 20 17:50:02 crc kubenswrapper[4690]: I0320 17:50:02.509827 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/26b1c9fc-55f9-4895-9d23-a7c7e0e811c3-webhook-certs\") pod \"openstack-operator-controller-manager-54c9b8654f-ms4r7\" (UID: \"26b1c9fc-55f9-4895-9d23-a7c7e0e811c3\") " pod="openstack-operators/openstack-operator-controller-manager-54c9b8654f-ms4r7" Mar 20 17:50:02 crc kubenswrapper[4690]: E0320 17:50:02.509676 4690 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 17:50:02 crc kubenswrapper[4690]: E0320 17:50:02.509952 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26b1c9fc-55f9-4895-9d23-a7c7e0e811c3-metrics-certs podName:26b1c9fc-55f9-4895-9d23-a7c7e0e811c3 nodeName:}" failed. No retries permitted until 2026-03-20 17:50:18.509924941 +0000 UTC m=+1093.375750619 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/26b1c9fc-55f9-4895-9d23-a7c7e0e811c3-metrics-certs") pod "openstack-operator-controller-manager-54c9b8654f-ms4r7" (UID: "26b1c9fc-55f9-4895-9d23-a7c7e0e811c3") : secret "metrics-server-cert" not found Mar 20 17:50:02 crc kubenswrapper[4690]: E0320 17:50:02.509974 4690 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 17:50:02 crc kubenswrapper[4690]: E0320 17:50:02.510022 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26b1c9fc-55f9-4895-9d23-a7c7e0e811c3-webhook-certs podName:26b1c9fc-55f9-4895-9d23-a7c7e0e811c3 nodeName:}" failed. No retries permitted until 2026-03-20 17:50:18.510007543 +0000 UTC m=+1093.375833221 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/26b1c9fc-55f9-4895-9d23-a7c7e0e811c3-webhook-certs") pod "openstack-operator-controller-manager-54c9b8654f-ms4r7" (UID: "26b1c9fc-55f9-4895-9d23-a7c7e0e811c3") : secret "webhook-server-cert" not found Mar 20 17:50:02 crc kubenswrapper[4690]: I0320 17:50:02.587787 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567150-4m2nn" event={"ID":"cb74f825-5d0a-4a8a-8d15-d95cfdcf2729","Type":"ContainerStarted","Data":"8ba358eb6c3b3ac3d431f63cad1f33f11e7d1366cd0c13b9100253d62a11fabc"} Mar 20 17:50:02 crc kubenswrapper[4690]: I0320 17:50:02.606388 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567150-4m2nn" podStartSLOduration=1.361547051 podStartE2EDuration="2.6063726s" podCreationTimestamp="2026-03-20 17:50:00 +0000 UTC" firstStartedPulling="2026-03-20 17:50:00.995065021 +0000 UTC m=+1075.860890699" lastFinishedPulling="2026-03-20 17:50:02.23989057 +0000 UTC m=+1077.105716248" observedRunningTime="2026-03-20 17:50:02.599727689 +0000 UTC m=+1077.465553357" watchObservedRunningTime="2026-03-20 17:50:02.6063726 +0000 UTC m=+1077.472198278" Mar 20 17:50:03 crc kubenswrapper[4690]: I0320 17:50:03.603850 4690 generic.go:334] "Generic (PLEG): container finished" podID="cb74f825-5d0a-4a8a-8d15-d95cfdcf2729" containerID="8ba358eb6c3b3ac3d431f63cad1f33f11e7d1366cd0c13b9100253d62a11fabc" exitCode=0 Mar 20 17:50:03 crc kubenswrapper[4690]: I0320 17:50:03.605091 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567150-4m2nn" event={"ID":"cb74f825-5d0a-4a8a-8d15-d95cfdcf2729","Type":"ContainerDied","Data":"8ba358eb6c3b3ac3d431f63cad1f33f11e7d1366cd0c13b9100253d62a11fabc"} Mar 20 17:50:04 crc kubenswrapper[4690]: I0320 17:50:04.613168 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-klmr6" event={"ID":"64a2959f-0b79-4b19-934b-486aad0e782b","Type":"ContainerStarted","Data":"10d8dbb76d14c478af9f0e34c478209629c77bc20cb40fa385a9523f7c4df9c6"} Mar 20 17:50:04 crc kubenswrapper[4690]: I0320 17:50:04.614308 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-c674c5965-klmr6" Mar 20 17:50:04 crc kubenswrapper[4690]: I0320 17:50:04.628911 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-c674c5965-klmr6" podStartSLOduration=2.7348102279999997 podStartE2EDuration="18.62888971s" podCreationTimestamp="2026-03-20 17:49:46 +0000 UTC" firstStartedPulling="2026-03-20 17:49:47.676311586 +0000 UTC m=+1062.542137264" lastFinishedPulling="2026-03-20 17:50:03.570391028 +0000 UTC m=+1078.436216746" observedRunningTime="2026-03-20 17:50:04.625722488 +0000 UTC m=+1079.491548186" watchObservedRunningTime="2026-03-20 17:50:04.62888971 +0000 UTC m=+1079.494715388" Mar 20 17:50:05 crc kubenswrapper[4690]: I0320 17:50:05.021308 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567150-4m2nn" Mar 20 17:50:05 crc kubenswrapper[4690]: I0320 17:50:05.196704 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrtgp\" (UniqueName: \"kubernetes.io/projected/cb74f825-5d0a-4a8a-8d15-d95cfdcf2729-kube-api-access-zrtgp\") pod \"cb74f825-5d0a-4a8a-8d15-d95cfdcf2729\" (UID: \"cb74f825-5d0a-4a8a-8d15-d95cfdcf2729\") " Mar 20 17:50:05 crc kubenswrapper[4690]: I0320 17:50:05.215137 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb74f825-5d0a-4a8a-8d15-d95cfdcf2729-kube-api-access-zrtgp" (OuterVolumeSpecName: "kube-api-access-zrtgp") pod "cb74f825-5d0a-4a8a-8d15-d95cfdcf2729" (UID: "cb74f825-5d0a-4a8a-8d15-d95cfdcf2729"). InnerVolumeSpecName "kube-api-access-zrtgp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:50:05 crc kubenswrapper[4690]: I0320 17:50:05.298706 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zrtgp\" (UniqueName: \"kubernetes.io/projected/cb74f825-5d0a-4a8a-8d15-d95cfdcf2729-kube-api-access-zrtgp\") on node \"crc\" DevicePath \"\"" Mar 20 17:50:05 crc kubenswrapper[4690]: I0320 17:50:05.625402 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567150-4m2nn" Mar 20 17:50:05 crc kubenswrapper[4690]: I0320 17:50:05.625681 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567150-4m2nn" event={"ID":"cb74f825-5d0a-4a8a-8d15-d95cfdcf2729","Type":"ContainerDied","Data":"8a894c57066acf32b0b61c9c3cc7f5c07a316b635b935926062d22f80b793997"} Mar 20 17:50:05 crc kubenswrapper[4690]: I0320 17:50:05.625714 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8a894c57066acf32b0b61c9c3cc7f5c07a316b635b935926062d22f80b793997" Mar 20 17:50:05 crc kubenswrapper[4690]: I0320 17:50:05.688005 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567144-gsxt8"] Mar 20 17:50:05 crc kubenswrapper[4690]: I0320 17:50:05.690711 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567144-gsxt8"] Mar 20 17:50:05 crc kubenswrapper[4690]: I0320 17:50:05.911387 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bbf81b2-25de-4f9f-b7df-e61e997e9418" path="/var/lib/kubelet/pods/7bbf81b2-25de-4f9f-b7df-e61e997e9418/volumes" Mar 20 17:50:06 crc kubenswrapper[4690]: I0320 17:50:06.011478 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-5bf49" Mar 20 17:50:06 crc kubenswrapper[4690]: I0320 17:50:06.065051 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-mbp48" Mar 20 17:50:06 crc kubenswrapper[4690]: I0320 17:50:06.115236 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-w2m9s" Mar 20 17:50:06 crc kubenswrapper[4690]: I0320 17:50:06.150298 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-27xp7" Mar 20 17:50:06 crc kubenswrapper[4690]: I0320 17:50:06.162900 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-wclkd" Mar 20 17:50:06 crc kubenswrapper[4690]: I0320 17:50:06.188073 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-rtb26" Mar 20 17:50:06 crc kubenswrapper[4690]: I0320 17:50:06.266635 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-pxcl7" Mar 20 17:50:06 crc kubenswrapper[4690]: I0320 17:50:06.321122 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-767865f676-xs6lt" Mar 20 17:50:06 crc kubenswrapper[4690]: I0320 17:50:06.416748 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-xzzx7" Mar 20 17:50:06 crc kubenswrapper[4690]: I0320 17:50:06.452993 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-mmwbh" Mar 20 17:50:06 crc kubenswrapper[4690]: I0320 17:50:06.519942 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-5784578c99-fm55z" Mar 20 17:50:06 crc kubenswrapper[4690]: I0320 17:50:06.843174 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-884679f54-48tn7" Mar 20 17:50:08 crc kubenswrapper[4690]: I0320 17:50:08.643598 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-xgkr9" event={"ID":"dc81b6ac-2881-4a5c-b3f6-e09fc1c634e4","Type":"ContainerStarted","Data":"58b419fcbfc4cbbbb525461f6c87b583533c39e577ae0fbceb23f7ab38fc525e"} Mar 20 17:50:08 crc kubenswrapper[4690]: I0320 17:50:08.644781 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-vnl4s" event={"ID":"d7d5bc08-99d0-4361-ae0e-ca9732db6154","Type":"ContainerStarted","Data":"d47bf0909eca18e182a567571f532afd3115fa1c516a0da23c8650033c700d39"} Mar 20 17:50:09 crc kubenswrapper[4690]: I0320 17:50:09.215007 4690 scope.go:117] "RemoveContainer" containerID="607bc9a4bf1c0ac71f852eab2f02e5048ac0c738c9bdea5c44150edcba212202" Mar 20 17:50:09 crc kubenswrapper[4690]: I0320 17:50:09.654859 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-8kl25" event={"ID":"c7a8e424-00f3-4e97-b6b3-bd2513624b2e","Type":"ContainerStarted","Data":"05b99c2f912c282b352909e329867917fbff1de847dcc010a87d4da764a7b96b"} Mar 20 17:50:09 crc kubenswrapper[4690]: I0320 17:50:09.656015 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-8kl25" Mar 20 17:50:09 crc kubenswrapper[4690]: I0320 17:50:09.657395 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-pv7x5" event={"ID":"64ced890-6363-43c3-83e5-0001c72851ef","Type":"ContainerStarted","Data":"0cc88a3756d8e8cc6c10679ac563e6826a14cab8e551b46ca0783feb37548841"} Mar 20 17:50:09 crc kubenswrapper[4690]: I0320 17:50:09.657676 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-xgkr9" Mar 20 17:50:09 crc kubenswrapper[4690]: I0320 17:50:09.657912 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-vnl4s" Mar 20 17:50:09 crc kubenswrapper[4690]: I0320 17:50:09.658141 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-pv7x5" Mar 20 17:50:09 crc kubenswrapper[4690]: I0320 17:50:09.676050 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-8kl25" podStartSLOduration=3.861679438 podStartE2EDuration="23.676032714s" podCreationTimestamp="2026-03-20 17:49:46 +0000 UTC" firstStartedPulling="2026-03-20 17:49:47.689705462 +0000 UTC m=+1062.555531140" lastFinishedPulling="2026-03-20 17:50:07.504058738 +0000 UTC m=+1082.369884416" observedRunningTime="2026-03-20 17:50:09.674407107 +0000 UTC m=+1084.540232785" watchObservedRunningTime="2026-03-20 17:50:09.676032714 +0000 UTC m=+1084.541858412" Mar 20 17:50:09 crc kubenswrapper[4690]: I0320 17:50:09.695953 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-xgkr9" podStartSLOduration=4.779216352 podStartE2EDuration="24.695931987s" podCreationTimestamp="2026-03-20 17:49:45 +0000 UTC" firstStartedPulling="2026-03-20 17:49:47.585764248 +0000 UTC m=+1062.451589916" lastFinishedPulling="2026-03-20 17:50:07.502479873 +0000 UTC m=+1082.368305551" observedRunningTime="2026-03-20 17:50:09.690155121 +0000 UTC m=+1084.555980819" watchObservedRunningTime="2026-03-20 17:50:09.695931987 +0000 UTC m=+1084.561757685" Mar 20 17:50:09 crc kubenswrapper[4690]: I0320 17:50:09.712921 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-vnl4s" podStartSLOduration=3.851332889 podStartE2EDuration="23.712892966s" podCreationTimestamp="2026-03-20 17:49:46 +0000 UTC" firstStartedPulling="2026-03-20 17:49:47.669279243 +0000 UTC m=+1062.535104931" lastFinishedPulling="2026-03-20 17:50:07.53083933 +0000 UTC m=+1082.396665008" observedRunningTime="2026-03-20 17:50:09.704131054 +0000 UTC m=+1084.569956752" watchObservedRunningTime="2026-03-20 17:50:09.712892966 +0000 UTC m=+1084.578718664" Mar 20 17:50:09 crc kubenswrapper[4690]: I0320 17:50:09.726050 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-pv7x5" podStartSLOduration=3.906314334 podStartE2EDuration="23.726033485s" podCreationTimestamp="2026-03-20 17:49:46 +0000 UTC" firstStartedPulling="2026-03-20 17:49:47.687762396 +0000 UTC m=+1062.553588084" lastFinishedPulling="2026-03-20 17:50:07.507481557 +0000 UTC m=+1082.373307235" observedRunningTime="2026-03-20 17:50:09.720741052 +0000 UTC m=+1084.586566730" watchObservedRunningTime="2026-03-20 17:50:09.726033485 +0000 UTC m=+1084.591859163" Mar 20 17:50:14 crc kubenswrapper[4690]: I0320 17:50:14.693607 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-55787" event={"ID":"458fe699-42d5-44ad-9288-3b6fbcd87161","Type":"ContainerStarted","Data":"09938c4d98caa987afcb5f6fff45414ba5e6d109ab1bc8b0458e3ca1f4999031"} Mar 20 17:50:14 crc kubenswrapper[4690]: I0320 17:50:14.694306 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-55787" Mar 20 17:50:14 crc kubenswrapper[4690]: I0320 17:50:14.696014 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-tllng" event={"ID":"a8f81ddb-a5b3-4881-88de-66ed78d8d344","Type":"ContainerStarted","Data":"dfed18d2cb18966a38d66d75e1a8bc241ec0892309ed9397dcbd1badf2495d31"} Mar 20 17:50:14 crc kubenswrapper[4690]: I0320 17:50:14.696193 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-55f864c847-tllng" Mar 20 17:50:14 crc kubenswrapper[4690]: I0320 17:50:14.713880 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-55787" podStartSLOduration=2.950698154 podStartE2EDuration="29.713860181s" podCreationTimestamp="2026-03-20 17:49:45 +0000 UTC" firstStartedPulling="2026-03-20 17:49:47.58168025 +0000 UTC m=+1062.447505928" lastFinishedPulling="2026-03-20 17:50:14.344842277 +0000 UTC m=+1089.210667955" observedRunningTime="2026-03-20 17:50:14.707893429 +0000 UTC m=+1089.573719127" watchObservedRunningTime="2026-03-20 17:50:14.713860181 +0000 UTC m=+1089.579685859" Mar 20 17:50:14 crc kubenswrapper[4690]: I0320 17:50:14.725806 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-55f864c847-tllng" podStartSLOduration=3.007943644 podStartE2EDuration="29.725787775s" podCreationTimestamp="2026-03-20 17:49:45 +0000 UTC" firstStartedPulling="2026-03-20 17:49:47.570293162 +0000 UTC m=+1062.436118840" lastFinishedPulling="2026-03-20 17:50:14.288137303 +0000 UTC m=+1089.153962971" observedRunningTime="2026-03-20 17:50:14.720823221 +0000 UTC m=+1089.586648909" watchObservedRunningTime="2026-03-20 17:50:14.725787775 +0000 UTC m=+1089.591613453" Mar 20 17:50:16 crc kubenswrapper[4690]: I0320 17:50:16.246401 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-xgkr9" Mar 20 17:50:16 crc kubenswrapper[4690]: I0320 17:50:16.872129 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-c674c5965-klmr6" Mar 20 17:50:16 crc kubenswrapper[4690]: I0320 17:50:16.915104 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-8kl25" Mar 20 17:50:16 crc kubenswrapper[4690]: I0320 17:50:16.934591 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-vnl4s" Mar 20 17:50:16 crc kubenswrapper[4690]: I0320 17:50:16.949800 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-pv7x5" Mar 20 17:50:17 crc kubenswrapper[4690]: I0320 17:50:17.799057 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/535eb2e4-3de8-49bd-97a8-135823a8d1c9-cert\") pod \"infra-operator-controller-manager-c55d6cc99-gzcjf\" (UID: \"535eb2e4-3de8-49bd-97a8-135823a8d1c9\") " pod="openstack-operators/infra-operator-controller-manager-c55d6cc99-gzcjf" Mar 20 17:50:17 crc kubenswrapper[4690]: I0320 17:50:17.808641 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/535eb2e4-3de8-49bd-97a8-135823a8d1c9-cert\") pod \"infra-operator-controller-manager-c55d6cc99-gzcjf\" (UID: \"535eb2e4-3de8-49bd-97a8-135823a8d1c9\") " pod="openstack-operators/infra-operator-controller-manager-c55d6cc99-gzcjf" Mar 20 17:50:18 crc kubenswrapper[4690]: I0320 17:50:18.061822 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-q6spz" Mar 20 17:50:18 crc kubenswrapper[4690]: I0320 17:50:18.069940 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-c55d6cc99-gzcjf" Mar 20 17:50:18 crc kubenswrapper[4690]: I0320 17:50:18.105011 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f3bca6f7-be2b-4420-8664-b94ba53d5f7f-cert\") pod \"openstack-baremetal-operator-controller-manager-86657c54f5m7n5b\" (UID: \"f3bca6f7-be2b-4420-8664-b94ba53d5f7f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5m7n5b" Mar 20 17:50:18 crc kubenswrapper[4690]: I0320 17:50:18.112153 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f3bca6f7-be2b-4420-8664-b94ba53d5f7f-cert\") pod \"openstack-baremetal-operator-controller-manager-86657c54f5m7n5b\" (UID: \"f3bca6f7-be2b-4420-8664-b94ba53d5f7f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5m7n5b" Mar 20 17:50:18 crc kubenswrapper[4690]: I0320 17:50:18.277437 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-5z8z7" Mar 20 17:50:18 crc kubenswrapper[4690]: I0320 17:50:18.286706 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5m7n5b" Mar 20 17:50:18 crc kubenswrapper[4690]: I0320 17:50:18.390422 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-c55d6cc99-gzcjf"] Mar 20 17:50:18 crc kubenswrapper[4690]: W0320 17:50:18.404680 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod535eb2e4_3de8_49bd_97a8_135823a8d1c9.slice/crio-9a04a2c9b9b92df6ec5f5a1d7c02766f892c81900e60d60b8df1ff6e651d2542 WatchSource:0}: Error finding container 9a04a2c9b9b92df6ec5f5a1d7c02766f892c81900e60d60b8df1ff6e651d2542: Status 404 returned error can't find the container with id 9a04a2c9b9b92df6ec5f5a1d7c02766f892c81900e60d60b8df1ff6e651d2542 Mar 20 17:50:18 crc kubenswrapper[4690]: I0320 17:50:18.565332 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5m7n5b"] Mar 20 17:50:18 crc kubenswrapper[4690]: I0320 17:50:18.610541 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/26b1c9fc-55f9-4895-9d23-a7c7e0e811c3-webhook-certs\") pod \"openstack-operator-controller-manager-54c9b8654f-ms4r7\" (UID: \"26b1c9fc-55f9-4895-9d23-a7c7e0e811c3\") " pod="openstack-operators/openstack-operator-controller-manager-54c9b8654f-ms4r7" Mar 20 17:50:18 crc kubenswrapper[4690]: I0320 17:50:18.610659 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/26b1c9fc-55f9-4895-9d23-a7c7e0e811c3-metrics-certs\") pod \"openstack-operator-controller-manager-54c9b8654f-ms4r7\" (UID: \"26b1c9fc-55f9-4895-9d23-a7c7e0e811c3\") " pod="openstack-operators/openstack-operator-controller-manager-54c9b8654f-ms4r7" Mar 20 17:50:18 crc kubenswrapper[4690]: I0320 17:50:18.616696 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/26b1c9fc-55f9-4895-9d23-a7c7e0e811c3-webhook-certs\") pod \"openstack-operator-controller-manager-54c9b8654f-ms4r7\" (UID: \"26b1c9fc-55f9-4895-9d23-a7c7e0e811c3\") " pod="openstack-operators/openstack-operator-controller-manager-54c9b8654f-ms4r7" Mar 20 17:50:18 crc kubenswrapper[4690]: I0320 17:50:18.618354 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/26b1c9fc-55f9-4895-9d23-a7c7e0e811c3-metrics-certs\") pod \"openstack-operator-controller-manager-54c9b8654f-ms4r7\" (UID: \"26b1c9fc-55f9-4895-9d23-a7c7e0e811c3\") " pod="openstack-operators/openstack-operator-controller-manager-54c9b8654f-ms4r7" Mar 20 17:50:18 crc kubenswrapper[4690]: I0320 17:50:18.731340 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-c55d6cc99-gzcjf" event={"ID":"535eb2e4-3de8-49bd-97a8-135823a8d1c9","Type":"ContainerStarted","Data":"9a04a2c9b9b92df6ec5f5a1d7c02766f892c81900e60d60b8df1ff6e651d2542"} Mar 20 17:50:18 crc kubenswrapper[4690]: I0320 17:50:18.733032 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5m7n5b" event={"ID":"f3bca6f7-be2b-4420-8664-b94ba53d5f7f","Type":"ContainerStarted","Data":"640fe54acf5b947cf8d4204ee1716f2c74f6ac24e17f810eda5de87fe2e58629"} Mar 20 17:50:18 crc kubenswrapper[4690]: I0320 17:50:18.779351 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-2jwf2" Mar 20 17:50:18 crc kubenswrapper[4690]: I0320 17:50:18.786834 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-54c9b8654f-ms4r7" Mar 20 17:50:19 crc kubenswrapper[4690]: I0320 17:50:19.265216 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-54c9b8654f-ms4r7"] Mar 20 17:50:19 crc kubenswrapper[4690]: W0320 17:50:19.277007 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26b1c9fc_55f9_4895_9d23_a7c7e0e811c3.slice/crio-77455a56f1ff2a1f5455beddf14260d46ec653424e8fb4814cb77e67cb4774a1 WatchSource:0}: Error finding container 77455a56f1ff2a1f5455beddf14260d46ec653424e8fb4814cb77e67cb4774a1: Status 404 returned error can't find the container with id 77455a56f1ff2a1f5455beddf14260d46ec653424e8fb4814cb77e67cb4774a1 Mar 20 17:50:19 crc kubenswrapper[4690]: I0320 17:50:19.745276 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-54c9b8654f-ms4r7" event={"ID":"26b1c9fc-55f9-4895-9d23-a7c7e0e811c3","Type":"ContainerStarted","Data":"77455a56f1ff2a1f5455beddf14260d46ec653424e8fb4814cb77e67cb4774a1"} Mar 20 17:50:24 crc kubenswrapper[4690]: I0320 17:50:24.788885 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-54c9b8654f-ms4r7" event={"ID":"26b1c9fc-55f9-4895-9d23-a7c7e0e811c3","Type":"ContainerStarted","Data":"f2608e75b125fbd3bf00950bd64a2309bd4011930d93a5389f8896bc7846e2e9"} Mar 20 17:50:25 crc kubenswrapper[4690]: I0320 17:50:25.797180 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-54c9b8654f-ms4r7" Mar 20 17:50:25 crc kubenswrapper[4690]: I0320 17:50:25.917159 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-54c9b8654f-ms4r7" podStartSLOduration=39.917139927 podStartE2EDuration="39.917139927s" podCreationTimestamp="2026-03-20 17:49:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:50:25.834561917 +0000 UTC m=+1100.700387615" watchObservedRunningTime="2026-03-20 17:50:25.917139927 +0000 UTC m=+1100.782965615" Mar 20 17:50:26 crc kubenswrapper[4690]: I0320 17:50:26.387073 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-55787" Mar 20 17:50:26 crc kubenswrapper[4690]: I0320 17:50:26.483682 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-55f864c847-tllng" Mar 20 17:50:27 crc kubenswrapper[4690]: I0320 17:50:27.810596 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5m7n5b" event={"ID":"f3bca6f7-be2b-4420-8664-b94ba53d5f7f","Type":"ContainerStarted","Data":"ace7cd8645e009ab5b599b2c1302b0a025e07c06c0bc14d1a29cd316e5c73bf0"} Mar 20 17:50:27 crc kubenswrapper[4690]: I0320 17:50:27.810982 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5m7n5b" Mar 20 17:50:27 crc kubenswrapper[4690]: I0320 17:50:27.813949 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-c55d6cc99-gzcjf" event={"ID":"535eb2e4-3de8-49bd-97a8-135823a8d1c9","Type":"ContainerStarted","Data":"61586e8870e2376ab58276b021d1461d49bd8fc1bee1cc977e6e0a5a1e1e9104"} Mar 20 17:50:27 crc kubenswrapper[4690]: I0320 17:50:27.814126 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-c55d6cc99-gzcjf" Mar 20 17:50:27 crc kubenswrapper[4690]: I0320 17:50:27.848403 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5m7n5b" podStartSLOduration=33.53727781 podStartE2EDuration="41.848383136s" podCreationTimestamp="2026-03-20 17:49:46 +0000 UTC" firstStartedPulling="2026-03-20 17:50:18.559854585 +0000 UTC m=+1093.425680253" lastFinishedPulling="2026-03-20 17:50:26.870959901 +0000 UTC m=+1101.736785579" observedRunningTime="2026-03-20 17:50:27.844532225 +0000 UTC m=+1102.710357923" watchObservedRunningTime="2026-03-20 17:50:27.848383136 +0000 UTC m=+1102.714208824" Mar 20 17:50:27 crc kubenswrapper[4690]: I0320 17:50:27.869430 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-c55d6cc99-gzcjf" podStartSLOduration=34.359919467 podStartE2EDuration="42.869411261s" podCreationTimestamp="2026-03-20 17:49:45 +0000 UTC" firstStartedPulling="2026-03-20 17:50:18.407155324 +0000 UTC m=+1093.272981002" lastFinishedPulling="2026-03-20 17:50:26.916647108 +0000 UTC m=+1101.782472796" observedRunningTime="2026-03-20 17:50:27.863504531 +0000 UTC m=+1102.729330219" watchObservedRunningTime="2026-03-20 17:50:27.869411261 +0000 UTC m=+1102.735236949" Mar 20 17:50:38 crc kubenswrapper[4690]: I0320 17:50:38.081176 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-c55d6cc99-gzcjf" Mar 20 17:50:38 crc kubenswrapper[4690]: I0320 17:50:38.295717 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5m7n5b" Mar 20 17:50:38 crc kubenswrapper[4690]: I0320 17:50:38.794728 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-54c9b8654f-ms4r7" Mar 20 17:50:54 crc kubenswrapper[4690]: I0320 17:50:54.453753 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-n27gl"] Mar 20 17:50:54 crc kubenswrapper[4690]: E0320 17:50:54.458721 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb74f825-5d0a-4a8a-8d15-d95cfdcf2729" containerName="oc" Mar 20 17:50:54 crc kubenswrapper[4690]: I0320 17:50:54.458750 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb74f825-5d0a-4a8a-8d15-d95cfdcf2729" containerName="oc" Mar 20 17:50:54 crc kubenswrapper[4690]: I0320 17:50:54.458898 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb74f825-5d0a-4a8a-8d15-d95cfdcf2729" containerName="oc" Mar 20 17:50:54 crc kubenswrapper[4690]: I0320 17:50:54.459639 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-n27gl" Mar 20 17:50:54 crc kubenswrapper[4690]: I0320 17:50:54.463969 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 20 17:50:54 crc kubenswrapper[4690]: I0320 17:50:54.465804 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 20 17:50:54 crc kubenswrapper[4690]: I0320 17:50:54.465806 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-w7xlp" Mar 20 17:50:54 crc kubenswrapper[4690]: I0320 17:50:54.466018 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 20 17:50:54 crc kubenswrapper[4690]: I0320 17:50:54.485116 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-n27gl"] Mar 20 17:50:54 crc kubenswrapper[4690]: I0320 17:50:54.494113 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-wmwmz"] Mar 20 17:50:54 crc kubenswrapper[4690]: I0320 17:50:54.495484 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-wmwmz" Mar 20 17:50:54 crc kubenswrapper[4690]: I0320 17:50:54.497044 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 20 17:50:54 crc kubenswrapper[4690]: I0320 17:50:54.505164 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-wmwmz"] Mar 20 17:50:54 crc kubenswrapper[4690]: I0320 17:50:54.602922 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t557r\" (UniqueName: \"kubernetes.io/projected/90a6ba98-0064-4f8e-bd8a-dfa13ddc8d2e-kube-api-access-t557r\") pod \"dnsmasq-dns-78dd6ddcc-wmwmz\" (UID: \"90a6ba98-0064-4f8e-bd8a-dfa13ddc8d2e\") " pod="openstack/dnsmasq-dns-78dd6ddcc-wmwmz" Mar 20 17:50:54 crc kubenswrapper[4690]: I0320 17:50:54.602961 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7a6c828-810d-4ce8-b7d5-4242668f01db-config\") pod \"dnsmasq-dns-675f4bcbfc-n27gl\" (UID: \"c7a6c828-810d-4ce8-b7d5-4242668f01db\") " pod="openstack/dnsmasq-dns-675f4bcbfc-n27gl" Mar 20 17:50:54 crc kubenswrapper[4690]: I0320 17:50:54.602981 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90a6ba98-0064-4f8e-bd8a-dfa13ddc8d2e-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-wmwmz\" (UID: \"90a6ba98-0064-4f8e-bd8a-dfa13ddc8d2e\") " pod="openstack/dnsmasq-dns-78dd6ddcc-wmwmz" Mar 20 17:50:54 crc kubenswrapper[4690]: I0320 17:50:54.603017 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pn5km\" (UniqueName: \"kubernetes.io/projected/c7a6c828-810d-4ce8-b7d5-4242668f01db-kube-api-access-pn5km\") pod \"dnsmasq-dns-675f4bcbfc-n27gl\" (UID: \"c7a6c828-810d-4ce8-b7d5-4242668f01db\") " pod="openstack/dnsmasq-dns-675f4bcbfc-n27gl" Mar 20 17:50:54 crc kubenswrapper[4690]: I0320 17:50:54.603086 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90a6ba98-0064-4f8e-bd8a-dfa13ddc8d2e-config\") pod \"dnsmasq-dns-78dd6ddcc-wmwmz\" (UID: \"90a6ba98-0064-4f8e-bd8a-dfa13ddc8d2e\") " pod="openstack/dnsmasq-dns-78dd6ddcc-wmwmz" Mar 20 17:50:54 crc kubenswrapper[4690]: I0320 17:50:54.704788 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7a6c828-810d-4ce8-b7d5-4242668f01db-config\") pod \"dnsmasq-dns-675f4bcbfc-n27gl\" (UID: \"c7a6c828-810d-4ce8-b7d5-4242668f01db\") " pod="openstack/dnsmasq-dns-675f4bcbfc-n27gl" Mar 20 17:50:54 crc kubenswrapper[4690]: I0320 17:50:54.704901 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t557r\" (UniqueName: \"kubernetes.io/projected/90a6ba98-0064-4f8e-bd8a-dfa13ddc8d2e-kube-api-access-t557r\") pod \"dnsmasq-dns-78dd6ddcc-wmwmz\" (UID: \"90a6ba98-0064-4f8e-bd8a-dfa13ddc8d2e\") " pod="openstack/dnsmasq-dns-78dd6ddcc-wmwmz" Mar 20 17:50:54 crc kubenswrapper[4690]: I0320 17:50:54.704964 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90a6ba98-0064-4f8e-bd8a-dfa13ddc8d2e-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-wmwmz\" (UID: \"90a6ba98-0064-4f8e-bd8a-dfa13ddc8d2e\") " pod="openstack/dnsmasq-dns-78dd6ddcc-wmwmz" Mar 20 17:50:54 crc kubenswrapper[4690]: I0320 17:50:54.705024 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pn5km\" (UniqueName: \"kubernetes.io/projected/c7a6c828-810d-4ce8-b7d5-4242668f01db-kube-api-access-pn5km\") pod \"dnsmasq-dns-675f4bcbfc-n27gl\" (UID: \"c7a6c828-810d-4ce8-b7d5-4242668f01db\") " pod="openstack/dnsmasq-dns-675f4bcbfc-n27gl" Mar 20 17:50:54 crc kubenswrapper[4690]: I0320 17:50:54.705163 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90a6ba98-0064-4f8e-bd8a-dfa13ddc8d2e-config\") pod \"dnsmasq-dns-78dd6ddcc-wmwmz\" (UID: \"90a6ba98-0064-4f8e-bd8a-dfa13ddc8d2e\") " pod="openstack/dnsmasq-dns-78dd6ddcc-wmwmz" Mar 20 17:50:54 crc kubenswrapper[4690]: I0320 17:50:54.705698 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7a6c828-810d-4ce8-b7d5-4242668f01db-config\") pod \"dnsmasq-dns-675f4bcbfc-n27gl\" (UID: \"c7a6c828-810d-4ce8-b7d5-4242668f01db\") " pod="openstack/dnsmasq-dns-675f4bcbfc-n27gl" Mar 20 17:50:54 crc kubenswrapper[4690]: I0320 17:50:54.706387 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90a6ba98-0064-4f8e-bd8a-dfa13ddc8d2e-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-wmwmz\" (UID: \"90a6ba98-0064-4f8e-bd8a-dfa13ddc8d2e\") " pod="openstack/dnsmasq-dns-78dd6ddcc-wmwmz" Mar 20 17:50:54 crc kubenswrapper[4690]: I0320 17:50:54.706744 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90a6ba98-0064-4f8e-bd8a-dfa13ddc8d2e-config\") pod \"dnsmasq-dns-78dd6ddcc-wmwmz\" (UID: \"90a6ba98-0064-4f8e-bd8a-dfa13ddc8d2e\") " pod="openstack/dnsmasq-dns-78dd6ddcc-wmwmz" Mar 20 17:50:54 crc kubenswrapper[4690]: I0320 17:50:54.724019 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pn5km\" (UniqueName: \"kubernetes.io/projected/c7a6c828-810d-4ce8-b7d5-4242668f01db-kube-api-access-pn5km\") pod \"dnsmasq-dns-675f4bcbfc-n27gl\" (UID: \"c7a6c828-810d-4ce8-b7d5-4242668f01db\") " pod="openstack/dnsmasq-dns-675f4bcbfc-n27gl" Mar 20 17:50:54 crc kubenswrapper[4690]: I0320 17:50:54.724757 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t557r\" (UniqueName: \"kubernetes.io/projected/90a6ba98-0064-4f8e-bd8a-dfa13ddc8d2e-kube-api-access-t557r\") pod \"dnsmasq-dns-78dd6ddcc-wmwmz\" (UID: \"90a6ba98-0064-4f8e-bd8a-dfa13ddc8d2e\") " pod="openstack/dnsmasq-dns-78dd6ddcc-wmwmz" Mar 20 17:50:54 crc kubenswrapper[4690]: I0320 17:50:54.777022 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-n27gl" Mar 20 17:50:54 crc kubenswrapper[4690]: I0320 17:50:54.824715 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-wmwmz" Mar 20 17:50:55 crc kubenswrapper[4690]: I0320 17:50:55.046566 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-n27gl"] Mar 20 17:50:55 crc kubenswrapper[4690]: W0320 17:50:55.326804 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod90a6ba98_0064_4f8e_bd8a_dfa13ddc8d2e.slice/crio-1d010598536502c745367fac9cfbc76add0e7f1d32099b409a8f8c2654be43eb WatchSource:0}: Error finding container 1d010598536502c745367fac9cfbc76add0e7f1d32099b409a8f8c2654be43eb: Status 404 returned error can't find the container with id 1d010598536502c745367fac9cfbc76add0e7f1d32099b409a8f8c2654be43eb Mar 20 17:50:55 crc kubenswrapper[4690]: I0320 17:50:55.331678 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-wmwmz"] Mar 20 17:50:56 crc kubenswrapper[4690]: I0320 17:50:56.059826 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-n27gl" event={"ID":"c7a6c828-810d-4ce8-b7d5-4242668f01db","Type":"ContainerStarted","Data":"14d33695b44e8c73d8f6c27efa7113255899d36cd39b02633f805f136c4fc1d8"} Mar 20 17:50:56 crc kubenswrapper[4690]: I0320 17:50:56.061284 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-wmwmz" event={"ID":"90a6ba98-0064-4f8e-bd8a-dfa13ddc8d2e","Type":"ContainerStarted","Data":"1d010598536502c745367fac9cfbc76add0e7f1d32099b409a8f8c2654be43eb"} Mar 20 17:50:57 crc kubenswrapper[4690]: I0320 17:50:57.183564 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-n27gl"] Mar 20 17:50:57 crc kubenswrapper[4690]: I0320 17:50:57.213922 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-ztc9k"] Mar 20 17:50:57 crc kubenswrapper[4690]: I0320 17:50:57.215061 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-ztc9k" Mar 20 17:50:57 crc kubenswrapper[4690]: I0320 17:50:57.224493 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-ztc9k"] Mar 20 17:50:57 crc kubenswrapper[4690]: I0320 17:50:57.268913 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/875bf125-544f-4899-ac00-797737833d7e-dns-svc\") pod \"dnsmasq-dns-666b6646f7-ztc9k\" (UID: \"875bf125-544f-4899-ac00-797737833d7e\") " pod="openstack/dnsmasq-dns-666b6646f7-ztc9k" Mar 20 17:50:57 crc kubenswrapper[4690]: I0320 17:50:57.268986 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dn267\" (UniqueName: \"kubernetes.io/projected/875bf125-544f-4899-ac00-797737833d7e-kube-api-access-dn267\") pod \"dnsmasq-dns-666b6646f7-ztc9k\" (UID: \"875bf125-544f-4899-ac00-797737833d7e\") " pod="openstack/dnsmasq-dns-666b6646f7-ztc9k" Mar 20 17:50:57 crc kubenswrapper[4690]: I0320 17:50:57.269033 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/875bf125-544f-4899-ac00-797737833d7e-config\") pod \"dnsmasq-dns-666b6646f7-ztc9k\" (UID: \"875bf125-544f-4899-ac00-797737833d7e\") " pod="openstack/dnsmasq-dns-666b6646f7-ztc9k" Mar 20 17:50:57 crc kubenswrapper[4690]: I0320 17:50:57.370935 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/875bf125-544f-4899-ac00-797737833d7e-config\") pod \"dnsmasq-dns-666b6646f7-ztc9k\" (UID: \"875bf125-544f-4899-ac00-797737833d7e\") " pod="openstack/dnsmasq-dns-666b6646f7-ztc9k" Mar 20 17:50:57 crc kubenswrapper[4690]: I0320 17:50:57.371021 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/875bf125-544f-4899-ac00-797737833d7e-dns-svc\") pod \"dnsmasq-dns-666b6646f7-ztc9k\" (UID: \"875bf125-544f-4899-ac00-797737833d7e\") " pod="openstack/dnsmasq-dns-666b6646f7-ztc9k" Mar 20 17:50:57 crc kubenswrapper[4690]: I0320 17:50:57.371059 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dn267\" (UniqueName: \"kubernetes.io/projected/875bf125-544f-4899-ac00-797737833d7e-kube-api-access-dn267\") pod \"dnsmasq-dns-666b6646f7-ztc9k\" (UID: \"875bf125-544f-4899-ac00-797737833d7e\") " pod="openstack/dnsmasq-dns-666b6646f7-ztc9k" Mar 20 17:50:57 crc kubenswrapper[4690]: I0320 17:50:57.372115 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/875bf125-544f-4899-ac00-797737833d7e-config\") pod \"dnsmasq-dns-666b6646f7-ztc9k\" (UID: \"875bf125-544f-4899-ac00-797737833d7e\") " pod="openstack/dnsmasq-dns-666b6646f7-ztc9k" Mar 20 17:50:57 crc kubenswrapper[4690]: I0320 17:50:57.372946 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/875bf125-544f-4899-ac00-797737833d7e-dns-svc\") pod \"dnsmasq-dns-666b6646f7-ztc9k\" (UID: \"875bf125-544f-4899-ac00-797737833d7e\") " pod="openstack/dnsmasq-dns-666b6646f7-ztc9k" Mar 20 17:50:57 crc kubenswrapper[4690]: I0320 17:50:57.394746 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dn267\" (UniqueName: \"kubernetes.io/projected/875bf125-544f-4899-ac00-797737833d7e-kube-api-access-dn267\") pod \"dnsmasq-dns-666b6646f7-ztc9k\" (UID: \"875bf125-544f-4899-ac00-797737833d7e\") " pod="openstack/dnsmasq-dns-666b6646f7-ztc9k" Mar 20 17:50:57 crc kubenswrapper[4690]: I0320 17:50:57.406381 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-wmwmz"] Mar 20 17:50:57 crc kubenswrapper[4690]: I0320 17:50:57.430516 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-9mzwk"] Mar 20 17:50:57 crc kubenswrapper[4690]: I0320 17:50:57.432745 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-9mzwk" Mar 20 17:50:57 crc kubenswrapper[4690]: I0320 17:50:57.451991 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-9mzwk"] Mar 20 17:50:57 crc kubenswrapper[4690]: I0320 17:50:57.471928 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/03da2fe9-5a86-4fba-87c7-7b2132c31d5f-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-9mzwk\" (UID: \"03da2fe9-5a86-4fba-87c7-7b2132c31d5f\") " pod="openstack/dnsmasq-dns-57d769cc4f-9mzwk" Mar 20 17:50:57 crc kubenswrapper[4690]: I0320 17:50:57.471980 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03da2fe9-5a86-4fba-87c7-7b2132c31d5f-config\") pod \"dnsmasq-dns-57d769cc4f-9mzwk\" (UID: \"03da2fe9-5a86-4fba-87c7-7b2132c31d5f\") " pod="openstack/dnsmasq-dns-57d769cc4f-9mzwk" Mar 20 17:50:57 crc kubenswrapper[4690]: I0320 17:50:57.472128 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djktk\" (UniqueName: \"kubernetes.io/projected/03da2fe9-5a86-4fba-87c7-7b2132c31d5f-kube-api-access-djktk\") pod \"dnsmasq-dns-57d769cc4f-9mzwk\" (UID: \"03da2fe9-5a86-4fba-87c7-7b2132c31d5f\") " pod="openstack/dnsmasq-dns-57d769cc4f-9mzwk" Mar 20 17:50:57 crc kubenswrapper[4690]: I0320 17:50:57.538055 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-ztc9k" Mar 20 17:50:57 crc kubenswrapper[4690]: I0320 17:50:57.573402 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djktk\" (UniqueName: \"kubernetes.io/projected/03da2fe9-5a86-4fba-87c7-7b2132c31d5f-kube-api-access-djktk\") pod \"dnsmasq-dns-57d769cc4f-9mzwk\" (UID: \"03da2fe9-5a86-4fba-87c7-7b2132c31d5f\") " pod="openstack/dnsmasq-dns-57d769cc4f-9mzwk" Mar 20 17:50:57 crc kubenswrapper[4690]: I0320 17:50:57.573516 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/03da2fe9-5a86-4fba-87c7-7b2132c31d5f-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-9mzwk\" (UID: \"03da2fe9-5a86-4fba-87c7-7b2132c31d5f\") " pod="openstack/dnsmasq-dns-57d769cc4f-9mzwk" Mar 20 17:50:57 crc kubenswrapper[4690]: I0320 17:50:57.573546 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03da2fe9-5a86-4fba-87c7-7b2132c31d5f-config\") pod \"dnsmasq-dns-57d769cc4f-9mzwk\" (UID: \"03da2fe9-5a86-4fba-87c7-7b2132c31d5f\") " pod="openstack/dnsmasq-dns-57d769cc4f-9mzwk" Mar 20 17:50:57 crc kubenswrapper[4690]: I0320 17:50:57.575005 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03da2fe9-5a86-4fba-87c7-7b2132c31d5f-config\") pod \"dnsmasq-dns-57d769cc4f-9mzwk\" (UID: \"03da2fe9-5a86-4fba-87c7-7b2132c31d5f\") " pod="openstack/dnsmasq-dns-57d769cc4f-9mzwk" Mar 20 17:50:57 crc kubenswrapper[4690]: I0320 17:50:57.575513 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/03da2fe9-5a86-4fba-87c7-7b2132c31d5f-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-9mzwk\" (UID: \"03da2fe9-5a86-4fba-87c7-7b2132c31d5f\") " pod="openstack/dnsmasq-dns-57d769cc4f-9mzwk" Mar 20 17:50:57 crc kubenswrapper[4690]: I0320 17:50:57.587988 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djktk\" (UniqueName: \"kubernetes.io/projected/03da2fe9-5a86-4fba-87c7-7b2132c31d5f-kube-api-access-djktk\") pod \"dnsmasq-dns-57d769cc4f-9mzwk\" (UID: \"03da2fe9-5a86-4fba-87c7-7b2132c31d5f\") " pod="openstack/dnsmasq-dns-57d769cc4f-9mzwk" Mar 20 17:50:57 crc kubenswrapper[4690]: I0320 17:50:57.759161 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-9mzwk" Mar 20 17:50:57 crc kubenswrapper[4690]: I0320 17:50:57.957877 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-ztc9k"] Mar 20 17:50:58 crc kubenswrapper[4690]: I0320 17:50:58.080726 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-ztc9k" event={"ID":"875bf125-544f-4899-ac00-797737833d7e","Type":"ContainerStarted","Data":"929044064d11a669a394297c9f0b2236304210d6bed8708c20050cac82b63b77"} Mar 20 17:50:58 crc kubenswrapper[4690]: I0320 17:50:58.171847 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 17:50:58 crc kubenswrapper[4690]: I0320 17:50:58.173223 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 20 17:50:58 crc kubenswrapper[4690]: I0320 17:50:58.174630 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 20 17:50:58 crc kubenswrapper[4690]: I0320 17:50:58.174993 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-6x46v" Mar 20 17:50:58 crc kubenswrapper[4690]: I0320 17:50:58.175174 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 20 17:50:58 crc kubenswrapper[4690]: I0320 17:50:58.176672 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 20 17:50:58 crc kubenswrapper[4690]: I0320 17:50:58.177061 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 20 17:50:58 crc kubenswrapper[4690]: I0320 17:50:58.177232 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 20 17:50:58 crc kubenswrapper[4690]: I0320 17:50:58.178845 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 20 17:50:58 crc kubenswrapper[4690]: I0320 17:50:58.182865 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 17:50:58 crc kubenswrapper[4690]: I0320 17:50:58.256076 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-9mzwk"] Mar 20 17:50:58 crc kubenswrapper[4690]: I0320 17:50:58.316951 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4ee4534b-8d84-4ca5-a8bc-10574d39d7bc-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"4ee4534b-8d84-4ca5-a8bc-10574d39d7bc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:50:58 crc kubenswrapper[4690]: I0320 17:50:58.317020 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4ee4534b-8d84-4ca5-a8bc-10574d39d7bc-config-data\") pod \"rabbitmq-server-0\" (UID: \"4ee4534b-8d84-4ca5-a8bc-10574d39d7bc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:50:58 crc kubenswrapper[4690]: I0320 17:50:58.317039 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4ee4534b-8d84-4ca5-a8bc-10574d39d7bc-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"4ee4534b-8d84-4ca5-a8bc-10574d39d7bc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:50:58 crc kubenswrapper[4690]: I0320 17:50:58.317056 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4ee4534b-8d84-4ca5-a8bc-10574d39d7bc-server-conf\") pod \"rabbitmq-server-0\" (UID: \"4ee4534b-8d84-4ca5-a8bc-10574d39d7bc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:50:58 crc kubenswrapper[4690]: I0320 17:50:58.317091 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4ee4534b-8d84-4ca5-a8bc-10574d39d7bc-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"4ee4534b-8d84-4ca5-a8bc-10574d39d7bc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:50:58 crc kubenswrapper[4690]: I0320 17:50:58.317132 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"4ee4534b-8d84-4ca5-a8bc-10574d39d7bc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:50:58 crc kubenswrapper[4690]: I0320 17:50:58.317151 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4ee4534b-8d84-4ca5-a8bc-10574d39d7bc-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"4ee4534b-8d84-4ca5-a8bc-10574d39d7bc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:50:58 crc kubenswrapper[4690]: I0320 17:50:58.317186 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4ee4534b-8d84-4ca5-a8bc-10574d39d7bc-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"4ee4534b-8d84-4ca5-a8bc-10574d39d7bc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:50:58 crc kubenswrapper[4690]: I0320 17:50:58.317200 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6qrh\" (UniqueName: \"kubernetes.io/projected/4ee4534b-8d84-4ca5-a8bc-10574d39d7bc-kube-api-access-m6qrh\") pod \"rabbitmq-server-0\" (UID: \"4ee4534b-8d84-4ca5-a8bc-10574d39d7bc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:50:58 crc kubenswrapper[4690]: I0320 17:50:58.317216 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4ee4534b-8d84-4ca5-a8bc-10574d39d7bc-pod-info\") pod \"rabbitmq-server-0\" (UID: \"4ee4534b-8d84-4ca5-a8bc-10574d39d7bc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:50:58 crc kubenswrapper[4690]: I0320 17:50:58.317242 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4ee4534b-8d84-4ca5-a8bc-10574d39d7bc-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"4ee4534b-8d84-4ca5-a8bc-10574d39d7bc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:50:58 crc kubenswrapper[4690]: I0320 17:50:58.368852 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 17:50:58 crc kubenswrapper[4690]: I0320 17:50:58.369899 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:50:58 crc kubenswrapper[4690]: I0320 17:50:58.372184 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 20 17:50:58 crc kubenswrapper[4690]: I0320 17:50:58.372383 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 20 17:50:58 crc kubenswrapper[4690]: I0320 17:50:58.372460 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 20 17:50:58 crc kubenswrapper[4690]: I0320 17:50:58.372827 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 20 17:50:58 crc kubenswrapper[4690]: I0320 17:50:58.372923 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 20 17:50:58 crc kubenswrapper[4690]: I0320 17:50:58.373184 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-kbltv" Mar 20 17:50:58 crc kubenswrapper[4690]: I0320 17:50:58.373208 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 20 17:50:58 crc kubenswrapper[4690]: I0320 17:50:58.384801 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 17:50:58 crc kubenswrapper[4690]: I0320 17:50:58.418927 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4ee4534b-8d84-4ca5-a8bc-10574d39d7bc-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"4ee4534b-8d84-4ca5-a8bc-10574d39d7bc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:50:58 crc kubenswrapper[4690]: I0320 17:50:58.418967 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6qrh\" (UniqueName: \"kubernetes.io/projected/4ee4534b-8d84-4ca5-a8bc-10574d39d7bc-kube-api-access-m6qrh\") pod \"rabbitmq-server-0\" (UID: \"4ee4534b-8d84-4ca5-a8bc-10574d39d7bc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:50:58 crc kubenswrapper[4690]: I0320 17:50:58.418987 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4ee4534b-8d84-4ca5-a8bc-10574d39d7bc-pod-info\") pod \"rabbitmq-server-0\" (UID: \"4ee4534b-8d84-4ca5-a8bc-10574d39d7bc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:50:58 crc kubenswrapper[4690]: I0320 17:50:58.419008 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:50:58 crc kubenswrapper[4690]: I0320 17:50:58.419034 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4ee4534b-8d84-4ca5-a8bc-10574d39d7bc-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"4ee4534b-8d84-4ca5-a8bc-10574d39d7bc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:50:58 crc kubenswrapper[4690]: I0320 17:50:58.419056 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:50:58 crc kubenswrapper[4690]: I0320 17:50:58.419073 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-869th\" (UniqueName: \"kubernetes.io/projected/a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7-kube-api-access-869th\") pod \"rabbitmq-cell1-server-0\" (UID: \"a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:50:58 crc kubenswrapper[4690]: I0320 17:50:58.419103 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:50:58 crc kubenswrapper[4690]: I0320 17:50:58.419123 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4ee4534b-8d84-4ca5-a8bc-10574d39d7bc-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"4ee4534b-8d84-4ca5-a8bc-10574d39d7bc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:50:58 crc kubenswrapper[4690]: I0320 17:50:58.419141 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4ee4534b-8d84-4ca5-a8bc-10574d39d7bc-config-data\") pod \"rabbitmq-server-0\" (UID: \"4ee4534b-8d84-4ca5-a8bc-10574d39d7bc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:50:58 crc kubenswrapper[4690]: I0320 17:50:58.419157 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4ee4534b-8d84-4ca5-a8bc-10574d39d7bc-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"4ee4534b-8d84-4ca5-a8bc-10574d39d7bc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:50:58 crc kubenswrapper[4690]: I0320 17:50:58.419173 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4ee4534b-8d84-4ca5-a8bc-10574d39d7bc-server-conf\") pod \"rabbitmq-server-0\" (UID: \"4ee4534b-8d84-4ca5-a8bc-10574d39d7bc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:50:58 crc kubenswrapper[4690]: I0320 17:50:58.419199 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:50:58 crc kubenswrapper[4690]: I0320 17:50:58.419216 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:50:58 crc kubenswrapper[4690]: I0320 17:50:58.419235 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4ee4534b-8d84-4ca5-a8bc-10574d39d7bc-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"4ee4534b-8d84-4ca5-a8bc-10574d39d7bc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:50:58 crc kubenswrapper[4690]: I0320 17:50:58.419250 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:50:58 crc kubenswrapper[4690]: I0320 17:50:58.419281 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"4ee4534b-8d84-4ca5-a8bc-10574d39d7bc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:50:58 crc kubenswrapper[4690]: I0320 17:50:58.419293 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4ee4534b-8d84-4ca5-a8bc-10574d39d7bc-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"4ee4534b-8d84-4ca5-a8bc-10574d39d7bc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:50:58 crc kubenswrapper[4690]: I0320 17:50:58.419313 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:50:58 crc kubenswrapper[4690]: I0320 17:50:58.419329 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:50:58 crc kubenswrapper[4690]: I0320 17:50:58.419346 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:50:58 crc kubenswrapper[4690]: I0320 17:50:58.419360 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:50:58 crc kubenswrapper[4690]: I0320 17:50:58.420157 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4ee4534b-8d84-4ca5-a8bc-10574d39d7bc-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"4ee4534b-8d84-4ca5-a8bc-10574d39d7bc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:50:58 crc kubenswrapper[4690]: I0320 17:50:58.422016 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4ee4534b-8d84-4ca5-a8bc-10574d39d7bc-server-conf\") pod \"rabbitmq-server-0\" (UID: \"4ee4534b-8d84-4ca5-a8bc-10574d39d7bc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:50:58 crc kubenswrapper[4690]: I0320 17:50:58.422627 4690 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"4ee4534b-8d84-4ca5-a8bc-10574d39d7bc\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Mar 20 17:50:58 crc kubenswrapper[4690]: I0320 17:50:58.423488 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4ee4534b-8d84-4ca5-a8bc-10574d39d7bc-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"4ee4534b-8d84-4ca5-a8bc-10574d39d7bc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:50:58 crc kubenswrapper[4690]: I0320 17:50:58.423609 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4ee4534b-8d84-4ca5-a8bc-10574d39d7bc-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"4ee4534b-8d84-4ca5-a8bc-10574d39d7bc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:50:58 crc kubenswrapper[4690]: I0320 17:50:58.424028 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4ee4534b-8d84-4ca5-a8bc-10574d39d7bc-config-data\") pod \"rabbitmq-server-0\" (UID: \"4ee4534b-8d84-4ca5-a8bc-10574d39d7bc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:50:58 crc kubenswrapper[4690]: I0320 17:50:58.428582 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4ee4534b-8d84-4ca5-a8bc-10574d39d7bc-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"4ee4534b-8d84-4ca5-a8bc-10574d39d7bc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:50:58 crc kubenswrapper[4690]: I0320 17:50:58.428597 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4ee4534b-8d84-4ca5-a8bc-10574d39d7bc-pod-info\") pod \"rabbitmq-server-0\" (UID: \"4ee4534b-8d84-4ca5-a8bc-10574d39d7bc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:50:58 crc kubenswrapper[4690]: I0320 17:50:58.429083 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4ee4534b-8d84-4ca5-a8bc-10574d39d7bc-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"4ee4534b-8d84-4ca5-a8bc-10574d39d7bc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:50:58 crc kubenswrapper[4690]: I0320 17:50:58.435992 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4ee4534b-8d84-4ca5-a8bc-10574d39d7bc-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"4ee4534b-8d84-4ca5-a8bc-10574d39d7bc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:50:58 crc kubenswrapper[4690]: I0320 17:50:58.441215 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6qrh\" (UniqueName: \"kubernetes.io/projected/4ee4534b-8d84-4ca5-a8bc-10574d39d7bc-kube-api-access-m6qrh\") pod \"rabbitmq-server-0\" (UID: \"4ee4534b-8d84-4ca5-a8bc-10574d39d7bc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:50:58 crc kubenswrapper[4690]: I0320 17:50:58.454693 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"4ee4534b-8d84-4ca5-a8bc-10574d39d7bc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:50:58 crc kubenswrapper[4690]: I0320 17:50:58.517086 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 20 17:50:58 crc kubenswrapper[4690]: I0320 17:50:58.519982 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:50:58 crc kubenswrapper[4690]: I0320 17:50:58.520037 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:50:58 crc kubenswrapper[4690]: I0320 17:50:58.520068 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:50:58 crc kubenswrapper[4690]: I0320 17:50:58.520097 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:50:58 crc kubenswrapper[4690]: I0320 17:50:58.520121 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:50:58 crc kubenswrapper[4690]: I0320 17:50:58.520141 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:50:58 crc kubenswrapper[4690]: I0320 17:50:58.520162 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:50:58 crc kubenswrapper[4690]: I0320 17:50:58.520202 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:50:58 crc kubenswrapper[4690]: I0320 17:50:58.520240 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:50:58 crc kubenswrapper[4690]: I0320 17:50:58.520280 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-869th\" (UniqueName: \"kubernetes.io/projected/a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7-kube-api-access-869th\") pod \"rabbitmq-cell1-server-0\" (UID: \"a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:50:58 crc kubenswrapper[4690]: I0320 17:50:58.520322 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:50:58 crc kubenswrapper[4690]: I0320 17:50:58.520843 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:50:58 crc kubenswrapper[4690]: I0320 17:50:58.526023 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:50:58 crc kubenswrapper[4690]: I0320 17:50:58.526312 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:50:58 crc kubenswrapper[4690]: I0320 17:50:58.526512 4690 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:50:58 crc kubenswrapper[4690]: I0320 17:50:58.526577 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:50:58 crc kubenswrapper[4690]: I0320 17:50:58.527116 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:50:58 crc kubenswrapper[4690]: I0320 17:50:58.531552 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:50:58 crc kubenswrapper[4690]: I0320 17:50:58.532225 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:50:58 crc kubenswrapper[4690]: I0320 17:50:58.536207 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:50:58 crc kubenswrapper[4690]: I0320 17:50:58.541058 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:50:58 crc kubenswrapper[4690]: I0320 17:50:58.544341 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-869th\" (UniqueName: \"kubernetes.io/projected/a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7-kube-api-access-869th\") pod \"rabbitmq-cell1-server-0\" (UID: \"a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:50:58 crc kubenswrapper[4690]: I0320 17:50:58.558134 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:50:58 crc kubenswrapper[4690]: I0320 17:50:58.691723 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:50:59 crc kubenswrapper[4690]: I0320 17:50:59.739852 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 20 17:50:59 crc kubenswrapper[4690]: I0320 17:50:59.741981 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 20 17:50:59 crc kubenswrapper[4690]: I0320 17:50:59.748478 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 20 17:50:59 crc kubenswrapper[4690]: I0320 17:50:59.748695 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-lh48f" Mar 20 17:50:59 crc kubenswrapper[4690]: I0320 17:50:59.748785 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 20 17:50:59 crc kubenswrapper[4690]: I0320 17:50:59.749168 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 20 17:50:59 crc kubenswrapper[4690]: I0320 17:50:59.750494 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 20 17:50:59 crc kubenswrapper[4690]: I0320 17:50:59.764647 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 20 17:50:59 crc kubenswrapper[4690]: I0320 17:50:59.841744 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dacc9bed-eaa9-4747-8a92-30f5afa0a698-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"dacc9bed-eaa9-4747-8a92-30f5afa0a698\") " pod="openstack/openstack-galera-0" Mar 20 17:50:59 crc kubenswrapper[4690]: I0320 17:50:59.841830 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"dacc9bed-eaa9-4747-8a92-30f5afa0a698\") " pod="openstack/openstack-galera-0" Mar 20 17:50:59 crc kubenswrapper[4690]: I0320 17:50:59.841880 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dacc9bed-eaa9-4747-8a92-30f5afa0a698-operator-scripts\") pod \"openstack-galera-0\" (UID: \"dacc9bed-eaa9-4747-8a92-30f5afa0a698\") " pod="openstack/openstack-galera-0" Mar 20 17:50:59 crc kubenswrapper[4690]: I0320 17:50:59.841895 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/dacc9bed-eaa9-4747-8a92-30f5afa0a698-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"dacc9bed-eaa9-4747-8a92-30f5afa0a698\") " pod="openstack/openstack-galera-0" Mar 20 17:50:59 crc kubenswrapper[4690]: I0320 17:50:59.841914 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/dacc9bed-eaa9-4747-8a92-30f5afa0a698-config-data-generated\") pod \"openstack-galera-0\" (UID: \"dacc9bed-eaa9-4747-8a92-30f5afa0a698\") " pod="openstack/openstack-galera-0" Mar 20 17:50:59 crc kubenswrapper[4690]: I0320 17:50:59.841934 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/dacc9bed-eaa9-4747-8a92-30f5afa0a698-kolla-config\") pod \"openstack-galera-0\" (UID: \"dacc9bed-eaa9-4747-8a92-30f5afa0a698\") " pod="openstack/openstack-galera-0" Mar 20 17:50:59 crc kubenswrapper[4690]: I0320 17:50:59.841951 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/dacc9bed-eaa9-4747-8a92-30f5afa0a698-config-data-default\") pod \"openstack-galera-0\" (UID: \"dacc9bed-eaa9-4747-8a92-30f5afa0a698\") " pod="openstack/openstack-galera-0" Mar 20 17:50:59 crc kubenswrapper[4690]: I0320 17:50:59.841966 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76xx9\" (UniqueName: \"kubernetes.io/projected/dacc9bed-eaa9-4747-8a92-30f5afa0a698-kube-api-access-76xx9\") pod \"openstack-galera-0\" (UID: \"dacc9bed-eaa9-4747-8a92-30f5afa0a698\") " pod="openstack/openstack-galera-0" Mar 20 17:50:59 crc kubenswrapper[4690]: I0320 17:50:59.944134 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dacc9bed-eaa9-4747-8a92-30f5afa0a698-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"dacc9bed-eaa9-4747-8a92-30f5afa0a698\") " pod="openstack/openstack-galera-0" Mar 20 17:50:59 crc kubenswrapper[4690]: I0320 17:50:59.944244 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"dacc9bed-eaa9-4747-8a92-30f5afa0a698\") " pod="openstack/openstack-galera-0" Mar 20 17:50:59 crc kubenswrapper[4690]: I0320 17:50:59.944345 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dacc9bed-eaa9-4747-8a92-30f5afa0a698-operator-scripts\") pod \"openstack-galera-0\" (UID: \"dacc9bed-eaa9-4747-8a92-30f5afa0a698\") " pod="openstack/openstack-galera-0" Mar 20 17:50:59 crc kubenswrapper[4690]: I0320 17:50:59.944368 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/dacc9bed-eaa9-4747-8a92-30f5afa0a698-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"dacc9bed-eaa9-4747-8a92-30f5afa0a698\") " pod="openstack/openstack-galera-0" Mar 20 17:50:59 crc kubenswrapper[4690]: I0320 17:50:59.944390 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/dacc9bed-eaa9-4747-8a92-30f5afa0a698-config-data-generated\") pod \"openstack-galera-0\" (UID: \"dacc9bed-eaa9-4747-8a92-30f5afa0a698\") " pod="openstack/openstack-galera-0" Mar 20 17:50:59 crc kubenswrapper[4690]: I0320 17:50:59.944400 4690 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"dacc9bed-eaa9-4747-8a92-30f5afa0a698\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/openstack-galera-0" Mar 20 17:50:59 crc kubenswrapper[4690]: I0320 17:50:59.944413 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/dacc9bed-eaa9-4747-8a92-30f5afa0a698-kolla-config\") pod \"openstack-galera-0\" (UID: \"dacc9bed-eaa9-4747-8a92-30f5afa0a698\") " pod="openstack/openstack-galera-0" Mar 20 17:50:59 crc kubenswrapper[4690]: I0320 17:50:59.944450 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/dacc9bed-eaa9-4747-8a92-30f5afa0a698-config-data-default\") pod \"openstack-galera-0\" (UID: \"dacc9bed-eaa9-4747-8a92-30f5afa0a698\") " pod="openstack/openstack-galera-0" Mar 20 17:50:59 crc kubenswrapper[4690]: I0320 17:50:59.944476 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76xx9\" (UniqueName: \"kubernetes.io/projected/dacc9bed-eaa9-4747-8a92-30f5afa0a698-kube-api-access-76xx9\") pod \"openstack-galera-0\" (UID: \"dacc9bed-eaa9-4747-8a92-30f5afa0a698\") " pod="openstack/openstack-galera-0" Mar 20 17:50:59 crc kubenswrapper[4690]: I0320 17:50:59.945948 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/dacc9bed-eaa9-4747-8a92-30f5afa0a698-config-data-default\") pod \"openstack-galera-0\" (UID: \"dacc9bed-eaa9-4747-8a92-30f5afa0a698\") " pod="openstack/openstack-galera-0" Mar 20 17:50:59 crc kubenswrapper[4690]: I0320 17:50:59.945957 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dacc9bed-eaa9-4747-8a92-30f5afa0a698-operator-scripts\") pod \"openstack-galera-0\" (UID: \"dacc9bed-eaa9-4747-8a92-30f5afa0a698\") " pod="openstack/openstack-galera-0" Mar 20 17:50:59 crc kubenswrapper[4690]: I0320 17:50:59.946067 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/dacc9bed-eaa9-4747-8a92-30f5afa0a698-config-data-generated\") pod \"openstack-galera-0\" (UID: \"dacc9bed-eaa9-4747-8a92-30f5afa0a698\") " pod="openstack/openstack-galera-0" Mar 20 17:50:59 crc kubenswrapper[4690]: I0320 17:50:59.946662 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/dacc9bed-eaa9-4747-8a92-30f5afa0a698-kolla-config\") pod \"openstack-galera-0\" (UID: \"dacc9bed-eaa9-4747-8a92-30f5afa0a698\") " pod="openstack/openstack-galera-0" Mar 20 17:50:59 crc kubenswrapper[4690]: I0320 17:50:59.949904 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dacc9bed-eaa9-4747-8a92-30f5afa0a698-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"dacc9bed-eaa9-4747-8a92-30f5afa0a698\") " pod="openstack/openstack-galera-0" Mar 20 17:50:59 crc kubenswrapper[4690]: I0320 17:50:59.961887 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/dacc9bed-eaa9-4747-8a92-30f5afa0a698-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"dacc9bed-eaa9-4747-8a92-30f5afa0a698\") " pod="openstack/openstack-galera-0" Mar 20 17:50:59 crc kubenswrapper[4690]: I0320 17:50:59.964825 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76xx9\" (UniqueName: \"kubernetes.io/projected/dacc9bed-eaa9-4747-8a92-30f5afa0a698-kube-api-access-76xx9\") pod \"openstack-galera-0\" (UID: \"dacc9bed-eaa9-4747-8a92-30f5afa0a698\") " pod="openstack/openstack-galera-0" Mar 20 17:50:59 crc kubenswrapper[4690]: I0320 17:50:59.969729 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"dacc9bed-eaa9-4747-8a92-30f5afa0a698\") " pod="openstack/openstack-galera-0" Mar 20 17:51:00 crc kubenswrapper[4690]: I0320 17:51:00.077516 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 20 17:51:01 crc kubenswrapper[4690]: I0320 17:51:01.115878 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-9mzwk" event={"ID":"03da2fe9-5a86-4fba-87c7-7b2132c31d5f","Type":"ContainerStarted","Data":"f0684160015a6d523d6d312a456c6868dff5d334dd83e652d5a4d44238fbd79a"} Mar 20 17:51:01 crc kubenswrapper[4690]: I0320 17:51:01.154800 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 20 17:51:01 crc kubenswrapper[4690]: I0320 17:51:01.156363 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 20 17:51:01 crc kubenswrapper[4690]: I0320 17:51:01.159099 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 20 17:51:01 crc kubenswrapper[4690]: I0320 17:51:01.159306 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-25r2v" Mar 20 17:51:01 crc kubenswrapper[4690]: I0320 17:51:01.159455 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 20 17:51:01 crc kubenswrapper[4690]: I0320 17:51:01.161470 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 20 17:51:01 crc kubenswrapper[4690]: I0320 17:51:01.169732 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 20 17:51:01 crc kubenswrapper[4690]: I0320 17:51:01.267590 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4aa597c-8302-463f-a383-39c9a51baa2c-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"d4aa597c-8302-463f-a383-39c9a51baa2c\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:51:01 crc kubenswrapper[4690]: I0320 17:51:01.267652 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d4aa597c-8302-463f-a383-39c9a51baa2c-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"d4aa597c-8302-463f-a383-39c9a51baa2c\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:51:01 crc kubenswrapper[4690]: I0320 17:51:01.267687 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4aa597c-8302-463f-a383-39c9a51baa2c-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"d4aa597c-8302-463f-a383-39c9a51baa2c\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:51:01 crc kubenswrapper[4690]: I0320 17:51:01.267713 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d4aa597c-8302-463f-a383-39c9a51baa2c-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"d4aa597c-8302-463f-a383-39c9a51baa2c\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:51:01 crc kubenswrapper[4690]: I0320 17:51:01.267762 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d4aa597c-8302-463f-a383-39c9a51baa2c-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"d4aa597c-8302-463f-a383-39c9a51baa2c\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:51:01 crc kubenswrapper[4690]: I0320 17:51:01.267794 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d4aa597c-8302-463f-a383-39c9a51baa2c-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"d4aa597c-8302-463f-a383-39c9a51baa2c\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:51:01 crc kubenswrapper[4690]: I0320 17:51:01.268046 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"d4aa597c-8302-463f-a383-39c9a51baa2c\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:51:01 crc kubenswrapper[4690]: I0320 17:51:01.268151 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrj82\" (UniqueName: \"kubernetes.io/projected/d4aa597c-8302-463f-a383-39c9a51baa2c-kube-api-access-mrj82\") pod \"openstack-cell1-galera-0\" (UID: \"d4aa597c-8302-463f-a383-39c9a51baa2c\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:51:01 crc kubenswrapper[4690]: I0320 17:51:01.286163 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 20 17:51:01 crc kubenswrapper[4690]: I0320 17:51:01.287024 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 20 17:51:01 crc kubenswrapper[4690]: I0320 17:51:01.288781 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 20 17:51:01 crc kubenswrapper[4690]: I0320 17:51:01.289241 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 20 17:51:01 crc kubenswrapper[4690]: I0320 17:51:01.289717 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-jtncr" Mar 20 17:51:01 crc kubenswrapper[4690]: I0320 17:51:01.302001 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 20 17:51:01 crc kubenswrapper[4690]: I0320 17:51:01.370375 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrzf2\" (UniqueName: \"kubernetes.io/projected/b74da73d-632e-490b-b3c7-22450d29ede6-kube-api-access-xrzf2\") pod \"memcached-0\" (UID: \"b74da73d-632e-490b-b3c7-22450d29ede6\") " pod="openstack/memcached-0" Mar 20 17:51:01 crc kubenswrapper[4690]: I0320 17:51:01.370434 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b74da73d-632e-490b-b3c7-22450d29ede6-kolla-config\") pod \"memcached-0\" (UID: \"b74da73d-632e-490b-b3c7-22450d29ede6\") " pod="openstack/memcached-0" Mar 20 17:51:01 crc kubenswrapper[4690]: I0320 17:51:01.370469 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4aa597c-8302-463f-a383-39c9a51baa2c-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"d4aa597c-8302-463f-a383-39c9a51baa2c\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:51:01 crc kubenswrapper[4690]: I0320 17:51:01.370500 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d4aa597c-8302-463f-a383-39c9a51baa2c-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"d4aa597c-8302-463f-a383-39c9a51baa2c\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:51:01 crc kubenswrapper[4690]: I0320 17:51:01.370540 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4aa597c-8302-463f-a383-39c9a51baa2c-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"d4aa597c-8302-463f-a383-39c9a51baa2c\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:51:01 crc kubenswrapper[4690]: I0320 17:51:01.370574 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d4aa597c-8302-463f-a383-39c9a51baa2c-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"d4aa597c-8302-463f-a383-39c9a51baa2c\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:51:01 crc kubenswrapper[4690]: I0320 17:51:01.370611 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d4aa597c-8302-463f-a383-39c9a51baa2c-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"d4aa597c-8302-463f-a383-39c9a51baa2c\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:51:01 crc kubenswrapper[4690]: I0320 17:51:01.370634 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/b74da73d-632e-490b-b3c7-22450d29ede6-memcached-tls-certs\") pod \"memcached-0\" (UID: \"b74da73d-632e-490b-b3c7-22450d29ede6\") " pod="openstack/memcached-0" Mar 20 17:51:01 crc kubenswrapper[4690]: I0320 17:51:01.370686 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b74da73d-632e-490b-b3c7-22450d29ede6-config-data\") pod \"memcached-0\" (UID: \"b74da73d-632e-490b-b3c7-22450d29ede6\") " pod="openstack/memcached-0" Mar 20 17:51:01 crc kubenswrapper[4690]: I0320 17:51:01.370723 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d4aa597c-8302-463f-a383-39c9a51baa2c-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"d4aa597c-8302-463f-a383-39c9a51baa2c\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:51:01 crc kubenswrapper[4690]: I0320 17:51:01.370792 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b74da73d-632e-490b-b3c7-22450d29ede6-combined-ca-bundle\") pod \"memcached-0\" (UID: \"b74da73d-632e-490b-b3c7-22450d29ede6\") " pod="openstack/memcached-0" Mar 20 17:51:01 crc kubenswrapper[4690]: I0320 17:51:01.370821 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"d4aa597c-8302-463f-a383-39c9a51baa2c\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:51:01 crc kubenswrapper[4690]: I0320 17:51:01.370883 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrj82\" (UniqueName: \"kubernetes.io/projected/d4aa597c-8302-463f-a383-39c9a51baa2c-kube-api-access-mrj82\") pod \"openstack-cell1-galera-0\" (UID: \"d4aa597c-8302-463f-a383-39c9a51baa2c\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:51:01 crc kubenswrapper[4690]: I0320 17:51:01.372961 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d4aa597c-8302-463f-a383-39c9a51baa2c-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"d4aa597c-8302-463f-a383-39c9a51baa2c\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:51:01 crc kubenswrapper[4690]: I0320 17:51:01.373279 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d4aa597c-8302-463f-a383-39c9a51baa2c-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"d4aa597c-8302-463f-a383-39c9a51baa2c\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:51:01 crc kubenswrapper[4690]: I0320 17:51:01.373460 4690 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"d4aa597c-8302-463f-a383-39c9a51baa2c\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/openstack-cell1-galera-0" Mar 20 17:51:01 crc kubenswrapper[4690]: I0320 17:51:01.374801 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d4aa597c-8302-463f-a383-39c9a51baa2c-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"d4aa597c-8302-463f-a383-39c9a51baa2c\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:51:01 crc kubenswrapper[4690]: I0320 17:51:01.375780 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d4aa597c-8302-463f-a383-39c9a51baa2c-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"d4aa597c-8302-463f-a383-39c9a51baa2c\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:51:01 crc kubenswrapper[4690]: I0320 17:51:01.376972 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4aa597c-8302-463f-a383-39c9a51baa2c-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"d4aa597c-8302-463f-a383-39c9a51baa2c\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:51:01 crc kubenswrapper[4690]: I0320 17:51:01.377894 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4aa597c-8302-463f-a383-39c9a51baa2c-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"d4aa597c-8302-463f-a383-39c9a51baa2c\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:51:01 crc kubenswrapper[4690]: I0320 17:51:01.390480 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrj82\" (UniqueName: \"kubernetes.io/projected/d4aa597c-8302-463f-a383-39c9a51baa2c-kube-api-access-mrj82\") pod \"openstack-cell1-galera-0\" (UID: \"d4aa597c-8302-463f-a383-39c9a51baa2c\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:51:01 crc kubenswrapper[4690]: I0320 17:51:01.400400 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"d4aa597c-8302-463f-a383-39c9a51baa2c\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:51:01 crc kubenswrapper[4690]: I0320 17:51:01.472604 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrzf2\" (UniqueName: \"kubernetes.io/projected/b74da73d-632e-490b-b3c7-22450d29ede6-kube-api-access-xrzf2\") pod \"memcached-0\" (UID: \"b74da73d-632e-490b-b3c7-22450d29ede6\") " pod="openstack/memcached-0" Mar 20 17:51:01 crc kubenswrapper[4690]: I0320 17:51:01.472656 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b74da73d-632e-490b-b3c7-22450d29ede6-kolla-config\") pod \"memcached-0\" (UID: \"b74da73d-632e-490b-b3c7-22450d29ede6\") " pod="openstack/memcached-0" Mar 20 17:51:01 crc kubenswrapper[4690]: I0320 17:51:01.472737 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/b74da73d-632e-490b-b3c7-22450d29ede6-memcached-tls-certs\") pod \"memcached-0\" (UID: \"b74da73d-632e-490b-b3c7-22450d29ede6\") " pod="openstack/memcached-0" Mar 20 17:51:01 crc kubenswrapper[4690]: I0320 17:51:01.472772 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b74da73d-632e-490b-b3c7-22450d29ede6-config-data\") pod \"memcached-0\" (UID: \"b74da73d-632e-490b-b3c7-22450d29ede6\") " pod="openstack/memcached-0" Mar 20 17:51:01 crc kubenswrapper[4690]: I0320 17:51:01.472813 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b74da73d-632e-490b-b3c7-22450d29ede6-combined-ca-bundle\") pod \"memcached-0\" (UID: \"b74da73d-632e-490b-b3c7-22450d29ede6\") " pod="openstack/memcached-0" Mar 20 17:51:01 crc kubenswrapper[4690]: I0320 17:51:01.473586 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b74da73d-632e-490b-b3c7-22450d29ede6-kolla-config\") pod \"memcached-0\" (UID: \"b74da73d-632e-490b-b3c7-22450d29ede6\") " pod="openstack/memcached-0" Mar 20 17:51:01 crc kubenswrapper[4690]: I0320 17:51:01.474031 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b74da73d-632e-490b-b3c7-22450d29ede6-config-data\") pod \"memcached-0\" (UID: \"b74da73d-632e-490b-b3c7-22450d29ede6\") " pod="openstack/memcached-0" Mar 20 17:51:01 crc kubenswrapper[4690]: I0320 17:51:01.477033 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/b74da73d-632e-490b-b3c7-22450d29ede6-memcached-tls-certs\") pod \"memcached-0\" (UID: \"b74da73d-632e-490b-b3c7-22450d29ede6\") " pod="openstack/memcached-0" Mar 20 17:51:01 crc kubenswrapper[4690]: I0320 17:51:01.482036 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b74da73d-632e-490b-b3c7-22450d29ede6-combined-ca-bundle\") pod \"memcached-0\" (UID: \"b74da73d-632e-490b-b3c7-22450d29ede6\") " pod="openstack/memcached-0" Mar 20 17:51:01 crc kubenswrapper[4690]: I0320 17:51:01.482636 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 20 17:51:01 crc kubenswrapper[4690]: I0320 17:51:01.491888 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrzf2\" (UniqueName: \"kubernetes.io/projected/b74da73d-632e-490b-b3c7-22450d29ede6-kube-api-access-xrzf2\") pod \"memcached-0\" (UID: \"b74da73d-632e-490b-b3c7-22450d29ede6\") " pod="openstack/memcached-0" Mar 20 17:51:01 crc kubenswrapper[4690]: I0320 17:51:01.646481 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 20 17:51:03 crc kubenswrapper[4690]: I0320 17:51:03.824229 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 17:51:03 crc kubenswrapper[4690]: I0320 17:51:03.825108 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 17:51:03 crc kubenswrapper[4690]: I0320 17:51:03.827448 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-gxsc8" Mar 20 17:51:03 crc kubenswrapper[4690]: I0320 17:51:03.839093 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 17:51:03 crc kubenswrapper[4690]: I0320 17:51:03.911127 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmmsd\" (UniqueName: \"kubernetes.io/projected/5855b86f-8504-4956-af4e-0cb3f9ace108-kube-api-access-vmmsd\") pod \"kube-state-metrics-0\" (UID: \"5855b86f-8504-4956-af4e-0cb3f9ace108\") " pod="openstack/kube-state-metrics-0" Mar 20 17:51:04 crc kubenswrapper[4690]: I0320 17:51:04.012885 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmmsd\" (UniqueName: \"kubernetes.io/projected/5855b86f-8504-4956-af4e-0cb3f9ace108-kube-api-access-vmmsd\") pod \"kube-state-metrics-0\" (UID: \"5855b86f-8504-4956-af4e-0cb3f9ace108\") " pod="openstack/kube-state-metrics-0" Mar 20 17:51:04 crc kubenswrapper[4690]: I0320 17:51:04.029388 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmmsd\" (UniqueName: \"kubernetes.io/projected/5855b86f-8504-4956-af4e-0cb3f9ace108-kube-api-access-vmmsd\") pod \"kube-state-metrics-0\" (UID: \"5855b86f-8504-4956-af4e-0cb3f9ace108\") " pod="openstack/kube-state-metrics-0" Mar 20 17:51:04 crc kubenswrapper[4690]: I0320 17:51:04.142730 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 17:51:04 crc kubenswrapper[4690]: I0320 17:51:04.871590 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 17:51:07 crc kubenswrapper[4690]: I0320 17:51:07.121129 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-j8pr4"] Mar 20 17:51:07 crc kubenswrapper[4690]: I0320 17:51:07.122923 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-j8pr4" Mar 20 17:51:07 crc kubenswrapper[4690]: I0320 17:51:07.126896 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-6vh8f" Mar 20 17:51:07 crc kubenswrapper[4690]: I0320 17:51:07.129079 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 20 17:51:07 crc kubenswrapper[4690]: I0320 17:51:07.130375 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 20 17:51:07 crc kubenswrapper[4690]: I0320 17:51:07.135152 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 20 17:51:07 crc kubenswrapper[4690]: I0320 17:51:07.136501 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 20 17:51:07 crc kubenswrapper[4690]: I0320 17:51:07.146385 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 20 17:51:07 crc kubenswrapper[4690]: I0320 17:51:07.149390 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 20 17:51:07 crc kubenswrapper[4690]: I0320 17:51:07.149647 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-bn5nn" Mar 20 17:51:07 crc kubenswrapper[4690]: I0320 17:51:07.149838 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 20 17:51:07 crc kubenswrapper[4690]: I0320 17:51:07.151025 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 20 17:51:07 crc kubenswrapper[4690]: I0320 17:51:07.155799 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-8dmtk"] Mar 20 17:51:07 crc kubenswrapper[4690]: I0320 17:51:07.157645 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-8dmtk" Mar 20 17:51:07 crc kubenswrapper[4690]: I0320 17:51:07.163092 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-j8pr4"] Mar 20 17:51:07 crc kubenswrapper[4690]: I0320 17:51:07.169288 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 20 17:51:07 crc kubenswrapper[4690]: I0320 17:51:07.173239 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-8dmtk"] Mar 20 17:51:07 crc kubenswrapper[4690]: I0320 17:51:07.188778 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/172668db-85fb-47e1-82fe-dee7c454993e-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"172668db-85fb-47e1-82fe-dee7c454993e\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:51:07 crc kubenswrapper[4690]: I0320 17:51:07.189007 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/172668db-85fb-47e1-82fe-dee7c454993e-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"172668db-85fb-47e1-82fe-dee7c454993e\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:51:07 crc kubenswrapper[4690]: I0320 17:51:07.189112 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f0e78344-d5a9-4bc2-9556-e3daf0ce19db-var-run-ovn\") pod \"ovn-controller-j8pr4\" (UID: \"f0e78344-d5a9-4bc2-9556-e3daf0ce19db\") " pod="openstack/ovn-controller-j8pr4" Mar 20 17:51:07 crc kubenswrapper[4690]: I0320 17:51:07.189202 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0e78344-d5a9-4bc2-9556-e3daf0ce19db-combined-ca-bundle\") pod \"ovn-controller-j8pr4\" (UID: \"f0e78344-d5a9-4bc2-9556-e3daf0ce19db\") " pod="openstack/ovn-controller-j8pr4" Mar 20 17:51:07 crc kubenswrapper[4690]: I0320 17:51:07.189397 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7x2f\" (UniqueName: \"kubernetes.io/projected/172668db-85fb-47e1-82fe-dee7c454993e-kube-api-access-v7x2f\") pod \"ovsdbserver-nb-0\" (UID: \"172668db-85fb-47e1-82fe-dee7c454993e\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:51:07 crc kubenswrapper[4690]: I0320 17:51:07.189476 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/172668db-85fb-47e1-82fe-dee7c454993e-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"172668db-85fb-47e1-82fe-dee7c454993e\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:51:07 crc kubenswrapper[4690]: I0320 17:51:07.189578 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/172668db-85fb-47e1-82fe-dee7c454993e-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"172668db-85fb-47e1-82fe-dee7c454993e\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:51:07 crc kubenswrapper[4690]: I0320 17:51:07.189677 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/172668db-85fb-47e1-82fe-dee7c454993e-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"172668db-85fb-47e1-82fe-dee7c454993e\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:51:07 crc kubenswrapper[4690]: I0320 17:51:07.189752 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f0e78344-d5a9-4bc2-9556-e3daf0ce19db-scripts\") pod \"ovn-controller-j8pr4\" (UID: \"f0e78344-d5a9-4bc2-9556-e3daf0ce19db\") " pod="openstack/ovn-controller-j8pr4" Mar 20 17:51:07 crc kubenswrapper[4690]: I0320 17:51:07.189822 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vc9g\" (UniqueName: \"kubernetes.io/projected/f0e78344-d5a9-4bc2-9556-e3daf0ce19db-kube-api-access-4vc9g\") pod \"ovn-controller-j8pr4\" (UID: \"f0e78344-d5a9-4bc2-9556-e3daf0ce19db\") " pod="openstack/ovn-controller-j8pr4" Mar 20 17:51:07 crc kubenswrapper[4690]: I0320 17:51:07.189899 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/172668db-85fb-47e1-82fe-dee7c454993e-config\") pod \"ovsdbserver-nb-0\" (UID: \"172668db-85fb-47e1-82fe-dee7c454993e\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:51:07 crc kubenswrapper[4690]: I0320 17:51:07.189962 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0e78344-d5a9-4bc2-9556-e3daf0ce19db-ovn-controller-tls-certs\") pod \"ovn-controller-j8pr4\" (UID: \"f0e78344-d5a9-4bc2-9556-e3daf0ce19db\") " pod="openstack/ovn-controller-j8pr4" Mar 20 17:51:07 crc kubenswrapper[4690]: I0320 17:51:07.190036 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f0e78344-d5a9-4bc2-9556-e3daf0ce19db-var-run\") pod \"ovn-controller-j8pr4\" (UID: \"f0e78344-d5a9-4bc2-9556-e3daf0ce19db\") " pod="openstack/ovn-controller-j8pr4" Mar 20 17:51:07 crc kubenswrapper[4690]: I0320 17:51:07.190102 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"172668db-85fb-47e1-82fe-dee7c454993e\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:51:07 crc kubenswrapper[4690]: I0320 17:51:07.190162 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f0e78344-d5a9-4bc2-9556-e3daf0ce19db-var-log-ovn\") pod \"ovn-controller-j8pr4\" (UID: \"f0e78344-d5a9-4bc2-9556-e3daf0ce19db\") " pod="openstack/ovn-controller-j8pr4" Mar 20 17:51:07 crc kubenswrapper[4690]: I0320 17:51:07.293221 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/497fed5f-b87a-4042-ae22-186983ed7536-var-log\") pod \"ovn-controller-ovs-8dmtk\" (UID: \"497fed5f-b87a-4042-ae22-186983ed7536\") " pod="openstack/ovn-controller-ovs-8dmtk" Mar 20 17:51:07 crc kubenswrapper[4690]: I0320 17:51:07.293321 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/497fed5f-b87a-4042-ae22-186983ed7536-scripts\") pod \"ovn-controller-ovs-8dmtk\" (UID: \"497fed5f-b87a-4042-ae22-186983ed7536\") " pod="openstack/ovn-controller-ovs-8dmtk" Mar 20 17:51:07 crc kubenswrapper[4690]: I0320 17:51:07.293360 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/497fed5f-b87a-4042-ae22-186983ed7536-var-lib\") pod \"ovn-controller-ovs-8dmtk\" (UID: \"497fed5f-b87a-4042-ae22-186983ed7536\") " pod="openstack/ovn-controller-ovs-8dmtk" Mar 20 17:51:07 crc kubenswrapper[4690]: I0320 17:51:07.293390 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f0e78344-d5a9-4bc2-9556-e3daf0ce19db-var-run-ovn\") pod \"ovn-controller-j8pr4\" (UID: \"f0e78344-d5a9-4bc2-9556-e3daf0ce19db\") " pod="openstack/ovn-controller-j8pr4" Mar 20 17:51:07 crc kubenswrapper[4690]: I0320 17:51:07.293421 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/497fed5f-b87a-4042-ae22-186983ed7536-etc-ovs\") pod \"ovn-controller-ovs-8dmtk\" (UID: \"497fed5f-b87a-4042-ae22-186983ed7536\") " pod="openstack/ovn-controller-ovs-8dmtk" Mar 20 17:51:07 crc kubenswrapper[4690]: I0320 17:51:07.293468 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0e78344-d5a9-4bc2-9556-e3daf0ce19db-combined-ca-bundle\") pod \"ovn-controller-j8pr4\" (UID: \"f0e78344-d5a9-4bc2-9556-e3daf0ce19db\") " pod="openstack/ovn-controller-j8pr4" Mar 20 17:51:07 crc kubenswrapper[4690]: I0320 17:51:07.293513 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7x2f\" (UniqueName: \"kubernetes.io/projected/172668db-85fb-47e1-82fe-dee7c454993e-kube-api-access-v7x2f\") pod \"ovsdbserver-nb-0\" (UID: \"172668db-85fb-47e1-82fe-dee7c454993e\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:51:07 crc kubenswrapper[4690]: I0320 17:51:07.293535 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/172668db-85fb-47e1-82fe-dee7c454993e-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"172668db-85fb-47e1-82fe-dee7c454993e\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:51:07 crc kubenswrapper[4690]: I0320 17:51:07.293554 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7mpn\" (UniqueName: \"kubernetes.io/projected/497fed5f-b87a-4042-ae22-186983ed7536-kube-api-access-j7mpn\") pod \"ovn-controller-ovs-8dmtk\" (UID: \"497fed5f-b87a-4042-ae22-186983ed7536\") " pod="openstack/ovn-controller-ovs-8dmtk" Mar 20 17:51:07 crc kubenswrapper[4690]: I0320 17:51:07.293600 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/172668db-85fb-47e1-82fe-dee7c454993e-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"172668db-85fb-47e1-82fe-dee7c454993e\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:51:07 crc kubenswrapper[4690]: I0320 17:51:07.293638 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/172668db-85fb-47e1-82fe-dee7c454993e-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"172668db-85fb-47e1-82fe-dee7c454993e\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:51:07 crc kubenswrapper[4690]: I0320 17:51:07.293659 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f0e78344-d5a9-4bc2-9556-e3daf0ce19db-scripts\") pod \"ovn-controller-j8pr4\" (UID: \"f0e78344-d5a9-4bc2-9556-e3daf0ce19db\") " pod="openstack/ovn-controller-j8pr4" Mar 20 17:51:07 crc kubenswrapper[4690]: I0320 17:51:07.293683 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vc9g\" (UniqueName: \"kubernetes.io/projected/f0e78344-d5a9-4bc2-9556-e3daf0ce19db-kube-api-access-4vc9g\") pod \"ovn-controller-j8pr4\" (UID: \"f0e78344-d5a9-4bc2-9556-e3daf0ce19db\") " pod="openstack/ovn-controller-j8pr4" Mar 20 17:51:07 crc kubenswrapper[4690]: I0320 17:51:07.293730 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/172668db-85fb-47e1-82fe-dee7c454993e-config\") pod \"ovsdbserver-nb-0\" (UID: \"172668db-85fb-47e1-82fe-dee7c454993e\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:51:07 crc kubenswrapper[4690]: I0320 17:51:07.293764 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f0e78344-d5a9-4bc2-9556-e3daf0ce19db-var-run\") pod \"ovn-controller-j8pr4\" (UID: \"f0e78344-d5a9-4bc2-9556-e3daf0ce19db\") " pod="openstack/ovn-controller-j8pr4" Mar 20 17:51:07 crc kubenswrapper[4690]: I0320 17:51:07.293785 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0e78344-d5a9-4bc2-9556-e3daf0ce19db-ovn-controller-tls-certs\") pod \"ovn-controller-j8pr4\" (UID: \"f0e78344-d5a9-4bc2-9556-e3daf0ce19db\") " pod="openstack/ovn-controller-j8pr4" Mar 20 17:51:07 crc kubenswrapper[4690]: I0320 17:51:07.293816 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"172668db-85fb-47e1-82fe-dee7c454993e\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:51:07 crc kubenswrapper[4690]: I0320 17:51:07.293836 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f0e78344-d5a9-4bc2-9556-e3daf0ce19db-var-log-ovn\") pod \"ovn-controller-j8pr4\" (UID: \"f0e78344-d5a9-4bc2-9556-e3daf0ce19db\") " pod="openstack/ovn-controller-j8pr4" Mar 20 17:51:07 crc kubenswrapper[4690]: I0320 17:51:07.293862 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/172668db-85fb-47e1-82fe-dee7c454993e-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"172668db-85fb-47e1-82fe-dee7c454993e\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:51:07 crc kubenswrapper[4690]: I0320 17:51:07.293888 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/172668db-85fb-47e1-82fe-dee7c454993e-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"172668db-85fb-47e1-82fe-dee7c454993e\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:51:07 crc kubenswrapper[4690]: I0320 17:51:07.293909 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/497fed5f-b87a-4042-ae22-186983ed7536-var-run\") pod \"ovn-controller-ovs-8dmtk\" (UID: \"497fed5f-b87a-4042-ae22-186983ed7536\") " pod="openstack/ovn-controller-ovs-8dmtk" Mar 20 17:51:07 crc kubenswrapper[4690]: I0320 17:51:07.293937 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f0e78344-d5a9-4bc2-9556-e3daf0ce19db-var-run-ovn\") pod \"ovn-controller-j8pr4\" (UID: \"f0e78344-d5a9-4bc2-9556-e3daf0ce19db\") " pod="openstack/ovn-controller-j8pr4" Mar 20 17:51:07 crc kubenswrapper[4690]: I0320 17:51:07.294776 4690 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"172668db-85fb-47e1-82fe-dee7c454993e\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/ovsdbserver-nb-0" Mar 20 17:51:07 crc kubenswrapper[4690]: I0320 17:51:07.294876 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/172668db-85fb-47e1-82fe-dee7c454993e-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"172668db-85fb-47e1-82fe-dee7c454993e\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:51:07 crc kubenswrapper[4690]: I0320 17:51:07.295875 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f0e78344-d5a9-4bc2-9556-e3daf0ce19db-scripts\") pod \"ovn-controller-j8pr4\" (UID: \"f0e78344-d5a9-4bc2-9556-e3daf0ce19db\") " pod="openstack/ovn-controller-j8pr4" Mar 20 17:51:07 crc kubenswrapper[4690]: I0320 17:51:07.295935 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/172668db-85fb-47e1-82fe-dee7c454993e-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"172668db-85fb-47e1-82fe-dee7c454993e\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:51:07 crc kubenswrapper[4690]: I0320 17:51:07.296295 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f0e78344-d5a9-4bc2-9556-e3daf0ce19db-var-run\") pod \"ovn-controller-j8pr4\" (UID: \"f0e78344-d5a9-4bc2-9556-e3daf0ce19db\") " pod="openstack/ovn-controller-j8pr4" Mar 20 17:51:07 crc kubenswrapper[4690]: I0320 17:51:07.294788 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f0e78344-d5a9-4bc2-9556-e3daf0ce19db-var-log-ovn\") pod \"ovn-controller-j8pr4\" (UID: \"f0e78344-d5a9-4bc2-9556-e3daf0ce19db\") " pod="openstack/ovn-controller-j8pr4" Mar 20 17:51:07 crc kubenswrapper[4690]: I0320 17:51:07.299035 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/172668db-85fb-47e1-82fe-dee7c454993e-config\") pod \"ovsdbserver-nb-0\" (UID: \"172668db-85fb-47e1-82fe-dee7c454993e\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:51:07 crc kubenswrapper[4690]: I0320 17:51:07.311328 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/172668db-85fb-47e1-82fe-dee7c454993e-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"172668db-85fb-47e1-82fe-dee7c454993e\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:51:07 crc kubenswrapper[4690]: I0320 17:51:07.311836 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0e78344-d5a9-4bc2-9556-e3daf0ce19db-combined-ca-bundle\") pod \"ovn-controller-j8pr4\" (UID: \"f0e78344-d5a9-4bc2-9556-e3daf0ce19db\") " pod="openstack/ovn-controller-j8pr4" Mar 20 17:51:07 crc kubenswrapper[4690]: I0320 17:51:07.312211 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0e78344-d5a9-4bc2-9556-e3daf0ce19db-ovn-controller-tls-certs\") pod \"ovn-controller-j8pr4\" (UID: \"f0e78344-d5a9-4bc2-9556-e3daf0ce19db\") " pod="openstack/ovn-controller-j8pr4" Mar 20 17:51:07 crc kubenswrapper[4690]: I0320 17:51:07.317002 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vc9g\" (UniqueName: \"kubernetes.io/projected/f0e78344-d5a9-4bc2-9556-e3daf0ce19db-kube-api-access-4vc9g\") pod \"ovn-controller-j8pr4\" (UID: \"f0e78344-d5a9-4bc2-9556-e3daf0ce19db\") " pod="openstack/ovn-controller-j8pr4" Mar 20 17:51:07 crc kubenswrapper[4690]: I0320 17:51:07.322170 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/172668db-85fb-47e1-82fe-dee7c454993e-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"172668db-85fb-47e1-82fe-dee7c454993e\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:51:07 crc kubenswrapper[4690]: I0320 17:51:07.324362 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7x2f\" (UniqueName: \"kubernetes.io/projected/172668db-85fb-47e1-82fe-dee7c454993e-kube-api-access-v7x2f\") pod \"ovsdbserver-nb-0\" (UID: \"172668db-85fb-47e1-82fe-dee7c454993e\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:51:07 crc kubenswrapper[4690]: I0320 17:51:07.329899 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/172668db-85fb-47e1-82fe-dee7c454993e-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"172668db-85fb-47e1-82fe-dee7c454993e\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:51:07 crc kubenswrapper[4690]: I0320 17:51:07.336094 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"172668db-85fb-47e1-82fe-dee7c454993e\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:51:07 crc kubenswrapper[4690]: I0320 17:51:07.395926 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7mpn\" (UniqueName: \"kubernetes.io/projected/497fed5f-b87a-4042-ae22-186983ed7536-kube-api-access-j7mpn\") pod \"ovn-controller-ovs-8dmtk\" (UID: \"497fed5f-b87a-4042-ae22-186983ed7536\") " pod="openstack/ovn-controller-ovs-8dmtk" Mar 20 17:51:07 crc kubenswrapper[4690]: I0320 17:51:07.396047 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/497fed5f-b87a-4042-ae22-186983ed7536-var-run\") pod \"ovn-controller-ovs-8dmtk\" (UID: \"497fed5f-b87a-4042-ae22-186983ed7536\") " pod="openstack/ovn-controller-ovs-8dmtk" Mar 20 17:51:07 crc kubenswrapper[4690]: I0320 17:51:07.396090 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/497fed5f-b87a-4042-ae22-186983ed7536-var-log\") pod \"ovn-controller-ovs-8dmtk\" (UID: \"497fed5f-b87a-4042-ae22-186983ed7536\") " pod="openstack/ovn-controller-ovs-8dmtk" Mar 20 17:51:07 crc kubenswrapper[4690]: I0320 17:51:07.396114 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/497fed5f-b87a-4042-ae22-186983ed7536-scripts\") pod \"ovn-controller-ovs-8dmtk\" (UID: \"497fed5f-b87a-4042-ae22-186983ed7536\") " pod="openstack/ovn-controller-ovs-8dmtk" Mar 20 17:51:07 crc kubenswrapper[4690]: I0320 17:51:07.396150 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/497fed5f-b87a-4042-ae22-186983ed7536-var-lib\") pod \"ovn-controller-ovs-8dmtk\" (UID: \"497fed5f-b87a-4042-ae22-186983ed7536\") " pod="openstack/ovn-controller-ovs-8dmtk" Mar 20 17:51:07 crc kubenswrapper[4690]: I0320 17:51:07.396177 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/497fed5f-b87a-4042-ae22-186983ed7536-etc-ovs\") pod \"ovn-controller-ovs-8dmtk\" (UID: \"497fed5f-b87a-4042-ae22-186983ed7536\") " pod="openstack/ovn-controller-ovs-8dmtk" Mar 20 17:51:07 crc kubenswrapper[4690]: I0320 17:51:07.396433 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/497fed5f-b87a-4042-ae22-186983ed7536-etc-ovs\") pod \"ovn-controller-ovs-8dmtk\" (UID: \"497fed5f-b87a-4042-ae22-186983ed7536\") " pod="openstack/ovn-controller-ovs-8dmtk" Mar 20 17:51:07 crc kubenswrapper[4690]: I0320 17:51:07.396525 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/497fed5f-b87a-4042-ae22-186983ed7536-var-log\") pod \"ovn-controller-ovs-8dmtk\" (UID: \"497fed5f-b87a-4042-ae22-186983ed7536\") " pod="openstack/ovn-controller-ovs-8dmtk" Mar 20 17:51:07 crc kubenswrapper[4690]: I0320 17:51:07.396587 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/497fed5f-b87a-4042-ae22-186983ed7536-var-run\") pod \"ovn-controller-ovs-8dmtk\" (UID: \"497fed5f-b87a-4042-ae22-186983ed7536\") " pod="openstack/ovn-controller-ovs-8dmtk" Mar 20 17:51:07 crc kubenswrapper[4690]: I0320 17:51:07.396585 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/497fed5f-b87a-4042-ae22-186983ed7536-var-lib\") pod \"ovn-controller-ovs-8dmtk\" (UID: \"497fed5f-b87a-4042-ae22-186983ed7536\") " pod="openstack/ovn-controller-ovs-8dmtk" Mar 20 17:51:07 crc kubenswrapper[4690]: I0320 17:51:07.398240 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/497fed5f-b87a-4042-ae22-186983ed7536-scripts\") pod \"ovn-controller-ovs-8dmtk\" (UID: \"497fed5f-b87a-4042-ae22-186983ed7536\") " pod="openstack/ovn-controller-ovs-8dmtk" Mar 20 17:51:07 crc kubenswrapper[4690]: I0320 17:51:07.411835 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7mpn\" (UniqueName: \"kubernetes.io/projected/497fed5f-b87a-4042-ae22-186983ed7536-kube-api-access-j7mpn\") pod \"ovn-controller-ovs-8dmtk\" (UID: \"497fed5f-b87a-4042-ae22-186983ed7536\") " pod="openstack/ovn-controller-ovs-8dmtk" Mar 20 17:51:07 crc kubenswrapper[4690]: I0320 17:51:07.451053 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-j8pr4" Mar 20 17:51:07 crc kubenswrapper[4690]: I0320 17:51:07.457948 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 20 17:51:07 crc kubenswrapper[4690]: I0320 17:51:07.475205 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-8dmtk" Mar 20 17:51:10 crc kubenswrapper[4690]: I0320 17:51:10.181088 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7","Type":"ContainerStarted","Data":"90c0dff250aab7f2bbe343a386791e62adb9fdbf2ae00b7c9e03c065674a4553"} Mar 20 17:51:10 crc kubenswrapper[4690]: E0320 17:51:10.411406 4690 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 20 17:51:10 crc kubenswrapper[4690]: E0320 17:51:10.411769 4690 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pn5km,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-n27gl_openstack(c7a6c828-810d-4ce8-b7d5-4242668f01db): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 17:51:10 crc kubenswrapper[4690]: E0320 17:51:10.413124 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-n27gl" podUID="c7a6c828-810d-4ce8-b7d5-4242668f01db" Mar 20 17:51:10 crc kubenswrapper[4690]: E0320 17:51:10.633079 4690 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 20 17:51:10 crc kubenswrapper[4690]: E0320 17:51:10.633518 4690 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-t557r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-wmwmz_openstack(90a6ba98-0064-4f8e-bd8a-dfa13ddc8d2e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 17:51:10 crc kubenswrapper[4690]: E0320 17:51:10.634640 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-wmwmz" podUID="90a6ba98-0064-4f8e-bd8a-dfa13ddc8d2e" Mar 20 17:51:10 crc kubenswrapper[4690]: I0320 17:51:10.678888 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 20 17:51:10 crc kubenswrapper[4690]: I0320 17:51:10.680182 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 20 17:51:10 crc kubenswrapper[4690]: I0320 17:51:10.682959 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 20 17:51:10 crc kubenswrapper[4690]: I0320 17:51:10.683207 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 20 17:51:10 crc kubenswrapper[4690]: I0320 17:51:10.683830 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 20 17:51:10 crc kubenswrapper[4690]: I0320 17:51:10.690008 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-lfqt4" Mar 20 17:51:10 crc kubenswrapper[4690]: I0320 17:51:10.706128 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 20 17:51:10 crc kubenswrapper[4690]: I0320 17:51:10.753133 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d64e7a22-5bb9-49ec-95d6-7ff145a31f9a-config\") pod \"ovsdbserver-sb-0\" (UID: \"d64e7a22-5bb9-49ec-95d6-7ff145a31f9a\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:51:10 crc kubenswrapper[4690]: I0320 17:51:10.753221 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-km42d\" (UniqueName: \"kubernetes.io/projected/d64e7a22-5bb9-49ec-95d6-7ff145a31f9a-kube-api-access-km42d\") pod \"ovsdbserver-sb-0\" (UID: \"d64e7a22-5bb9-49ec-95d6-7ff145a31f9a\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:51:10 crc kubenswrapper[4690]: I0320 17:51:10.753309 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d64e7a22-5bb9-49ec-95d6-7ff145a31f9a-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d64e7a22-5bb9-49ec-95d6-7ff145a31f9a\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:51:10 crc kubenswrapper[4690]: I0320 17:51:10.753349 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d64e7a22-5bb9-49ec-95d6-7ff145a31f9a-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"d64e7a22-5bb9-49ec-95d6-7ff145a31f9a\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:51:10 crc kubenswrapper[4690]: I0320 17:51:10.753380 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"d64e7a22-5bb9-49ec-95d6-7ff145a31f9a\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:51:10 crc kubenswrapper[4690]: I0320 17:51:10.753423 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d64e7a22-5bb9-49ec-95d6-7ff145a31f9a-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d64e7a22-5bb9-49ec-95d6-7ff145a31f9a\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:51:10 crc kubenswrapper[4690]: I0320 17:51:10.753452 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d64e7a22-5bb9-49ec-95d6-7ff145a31f9a-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"d64e7a22-5bb9-49ec-95d6-7ff145a31f9a\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:51:10 crc kubenswrapper[4690]: I0320 17:51:10.753515 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d64e7a22-5bb9-49ec-95d6-7ff145a31f9a-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"d64e7a22-5bb9-49ec-95d6-7ff145a31f9a\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:51:10 crc kubenswrapper[4690]: I0320 17:51:10.855504 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d64e7a22-5bb9-49ec-95d6-7ff145a31f9a-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"d64e7a22-5bb9-49ec-95d6-7ff145a31f9a\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:51:10 crc kubenswrapper[4690]: I0320 17:51:10.855877 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d64e7a22-5bb9-49ec-95d6-7ff145a31f9a-config\") pod \"ovsdbserver-sb-0\" (UID: \"d64e7a22-5bb9-49ec-95d6-7ff145a31f9a\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:51:10 crc kubenswrapper[4690]: I0320 17:51:10.855911 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-km42d\" (UniqueName: \"kubernetes.io/projected/d64e7a22-5bb9-49ec-95d6-7ff145a31f9a-kube-api-access-km42d\") pod \"ovsdbserver-sb-0\" (UID: \"d64e7a22-5bb9-49ec-95d6-7ff145a31f9a\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:51:10 crc kubenswrapper[4690]: I0320 17:51:10.855973 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d64e7a22-5bb9-49ec-95d6-7ff145a31f9a-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d64e7a22-5bb9-49ec-95d6-7ff145a31f9a\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:51:10 crc kubenswrapper[4690]: I0320 17:51:10.856014 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d64e7a22-5bb9-49ec-95d6-7ff145a31f9a-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"d64e7a22-5bb9-49ec-95d6-7ff145a31f9a\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:51:10 crc kubenswrapper[4690]: I0320 17:51:10.856046 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"d64e7a22-5bb9-49ec-95d6-7ff145a31f9a\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:51:10 crc kubenswrapper[4690]: I0320 17:51:10.856083 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d64e7a22-5bb9-49ec-95d6-7ff145a31f9a-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d64e7a22-5bb9-49ec-95d6-7ff145a31f9a\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:51:10 crc kubenswrapper[4690]: I0320 17:51:10.856131 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d64e7a22-5bb9-49ec-95d6-7ff145a31f9a-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"d64e7a22-5bb9-49ec-95d6-7ff145a31f9a\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:51:10 crc kubenswrapper[4690]: I0320 17:51:10.857089 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d64e7a22-5bb9-49ec-95d6-7ff145a31f9a-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"d64e7a22-5bb9-49ec-95d6-7ff145a31f9a\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:51:10 crc kubenswrapper[4690]: I0320 17:51:10.857188 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d64e7a22-5bb9-49ec-95d6-7ff145a31f9a-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"d64e7a22-5bb9-49ec-95d6-7ff145a31f9a\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:51:10 crc kubenswrapper[4690]: I0320 17:51:10.857491 4690 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"d64e7a22-5bb9-49ec-95d6-7ff145a31f9a\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/ovsdbserver-sb-0" Mar 20 17:51:10 crc kubenswrapper[4690]: I0320 17:51:10.857593 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d64e7a22-5bb9-49ec-95d6-7ff145a31f9a-config\") pod \"ovsdbserver-sb-0\" (UID: \"d64e7a22-5bb9-49ec-95d6-7ff145a31f9a\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:51:10 crc kubenswrapper[4690]: I0320 17:51:10.865436 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d64e7a22-5bb9-49ec-95d6-7ff145a31f9a-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d64e7a22-5bb9-49ec-95d6-7ff145a31f9a\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:51:10 crc kubenswrapper[4690]: I0320 17:51:10.870915 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d64e7a22-5bb9-49ec-95d6-7ff145a31f9a-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"d64e7a22-5bb9-49ec-95d6-7ff145a31f9a\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:51:10 crc kubenswrapper[4690]: I0320 17:51:10.871351 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d64e7a22-5bb9-49ec-95d6-7ff145a31f9a-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d64e7a22-5bb9-49ec-95d6-7ff145a31f9a\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:51:10 crc kubenswrapper[4690]: I0320 17:51:10.877561 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-km42d\" (UniqueName: \"kubernetes.io/projected/d64e7a22-5bb9-49ec-95d6-7ff145a31f9a-kube-api-access-km42d\") pod \"ovsdbserver-sb-0\" (UID: \"d64e7a22-5bb9-49ec-95d6-7ff145a31f9a\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:51:10 crc kubenswrapper[4690]: I0320 17:51:10.925568 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"d64e7a22-5bb9-49ec-95d6-7ff145a31f9a\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:51:10 crc kubenswrapper[4690]: I0320 17:51:10.955447 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 20 17:51:10 crc kubenswrapper[4690]: I0320 17:51:10.961146 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 17:51:11 crc kubenswrapper[4690]: I0320 17:51:11.008404 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 20 17:51:11 crc kubenswrapper[4690]: E0320 17:51:11.035584 4690 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod875bf125_544f_4899_ac00_797737833d7e.slice/crio-b7f1836052e14774be03519bb4120838e3bf6b3b951beb1928c29424d0b96c7c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod875bf125_544f_4899_ac00_797737833d7e.slice/crio-conmon-b7f1836052e14774be03519bb4120838e3bf6b3b951beb1928c29424d0b96c7c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod03da2fe9_5a86_4fba_87c7_7b2132c31d5f.slice/crio-conmon-6b008c90252cedb91c5007188f4d976d4aac86e642c8f4a5ee05eadc58541fb4.scope\": RecentStats: unable to find data in memory cache]" Mar 20 17:51:11 crc kubenswrapper[4690]: I0320 17:51:11.104034 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 20 17:51:11 crc kubenswrapper[4690]: I0320 17:51:11.179807 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 20 17:51:11 crc kubenswrapper[4690]: I0320 17:51:11.186668 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 17:51:11 crc kubenswrapper[4690]: I0320 17:51:11.195622 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4ee4534b-8d84-4ca5-a8bc-10574d39d7bc","Type":"ContainerStarted","Data":"67c11610348986459be7d3545e959b3f6c3cb99823efa90eeaf0b4cf35de901c"} Mar 20 17:51:11 crc kubenswrapper[4690]: W0320 17:51:11.196181 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5855b86f_8504_4956_af4e_0cb3f9ace108.slice/crio-87d6401aa5bd45c5c7d4ae253c0af29f9b06720a31cb4e92b14b338f3af8ba94 WatchSource:0}: Error finding container 87d6401aa5bd45c5c7d4ae253c0af29f9b06720a31cb4e92b14b338f3af8ba94: Status 404 returned error can't find the container with id 87d6401aa5bd45c5c7d4ae253c0af29f9b06720a31cb4e92b14b338f3af8ba94 Mar 20 17:51:11 crc kubenswrapper[4690]: I0320 17:51:11.198329 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"d4aa597c-8302-463f-a383-39c9a51baa2c","Type":"ContainerStarted","Data":"f6541124b560a61127a1740d22dcaa2f53e1863d08b3d94e464f23e3cd75e1e4"} Mar 20 17:51:11 crc kubenswrapper[4690]: I0320 17:51:11.200061 4690 generic.go:334] "Generic (PLEG): container finished" podID="03da2fe9-5a86-4fba-87c7-7b2132c31d5f" containerID="6b008c90252cedb91c5007188f4d976d4aac86e642c8f4a5ee05eadc58541fb4" exitCode=0 Mar 20 17:51:11 crc kubenswrapper[4690]: I0320 17:51:11.200133 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-9mzwk" event={"ID":"03da2fe9-5a86-4fba-87c7-7b2132c31d5f","Type":"ContainerDied","Data":"6b008c90252cedb91c5007188f4d976d4aac86e642c8f4a5ee05eadc58541fb4"} Mar 20 17:51:11 crc kubenswrapper[4690]: I0320 17:51:11.201510 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"dacc9bed-eaa9-4747-8a92-30f5afa0a698","Type":"ContainerStarted","Data":"6d922755b4e4baf9733e2e6e0c80311c78dc994db03c9e448158dd3cf37e309c"} Mar 20 17:51:11 crc kubenswrapper[4690]: I0320 17:51:11.203008 4690 generic.go:334] "Generic (PLEG): container finished" podID="875bf125-544f-4899-ac00-797737833d7e" containerID="b7f1836052e14774be03519bb4120838e3bf6b3b951beb1928c29424d0b96c7c" exitCode=0 Mar 20 17:51:11 crc kubenswrapper[4690]: I0320 17:51:11.203070 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-ztc9k" event={"ID":"875bf125-544f-4899-ac00-797737833d7e","Type":"ContainerDied","Data":"b7f1836052e14774be03519bb4120838e3bf6b3b951beb1928c29424d0b96c7c"} Mar 20 17:51:11 crc kubenswrapper[4690]: W0320 17:51:11.212964 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb74da73d_632e_490b_b3c7_22450d29ede6.slice/crio-887887f200fa5c3a1eaa6884cb9c4dbb844650f20637e6f75a8bfa60d4499586 WatchSource:0}: Error finding container 887887f200fa5c3a1eaa6884cb9c4dbb844650f20637e6f75a8bfa60d4499586: Status 404 returned error can't find the container with id 887887f200fa5c3a1eaa6884cb9c4dbb844650f20637e6f75a8bfa60d4499586 Mar 20 17:51:11 crc kubenswrapper[4690]: I0320 17:51:11.299862 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-j8pr4"] Mar 20 17:51:11 crc kubenswrapper[4690]: I0320 17:51:11.417182 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 20 17:51:11 crc kubenswrapper[4690]: W0320 17:51:11.423273 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod172668db_85fb_47e1_82fe_dee7c454993e.slice/crio-63a332b392b3825a435490c475eb32cefd00e1ff414077c7e389dfd04d745482 WatchSource:0}: Error finding container 63a332b392b3825a435490c475eb32cefd00e1ff414077c7e389dfd04d745482: Status 404 returned error can't find the container with id 63a332b392b3825a435490c475eb32cefd00e1ff414077c7e389dfd04d745482 Mar 20 17:51:11 crc kubenswrapper[4690]: I0320 17:51:11.582855 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 20 17:51:11 crc kubenswrapper[4690]: I0320 17:51:11.654780 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-wmwmz" Mar 20 17:51:11 crc kubenswrapper[4690]: I0320 17:51:11.669143 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-n27gl" Mar 20 17:51:11 crc kubenswrapper[4690]: I0320 17:51:11.777975 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7a6c828-810d-4ce8-b7d5-4242668f01db-config\") pod \"c7a6c828-810d-4ce8-b7d5-4242668f01db\" (UID: \"c7a6c828-810d-4ce8-b7d5-4242668f01db\") " Mar 20 17:51:11 crc kubenswrapper[4690]: I0320 17:51:11.778572 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7a6c828-810d-4ce8-b7d5-4242668f01db-config" (OuterVolumeSpecName: "config") pod "c7a6c828-810d-4ce8-b7d5-4242668f01db" (UID: "c7a6c828-810d-4ce8-b7d5-4242668f01db"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:51:11 crc kubenswrapper[4690]: I0320 17:51:11.779331 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pn5km\" (UniqueName: \"kubernetes.io/projected/c7a6c828-810d-4ce8-b7d5-4242668f01db-kube-api-access-pn5km\") pod \"c7a6c828-810d-4ce8-b7d5-4242668f01db\" (UID: \"c7a6c828-810d-4ce8-b7d5-4242668f01db\") " Mar 20 17:51:11 crc kubenswrapper[4690]: I0320 17:51:11.779448 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90a6ba98-0064-4f8e-bd8a-dfa13ddc8d2e-dns-svc\") pod \"90a6ba98-0064-4f8e-bd8a-dfa13ddc8d2e\" (UID: \"90a6ba98-0064-4f8e-bd8a-dfa13ddc8d2e\") " Mar 20 17:51:11 crc kubenswrapper[4690]: I0320 17:51:11.779572 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t557r\" (UniqueName: \"kubernetes.io/projected/90a6ba98-0064-4f8e-bd8a-dfa13ddc8d2e-kube-api-access-t557r\") pod \"90a6ba98-0064-4f8e-bd8a-dfa13ddc8d2e\" (UID: \"90a6ba98-0064-4f8e-bd8a-dfa13ddc8d2e\") " Mar 20 17:51:11 crc kubenswrapper[4690]: I0320 17:51:11.779753 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90a6ba98-0064-4f8e-bd8a-dfa13ddc8d2e-config\") pod \"90a6ba98-0064-4f8e-bd8a-dfa13ddc8d2e\" (UID: \"90a6ba98-0064-4f8e-bd8a-dfa13ddc8d2e\") " Mar 20 17:51:11 crc kubenswrapper[4690]: I0320 17:51:11.779915 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90a6ba98-0064-4f8e-bd8a-dfa13ddc8d2e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "90a6ba98-0064-4f8e-bd8a-dfa13ddc8d2e" (UID: "90a6ba98-0064-4f8e-bd8a-dfa13ddc8d2e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:51:11 crc kubenswrapper[4690]: I0320 17:51:11.780309 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90a6ba98-0064-4f8e-bd8a-dfa13ddc8d2e-config" (OuterVolumeSpecName: "config") pod "90a6ba98-0064-4f8e-bd8a-dfa13ddc8d2e" (UID: "90a6ba98-0064-4f8e-bd8a-dfa13ddc8d2e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:51:11 crc kubenswrapper[4690]: I0320 17:51:11.780442 4690 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90a6ba98-0064-4f8e-bd8a-dfa13ddc8d2e-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 17:51:11 crc kubenswrapper[4690]: I0320 17:51:11.780479 4690 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7a6c828-810d-4ce8-b7d5-4242668f01db-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:51:11 crc kubenswrapper[4690]: I0320 17:51:11.793905 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7a6c828-810d-4ce8-b7d5-4242668f01db-kube-api-access-pn5km" (OuterVolumeSpecName: "kube-api-access-pn5km") pod "c7a6c828-810d-4ce8-b7d5-4242668f01db" (UID: "c7a6c828-810d-4ce8-b7d5-4242668f01db"). InnerVolumeSpecName "kube-api-access-pn5km". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:51:11 crc kubenswrapper[4690]: I0320 17:51:11.794730 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90a6ba98-0064-4f8e-bd8a-dfa13ddc8d2e-kube-api-access-t557r" (OuterVolumeSpecName: "kube-api-access-t557r") pod "90a6ba98-0064-4f8e-bd8a-dfa13ddc8d2e" (UID: "90a6ba98-0064-4f8e-bd8a-dfa13ddc8d2e"). InnerVolumeSpecName "kube-api-access-t557r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:51:11 crc kubenswrapper[4690]: I0320 17:51:11.882635 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pn5km\" (UniqueName: \"kubernetes.io/projected/c7a6c828-810d-4ce8-b7d5-4242668f01db-kube-api-access-pn5km\") on node \"crc\" DevicePath \"\"" Mar 20 17:51:11 crc kubenswrapper[4690]: I0320 17:51:11.882672 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t557r\" (UniqueName: \"kubernetes.io/projected/90a6ba98-0064-4f8e-bd8a-dfa13ddc8d2e-kube-api-access-t557r\") on node \"crc\" DevicePath \"\"" Mar 20 17:51:11 crc kubenswrapper[4690]: I0320 17:51:11.882683 4690 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90a6ba98-0064-4f8e-bd8a-dfa13ddc8d2e-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:51:11 crc kubenswrapper[4690]: I0320 17:51:11.923310 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-8dmtk"] Mar 20 17:51:11 crc kubenswrapper[4690]: W0320 17:51:11.954933 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod497fed5f_b87a_4042_ae22_186983ed7536.slice/crio-60da3c633cb587a03d2cd4a54df3b3e0b157d8bbb635d74ad9e00ab5cba95023 WatchSource:0}: Error finding container 60da3c633cb587a03d2cd4a54df3b3e0b157d8bbb635d74ad9e00ab5cba95023: Status 404 returned error can't find the container with id 60da3c633cb587a03d2cd4a54df3b3e0b157d8bbb635d74ad9e00ab5cba95023 Mar 20 17:51:12 crc kubenswrapper[4690]: I0320 17:51:12.215696 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5855b86f-8504-4956-af4e-0cb3f9ace108","Type":"ContainerStarted","Data":"87d6401aa5bd45c5c7d4ae253c0af29f9b06720a31cb4e92b14b338f3af8ba94"} Mar 20 17:51:12 crc kubenswrapper[4690]: I0320 17:51:12.218340 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-n27gl" event={"ID":"c7a6c828-810d-4ce8-b7d5-4242668f01db","Type":"ContainerDied","Data":"14d33695b44e8c73d8f6c27efa7113255899d36cd39b02633f805f136c4fc1d8"} Mar 20 17:51:12 crc kubenswrapper[4690]: I0320 17:51:12.218403 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-n27gl" Mar 20 17:51:12 crc kubenswrapper[4690]: I0320 17:51:12.225267 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-8dmtk" event={"ID":"497fed5f-b87a-4042-ae22-186983ed7536","Type":"ContainerStarted","Data":"60da3c633cb587a03d2cd4a54df3b3e0b157d8bbb635d74ad9e00ab5cba95023"} Mar 20 17:51:12 crc kubenswrapper[4690]: I0320 17:51:12.229151 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-ztc9k" event={"ID":"875bf125-544f-4899-ac00-797737833d7e","Type":"ContainerStarted","Data":"297e2d0cc4d07c43449282fda4c44a37804f3dafdcce9faa00dfe8bd4f838ecb"} Mar 20 17:51:12 crc kubenswrapper[4690]: I0320 17:51:12.229379 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-ztc9k" Mar 20 17:51:12 crc kubenswrapper[4690]: I0320 17:51:12.230461 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-wmwmz" Mar 20 17:51:12 crc kubenswrapper[4690]: I0320 17:51:12.230532 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-wmwmz" event={"ID":"90a6ba98-0064-4f8e-bd8a-dfa13ddc8d2e","Type":"ContainerDied","Data":"1d010598536502c745367fac9cfbc76add0e7f1d32099b409a8f8c2654be43eb"} Mar 20 17:51:12 crc kubenswrapper[4690]: I0320 17:51:12.234096 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-j8pr4" event={"ID":"f0e78344-d5a9-4bc2-9556-e3daf0ce19db","Type":"ContainerStarted","Data":"32bd226919571c97ffd91c1fc174c3765b9cb7b20ea2a34799b0db7b0f2abf32"} Mar 20 17:51:12 crc kubenswrapper[4690]: I0320 17:51:12.245531 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"b74da73d-632e-490b-b3c7-22450d29ede6","Type":"ContainerStarted","Data":"887887f200fa5c3a1eaa6884cb9c4dbb844650f20637e6f75a8bfa60d4499586"} Mar 20 17:51:12 crc kubenswrapper[4690]: I0320 17:51:12.247734 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-9mzwk" event={"ID":"03da2fe9-5a86-4fba-87c7-7b2132c31d5f","Type":"ContainerStarted","Data":"23a76f8b21ec13fc0517e5d372cbf075afe6057ed89575877f2580dc8cb31056"} Mar 20 17:51:12 crc kubenswrapper[4690]: I0320 17:51:12.248313 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-9mzwk" Mar 20 17:51:12 crc kubenswrapper[4690]: I0320 17:51:12.250928 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"d64e7a22-5bb9-49ec-95d6-7ff145a31f9a","Type":"ContainerStarted","Data":"ef0dfdad5caeb4f1025ef60b593c2ac511de6da992eb175ba7fa94b3d775dc98"} Mar 20 17:51:12 crc kubenswrapper[4690]: I0320 17:51:12.268202 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"172668db-85fb-47e1-82fe-dee7c454993e","Type":"ContainerStarted","Data":"63a332b392b3825a435490c475eb32cefd00e1ff414077c7e389dfd04d745482"} Mar 20 17:51:12 crc kubenswrapper[4690]: I0320 17:51:12.284688 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-n27gl"] Mar 20 17:51:12 crc kubenswrapper[4690]: I0320 17:51:12.298064 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-n27gl"] Mar 20 17:51:12 crc kubenswrapper[4690]: I0320 17:51:12.323747 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-wmwmz"] Mar 20 17:51:12 crc kubenswrapper[4690]: I0320 17:51:12.329987 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-wmwmz"] Mar 20 17:51:12 crc kubenswrapper[4690]: I0320 17:51:12.330371 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-ztc9k" podStartSLOduration=2.791647237 podStartE2EDuration="15.330358023s" podCreationTimestamp="2026-03-20 17:50:57 +0000 UTC" firstStartedPulling="2026-03-20 17:50:58.032643531 +0000 UTC m=+1132.898469209" lastFinishedPulling="2026-03-20 17:51:10.571354317 +0000 UTC m=+1145.437179995" observedRunningTime="2026-03-20 17:51:12.29102802 +0000 UTC m=+1147.156853698" watchObservedRunningTime="2026-03-20 17:51:12.330358023 +0000 UTC m=+1147.196183701" Mar 20 17:51:12 crc kubenswrapper[4690]: I0320 17:51:12.341131 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-9mzwk" podStartSLOduration=5.580813887 podStartE2EDuration="15.341116823s" podCreationTimestamp="2026-03-20 17:50:57 +0000 UTC" firstStartedPulling="2026-03-20 17:51:00.943537779 +0000 UTC m=+1135.809363467" lastFinishedPulling="2026-03-20 17:51:10.703840735 +0000 UTC m=+1145.569666403" observedRunningTime="2026-03-20 17:51:12.308437322 +0000 UTC m=+1147.174263000" watchObservedRunningTime="2026-03-20 17:51:12.341116823 +0000 UTC m=+1147.206942501" Mar 20 17:51:13 crc kubenswrapper[4690]: I0320 17:51:13.892864 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90a6ba98-0064-4f8e-bd8a-dfa13ddc8d2e" path="/var/lib/kubelet/pods/90a6ba98-0064-4f8e-bd8a-dfa13ddc8d2e/volumes" Mar 20 17:51:13 crc kubenswrapper[4690]: I0320 17:51:13.893351 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7a6c828-810d-4ce8-b7d5-4242668f01db" path="/var/lib/kubelet/pods/c7a6c828-810d-4ce8-b7d5-4242668f01db/volumes" Mar 20 17:51:17 crc kubenswrapper[4690]: I0320 17:51:17.539419 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-666b6646f7-ztc9k" Mar 20 17:51:17 crc kubenswrapper[4690]: I0320 17:51:17.761384 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-9mzwk" Mar 20 17:51:17 crc kubenswrapper[4690]: I0320 17:51:17.810469 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-ztc9k"] Mar 20 17:51:18 crc kubenswrapper[4690]: I0320 17:51:18.314046 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-ztc9k" podUID="875bf125-544f-4899-ac00-797737833d7e" containerName="dnsmasq-dns" containerID="cri-o://297e2d0cc4d07c43449282fda4c44a37804f3dafdcce9faa00dfe8bd4f838ecb" gracePeriod=10 Mar 20 17:51:19 crc kubenswrapper[4690]: I0320 17:51:19.336614 4690 generic.go:334] "Generic (PLEG): container finished" podID="875bf125-544f-4899-ac00-797737833d7e" containerID="297e2d0cc4d07c43449282fda4c44a37804f3dafdcce9faa00dfe8bd4f838ecb" exitCode=0 Mar 20 17:51:19 crc kubenswrapper[4690]: I0320 17:51:19.336683 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-ztc9k" event={"ID":"875bf125-544f-4899-ac00-797737833d7e","Type":"ContainerDied","Data":"297e2d0cc4d07c43449282fda4c44a37804f3dafdcce9faa00dfe8bd4f838ecb"} Mar 20 17:51:19 crc kubenswrapper[4690]: I0320 17:51:19.980985 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-bv2wl"] Mar 20 17:51:19 crc kubenswrapper[4690]: I0320 17:51:19.985810 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-bv2wl" Mar 20 17:51:19 crc kubenswrapper[4690]: I0320 17:51:19.987954 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 20 17:51:20 crc kubenswrapper[4690]: I0320 17:51:20.006736 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-bv2wl"] Mar 20 17:51:20 crc kubenswrapper[4690]: I0320 17:51:20.039081 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8bc22b8-57e1-4cfd-bce8-446fb8cee600-combined-ca-bundle\") pod \"ovn-controller-metrics-bv2wl\" (UID: \"f8bc22b8-57e1-4cfd-bce8-446fb8cee600\") " pod="openstack/ovn-controller-metrics-bv2wl" Mar 20 17:51:20 crc kubenswrapper[4690]: I0320 17:51:20.039137 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8bc22b8-57e1-4cfd-bce8-446fb8cee600-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-bv2wl\" (UID: \"f8bc22b8-57e1-4cfd-bce8-446fb8cee600\") " pod="openstack/ovn-controller-metrics-bv2wl" Mar 20 17:51:20 crc kubenswrapper[4690]: I0320 17:51:20.039191 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p87v5\" (UniqueName: \"kubernetes.io/projected/f8bc22b8-57e1-4cfd-bce8-446fb8cee600-kube-api-access-p87v5\") pod \"ovn-controller-metrics-bv2wl\" (UID: \"f8bc22b8-57e1-4cfd-bce8-446fb8cee600\") " pod="openstack/ovn-controller-metrics-bv2wl" Mar 20 17:51:20 crc kubenswrapper[4690]: I0320 17:51:20.039217 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8bc22b8-57e1-4cfd-bce8-446fb8cee600-config\") pod \"ovn-controller-metrics-bv2wl\" (UID: \"f8bc22b8-57e1-4cfd-bce8-446fb8cee600\") " pod="openstack/ovn-controller-metrics-bv2wl" Mar 20 17:51:20 crc kubenswrapper[4690]: I0320 17:51:20.039273 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/f8bc22b8-57e1-4cfd-bce8-446fb8cee600-ovn-rundir\") pod \"ovn-controller-metrics-bv2wl\" (UID: \"f8bc22b8-57e1-4cfd-bce8-446fb8cee600\") " pod="openstack/ovn-controller-metrics-bv2wl" Mar 20 17:51:20 crc kubenswrapper[4690]: I0320 17:51:20.039307 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/f8bc22b8-57e1-4cfd-bce8-446fb8cee600-ovs-rundir\") pod \"ovn-controller-metrics-bv2wl\" (UID: \"f8bc22b8-57e1-4cfd-bce8-446fb8cee600\") " pod="openstack/ovn-controller-metrics-bv2wl" Mar 20 17:51:20 crc kubenswrapper[4690]: I0320 17:51:20.100652 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-6rv4v"] Mar 20 17:51:20 crc kubenswrapper[4690]: I0320 17:51:20.101822 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-6rv4v" Mar 20 17:51:20 crc kubenswrapper[4690]: I0320 17:51:20.103395 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 20 17:51:20 crc kubenswrapper[4690]: I0320 17:51:20.117149 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-6rv4v"] Mar 20 17:51:20 crc kubenswrapper[4690]: I0320 17:51:20.140307 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7b0024cf-c6b0-47d1-8d9f-5c91b52fd648-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-6rv4v\" (UID: \"7b0024cf-c6b0-47d1-8d9f-5c91b52fd648\") " pod="openstack/dnsmasq-dns-7fd796d7df-6rv4v" Mar 20 17:51:20 crc kubenswrapper[4690]: I0320 17:51:20.140360 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58trc\" (UniqueName: \"kubernetes.io/projected/7b0024cf-c6b0-47d1-8d9f-5c91b52fd648-kube-api-access-58trc\") pod \"dnsmasq-dns-7fd796d7df-6rv4v\" (UID: \"7b0024cf-c6b0-47d1-8d9f-5c91b52fd648\") " pod="openstack/dnsmasq-dns-7fd796d7df-6rv4v" Mar 20 17:51:20 crc kubenswrapper[4690]: I0320 17:51:20.140383 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7b0024cf-c6b0-47d1-8d9f-5c91b52fd648-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-6rv4v\" (UID: \"7b0024cf-c6b0-47d1-8d9f-5c91b52fd648\") " pod="openstack/dnsmasq-dns-7fd796d7df-6rv4v" Mar 20 17:51:20 crc kubenswrapper[4690]: I0320 17:51:20.140430 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/f8bc22b8-57e1-4cfd-bce8-446fb8cee600-ovs-rundir\") pod \"ovn-controller-metrics-bv2wl\" (UID: \"f8bc22b8-57e1-4cfd-bce8-446fb8cee600\") " pod="openstack/ovn-controller-metrics-bv2wl" Mar 20 17:51:20 crc kubenswrapper[4690]: I0320 17:51:20.140477 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8bc22b8-57e1-4cfd-bce8-446fb8cee600-combined-ca-bundle\") pod \"ovn-controller-metrics-bv2wl\" (UID: \"f8bc22b8-57e1-4cfd-bce8-446fb8cee600\") " pod="openstack/ovn-controller-metrics-bv2wl" Mar 20 17:51:20 crc kubenswrapper[4690]: I0320 17:51:20.140504 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8bc22b8-57e1-4cfd-bce8-446fb8cee600-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-bv2wl\" (UID: \"f8bc22b8-57e1-4cfd-bce8-446fb8cee600\") " pod="openstack/ovn-controller-metrics-bv2wl" Mar 20 17:51:20 crc kubenswrapper[4690]: I0320 17:51:20.140557 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p87v5\" (UniqueName: \"kubernetes.io/projected/f8bc22b8-57e1-4cfd-bce8-446fb8cee600-kube-api-access-p87v5\") pod \"ovn-controller-metrics-bv2wl\" (UID: \"f8bc22b8-57e1-4cfd-bce8-446fb8cee600\") " pod="openstack/ovn-controller-metrics-bv2wl" Mar 20 17:51:20 crc kubenswrapper[4690]: I0320 17:51:20.140580 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8bc22b8-57e1-4cfd-bce8-446fb8cee600-config\") pod \"ovn-controller-metrics-bv2wl\" (UID: \"f8bc22b8-57e1-4cfd-bce8-446fb8cee600\") " pod="openstack/ovn-controller-metrics-bv2wl" Mar 20 17:51:20 crc kubenswrapper[4690]: I0320 17:51:20.140608 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b0024cf-c6b0-47d1-8d9f-5c91b52fd648-config\") pod \"dnsmasq-dns-7fd796d7df-6rv4v\" (UID: \"7b0024cf-c6b0-47d1-8d9f-5c91b52fd648\") " pod="openstack/dnsmasq-dns-7fd796d7df-6rv4v" Mar 20 17:51:20 crc kubenswrapper[4690]: I0320 17:51:20.140628 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/f8bc22b8-57e1-4cfd-bce8-446fb8cee600-ovn-rundir\") pod \"ovn-controller-metrics-bv2wl\" (UID: \"f8bc22b8-57e1-4cfd-bce8-446fb8cee600\") " pod="openstack/ovn-controller-metrics-bv2wl" Mar 20 17:51:20 crc kubenswrapper[4690]: I0320 17:51:20.140913 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/f8bc22b8-57e1-4cfd-bce8-446fb8cee600-ovn-rundir\") pod \"ovn-controller-metrics-bv2wl\" (UID: \"f8bc22b8-57e1-4cfd-bce8-446fb8cee600\") " pod="openstack/ovn-controller-metrics-bv2wl" Mar 20 17:51:20 crc kubenswrapper[4690]: I0320 17:51:20.140981 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/f8bc22b8-57e1-4cfd-bce8-446fb8cee600-ovs-rundir\") pod \"ovn-controller-metrics-bv2wl\" (UID: \"f8bc22b8-57e1-4cfd-bce8-446fb8cee600\") " pod="openstack/ovn-controller-metrics-bv2wl" Mar 20 17:51:20 crc kubenswrapper[4690]: I0320 17:51:20.142449 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8bc22b8-57e1-4cfd-bce8-446fb8cee600-config\") pod \"ovn-controller-metrics-bv2wl\" (UID: \"f8bc22b8-57e1-4cfd-bce8-446fb8cee600\") " pod="openstack/ovn-controller-metrics-bv2wl" Mar 20 17:51:20 crc kubenswrapper[4690]: I0320 17:51:20.146182 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8bc22b8-57e1-4cfd-bce8-446fb8cee600-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-bv2wl\" (UID: \"f8bc22b8-57e1-4cfd-bce8-446fb8cee600\") " pod="openstack/ovn-controller-metrics-bv2wl" Mar 20 17:51:20 crc kubenswrapper[4690]: I0320 17:51:20.146195 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8bc22b8-57e1-4cfd-bce8-446fb8cee600-combined-ca-bundle\") pod \"ovn-controller-metrics-bv2wl\" (UID: \"f8bc22b8-57e1-4cfd-bce8-446fb8cee600\") " pod="openstack/ovn-controller-metrics-bv2wl" Mar 20 17:51:20 crc kubenswrapper[4690]: I0320 17:51:20.163810 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p87v5\" (UniqueName: \"kubernetes.io/projected/f8bc22b8-57e1-4cfd-bce8-446fb8cee600-kube-api-access-p87v5\") pod \"ovn-controller-metrics-bv2wl\" (UID: \"f8bc22b8-57e1-4cfd-bce8-446fb8cee600\") " pod="openstack/ovn-controller-metrics-bv2wl" Mar 20 17:51:20 crc kubenswrapper[4690]: I0320 17:51:20.242103 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b0024cf-c6b0-47d1-8d9f-5c91b52fd648-config\") pod \"dnsmasq-dns-7fd796d7df-6rv4v\" (UID: \"7b0024cf-c6b0-47d1-8d9f-5c91b52fd648\") " pod="openstack/dnsmasq-dns-7fd796d7df-6rv4v" Mar 20 17:51:20 crc kubenswrapper[4690]: I0320 17:51:20.242166 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7b0024cf-c6b0-47d1-8d9f-5c91b52fd648-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-6rv4v\" (UID: \"7b0024cf-c6b0-47d1-8d9f-5c91b52fd648\") " pod="openstack/dnsmasq-dns-7fd796d7df-6rv4v" Mar 20 17:51:20 crc kubenswrapper[4690]: I0320 17:51:20.242190 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58trc\" (UniqueName: \"kubernetes.io/projected/7b0024cf-c6b0-47d1-8d9f-5c91b52fd648-kube-api-access-58trc\") pod \"dnsmasq-dns-7fd796d7df-6rv4v\" (UID: \"7b0024cf-c6b0-47d1-8d9f-5c91b52fd648\") " pod="openstack/dnsmasq-dns-7fd796d7df-6rv4v" Mar 20 17:51:20 crc kubenswrapper[4690]: I0320 17:51:20.242207 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7b0024cf-c6b0-47d1-8d9f-5c91b52fd648-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-6rv4v\" (UID: \"7b0024cf-c6b0-47d1-8d9f-5c91b52fd648\") " pod="openstack/dnsmasq-dns-7fd796d7df-6rv4v" Mar 20 17:51:20 crc kubenswrapper[4690]: I0320 17:51:20.243035 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b0024cf-c6b0-47d1-8d9f-5c91b52fd648-config\") pod \"dnsmasq-dns-7fd796d7df-6rv4v\" (UID: \"7b0024cf-c6b0-47d1-8d9f-5c91b52fd648\") " pod="openstack/dnsmasq-dns-7fd796d7df-6rv4v" Mar 20 17:51:20 crc kubenswrapper[4690]: I0320 17:51:20.243051 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7b0024cf-c6b0-47d1-8d9f-5c91b52fd648-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-6rv4v\" (UID: \"7b0024cf-c6b0-47d1-8d9f-5c91b52fd648\") " pod="openstack/dnsmasq-dns-7fd796d7df-6rv4v" Mar 20 17:51:20 crc kubenswrapper[4690]: I0320 17:51:20.243550 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7b0024cf-c6b0-47d1-8d9f-5c91b52fd648-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-6rv4v\" (UID: \"7b0024cf-c6b0-47d1-8d9f-5c91b52fd648\") " pod="openstack/dnsmasq-dns-7fd796d7df-6rv4v" Mar 20 17:51:20 crc kubenswrapper[4690]: I0320 17:51:20.248675 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-6rv4v"] Mar 20 17:51:20 crc kubenswrapper[4690]: E0320 17:51:20.249196 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-58trc], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-7fd796d7df-6rv4v" podUID="7b0024cf-c6b0-47d1-8d9f-5c91b52fd648" Mar 20 17:51:20 crc kubenswrapper[4690]: I0320 17:51:20.262198 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58trc\" (UniqueName: \"kubernetes.io/projected/7b0024cf-c6b0-47d1-8d9f-5c91b52fd648-kube-api-access-58trc\") pod \"dnsmasq-dns-7fd796d7df-6rv4v\" (UID: \"7b0024cf-c6b0-47d1-8d9f-5c91b52fd648\") " pod="openstack/dnsmasq-dns-7fd796d7df-6rv4v" Mar 20 17:51:20 crc kubenswrapper[4690]: I0320 17:51:20.288134 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-lmjxl"] Mar 20 17:51:20 crc kubenswrapper[4690]: I0320 17:51:20.290032 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-lmjxl" Mar 20 17:51:20 crc kubenswrapper[4690]: I0320 17:51:20.291544 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 20 17:51:20 crc kubenswrapper[4690]: I0320 17:51:20.305724 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-lmjxl"] Mar 20 17:51:20 crc kubenswrapper[4690]: I0320 17:51:20.306418 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-bv2wl" Mar 20 17:51:20 crc kubenswrapper[4690]: I0320 17:51:20.343126 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1ec284e8-7aae-4e75-bd47-6326179eef7f-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-lmjxl\" (UID: \"1ec284e8-7aae-4e75-bd47-6326179eef7f\") " pod="openstack/dnsmasq-dns-86db49b7ff-lmjxl" Mar 20 17:51:20 crc kubenswrapper[4690]: I0320 17:51:20.343166 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1ec284e8-7aae-4e75-bd47-6326179eef7f-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-lmjxl\" (UID: \"1ec284e8-7aae-4e75-bd47-6326179eef7f\") " pod="openstack/dnsmasq-dns-86db49b7ff-lmjxl" Mar 20 17:51:20 crc kubenswrapper[4690]: I0320 17:51:20.343232 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6s22z\" (UniqueName: \"kubernetes.io/projected/1ec284e8-7aae-4e75-bd47-6326179eef7f-kube-api-access-6s22z\") pod \"dnsmasq-dns-86db49b7ff-lmjxl\" (UID: \"1ec284e8-7aae-4e75-bd47-6326179eef7f\") " pod="openstack/dnsmasq-dns-86db49b7ff-lmjxl" Mar 20 17:51:20 crc kubenswrapper[4690]: I0320 17:51:20.343275 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1ec284e8-7aae-4e75-bd47-6326179eef7f-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-lmjxl\" (UID: \"1ec284e8-7aae-4e75-bd47-6326179eef7f\") " pod="openstack/dnsmasq-dns-86db49b7ff-lmjxl" Mar 20 17:51:20 crc kubenswrapper[4690]: I0320 17:51:20.343289 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ec284e8-7aae-4e75-bd47-6326179eef7f-config\") pod \"dnsmasq-dns-86db49b7ff-lmjxl\" (UID: \"1ec284e8-7aae-4e75-bd47-6326179eef7f\") " pod="openstack/dnsmasq-dns-86db49b7ff-lmjxl" Mar 20 17:51:20 crc kubenswrapper[4690]: I0320 17:51:20.343608 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-6rv4v" Mar 20 17:51:20 crc kubenswrapper[4690]: I0320 17:51:20.352189 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-6rv4v" Mar 20 17:51:20 crc kubenswrapper[4690]: I0320 17:51:20.444908 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7b0024cf-c6b0-47d1-8d9f-5c91b52fd648-dns-svc\") pod \"7b0024cf-c6b0-47d1-8d9f-5c91b52fd648\" (UID: \"7b0024cf-c6b0-47d1-8d9f-5c91b52fd648\") " Mar 20 17:51:20 crc kubenswrapper[4690]: I0320 17:51:20.444950 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b0024cf-c6b0-47d1-8d9f-5c91b52fd648-config\") pod \"7b0024cf-c6b0-47d1-8d9f-5c91b52fd648\" (UID: \"7b0024cf-c6b0-47d1-8d9f-5c91b52fd648\") " Mar 20 17:51:20 crc kubenswrapper[4690]: I0320 17:51:20.444999 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7b0024cf-c6b0-47d1-8d9f-5c91b52fd648-ovsdbserver-nb\") pod \"7b0024cf-c6b0-47d1-8d9f-5c91b52fd648\" (UID: \"7b0024cf-c6b0-47d1-8d9f-5c91b52fd648\") " Mar 20 17:51:20 crc kubenswrapper[4690]: I0320 17:51:20.445025 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-58trc\" (UniqueName: \"kubernetes.io/projected/7b0024cf-c6b0-47d1-8d9f-5c91b52fd648-kube-api-access-58trc\") pod \"7b0024cf-c6b0-47d1-8d9f-5c91b52fd648\" (UID: \"7b0024cf-c6b0-47d1-8d9f-5c91b52fd648\") " Mar 20 17:51:20 crc kubenswrapper[4690]: I0320 17:51:20.445186 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1ec284e8-7aae-4e75-bd47-6326179eef7f-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-lmjxl\" (UID: \"1ec284e8-7aae-4e75-bd47-6326179eef7f\") " pod="openstack/dnsmasq-dns-86db49b7ff-lmjxl" Mar 20 17:51:20 crc kubenswrapper[4690]: I0320 17:51:20.445217 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1ec284e8-7aae-4e75-bd47-6326179eef7f-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-lmjxl\" (UID: \"1ec284e8-7aae-4e75-bd47-6326179eef7f\") " pod="openstack/dnsmasq-dns-86db49b7ff-lmjxl" Mar 20 17:51:20 crc kubenswrapper[4690]: I0320 17:51:20.445274 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6s22z\" (UniqueName: \"kubernetes.io/projected/1ec284e8-7aae-4e75-bd47-6326179eef7f-kube-api-access-6s22z\") pod \"dnsmasq-dns-86db49b7ff-lmjxl\" (UID: \"1ec284e8-7aae-4e75-bd47-6326179eef7f\") " pod="openstack/dnsmasq-dns-86db49b7ff-lmjxl" Mar 20 17:51:20 crc kubenswrapper[4690]: I0320 17:51:20.445302 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1ec284e8-7aae-4e75-bd47-6326179eef7f-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-lmjxl\" (UID: \"1ec284e8-7aae-4e75-bd47-6326179eef7f\") " pod="openstack/dnsmasq-dns-86db49b7ff-lmjxl" Mar 20 17:51:20 crc kubenswrapper[4690]: I0320 17:51:20.445321 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ec284e8-7aae-4e75-bd47-6326179eef7f-config\") pod \"dnsmasq-dns-86db49b7ff-lmjxl\" (UID: \"1ec284e8-7aae-4e75-bd47-6326179eef7f\") " pod="openstack/dnsmasq-dns-86db49b7ff-lmjxl" Mar 20 17:51:20 crc kubenswrapper[4690]: I0320 17:51:20.445610 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b0024cf-c6b0-47d1-8d9f-5c91b52fd648-config" (OuterVolumeSpecName: "config") pod "7b0024cf-c6b0-47d1-8d9f-5c91b52fd648" (UID: "7b0024cf-c6b0-47d1-8d9f-5c91b52fd648"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:51:20 crc kubenswrapper[4690]: I0320 17:51:20.446056 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b0024cf-c6b0-47d1-8d9f-5c91b52fd648-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7b0024cf-c6b0-47d1-8d9f-5c91b52fd648" (UID: "7b0024cf-c6b0-47d1-8d9f-5c91b52fd648"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:51:20 crc kubenswrapper[4690]: I0320 17:51:20.446309 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1ec284e8-7aae-4e75-bd47-6326179eef7f-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-lmjxl\" (UID: \"1ec284e8-7aae-4e75-bd47-6326179eef7f\") " pod="openstack/dnsmasq-dns-86db49b7ff-lmjxl" Mar 20 17:51:20 crc kubenswrapper[4690]: I0320 17:51:20.446320 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1ec284e8-7aae-4e75-bd47-6326179eef7f-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-lmjxl\" (UID: \"1ec284e8-7aae-4e75-bd47-6326179eef7f\") " pod="openstack/dnsmasq-dns-86db49b7ff-lmjxl" Mar 20 17:51:20 crc kubenswrapper[4690]: I0320 17:51:20.446396 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1ec284e8-7aae-4e75-bd47-6326179eef7f-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-lmjxl\" (UID: \"1ec284e8-7aae-4e75-bd47-6326179eef7f\") " pod="openstack/dnsmasq-dns-86db49b7ff-lmjxl" Mar 20 17:51:20 crc kubenswrapper[4690]: I0320 17:51:20.446830 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ec284e8-7aae-4e75-bd47-6326179eef7f-config\") pod \"dnsmasq-dns-86db49b7ff-lmjxl\" (UID: \"1ec284e8-7aae-4e75-bd47-6326179eef7f\") " pod="openstack/dnsmasq-dns-86db49b7ff-lmjxl" Mar 20 17:51:20 crc kubenswrapper[4690]: I0320 17:51:20.446961 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b0024cf-c6b0-47d1-8d9f-5c91b52fd648-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7b0024cf-c6b0-47d1-8d9f-5c91b52fd648" (UID: "7b0024cf-c6b0-47d1-8d9f-5c91b52fd648"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:51:20 crc kubenswrapper[4690]: I0320 17:51:20.450125 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b0024cf-c6b0-47d1-8d9f-5c91b52fd648-kube-api-access-58trc" (OuterVolumeSpecName: "kube-api-access-58trc") pod "7b0024cf-c6b0-47d1-8d9f-5c91b52fd648" (UID: "7b0024cf-c6b0-47d1-8d9f-5c91b52fd648"). InnerVolumeSpecName "kube-api-access-58trc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:51:20 crc kubenswrapper[4690]: I0320 17:51:20.463282 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6s22z\" (UniqueName: \"kubernetes.io/projected/1ec284e8-7aae-4e75-bd47-6326179eef7f-kube-api-access-6s22z\") pod \"dnsmasq-dns-86db49b7ff-lmjxl\" (UID: \"1ec284e8-7aae-4e75-bd47-6326179eef7f\") " pod="openstack/dnsmasq-dns-86db49b7ff-lmjxl" Mar 20 17:51:20 crc kubenswrapper[4690]: I0320 17:51:20.547330 4690 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7b0024cf-c6b0-47d1-8d9f-5c91b52fd648-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 17:51:20 crc kubenswrapper[4690]: I0320 17:51:20.547377 4690 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b0024cf-c6b0-47d1-8d9f-5c91b52fd648-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:51:20 crc kubenswrapper[4690]: I0320 17:51:20.547392 4690 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7b0024cf-c6b0-47d1-8d9f-5c91b52fd648-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 17:51:20 crc kubenswrapper[4690]: I0320 17:51:20.547409 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-58trc\" (UniqueName: \"kubernetes.io/projected/7b0024cf-c6b0-47d1-8d9f-5c91b52fd648-kube-api-access-58trc\") on node \"crc\" DevicePath \"\"" Mar 20 17:51:20 crc kubenswrapper[4690]: I0320 17:51:20.611034 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-lmjxl" Mar 20 17:51:21 crc kubenswrapper[4690]: I0320 17:51:21.393385 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-6rv4v" Mar 20 17:51:21 crc kubenswrapper[4690]: I0320 17:51:21.541773 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-6rv4v"] Mar 20 17:51:21 crc kubenswrapper[4690]: I0320 17:51:21.564696 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-6rv4v"] Mar 20 17:51:21 crc kubenswrapper[4690]: I0320 17:51:21.895042 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b0024cf-c6b0-47d1-8d9f-5c91b52fd648" path="/var/lib/kubelet/pods/7b0024cf-c6b0-47d1-8d9f-5c91b52fd648/volumes" Mar 20 17:51:24 crc kubenswrapper[4690]: I0320 17:51:24.274648 4690 patch_prober.go:28] interesting pod/machine-config-daemon-wtg2q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:51:24 crc kubenswrapper[4690]: I0320 17:51:24.274736 4690 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:51:27 crc kubenswrapper[4690]: I0320 17:51:27.539171 4690 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-666b6646f7-ztc9k" podUID="875bf125-544f-4899-ac00-797737833d7e" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.95:5353: i/o timeout" Mar 20 17:51:31 crc kubenswrapper[4690]: E0320 17:51:31.022312 4690 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified" Mar 20 17:51:31 crc kubenswrapper[4690]: E0320 17:51:31.022736 4690 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ovn-controller,Image:quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified,Command:[ovn-controller --pidfile unix:/run/openvswitch/db.sock --certificate=/etc/pki/tls/certs/ovndb.crt --private-key=/etc/pki/tls/private/ovndb.key --ca-cert=/etc/pki/tls/certs/ovndbca.crt],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n687h74h697h5bch65fh568h55h5c4h5bch66dh7bh676h546h89h5dch5bfh65bh678h94hc8h7fh55dh67h666h5f6h67bh665h5cch66bh6h67bh5b6q,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:var-run,ReadOnly:false,MountPath:/var/run/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-run-ovn,ReadOnly:false,MountPath:/var/run/ovn,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-log-ovn,ReadOnly:false,MountPath:/var/log/ovn,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-controller-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndb.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-controller-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/private/ovndb.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-controller-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndbca.crt,SubPath:ca.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4vc9g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/ovn_controller_liveness.sh],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:30,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/ovn_controller_readiness.sh],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:30,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:&Lifecycle{PostStart:nil,PreStop:&LifecycleHandler{Exec:&ExecAction{Command:[/usr/share/ovn/scripts/ovn-ctl stop_controller],},HTTPGet:nil,TCPSocket:nil,Sleep:nil,},},TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[NET_ADMIN SYS_ADMIN SYS_NICE],Drop:[],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-controller-j8pr4_openstack(f0e78344-d5a9-4bc2-9556-e3daf0ce19db): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 17:51:31 crc kubenswrapper[4690]: E0320 17:51:31.023934 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovn-controller\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ovn-controller-j8pr4" podUID="f0e78344-d5a9-4bc2-9556-e3daf0ce19db" Mar 20 17:51:31 crc kubenswrapper[4690]: E0320 17:51:31.209557 4690 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified" Mar 20 17:51:31 crc kubenswrapper[4690]: E0320 17:51:31.209850 4690 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ovsdbserver-sb,Image:quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified,Command:[/usr/bin/dumb-init],Args:[/usr/local/bin/container-scripts/setup.sh],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n85h65ch576hb9h5chc4h599h544h5f6h5ddh5d4h65ch68dh98hbch679h5h644h545h59dh69hcfh8h54h56fh55bh586h68ch67fh545h68bh649q,ValueFrom:nil,},EnvVar{Name:OVN_LOGDIR,Value:/tmp,ValueFrom:nil,},EnvVar{Name:OVN_RUNDIR,Value:/tmp,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovndbcluster-sb-etc-ovn,ReadOnly:false,MountPath:/etc/ovn,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdb-rundir,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndb.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/private/ovndb.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndbca.crt,SubPath:ca.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-km42d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:&Lifecycle{PostStart:nil,PreStop:&LifecycleHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/cleanup.sh],},HTTPGet:nil,TCPSocket:nil,Sleep:nil,},},TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:20,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovsdbserver-sb-0_openstack(d64e7a22-5bb9-49ec-95d6-7ff145a31f9a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 17:51:31 crc kubenswrapper[4690]: E0320 17:51:31.488459 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovn-controller\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified\\\"\"" pod="openstack/ovn-controller-j8pr4" podUID="f0e78344-d5a9-4bc2-9556-e3daf0ce19db" Mar 20 17:51:31 crc kubenswrapper[4690]: E0320 17:51:31.529537 4690 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Mar 20 17:51:31 crc kubenswrapper[4690]: E0320 17:51:31.529595 4690 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Mar 20 17:51:31 crc kubenswrapper[4690]: E0320 17:51:31.529731 4690 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-state-metrics,Image:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,Command:[],Args:[--resources=pods --namespaces=openstack],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:http-metrics,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},ContainerPort{Name:telemetry,HostPort:0,ContainerPort:8081,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vmmsd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/livez,Port:{0 8080 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod kube-state-metrics-0_openstack(5855b86f-8504-4956-af4e-0cb3f9ace108): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 17:51:31 crc kubenswrapper[4690]: E0320 17:51:31.530959 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/kube-state-metrics-0" podUID="5855b86f-8504-4956-af4e-0cb3f9ace108" Mar 20 17:51:31 crc kubenswrapper[4690]: E0320 17:51:31.758524 4690 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified" Mar 20 17:51:31 crc kubenswrapper[4690]: E0320 17:51:31.758698 4690 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ovsdbserver-nb,Image:quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified,Command:[/usr/bin/dumb-init],Args:[/usr/local/bin/container-scripts/setup.sh],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n596h54hc7hfh599h5b6h5cdh696h65fh5d8h6bhffh4h59dh646h96h5f5hc5h7ch5c8hddh65ch5d6h5b6h654h689h5b4h547h54ch69h5f7h579q,ValueFrom:nil,},EnvVar{Name:OVN_LOGDIR,Value:/tmp,ValueFrom:nil,},EnvVar{Name:OVN_RUNDIR,Value:/tmp,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovndbcluster-nb-etc-ovn,ReadOnly:false,MountPath:/etc/ovn,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdb-rundir,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndb.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/private/ovndb.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndbca.crt,SubPath:ca.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-v7x2f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:&Lifecycle{PostStart:nil,PreStop:&LifecycleHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/cleanup.sh],},HTTPGet:nil,TCPSocket:nil,Sleep:nil,},},TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:20,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovsdbserver-nb-0_openstack(172668db-85fb-47e1-82fe-dee7c454993e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 17:51:31 crc kubenswrapper[4690]: I0320 17:51:31.798789 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-ztc9k" Mar 20 17:51:31 crc kubenswrapper[4690]: I0320 17:51:31.906489 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/875bf125-544f-4899-ac00-797737833d7e-dns-svc\") pod \"875bf125-544f-4899-ac00-797737833d7e\" (UID: \"875bf125-544f-4899-ac00-797737833d7e\") " Mar 20 17:51:31 crc kubenswrapper[4690]: I0320 17:51:31.906677 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/875bf125-544f-4899-ac00-797737833d7e-config\") pod \"875bf125-544f-4899-ac00-797737833d7e\" (UID: \"875bf125-544f-4899-ac00-797737833d7e\") " Mar 20 17:51:31 crc kubenswrapper[4690]: I0320 17:51:31.906754 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dn267\" (UniqueName: \"kubernetes.io/projected/875bf125-544f-4899-ac00-797737833d7e-kube-api-access-dn267\") pod \"875bf125-544f-4899-ac00-797737833d7e\" (UID: \"875bf125-544f-4899-ac00-797737833d7e\") " Mar 20 17:51:31 crc kubenswrapper[4690]: I0320 17:51:31.910084 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/875bf125-544f-4899-ac00-797737833d7e-kube-api-access-dn267" (OuterVolumeSpecName: "kube-api-access-dn267") pod "875bf125-544f-4899-ac00-797737833d7e" (UID: "875bf125-544f-4899-ac00-797737833d7e"). InnerVolumeSpecName "kube-api-access-dn267". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:51:31 crc kubenswrapper[4690]: I0320 17:51:31.946247 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/875bf125-544f-4899-ac00-797737833d7e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "875bf125-544f-4899-ac00-797737833d7e" (UID: "875bf125-544f-4899-ac00-797737833d7e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:51:31 crc kubenswrapper[4690]: I0320 17:51:31.947857 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/875bf125-544f-4899-ac00-797737833d7e-config" (OuterVolumeSpecName: "config") pod "875bf125-544f-4899-ac00-797737833d7e" (UID: "875bf125-544f-4899-ac00-797737833d7e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:51:32 crc kubenswrapper[4690]: I0320 17:51:32.008725 4690 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/875bf125-544f-4899-ac00-797737833d7e-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:51:32 crc kubenswrapper[4690]: I0320 17:51:32.008771 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dn267\" (UniqueName: \"kubernetes.io/projected/875bf125-544f-4899-ac00-797737833d7e-kube-api-access-dn267\") on node \"crc\" DevicePath \"\"" Mar 20 17:51:32 crc kubenswrapper[4690]: I0320 17:51:32.008788 4690 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/875bf125-544f-4899-ac00-797737833d7e-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 17:51:32 crc kubenswrapper[4690]: I0320 17:51:32.495017 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-ztc9k" event={"ID":"875bf125-544f-4899-ac00-797737833d7e","Type":"ContainerDied","Data":"929044064d11a669a394297c9f0b2236304210d6bed8708c20050cac82b63b77"} Mar 20 17:51:32 crc kubenswrapper[4690]: I0320 17:51:32.495049 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-ztc9k" Mar 20 17:51:32 crc kubenswrapper[4690]: I0320 17:51:32.495074 4690 scope.go:117] "RemoveContainer" containerID="297e2d0cc4d07c43449282fda4c44a37804f3dafdcce9faa00dfe8bd4f838ecb" Mar 20 17:51:32 crc kubenswrapper[4690]: E0320 17:51:32.496680 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0\\\"\"" pod="openstack/kube-state-metrics-0" podUID="5855b86f-8504-4956-af4e-0cb3f9ace108" Mar 20 17:51:32 crc kubenswrapper[4690]: I0320 17:51:32.534018 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-ztc9k"] Mar 20 17:51:32 crc kubenswrapper[4690]: I0320 17:51:32.539153 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-ztc9k"] Mar 20 17:51:32 crc kubenswrapper[4690]: I0320 17:51:32.540515 4690 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-666b6646f7-ztc9k" podUID="875bf125-544f-4899-ac00-797737833d7e" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.95:5353: i/o timeout" Mar 20 17:51:33 crc kubenswrapper[4690]: I0320 17:51:33.676892 4690 scope.go:117] "RemoveContainer" containerID="b7f1836052e14774be03519bb4120838e3bf6b3b951beb1928c29424d0b96c7c" Mar 20 17:51:33 crc kubenswrapper[4690]: E0320 17:51:33.715448 4690 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/lmiccini/openstack-rabbitmq:r42p" Mar 20 17:51:33 crc kubenswrapper[4690]: E0320 17:51:33.715735 4690 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 17:51:33 crc kubenswrapper[4690]: init container &Container{Name:setup-container,Image:quay.io/lmiccini/openstack-rabbitmq:r42p,Command:[sh -c],Args:[set -e Mar 20 17:51:33 crc kubenswrapper[4690]: cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie Mar 20 17:51:33 crc kubenswrapper[4690]: chmod 600 /var/lib/rabbitmq/.erlang.cookie Mar 20 17:51:33 crc kubenswrapper[4690]: cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins Mar 20 17:51:33 crc kubenswrapper[4690]: echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf Mar 20 17:51:33 crc kubenswrapper[4690]: sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf Mar 20 17:51:33 crc kubenswrapper[4690]: chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf Mar 20 17:51:33 crc kubenswrapper[4690]: # Allow time for multi-pod clusters to complete peer discovery Mar 20 17:51:33 crc kubenswrapper[4690]: sleep 30],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m6qrh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_openstack(4ee4534b-8d84-4ca5-a8bc-10574d39d7bc): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled Mar 20 17:51:33 crc kubenswrapper[4690]: > logger="UnhandledError" Mar 20 17:51:33 crc kubenswrapper[4690]: E0320 17:51:33.719393 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-0" podUID="4ee4534b-8d84-4ca5-a8bc-10574d39d7bc" Mar 20 17:51:33 crc kubenswrapper[4690]: E0320 17:51:33.739706 4690 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/lmiccini/openstack-rabbitmq:r42p" Mar 20 17:51:33 crc kubenswrapper[4690]: E0320 17:51:33.739892 4690 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 17:51:33 crc kubenswrapper[4690]: init container &Container{Name:setup-container,Image:quay.io/lmiccini/openstack-rabbitmq:r42p,Command:[sh -c],Args:[set -e Mar 20 17:51:33 crc kubenswrapper[4690]: cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie Mar 20 17:51:33 crc kubenswrapper[4690]: chmod 600 /var/lib/rabbitmq/.erlang.cookie Mar 20 17:51:33 crc kubenswrapper[4690]: cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins Mar 20 17:51:33 crc kubenswrapper[4690]: echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf Mar 20 17:51:33 crc kubenswrapper[4690]: sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf Mar 20 17:51:33 crc kubenswrapper[4690]: chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf Mar 20 17:51:33 crc kubenswrapper[4690]: # Allow time for multi-pod clusters to complete peer discovery Mar 20 17:51:33 crc kubenswrapper[4690]: sleep 30],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-869th,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cell1-server-0_openstack(a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled Mar 20 17:51:33 crc kubenswrapper[4690]: > logger="UnhandledError" Mar 20 17:51:33 crc kubenswrapper[4690]: E0320 17:51:33.741060 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-cell1-server-0" podUID="a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7" Mar 20 17:51:33 crc kubenswrapper[4690]: I0320 17:51:33.893931 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="875bf125-544f-4899-ac00-797737833d7e" path="/var/lib/kubelet/pods/875bf125-544f-4899-ac00-797737833d7e/volumes" Mar 20 17:51:34 crc kubenswrapper[4690]: I0320 17:51:34.095693 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-lmjxl"] Mar 20 17:51:34 crc kubenswrapper[4690]: I0320 17:51:34.169531 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-bv2wl"] Mar 20 17:51:34 crc kubenswrapper[4690]: E0320 17:51:34.517963 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/lmiccini/openstack-rabbitmq:r42p\\\"\"" pod="openstack/rabbitmq-cell1-server-0" podUID="a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7" Mar 20 17:51:34 crc kubenswrapper[4690]: E0320 17:51:34.518471 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/lmiccini/openstack-rabbitmq:r42p\\\"\"" pod="openstack/rabbitmq-server-0" podUID="4ee4534b-8d84-4ca5-a8bc-10574d39d7bc" Mar 20 17:51:35 crc kubenswrapper[4690]: I0320 17:51:35.527753 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-bv2wl" event={"ID":"f8bc22b8-57e1-4cfd-bce8-446fb8cee600","Type":"ContainerStarted","Data":"bb54587b091dfab2b34eaca7d60a8932cd0d0ea0e5ab609a038601675547d5ec"} Mar 20 17:51:35 crc kubenswrapper[4690]: I0320 17:51:35.528926 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-lmjxl" event={"ID":"1ec284e8-7aae-4e75-bd47-6326179eef7f","Type":"ContainerStarted","Data":"60d3ab2bbb15770efe7dcf8115bb7c4000cd6386b32a6be6ddcc18c543e53d09"} Mar 20 17:51:35 crc kubenswrapper[4690]: E0320 17:51:35.920867 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-sb\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ovsdbserver-sb-0" podUID="d64e7a22-5bb9-49ec-95d6-7ff145a31f9a" Mar 20 17:51:35 crc kubenswrapper[4690]: E0320 17:51:35.921136 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-nb\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ovsdbserver-nb-0" podUID="172668db-85fb-47e1-82fe-dee7c454993e" Mar 20 17:51:36 crc kubenswrapper[4690]: I0320 17:51:36.536917 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"172668db-85fb-47e1-82fe-dee7c454993e","Type":"ContainerStarted","Data":"5dc9eb7c749c81081e04aff1606fd8dfee2a83b25233ee007ef2c47e8d816dcf"} Mar 20 17:51:36 crc kubenswrapper[4690]: E0320 17:51:36.538175 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-nb\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified\\\"\"" pod="openstack/ovsdbserver-nb-0" podUID="172668db-85fb-47e1-82fe-dee7c454993e" Mar 20 17:51:36 crc kubenswrapper[4690]: I0320 17:51:36.538286 4690 generic.go:334] "Generic (PLEG): container finished" podID="497fed5f-b87a-4042-ae22-186983ed7536" containerID="611860a3c18b1651d91646f843d36690ca60e55755a632dbcc2262b8038392a0" exitCode=0 Mar 20 17:51:36 crc kubenswrapper[4690]: I0320 17:51:36.538371 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-8dmtk" event={"ID":"497fed5f-b87a-4042-ae22-186983ed7536","Type":"ContainerDied","Data":"611860a3c18b1651d91646f843d36690ca60e55755a632dbcc2262b8038392a0"} Mar 20 17:51:36 crc kubenswrapper[4690]: I0320 17:51:36.540186 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"dacc9bed-eaa9-4747-8a92-30f5afa0a698","Type":"ContainerStarted","Data":"9073212534f911f321915a03ee23453352cf0e925daad7b59a4f28ac93d0f9a7"} Mar 20 17:51:36 crc kubenswrapper[4690]: I0320 17:51:36.546701 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"b74da73d-632e-490b-b3c7-22450d29ede6","Type":"ContainerStarted","Data":"307a727d467a0529936f86f12ef784bcfefb88638fc32e278659d6142b250d3a"} Mar 20 17:51:36 crc kubenswrapper[4690]: I0320 17:51:36.546832 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 20 17:51:36 crc kubenswrapper[4690]: I0320 17:51:36.549825 4690 generic.go:334] "Generic (PLEG): container finished" podID="1ec284e8-7aae-4e75-bd47-6326179eef7f" containerID="6bf6bdcd7318d5f656757b8d47a56206bcc57b804d2ea81a3466bcaaef721b20" exitCode=0 Mar 20 17:51:36 crc kubenswrapper[4690]: I0320 17:51:36.549878 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-lmjxl" event={"ID":"1ec284e8-7aae-4e75-bd47-6326179eef7f","Type":"ContainerDied","Data":"6bf6bdcd7318d5f656757b8d47a56206bcc57b804d2ea81a3466bcaaef721b20"} Mar 20 17:51:36 crc kubenswrapper[4690]: I0320 17:51:36.551482 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"d4aa597c-8302-463f-a383-39c9a51baa2c","Type":"ContainerStarted","Data":"05f72247af08ee48b869b2ec37bc8614ab7d80db70c5e0b56369e8c4279f3539"} Mar 20 17:51:36 crc kubenswrapper[4690]: I0320 17:51:36.552575 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"d64e7a22-5bb9-49ec-95d6-7ff145a31f9a","Type":"ContainerStarted","Data":"89b735c0db2091b834e4a1156b5f5423e5a41b496481354814d55fb7c3aad0da"} Mar 20 17:51:36 crc kubenswrapper[4690]: I0320 17:51:36.554173 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-bv2wl" event={"ID":"f8bc22b8-57e1-4cfd-bce8-446fb8cee600","Type":"ContainerStarted","Data":"44f7e8e71812b83d71c25631a9cd8d1826d73e6a28792e90ed9f6e420356ba8b"} Mar 20 17:51:36 crc kubenswrapper[4690]: E0320 17:51:36.562382 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-sb\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified\\\"\"" pod="openstack/ovsdbserver-sb-0" podUID="d64e7a22-5bb9-49ec-95d6-7ff145a31f9a" Mar 20 17:51:36 crc kubenswrapper[4690]: I0320 17:51:36.578446 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-bv2wl" podStartSLOduration=17.089844797 podStartE2EDuration="17.578415865s" podCreationTimestamp="2026-03-20 17:51:19 +0000 UTC" firstStartedPulling="2026-03-20 17:51:35.415607534 +0000 UTC m=+1170.281433212" lastFinishedPulling="2026-03-20 17:51:35.904178602 +0000 UTC m=+1170.770004280" observedRunningTime="2026-03-20 17:51:36.574011249 +0000 UTC m=+1171.439836947" watchObservedRunningTime="2026-03-20 17:51:36.578415865 +0000 UTC m=+1171.444241553" Mar 20 17:51:36 crc kubenswrapper[4690]: I0320 17:51:36.596391 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=15.306953276 podStartE2EDuration="35.59637472s" podCreationTimestamp="2026-03-20 17:51:01 +0000 UTC" firstStartedPulling="2026-03-20 17:51:11.232652943 +0000 UTC m=+1146.098478611" lastFinishedPulling="2026-03-20 17:51:31.522074357 +0000 UTC m=+1166.387900055" observedRunningTime="2026-03-20 17:51:36.595953817 +0000 UTC m=+1171.461779505" watchObservedRunningTime="2026-03-20 17:51:36.59637472 +0000 UTC m=+1171.462200398" Mar 20 17:51:37 crc kubenswrapper[4690]: I0320 17:51:37.563862 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-lmjxl" event={"ID":"1ec284e8-7aae-4e75-bd47-6326179eef7f","Type":"ContainerStarted","Data":"de0cff331526c76453f292c84b51632bb31a727a6cd1b72634f96ec3d1acd386"} Mar 20 17:51:37 crc kubenswrapper[4690]: I0320 17:51:37.565384 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-lmjxl" Mar 20 17:51:37 crc kubenswrapper[4690]: I0320 17:51:37.571683 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-8dmtk" event={"ID":"497fed5f-b87a-4042-ae22-186983ed7536","Type":"ContainerStarted","Data":"3df87145603a9d919859a6caaedb73d524c87687726f3160c99660ba6bdb9edc"} Mar 20 17:51:37 crc kubenswrapper[4690]: I0320 17:51:37.571727 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-8dmtk" event={"ID":"497fed5f-b87a-4042-ae22-186983ed7536","Type":"ContainerStarted","Data":"0e671af7e616f7370ae1e1fd77a01ba521fa39b02238f969794d23cba0756052"} Mar 20 17:51:37 crc kubenswrapper[4690]: E0320 17:51:37.575641 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-sb\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified\\\"\"" pod="openstack/ovsdbserver-sb-0" podUID="d64e7a22-5bb9-49ec-95d6-7ff145a31f9a" Mar 20 17:51:37 crc kubenswrapper[4690]: E0320 17:51:37.575861 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-nb\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified\\\"\"" pod="openstack/ovsdbserver-nb-0" podUID="172668db-85fb-47e1-82fe-dee7c454993e" Mar 20 17:51:37 crc kubenswrapper[4690]: I0320 17:51:37.587709 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-lmjxl" podStartSLOduration=17.587689581 podStartE2EDuration="17.587689581s" podCreationTimestamp="2026-03-20 17:51:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:51:37.58453103 +0000 UTC m=+1172.450356728" watchObservedRunningTime="2026-03-20 17:51:37.587689581 +0000 UTC m=+1172.453515289" Mar 20 17:51:37 crc kubenswrapper[4690]: I0320 17:51:37.630190 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-8dmtk" podStartSLOduration=10.819884189 podStartE2EDuration="30.630168447s" podCreationTimestamp="2026-03-20 17:51:07 +0000 UTC" firstStartedPulling="2026-03-20 17:51:11.957918601 +0000 UTC m=+1146.823744269" lastFinishedPulling="2026-03-20 17:51:31.768202849 +0000 UTC m=+1166.634028527" observedRunningTime="2026-03-20 17:51:37.626640546 +0000 UTC m=+1172.492466224" watchObservedRunningTime="2026-03-20 17:51:37.630168447 +0000 UTC m=+1172.495994135" Mar 20 17:51:38 crc kubenswrapper[4690]: I0320 17:51:38.581816 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-8dmtk" Mar 20 17:51:38 crc kubenswrapper[4690]: I0320 17:51:38.582576 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-8dmtk" Mar 20 17:51:40 crc kubenswrapper[4690]: I0320 17:51:40.603701 4690 generic.go:334] "Generic (PLEG): container finished" podID="dacc9bed-eaa9-4747-8a92-30f5afa0a698" containerID="9073212534f911f321915a03ee23453352cf0e925daad7b59a4f28ac93d0f9a7" exitCode=0 Mar 20 17:51:40 crc kubenswrapper[4690]: I0320 17:51:40.603780 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"dacc9bed-eaa9-4747-8a92-30f5afa0a698","Type":"ContainerDied","Data":"9073212534f911f321915a03ee23453352cf0e925daad7b59a4f28ac93d0f9a7"} Mar 20 17:51:40 crc kubenswrapper[4690]: I0320 17:51:40.606842 4690 generic.go:334] "Generic (PLEG): container finished" podID="d4aa597c-8302-463f-a383-39c9a51baa2c" containerID="05f72247af08ee48b869b2ec37bc8614ab7d80db70c5e0b56369e8c4279f3539" exitCode=0 Mar 20 17:51:40 crc kubenswrapper[4690]: I0320 17:51:40.608157 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"d4aa597c-8302-463f-a383-39c9a51baa2c","Type":"ContainerDied","Data":"05f72247af08ee48b869b2ec37bc8614ab7d80db70c5e0b56369e8c4279f3539"} Mar 20 17:51:41 crc kubenswrapper[4690]: I0320 17:51:41.617692 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"d4aa597c-8302-463f-a383-39c9a51baa2c","Type":"ContainerStarted","Data":"470bf7042b01de174e0169bf19aa5082acfdeaee78264a2a1f9608d0de86a720"} Mar 20 17:51:41 crc kubenswrapper[4690]: I0320 17:51:41.620349 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"dacc9bed-eaa9-4747-8a92-30f5afa0a698","Type":"ContainerStarted","Data":"e679d29d0f2289fda3390e87dc7c6c0cfad1f67f9a87a5ce4cfecded6003e0a5"} Mar 20 17:51:41 crc kubenswrapper[4690]: I0320 17:51:41.653582 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 20 17:51:41 crc kubenswrapper[4690]: I0320 17:51:41.659465 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=19.04225121 podStartE2EDuration="41.659440033s" podCreationTimestamp="2026-03-20 17:51:00 +0000 UTC" firstStartedPulling="2026-03-20 17:51:11.02779092 +0000 UTC m=+1145.893616598" lastFinishedPulling="2026-03-20 17:51:33.644979743 +0000 UTC m=+1168.510805421" observedRunningTime="2026-03-20 17:51:41.652775732 +0000 UTC m=+1176.518601440" watchObservedRunningTime="2026-03-20 17:51:41.659440033 +0000 UTC m=+1176.525265751" Mar 20 17:51:41 crc kubenswrapper[4690]: I0320 17:51:41.689370 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=21.128194348 podStartE2EDuration="43.689342749s" podCreationTimestamp="2026-03-20 17:50:58 +0000 UTC" firstStartedPulling="2026-03-20 17:51:11.123011253 +0000 UTC m=+1145.988836931" lastFinishedPulling="2026-03-20 17:51:33.684159654 +0000 UTC m=+1168.549985332" observedRunningTime="2026-03-20 17:51:41.675370539 +0000 UTC m=+1176.541196257" watchObservedRunningTime="2026-03-20 17:51:41.689342749 +0000 UTC m=+1176.555168437" Mar 20 17:51:43 crc kubenswrapper[4690]: E0320 17:51:43.435556 4690 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.192:45566->38.102.83.192:45043: write tcp 38.102.83.192:45566->38.102.83.192:45043: write: broken pipe Mar 20 17:51:44 crc kubenswrapper[4690]: I0320 17:51:44.218614 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-lmjxl"] Mar 20 17:51:44 crc kubenswrapper[4690]: I0320 17:51:44.218818 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-lmjxl" podUID="1ec284e8-7aae-4e75-bd47-6326179eef7f" containerName="dnsmasq-dns" containerID="cri-o://de0cff331526c76453f292c84b51632bb31a727a6cd1b72634f96ec3d1acd386" gracePeriod=10 Mar 20 17:51:44 crc kubenswrapper[4690]: I0320 17:51:44.220514 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-lmjxl" Mar 20 17:51:44 crc kubenswrapper[4690]: I0320 17:51:44.272604 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-b56kg"] Mar 20 17:51:44 crc kubenswrapper[4690]: E0320 17:51:44.272904 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="875bf125-544f-4899-ac00-797737833d7e" containerName="init" Mar 20 17:51:44 crc kubenswrapper[4690]: I0320 17:51:44.272922 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="875bf125-544f-4899-ac00-797737833d7e" containerName="init" Mar 20 17:51:44 crc kubenswrapper[4690]: E0320 17:51:44.272936 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="875bf125-544f-4899-ac00-797737833d7e" containerName="dnsmasq-dns" Mar 20 17:51:44 crc kubenswrapper[4690]: I0320 17:51:44.272943 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="875bf125-544f-4899-ac00-797737833d7e" containerName="dnsmasq-dns" Mar 20 17:51:44 crc kubenswrapper[4690]: I0320 17:51:44.273085 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="875bf125-544f-4899-ac00-797737833d7e" containerName="dnsmasq-dns" Mar 20 17:51:44 crc kubenswrapper[4690]: I0320 17:51:44.273860 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-b56kg" Mar 20 17:51:44 crc kubenswrapper[4690]: I0320 17:51:44.337937 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-b56kg"] Mar 20 17:51:44 crc kubenswrapper[4690]: I0320 17:51:44.429752 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5bc2feb8-84ad-46f0-aa5e-3b17b33f1bce-dns-svc\") pod \"dnsmasq-dns-698758b865-b56kg\" (UID: \"5bc2feb8-84ad-46f0-aa5e-3b17b33f1bce\") " pod="openstack/dnsmasq-dns-698758b865-b56kg" Mar 20 17:51:44 crc kubenswrapper[4690]: I0320 17:51:44.429822 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5bc2feb8-84ad-46f0-aa5e-3b17b33f1bce-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-b56kg\" (UID: \"5bc2feb8-84ad-46f0-aa5e-3b17b33f1bce\") " pod="openstack/dnsmasq-dns-698758b865-b56kg" Mar 20 17:51:44 crc kubenswrapper[4690]: I0320 17:51:44.429883 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqqb7\" (UniqueName: \"kubernetes.io/projected/5bc2feb8-84ad-46f0-aa5e-3b17b33f1bce-kube-api-access-zqqb7\") pod \"dnsmasq-dns-698758b865-b56kg\" (UID: \"5bc2feb8-84ad-46f0-aa5e-3b17b33f1bce\") " pod="openstack/dnsmasq-dns-698758b865-b56kg" Mar 20 17:51:44 crc kubenswrapper[4690]: I0320 17:51:44.429901 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bc2feb8-84ad-46f0-aa5e-3b17b33f1bce-config\") pod \"dnsmasq-dns-698758b865-b56kg\" (UID: \"5bc2feb8-84ad-46f0-aa5e-3b17b33f1bce\") " pod="openstack/dnsmasq-dns-698758b865-b56kg" Mar 20 17:51:44 crc kubenswrapper[4690]: I0320 17:51:44.430022 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5bc2feb8-84ad-46f0-aa5e-3b17b33f1bce-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-b56kg\" (UID: \"5bc2feb8-84ad-46f0-aa5e-3b17b33f1bce\") " pod="openstack/dnsmasq-dns-698758b865-b56kg" Mar 20 17:51:44 crc kubenswrapper[4690]: I0320 17:51:44.532046 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5bc2feb8-84ad-46f0-aa5e-3b17b33f1bce-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-b56kg\" (UID: \"5bc2feb8-84ad-46f0-aa5e-3b17b33f1bce\") " pod="openstack/dnsmasq-dns-698758b865-b56kg" Mar 20 17:51:44 crc kubenswrapper[4690]: I0320 17:51:44.532120 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5bc2feb8-84ad-46f0-aa5e-3b17b33f1bce-dns-svc\") pod \"dnsmasq-dns-698758b865-b56kg\" (UID: \"5bc2feb8-84ad-46f0-aa5e-3b17b33f1bce\") " pod="openstack/dnsmasq-dns-698758b865-b56kg" Mar 20 17:51:44 crc kubenswrapper[4690]: I0320 17:51:44.532160 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5bc2feb8-84ad-46f0-aa5e-3b17b33f1bce-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-b56kg\" (UID: \"5bc2feb8-84ad-46f0-aa5e-3b17b33f1bce\") " pod="openstack/dnsmasq-dns-698758b865-b56kg" Mar 20 17:51:44 crc kubenswrapper[4690]: I0320 17:51:44.532188 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqqb7\" (UniqueName: \"kubernetes.io/projected/5bc2feb8-84ad-46f0-aa5e-3b17b33f1bce-kube-api-access-zqqb7\") pod \"dnsmasq-dns-698758b865-b56kg\" (UID: \"5bc2feb8-84ad-46f0-aa5e-3b17b33f1bce\") " pod="openstack/dnsmasq-dns-698758b865-b56kg" Mar 20 17:51:44 crc kubenswrapper[4690]: I0320 17:51:44.532208 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bc2feb8-84ad-46f0-aa5e-3b17b33f1bce-config\") pod \"dnsmasq-dns-698758b865-b56kg\" (UID: \"5bc2feb8-84ad-46f0-aa5e-3b17b33f1bce\") " pod="openstack/dnsmasq-dns-698758b865-b56kg" Mar 20 17:51:44 crc kubenswrapper[4690]: I0320 17:51:44.533392 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5bc2feb8-84ad-46f0-aa5e-3b17b33f1bce-dns-svc\") pod \"dnsmasq-dns-698758b865-b56kg\" (UID: \"5bc2feb8-84ad-46f0-aa5e-3b17b33f1bce\") " pod="openstack/dnsmasq-dns-698758b865-b56kg" Mar 20 17:51:44 crc kubenswrapper[4690]: I0320 17:51:44.533442 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bc2feb8-84ad-46f0-aa5e-3b17b33f1bce-config\") pod \"dnsmasq-dns-698758b865-b56kg\" (UID: \"5bc2feb8-84ad-46f0-aa5e-3b17b33f1bce\") " pod="openstack/dnsmasq-dns-698758b865-b56kg" Mar 20 17:51:44 crc kubenswrapper[4690]: I0320 17:51:44.534121 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5bc2feb8-84ad-46f0-aa5e-3b17b33f1bce-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-b56kg\" (UID: \"5bc2feb8-84ad-46f0-aa5e-3b17b33f1bce\") " pod="openstack/dnsmasq-dns-698758b865-b56kg" Mar 20 17:51:44 crc kubenswrapper[4690]: I0320 17:51:44.534207 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5bc2feb8-84ad-46f0-aa5e-3b17b33f1bce-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-b56kg\" (UID: \"5bc2feb8-84ad-46f0-aa5e-3b17b33f1bce\") " pod="openstack/dnsmasq-dns-698758b865-b56kg" Mar 20 17:51:44 crc kubenswrapper[4690]: I0320 17:51:44.573227 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqqb7\" (UniqueName: \"kubernetes.io/projected/5bc2feb8-84ad-46f0-aa5e-3b17b33f1bce-kube-api-access-zqqb7\") pod \"dnsmasq-dns-698758b865-b56kg\" (UID: \"5bc2feb8-84ad-46f0-aa5e-3b17b33f1bce\") " pod="openstack/dnsmasq-dns-698758b865-b56kg" Mar 20 17:51:44 crc kubenswrapper[4690]: I0320 17:51:44.643941 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5855b86f-8504-4956-af4e-0cb3f9ace108","Type":"ContainerStarted","Data":"ac47407de899afac863001490b2003df80d11e3399b69af6add689ae63492528"} Mar 20 17:51:44 crc kubenswrapper[4690]: I0320 17:51:44.644306 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 20 17:51:44 crc kubenswrapper[4690]: I0320 17:51:44.648529 4690 generic.go:334] "Generic (PLEG): container finished" podID="1ec284e8-7aae-4e75-bd47-6326179eef7f" containerID="de0cff331526c76453f292c84b51632bb31a727a6cd1b72634f96ec3d1acd386" exitCode=0 Mar 20 17:51:44 crc kubenswrapper[4690]: I0320 17:51:44.648596 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-lmjxl" event={"ID":"1ec284e8-7aae-4e75-bd47-6326179eef7f","Type":"ContainerDied","Data":"de0cff331526c76453f292c84b51632bb31a727a6cd1b72634f96ec3d1acd386"} Mar 20 17:51:44 crc kubenswrapper[4690]: I0320 17:51:44.648643 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-lmjxl" event={"ID":"1ec284e8-7aae-4e75-bd47-6326179eef7f","Type":"ContainerDied","Data":"60d3ab2bbb15770efe7dcf8115bb7c4000cd6386b32a6be6ddcc18c543e53d09"} Mar 20 17:51:44 crc kubenswrapper[4690]: I0320 17:51:44.648662 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="60d3ab2bbb15770efe7dcf8115bb7c4000cd6386b32a6be6ddcc18c543e53d09" Mar 20 17:51:44 crc kubenswrapper[4690]: I0320 17:51:44.650772 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-lmjxl" Mar 20 17:51:44 crc kubenswrapper[4690]: I0320 17:51:44.664106 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=8.619905114 podStartE2EDuration="41.664086024s" podCreationTimestamp="2026-03-20 17:51:03 +0000 UTC" firstStartedPulling="2026-03-20 17:51:11.200437875 +0000 UTC m=+1146.066263563" lastFinishedPulling="2026-03-20 17:51:44.244618795 +0000 UTC m=+1179.110444473" observedRunningTime="2026-03-20 17:51:44.659941256 +0000 UTC m=+1179.525766944" watchObservedRunningTime="2026-03-20 17:51:44.664086024 +0000 UTC m=+1179.529911712" Mar 20 17:51:44 crc kubenswrapper[4690]: I0320 17:51:44.711658 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-b56kg" Mar 20 17:51:44 crc kubenswrapper[4690]: I0320 17:51:44.735286 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1ec284e8-7aae-4e75-bd47-6326179eef7f-ovsdbserver-sb\") pod \"1ec284e8-7aae-4e75-bd47-6326179eef7f\" (UID: \"1ec284e8-7aae-4e75-bd47-6326179eef7f\") " Mar 20 17:51:44 crc kubenswrapper[4690]: I0320 17:51:44.735419 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1ec284e8-7aae-4e75-bd47-6326179eef7f-ovsdbserver-nb\") pod \"1ec284e8-7aae-4e75-bd47-6326179eef7f\" (UID: \"1ec284e8-7aae-4e75-bd47-6326179eef7f\") " Mar 20 17:51:44 crc kubenswrapper[4690]: I0320 17:51:44.735490 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ec284e8-7aae-4e75-bd47-6326179eef7f-config\") pod \"1ec284e8-7aae-4e75-bd47-6326179eef7f\" (UID: \"1ec284e8-7aae-4e75-bd47-6326179eef7f\") " Mar 20 17:51:44 crc kubenswrapper[4690]: I0320 17:51:44.735534 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1ec284e8-7aae-4e75-bd47-6326179eef7f-dns-svc\") pod \"1ec284e8-7aae-4e75-bd47-6326179eef7f\" (UID: \"1ec284e8-7aae-4e75-bd47-6326179eef7f\") " Mar 20 17:51:44 crc kubenswrapper[4690]: I0320 17:51:44.735587 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6s22z\" (UniqueName: \"kubernetes.io/projected/1ec284e8-7aae-4e75-bd47-6326179eef7f-kube-api-access-6s22z\") pod \"1ec284e8-7aae-4e75-bd47-6326179eef7f\" (UID: \"1ec284e8-7aae-4e75-bd47-6326179eef7f\") " Mar 20 17:51:44 crc kubenswrapper[4690]: I0320 17:51:44.739930 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ec284e8-7aae-4e75-bd47-6326179eef7f-kube-api-access-6s22z" (OuterVolumeSpecName: "kube-api-access-6s22z") pod "1ec284e8-7aae-4e75-bd47-6326179eef7f" (UID: "1ec284e8-7aae-4e75-bd47-6326179eef7f"). InnerVolumeSpecName "kube-api-access-6s22z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:51:44 crc kubenswrapper[4690]: I0320 17:51:44.772497 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ec284e8-7aae-4e75-bd47-6326179eef7f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1ec284e8-7aae-4e75-bd47-6326179eef7f" (UID: "1ec284e8-7aae-4e75-bd47-6326179eef7f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:51:44 crc kubenswrapper[4690]: I0320 17:51:44.773032 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ec284e8-7aae-4e75-bd47-6326179eef7f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1ec284e8-7aae-4e75-bd47-6326179eef7f" (UID: "1ec284e8-7aae-4e75-bd47-6326179eef7f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:51:44 crc kubenswrapper[4690]: I0320 17:51:44.782951 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ec284e8-7aae-4e75-bd47-6326179eef7f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1ec284e8-7aae-4e75-bd47-6326179eef7f" (UID: "1ec284e8-7aae-4e75-bd47-6326179eef7f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:51:44 crc kubenswrapper[4690]: I0320 17:51:44.784904 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ec284e8-7aae-4e75-bd47-6326179eef7f-config" (OuterVolumeSpecName: "config") pod "1ec284e8-7aae-4e75-bd47-6326179eef7f" (UID: "1ec284e8-7aae-4e75-bd47-6326179eef7f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:51:44 crc kubenswrapper[4690]: I0320 17:51:44.837518 4690 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ec284e8-7aae-4e75-bd47-6326179eef7f-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:51:44 crc kubenswrapper[4690]: I0320 17:51:44.837549 4690 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1ec284e8-7aae-4e75-bd47-6326179eef7f-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 17:51:44 crc kubenswrapper[4690]: I0320 17:51:44.837558 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6s22z\" (UniqueName: \"kubernetes.io/projected/1ec284e8-7aae-4e75-bd47-6326179eef7f-kube-api-access-6s22z\") on node \"crc\" DevicePath \"\"" Mar 20 17:51:44 crc kubenswrapper[4690]: I0320 17:51:44.837570 4690 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1ec284e8-7aae-4e75-bd47-6326179eef7f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 17:51:44 crc kubenswrapper[4690]: I0320 17:51:44.837578 4690 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1ec284e8-7aae-4e75-bd47-6326179eef7f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 17:51:45 crc kubenswrapper[4690]: I0320 17:51:45.125987 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-b56kg"] Mar 20 17:51:45 crc kubenswrapper[4690]: I0320 17:51:45.379824 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Mar 20 17:51:45 crc kubenswrapper[4690]: E0320 17:51:45.380192 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ec284e8-7aae-4e75-bd47-6326179eef7f" containerName="dnsmasq-dns" Mar 20 17:51:45 crc kubenswrapper[4690]: I0320 17:51:45.380214 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ec284e8-7aae-4e75-bd47-6326179eef7f" containerName="dnsmasq-dns" Mar 20 17:51:45 crc kubenswrapper[4690]: E0320 17:51:45.380234 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ec284e8-7aae-4e75-bd47-6326179eef7f" containerName="init" Mar 20 17:51:45 crc kubenswrapper[4690]: I0320 17:51:45.380243 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ec284e8-7aae-4e75-bd47-6326179eef7f" containerName="init" Mar 20 17:51:45 crc kubenswrapper[4690]: I0320 17:51:45.380464 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ec284e8-7aae-4e75-bd47-6326179eef7f" containerName="dnsmasq-dns" Mar 20 17:51:45 crc kubenswrapper[4690]: I0320 17:51:45.387619 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 20 17:51:45 crc kubenswrapper[4690]: I0320 17:51:45.390566 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Mar 20 17:51:45 crc kubenswrapper[4690]: I0320 17:51:45.391562 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Mar 20 17:51:45 crc kubenswrapper[4690]: I0320 17:51:45.391826 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Mar 20 17:51:45 crc kubenswrapper[4690]: I0320 17:51:45.392884 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-g99rl" Mar 20 17:51:45 crc kubenswrapper[4690]: I0320 17:51:45.399233 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 20 17:51:45 crc kubenswrapper[4690]: I0320 17:51:45.549654 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"4191d8a1-c023-4412-a90c-e819672da33f\") " pod="openstack/swift-storage-0" Mar 20 17:51:45 crc kubenswrapper[4690]: I0320 17:51:45.549731 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4191d8a1-c023-4412-a90c-e819672da33f-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"4191d8a1-c023-4412-a90c-e819672da33f\") " pod="openstack/swift-storage-0" Mar 20 17:51:45 crc kubenswrapper[4690]: I0320 17:51:45.549833 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/4191d8a1-c023-4412-a90c-e819672da33f-lock\") pod \"swift-storage-0\" (UID: \"4191d8a1-c023-4412-a90c-e819672da33f\") " pod="openstack/swift-storage-0" Mar 20 17:51:45 crc kubenswrapper[4690]: I0320 17:51:45.549865 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnjcj\" (UniqueName: \"kubernetes.io/projected/4191d8a1-c023-4412-a90c-e819672da33f-kube-api-access-xnjcj\") pod \"swift-storage-0\" (UID: \"4191d8a1-c023-4412-a90c-e819672da33f\") " pod="openstack/swift-storage-0" Mar 20 17:51:45 crc kubenswrapper[4690]: I0320 17:51:45.550170 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/4191d8a1-c023-4412-a90c-e819672da33f-cache\") pod \"swift-storage-0\" (UID: \"4191d8a1-c023-4412-a90c-e819672da33f\") " pod="openstack/swift-storage-0" Mar 20 17:51:45 crc kubenswrapper[4690]: I0320 17:51:45.550217 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4191d8a1-c023-4412-a90c-e819672da33f-etc-swift\") pod \"swift-storage-0\" (UID: \"4191d8a1-c023-4412-a90c-e819672da33f\") " pod="openstack/swift-storage-0" Mar 20 17:51:45 crc kubenswrapper[4690]: I0320 17:51:45.615032 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-dsjgc"] Mar 20 17:51:45 crc kubenswrapper[4690]: I0320 17:51:45.616601 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-dsjgc" Mar 20 17:51:45 crc kubenswrapper[4690]: I0320 17:51:45.618199 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 20 17:51:45 crc kubenswrapper[4690]: I0320 17:51:45.619206 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 20 17:51:45 crc kubenswrapper[4690]: I0320 17:51:45.620560 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 20 17:51:45 crc kubenswrapper[4690]: I0320 17:51:45.632310 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-dsjgc"] Mar 20 17:51:45 crc kubenswrapper[4690]: I0320 17:51:45.651696 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4191d8a1-c023-4412-a90c-e819672da33f-etc-swift\") pod \"swift-storage-0\" (UID: \"4191d8a1-c023-4412-a90c-e819672da33f\") " pod="openstack/swift-storage-0" Mar 20 17:51:45 crc kubenswrapper[4690]: I0320 17:51:45.651755 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"4191d8a1-c023-4412-a90c-e819672da33f\") " pod="openstack/swift-storage-0" Mar 20 17:51:45 crc kubenswrapper[4690]: I0320 17:51:45.651783 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4191d8a1-c023-4412-a90c-e819672da33f-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"4191d8a1-c023-4412-a90c-e819672da33f\") " pod="openstack/swift-storage-0" Mar 20 17:51:45 crc kubenswrapper[4690]: I0320 17:51:45.651837 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/4191d8a1-c023-4412-a90c-e819672da33f-lock\") pod \"swift-storage-0\" (UID: \"4191d8a1-c023-4412-a90c-e819672da33f\") " pod="openstack/swift-storage-0" Mar 20 17:51:45 crc kubenswrapper[4690]: I0320 17:51:45.651856 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xnjcj\" (UniqueName: \"kubernetes.io/projected/4191d8a1-c023-4412-a90c-e819672da33f-kube-api-access-xnjcj\") pod \"swift-storage-0\" (UID: \"4191d8a1-c023-4412-a90c-e819672da33f\") " pod="openstack/swift-storage-0" Mar 20 17:51:45 crc kubenswrapper[4690]: I0320 17:51:45.651915 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/4191d8a1-c023-4412-a90c-e819672da33f-cache\") pod \"swift-storage-0\" (UID: \"4191d8a1-c023-4412-a90c-e819672da33f\") " pod="openstack/swift-storage-0" Mar 20 17:51:45 crc kubenswrapper[4690]: E0320 17:51:45.652175 4690 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 17:51:45 crc kubenswrapper[4690]: E0320 17:51:45.652211 4690 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 17:51:45 crc kubenswrapper[4690]: E0320 17:51:45.652278 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4191d8a1-c023-4412-a90c-e819672da33f-etc-swift podName:4191d8a1-c023-4412-a90c-e819672da33f nodeName:}" failed. No retries permitted until 2026-03-20 17:51:46.152242635 +0000 UTC m=+1181.018068423 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/4191d8a1-c023-4412-a90c-e819672da33f-etc-swift") pod "swift-storage-0" (UID: "4191d8a1-c023-4412-a90c-e819672da33f") : configmap "swift-ring-files" not found Mar 20 17:51:45 crc kubenswrapper[4690]: I0320 17:51:45.652185 4690 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"4191d8a1-c023-4412-a90c-e819672da33f\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/swift-storage-0" Mar 20 17:51:45 crc kubenswrapper[4690]: I0320 17:51:45.652366 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/4191d8a1-c023-4412-a90c-e819672da33f-cache\") pod \"swift-storage-0\" (UID: \"4191d8a1-c023-4412-a90c-e819672da33f\") " pod="openstack/swift-storage-0" Mar 20 17:51:45 crc kubenswrapper[4690]: I0320 17:51:45.652718 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/4191d8a1-c023-4412-a90c-e819672da33f-lock\") pod \"swift-storage-0\" (UID: \"4191d8a1-c023-4412-a90c-e819672da33f\") " pod="openstack/swift-storage-0" Mar 20 17:51:45 crc kubenswrapper[4690]: I0320 17:51:45.664513 4690 generic.go:334] "Generic (PLEG): container finished" podID="5bc2feb8-84ad-46f0-aa5e-3b17b33f1bce" containerID="a2e0e3a98dc062d76f2ede084588cd281812e1a12f6d49f55bc2043c7783f75b" exitCode=0 Mar 20 17:51:45 crc kubenswrapper[4690]: I0320 17:51:45.664707 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-b56kg" event={"ID":"5bc2feb8-84ad-46f0-aa5e-3b17b33f1bce","Type":"ContainerDied","Data":"a2e0e3a98dc062d76f2ede084588cd281812e1a12f6d49f55bc2043c7783f75b"} Mar 20 17:51:45 crc kubenswrapper[4690]: I0320 17:51:45.664863 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-b56kg" event={"ID":"5bc2feb8-84ad-46f0-aa5e-3b17b33f1bce","Type":"ContainerStarted","Data":"629c969a0c79b47e304c722bbdc5b10f3345de19739324c1f0dd3c9ea65e8a75"} Mar 20 17:51:45 crc kubenswrapper[4690]: I0320 17:51:45.664978 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-lmjxl" Mar 20 17:51:45 crc kubenswrapper[4690]: I0320 17:51:45.672884 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4191d8a1-c023-4412-a90c-e819672da33f-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"4191d8a1-c023-4412-a90c-e819672da33f\") " pod="openstack/swift-storage-0" Mar 20 17:51:45 crc kubenswrapper[4690]: I0320 17:51:45.674001 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnjcj\" (UniqueName: \"kubernetes.io/projected/4191d8a1-c023-4412-a90c-e819672da33f-kube-api-access-xnjcj\") pod \"swift-storage-0\" (UID: \"4191d8a1-c023-4412-a90c-e819672da33f\") " pod="openstack/swift-storage-0" Mar 20 17:51:45 crc kubenswrapper[4690]: I0320 17:51:45.676491 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"4191d8a1-c023-4412-a90c-e819672da33f\") " pod="openstack/swift-storage-0" Mar 20 17:51:45 crc kubenswrapper[4690]: I0320 17:51:45.752783 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7930c325-4b03-450e-b3d0-b7116efc71cb-scripts\") pod \"swift-ring-rebalance-dsjgc\" (UID: \"7930c325-4b03-450e-b3d0-b7116efc71cb\") " pod="openstack/swift-ring-rebalance-dsjgc" Mar 20 17:51:45 crc kubenswrapper[4690]: I0320 17:51:45.752828 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7930c325-4b03-450e-b3d0-b7116efc71cb-ring-data-devices\") pod \"swift-ring-rebalance-dsjgc\" (UID: \"7930c325-4b03-450e-b3d0-b7116efc71cb\") " pod="openstack/swift-ring-rebalance-dsjgc" Mar 20 17:51:45 crc kubenswrapper[4690]: I0320 17:51:45.752847 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4wwn\" (UniqueName: \"kubernetes.io/projected/7930c325-4b03-450e-b3d0-b7116efc71cb-kube-api-access-x4wwn\") pod \"swift-ring-rebalance-dsjgc\" (UID: \"7930c325-4b03-450e-b3d0-b7116efc71cb\") " pod="openstack/swift-ring-rebalance-dsjgc" Mar 20 17:51:45 crc kubenswrapper[4690]: I0320 17:51:45.752903 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7930c325-4b03-450e-b3d0-b7116efc71cb-etc-swift\") pod \"swift-ring-rebalance-dsjgc\" (UID: \"7930c325-4b03-450e-b3d0-b7116efc71cb\") " pod="openstack/swift-ring-rebalance-dsjgc" Mar 20 17:51:45 crc kubenswrapper[4690]: I0320 17:51:45.752957 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7930c325-4b03-450e-b3d0-b7116efc71cb-combined-ca-bundle\") pod \"swift-ring-rebalance-dsjgc\" (UID: \"7930c325-4b03-450e-b3d0-b7116efc71cb\") " pod="openstack/swift-ring-rebalance-dsjgc" Mar 20 17:51:45 crc kubenswrapper[4690]: I0320 17:51:45.752980 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7930c325-4b03-450e-b3d0-b7116efc71cb-swiftconf\") pod \"swift-ring-rebalance-dsjgc\" (UID: \"7930c325-4b03-450e-b3d0-b7116efc71cb\") " pod="openstack/swift-ring-rebalance-dsjgc" Mar 20 17:51:45 crc kubenswrapper[4690]: I0320 17:51:45.753035 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7930c325-4b03-450e-b3d0-b7116efc71cb-dispersionconf\") pod \"swift-ring-rebalance-dsjgc\" (UID: \"7930c325-4b03-450e-b3d0-b7116efc71cb\") " pod="openstack/swift-ring-rebalance-dsjgc" Mar 20 17:51:45 crc kubenswrapper[4690]: I0320 17:51:45.830862 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-lmjxl"] Mar 20 17:51:45 crc kubenswrapper[4690]: I0320 17:51:45.843416 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-lmjxl"] Mar 20 17:51:45 crc kubenswrapper[4690]: I0320 17:51:45.854562 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7930c325-4b03-450e-b3d0-b7116efc71cb-scripts\") pod \"swift-ring-rebalance-dsjgc\" (UID: \"7930c325-4b03-450e-b3d0-b7116efc71cb\") " pod="openstack/swift-ring-rebalance-dsjgc" Mar 20 17:51:45 crc kubenswrapper[4690]: I0320 17:51:45.854646 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7930c325-4b03-450e-b3d0-b7116efc71cb-ring-data-devices\") pod \"swift-ring-rebalance-dsjgc\" (UID: \"7930c325-4b03-450e-b3d0-b7116efc71cb\") " pod="openstack/swift-ring-rebalance-dsjgc" Mar 20 17:51:45 crc kubenswrapper[4690]: I0320 17:51:45.854685 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4wwn\" (UniqueName: \"kubernetes.io/projected/7930c325-4b03-450e-b3d0-b7116efc71cb-kube-api-access-x4wwn\") pod \"swift-ring-rebalance-dsjgc\" (UID: \"7930c325-4b03-450e-b3d0-b7116efc71cb\") " pod="openstack/swift-ring-rebalance-dsjgc" Mar 20 17:51:45 crc kubenswrapper[4690]: I0320 17:51:45.854774 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7930c325-4b03-450e-b3d0-b7116efc71cb-etc-swift\") pod \"swift-ring-rebalance-dsjgc\" (UID: \"7930c325-4b03-450e-b3d0-b7116efc71cb\") " pod="openstack/swift-ring-rebalance-dsjgc" Mar 20 17:51:45 crc kubenswrapper[4690]: I0320 17:51:45.854863 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7930c325-4b03-450e-b3d0-b7116efc71cb-combined-ca-bundle\") pod \"swift-ring-rebalance-dsjgc\" (UID: \"7930c325-4b03-450e-b3d0-b7116efc71cb\") " pod="openstack/swift-ring-rebalance-dsjgc" Mar 20 17:51:45 crc kubenswrapper[4690]: I0320 17:51:45.854904 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7930c325-4b03-450e-b3d0-b7116efc71cb-swiftconf\") pod \"swift-ring-rebalance-dsjgc\" (UID: \"7930c325-4b03-450e-b3d0-b7116efc71cb\") " pod="openstack/swift-ring-rebalance-dsjgc" Mar 20 17:51:45 crc kubenswrapper[4690]: I0320 17:51:45.854969 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7930c325-4b03-450e-b3d0-b7116efc71cb-dispersionconf\") pod \"swift-ring-rebalance-dsjgc\" (UID: \"7930c325-4b03-450e-b3d0-b7116efc71cb\") " pod="openstack/swift-ring-rebalance-dsjgc" Mar 20 17:51:45 crc kubenswrapper[4690]: I0320 17:51:45.855516 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7930c325-4b03-450e-b3d0-b7116efc71cb-scripts\") pod \"swift-ring-rebalance-dsjgc\" (UID: \"7930c325-4b03-450e-b3d0-b7116efc71cb\") " pod="openstack/swift-ring-rebalance-dsjgc" Mar 20 17:51:45 crc kubenswrapper[4690]: I0320 17:51:45.855886 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7930c325-4b03-450e-b3d0-b7116efc71cb-etc-swift\") pod \"swift-ring-rebalance-dsjgc\" (UID: \"7930c325-4b03-450e-b3d0-b7116efc71cb\") " pod="openstack/swift-ring-rebalance-dsjgc" Mar 20 17:51:45 crc kubenswrapper[4690]: I0320 17:51:45.858464 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7930c325-4b03-450e-b3d0-b7116efc71cb-ring-data-devices\") pod \"swift-ring-rebalance-dsjgc\" (UID: \"7930c325-4b03-450e-b3d0-b7116efc71cb\") " pod="openstack/swift-ring-rebalance-dsjgc" Mar 20 17:51:45 crc kubenswrapper[4690]: I0320 17:51:45.858631 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7930c325-4b03-450e-b3d0-b7116efc71cb-combined-ca-bundle\") pod \"swift-ring-rebalance-dsjgc\" (UID: \"7930c325-4b03-450e-b3d0-b7116efc71cb\") " pod="openstack/swift-ring-rebalance-dsjgc" Mar 20 17:51:45 crc kubenswrapper[4690]: I0320 17:51:45.862496 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7930c325-4b03-450e-b3d0-b7116efc71cb-dispersionconf\") pod \"swift-ring-rebalance-dsjgc\" (UID: \"7930c325-4b03-450e-b3d0-b7116efc71cb\") " pod="openstack/swift-ring-rebalance-dsjgc" Mar 20 17:51:45 crc kubenswrapper[4690]: I0320 17:51:45.866340 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7930c325-4b03-450e-b3d0-b7116efc71cb-swiftconf\") pod \"swift-ring-rebalance-dsjgc\" (UID: \"7930c325-4b03-450e-b3d0-b7116efc71cb\") " pod="openstack/swift-ring-rebalance-dsjgc" Mar 20 17:51:45 crc kubenswrapper[4690]: I0320 17:51:45.878567 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4wwn\" (UniqueName: \"kubernetes.io/projected/7930c325-4b03-450e-b3d0-b7116efc71cb-kube-api-access-x4wwn\") pod \"swift-ring-rebalance-dsjgc\" (UID: \"7930c325-4b03-450e-b3d0-b7116efc71cb\") " pod="openstack/swift-ring-rebalance-dsjgc" Mar 20 17:51:45 crc kubenswrapper[4690]: I0320 17:51:45.895571 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ec284e8-7aae-4e75-bd47-6326179eef7f" path="/var/lib/kubelet/pods/1ec284e8-7aae-4e75-bd47-6326179eef7f/volumes" Mar 20 17:51:45 crc kubenswrapper[4690]: I0320 17:51:45.932424 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-dsjgc" Mar 20 17:51:46 crc kubenswrapper[4690]: I0320 17:51:46.160785 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4191d8a1-c023-4412-a90c-e819672da33f-etc-swift\") pod \"swift-storage-0\" (UID: \"4191d8a1-c023-4412-a90c-e819672da33f\") " pod="openstack/swift-storage-0" Mar 20 17:51:46 crc kubenswrapper[4690]: E0320 17:51:46.160998 4690 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 17:51:46 crc kubenswrapper[4690]: E0320 17:51:46.161161 4690 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 17:51:46 crc kubenswrapper[4690]: E0320 17:51:46.161214 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4191d8a1-c023-4412-a90c-e819672da33f-etc-swift podName:4191d8a1-c023-4412-a90c-e819672da33f nodeName:}" failed. No retries permitted until 2026-03-20 17:51:47.161198715 +0000 UTC m=+1182.027024393 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/4191d8a1-c023-4412-a90c-e819672da33f-etc-swift") pod "swift-storage-0" (UID: "4191d8a1-c023-4412-a90c-e819672da33f") : configmap "swift-ring-files" not found Mar 20 17:51:46 crc kubenswrapper[4690]: I0320 17:51:46.392611 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-dsjgc"] Mar 20 17:51:46 crc kubenswrapper[4690]: I0320 17:51:46.676355 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-b56kg" event={"ID":"5bc2feb8-84ad-46f0-aa5e-3b17b33f1bce","Type":"ContainerStarted","Data":"1108152d0e8f6299034524770dd94f6864cd5941547739ec01c98a829e06ef56"} Mar 20 17:51:46 crc kubenswrapper[4690]: I0320 17:51:46.676666 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-b56kg" Mar 20 17:51:46 crc kubenswrapper[4690]: I0320 17:51:46.678662 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-dsjgc" event={"ID":"7930c325-4b03-450e-b3d0-b7116efc71cb","Type":"ContainerStarted","Data":"774caecb580b5883e271f086a0255ab1c88d78dfa64b1557e7ffdf4bcaf5c540"} Mar 20 17:51:46 crc kubenswrapper[4690]: I0320 17:51:46.704620 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-b56kg" podStartSLOduration=2.704601384 podStartE2EDuration="2.704601384s" podCreationTimestamp="2026-03-20 17:51:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:51:46.699898349 +0000 UTC m=+1181.565724027" watchObservedRunningTime="2026-03-20 17:51:46.704601384 +0000 UTC m=+1181.570427082" Mar 20 17:51:47 crc kubenswrapper[4690]: I0320 17:51:47.179020 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4191d8a1-c023-4412-a90c-e819672da33f-etc-swift\") pod \"swift-storage-0\" (UID: \"4191d8a1-c023-4412-a90c-e819672da33f\") " pod="openstack/swift-storage-0" Mar 20 17:51:47 crc kubenswrapper[4690]: E0320 17:51:47.179191 4690 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 17:51:47 crc kubenswrapper[4690]: E0320 17:51:47.179211 4690 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 17:51:47 crc kubenswrapper[4690]: E0320 17:51:47.179287 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4191d8a1-c023-4412-a90c-e819672da33f-etc-swift podName:4191d8a1-c023-4412-a90c-e819672da33f nodeName:}" failed. No retries permitted until 2026-03-20 17:51:49.179267554 +0000 UTC m=+1184.045093232 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/4191d8a1-c023-4412-a90c-e819672da33f-etc-swift") pod "swift-storage-0" (UID: "4191d8a1-c023-4412-a90c-e819672da33f") : configmap "swift-ring-files" not found Mar 20 17:51:47 crc kubenswrapper[4690]: I0320 17:51:47.689576 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-j8pr4" event={"ID":"f0e78344-d5a9-4bc2-9556-e3daf0ce19db","Type":"ContainerStarted","Data":"12e06e4bf6bc8f74ac99f1edec1a79ee284b6ad8bec7a70b822efc53f469f620"} Mar 20 17:51:47 crc kubenswrapper[4690]: I0320 17:51:47.690206 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-j8pr4" Mar 20 17:51:47 crc kubenswrapper[4690]: I0320 17:51:47.691362 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7","Type":"ContainerStarted","Data":"3df7ac4d250a04a6d7d52ab030145e0cd9c9bdc339e5fa7bd91d25cb277c9406"} Mar 20 17:51:47 crc kubenswrapper[4690]: I0320 17:51:47.715597 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-j8pr4" podStartSLOduration=5.668733367 podStartE2EDuration="40.715575658s" podCreationTimestamp="2026-03-20 17:51:07 +0000 UTC" firstStartedPulling="2026-03-20 17:51:11.33458762 +0000 UTC m=+1146.200413298" lastFinishedPulling="2026-03-20 17:51:46.381429901 +0000 UTC m=+1181.247255589" observedRunningTime="2026-03-20 17:51:47.71108681 +0000 UTC m=+1182.576912508" watchObservedRunningTime="2026-03-20 17:51:47.715575658 +0000 UTC m=+1182.581401336" Mar 20 17:51:49 crc kubenswrapper[4690]: I0320 17:51:49.237245 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4191d8a1-c023-4412-a90c-e819672da33f-etc-swift\") pod \"swift-storage-0\" (UID: \"4191d8a1-c023-4412-a90c-e819672da33f\") " pod="openstack/swift-storage-0" Mar 20 17:51:49 crc kubenswrapper[4690]: E0320 17:51:49.237575 4690 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 17:51:49 crc kubenswrapper[4690]: E0320 17:51:49.237623 4690 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 17:51:49 crc kubenswrapper[4690]: E0320 17:51:49.237711 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4191d8a1-c023-4412-a90c-e819672da33f-etc-swift podName:4191d8a1-c023-4412-a90c-e819672da33f nodeName:}" failed. No retries permitted until 2026-03-20 17:51:53.237683576 +0000 UTC m=+1188.103509284 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/4191d8a1-c023-4412-a90c-e819672da33f-etc-swift") pod "swift-storage-0" (UID: "4191d8a1-c023-4412-a90c-e819672da33f") : configmap "swift-ring-files" not found Mar 20 17:51:50 crc kubenswrapper[4690]: I0320 17:51:50.078998 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 20 17:51:50 crc kubenswrapper[4690]: I0320 17:51:50.080142 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 20 17:51:50 crc kubenswrapper[4690]: I0320 17:51:50.167689 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 20 17:51:50 crc kubenswrapper[4690]: I0320 17:51:50.723546 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"d64e7a22-5bb9-49ec-95d6-7ff145a31f9a","Type":"ContainerStarted","Data":"dbdeabe5c6af5b9c9d123c15c503a8e25a884a7d3cd426949c7132f761556675"} Mar 20 17:51:50 crc kubenswrapper[4690]: I0320 17:51:50.726044 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-dsjgc" event={"ID":"7930c325-4b03-450e-b3d0-b7116efc71cb","Type":"ContainerStarted","Data":"3b92eaf68e23ee0ed730d3fc39929d439f967abd6608547f33f138444f77c6db"} Mar 20 17:51:50 crc kubenswrapper[4690]: I0320 17:51:50.747672 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=3.519991902 podStartE2EDuration="41.747655205s" podCreationTimestamp="2026-03-20 17:51:09 +0000 UTC" firstStartedPulling="2026-03-20 17:51:11.628986033 +0000 UTC m=+1146.494811711" lastFinishedPulling="2026-03-20 17:51:49.856649326 +0000 UTC m=+1184.722475014" observedRunningTime="2026-03-20 17:51:50.744181796 +0000 UTC m=+1185.610007474" watchObservedRunningTime="2026-03-20 17:51:50.747655205 +0000 UTC m=+1185.613480883" Mar 20 17:51:50 crc kubenswrapper[4690]: I0320 17:51:50.769419 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-dsjgc" podStartSLOduration=2.3010300089999998 podStartE2EDuration="5.769397618s" podCreationTimestamp="2026-03-20 17:51:45 +0000 UTC" firstStartedPulling="2026-03-20 17:51:46.396456821 +0000 UTC m=+1181.262282499" lastFinishedPulling="2026-03-20 17:51:49.86482441 +0000 UTC m=+1184.730650108" observedRunningTime="2026-03-20 17:51:50.760835012 +0000 UTC m=+1185.626660700" watchObservedRunningTime="2026-03-20 17:51:50.769397618 +0000 UTC m=+1185.635223296" Mar 20 17:51:50 crc kubenswrapper[4690]: I0320 17:51:50.917449 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 20 17:51:51 crc kubenswrapper[4690]: I0320 17:51:51.009114 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 20 17:51:51 crc kubenswrapper[4690]: I0320 17:51:51.483355 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 20 17:51:51 crc kubenswrapper[4690]: I0320 17:51:51.483669 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 20 17:51:51 crc kubenswrapper[4690]: I0320 17:51:51.556788 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 20 17:51:51 crc kubenswrapper[4690]: I0320 17:51:51.748890 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4ee4534b-8d84-4ca5-a8bc-10574d39d7bc","Type":"ContainerStarted","Data":"e32fdf8c2102bd7ba3cad331c73e818ce6d0901e8c7c47f023f4120143d2d905"} Mar 20 17:51:51 crc kubenswrapper[4690]: I0320 17:51:51.895395 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 20 17:51:51 crc kubenswrapper[4690]: I0320 17:51:51.925403 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-9jdjw"] Mar 20 17:51:51 crc kubenswrapper[4690]: I0320 17:51:51.929053 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-9jdjw" Mar 20 17:51:51 crc kubenswrapper[4690]: I0320 17:51:51.976246 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-d6a5-account-create-update-kwm5l"] Mar 20 17:51:51 crc kubenswrapper[4690]: I0320 17:51:51.977242 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-d6a5-account-create-update-kwm5l" Mar 20 17:51:51 crc kubenswrapper[4690]: I0320 17:51:51.979513 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 20 17:51:51 crc kubenswrapper[4690]: I0320 17:51:51.990603 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lcbz\" (UniqueName: \"kubernetes.io/projected/08b775c4-9217-4241-8049-1253db4ecb81-kube-api-access-6lcbz\") pod \"glance-db-create-9jdjw\" (UID: \"08b775c4-9217-4241-8049-1253db4ecb81\") " pod="openstack/glance-db-create-9jdjw" Mar 20 17:51:51 crc kubenswrapper[4690]: I0320 17:51:51.990697 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/08b775c4-9217-4241-8049-1253db4ecb81-operator-scripts\") pod \"glance-db-create-9jdjw\" (UID: \"08b775c4-9217-4241-8049-1253db4ecb81\") " pod="openstack/glance-db-create-9jdjw" Mar 20 17:51:52 crc kubenswrapper[4690]: I0320 17:51:51.994838 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-9jdjw"] Mar 20 17:51:52 crc kubenswrapper[4690]: I0320 17:51:52.004704 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-d6a5-account-create-update-kwm5l"] Mar 20 17:51:52 crc kubenswrapper[4690]: I0320 17:51:52.092416 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lcbz\" (UniqueName: \"kubernetes.io/projected/08b775c4-9217-4241-8049-1253db4ecb81-kube-api-access-6lcbz\") pod \"glance-db-create-9jdjw\" (UID: \"08b775c4-9217-4241-8049-1253db4ecb81\") " pod="openstack/glance-db-create-9jdjw" Mar 20 17:51:52 crc kubenswrapper[4690]: I0320 17:51:52.092513 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnxmx\" (UniqueName: \"kubernetes.io/projected/f8409e1a-31b0-4050-86bf-69c3d18c6185-kube-api-access-tnxmx\") pod \"glance-d6a5-account-create-update-kwm5l\" (UID: \"f8409e1a-31b0-4050-86bf-69c3d18c6185\") " pod="openstack/glance-d6a5-account-create-update-kwm5l" Mar 20 17:51:52 crc kubenswrapper[4690]: I0320 17:51:52.092554 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/08b775c4-9217-4241-8049-1253db4ecb81-operator-scripts\") pod \"glance-db-create-9jdjw\" (UID: \"08b775c4-9217-4241-8049-1253db4ecb81\") " pod="openstack/glance-db-create-9jdjw" Mar 20 17:51:52 crc kubenswrapper[4690]: I0320 17:51:52.092593 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8409e1a-31b0-4050-86bf-69c3d18c6185-operator-scripts\") pod \"glance-d6a5-account-create-update-kwm5l\" (UID: \"f8409e1a-31b0-4050-86bf-69c3d18c6185\") " pod="openstack/glance-d6a5-account-create-update-kwm5l" Mar 20 17:51:52 crc kubenswrapper[4690]: I0320 17:51:52.093427 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/08b775c4-9217-4241-8049-1253db4ecb81-operator-scripts\") pod \"glance-db-create-9jdjw\" (UID: \"08b775c4-9217-4241-8049-1253db4ecb81\") " pod="openstack/glance-db-create-9jdjw" Mar 20 17:51:52 crc kubenswrapper[4690]: I0320 17:51:52.109408 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lcbz\" (UniqueName: \"kubernetes.io/projected/08b775c4-9217-4241-8049-1253db4ecb81-kube-api-access-6lcbz\") pod \"glance-db-create-9jdjw\" (UID: \"08b775c4-9217-4241-8049-1253db4ecb81\") " pod="openstack/glance-db-create-9jdjw" Mar 20 17:51:52 crc kubenswrapper[4690]: I0320 17:51:52.196435 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8409e1a-31b0-4050-86bf-69c3d18c6185-operator-scripts\") pod \"glance-d6a5-account-create-update-kwm5l\" (UID: \"f8409e1a-31b0-4050-86bf-69c3d18c6185\") " pod="openstack/glance-d6a5-account-create-update-kwm5l" Mar 20 17:51:52 crc kubenswrapper[4690]: I0320 17:51:52.196896 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnxmx\" (UniqueName: \"kubernetes.io/projected/f8409e1a-31b0-4050-86bf-69c3d18c6185-kube-api-access-tnxmx\") pod \"glance-d6a5-account-create-update-kwm5l\" (UID: \"f8409e1a-31b0-4050-86bf-69c3d18c6185\") " pod="openstack/glance-d6a5-account-create-update-kwm5l" Mar 20 17:51:52 crc kubenswrapper[4690]: I0320 17:51:52.197652 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8409e1a-31b0-4050-86bf-69c3d18c6185-operator-scripts\") pod \"glance-d6a5-account-create-update-kwm5l\" (UID: \"f8409e1a-31b0-4050-86bf-69c3d18c6185\") " pod="openstack/glance-d6a5-account-create-update-kwm5l" Mar 20 17:51:52 crc kubenswrapper[4690]: I0320 17:51:52.223070 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnxmx\" (UniqueName: \"kubernetes.io/projected/f8409e1a-31b0-4050-86bf-69c3d18c6185-kube-api-access-tnxmx\") pod \"glance-d6a5-account-create-update-kwm5l\" (UID: \"f8409e1a-31b0-4050-86bf-69c3d18c6185\") " pod="openstack/glance-d6a5-account-create-update-kwm5l" Mar 20 17:51:52 crc kubenswrapper[4690]: I0320 17:51:52.250090 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-9jdjw" Mar 20 17:51:52 crc kubenswrapper[4690]: I0320 17:51:52.294614 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-d6a5-account-create-update-kwm5l" Mar 20 17:51:52 crc kubenswrapper[4690]: I0320 17:51:52.742359 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-9jdjw"] Mar 20 17:51:52 crc kubenswrapper[4690]: W0320 17:51:52.758543 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod08b775c4_9217_4241_8049_1253db4ecb81.slice/crio-a108dd31c9ffaa0103e27e2e524f47a6c3346f072f8913e4cef7282ed02bc380 WatchSource:0}: Error finding container a108dd31c9ffaa0103e27e2e524f47a6c3346f072f8913e4cef7282ed02bc380: Status 404 returned error can't find the container with id a108dd31c9ffaa0103e27e2e524f47a6c3346f072f8913e4cef7282ed02bc380 Mar 20 17:51:52 crc kubenswrapper[4690]: I0320 17:51:52.776465 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-qch77"] Mar 20 17:51:52 crc kubenswrapper[4690]: I0320 17:51:52.777412 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-qch77" Mar 20 17:51:52 crc kubenswrapper[4690]: I0320 17:51:52.792908 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-qch77"] Mar 20 17:51:52 crc kubenswrapper[4690]: I0320 17:51:52.828445 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-d6a5-account-create-update-kwm5l"] Mar 20 17:51:52 crc kubenswrapper[4690]: W0320 17:51:52.834594 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8409e1a_31b0_4050_86bf_69c3d18c6185.slice/crio-f807e34fdda2d70ded812d071a24dc4569a871ca8a8e993ddcd34d924ba45092 WatchSource:0}: Error finding container f807e34fdda2d70ded812d071a24dc4569a871ca8a8e993ddcd34d924ba45092: Status 404 returned error can't find the container with id f807e34fdda2d70ded812d071a24dc4569a871ca8a8e993ddcd34d924ba45092 Mar 20 17:51:52 crc kubenswrapper[4690]: I0320 17:51:52.878002 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-98d6-account-create-update-hbppj"] Mar 20 17:51:52 crc kubenswrapper[4690]: I0320 17:51:52.879380 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-98d6-account-create-update-hbppj" Mar 20 17:51:52 crc kubenswrapper[4690]: I0320 17:51:52.882840 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 20 17:51:52 crc kubenswrapper[4690]: I0320 17:51:52.886980 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-98d6-account-create-update-hbppj"] Mar 20 17:51:52 crc kubenswrapper[4690]: I0320 17:51:52.918566 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tfhn\" (UniqueName: \"kubernetes.io/projected/8762da29-e17c-42a8-b233-a6c565c3a6de-kube-api-access-7tfhn\") pod \"keystone-db-create-qch77\" (UID: \"8762da29-e17c-42a8-b233-a6c565c3a6de\") " pod="openstack/keystone-db-create-qch77" Mar 20 17:51:52 crc kubenswrapper[4690]: I0320 17:51:52.918673 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8762da29-e17c-42a8-b233-a6c565c3a6de-operator-scripts\") pod \"keystone-db-create-qch77\" (UID: \"8762da29-e17c-42a8-b233-a6c565c3a6de\") " pod="openstack/keystone-db-create-qch77" Mar 20 17:51:53 crc kubenswrapper[4690]: I0320 17:51:53.009208 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 20 17:51:53 crc kubenswrapper[4690]: I0320 17:51:53.020806 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpv6g\" (UniqueName: \"kubernetes.io/projected/d6415de2-c5b2-4077-91d9-74f1c0852b56-kube-api-access-dpv6g\") pod \"keystone-98d6-account-create-update-hbppj\" (UID: \"d6415de2-c5b2-4077-91d9-74f1c0852b56\") " pod="openstack/keystone-98d6-account-create-update-hbppj" Mar 20 17:51:53 crc kubenswrapper[4690]: I0320 17:51:53.020940 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tfhn\" (UniqueName: \"kubernetes.io/projected/8762da29-e17c-42a8-b233-a6c565c3a6de-kube-api-access-7tfhn\") pod \"keystone-db-create-qch77\" (UID: \"8762da29-e17c-42a8-b233-a6c565c3a6de\") " pod="openstack/keystone-db-create-qch77" Mar 20 17:51:53 crc kubenswrapper[4690]: I0320 17:51:53.021005 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8762da29-e17c-42a8-b233-a6c565c3a6de-operator-scripts\") pod \"keystone-db-create-qch77\" (UID: \"8762da29-e17c-42a8-b233-a6c565c3a6de\") " pod="openstack/keystone-db-create-qch77" Mar 20 17:51:53 crc kubenswrapper[4690]: I0320 17:51:53.021070 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6415de2-c5b2-4077-91d9-74f1c0852b56-operator-scripts\") pod \"keystone-98d6-account-create-update-hbppj\" (UID: \"d6415de2-c5b2-4077-91d9-74f1c0852b56\") " pod="openstack/keystone-98d6-account-create-update-hbppj" Mar 20 17:51:53 crc kubenswrapper[4690]: I0320 17:51:53.022685 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8762da29-e17c-42a8-b233-a6c565c3a6de-operator-scripts\") pod \"keystone-db-create-qch77\" (UID: \"8762da29-e17c-42a8-b233-a6c565c3a6de\") " pod="openstack/keystone-db-create-qch77" Mar 20 17:51:53 crc kubenswrapper[4690]: I0320 17:51:53.042470 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tfhn\" (UniqueName: \"kubernetes.io/projected/8762da29-e17c-42a8-b233-a6c565c3a6de-kube-api-access-7tfhn\") pod \"keystone-db-create-qch77\" (UID: \"8762da29-e17c-42a8-b233-a6c565c3a6de\") " pod="openstack/keystone-db-create-qch77" Mar 20 17:51:53 crc kubenswrapper[4690]: I0320 17:51:53.052712 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 20 17:51:53 crc kubenswrapper[4690]: I0320 17:51:53.076131 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-6nm7z"] Mar 20 17:51:53 crc kubenswrapper[4690]: I0320 17:51:53.077427 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-6nm7z" Mar 20 17:51:53 crc kubenswrapper[4690]: I0320 17:51:53.091301 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-43e3-account-create-update-hb8zx"] Mar 20 17:51:53 crc kubenswrapper[4690]: I0320 17:51:53.092375 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-43e3-account-create-update-hb8zx" Mar 20 17:51:53 crc kubenswrapper[4690]: I0320 17:51:53.094181 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 20 17:51:53 crc kubenswrapper[4690]: I0320 17:51:53.095282 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-qch77" Mar 20 17:51:53 crc kubenswrapper[4690]: I0320 17:51:53.113540 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-43e3-account-create-update-hb8zx"] Mar 20 17:51:53 crc kubenswrapper[4690]: I0320 17:51:53.122346 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6415de2-c5b2-4077-91d9-74f1c0852b56-operator-scripts\") pod \"keystone-98d6-account-create-update-hbppj\" (UID: \"d6415de2-c5b2-4077-91d9-74f1c0852b56\") " pod="openstack/keystone-98d6-account-create-update-hbppj" Mar 20 17:51:53 crc kubenswrapper[4690]: I0320 17:51:53.122482 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpv6g\" (UniqueName: \"kubernetes.io/projected/d6415de2-c5b2-4077-91d9-74f1c0852b56-kube-api-access-dpv6g\") pod \"keystone-98d6-account-create-update-hbppj\" (UID: \"d6415de2-c5b2-4077-91d9-74f1c0852b56\") " pod="openstack/keystone-98d6-account-create-update-hbppj" Mar 20 17:51:53 crc kubenswrapper[4690]: I0320 17:51:53.122641 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e760b621-28bc-4ede-b08d-8e46250407eb-operator-scripts\") pod \"placement-db-create-6nm7z\" (UID: \"e760b621-28bc-4ede-b08d-8e46250407eb\") " pod="openstack/placement-db-create-6nm7z" Mar 20 17:51:53 crc kubenswrapper[4690]: I0320 17:51:53.122691 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxhfs\" (UniqueName: \"kubernetes.io/projected/e760b621-28bc-4ede-b08d-8e46250407eb-kube-api-access-jxhfs\") pod \"placement-db-create-6nm7z\" (UID: \"e760b621-28bc-4ede-b08d-8e46250407eb\") " pod="openstack/placement-db-create-6nm7z" Mar 20 17:51:53 crc kubenswrapper[4690]: I0320 17:51:53.124507 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6415de2-c5b2-4077-91d9-74f1c0852b56-operator-scripts\") pod \"keystone-98d6-account-create-update-hbppj\" (UID: \"d6415de2-c5b2-4077-91d9-74f1c0852b56\") " pod="openstack/keystone-98d6-account-create-update-hbppj" Mar 20 17:51:53 crc kubenswrapper[4690]: I0320 17:51:53.139148 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-6nm7z"] Mar 20 17:51:53 crc kubenswrapper[4690]: I0320 17:51:53.141186 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpv6g\" (UniqueName: \"kubernetes.io/projected/d6415de2-c5b2-4077-91d9-74f1c0852b56-kube-api-access-dpv6g\") pod \"keystone-98d6-account-create-update-hbppj\" (UID: \"d6415de2-c5b2-4077-91d9-74f1c0852b56\") " pod="openstack/keystone-98d6-account-create-update-hbppj" Mar 20 17:51:53 crc kubenswrapper[4690]: I0320 17:51:53.198914 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-98d6-account-create-update-hbppj" Mar 20 17:51:53 crc kubenswrapper[4690]: I0320 17:51:53.225144 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e760b621-28bc-4ede-b08d-8e46250407eb-operator-scripts\") pod \"placement-db-create-6nm7z\" (UID: \"e760b621-28bc-4ede-b08d-8e46250407eb\") " pod="openstack/placement-db-create-6nm7z" Mar 20 17:51:53 crc kubenswrapper[4690]: I0320 17:51:53.225182 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxhfs\" (UniqueName: \"kubernetes.io/projected/e760b621-28bc-4ede-b08d-8e46250407eb-kube-api-access-jxhfs\") pod \"placement-db-create-6nm7z\" (UID: \"e760b621-28bc-4ede-b08d-8e46250407eb\") " pod="openstack/placement-db-create-6nm7z" Mar 20 17:51:53 crc kubenswrapper[4690]: I0320 17:51:53.225210 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/995742f4-b26d-4f30-ae2c-2635257cd664-operator-scripts\") pod \"placement-43e3-account-create-update-hb8zx\" (UID: \"995742f4-b26d-4f30-ae2c-2635257cd664\") " pod="openstack/placement-43e3-account-create-update-hb8zx" Mar 20 17:51:53 crc kubenswrapper[4690]: I0320 17:51:53.225273 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2h9w5\" (UniqueName: \"kubernetes.io/projected/995742f4-b26d-4f30-ae2c-2635257cd664-kube-api-access-2h9w5\") pod \"placement-43e3-account-create-update-hb8zx\" (UID: \"995742f4-b26d-4f30-ae2c-2635257cd664\") " pod="openstack/placement-43e3-account-create-update-hb8zx" Mar 20 17:51:53 crc kubenswrapper[4690]: I0320 17:51:53.225983 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e760b621-28bc-4ede-b08d-8e46250407eb-operator-scripts\") pod \"placement-db-create-6nm7z\" (UID: \"e760b621-28bc-4ede-b08d-8e46250407eb\") " pod="openstack/placement-db-create-6nm7z" Mar 20 17:51:53 crc kubenswrapper[4690]: I0320 17:51:53.242932 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxhfs\" (UniqueName: \"kubernetes.io/projected/e760b621-28bc-4ede-b08d-8e46250407eb-kube-api-access-jxhfs\") pod \"placement-db-create-6nm7z\" (UID: \"e760b621-28bc-4ede-b08d-8e46250407eb\") " pod="openstack/placement-db-create-6nm7z" Mar 20 17:51:53 crc kubenswrapper[4690]: I0320 17:51:53.326474 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4191d8a1-c023-4412-a90c-e819672da33f-etc-swift\") pod \"swift-storage-0\" (UID: \"4191d8a1-c023-4412-a90c-e819672da33f\") " pod="openstack/swift-storage-0" Mar 20 17:51:53 crc kubenswrapper[4690]: I0320 17:51:53.326810 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/995742f4-b26d-4f30-ae2c-2635257cd664-operator-scripts\") pod \"placement-43e3-account-create-update-hb8zx\" (UID: \"995742f4-b26d-4f30-ae2c-2635257cd664\") " pod="openstack/placement-43e3-account-create-update-hb8zx" Mar 20 17:51:53 crc kubenswrapper[4690]: I0320 17:51:53.326856 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2h9w5\" (UniqueName: \"kubernetes.io/projected/995742f4-b26d-4f30-ae2c-2635257cd664-kube-api-access-2h9w5\") pod \"placement-43e3-account-create-update-hb8zx\" (UID: \"995742f4-b26d-4f30-ae2c-2635257cd664\") " pod="openstack/placement-43e3-account-create-update-hb8zx" Mar 20 17:51:53 crc kubenswrapper[4690]: E0320 17:51:53.327711 4690 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 17:51:53 crc kubenswrapper[4690]: E0320 17:51:53.327751 4690 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 17:51:53 crc kubenswrapper[4690]: I0320 17:51:53.327790 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/995742f4-b26d-4f30-ae2c-2635257cd664-operator-scripts\") pod \"placement-43e3-account-create-update-hb8zx\" (UID: \"995742f4-b26d-4f30-ae2c-2635257cd664\") " pod="openstack/placement-43e3-account-create-update-hb8zx" Mar 20 17:51:53 crc kubenswrapper[4690]: E0320 17:51:53.327817 4690 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4191d8a1-c023-4412-a90c-e819672da33f-etc-swift podName:4191d8a1-c023-4412-a90c-e819672da33f nodeName:}" failed. No retries permitted until 2026-03-20 17:52:01.327795923 +0000 UTC m=+1196.193621611 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/4191d8a1-c023-4412-a90c-e819672da33f-etc-swift") pod "swift-storage-0" (UID: "4191d8a1-c023-4412-a90c-e819672da33f") : configmap "swift-ring-files" not found Mar 20 17:51:53 crc kubenswrapper[4690]: I0320 17:51:53.346386 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2h9w5\" (UniqueName: \"kubernetes.io/projected/995742f4-b26d-4f30-ae2c-2635257cd664-kube-api-access-2h9w5\") pod \"placement-43e3-account-create-update-hb8zx\" (UID: \"995742f4-b26d-4f30-ae2c-2635257cd664\") " pod="openstack/placement-43e3-account-create-update-hb8zx" Mar 20 17:51:53 crc kubenswrapper[4690]: I0320 17:51:53.392250 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-6nm7z" Mar 20 17:51:53 crc kubenswrapper[4690]: I0320 17:51:53.405177 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-43e3-account-create-update-hb8zx" Mar 20 17:51:53 crc kubenswrapper[4690]: I0320 17:51:53.606981 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-qch77"] Mar 20 17:51:53 crc kubenswrapper[4690]: I0320 17:51:53.668036 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-98d6-account-create-update-hbppj"] Mar 20 17:51:53 crc kubenswrapper[4690]: I0320 17:51:53.781107 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-98d6-account-create-update-hbppj" event={"ID":"d6415de2-c5b2-4077-91d9-74f1c0852b56","Type":"ContainerStarted","Data":"73c12fd928eb3aa741fc74b821edb45f9b0a07ee7c710f23c66789de83512586"} Mar 20 17:51:53 crc kubenswrapper[4690]: I0320 17:51:53.787604 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-qch77" event={"ID":"8762da29-e17c-42a8-b233-a6c565c3a6de","Type":"ContainerStarted","Data":"34ec22fce1b3d9e1f2e508649280a6972c91330959eacd0b67975326566e32f1"} Mar 20 17:51:53 crc kubenswrapper[4690]: I0320 17:51:53.790840 4690 generic.go:334] "Generic (PLEG): container finished" podID="08b775c4-9217-4241-8049-1253db4ecb81" containerID="9e8043a54febb29b6ec235d3caddc995474dc5724fc958d914aa6e1fb3298c94" exitCode=0 Mar 20 17:51:53 crc kubenswrapper[4690]: I0320 17:51:53.791008 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-9jdjw" event={"ID":"08b775c4-9217-4241-8049-1253db4ecb81","Type":"ContainerDied","Data":"9e8043a54febb29b6ec235d3caddc995474dc5724fc958d914aa6e1fb3298c94"} Mar 20 17:51:53 crc kubenswrapper[4690]: I0320 17:51:53.791109 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-9jdjw" event={"ID":"08b775c4-9217-4241-8049-1253db4ecb81","Type":"ContainerStarted","Data":"a108dd31c9ffaa0103e27e2e524f47a6c3346f072f8913e4cef7282ed02bc380"} Mar 20 17:51:53 crc kubenswrapper[4690]: I0320 17:51:53.800530 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"172668db-85fb-47e1-82fe-dee7c454993e","Type":"ContainerStarted","Data":"b2d813b3ff1bdcb33d47a300e01614a67408c9ca91207f2e17f2c8f9bb18655e"} Mar 20 17:51:53 crc kubenswrapper[4690]: I0320 17:51:53.803816 4690 generic.go:334] "Generic (PLEG): container finished" podID="f8409e1a-31b0-4050-86bf-69c3d18c6185" containerID="1b3f404decb5c61241855588ab03641c91bda827e6d79fe66660b1469f93eaea" exitCode=0 Mar 20 17:51:53 crc kubenswrapper[4690]: I0320 17:51:53.804088 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-d6a5-account-create-update-kwm5l" event={"ID":"f8409e1a-31b0-4050-86bf-69c3d18c6185","Type":"ContainerDied","Data":"1b3f404decb5c61241855588ab03641c91bda827e6d79fe66660b1469f93eaea"} Mar 20 17:51:53 crc kubenswrapper[4690]: I0320 17:51:53.804156 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-d6a5-account-create-update-kwm5l" event={"ID":"f8409e1a-31b0-4050-86bf-69c3d18c6185","Type":"ContainerStarted","Data":"f807e34fdda2d70ded812d071a24dc4569a871ca8a8e993ddcd34d924ba45092"} Mar 20 17:51:53 crc kubenswrapper[4690]: I0320 17:51:53.847582 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=5.8071238560000005 podStartE2EDuration="47.847562604s" podCreationTimestamp="2026-03-20 17:51:06 +0000 UTC" firstStartedPulling="2026-03-20 17:51:11.426788007 +0000 UTC m=+1146.292613685" lastFinishedPulling="2026-03-20 17:51:53.467226765 +0000 UTC m=+1188.333052433" observedRunningTime="2026-03-20 17:51:53.840054119 +0000 UTC m=+1188.705879797" watchObservedRunningTime="2026-03-20 17:51:53.847562604 +0000 UTC m=+1188.713388292" Mar 20 17:51:53 crc kubenswrapper[4690]: I0320 17:51:53.864481 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-6nm7z"] Mar 20 17:51:53 crc kubenswrapper[4690]: I0320 17:51:53.915764 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-43e3-account-create-update-hb8zx"] Mar 20 17:51:53 crc kubenswrapper[4690]: W0320 17:51:53.919653 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod995742f4_b26d_4f30_ae2c_2635257cd664.slice/crio-8b3489d0d5dec7572713181bb4b31635a9ca3bac8537f16817c1e896299db394 WatchSource:0}: Error finding container 8b3489d0d5dec7572713181bb4b31635a9ca3bac8537f16817c1e896299db394: Status 404 returned error can't find the container with id 8b3489d0d5dec7572713181bb4b31635a9ca3bac8537f16817c1e896299db394 Mar 20 17:51:54 crc kubenswrapper[4690]: I0320 17:51:54.152325 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 20 17:51:54 crc kubenswrapper[4690]: I0320 17:51:54.274049 4690 patch_prober.go:28] interesting pod/machine-config-daemon-wtg2q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:51:54 crc kubenswrapper[4690]: I0320 17:51:54.274297 4690 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:51:54 crc kubenswrapper[4690]: I0320 17:51:54.713576 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-b56kg" Mar 20 17:51:54 crc kubenswrapper[4690]: I0320 17:51:54.796034 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-9mzwk"] Mar 20 17:51:54 crc kubenswrapper[4690]: I0320 17:51:54.796337 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-9mzwk" podUID="03da2fe9-5a86-4fba-87c7-7b2132c31d5f" containerName="dnsmasq-dns" containerID="cri-o://23a76f8b21ec13fc0517e5d372cbf075afe6057ed89575877f2580dc8cb31056" gracePeriod=10 Mar 20 17:51:54 crc kubenswrapper[4690]: I0320 17:51:54.819799 4690 generic.go:334] "Generic (PLEG): container finished" podID="e760b621-28bc-4ede-b08d-8e46250407eb" containerID="708db4bb1e48b05476ad52b46d6ff75400022cc63ab48bf41ae24dd02a6cfd03" exitCode=0 Mar 20 17:51:54 crc kubenswrapper[4690]: I0320 17:51:54.819860 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-6nm7z" event={"ID":"e760b621-28bc-4ede-b08d-8e46250407eb","Type":"ContainerDied","Data":"708db4bb1e48b05476ad52b46d6ff75400022cc63ab48bf41ae24dd02a6cfd03"} Mar 20 17:51:54 crc kubenswrapper[4690]: I0320 17:51:54.819887 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-6nm7z" event={"ID":"e760b621-28bc-4ede-b08d-8e46250407eb","Type":"ContainerStarted","Data":"2819f26467458ec312f152d125b660700ed92dc59e99997653ef3d140662e711"} Mar 20 17:51:54 crc kubenswrapper[4690]: I0320 17:51:54.824748 4690 generic.go:334] "Generic (PLEG): container finished" podID="d6415de2-c5b2-4077-91d9-74f1c0852b56" containerID="e7517286ef6ba806a0a448a5c4c3fb9ed64d8f3cc9d5aade27b840310610bc2e" exitCode=0 Mar 20 17:51:54 crc kubenswrapper[4690]: I0320 17:51:54.825285 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-98d6-account-create-update-hbppj" event={"ID":"d6415de2-c5b2-4077-91d9-74f1c0852b56","Type":"ContainerDied","Data":"e7517286ef6ba806a0a448a5c4c3fb9ed64d8f3cc9d5aade27b840310610bc2e"} Mar 20 17:51:54 crc kubenswrapper[4690]: I0320 17:51:54.828291 4690 generic.go:334] "Generic (PLEG): container finished" podID="8762da29-e17c-42a8-b233-a6c565c3a6de" containerID="a62e6985e8a728e6df4be921ff90b2ef220525cb6bedddd86024e5a64cceb273" exitCode=0 Mar 20 17:51:54 crc kubenswrapper[4690]: I0320 17:51:54.828345 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-qch77" event={"ID":"8762da29-e17c-42a8-b233-a6c565c3a6de","Type":"ContainerDied","Data":"a62e6985e8a728e6df4be921ff90b2ef220525cb6bedddd86024e5a64cceb273"} Mar 20 17:51:54 crc kubenswrapper[4690]: I0320 17:51:54.830052 4690 generic.go:334] "Generic (PLEG): container finished" podID="995742f4-b26d-4f30-ae2c-2635257cd664" containerID="a5fed98b3c2fb3d0785b82bc8369314110a9df4d935ba8e7ec174b0b47a35990" exitCode=0 Mar 20 17:51:54 crc kubenswrapper[4690]: I0320 17:51:54.830393 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-43e3-account-create-update-hb8zx" event={"ID":"995742f4-b26d-4f30-ae2c-2635257cd664","Type":"ContainerDied","Data":"a5fed98b3c2fb3d0785b82bc8369314110a9df4d935ba8e7ec174b0b47a35990"} Mar 20 17:51:54 crc kubenswrapper[4690]: I0320 17:51:54.830430 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-43e3-account-create-update-hb8zx" event={"ID":"995742f4-b26d-4f30-ae2c-2635257cd664","Type":"ContainerStarted","Data":"8b3489d0d5dec7572713181bb4b31635a9ca3bac8537f16817c1e896299db394"} Mar 20 17:51:55 crc kubenswrapper[4690]: I0320 17:51:55.352609 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-9jdjw" Mar 20 17:51:55 crc kubenswrapper[4690]: I0320 17:51:55.358220 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-d6a5-account-create-update-kwm5l" Mar 20 17:51:55 crc kubenswrapper[4690]: I0320 17:51:55.458520 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 20 17:51:55 crc kubenswrapper[4690]: I0320 17:51:55.474595 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6lcbz\" (UniqueName: \"kubernetes.io/projected/08b775c4-9217-4241-8049-1253db4ecb81-kube-api-access-6lcbz\") pod \"08b775c4-9217-4241-8049-1253db4ecb81\" (UID: \"08b775c4-9217-4241-8049-1253db4ecb81\") " Mar 20 17:51:55 crc kubenswrapper[4690]: I0320 17:51:55.474648 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/08b775c4-9217-4241-8049-1253db4ecb81-operator-scripts\") pod \"08b775c4-9217-4241-8049-1253db4ecb81\" (UID: \"08b775c4-9217-4241-8049-1253db4ecb81\") " Mar 20 17:51:55 crc kubenswrapper[4690]: I0320 17:51:55.474700 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tnxmx\" (UniqueName: \"kubernetes.io/projected/f8409e1a-31b0-4050-86bf-69c3d18c6185-kube-api-access-tnxmx\") pod \"f8409e1a-31b0-4050-86bf-69c3d18c6185\" (UID: \"f8409e1a-31b0-4050-86bf-69c3d18c6185\") " Mar 20 17:51:55 crc kubenswrapper[4690]: I0320 17:51:55.474721 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8409e1a-31b0-4050-86bf-69c3d18c6185-operator-scripts\") pod \"f8409e1a-31b0-4050-86bf-69c3d18c6185\" (UID: \"f8409e1a-31b0-4050-86bf-69c3d18c6185\") " Mar 20 17:51:55 crc kubenswrapper[4690]: I0320 17:51:55.475554 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08b775c4-9217-4241-8049-1253db4ecb81-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "08b775c4-9217-4241-8049-1253db4ecb81" (UID: "08b775c4-9217-4241-8049-1253db4ecb81"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:51:55 crc kubenswrapper[4690]: I0320 17:51:55.475994 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8409e1a-31b0-4050-86bf-69c3d18c6185-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f8409e1a-31b0-4050-86bf-69c3d18c6185" (UID: "f8409e1a-31b0-4050-86bf-69c3d18c6185"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:51:55 crc kubenswrapper[4690]: I0320 17:51:55.479696 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08b775c4-9217-4241-8049-1253db4ecb81-kube-api-access-6lcbz" (OuterVolumeSpecName: "kube-api-access-6lcbz") pod "08b775c4-9217-4241-8049-1253db4ecb81" (UID: "08b775c4-9217-4241-8049-1253db4ecb81"). InnerVolumeSpecName "kube-api-access-6lcbz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:51:55 crc kubenswrapper[4690]: I0320 17:51:55.479923 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8409e1a-31b0-4050-86bf-69c3d18c6185-kube-api-access-tnxmx" (OuterVolumeSpecName: "kube-api-access-tnxmx") pod "f8409e1a-31b0-4050-86bf-69c3d18c6185" (UID: "f8409e1a-31b0-4050-86bf-69c3d18c6185"). InnerVolumeSpecName "kube-api-access-tnxmx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:51:55 crc kubenswrapper[4690]: I0320 17:51:55.577306 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6lcbz\" (UniqueName: \"kubernetes.io/projected/08b775c4-9217-4241-8049-1253db4ecb81-kube-api-access-6lcbz\") on node \"crc\" DevicePath \"\"" Mar 20 17:51:55 crc kubenswrapper[4690]: I0320 17:51:55.577339 4690 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/08b775c4-9217-4241-8049-1253db4ecb81-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:51:55 crc kubenswrapper[4690]: I0320 17:51:55.577348 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tnxmx\" (UniqueName: \"kubernetes.io/projected/f8409e1a-31b0-4050-86bf-69c3d18c6185-kube-api-access-tnxmx\") on node \"crc\" DevicePath \"\"" Mar 20 17:51:55 crc kubenswrapper[4690]: I0320 17:51:55.577358 4690 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8409e1a-31b0-4050-86bf-69c3d18c6185-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:51:55 crc kubenswrapper[4690]: I0320 17:51:55.842964 4690 generic.go:334] "Generic (PLEG): container finished" podID="03da2fe9-5a86-4fba-87c7-7b2132c31d5f" containerID="23a76f8b21ec13fc0517e5d372cbf075afe6057ed89575877f2580dc8cb31056" exitCode=0 Mar 20 17:51:55 crc kubenswrapper[4690]: I0320 17:51:55.844099 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-9mzwk" event={"ID":"03da2fe9-5a86-4fba-87c7-7b2132c31d5f","Type":"ContainerDied","Data":"23a76f8b21ec13fc0517e5d372cbf075afe6057ed89575877f2580dc8cb31056"} Mar 20 17:51:55 crc kubenswrapper[4690]: I0320 17:51:55.845080 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-d6a5-account-create-update-kwm5l" event={"ID":"f8409e1a-31b0-4050-86bf-69c3d18c6185","Type":"ContainerDied","Data":"f807e34fdda2d70ded812d071a24dc4569a871ca8a8e993ddcd34d924ba45092"} Mar 20 17:51:55 crc kubenswrapper[4690]: I0320 17:51:55.845113 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f807e34fdda2d70ded812d071a24dc4569a871ca8a8e993ddcd34d924ba45092" Mar 20 17:51:55 crc kubenswrapper[4690]: I0320 17:51:55.845127 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-d6a5-account-create-update-kwm5l" Mar 20 17:51:55 crc kubenswrapper[4690]: I0320 17:51:55.847113 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-9jdjw" Mar 20 17:51:55 crc kubenswrapper[4690]: I0320 17:51:55.847264 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-9jdjw" event={"ID":"08b775c4-9217-4241-8049-1253db4ecb81","Type":"ContainerDied","Data":"a108dd31c9ffaa0103e27e2e524f47a6c3346f072f8913e4cef7282ed02bc380"} Mar 20 17:51:55 crc kubenswrapper[4690]: I0320 17:51:55.847549 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a108dd31c9ffaa0103e27e2e524f47a6c3346f072f8913e4cef7282ed02bc380" Mar 20 17:51:56 crc kubenswrapper[4690]: I0320 17:51:56.053088 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 20 17:51:56 crc kubenswrapper[4690]: I0320 17:51:56.229928 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-6nm7z" Mar 20 17:51:56 crc kubenswrapper[4690]: I0320 17:51:56.295866 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jxhfs\" (UniqueName: \"kubernetes.io/projected/e760b621-28bc-4ede-b08d-8e46250407eb-kube-api-access-jxhfs\") pod \"e760b621-28bc-4ede-b08d-8e46250407eb\" (UID: \"e760b621-28bc-4ede-b08d-8e46250407eb\") " Mar 20 17:51:56 crc kubenswrapper[4690]: I0320 17:51:56.295995 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e760b621-28bc-4ede-b08d-8e46250407eb-operator-scripts\") pod \"e760b621-28bc-4ede-b08d-8e46250407eb\" (UID: \"e760b621-28bc-4ede-b08d-8e46250407eb\") " Mar 20 17:51:56 crc kubenswrapper[4690]: I0320 17:51:56.296897 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e760b621-28bc-4ede-b08d-8e46250407eb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e760b621-28bc-4ede-b08d-8e46250407eb" (UID: "e760b621-28bc-4ede-b08d-8e46250407eb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:51:56 crc kubenswrapper[4690]: I0320 17:51:56.302052 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e760b621-28bc-4ede-b08d-8e46250407eb-kube-api-access-jxhfs" (OuterVolumeSpecName: "kube-api-access-jxhfs") pod "e760b621-28bc-4ede-b08d-8e46250407eb" (UID: "e760b621-28bc-4ede-b08d-8e46250407eb"). InnerVolumeSpecName "kube-api-access-jxhfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:51:56 crc kubenswrapper[4690]: I0320 17:51:56.410912 4690 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e760b621-28bc-4ede-b08d-8e46250407eb-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:51:56 crc kubenswrapper[4690]: I0320 17:51:56.410946 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jxhfs\" (UniqueName: \"kubernetes.io/projected/e760b621-28bc-4ede-b08d-8e46250407eb-kube-api-access-jxhfs\") on node \"crc\" DevicePath \"\"" Mar 20 17:51:56 crc kubenswrapper[4690]: I0320 17:51:56.441051 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-98d6-account-create-update-hbppj" Mar 20 17:51:56 crc kubenswrapper[4690]: I0320 17:51:56.450187 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-43e3-account-create-update-hb8zx" Mar 20 17:51:56 crc kubenswrapper[4690]: I0320 17:51:56.493942 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-qch77" Mar 20 17:51:56 crc kubenswrapper[4690]: I0320 17:51:56.500312 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-9mzwk" Mar 20 17:51:56 crc kubenswrapper[4690]: I0320 17:51:56.511653 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/995742f4-b26d-4f30-ae2c-2635257cd664-operator-scripts\") pod \"995742f4-b26d-4f30-ae2c-2635257cd664\" (UID: \"995742f4-b26d-4f30-ae2c-2635257cd664\") " Mar 20 17:51:56 crc kubenswrapper[4690]: I0320 17:51:56.511709 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6415de2-c5b2-4077-91d9-74f1c0852b56-operator-scripts\") pod \"d6415de2-c5b2-4077-91d9-74f1c0852b56\" (UID: \"d6415de2-c5b2-4077-91d9-74f1c0852b56\") " Mar 20 17:51:56 crc kubenswrapper[4690]: I0320 17:51:56.511806 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dpv6g\" (UniqueName: \"kubernetes.io/projected/d6415de2-c5b2-4077-91d9-74f1c0852b56-kube-api-access-dpv6g\") pod \"d6415de2-c5b2-4077-91d9-74f1c0852b56\" (UID: \"d6415de2-c5b2-4077-91d9-74f1c0852b56\") " Mar 20 17:51:56 crc kubenswrapper[4690]: I0320 17:51:56.512124 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6415de2-c5b2-4077-91d9-74f1c0852b56-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d6415de2-c5b2-4077-91d9-74f1c0852b56" (UID: "d6415de2-c5b2-4077-91d9-74f1c0852b56"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:51:56 crc kubenswrapper[4690]: I0320 17:51:56.512168 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/995742f4-b26d-4f30-ae2c-2635257cd664-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "995742f4-b26d-4f30-ae2c-2635257cd664" (UID: "995742f4-b26d-4f30-ae2c-2635257cd664"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:51:56 crc kubenswrapper[4690]: I0320 17:51:56.512322 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2h9w5\" (UniqueName: \"kubernetes.io/projected/995742f4-b26d-4f30-ae2c-2635257cd664-kube-api-access-2h9w5\") pod \"995742f4-b26d-4f30-ae2c-2635257cd664\" (UID: \"995742f4-b26d-4f30-ae2c-2635257cd664\") " Mar 20 17:51:56 crc kubenswrapper[4690]: I0320 17:51:56.512801 4690 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/995742f4-b26d-4f30-ae2c-2635257cd664-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:51:56 crc kubenswrapper[4690]: I0320 17:51:56.512830 4690 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6415de2-c5b2-4077-91d9-74f1c0852b56-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:51:56 crc kubenswrapper[4690]: I0320 17:51:56.515444 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6415de2-c5b2-4077-91d9-74f1c0852b56-kube-api-access-dpv6g" (OuterVolumeSpecName: "kube-api-access-dpv6g") pod "d6415de2-c5b2-4077-91d9-74f1c0852b56" (UID: "d6415de2-c5b2-4077-91d9-74f1c0852b56"). InnerVolumeSpecName "kube-api-access-dpv6g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:51:56 crc kubenswrapper[4690]: I0320 17:51:56.515874 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/995742f4-b26d-4f30-ae2c-2635257cd664-kube-api-access-2h9w5" (OuterVolumeSpecName: "kube-api-access-2h9w5") pod "995742f4-b26d-4f30-ae2c-2635257cd664" (UID: "995742f4-b26d-4f30-ae2c-2635257cd664"). InnerVolumeSpecName "kube-api-access-2h9w5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:51:56 crc kubenswrapper[4690]: I0320 17:51:56.614007 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03da2fe9-5a86-4fba-87c7-7b2132c31d5f-config\") pod \"03da2fe9-5a86-4fba-87c7-7b2132c31d5f\" (UID: \"03da2fe9-5a86-4fba-87c7-7b2132c31d5f\") " Mar 20 17:51:56 crc kubenswrapper[4690]: I0320 17:51:56.614062 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djktk\" (UniqueName: \"kubernetes.io/projected/03da2fe9-5a86-4fba-87c7-7b2132c31d5f-kube-api-access-djktk\") pod \"03da2fe9-5a86-4fba-87c7-7b2132c31d5f\" (UID: \"03da2fe9-5a86-4fba-87c7-7b2132c31d5f\") " Mar 20 17:51:56 crc kubenswrapper[4690]: I0320 17:51:56.614101 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/03da2fe9-5a86-4fba-87c7-7b2132c31d5f-dns-svc\") pod \"03da2fe9-5a86-4fba-87c7-7b2132c31d5f\" (UID: \"03da2fe9-5a86-4fba-87c7-7b2132c31d5f\") " Mar 20 17:51:56 crc kubenswrapper[4690]: I0320 17:51:56.614231 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7tfhn\" (UniqueName: \"kubernetes.io/projected/8762da29-e17c-42a8-b233-a6c565c3a6de-kube-api-access-7tfhn\") pod \"8762da29-e17c-42a8-b233-a6c565c3a6de\" (UID: \"8762da29-e17c-42a8-b233-a6c565c3a6de\") " Mar 20 17:51:56 crc kubenswrapper[4690]: I0320 17:51:56.614289 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8762da29-e17c-42a8-b233-a6c565c3a6de-operator-scripts\") pod \"8762da29-e17c-42a8-b233-a6c565c3a6de\" (UID: \"8762da29-e17c-42a8-b233-a6c565c3a6de\") " Mar 20 17:51:56 crc kubenswrapper[4690]: I0320 17:51:56.614657 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dpv6g\" (UniqueName: \"kubernetes.io/projected/d6415de2-c5b2-4077-91d9-74f1c0852b56-kube-api-access-dpv6g\") on node \"crc\" DevicePath \"\"" Mar 20 17:51:56 crc kubenswrapper[4690]: I0320 17:51:56.614680 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2h9w5\" (UniqueName: \"kubernetes.io/projected/995742f4-b26d-4f30-ae2c-2635257cd664-kube-api-access-2h9w5\") on node \"crc\" DevicePath \"\"" Mar 20 17:51:56 crc kubenswrapper[4690]: I0320 17:51:56.615059 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8762da29-e17c-42a8-b233-a6c565c3a6de-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8762da29-e17c-42a8-b233-a6c565c3a6de" (UID: "8762da29-e17c-42a8-b233-a6c565c3a6de"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:51:56 crc kubenswrapper[4690]: I0320 17:51:56.618334 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03da2fe9-5a86-4fba-87c7-7b2132c31d5f-kube-api-access-djktk" (OuterVolumeSpecName: "kube-api-access-djktk") pod "03da2fe9-5a86-4fba-87c7-7b2132c31d5f" (UID: "03da2fe9-5a86-4fba-87c7-7b2132c31d5f"). InnerVolumeSpecName "kube-api-access-djktk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:51:56 crc kubenswrapper[4690]: I0320 17:51:56.619591 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8762da29-e17c-42a8-b233-a6c565c3a6de-kube-api-access-7tfhn" (OuterVolumeSpecName: "kube-api-access-7tfhn") pod "8762da29-e17c-42a8-b233-a6c565c3a6de" (UID: "8762da29-e17c-42a8-b233-a6c565c3a6de"). InnerVolumeSpecName "kube-api-access-7tfhn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:51:56 crc kubenswrapper[4690]: I0320 17:51:56.648438 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03da2fe9-5a86-4fba-87c7-7b2132c31d5f-config" (OuterVolumeSpecName: "config") pod "03da2fe9-5a86-4fba-87c7-7b2132c31d5f" (UID: "03da2fe9-5a86-4fba-87c7-7b2132c31d5f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:51:56 crc kubenswrapper[4690]: I0320 17:51:56.652645 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03da2fe9-5a86-4fba-87c7-7b2132c31d5f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "03da2fe9-5a86-4fba-87c7-7b2132c31d5f" (UID: "03da2fe9-5a86-4fba-87c7-7b2132c31d5f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:51:56 crc kubenswrapper[4690]: I0320 17:51:56.716588 4690 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03da2fe9-5a86-4fba-87c7-7b2132c31d5f-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:51:56 crc kubenswrapper[4690]: I0320 17:51:56.716622 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djktk\" (UniqueName: \"kubernetes.io/projected/03da2fe9-5a86-4fba-87c7-7b2132c31d5f-kube-api-access-djktk\") on node \"crc\" DevicePath \"\"" Mar 20 17:51:56 crc kubenswrapper[4690]: I0320 17:51:56.716632 4690 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/03da2fe9-5a86-4fba-87c7-7b2132c31d5f-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 17:51:56 crc kubenswrapper[4690]: I0320 17:51:56.716643 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7tfhn\" (UniqueName: \"kubernetes.io/projected/8762da29-e17c-42a8-b233-a6c565c3a6de-kube-api-access-7tfhn\") on node \"crc\" DevicePath \"\"" Mar 20 17:51:56 crc kubenswrapper[4690]: I0320 17:51:56.716653 4690 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8762da29-e17c-42a8-b233-a6c565c3a6de-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:51:56 crc kubenswrapper[4690]: I0320 17:51:56.855783 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-98d6-account-create-update-hbppj" event={"ID":"d6415de2-c5b2-4077-91d9-74f1c0852b56","Type":"ContainerDied","Data":"73c12fd928eb3aa741fc74b821edb45f9b0a07ee7c710f23c66789de83512586"} Mar 20 17:51:56 crc kubenswrapper[4690]: I0320 17:51:56.855845 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="73c12fd928eb3aa741fc74b821edb45f9b0a07ee7c710f23c66789de83512586" Mar 20 17:51:56 crc kubenswrapper[4690]: I0320 17:51:56.855806 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-98d6-account-create-update-hbppj" Mar 20 17:51:56 crc kubenswrapper[4690]: I0320 17:51:56.858114 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-qch77" event={"ID":"8762da29-e17c-42a8-b233-a6c565c3a6de","Type":"ContainerDied","Data":"34ec22fce1b3d9e1f2e508649280a6972c91330959eacd0b67975326566e32f1"} Mar 20 17:51:56 crc kubenswrapper[4690]: I0320 17:51:56.858161 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="34ec22fce1b3d9e1f2e508649280a6972c91330959eacd0b67975326566e32f1" Mar 20 17:51:56 crc kubenswrapper[4690]: I0320 17:51:56.858218 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-qch77" Mar 20 17:51:56 crc kubenswrapper[4690]: I0320 17:51:56.860543 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-43e3-account-create-update-hb8zx" Mar 20 17:51:56 crc kubenswrapper[4690]: I0320 17:51:56.862397 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-43e3-account-create-update-hb8zx" event={"ID":"995742f4-b26d-4f30-ae2c-2635257cd664","Type":"ContainerDied","Data":"8b3489d0d5dec7572713181bb4b31635a9ca3bac8537f16817c1e896299db394"} Mar 20 17:51:56 crc kubenswrapper[4690]: I0320 17:51:56.862636 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b3489d0d5dec7572713181bb4b31635a9ca3bac8537f16817c1e896299db394" Mar 20 17:51:56 crc kubenswrapper[4690]: I0320 17:51:56.864972 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-9mzwk" Mar 20 17:51:56 crc kubenswrapper[4690]: I0320 17:51:56.864977 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-9mzwk" event={"ID":"03da2fe9-5a86-4fba-87c7-7b2132c31d5f","Type":"ContainerDied","Data":"f0684160015a6d523d6d312a456c6868dff5d334dd83e652d5a4d44238fbd79a"} Mar 20 17:51:56 crc kubenswrapper[4690]: I0320 17:51:56.865110 4690 scope.go:117] "RemoveContainer" containerID="23a76f8b21ec13fc0517e5d372cbf075afe6057ed89575877f2580dc8cb31056" Mar 20 17:51:56 crc kubenswrapper[4690]: I0320 17:51:56.868895 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-6nm7z" event={"ID":"e760b621-28bc-4ede-b08d-8e46250407eb","Type":"ContainerDied","Data":"2819f26467458ec312f152d125b660700ed92dc59e99997653ef3d140662e711"} Mar 20 17:51:56 crc kubenswrapper[4690]: I0320 17:51:56.868938 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2819f26467458ec312f152d125b660700ed92dc59e99997653ef3d140662e711" Mar 20 17:51:56 crc kubenswrapper[4690]: I0320 17:51:56.869002 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-6nm7z" Mar 20 17:51:56 crc kubenswrapper[4690]: I0320 17:51:56.901503 4690 scope.go:117] "RemoveContainer" containerID="6b008c90252cedb91c5007188f4d976d4aac86e642c8f4a5ee05eadc58541fb4" Mar 20 17:51:56 crc kubenswrapper[4690]: I0320 17:51:56.935192 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-9mzwk"] Mar 20 17:51:56 crc kubenswrapper[4690]: I0320 17:51:56.942412 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-9mzwk"] Mar 20 17:51:57 crc kubenswrapper[4690]: I0320 17:51:57.189356 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-x2bws"] Mar 20 17:51:57 crc kubenswrapper[4690]: E0320 17:51:57.190891 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e760b621-28bc-4ede-b08d-8e46250407eb" containerName="mariadb-database-create" Mar 20 17:51:57 crc kubenswrapper[4690]: I0320 17:51:57.191095 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="e760b621-28bc-4ede-b08d-8e46250407eb" containerName="mariadb-database-create" Mar 20 17:51:57 crc kubenswrapper[4690]: E0320 17:51:57.191284 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03da2fe9-5a86-4fba-87c7-7b2132c31d5f" containerName="dnsmasq-dns" Mar 20 17:51:57 crc kubenswrapper[4690]: I0320 17:51:57.191419 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="03da2fe9-5a86-4fba-87c7-7b2132c31d5f" containerName="dnsmasq-dns" Mar 20 17:51:57 crc kubenswrapper[4690]: E0320 17:51:57.191558 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03da2fe9-5a86-4fba-87c7-7b2132c31d5f" containerName="init" Mar 20 17:51:57 crc kubenswrapper[4690]: I0320 17:51:57.191673 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="03da2fe9-5a86-4fba-87c7-7b2132c31d5f" containerName="init" Mar 20 17:51:57 crc kubenswrapper[4690]: E0320 17:51:57.191971 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6415de2-c5b2-4077-91d9-74f1c0852b56" containerName="mariadb-account-create-update" Mar 20 17:51:57 crc kubenswrapper[4690]: I0320 17:51:57.192103 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6415de2-c5b2-4077-91d9-74f1c0852b56" containerName="mariadb-account-create-update" Mar 20 17:51:57 crc kubenswrapper[4690]: E0320 17:51:57.192239 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8409e1a-31b0-4050-86bf-69c3d18c6185" containerName="mariadb-account-create-update" Mar 20 17:51:57 crc kubenswrapper[4690]: I0320 17:51:57.192400 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8409e1a-31b0-4050-86bf-69c3d18c6185" containerName="mariadb-account-create-update" Mar 20 17:51:57 crc kubenswrapper[4690]: E0320 17:51:57.192558 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="995742f4-b26d-4f30-ae2c-2635257cd664" containerName="mariadb-account-create-update" Mar 20 17:51:57 crc kubenswrapper[4690]: I0320 17:51:57.192719 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="995742f4-b26d-4f30-ae2c-2635257cd664" containerName="mariadb-account-create-update" Mar 20 17:51:57 crc kubenswrapper[4690]: E0320 17:51:57.192892 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8762da29-e17c-42a8-b233-a6c565c3a6de" containerName="mariadb-database-create" Mar 20 17:51:57 crc kubenswrapper[4690]: I0320 17:51:57.193023 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="8762da29-e17c-42a8-b233-a6c565c3a6de" containerName="mariadb-database-create" Mar 20 17:51:57 crc kubenswrapper[4690]: E0320 17:51:57.193195 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08b775c4-9217-4241-8049-1253db4ecb81" containerName="mariadb-database-create" Mar 20 17:51:57 crc kubenswrapper[4690]: I0320 17:51:57.193375 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="08b775c4-9217-4241-8049-1253db4ecb81" containerName="mariadb-database-create" Mar 20 17:51:57 crc kubenswrapper[4690]: I0320 17:51:57.193846 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="03da2fe9-5a86-4fba-87c7-7b2132c31d5f" containerName="dnsmasq-dns" Mar 20 17:51:57 crc kubenswrapper[4690]: I0320 17:51:57.194011 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="8762da29-e17c-42a8-b233-a6c565c3a6de" containerName="mariadb-database-create" Mar 20 17:51:57 crc kubenswrapper[4690]: I0320 17:51:57.194168 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="995742f4-b26d-4f30-ae2c-2635257cd664" containerName="mariadb-account-create-update" Mar 20 17:51:57 crc kubenswrapper[4690]: I0320 17:51:57.194333 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8409e1a-31b0-4050-86bf-69c3d18c6185" containerName="mariadb-account-create-update" Mar 20 17:51:57 crc kubenswrapper[4690]: I0320 17:51:57.194474 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6415de2-c5b2-4077-91d9-74f1c0852b56" containerName="mariadb-account-create-update" Mar 20 17:51:57 crc kubenswrapper[4690]: I0320 17:51:57.194624 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="08b775c4-9217-4241-8049-1253db4ecb81" containerName="mariadb-database-create" Mar 20 17:51:57 crc kubenswrapper[4690]: I0320 17:51:57.194768 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="e760b621-28bc-4ede-b08d-8e46250407eb" containerName="mariadb-database-create" Mar 20 17:51:57 crc kubenswrapper[4690]: I0320 17:51:57.195782 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-x2bws" Mar 20 17:51:57 crc kubenswrapper[4690]: I0320 17:51:57.199676 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-x6sz7" Mar 20 17:51:57 crc kubenswrapper[4690]: I0320 17:51:57.210246 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 20 17:51:57 crc kubenswrapper[4690]: I0320 17:51:57.210307 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-x2bws"] Mar 20 17:51:57 crc kubenswrapper[4690]: I0320 17:51:57.227920 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d08ec433-4043-43b3-ae56-de712919babe-config-data\") pod \"glance-db-sync-x2bws\" (UID: \"d08ec433-4043-43b3-ae56-de712919babe\") " pod="openstack/glance-db-sync-x2bws" Mar 20 17:51:57 crc kubenswrapper[4690]: I0320 17:51:57.228472 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d08ec433-4043-43b3-ae56-de712919babe-combined-ca-bundle\") pod \"glance-db-sync-x2bws\" (UID: \"d08ec433-4043-43b3-ae56-de712919babe\") " pod="openstack/glance-db-sync-x2bws" Mar 20 17:51:57 crc kubenswrapper[4690]: I0320 17:51:57.228553 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fd7xx\" (UniqueName: \"kubernetes.io/projected/d08ec433-4043-43b3-ae56-de712919babe-kube-api-access-fd7xx\") pod \"glance-db-sync-x2bws\" (UID: \"d08ec433-4043-43b3-ae56-de712919babe\") " pod="openstack/glance-db-sync-x2bws" Mar 20 17:51:57 crc kubenswrapper[4690]: I0320 17:51:57.228622 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d08ec433-4043-43b3-ae56-de712919babe-db-sync-config-data\") pod \"glance-db-sync-x2bws\" (UID: \"d08ec433-4043-43b3-ae56-de712919babe\") " pod="openstack/glance-db-sync-x2bws" Mar 20 17:51:57 crc kubenswrapper[4690]: I0320 17:51:57.329531 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d08ec433-4043-43b3-ae56-de712919babe-combined-ca-bundle\") pod \"glance-db-sync-x2bws\" (UID: \"d08ec433-4043-43b3-ae56-de712919babe\") " pod="openstack/glance-db-sync-x2bws" Mar 20 17:51:57 crc kubenswrapper[4690]: I0320 17:51:57.329578 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fd7xx\" (UniqueName: \"kubernetes.io/projected/d08ec433-4043-43b3-ae56-de712919babe-kube-api-access-fd7xx\") pod \"glance-db-sync-x2bws\" (UID: \"d08ec433-4043-43b3-ae56-de712919babe\") " pod="openstack/glance-db-sync-x2bws" Mar 20 17:51:57 crc kubenswrapper[4690]: I0320 17:51:57.329612 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d08ec433-4043-43b3-ae56-de712919babe-db-sync-config-data\") pod \"glance-db-sync-x2bws\" (UID: \"d08ec433-4043-43b3-ae56-de712919babe\") " pod="openstack/glance-db-sync-x2bws" Mar 20 17:51:57 crc kubenswrapper[4690]: I0320 17:51:57.329715 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d08ec433-4043-43b3-ae56-de712919babe-config-data\") pod \"glance-db-sync-x2bws\" (UID: \"d08ec433-4043-43b3-ae56-de712919babe\") " pod="openstack/glance-db-sync-x2bws" Mar 20 17:51:57 crc kubenswrapper[4690]: I0320 17:51:57.334966 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d08ec433-4043-43b3-ae56-de712919babe-config-data\") pod \"glance-db-sync-x2bws\" (UID: \"d08ec433-4043-43b3-ae56-de712919babe\") " pod="openstack/glance-db-sync-x2bws" Mar 20 17:51:57 crc kubenswrapper[4690]: I0320 17:51:57.335666 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d08ec433-4043-43b3-ae56-de712919babe-db-sync-config-data\") pod \"glance-db-sync-x2bws\" (UID: \"d08ec433-4043-43b3-ae56-de712919babe\") " pod="openstack/glance-db-sync-x2bws" Mar 20 17:51:57 crc kubenswrapper[4690]: I0320 17:51:57.336091 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d08ec433-4043-43b3-ae56-de712919babe-combined-ca-bundle\") pod \"glance-db-sync-x2bws\" (UID: \"d08ec433-4043-43b3-ae56-de712919babe\") " pod="openstack/glance-db-sync-x2bws" Mar 20 17:51:57 crc kubenswrapper[4690]: I0320 17:51:57.351520 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fd7xx\" (UniqueName: \"kubernetes.io/projected/d08ec433-4043-43b3-ae56-de712919babe-kube-api-access-fd7xx\") pod \"glance-db-sync-x2bws\" (UID: \"d08ec433-4043-43b3-ae56-de712919babe\") " pod="openstack/glance-db-sync-x2bws" Mar 20 17:51:57 crc kubenswrapper[4690]: I0320 17:51:57.458636 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 20 17:51:57 crc kubenswrapper[4690]: I0320 17:51:57.519344 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-x2bws" Mar 20 17:51:57 crc kubenswrapper[4690]: I0320 17:51:57.892937 4690 generic.go:334] "Generic (PLEG): container finished" podID="7930c325-4b03-450e-b3d0-b7116efc71cb" containerID="3b92eaf68e23ee0ed730d3fc39929d439f967abd6608547f33f138444f77c6db" exitCode=0 Mar 20 17:51:57 crc kubenswrapper[4690]: I0320 17:51:57.896814 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03da2fe9-5a86-4fba-87c7-7b2132c31d5f" path="/var/lib/kubelet/pods/03da2fe9-5a86-4fba-87c7-7b2132c31d5f/volumes" Mar 20 17:51:57 crc kubenswrapper[4690]: I0320 17:51:57.897524 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-dsjgc" event={"ID":"7930c325-4b03-450e-b3d0-b7116efc71cb","Type":"ContainerDied","Data":"3b92eaf68e23ee0ed730d3fc39929d439f967abd6608547f33f138444f77c6db"} Mar 20 17:51:58 crc kubenswrapper[4690]: I0320 17:51:58.042497 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-x2bws"] Mar 20 17:51:58 crc kubenswrapper[4690]: W0320 17:51:58.045616 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd08ec433_4043_43b3_ae56_de712919babe.slice/crio-4bbf117be75e138da0805afdf4c344e1c36e887e44aab7bb40732373ab0c8635 WatchSource:0}: Error finding container 4bbf117be75e138da0805afdf4c344e1c36e887e44aab7bb40732373ab0c8635: Status 404 returned error can't find the container with id 4bbf117be75e138da0805afdf4c344e1c36e887e44aab7bb40732373ab0c8635 Mar 20 17:51:58 crc kubenswrapper[4690]: I0320 17:51:58.526493 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 20 17:51:58 crc kubenswrapper[4690]: I0320 17:51:58.590370 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 20 17:51:58 crc kubenswrapper[4690]: I0320 17:51:58.757032 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 20 17:51:58 crc kubenswrapper[4690]: I0320 17:51:58.758803 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 20 17:51:58 crc kubenswrapper[4690]: I0320 17:51:58.763229 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-vkwgn" Mar 20 17:51:58 crc kubenswrapper[4690]: I0320 17:51:58.763457 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 20 17:51:58 crc kubenswrapper[4690]: I0320 17:51:58.763572 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 20 17:51:58 crc kubenswrapper[4690]: I0320 17:51:58.763689 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 20 17:51:58 crc kubenswrapper[4690]: I0320 17:51:58.768974 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-sxwxr"] Mar 20 17:51:58 crc kubenswrapper[4690]: I0320 17:51:58.769958 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-sxwxr" Mar 20 17:51:58 crc kubenswrapper[4690]: I0320 17:51:58.774288 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 20 17:51:58 crc kubenswrapper[4690]: I0320 17:51:58.785126 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 20 17:51:58 crc kubenswrapper[4690]: I0320 17:51:58.804027 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-sxwxr"] Mar 20 17:51:58 crc kubenswrapper[4690]: I0320 17:51:58.870865 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9fac0864-e5f7-4226-a62d-0e9036b6c420-operator-scripts\") pod \"root-account-create-update-sxwxr\" (UID: \"9fac0864-e5f7-4226-a62d-0e9036b6c420\") " pod="openstack/root-account-create-update-sxwxr" Mar 20 17:51:58 crc kubenswrapper[4690]: I0320 17:51:58.870943 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6a18387d-9d4e-4fd5-bdb3-8568831a7930-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"6a18387d-9d4e-4fd5-bdb3-8568831a7930\") " pod="openstack/ovn-northd-0" Mar 20 17:51:58 crc kubenswrapper[4690]: I0320 17:51:58.870970 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6a18387d-9d4e-4fd5-bdb3-8568831a7930-scripts\") pod \"ovn-northd-0\" (UID: \"6a18387d-9d4e-4fd5-bdb3-8568831a7930\") " pod="openstack/ovn-northd-0" Mar 20 17:51:58 crc kubenswrapper[4690]: I0320 17:51:58.871044 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a18387d-9d4e-4fd5-bdb3-8568831a7930-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"6a18387d-9d4e-4fd5-bdb3-8568831a7930\") " pod="openstack/ovn-northd-0" Mar 20 17:51:58 crc kubenswrapper[4690]: I0320 17:51:58.871122 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pgsq\" (UniqueName: \"kubernetes.io/projected/6a18387d-9d4e-4fd5-bdb3-8568831a7930-kube-api-access-2pgsq\") pod \"ovn-northd-0\" (UID: \"6a18387d-9d4e-4fd5-bdb3-8568831a7930\") " pod="openstack/ovn-northd-0" Mar 20 17:51:58 crc kubenswrapper[4690]: I0320 17:51:58.871204 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ss7r\" (UniqueName: \"kubernetes.io/projected/9fac0864-e5f7-4226-a62d-0e9036b6c420-kube-api-access-9ss7r\") pod \"root-account-create-update-sxwxr\" (UID: \"9fac0864-e5f7-4226-a62d-0e9036b6c420\") " pod="openstack/root-account-create-update-sxwxr" Mar 20 17:51:58 crc kubenswrapper[4690]: I0320 17:51:58.871226 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a18387d-9d4e-4fd5-bdb3-8568831a7930-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"6a18387d-9d4e-4fd5-bdb3-8568831a7930\") " pod="openstack/ovn-northd-0" Mar 20 17:51:58 crc kubenswrapper[4690]: I0320 17:51:58.871246 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a18387d-9d4e-4fd5-bdb3-8568831a7930-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"6a18387d-9d4e-4fd5-bdb3-8568831a7930\") " pod="openstack/ovn-northd-0" Mar 20 17:51:58 crc kubenswrapper[4690]: I0320 17:51:58.871303 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a18387d-9d4e-4fd5-bdb3-8568831a7930-config\") pod \"ovn-northd-0\" (UID: \"6a18387d-9d4e-4fd5-bdb3-8568831a7930\") " pod="openstack/ovn-northd-0" Mar 20 17:51:58 crc kubenswrapper[4690]: I0320 17:51:58.899788 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-x2bws" event={"ID":"d08ec433-4043-43b3-ae56-de712919babe","Type":"ContainerStarted","Data":"4bbf117be75e138da0805afdf4c344e1c36e887e44aab7bb40732373ab0c8635"} Mar 20 17:51:58 crc kubenswrapper[4690]: I0320 17:51:58.972615 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pgsq\" (UniqueName: \"kubernetes.io/projected/6a18387d-9d4e-4fd5-bdb3-8568831a7930-kube-api-access-2pgsq\") pod \"ovn-northd-0\" (UID: \"6a18387d-9d4e-4fd5-bdb3-8568831a7930\") " pod="openstack/ovn-northd-0" Mar 20 17:51:58 crc kubenswrapper[4690]: I0320 17:51:58.972716 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ss7r\" (UniqueName: \"kubernetes.io/projected/9fac0864-e5f7-4226-a62d-0e9036b6c420-kube-api-access-9ss7r\") pod \"root-account-create-update-sxwxr\" (UID: \"9fac0864-e5f7-4226-a62d-0e9036b6c420\") " pod="openstack/root-account-create-update-sxwxr" Mar 20 17:51:58 crc kubenswrapper[4690]: I0320 17:51:58.972741 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a18387d-9d4e-4fd5-bdb3-8568831a7930-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"6a18387d-9d4e-4fd5-bdb3-8568831a7930\") " pod="openstack/ovn-northd-0" Mar 20 17:51:58 crc kubenswrapper[4690]: I0320 17:51:58.972767 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a18387d-9d4e-4fd5-bdb3-8568831a7930-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"6a18387d-9d4e-4fd5-bdb3-8568831a7930\") " pod="openstack/ovn-northd-0" Mar 20 17:51:58 crc kubenswrapper[4690]: I0320 17:51:58.972805 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a18387d-9d4e-4fd5-bdb3-8568831a7930-config\") pod \"ovn-northd-0\" (UID: \"6a18387d-9d4e-4fd5-bdb3-8568831a7930\") " pod="openstack/ovn-northd-0" Mar 20 17:51:58 crc kubenswrapper[4690]: I0320 17:51:58.972860 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9fac0864-e5f7-4226-a62d-0e9036b6c420-operator-scripts\") pod \"root-account-create-update-sxwxr\" (UID: \"9fac0864-e5f7-4226-a62d-0e9036b6c420\") " pod="openstack/root-account-create-update-sxwxr" Mar 20 17:51:58 crc kubenswrapper[4690]: I0320 17:51:58.972988 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6a18387d-9d4e-4fd5-bdb3-8568831a7930-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"6a18387d-9d4e-4fd5-bdb3-8568831a7930\") " pod="openstack/ovn-northd-0" Mar 20 17:51:58 crc kubenswrapper[4690]: I0320 17:51:58.973014 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6a18387d-9d4e-4fd5-bdb3-8568831a7930-scripts\") pod \"ovn-northd-0\" (UID: \"6a18387d-9d4e-4fd5-bdb3-8568831a7930\") " pod="openstack/ovn-northd-0" Mar 20 17:51:58 crc kubenswrapper[4690]: I0320 17:51:58.973156 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a18387d-9d4e-4fd5-bdb3-8568831a7930-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"6a18387d-9d4e-4fd5-bdb3-8568831a7930\") " pod="openstack/ovn-northd-0" Mar 20 17:51:58 crc kubenswrapper[4690]: I0320 17:51:58.975597 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a18387d-9d4e-4fd5-bdb3-8568831a7930-config\") pod \"ovn-northd-0\" (UID: \"6a18387d-9d4e-4fd5-bdb3-8568831a7930\") " pod="openstack/ovn-northd-0" Mar 20 17:51:58 crc kubenswrapper[4690]: I0320 17:51:58.976343 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9fac0864-e5f7-4226-a62d-0e9036b6c420-operator-scripts\") pod \"root-account-create-update-sxwxr\" (UID: \"9fac0864-e5f7-4226-a62d-0e9036b6c420\") " pod="openstack/root-account-create-update-sxwxr" Mar 20 17:51:58 crc kubenswrapper[4690]: I0320 17:51:58.976702 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6a18387d-9d4e-4fd5-bdb3-8568831a7930-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"6a18387d-9d4e-4fd5-bdb3-8568831a7930\") " pod="openstack/ovn-northd-0" Mar 20 17:51:58 crc kubenswrapper[4690]: I0320 17:51:58.980041 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a18387d-9d4e-4fd5-bdb3-8568831a7930-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"6a18387d-9d4e-4fd5-bdb3-8568831a7930\") " pod="openstack/ovn-northd-0" Mar 20 17:51:58 crc kubenswrapper[4690]: I0320 17:51:58.981809 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6a18387d-9d4e-4fd5-bdb3-8568831a7930-scripts\") pod \"ovn-northd-0\" (UID: \"6a18387d-9d4e-4fd5-bdb3-8568831a7930\") " pod="openstack/ovn-northd-0" Mar 20 17:51:58 crc kubenswrapper[4690]: I0320 17:51:58.982872 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a18387d-9d4e-4fd5-bdb3-8568831a7930-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"6a18387d-9d4e-4fd5-bdb3-8568831a7930\") " pod="openstack/ovn-northd-0" Mar 20 17:51:59 crc kubenswrapper[4690]: I0320 17:51:58.999571 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pgsq\" (UniqueName: \"kubernetes.io/projected/6a18387d-9d4e-4fd5-bdb3-8568831a7930-kube-api-access-2pgsq\") pod \"ovn-northd-0\" (UID: \"6a18387d-9d4e-4fd5-bdb3-8568831a7930\") " pod="openstack/ovn-northd-0" Mar 20 17:51:59 crc kubenswrapper[4690]: I0320 17:51:59.000100 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a18387d-9d4e-4fd5-bdb3-8568831a7930-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"6a18387d-9d4e-4fd5-bdb3-8568831a7930\") " pod="openstack/ovn-northd-0" Mar 20 17:51:59 crc kubenswrapper[4690]: I0320 17:51:59.002193 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ss7r\" (UniqueName: \"kubernetes.io/projected/9fac0864-e5f7-4226-a62d-0e9036b6c420-kube-api-access-9ss7r\") pod \"root-account-create-update-sxwxr\" (UID: \"9fac0864-e5f7-4226-a62d-0e9036b6c420\") " pod="openstack/root-account-create-update-sxwxr" Mar 20 17:51:59 crc kubenswrapper[4690]: I0320 17:51:59.076324 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 20 17:51:59 crc kubenswrapper[4690]: I0320 17:51:59.095694 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-sxwxr" Mar 20 17:51:59 crc kubenswrapper[4690]: I0320 17:51:59.227152 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-dsjgc" Mar 20 17:51:59 crc kubenswrapper[4690]: I0320 17:51:59.277126 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7930c325-4b03-450e-b3d0-b7116efc71cb-etc-swift\") pod \"7930c325-4b03-450e-b3d0-b7116efc71cb\" (UID: \"7930c325-4b03-450e-b3d0-b7116efc71cb\") " Mar 20 17:51:59 crc kubenswrapper[4690]: I0320 17:51:59.277161 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7930c325-4b03-450e-b3d0-b7116efc71cb-swiftconf\") pod \"7930c325-4b03-450e-b3d0-b7116efc71cb\" (UID: \"7930c325-4b03-450e-b3d0-b7116efc71cb\") " Mar 20 17:51:59 crc kubenswrapper[4690]: I0320 17:51:59.277179 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7930c325-4b03-450e-b3d0-b7116efc71cb-combined-ca-bundle\") pod \"7930c325-4b03-450e-b3d0-b7116efc71cb\" (UID: \"7930c325-4b03-450e-b3d0-b7116efc71cb\") " Mar 20 17:51:59 crc kubenswrapper[4690]: I0320 17:51:59.277229 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7930c325-4b03-450e-b3d0-b7116efc71cb-dispersionconf\") pod \"7930c325-4b03-450e-b3d0-b7116efc71cb\" (UID: \"7930c325-4b03-450e-b3d0-b7116efc71cb\") " Mar 20 17:51:59 crc kubenswrapper[4690]: I0320 17:51:59.277660 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4wwn\" (UniqueName: \"kubernetes.io/projected/7930c325-4b03-450e-b3d0-b7116efc71cb-kube-api-access-x4wwn\") pod \"7930c325-4b03-450e-b3d0-b7116efc71cb\" (UID: \"7930c325-4b03-450e-b3d0-b7116efc71cb\") " Mar 20 17:51:59 crc kubenswrapper[4690]: I0320 17:51:59.277689 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7930c325-4b03-450e-b3d0-b7116efc71cb-scripts\") pod \"7930c325-4b03-450e-b3d0-b7116efc71cb\" (UID: \"7930c325-4b03-450e-b3d0-b7116efc71cb\") " Mar 20 17:51:59 crc kubenswrapper[4690]: I0320 17:51:59.277712 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7930c325-4b03-450e-b3d0-b7116efc71cb-ring-data-devices\") pod \"7930c325-4b03-450e-b3d0-b7116efc71cb\" (UID: \"7930c325-4b03-450e-b3d0-b7116efc71cb\") " Mar 20 17:51:59 crc kubenswrapper[4690]: I0320 17:51:59.279180 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7930c325-4b03-450e-b3d0-b7116efc71cb-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "7930c325-4b03-450e-b3d0-b7116efc71cb" (UID: "7930c325-4b03-450e-b3d0-b7116efc71cb"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:51:59 crc kubenswrapper[4690]: I0320 17:51:59.280779 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7930c325-4b03-450e-b3d0-b7116efc71cb-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "7930c325-4b03-450e-b3d0-b7116efc71cb" (UID: "7930c325-4b03-450e-b3d0-b7116efc71cb"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:51:59 crc kubenswrapper[4690]: I0320 17:51:59.283643 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7930c325-4b03-450e-b3d0-b7116efc71cb-kube-api-access-x4wwn" (OuterVolumeSpecName: "kube-api-access-x4wwn") pod "7930c325-4b03-450e-b3d0-b7116efc71cb" (UID: "7930c325-4b03-450e-b3d0-b7116efc71cb"). InnerVolumeSpecName "kube-api-access-x4wwn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:51:59 crc kubenswrapper[4690]: I0320 17:51:59.287754 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7930c325-4b03-450e-b3d0-b7116efc71cb-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "7930c325-4b03-450e-b3d0-b7116efc71cb" (UID: "7930c325-4b03-450e-b3d0-b7116efc71cb"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:51:59 crc kubenswrapper[4690]: I0320 17:51:59.302193 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7930c325-4b03-450e-b3d0-b7116efc71cb-scripts" (OuterVolumeSpecName: "scripts") pod "7930c325-4b03-450e-b3d0-b7116efc71cb" (UID: "7930c325-4b03-450e-b3d0-b7116efc71cb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:51:59 crc kubenswrapper[4690]: I0320 17:51:59.305021 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7930c325-4b03-450e-b3d0-b7116efc71cb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7930c325-4b03-450e-b3d0-b7116efc71cb" (UID: "7930c325-4b03-450e-b3d0-b7116efc71cb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:51:59 crc kubenswrapper[4690]: I0320 17:51:59.315404 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7930c325-4b03-450e-b3d0-b7116efc71cb-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "7930c325-4b03-450e-b3d0-b7116efc71cb" (UID: "7930c325-4b03-450e-b3d0-b7116efc71cb"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:51:59 crc kubenswrapper[4690]: I0320 17:51:59.380006 4690 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7930c325-4b03-450e-b3d0-b7116efc71cb-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 20 17:51:59 crc kubenswrapper[4690]: I0320 17:51:59.380034 4690 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7930c325-4b03-450e-b3d0-b7116efc71cb-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 20 17:51:59 crc kubenswrapper[4690]: I0320 17:51:59.380043 4690 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7930c325-4b03-450e-b3d0-b7116efc71cb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:51:59 crc kubenswrapper[4690]: I0320 17:51:59.380052 4690 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7930c325-4b03-450e-b3d0-b7116efc71cb-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 20 17:51:59 crc kubenswrapper[4690]: I0320 17:51:59.380060 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4wwn\" (UniqueName: \"kubernetes.io/projected/7930c325-4b03-450e-b3d0-b7116efc71cb-kube-api-access-x4wwn\") on node \"crc\" DevicePath \"\"" Mar 20 17:51:59 crc kubenswrapper[4690]: I0320 17:51:59.380071 4690 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7930c325-4b03-450e-b3d0-b7116efc71cb-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:51:59 crc kubenswrapper[4690]: I0320 17:51:59.380078 4690 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7930c325-4b03-450e-b3d0-b7116efc71cb-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 20 17:51:59 crc kubenswrapper[4690]: I0320 17:51:59.551428 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 20 17:51:59 crc kubenswrapper[4690]: W0320 17:51:59.554567 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a18387d_9d4e_4fd5_bdb3_8568831a7930.slice/crio-cfe48b7727ba4578fbfa6ffdd76eb92b6ab141733a4e95e5cda6715048cfa06a WatchSource:0}: Error finding container cfe48b7727ba4578fbfa6ffdd76eb92b6ab141733a4e95e5cda6715048cfa06a: Status 404 returned error can't find the container with id cfe48b7727ba4578fbfa6ffdd76eb92b6ab141733a4e95e5cda6715048cfa06a Mar 20 17:51:59 crc kubenswrapper[4690]: I0320 17:51:59.597855 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-sxwxr"] Mar 20 17:51:59 crc kubenswrapper[4690]: W0320 17:51:59.599438 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9fac0864_e5f7_4226_a62d_0e9036b6c420.slice/crio-3dae4735a5f7465112264efdb33d7abbe00a8e7efc1b96421a4b62fe5a8ee92f WatchSource:0}: Error finding container 3dae4735a5f7465112264efdb33d7abbe00a8e7efc1b96421a4b62fe5a8ee92f: Status 404 returned error can't find the container with id 3dae4735a5f7465112264efdb33d7abbe00a8e7efc1b96421a4b62fe5a8ee92f Mar 20 17:51:59 crc kubenswrapper[4690]: I0320 17:51:59.911051 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-sxwxr" event={"ID":"9fac0864-e5f7-4226-a62d-0e9036b6c420","Type":"ContainerStarted","Data":"54a42d890ed25cd146ab7ae3de8ae0395d9660ddeb8c74ed0a2cbe365d3c3ffa"} Mar 20 17:51:59 crc kubenswrapper[4690]: I0320 17:51:59.911431 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-sxwxr" event={"ID":"9fac0864-e5f7-4226-a62d-0e9036b6c420","Type":"ContainerStarted","Data":"3dae4735a5f7465112264efdb33d7abbe00a8e7efc1b96421a4b62fe5a8ee92f"} Mar 20 17:51:59 crc kubenswrapper[4690]: I0320 17:51:59.913440 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-dsjgc" event={"ID":"7930c325-4b03-450e-b3d0-b7116efc71cb","Type":"ContainerDied","Data":"774caecb580b5883e271f086a0255ab1c88d78dfa64b1557e7ffdf4bcaf5c540"} Mar 20 17:51:59 crc kubenswrapper[4690]: I0320 17:51:59.913476 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="774caecb580b5883e271f086a0255ab1c88d78dfa64b1557e7ffdf4bcaf5c540" Mar 20 17:51:59 crc kubenswrapper[4690]: I0320 17:51:59.913543 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-dsjgc" Mar 20 17:51:59 crc kubenswrapper[4690]: I0320 17:51:59.923002 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"6a18387d-9d4e-4fd5-bdb3-8568831a7930","Type":"ContainerStarted","Data":"cfe48b7727ba4578fbfa6ffdd76eb92b6ab141733a4e95e5cda6715048cfa06a"} Mar 20 17:51:59 crc kubenswrapper[4690]: I0320 17:51:59.930672 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-sxwxr" podStartSLOduration=1.930650631 podStartE2EDuration="1.930650631s" podCreationTimestamp="2026-03-20 17:51:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:51:59.929679113 +0000 UTC m=+1194.795504831" watchObservedRunningTime="2026-03-20 17:51:59.930650631 +0000 UTC m=+1194.796476309" Mar 20 17:52:00 crc kubenswrapper[4690]: I0320 17:52:00.129313 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567152-5hrb6"] Mar 20 17:52:00 crc kubenswrapper[4690]: E0320 17:52:00.129733 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7930c325-4b03-450e-b3d0-b7116efc71cb" containerName="swift-ring-rebalance" Mar 20 17:52:00 crc kubenswrapper[4690]: I0320 17:52:00.131142 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="7930c325-4b03-450e-b3d0-b7116efc71cb" containerName="swift-ring-rebalance" Mar 20 17:52:00 crc kubenswrapper[4690]: I0320 17:52:00.131457 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="7930c325-4b03-450e-b3d0-b7116efc71cb" containerName="swift-ring-rebalance" Mar 20 17:52:00 crc kubenswrapper[4690]: I0320 17:52:00.132111 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567152-5hrb6" Mar 20 17:52:00 crc kubenswrapper[4690]: I0320 17:52:00.134912 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5fwhb" Mar 20 17:52:00 crc kubenswrapper[4690]: I0320 17:52:00.135076 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 17:52:00 crc kubenswrapper[4690]: I0320 17:52:00.135188 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 17:52:00 crc kubenswrapper[4690]: I0320 17:52:00.137489 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567152-5hrb6"] Mar 20 17:52:00 crc kubenswrapper[4690]: I0320 17:52:00.193870 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shncx\" (UniqueName: \"kubernetes.io/projected/f3572a99-ffc5-435a-b485-aa6aa5c9479c-kube-api-access-shncx\") pod \"auto-csr-approver-29567152-5hrb6\" (UID: \"f3572a99-ffc5-435a-b485-aa6aa5c9479c\") " pod="openshift-infra/auto-csr-approver-29567152-5hrb6" Mar 20 17:52:00 crc kubenswrapper[4690]: I0320 17:52:00.296156 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shncx\" (UniqueName: \"kubernetes.io/projected/f3572a99-ffc5-435a-b485-aa6aa5c9479c-kube-api-access-shncx\") pod \"auto-csr-approver-29567152-5hrb6\" (UID: \"f3572a99-ffc5-435a-b485-aa6aa5c9479c\") " pod="openshift-infra/auto-csr-approver-29567152-5hrb6" Mar 20 17:52:00 crc kubenswrapper[4690]: I0320 17:52:00.319793 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shncx\" (UniqueName: \"kubernetes.io/projected/f3572a99-ffc5-435a-b485-aa6aa5c9479c-kube-api-access-shncx\") pod \"auto-csr-approver-29567152-5hrb6\" (UID: \"f3572a99-ffc5-435a-b485-aa6aa5c9479c\") " pod="openshift-infra/auto-csr-approver-29567152-5hrb6" Mar 20 17:52:00 crc kubenswrapper[4690]: I0320 17:52:00.447626 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567152-5hrb6" Mar 20 17:52:00 crc kubenswrapper[4690]: I0320 17:52:00.935825 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567152-5hrb6"] Mar 20 17:52:00 crc kubenswrapper[4690]: I0320 17:52:00.948433 4690 generic.go:334] "Generic (PLEG): container finished" podID="9fac0864-e5f7-4226-a62d-0e9036b6c420" containerID="54a42d890ed25cd146ab7ae3de8ae0395d9660ddeb8c74ed0a2cbe365d3c3ffa" exitCode=0 Mar 20 17:52:00 crc kubenswrapper[4690]: I0320 17:52:00.948481 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-sxwxr" event={"ID":"9fac0864-e5f7-4226-a62d-0e9036b6c420","Type":"ContainerDied","Data":"54a42d890ed25cd146ab7ae3de8ae0395d9660ddeb8c74ed0a2cbe365d3c3ffa"} Mar 20 17:52:01 crc kubenswrapper[4690]: I0320 17:52:01.336635 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4191d8a1-c023-4412-a90c-e819672da33f-etc-swift\") pod \"swift-storage-0\" (UID: \"4191d8a1-c023-4412-a90c-e819672da33f\") " pod="openstack/swift-storage-0" Mar 20 17:52:01 crc kubenswrapper[4690]: I0320 17:52:01.341292 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4191d8a1-c023-4412-a90c-e819672da33f-etc-swift\") pod \"swift-storage-0\" (UID: \"4191d8a1-c023-4412-a90c-e819672da33f\") " pod="openstack/swift-storage-0" Mar 20 17:52:01 crc kubenswrapper[4690]: I0320 17:52:01.353073 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 20 17:52:01 crc kubenswrapper[4690]: I0320 17:52:01.858518 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 20 17:52:01 crc kubenswrapper[4690]: W0320 17:52:01.862273 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4191d8a1_c023_4412_a90c_e819672da33f.slice/crio-38899fd966a9dacdd0561a40c8d007d06a42357edd68ca7cc9d37ddd0a0ef65b WatchSource:0}: Error finding container 38899fd966a9dacdd0561a40c8d007d06a42357edd68ca7cc9d37ddd0a0ef65b: Status 404 returned error can't find the container with id 38899fd966a9dacdd0561a40c8d007d06a42357edd68ca7cc9d37ddd0a0ef65b Mar 20 17:52:01 crc kubenswrapper[4690]: I0320 17:52:01.961752 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"6a18387d-9d4e-4fd5-bdb3-8568831a7930","Type":"ContainerStarted","Data":"892a45ceab066299fe2242e4ea262801339c98be31cdfe943ab0400318980b12"} Mar 20 17:52:01 crc kubenswrapper[4690]: I0320 17:52:01.961822 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"6a18387d-9d4e-4fd5-bdb3-8568831a7930","Type":"ContainerStarted","Data":"c18942a9d2fa80936ca7c151f5ff016250b06a867c2a00a1be677137de0ad3cb"} Mar 20 17:52:01 crc kubenswrapper[4690]: I0320 17:52:01.961894 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 20 17:52:01 crc kubenswrapper[4690]: I0320 17:52:01.964107 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4191d8a1-c023-4412-a90c-e819672da33f","Type":"ContainerStarted","Data":"38899fd966a9dacdd0561a40c8d007d06a42357edd68ca7cc9d37ddd0a0ef65b"} Mar 20 17:52:01 crc kubenswrapper[4690]: I0320 17:52:01.965092 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567152-5hrb6" event={"ID":"f3572a99-ffc5-435a-b485-aa6aa5c9479c","Type":"ContainerStarted","Data":"9cfdbc477ee4a9d9c0951e10ee95627bf234f1b6e53ac84ab79684f997aa3ae7"} Mar 20 17:52:01 crc kubenswrapper[4690]: I0320 17:52:01.984783 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.370518894 podStartE2EDuration="3.984766989s" podCreationTimestamp="2026-03-20 17:51:58 +0000 UTC" firstStartedPulling="2026-03-20 17:51:59.558147846 +0000 UTC m=+1194.423973524" lastFinishedPulling="2026-03-20 17:52:01.172395941 +0000 UTC m=+1196.038221619" observedRunningTime="2026-03-20 17:52:01.981767403 +0000 UTC m=+1196.847593131" watchObservedRunningTime="2026-03-20 17:52:01.984766989 +0000 UTC m=+1196.850592667" Mar 20 17:52:02 crc kubenswrapper[4690]: I0320 17:52:02.299147 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-sxwxr" Mar 20 17:52:02 crc kubenswrapper[4690]: I0320 17:52:02.365527 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9fac0864-e5f7-4226-a62d-0e9036b6c420-operator-scripts\") pod \"9fac0864-e5f7-4226-a62d-0e9036b6c420\" (UID: \"9fac0864-e5f7-4226-a62d-0e9036b6c420\") " Mar 20 17:52:02 crc kubenswrapper[4690]: I0320 17:52:02.365752 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9ss7r\" (UniqueName: \"kubernetes.io/projected/9fac0864-e5f7-4226-a62d-0e9036b6c420-kube-api-access-9ss7r\") pod \"9fac0864-e5f7-4226-a62d-0e9036b6c420\" (UID: \"9fac0864-e5f7-4226-a62d-0e9036b6c420\") " Mar 20 17:52:02 crc kubenswrapper[4690]: I0320 17:52:02.366398 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9fac0864-e5f7-4226-a62d-0e9036b6c420-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9fac0864-e5f7-4226-a62d-0e9036b6c420" (UID: "9fac0864-e5f7-4226-a62d-0e9036b6c420"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:52:02 crc kubenswrapper[4690]: I0320 17:52:02.372089 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fac0864-e5f7-4226-a62d-0e9036b6c420-kube-api-access-9ss7r" (OuterVolumeSpecName: "kube-api-access-9ss7r") pod "9fac0864-e5f7-4226-a62d-0e9036b6c420" (UID: "9fac0864-e5f7-4226-a62d-0e9036b6c420"). InnerVolumeSpecName "kube-api-access-9ss7r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:52:02 crc kubenswrapper[4690]: I0320 17:52:02.467524 4690 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9fac0864-e5f7-4226-a62d-0e9036b6c420-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:52:02 crc kubenswrapper[4690]: I0320 17:52:02.467562 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9ss7r\" (UniqueName: \"kubernetes.io/projected/9fac0864-e5f7-4226-a62d-0e9036b6c420-kube-api-access-9ss7r\") on node \"crc\" DevicePath \"\"" Mar 20 17:52:02 crc kubenswrapper[4690]: I0320 17:52:02.976664 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-sxwxr" event={"ID":"9fac0864-e5f7-4226-a62d-0e9036b6c420","Type":"ContainerDied","Data":"3dae4735a5f7465112264efdb33d7abbe00a8e7efc1b96421a4b62fe5a8ee92f"} Mar 20 17:52:02 crc kubenswrapper[4690]: I0320 17:52:02.976712 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3dae4735a5f7465112264efdb33d7abbe00a8e7efc1b96421a4b62fe5a8ee92f" Mar 20 17:52:02 crc kubenswrapper[4690]: I0320 17:52:02.976791 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-sxwxr" Mar 20 17:52:02 crc kubenswrapper[4690]: I0320 17:52:02.979673 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4191d8a1-c023-4412-a90c-e819672da33f","Type":"ContainerStarted","Data":"a2d92a33900d4c07c58b5a72b5d9b32d2a24e745a41b4ab9bbde1fb1b0cd3861"} Mar 20 17:52:02 crc kubenswrapper[4690]: I0320 17:52:02.981508 4690 generic.go:334] "Generic (PLEG): container finished" podID="f3572a99-ffc5-435a-b485-aa6aa5c9479c" containerID="c48916f05045307b3413cb61f6721d52155a11bc486656d060bc10c70beba6d8" exitCode=0 Mar 20 17:52:02 crc kubenswrapper[4690]: I0320 17:52:02.981545 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567152-5hrb6" event={"ID":"f3572a99-ffc5-435a-b485-aa6aa5c9479c","Type":"ContainerDied","Data":"c48916f05045307b3413cb61f6721d52155a11bc486656d060bc10c70beba6d8"} Mar 20 17:52:03 crc kubenswrapper[4690]: I0320 17:52:03.415879 4690 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 17:52:03 crc kubenswrapper[4690]: I0320 17:52:03.995467 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4191d8a1-c023-4412-a90c-e819672da33f","Type":"ContainerStarted","Data":"a03c12cc1e995e2cf6fd7aec6c11d623ed707df6b5f2ff7f4413857927f9dcfb"} Mar 20 17:52:03 crc kubenswrapper[4690]: I0320 17:52:03.995507 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4191d8a1-c023-4412-a90c-e819672da33f","Type":"ContainerStarted","Data":"68f7cb89953cfed08c24ba1c01418c90fa9216270f99afe1134327147d336c8a"} Mar 20 17:52:03 crc kubenswrapper[4690]: I0320 17:52:03.995519 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4191d8a1-c023-4412-a90c-e819672da33f","Type":"ContainerStarted","Data":"9b3fff75a525e5d1347438b5c335b6ea00beaa3654acbea57c62ca0808a567aa"} Mar 20 17:52:05 crc kubenswrapper[4690]: I0320 17:52:05.169540 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-sxwxr"] Mar 20 17:52:05 crc kubenswrapper[4690]: I0320 17:52:05.176727 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-sxwxr"] Mar 20 17:52:05 crc kubenswrapper[4690]: I0320 17:52:05.898750 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9fac0864-e5f7-4226-a62d-0e9036b6c420" path="/var/lib/kubelet/pods/9fac0864-e5f7-4226-a62d-0e9036b6c420/volumes" Mar 20 17:52:07 crc kubenswrapper[4690]: I0320 17:52:07.517726 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-8dmtk" Mar 20 17:52:07 crc kubenswrapper[4690]: I0320 17:52:07.520460 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-8dmtk" Mar 20 17:52:07 crc kubenswrapper[4690]: I0320 17:52:07.747075 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-j8pr4-config-dng29"] Mar 20 17:52:07 crc kubenswrapper[4690]: E0320 17:52:07.747851 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fac0864-e5f7-4226-a62d-0e9036b6c420" containerName="mariadb-account-create-update" Mar 20 17:52:07 crc kubenswrapper[4690]: I0320 17:52:07.747877 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fac0864-e5f7-4226-a62d-0e9036b6c420" containerName="mariadb-account-create-update" Mar 20 17:52:07 crc kubenswrapper[4690]: I0320 17:52:07.748047 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fac0864-e5f7-4226-a62d-0e9036b6c420" containerName="mariadb-account-create-update" Mar 20 17:52:07 crc kubenswrapper[4690]: I0320 17:52:07.748746 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-j8pr4-config-dng29" Mar 20 17:52:07 crc kubenswrapper[4690]: I0320 17:52:07.751959 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 20 17:52:07 crc kubenswrapper[4690]: I0320 17:52:07.756679 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-j8pr4-config-dng29"] Mar 20 17:52:07 crc kubenswrapper[4690]: I0320 17:52:07.851435 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqscv\" (UniqueName: \"kubernetes.io/projected/2b2cf7f9-a868-4fd3-9065-fbebe6c6d6ae-kube-api-access-dqscv\") pod \"ovn-controller-j8pr4-config-dng29\" (UID: \"2b2cf7f9-a868-4fd3-9065-fbebe6c6d6ae\") " pod="openstack/ovn-controller-j8pr4-config-dng29" Mar 20 17:52:07 crc kubenswrapper[4690]: I0320 17:52:07.851540 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2b2cf7f9-a868-4fd3-9065-fbebe6c6d6ae-scripts\") pod \"ovn-controller-j8pr4-config-dng29\" (UID: \"2b2cf7f9-a868-4fd3-9065-fbebe6c6d6ae\") " pod="openstack/ovn-controller-j8pr4-config-dng29" Mar 20 17:52:07 crc kubenswrapper[4690]: I0320 17:52:07.851573 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2b2cf7f9-a868-4fd3-9065-fbebe6c6d6ae-var-run-ovn\") pod \"ovn-controller-j8pr4-config-dng29\" (UID: \"2b2cf7f9-a868-4fd3-9065-fbebe6c6d6ae\") " pod="openstack/ovn-controller-j8pr4-config-dng29" Mar 20 17:52:07 crc kubenswrapper[4690]: I0320 17:52:07.851604 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2b2cf7f9-a868-4fd3-9065-fbebe6c6d6ae-var-run\") pod \"ovn-controller-j8pr4-config-dng29\" (UID: \"2b2cf7f9-a868-4fd3-9065-fbebe6c6d6ae\") " pod="openstack/ovn-controller-j8pr4-config-dng29" Mar 20 17:52:07 crc kubenswrapper[4690]: I0320 17:52:07.851703 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2b2cf7f9-a868-4fd3-9065-fbebe6c6d6ae-var-log-ovn\") pod \"ovn-controller-j8pr4-config-dng29\" (UID: \"2b2cf7f9-a868-4fd3-9065-fbebe6c6d6ae\") " pod="openstack/ovn-controller-j8pr4-config-dng29" Mar 20 17:52:07 crc kubenswrapper[4690]: I0320 17:52:07.851734 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/2b2cf7f9-a868-4fd3-9065-fbebe6c6d6ae-additional-scripts\") pod \"ovn-controller-j8pr4-config-dng29\" (UID: \"2b2cf7f9-a868-4fd3-9065-fbebe6c6d6ae\") " pod="openstack/ovn-controller-j8pr4-config-dng29" Mar 20 17:52:07 crc kubenswrapper[4690]: I0320 17:52:07.954085 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/2b2cf7f9-a868-4fd3-9065-fbebe6c6d6ae-additional-scripts\") pod \"ovn-controller-j8pr4-config-dng29\" (UID: \"2b2cf7f9-a868-4fd3-9065-fbebe6c6d6ae\") " pod="openstack/ovn-controller-j8pr4-config-dng29" Mar 20 17:52:07 crc kubenswrapper[4690]: I0320 17:52:07.954175 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqscv\" (UniqueName: \"kubernetes.io/projected/2b2cf7f9-a868-4fd3-9065-fbebe6c6d6ae-kube-api-access-dqscv\") pod \"ovn-controller-j8pr4-config-dng29\" (UID: \"2b2cf7f9-a868-4fd3-9065-fbebe6c6d6ae\") " pod="openstack/ovn-controller-j8pr4-config-dng29" Mar 20 17:52:07 crc kubenswrapper[4690]: I0320 17:52:07.954270 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2b2cf7f9-a868-4fd3-9065-fbebe6c6d6ae-scripts\") pod \"ovn-controller-j8pr4-config-dng29\" (UID: \"2b2cf7f9-a868-4fd3-9065-fbebe6c6d6ae\") " pod="openstack/ovn-controller-j8pr4-config-dng29" Mar 20 17:52:07 crc kubenswrapper[4690]: I0320 17:52:07.954295 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2b2cf7f9-a868-4fd3-9065-fbebe6c6d6ae-var-run-ovn\") pod \"ovn-controller-j8pr4-config-dng29\" (UID: \"2b2cf7f9-a868-4fd3-9065-fbebe6c6d6ae\") " pod="openstack/ovn-controller-j8pr4-config-dng29" Mar 20 17:52:07 crc kubenswrapper[4690]: I0320 17:52:07.954316 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2b2cf7f9-a868-4fd3-9065-fbebe6c6d6ae-var-run\") pod \"ovn-controller-j8pr4-config-dng29\" (UID: \"2b2cf7f9-a868-4fd3-9065-fbebe6c6d6ae\") " pod="openstack/ovn-controller-j8pr4-config-dng29" Mar 20 17:52:07 crc kubenswrapper[4690]: I0320 17:52:07.954398 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2b2cf7f9-a868-4fd3-9065-fbebe6c6d6ae-var-log-ovn\") pod \"ovn-controller-j8pr4-config-dng29\" (UID: \"2b2cf7f9-a868-4fd3-9065-fbebe6c6d6ae\") " pod="openstack/ovn-controller-j8pr4-config-dng29" Mar 20 17:52:07 crc kubenswrapper[4690]: I0320 17:52:07.954703 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2b2cf7f9-a868-4fd3-9065-fbebe6c6d6ae-var-log-ovn\") pod \"ovn-controller-j8pr4-config-dng29\" (UID: \"2b2cf7f9-a868-4fd3-9065-fbebe6c6d6ae\") " pod="openstack/ovn-controller-j8pr4-config-dng29" Mar 20 17:52:07 crc kubenswrapper[4690]: I0320 17:52:07.956026 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/2b2cf7f9-a868-4fd3-9065-fbebe6c6d6ae-additional-scripts\") pod \"ovn-controller-j8pr4-config-dng29\" (UID: \"2b2cf7f9-a868-4fd3-9065-fbebe6c6d6ae\") " pod="openstack/ovn-controller-j8pr4-config-dng29" Mar 20 17:52:07 crc kubenswrapper[4690]: I0320 17:52:07.956536 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2b2cf7f9-a868-4fd3-9065-fbebe6c6d6ae-var-run-ovn\") pod \"ovn-controller-j8pr4-config-dng29\" (UID: \"2b2cf7f9-a868-4fd3-9065-fbebe6c6d6ae\") " pod="openstack/ovn-controller-j8pr4-config-dng29" Mar 20 17:52:07 crc kubenswrapper[4690]: I0320 17:52:07.956735 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2b2cf7f9-a868-4fd3-9065-fbebe6c6d6ae-var-run\") pod \"ovn-controller-j8pr4-config-dng29\" (UID: \"2b2cf7f9-a868-4fd3-9065-fbebe6c6d6ae\") " pod="openstack/ovn-controller-j8pr4-config-dng29" Mar 20 17:52:07 crc kubenswrapper[4690]: I0320 17:52:07.964122 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2b2cf7f9-a868-4fd3-9065-fbebe6c6d6ae-scripts\") pod \"ovn-controller-j8pr4-config-dng29\" (UID: \"2b2cf7f9-a868-4fd3-9065-fbebe6c6d6ae\") " pod="openstack/ovn-controller-j8pr4-config-dng29" Mar 20 17:52:07 crc kubenswrapper[4690]: I0320 17:52:07.978190 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqscv\" (UniqueName: \"kubernetes.io/projected/2b2cf7f9-a868-4fd3-9065-fbebe6c6d6ae-kube-api-access-dqscv\") pod \"ovn-controller-j8pr4-config-dng29\" (UID: \"2b2cf7f9-a868-4fd3-9065-fbebe6c6d6ae\") " pod="openstack/ovn-controller-j8pr4-config-dng29" Mar 20 17:52:08 crc kubenswrapper[4690]: I0320 17:52:08.072310 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-j8pr4-config-dng29" Mar 20 17:52:10 crc kubenswrapper[4690]: I0320 17:52:10.046158 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567152-5hrb6" Mar 20 17:52:10 crc kubenswrapper[4690]: I0320 17:52:10.057889 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567152-5hrb6" event={"ID":"f3572a99-ffc5-435a-b485-aa6aa5c9479c","Type":"ContainerDied","Data":"9cfdbc477ee4a9d9c0951e10ee95627bf234f1b6e53ac84ab79684f997aa3ae7"} Mar 20 17:52:10 crc kubenswrapper[4690]: I0320 17:52:10.057939 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9cfdbc477ee4a9d9c0951e10ee95627bf234f1b6e53ac84ab79684f997aa3ae7" Mar 20 17:52:10 crc kubenswrapper[4690]: I0320 17:52:10.057950 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567152-5hrb6" Mar 20 17:52:10 crc kubenswrapper[4690]: I0320 17:52:10.093143 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-shncx\" (UniqueName: \"kubernetes.io/projected/f3572a99-ffc5-435a-b485-aa6aa5c9479c-kube-api-access-shncx\") pod \"f3572a99-ffc5-435a-b485-aa6aa5c9479c\" (UID: \"f3572a99-ffc5-435a-b485-aa6aa5c9479c\") " Mar 20 17:52:10 crc kubenswrapper[4690]: I0320 17:52:10.110135 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3572a99-ffc5-435a-b485-aa6aa5c9479c-kube-api-access-shncx" (OuterVolumeSpecName: "kube-api-access-shncx") pod "f3572a99-ffc5-435a-b485-aa6aa5c9479c" (UID: "f3572a99-ffc5-435a-b485-aa6aa5c9479c"). InnerVolumeSpecName "kube-api-access-shncx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:52:10 crc kubenswrapper[4690]: I0320 17:52:10.180429 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-khb42"] Mar 20 17:52:10 crc kubenswrapper[4690]: E0320 17:52:10.180900 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3572a99-ffc5-435a-b485-aa6aa5c9479c" containerName="oc" Mar 20 17:52:10 crc kubenswrapper[4690]: I0320 17:52:10.180922 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3572a99-ffc5-435a-b485-aa6aa5c9479c" containerName="oc" Mar 20 17:52:10 crc kubenswrapper[4690]: I0320 17:52:10.181143 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3572a99-ffc5-435a-b485-aa6aa5c9479c" containerName="oc" Mar 20 17:52:10 crc kubenswrapper[4690]: I0320 17:52:10.181899 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-khb42" Mar 20 17:52:10 crc kubenswrapper[4690]: I0320 17:52:10.184502 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 20 17:52:10 crc kubenswrapper[4690]: I0320 17:52:10.188599 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-khb42"] Mar 20 17:52:10 crc kubenswrapper[4690]: I0320 17:52:10.196913 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-shncx\" (UniqueName: \"kubernetes.io/projected/f3572a99-ffc5-435a-b485-aa6aa5c9479c-kube-api-access-shncx\") on node \"crc\" DevicePath \"\"" Mar 20 17:52:10 crc kubenswrapper[4690]: I0320 17:52:10.299466 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fd3dc2e5-e309-432f-a876-8cf78434e9d7-operator-scripts\") pod \"root-account-create-update-khb42\" (UID: \"fd3dc2e5-e309-432f-a876-8cf78434e9d7\") " pod="openstack/root-account-create-update-khb42" Mar 20 17:52:10 crc kubenswrapper[4690]: I0320 17:52:10.299550 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvlvz\" (UniqueName: \"kubernetes.io/projected/fd3dc2e5-e309-432f-a876-8cf78434e9d7-kube-api-access-kvlvz\") pod \"root-account-create-update-khb42\" (UID: \"fd3dc2e5-e309-432f-a876-8cf78434e9d7\") " pod="openstack/root-account-create-update-khb42" Mar 20 17:52:10 crc kubenswrapper[4690]: I0320 17:52:10.401003 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fd3dc2e5-e309-432f-a876-8cf78434e9d7-operator-scripts\") pod \"root-account-create-update-khb42\" (UID: \"fd3dc2e5-e309-432f-a876-8cf78434e9d7\") " pod="openstack/root-account-create-update-khb42" Mar 20 17:52:10 crc kubenswrapper[4690]: I0320 17:52:10.401108 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvlvz\" (UniqueName: \"kubernetes.io/projected/fd3dc2e5-e309-432f-a876-8cf78434e9d7-kube-api-access-kvlvz\") pod \"root-account-create-update-khb42\" (UID: \"fd3dc2e5-e309-432f-a876-8cf78434e9d7\") " pod="openstack/root-account-create-update-khb42" Mar 20 17:52:10 crc kubenswrapper[4690]: I0320 17:52:10.402847 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fd3dc2e5-e309-432f-a876-8cf78434e9d7-operator-scripts\") pod \"root-account-create-update-khb42\" (UID: \"fd3dc2e5-e309-432f-a876-8cf78434e9d7\") " pod="openstack/root-account-create-update-khb42" Mar 20 17:52:10 crc kubenswrapper[4690]: I0320 17:52:10.417623 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvlvz\" (UniqueName: \"kubernetes.io/projected/fd3dc2e5-e309-432f-a876-8cf78434e9d7-kube-api-access-kvlvz\") pod \"root-account-create-update-khb42\" (UID: \"fd3dc2e5-e309-432f-a876-8cf78434e9d7\") " pod="openstack/root-account-create-update-khb42" Mar 20 17:52:10 crc kubenswrapper[4690]: I0320 17:52:10.508574 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-khb42" Mar 20 17:52:11 crc kubenswrapper[4690]: I0320 17:52:11.115689 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567146-qqzwn"] Mar 20 17:52:11 crc kubenswrapper[4690]: I0320 17:52:11.124089 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567146-qqzwn"] Mar 20 17:52:11 crc kubenswrapper[4690]: I0320 17:52:11.899678 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6397759-9cf9-4996-9fc7-6ec98f00014a" path="/var/lib/kubelet/pods/a6397759-9cf9-4996-9fc7-6ec98f00014a/volumes" Mar 20 17:52:12 crc kubenswrapper[4690]: I0320 17:52:12.750248 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-j8pr4-config-dng29"] Mar 20 17:52:12 crc kubenswrapper[4690]: W0320 17:52:12.763092 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2b2cf7f9_a868_4fd3_9065_fbebe6c6d6ae.slice/crio-55d02649d2f962c7b1c9c628217844a045354498ec2636bb1f1fb6f5c588501e WatchSource:0}: Error finding container 55d02649d2f962c7b1c9c628217844a045354498ec2636bb1f1fb6f5c588501e: Status 404 returned error can't find the container with id 55d02649d2f962c7b1c9c628217844a045354498ec2636bb1f1fb6f5c588501e Mar 20 17:52:12 crc kubenswrapper[4690]: I0320 17:52:12.870879 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-khb42"] Mar 20 17:52:13 crc kubenswrapper[4690]: I0320 17:52:13.088621 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4191d8a1-c023-4412-a90c-e819672da33f","Type":"ContainerStarted","Data":"98af18b95254bbd45be568141ca9b5fc3c9e960d76434ca641c3efd31b2c146c"} Mar 20 17:52:13 crc kubenswrapper[4690]: I0320 17:52:13.089006 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4191d8a1-c023-4412-a90c-e819672da33f","Type":"ContainerStarted","Data":"84ff39cf29a167aed6f9506421b7874b406bd8cf4ee912c2b8bf484522339f2e"} Mar 20 17:52:13 crc kubenswrapper[4690]: I0320 17:52:13.089024 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4191d8a1-c023-4412-a90c-e819672da33f","Type":"ContainerStarted","Data":"8ec6a797ca9fbe0f188493b67907d168c3c0a57e4007b965a673479c2e167db4"} Mar 20 17:52:13 crc kubenswrapper[4690]: I0320 17:52:13.090474 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-x2bws" event={"ID":"d08ec433-4043-43b3-ae56-de712919babe","Type":"ContainerStarted","Data":"d0564bea1fff016e08463a9c5e3dcfd1bde664bd8343722644716c94489ee616"} Mar 20 17:52:13 crc kubenswrapper[4690]: I0320 17:52:13.091604 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-j8pr4-config-dng29" event={"ID":"2b2cf7f9-a868-4fd3-9065-fbebe6c6d6ae","Type":"ContainerStarted","Data":"55d02649d2f962c7b1c9c628217844a045354498ec2636bb1f1fb6f5c588501e"} Mar 20 17:52:13 crc kubenswrapper[4690]: I0320 17:52:13.092973 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-khb42" event={"ID":"fd3dc2e5-e309-432f-a876-8cf78434e9d7","Type":"ContainerStarted","Data":"ebb24362457997a0b0694fab60326ed9c7225d8706dabcd9a2cd29616831dcc5"} Mar 20 17:52:13 crc kubenswrapper[4690]: I0320 17:52:13.093004 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-khb42" event={"ID":"fd3dc2e5-e309-432f-a876-8cf78434e9d7","Type":"ContainerStarted","Data":"9f794d3d1112c616a7f5c3fb5e189e8284c457c9d61817439fb66361493df0d9"} Mar 20 17:52:13 crc kubenswrapper[4690]: I0320 17:52:13.112684 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-x2bws" podStartSLOduration=1.828074874 podStartE2EDuration="16.112666026s" podCreationTimestamp="2026-03-20 17:51:57 +0000 UTC" firstStartedPulling="2026-03-20 17:51:58.047841126 +0000 UTC m=+1192.913666824" lastFinishedPulling="2026-03-20 17:52:12.332432278 +0000 UTC m=+1207.198257976" observedRunningTime="2026-03-20 17:52:13.108526518 +0000 UTC m=+1207.974352196" watchObservedRunningTime="2026-03-20 17:52:13.112666026 +0000 UTC m=+1207.978491704" Mar 20 17:52:13 crc kubenswrapper[4690]: I0320 17:52:13.134506 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-khb42" podStartSLOduration=3.134491021 podStartE2EDuration="3.134491021s" podCreationTimestamp="2026-03-20 17:52:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:52:13.129451047 +0000 UTC m=+1207.995276725" watchObservedRunningTime="2026-03-20 17:52:13.134491021 +0000 UTC m=+1208.000316699" Mar 20 17:52:14 crc kubenswrapper[4690]: I0320 17:52:14.105916 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4191d8a1-c023-4412-a90c-e819672da33f","Type":"ContainerStarted","Data":"04961a40d5ad9c2d8a8464726ce2cccea54beb6f4e11067017ae42d96ec0fb72"} Mar 20 17:52:14 crc kubenswrapper[4690]: I0320 17:52:14.108941 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-j8pr4-config-dng29" event={"ID":"2b2cf7f9-a868-4fd3-9065-fbebe6c6d6ae","Type":"ContainerDied","Data":"70b505a4f5072a8fe51177145adda014c686869af3a089c48a816ae8c2c6d31d"} Mar 20 17:52:14 crc kubenswrapper[4690]: I0320 17:52:14.109549 4690 generic.go:334] "Generic (PLEG): container finished" podID="2b2cf7f9-a868-4fd3-9065-fbebe6c6d6ae" containerID="70b505a4f5072a8fe51177145adda014c686869af3a089c48a816ae8c2c6d31d" exitCode=0 Mar 20 17:52:14 crc kubenswrapper[4690]: I0320 17:52:14.111410 4690 generic.go:334] "Generic (PLEG): container finished" podID="fd3dc2e5-e309-432f-a876-8cf78434e9d7" containerID="ebb24362457997a0b0694fab60326ed9c7225d8706dabcd9a2cd29616831dcc5" exitCode=0 Mar 20 17:52:14 crc kubenswrapper[4690]: I0320 17:52:14.111754 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-khb42" event={"ID":"fd3dc2e5-e309-432f-a876-8cf78434e9d7","Type":"ContainerDied","Data":"ebb24362457997a0b0694fab60326ed9c7225d8706dabcd9a2cd29616831dcc5"} Mar 20 17:52:15 crc kubenswrapper[4690]: I0320 17:52:15.143242 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4191d8a1-c023-4412-a90c-e819672da33f","Type":"ContainerStarted","Data":"1838d929d1bcd7018d0abf5fb80d4c7d83335b61a4a4f1a225b4f53e855a08c3"} Mar 20 17:52:15 crc kubenswrapper[4690]: I0320 17:52:15.144049 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4191d8a1-c023-4412-a90c-e819672da33f","Type":"ContainerStarted","Data":"2ac13b77c0270bd50628fb62725e2d313895b3e9a003f9879cbec7127e97a259"} Mar 20 17:52:15 crc kubenswrapper[4690]: I0320 17:52:15.144079 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4191d8a1-c023-4412-a90c-e819672da33f","Type":"ContainerStarted","Data":"fdbd8b263e5c257c19faceb65e79878c5fbb6c662ad4aee44a367cb2082e75eb"} Mar 20 17:52:15 crc kubenswrapper[4690]: I0320 17:52:15.144099 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4191d8a1-c023-4412-a90c-e819672da33f","Type":"ContainerStarted","Data":"e23a5775acb1cf98ecbf0ff57d68a05cb0798aa90d8a56b11ba678cc524e6c60"} Mar 20 17:52:15 crc kubenswrapper[4690]: I0320 17:52:15.627336 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-j8pr4-config-dng29" Mar 20 17:52:15 crc kubenswrapper[4690]: I0320 17:52:15.630637 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-khb42" Mar 20 17:52:15 crc kubenswrapper[4690]: I0320 17:52:15.704073 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2b2cf7f9-a868-4fd3-9065-fbebe6c6d6ae-scripts\") pod \"2b2cf7f9-a868-4fd3-9065-fbebe6c6d6ae\" (UID: \"2b2cf7f9-a868-4fd3-9065-fbebe6c6d6ae\") " Mar 20 17:52:15 crc kubenswrapper[4690]: I0320 17:52:15.704154 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2b2cf7f9-a868-4fd3-9065-fbebe6c6d6ae-var-run\") pod \"2b2cf7f9-a868-4fd3-9065-fbebe6c6d6ae\" (UID: \"2b2cf7f9-a868-4fd3-9065-fbebe6c6d6ae\") " Mar 20 17:52:15 crc kubenswrapper[4690]: I0320 17:52:15.704211 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2b2cf7f9-a868-4fd3-9065-fbebe6c6d6ae-var-run-ovn\") pod \"2b2cf7f9-a868-4fd3-9065-fbebe6c6d6ae\" (UID: \"2b2cf7f9-a868-4fd3-9065-fbebe6c6d6ae\") " Mar 20 17:52:15 crc kubenswrapper[4690]: I0320 17:52:15.704306 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fd3dc2e5-e309-432f-a876-8cf78434e9d7-operator-scripts\") pod \"fd3dc2e5-e309-432f-a876-8cf78434e9d7\" (UID: \"fd3dc2e5-e309-432f-a876-8cf78434e9d7\") " Mar 20 17:52:15 crc kubenswrapper[4690]: I0320 17:52:15.704333 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2b2cf7f9-a868-4fd3-9065-fbebe6c6d6ae-var-run" (OuterVolumeSpecName: "var-run") pod "2b2cf7f9-a868-4fd3-9065-fbebe6c6d6ae" (UID: "2b2cf7f9-a868-4fd3-9065-fbebe6c6d6ae"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:52:15 crc kubenswrapper[4690]: I0320 17:52:15.704388 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kvlvz\" (UniqueName: \"kubernetes.io/projected/fd3dc2e5-e309-432f-a876-8cf78434e9d7-kube-api-access-kvlvz\") pod \"fd3dc2e5-e309-432f-a876-8cf78434e9d7\" (UID: \"fd3dc2e5-e309-432f-a876-8cf78434e9d7\") " Mar 20 17:52:15 crc kubenswrapper[4690]: I0320 17:52:15.704399 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2b2cf7f9-a868-4fd3-9065-fbebe6c6d6ae-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "2b2cf7f9-a868-4fd3-9065-fbebe6c6d6ae" (UID: "2b2cf7f9-a868-4fd3-9065-fbebe6c6d6ae"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:52:15 crc kubenswrapper[4690]: I0320 17:52:15.704448 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/2b2cf7f9-a868-4fd3-9065-fbebe6c6d6ae-additional-scripts\") pod \"2b2cf7f9-a868-4fd3-9065-fbebe6c6d6ae\" (UID: \"2b2cf7f9-a868-4fd3-9065-fbebe6c6d6ae\") " Mar 20 17:52:15 crc kubenswrapper[4690]: I0320 17:52:15.704530 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2b2cf7f9-a868-4fd3-9065-fbebe6c6d6ae-var-log-ovn\") pod \"2b2cf7f9-a868-4fd3-9065-fbebe6c6d6ae\" (UID: \"2b2cf7f9-a868-4fd3-9065-fbebe6c6d6ae\") " Mar 20 17:52:15 crc kubenswrapper[4690]: I0320 17:52:15.704556 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dqscv\" (UniqueName: \"kubernetes.io/projected/2b2cf7f9-a868-4fd3-9065-fbebe6c6d6ae-kube-api-access-dqscv\") pod \"2b2cf7f9-a868-4fd3-9065-fbebe6c6d6ae\" (UID: \"2b2cf7f9-a868-4fd3-9065-fbebe6c6d6ae\") " Mar 20 17:52:15 crc kubenswrapper[4690]: I0320 17:52:15.705022 4690 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2b2cf7f9-a868-4fd3-9065-fbebe6c6d6ae-var-run\") on node \"crc\" DevicePath \"\"" Mar 20 17:52:15 crc kubenswrapper[4690]: I0320 17:52:15.705046 4690 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2b2cf7f9-a868-4fd3-9065-fbebe6c6d6ae-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 20 17:52:15 crc kubenswrapper[4690]: I0320 17:52:15.705049 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2b2cf7f9-a868-4fd3-9065-fbebe6c6d6ae-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "2b2cf7f9-a868-4fd3-9065-fbebe6c6d6ae" (UID: "2b2cf7f9-a868-4fd3-9065-fbebe6c6d6ae"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:52:15 crc kubenswrapper[4690]: I0320 17:52:15.705515 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b2cf7f9-a868-4fd3-9065-fbebe6c6d6ae-scripts" (OuterVolumeSpecName: "scripts") pod "2b2cf7f9-a868-4fd3-9065-fbebe6c6d6ae" (UID: "2b2cf7f9-a868-4fd3-9065-fbebe6c6d6ae"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:52:15 crc kubenswrapper[4690]: I0320 17:52:15.705614 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd3dc2e5-e309-432f-a876-8cf78434e9d7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fd3dc2e5-e309-432f-a876-8cf78434e9d7" (UID: "fd3dc2e5-e309-432f-a876-8cf78434e9d7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:52:15 crc kubenswrapper[4690]: I0320 17:52:15.705621 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b2cf7f9-a868-4fd3-9065-fbebe6c6d6ae-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "2b2cf7f9-a868-4fd3-9065-fbebe6c6d6ae" (UID: "2b2cf7f9-a868-4fd3-9065-fbebe6c6d6ae"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:52:15 crc kubenswrapper[4690]: I0320 17:52:15.708686 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd3dc2e5-e309-432f-a876-8cf78434e9d7-kube-api-access-kvlvz" (OuterVolumeSpecName: "kube-api-access-kvlvz") pod "fd3dc2e5-e309-432f-a876-8cf78434e9d7" (UID: "fd3dc2e5-e309-432f-a876-8cf78434e9d7"). InnerVolumeSpecName "kube-api-access-kvlvz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:52:15 crc kubenswrapper[4690]: I0320 17:52:15.709222 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b2cf7f9-a868-4fd3-9065-fbebe6c6d6ae-kube-api-access-dqscv" (OuterVolumeSpecName: "kube-api-access-dqscv") pod "2b2cf7f9-a868-4fd3-9065-fbebe6c6d6ae" (UID: "2b2cf7f9-a868-4fd3-9065-fbebe6c6d6ae"). InnerVolumeSpecName "kube-api-access-dqscv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:52:15 crc kubenswrapper[4690]: I0320 17:52:15.806678 4690 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2b2cf7f9-a868-4fd3-9065-fbebe6c6d6ae-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 20 17:52:15 crc kubenswrapper[4690]: I0320 17:52:15.806724 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dqscv\" (UniqueName: \"kubernetes.io/projected/2b2cf7f9-a868-4fd3-9065-fbebe6c6d6ae-kube-api-access-dqscv\") on node \"crc\" DevicePath \"\"" Mar 20 17:52:15 crc kubenswrapper[4690]: I0320 17:52:15.806741 4690 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2b2cf7f9-a868-4fd3-9065-fbebe6c6d6ae-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:52:15 crc kubenswrapper[4690]: I0320 17:52:15.806751 4690 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fd3dc2e5-e309-432f-a876-8cf78434e9d7-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:52:15 crc kubenswrapper[4690]: I0320 17:52:15.806760 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kvlvz\" (UniqueName: \"kubernetes.io/projected/fd3dc2e5-e309-432f-a876-8cf78434e9d7-kube-api-access-kvlvz\") on node \"crc\" DevicePath \"\"" Mar 20 17:52:15 crc kubenswrapper[4690]: I0320 17:52:15.806767 4690 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/2b2cf7f9-a868-4fd3-9065-fbebe6c6d6ae-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:52:16 crc kubenswrapper[4690]: I0320 17:52:16.153894 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-khb42" event={"ID":"fd3dc2e5-e309-432f-a876-8cf78434e9d7","Type":"ContainerDied","Data":"9f794d3d1112c616a7f5c3fb5e189e8284c457c9d61817439fb66361493df0d9"} Mar 20 17:52:16 crc kubenswrapper[4690]: I0320 17:52:16.153940 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f794d3d1112c616a7f5c3fb5e189e8284c457c9d61817439fb66361493df0d9" Mar 20 17:52:16 crc kubenswrapper[4690]: I0320 17:52:16.154000 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-khb42" Mar 20 17:52:16 crc kubenswrapper[4690]: I0320 17:52:16.165530 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4191d8a1-c023-4412-a90c-e819672da33f","Type":"ContainerStarted","Data":"3a642499cb6245e5496e4eee6fc19a2e0f46ee2f1c1f47d03c38f17df570e111"} Mar 20 17:52:16 crc kubenswrapper[4690]: I0320 17:52:16.165885 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4191d8a1-c023-4412-a90c-e819672da33f","Type":"ContainerStarted","Data":"a4a03dbcc19678be4ef31012af70ea4026e578ecedd1b9a39be994811220d3dc"} Mar 20 17:52:16 crc kubenswrapper[4690]: I0320 17:52:16.166075 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4191d8a1-c023-4412-a90c-e819672da33f","Type":"ContainerStarted","Data":"1c4dd08f8410723691ed73b54a52987d4f108d4bac1d56c0023955a5625eedb1"} Mar 20 17:52:16 crc kubenswrapper[4690]: I0320 17:52:16.169280 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-j8pr4-config-dng29" event={"ID":"2b2cf7f9-a868-4fd3-9065-fbebe6c6d6ae","Type":"ContainerDied","Data":"55d02649d2f962c7b1c9c628217844a045354498ec2636bb1f1fb6f5c588501e"} Mar 20 17:52:16 crc kubenswrapper[4690]: I0320 17:52:16.169771 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-j8pr4-config-dng29" Mar 20 17:52:16 crc kubenswrapper[4690]: I0320 17:52:16.169792 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="55d02649d2f962c7b1c9c628217844a045354498ec2636bb1f1fb6f5c588501e" Mar 20 17:52:16 crc kubenswrapper[4690]: I0320 17:52:16.243608 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=19.912676343 podStartE2EDuration="32.243589883s" podCreationTimestamp="2026-03-20 17:51:44 +0000 UTC" firstStartedPulling="2026-03-20 17:52:01.864567177 +0000 UTC m=+1196.730392855" lastFinishedPulling="2026-03-20 17:52:14.195480697 +0000 UTC m=+1209.061306395" observedRunningTime="2026-03-20 17:52:16.229592462 +0000 UTC m=+1211.095418180" watchObservedRunningTime="2026-03-20 17:52:16.243589883 +0000 UTC m=+1211.109415551" Mar 20 17:52:16 crc kubenswrapper[4690]: I0320 17:52:16.540536 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-zxhrx"] Mar 20 17:52:16 crc kubenswrapper[4690]: E0320 17:52:16.541301 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd3dc2e5-e309-432f-a876-8cf78434e9d7" containerName="mariadb-account-create-update" Mar 20 17:52:16 crc kubenswrapper[4690]: I0320 17:52:16.541326 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd3dc2e5-e309-432f-a876-8cf78434e9d7" containerName="mariadb-account-create-update" Mar 20 17:52:16 crc kubenswrapper[4690]: E0320 17:52:16.541344 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b2cf7f9-a868-4fd3-9065-fbebe6c6d6ae" containerName="ovn-config" Mar 20 17:52:16 crc kubenswrapper[4690]: I0320 17:52:16.541356 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b2cf7f9-a868-4fd3-9065-fbebe6c6d6ae" containerName="ovn-config" Mar 20 17:52:16 crc kubenswrapper[4690]: I0320 17:52:16.541569 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd3dc2e5-e309-432f-a876-8cf78434e9d7" containerName="mariadb-account-create-update" Mar 20 17:52:16 crc kubenswrapper[4690]: I0320 17:52:16.541615 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b2cf7f9-a868-4fd3-9065-fbebe6c6d6ae" containerName="ovn-config" Mar 20 17:52:16 crc kubenswrapper[4690]: I0320 17:52:16.542669 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-zxhrx" Mar 20 17:52:16 crc kubenswrapper[4690]: I0320 17:52:16.548557 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Mar 20 17:52:16 crc kubenswrapper[4690]: I0320 17:52:16.558819 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-zxhrx"] Mar 20 17:52:16 crc kubenswrapper[4690]: I0320 17:52:16.640249 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c68dd56-4783-4d46-870d-cdb5843ff342-dns-svc\") pod \"dnsmasq-dns-764c5664d7-zxhrx\" (UID: \"9c68dd56-4783-4d46-870d-cdb5843ff342\") " pod="openstack/dnsmasq-dns-764c5664d7-zxhrx" Mar 20 17:52:16 crc kubenswrapper[4690]: I0320 17:52:16.640351 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llqxv\" (UniqueName: \"kubernetes.io/projected/9c68dd56-4783-4d46-870d-cdb5843ff342-kube-api-access-llqxv\") pod \"dnsmasq-dns-764c5664d7-zxhrx\" (UID: \"9c68dd56-4783-4d46-870d-cdb5843ff342\") " pod="openstack/dnsmasq-dns-764c5664d7-zxhrx" Mar 20 17:52:16 crc kubenswrapper[4690]: I0320 17:52:16.640382 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9c68dd56-4783-4d46-870d-cdb5843ff342-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-zxhrx\" (UID: \"9c68dd56-4783-4d46-870d-cdb5843ff342\") " pod="openstack/dnsmasq-dns-764c5664d7-zxhrx" Mar 20 17:52:16 crc kubenswrapper[4690]: I0320 17:52:16.640493 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c68dd56-4783-4d46-870d-cdb5843ff342-config\") pod \"dnsmasq-dns-764c5664d7-zxhrx\" (UID: \"9c68dd56-4783-4d46-870d-cdb5843ff342\") " pod="openstack/dnsmasq-dns-764c5664d7-zxhrx" Mar 20 17:52:16 crc kubenswrapper[4690]: I0320 17:52:16.640542 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9c68dd56-4783-4d46-870d-cdb5843ff342-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-zxhrx\" (UID: \"9c68dd56-4783-4d46-870d-cdb5843ff342\") " pod="openstack/dnsmasq-dns-764c5664d7-zxhrx" Mar 20 17:52:16 crc kubenswrapper[4690]: I0320 17:52:16.640577 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9c68dd56-4783-4d46-870d-cdb5843ff342-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-zxhrx\" (UID: \"9c68dd56-4783-4d46-870d-cdb5843ff342\") " pod="openstack/dnsmasq-dns-764c5664d7-zxhrx" Mar 20 17:52:16 crc kubenswrapper[4690]: I0320 17:52:16.717752 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-j8pr4-config-dng29"] Mar 20 17:52:16 crc kubenswrapper[4690]: I0320 17:52:16.725237 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-j8pr4-config-dng29"] Mar 20 17:52:16 crc kubenswrapper[4690]: I0320 17:52:16.741529 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9c68dd56-4783-4d46-870d-cdb5843ff342-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-zxhrx\" (UID: \"9c68dd56-4783-4d46-870d-cdb5843ff342\") " pod="openstack/dnsmasq-dns-764c5664d7-zxhrx" Mar 20 17:52:16 crc kubenswrapper[4690]: I0320 17:52:16.741613 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c68dd56-4783-4d46-870d-cdb5843ff342-config\") pod \"dnsmasq-dns-764c5664d7-zxhrx\" (UID: \"9c68dd56-4783-4d46-870d-cdb5843ff342\") " pod="openstack/dnsmasq-dns-764c5664d7-zxhrx" Mar 20 17:52:16 crc kubenswrapper[4690]: I0320 17:52:16.741668 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9c68dd56-4783-4d46-870d-cdb5843ff342-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-zxhrx\" (UID: \"9c68dd56-4783-4d46-870d-cdb5843ff342\") " pod="openstack/dnsmasq-dns-764c5664d7-zxhrx" Mar 20 17:52:16 crc kubenswrapper[4690]: I0320 17:52:16.741716 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9c68dd56-4783-4d46-870d-cdb5843ff342-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-zxhrx\" (UID: \"9c68dd56-4783-4d46-870d-cdb5843ff342\") " pod="openstack/dnsmasq-dns-764c5664d7-zxhrx" Mar 20 17:52:16 crc kubenswrapper[4690]: I0320 17:52:16.741787 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c68dd56-4783-4d46-870d-cdb5843ff342-dns-svc\") pod \"dnsmasq-dns-764c5664d7-zxhrx\" (UID: \"9c68dd56-4783-4d46-870d-cdb5843ff342\") " pod="openstack/dnsmasq-dns-764c5664d7-zxhrx" Mar 20 17:52:16 crc kubenswrapper[4690]: I0320 17:52:16.741807 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llqxv\" (UniqueName: \"kubernetes.io/projected/9c68dd56-4783-4d46-870d-cdb5843ff342-kube-api-access-llqxv\") pod \"dnsmasq-dns-764c5664d7-zxhrx\" (UID: \"9c68dd56-4783-4d46-870d-cdb5843ff342\") " pod="openstack/dnsmasq-dns-764c5664d7-zxhrx" Mar 20 17:52:16 crc kubenswrapper[4690]: I0320 17:52:16.742453 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9c68dd56-4783-4d46-870d-cdb5843ff342-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-zxhrx\" (UID: \"9c68dd56-4783-4d46-870d-cdb5843ff342\") " pod="openstack/dnsmasq-dns-764c5664d7-zxhrx" Mar 20 17:52:16 crc kubenswrapper[4690]: I0320 17:52:16.742496 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c68dd56-4783-4d46-870d-cdb5843ff342-config\") pod \"dnsmasq-dns-764c5664d7-zxhrx\" (UID: \"9c68dd56-4783-4d46-870d-cdb5843ff342\") " pod="openstack/dnsmasq-dns-764c5664d7-zxhrx" Mar 20 17:52:16 crc kubenswrapper[4690]: I0320 17:52:16.742710 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9c68dd56-4783-4d46-870d-cdb5843ff342-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-zxhrx\" (UID: \"9c68dd56-4783-4d46-870d-cdb5843ff342\") " pod="openstack/dnsmasq-dns-764c5664d7-zxhrx" Mar 20 17:52:16 crc kubenswrapper[4690]: I0320 17:52:16.743073 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9c68dd56-4783-4d46-870d-cdb5843ff342-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-zxhrx\" (UID: \"9c68dd56-4783-4d46-870d-cdb5843ff342\") " pod="openstack/dnsmasq-dns-764c5664d7-zxhrx" Mar 20 17:52:16 crc kubenswrapper[4690]: I0320 17:52:16.743183 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c68dd56-4783-4d46-870d-cdb5843ff342-dns-svc\") pod \"dnsmasq-dns-764c5664d7-zxhrx\" (UID: \"9c68dd56-4783-4d46-870d-cdb5843ff342\") " pod="openstack/dnsmasq-dns-764c5664d7-zxhrx" Mar 20 17:52:16 crc kubenswrapper[4690]: I0320 17:52:16.758035 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llqxv\" (UniqueName: \"kubernetes.io/projected/9c68dd56-4783-4d46-870d-cdb5843ff342-kube-api-access-llqxv\") pod \"dnsmasq-dns-764c5664d7-zxhrx\" (UID: \"9c68dd56-4783-4d46-870d-cdb5843ff342\") " pod="openstack/dnsmasq-dns-764c5664d7-zxhrx" Mar 20 17:52:16 crc kubenswrapper[4690]: I0320 17:52:16.822910 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-j8pr4-config-9b9kh"] Mar 20 17:52:16 crc kubenswrapper[4690]: I0320 17:52:16.823923 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-j8pr4-config-9b9kh" Mar 20 17:52:16 crc kubenswrapper[4690]: I0320 17:52:16.832213 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-j8pr4-config-9b9kh"] Mar 20 17:52:16 crc kubenswrapper[4690]: I0320 17:52:16.832640 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 20 17:52:16 crc kubenswrapper[4690]: I0320 17:52:16.859415 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-zxhrx" Mar 20 17:52:16 crc kubenswrapper[4690]: I0320 17:52:16.951124 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/df31193b-3ee8-473c-aacc-47950f32cacf-additional-scripts\") pod \"ovn-controller-j8pr4-config-9b9kh\" (UID: \"df31193b-3ee8-473c-aacc-47950f32cacf\") " pod="openstack/ovn-controller-j8pr4-config-9b9kh" Mar 20 17:52:16 crc kubenswrapper[4690]: I0320 17:52:16.951839 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/df31193b-3ee8-473c-aacc-47950f32cacf-scripts\") pod \"ovn-controller-j8pr4-config-9b9kh\" (UID: \"df31193b-3ee8-473c-aacc-47950f32cacf\") " pod="openstack/ovn-controller-j8pr4-config-9b9kh" Mar 20 17:52:16 crc kubenswrapper[4690]: I0320 17:52:16.951881 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/df31193b-3ee8-473c-aacc-47950f32cacf-var-run\") pod \"ovn-controller-j8pr4-config-9b9kh\" (UID: \"df31193b-3ee8-473c-aacc-47950f32cacf\") " pod="openstack/ovn-controller-j8pr4-config-9b9kh" Mar 20 17:52:16 crc kubenswrapper[4690]: I0320 17:52:16.951921 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9mv9\" (UniqueName: \"kubernetes.io/projected/df31193b-3ee8-473c-aacc-47950f32cacf-kube-api-access-b9mv9\") pod \"ovn-controller-j8pr4-config-9b9kh\" (UID: \"df31193b-3ee8-473c-aacc-47950f32cacf\") " pod="openstack/ovn-controller-j8pr4-config-9b9kh" Mar 20 17:52:16 crc kubenswrapper[4690]: I0320 17:52:16.951946 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/df31193b-3ee8-473c-aacc-47950f32cacf-var-run-ovn\") pod \"ovn-controller-j8pr4-config-9b9kh\" (UID: \"df31193b-3ee8-473c-aacc-47950f32cacf\") " pod="openstack/ovn-controller-j8pr4-config-9b9kh" Mar 20 17:52:16 crc kubenswrapper[4690]: I0320 17:52:16.951973 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/df31193b-3ee8-473c-aacc-47950f32cacf-var-log-ovn\") pod \"ovn-controller-j8pr4-config-9b9kh\" (UID: \"df31193b-3ee8-473c-aacc-47950f32cacf\") " pod="openstack/ovn-controller-j8pr4-config-9b9kh" Mar 20 17:52:17 crc kubenswrapper[4690]: I0320 17:52:17.053946 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/df31193b-3ee8-473c-aacc-47950f32cacf-scripts\") pod \"ovn-controller-j8pr4-config-9b9kh\" (UID: \"df31193b-3ee8-473c-aacc-47950f32cacf\") " pod="openstack/ovn-controller-j8pr4-config-9b9kh" Mar 20 17:52:17 crc kubenswrapper[4690]: I0320 17:52:17.054227 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/df31193b-3ee8-473c-aacc-47950f32cacf-var-run\") pod \"ovn-controller-j8pr4-config-9b9kh\" (UID: \"df31193b-3ee8-473c-aacc-47950f32cacf\") " pod="openstack/ovn-controller-j8pr4-config-9b9kh" Mar 20 17:52:17 crc kubenswrapper[4690]: I0320 17:52:17.054288 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9mv9\" (UniqueName: \"kubernetes.io/projected/df31193b-3ee8-473c-aacc-47950f32cacf-kube-api-access-b9mv9\") pod \"ovn-controller-j8pr4-config-9b9kh\" (UID: \"df31193b-3ee8-473c-aacc-47950f32cacf\") " pod="openstack/ovn-controller-j8pr4-config-9b9kh" Mar 20 17:52:17 crc kubenswrapper[4690]: I0320 17:52:17.054313 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/df31193b-3ee8-473c-aacc-47950f32cacf-var-run-ovn\") pod \"ovn-controller-j8pr4-config-9b9kh\" (UID: \"df31193b-3ee8-473c-aacc-47950f32cacf\") " pod="openstack/ovn-controller-j8pr4-config-9b9kh" Mar 20 17:52:17 crc kubenswrapper[4690]: I0320 17:52:17.054343 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/df31193b-3ee8-473c-aacc-47950f32cacf-var-log-ovn\") pod \"ovn-controller-j8pr4-config-9b9kh\" (UID: \"df31193b-3ee8-473c-aacc-47950f32cacf\") " pod="openstack/ovn-controller-j8pr4-config-9b9kh" Mar 20 17:52:17 crc kubenswrapper[4690]: I0320 17:52:17.054384 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/df31193b-3ee8-473c-aacc-47950f32cacf-additional-scripts\") pod \"ovn-controller-j8pr4-config-9b9kh\" (UID: \"df31193b-3ee8-473c-aacc-47950f32cacf\") " pod="openstack/ovn-controller-j8pr4-config-9b9kh" Mar 20 17:52:17 crc kubenswrapper[4690]: I0320 17:52:17.054644 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/df31193b-3ee8-473c-aacc-47950f32cacf-var-run\") pod \"ovn-controller-j8pr4-config-9b9kh\" (UID: \"df31193b-3ee8-473c-aacc-47950f32cacf\") " pod="openstack/ovn-controller-j8pr4-config-9b9kh" Mar 20 17:52:17 crc kubenswrapper[4690]: I0320 17:52:17.054646 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/df31193b-3ee8-473c-aacc-47950f32cacf-var-run-ovn\") pod \"ovn-controller-j8pr4-config-9b9kh\" (UID: \"df31193b-3ee8-473c-aacc-47950f32cacf\") " pod="openstack/ovn-controller-j8pr4-config-9b9kh" Mar 20 17:52:17 crc kubenswrapper[4690]: I0320 17:52:17.054717 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/df31193b-3ee8-473c-aacc-47950f32cacf-var-log-ovn\") pod \"ovn-controller-j8pr4-config-9b9kh\" (UID: \"df31193b-3ee8-473c-aacc-47950f32cacf\") " pod="openstack/ovn-controller-j8pr4-config-9b9kh" Mar 20 17:52:17 crc kubenswrapper[4690]: I0320 17:52:17.055375 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/df31193b-3ee8-473c-aacc-47950f32cacf-additional-scripts\") pod \"ovn-controller-j8pr4-config-9b9kh\" (UID: \"df31193b-3ee8-473c-aacc-47950f32cacf\") " pod="openstack/ovn-controller-j8pr4-config-9b9kh" Mar 20 17:52:17 crc kubenswrapper[4690]: I0320 17:52:17.057990 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/df31193b-3ee8-473c-aacc-47950f32cacf-scripts\") pod \"ovn-controller-j8pr4-config-9b9kh\" (UID: \"df31193b-3ee8-473c-aacc-47950f32cacf\") " pod="openstack/ovn-controller-j8pr4-config-9b9kh" Mar 20 17:52:17 crc kubenswrapper[4690]: I0320 17:52:17.087614 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9mv9\" (UniqueName: \"kubernetes.io/projected/df31193b-3ee8-473c-aacc-47950f32cacf-kube-api-access-b9mv9\") pod \"ovn-controller-j8pr4-config-9b9kh\" (UID: \"df31193b-3ee8-473c-aacc-47950f32cacf\") " pod="openstack/ovn-controller-j8pr4-config-9b9kh" Mar 20 17:52:17 crc kubenswrapper[4690]: I0320 17:52:17.142902 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-j8pr4-config-9b9kh" Mar 20 17:52:17 crc kubenswrapper[4690]: I0320 17:52:17.307792 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-zxhrx"] Mar 20 17:52:17 crc kubenswrapper[4690]: W0320 17:52:17.316467 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9c68dd56_4783_4d46_870d_cdb5843ff342.slice/crio-99ffe126510b2c0cbbdb404d4bc888ecd635cd7a52bf170c9d7f23d274fd8870 WatchSource:0}: Error finding container 99ffe126510b2c0cbbdb404d4bc888ecd635cd7a52bf170c9d7f23d274fd8870: Status 404 returned error can't find the container with id 99ffe126510b2c0cbbdb404d4bc888ecd635cd7a52bf170c9d7f23d274fd8870 Mar 20 17:52:17 crc kubenswrapper[4690]: I0320 17:52:17.493628 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-j8pr4" Mar 20 17:52:17 crc kubenswrapper[4690]: I0320 17:52:17.620329 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-j8pr4-config-9b9kh"] Mar 20 17:52:17 crc kubenswrapper[4690]: W0320 17:52:17.655137 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf31193b_3ee8_473c_aacc_47950f32cacf.slice/crio-3d713ecee6d168d49f457d5abc6e705c82da74affcca1c9d9b37413bfe7c406f WatchSource:0}: Error finding container 3d713ecee6d168d49f457d5abc6e705c82da74affcca1c9d9b37413bfe7c406f: Status 404 returned error can't find the container with id 3d713ecee6d168d49f457d5abc6e705c82da74affcca1c9d9b37413bfe7c406f Mar 20 17:52:17 crc kubenswrapper[4690]: I0320 17:52:17.893108 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b2cf7f9-a868-4fd3-9065-fbebe6c6d6ae" path="/var/lib/kubelet/pods/2b2cf7f9-a868-4fd3-9065-fbebe6c6d6ae/volumes" Mar 20 17:52:18 crc kubenswrapper[4690]: I0320 17:52:18.188293 4690 generic.go:334] "Generic (PLEG): container finished" podID="9c68dd56-4783-4d46-870d-cdb5843ff342" containerID="fc3f0da12c63a78519e5d130c139d0e75dd9bc8e62fd2f2a4ab1adb98f49cc1b" exitCode=0 Mar 20 17:52:18 crc kubenswrapper[4690]: I0320 17:52:18.188361 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-zxhrx" event={"ID":"9c68dd56-4783-4d46-870d-cdb5843ff342","Type":"ContainerDied","Data":"fc3f0da12c63a78519e5d130c139d0e75dd9bc8e62fd2f2a4ab1adb98f49cc1b"} Mar 20 17:52:18 crc kubenswrapper[4690]: I0320 17:52:18.188385 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-zxhrx" event={"ID":"9c68dd56-4783-4d46-870d-cdb5843ff342","Type":"ContainerStarted","Data":"99ffe126510b2c0cbbdb404d4bc888ecd635cd7a52bf170c9d7f23d274fd8870"} Mar 20 17:52:18 crc kubenswrapper[4690]: I0320 17:52:18.191766 4690 generic.go:334] "Generic (PLEG): container finished" podID="df31193b-3ee8-473c-aacc-47950f32cacf" containerID="c301d4b9a666f9ed31a11c29a6644edc9964b4202ffff0da85d905570418c3c8" exitCode=0 Mar 20 17:52:18 crc kubenswrapper[4690]: I0320 17:52:18.191809 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-j8pr4-config-9b9kh" event={"ID":"df31193b-3ee8-473c-aacc-47950f32cacf","Type":"ContainerDied","Data":"c301d4b9a666f9ed31a11c29a6644edc9964b4202ffff0da85d905570418c3c8"} Mar 20 17:52:18 crc kubenswrapper[4690]: I0320 17:52:18.191836 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-j8pr4-config-9b9kh" event={"ID":"df31193b-3ee8-473c-aacc-47950f32cacf","Type":"ContainerStarted","Data":"3d713ecee6d168d49f457d5abc6e705c82da74affcca1c9d9b37413bfe7c406f"} Mar 20 17:52:19 crc kubenswrapper[4690]: I0320 17:52:19.152907 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 20 17:52:19 crc kubenswrapper[4690]: I0320 17:52:19.224611 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-zxhrx" event={"ID":"9c68dd56-4783-4d46-870d-cdb5843ff342","Type":"ContainerStarted","Data":"04a913f5a73faf7628592bc268a024dd8bfde9048ee5435639ece269f06fd896"} Mar 20 17:52:19 crc kubenswrapper[4690]: I0320 17:52:19.224832 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-764c5664d7-zxhrx" Mar 20 17:52:19 crc kubenswrapper[4690]: I0320 17:52:19.230477 4690 generic.go:334] "Generic (PLEG): container finished" podID="a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7" containerID="3df7ac4d250a04a6d7d52ab030145e0cd9c9bdc339e5fa7bd91d25cb277c9406" exitCode=0 Mar 20 17:52:19 crc kubenswrapper[4690]: I0320 17:52:19.230490 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7","Type":"ContainerDied","Data":"3df7ac4d250a04a6d7d52ab030145e0cd9c9bdc339e5fa7bd91d25cb277c9406"} Mar 20 17:52:19 crc kubenswrapper[4690]: I0320 17:52:19.233746 4690 generic.go:334] "Generic (PLEG): container finished" podID="d08ec433-4043-43b3-ae56-de712919babe" containerID="d0564bea1fff016e08463a9c5e3dcfd1bde664bd8343722644716c94489ee616" exitCode=0 Mar 20 17:52:19 crc kubenswrapper[4690]: I0320 17:52:19.233867 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-x2bws" event={"ID":"d08ec433-4043-43b3-ae56-de712919babe","Type":"ContainerDied","Data":"d0564bea1fff016e08463a9c5e3dcfd1bde664bd8343722644716c94489ee616"} Mar 20 17:52:19 crc kubenswrapper[4690]: I0320 17:52:19.246283 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-764c5664d7-zxhrx" podStartSLOduration=3.246215167 podStartE2EDuration="3.246215167s" podCreationTimestamp="2026-03-20 17:52:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:52:19.24390808 +0000 UTC m=+1214.109733778" watchObservedRunningTime="2026-03-20 17:52:19.246215167 +0000 UTC m=+1214.112040845" Mar 20 17:52:19 crc kubenswrapper[4690]: I0320 17:52:19.669195 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-j8pr4-config-9b9kh" Mar 20 17:52:19 crc kubenswrapper[4690]: I0320 17:52:19.815020 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b9mv9\" (UniqueName: \"kubernetes.io/projected/df31193b-3ee8-473c-aacc-47950f32cacf-kube-api-access-b9mv9\") pod \"df31193b-3ee8-473c-aacc-47950f32cacf\" (UID: \"df31193b-3ee8-473c-aacc-47950f32cacf\") " Mar 20 17:52:19 crc kubenswrapper[4690]: I0320 17:52:19.815094 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/df31193b-3ee8-473c-aacc-47950f32cacf-var-run\") pod \"df31193b-3ee8-473c-aacc-47950f32cacf\" (UID: \"df31193b-3ee8-473c-aacc-47950f32cacf\") " Mar 20 17:52:19 crc kubenswrapper[4690]: I0320 17:52:19.815160 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/df31193b-3ee8-473c-aacc-47950f32cacf-var-log-ovn\") pod \"df31193b-3ee8-473c-aacc-47950f32cacf\" (UID: \"df31193b-3ee8-473c-aacc-47950f32cacf\") " Mar 20 17:52:19 crc kubenswrapper[4690]: I0320 17:52:19.815187 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/df31193b-3ee8-473c-aacc-47950f32cacf-var-run-ovn\") pod \"df31193b-3ee8-473c-aacc-47950f32cacf\" (UID: \"df31193b-3ee8-473c-aacc-47950f32cacf\") " Mar 20 17:52:19 crc kubenswrapper[4690]: I0320 17:52:19.815202 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/df31193b-3ee8-473c-aacc-47950f32cacf-scripts\") pod \"df31193b-3ee8-473c-aacc-47950f32cacf\" (UID: \"df31193b-3ee8-473c-aacc-47950f32cacf\") " Mar 20 17:52:19 crc kubenswrapper[4690]: I0320 17:52:19.815305 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/df31193b-3ee8-473c-aacc-47950f32cacf-additional-scripts\") pod \"df31193b-3ee8-473c-aacc-47950f32cacf\" (UID: \"df31193b-3ee8-473c-aacc-47950f32cacf\") " Mar 20 17:52:19 crc kubenswrapper[4690]: I0320 17:52:19.815644 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/df31193b-3ee8-473c-aacc-47950f32cacf-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "df31193b-3ee8-473c-aacc-47950f32cacf" (UID: "df31193b-3ee8-473c-aacc-47950f32cacf"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:52:19 crc kubenswrapper[4690]: I0320 17:52:19.815687 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/df31193b-3ee8-473c-aacc-47950f32cacf-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "df31193b-3ee8-473c-aacc-47950f32cacf" (UID: "df31193b-3ee8-473c-aacc-47950f32cacf"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:52:19 crc kubenswrapper[4690]: I0320 17:52:19.815703 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/df31193b-3ee8-473c-aacc-47950f32cacf-var-run" (OuterVolumeSpecName: "var-run") pod "df31193b-3ee8-473c-aacc-47950f32cacf" (UID: "df31193b-3ee8-473c-aacc-47950f32cacf"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:52:19 crc kubenswrapper[4690]: I0320 17:52:19.816269 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df31193b-3ee8-473c-aacc-47950f32cacf-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "df31193b-3ee8-473c-aacc-47950f32cacf" (UID: "df31193b-3ee8-473c-aacc-47950f32cacf"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:52:19 crc kubenswrapper[4690]: I0320 17:52:19.816608 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df31193b-3ee8-473c-aacc-47950f32cacf-scripts" (OuterVolumeSpecName: "scripts") pod "df31193b-3ee8-473c-aacc-47950f32cacf" (UID: "df31193b-3ee8-473c-aacc-47950f32cacf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:52:19 crc kubenswrapper[4690]: I0320 17:52:19.821351 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df31193b-3ee8-473c-aacc-47950f32cacf-kube-api-access-b9mv9" (OuterVolumeSpecName: "kube-api-access-b9mv9") pod "df31193b-3ee8-473c-aacc-47950f32cacf" (UID: "df31193b-3ee8-473c-aacc-47950f32cacf"). InnerVolumeSpecName "kube-api-access-b9mv9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:52:19 crc kubenswrapper[4690]: I0320 17:52:19.917541 4690 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/df31193b-3ee8-473c-aacc-47950f32cacf-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 20 17:52:19 crc kubenswrapper[4690]: I0320 17:52:19.917790 4690 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/df31193b-3ee8-473c-aacc-47950f32cacf-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 20 17:52:19 crc kubenswrapper[4690]: I0320 17:52:19.918552 4690 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/df31193b-3ee8-473c-aacc-47950f32cacf-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:52:19 crc kubenswrapper[4690]: I0320 17:52:19.918654 4690 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/df31193b-3ee8-473c-aacc-47950f32cacf-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:52:19 crc kubenswrapper[4690]: I0320 17:52:19.918725 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b9mv9\" (UniqueName: \"kubernetes.io/projected/df31193b-3ee8-473c-aacc-47950f32cacf-kube-api-access-b9mv9\") on node \"crc\" DevicePath \"\"" Mar 20 17:52:19 crc kubenswrapper[4690]: I0320 17:52:19.918781 4690 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/df31193b-3ee8-473c-aacc-47950f32cacf-var-run\") on node \"crc\" DevicePath \"\"" Mar 20 17:52:20 crc kubenswrapper[4690]: I0320 17:52:20.242332 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-j8pr4-config-9b9kh" event={"ID":"df31193b-3ee8-473c-aacc-47950f32cacf","Type":"ContainerDied","Data":"3d713ecee6d168d49f457d5abc6e705c82da74affcca1c9d9b37413bfe7c406f"} Mar 20 17:52:20 crc kubenswrapper[4690]: I0320 17:52:20.242622 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3d713ecee6d168d49f457d5abc6e705c82da74affcca1c9d9b37413bfe7c406f" Mar 20 17:52:20 crc kubenswrapper[4690]: I0320 17:52:20.242723 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-j8pr4-config-9b9kh" Mar 20 17:52:20 crc kubenswrapper[4690]: I0320 17:52:20.245426 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7","Type":"ContainerStarted","Data":"09ddd27993db8baaf316b15c984459a48393208845452c3703761da831dfaced"} Mar 20 17:52:20 crc kubenswrapper[4690]: I0320 17:52:20.246321 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:52:20 crc kubenswrapper[4690]: I0320 17:52:20.278506 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=46.408994889 podStartE2EDuration="1m23.27848552s" podCreationTimestamp="2026-03-20 17:50:57 +0000 UTC" firstStartedPulling="2026-03-20 17:51:09.511972931 +0000 UTC m=+1144.377798609" lastFinishedPulling="2026-03-20 17:51:46.381463552 +0000 UTC m=+1181.247289240" observedRunningTime="2026-03-20 17:52:20.274601429 +0000 UTC m=+1215.140427147" watchObservedRunningTime="2026-03-20 17:52:20.27848552 +0000 UTC m=+1215.144311198" Mar 20 17:52:20 crc kubenswrapper[4690]: I0320 17:52:20.651484 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-x2bws" Mar 20 17:52:20 crc kubenswrapper[4690]: I0320 17:52:20.732841 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d08ec433-4043-43b3-ae56-de712919babe-config-data\") pod \"d08ec433-4043-43b3-ae56-de712919babe\" (UID: \"d08ec433-4043-43b3-ae56-de712919babe\") " Mar 20 17:52:20 crc kubenswrapper[4690]: I0320 17:52:20.732945 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d08ec433-4043-43b3-ae56-de712919babe-db-sync-config-data\") pod \"d08ec433-4043-43b3-ae56-de712919babe\" (UID: \"d08ec433-4043-43b3-ae56-de712919babe\") " Mar 20 17:52:20 crc kubenswrapper[4690]: I0320 17:52:20.732991 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d08ec433-4043-43b3-ae56-de712919babe-combined-ca-bundle\") pod \"d08ec433-4043-43b3-ae56-de712919babe\" (UID: \"d08ec433-4043-43b3-ae56-de712919babe\") " Mar 20 17:52:20 crc kubenswrapper[4690]: I0320 17:52:20.733033 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fd7xx\" (UniqueName: \"kubernetes.io/projected/d08ec433-4043-43b3-ae56-de712919babe-kube-api-access-fd7xx\") pod \"d08ec433-4043-43b3-ae56-de712919babe\" (UID: \"d08ec433-4043-43b3-ae56-de712919babe\") " Mar 20 17:52:20 crc kubenswrapper[4690]: I0320 17:52:20.738815 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d08ec433-4043-43b3-ae56-de712919babe-kube-api-access-fd7xx" (OuterVolumeSpecName: "kube-api-access-fd7xx") pod "d08ec433-4043-43b3-ae56-de712919babe" (UID: "d08ec433-4043-43b3-ae56-de712919babe"). InnerVolumeSpecName "kube-api-access-fd7xx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:52:20 crc kubenswrapper[4690]: I0320 17:52:20.741865 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d08ec433-4043-43b3-ae56-de712919babe-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "d08ec433-4043-43b3-ae56-de712919babe" (UID: "d08ec433-4043-43b3-ae56-de712919babe"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:52:20 crc kubenswrapper[4690]: I0320 17:52:20.753888 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-j8pr4-config-9b9kh"] Mar 20 17:52:20 crc kubenswrapper[4690]: I0320 17:52:20.765983 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d08ec433-4043-43b3-ae56-de712919babe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d08ec433-4043-43b3-ae56-de712919babe" (UID: "d08ec433-4043-43b3-ae56-de712919babe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:52:20 crc kubenswrapper[4690]: I0320 17:52:20.766186 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-j8pr4-config-9b9kh"] Mar 20 17:52:20 crc kubenswrapper[4690]: I0320 17:52:20.775518 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d08ec433-4043-43b3-ae56-de712919babe-config-data" (OuterVolumeSpecName: "config-data") pod "d08ec433-4043-43b3-ae56-de712919babe" (UID: "d08ec433-4043-43b3-ae56-de712919babe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:52:20 crc kubenswrapper[4690]: I0320 17:52:20.835194 4690 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d08ec433-4043-43b3-ae56-de712919babe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:52:20 crc kubenswrapper[4690]: I0320 17:52:20.835416 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fd7xx\" (UniqueName: \"kubernetes.io/projected/d08ec433-4043-43b3-ae56-de712919babe-kube-api-access-fd7xx\") on node \"crc\" DevicePath \"\"" Mar 20 17:52:20 crc kubenswrapper[4690]: I0320 17:52:20.835508 4690 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d08ec433-4043-43b3-ae56-de712919babe-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:52:20 crc kubenswrapper[4690]: I0320 17:52:20.835564 4690 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d08ec433-4043-43b3-ae56-de712919babe-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:52:21 crc kubenswrapper[4690]: I0320 17:52:21.258738 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-x2bws" event={"ID":"d08ec433-4043-43b3-ae56-de712919babe","Type":"ContainerDied","Data":"4bbf117be75e138da0805afdf4c344e1c36e887e44aab7bb40732373ab0c8635"} Mar 20 17:52:21 crc kubenswrapper[4690]: I0320 17:52:21.259135 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4bbf117be75e138da0805afdf4c344e1c36e887e44aab7bb40732373ab0c8635" Mar 20 17:52:21 crc kubenswrapper[4690]: I0320 17:52:21.258757 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-x2bws" Mar 20 17:52:21 crc kubenswrapper[4690]: I0320 17:52:21.761095 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-zxhrx"] Mar 20 17:52:21 crc kubenswrapper[4690]: I0320 17:52:21.761440 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-764c5664d7-zxhrx" podUID="9c68dd56-4783-4d46-870d-cdb5843ff342" containerName="dnsmasq-dns" containerID="cri-o://04a913f5a73faf7628592bc268a024dd8bfde9048ee5435639ece269f06fd896" gracePeriod=10 Mar 20 17:52:21 crc kubenswrapper[4690]: I0320 17:52:21.791161 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-f6h5n"] Mar 20 17:52:21 crc kubenswrapper[4690]: E0320 17:52:21.791581 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df31193b-3ee8-473c-aacc-47950f32cacf" containerName="ovn-config" Mar 20 17:52:21 crc kubenswrapper[4690]: I0320 17:52:21.791598 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="df31193b-3ee8-473c-aacc-47950f32cacf" containerName="ovn-config" Mar 20 17:52:21 crc kubenswrapper[4690]: E0320 17:52:21.791639 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d08ec433-4043-43b3-ae56-de712919babe" containerName="glance-db-sync" Mar 20 17:52:21 crc kubenswrapper[4690]: I0320 17:52:21.791649 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="d08ec433-4043-43b3-ae56-de712919babe" containerName="glance-db-sync" Mar 20 17:52:21 crc kubenswrapper[4690]: I0320 17:52:21.791855 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="d08ec433-4043-43b3-ae56-de712919babe" containerName="glance-db-sync" Mar 20 17:52:21 crc kubenswrapper[4690]: I0320 17:52:21.791867 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="df31193b-3ee8-473c-aacc-47950f32cacf" containerName="ovn-config" Mar 20 17:52:21 crc kubenswrapper[4690]: I0320 17:52:21.792890 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-f6h5n" Mar 20 17:52:21 crc kubenswrapper[4690]: I0320 17:52:21.810113 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-f6h5n"] Mar 20 17:52:21 crc kubenswrapper[4690]: I0320 17:52:21.856307 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d5e029b9-bf4d-4700-9a5f-c35bd3459b15-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-f6h5n\" (UID: \"d5e029b9-bf4d-4700-9a5f-c35bd3459b15\") " pod="openstack/dnsmasq-dns-74f6bcbc87-f6h5n" Mar 20 17:52:21 crc kubenswrapper[4690]: I0320 17:52:21.856363 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d5e029b9-bf4d-4700-9a5f-c35bd3459b15-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-f6h5n\" (UID: \"d5e029b9-bf4d-4700-9a5f-c35bd3459b15\") " pod="openstack/dnsmasq-dns-74f6bcbc87-f6h5n" Mar 20 17:52:21 crc kubenswrapper[4690]: I0320 17:52:21.856413 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5e029b9-bf4d-4700-9a5f-c35bd3459b15-config\") pod \"dnsmasq-dns-74f6bcbc87-f6h5n\" (UID: \"d5e029b9-bf4d-4700-9a5f-c35bd3459b15\") " pod="openstack/dnsmasq-dns-74f6bcbc87-f6h5n" Mar 20 17:52:21 crc kubenswrapper[4690]: I0320 17:52:21.856707 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-742ks\" (UniqueName: \"kubernetes.io/projected/d5e029b9-bf4d-4700-9a5f-c35bd3459b15-kube-api-access-742ks\") pod \"dnsmasq-dns-74f6bcbc87-f6h5n\" (UID: \"d5e029b9-bf4d-4700-9a5f-c35bd3459b15\") " pod="openstack/dnsmasq-dns-74f6bcbc87-f6h5n" Mar 20 17:52:21 crc kubenswrapper[4690]: I0320 17:52:21.856743 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d5e029b9-bf4d-4700-9a5f-c35bd3459b15-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-f6h5n\" (UID: \"d5e029b9-bf4d-4700-9a5f-c35bd3459b15\") " pod="openstack/dnsmasq-dns-74f6bcbc87-f6h5n" Mar 20 17:52:21 crc kubenswrapper[4690]: I0320 17:52:21.856847 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d5e029b9-bf4d-4700-9a5f-c35bd3459b15-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-f6h5n\" (UID: \"d5e029b9-bf4d-4700-9a5f-c35bd3459b15\") " pod="openstack/dnsmasq-dns-74f6bcbc87-f6h5n" Mar 20 17:52:21 crc kubenswrapper[4690]: I0320 17:52:21.894290 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df31193b-3ee8-473c-aacc-47950f32cacf" path="/var/lib/kubelet/pods/df31193b-3ee8-473c-aacc-47950f32cacf/volumes" Mar 20 17:52:21 crc kubenswrapper[4690]: I0320 17:52:21.957892 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d5e029b9-bf4d-4700-9a5f-c35bd3459b15-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-f6h5n\" (UID: \"d5e029b9-bf4d-4700-9a5f-c35bd3459b15\") " pod="openstack/dnsmasq-dns-74f6bcbc87-f6h5n" Mar 20 17:52:21 crc kubenswrapper[4690]: I0320 17:52:21.957934 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-742ks\" (UniqueName: \"kubernetes.io/projected/d5e029b9-bf4d-4700-9a5f-c35bd3459b15-kube-api-access-742ks\") pod \"dnsmasq-dns-74f6bcbc87-f6h5n\" (UID: \"d5e029b9-bf4d-4700-9a5f-c35bd3459b15\") " pod="openstack/dnsmasq-dns-74f6bcbc87-f6h5n" Mar 20 17:52:21 crc kubenswrapper[4690]: I0320 17:52:21.957965 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d5e029b9-bf4d-4700-9a5f-c35bd3459b15-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-f6h5n\" (UID: \"d5e029b9-bf4d-4700-9a5f-c35bd3459b15\") " pod="openstack/dnsmasq-dns-74f6bcbc87-f6h5n" Mar 20 17:52:21 crc kubenswrapper[4690]: I0320 17:52:21.957988 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d5e029b9-bf4d-4700-9a5f-c35bd3459b15-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-f6h5n\" (UID: \"d5e029b9-bf4d-4700-9a5f-c35bd3459b15\") " pod="openstack/dnsmasq-dns-74f6bcbc87-f6h5n" Mar 20 17:52:21 crc kubenswrapper[4690]: I0320 17:52:21.958009 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d5e029b9-bf4d-4700-9a5f-c35bd3459b15-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-f6h5n\" (UID: \"d5e029b9-bf4d-4700-9a5f-c35bd3459b15\") " pod="openstack/dnsmasq-dns-74f6bcbc87-f6h5n" Mar 20 17:52:21 crc kubenswrapper[4690]: I0320 17:52:21.958041 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5e029b9-bf4d-4700-9a5f-c35bd3459b15-config\") pod \"dnsmasq-dns-74f6bcbc87-f6h5n\" (UID: \"d5e029b9-bf4d-4700-9a5f-c35bd3459b15\") " pod="openstack/dnsmasq-dns-74f6bcbc87-f6h5n" Mar 20 17:52:21 crc kubenswrapper[4690]: I0320 17:52:21.958897 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5e029b9-bf4d-4700-9a5f-c35bd3459b15-config\") pod \"dnsmasq-dns-74f6bcbc87-f6h5n\" (UID: \"d5e029b9-bf4d-4700-9a5f-c35bd3459b15\") " pod="openstack/dnsmasq-dns-74f6bcbc87-f6h5n" Mar 20 17:52:21 crc kubenswrapper[4690]: I0320 17:52:21.958941 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d5e029b9-bf4d-4700-9a5f-c35bd3459b15-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-f6h5n\" (UID: \"d5e029b9-bf4d-4700-9a5f-c35bd3459b15\") " pod="openstack/dnsmasq-dns-74f6bcbc87-f6h5n" Mar 20 17:52:21 crc kubenswrapper[4690]: I0320 17:52:21.959125 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d5e029b9-bf4d-4700-9a5f-c35bd3459b15-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-f6h5n\" (UID: \"d5e029b9-bf4d-4700-9a5f-c35bd3459b15\") " pod="openstack/dnsmasq-dns-74f6bcbc87-f6h5n" Mar 20 17:52:21 crc kubenswrapper[4690]: I0320 17:52:21.959481 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d5e029b9-bf4d-4700-9a5f-c35bd3459b15-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-f6h5n\" (UID: \"d5e029b9-bf4d-4700-9a5f-c35bd3459b15\") " pod="openstack/dnsmasq-dns-74f6bcbc87-f6h5n" Mar 20 17:52:21 crc kubenswrapper[4690]: I0320 17:52:21.959749 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d5e029b9-bf4d-4700-9a5f-c35bd3459b15-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-f6h5n\" (UID: \"d5e029b9-bf4d-4700-9a5f-c35bd3459b15\") " pod="openstack/dnsmasq-dns-74f6bcbc87-f6h5n" Mar 20 17:52:21 crc kubenswrapper[4690]: I0320 17:52:21.977044 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-742ks\" (UniqueName: \"kubernetes.io/projected/d5e029b9-bf4d-4700-9a5f-c35bd3459b15-kube-api-access-742ks\") pod \"dnsmasq-dns-74f6bcbc87-f6h5n\" (UID: \"d5e029b9-bf4d-4700-9a5f-c35bd3459b15\") " pod="openstack/dnsmasq-dns-74f6bcbc87-f6h5n" Mar 20 17:52:22 crc kubenswrapper[4690]: I0320 17:52:22.115364 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-f6h5n" Mar 20 17:52:24 crc kubenswrapper[4690]: I0320 17:52:22.287877 4690 generic.go:334] "Generic (PLEG): container finished" podID="9c68dd56-4783-4d46-870d-cdb5843ff342" containerID="04a913f5a73faf7628592bc268a024dd8bfde9048ee5435639ece269f06fd896" exitCode=0 Mar 20 17:52:24 crc kubenswrapper[4690]: I0320 17:52:22.287984 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-zxhrx" event={"ID":"9c68dd56-4783-4d46-870d-cdb5843ff342","Type":"ContainerDied","Data":"04a913f5a73faf7628592bc268a024dd8bfde9048ee5435639ece269f06fd896"} Mar 20 17:52:24 crc kubenswrapper[4690]: I0320 17:52:22.288400 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-zxhrx" event={"ID":"9c68dd56-4783-4d46-870d-cdb5843ff342","Type":"ContainerDied","Data":"99ffe126510b2c0cbbdb404d4bc888ecd635cd7a52bf170c9d7f23d274fd8870"} Mar 20 17:52:24 crc kubenswrapper[4690]: I0320 17:52:22.288427 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="99ffe126510b2c0cbbdb404d4bc888ecd635cd7a52bf170c9d7f23d274fd8870" Mar 20 17:52:24 crc kubenswrapper[4690]: I0320 17:52:22.294814 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-zxhrx" Mar 20 17:52:24 crc kubenswrapper[4690]: I0320 17:52:22.380242 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c68dd56-4783-4d46-870d-cdb5843ff342-config\") pod \"9c68dd56-4783-4d46-870d-cdb5843ff342\" (UID: \"9c68dd56-4783-4d46-870d-cdb5843ff342\") " Mar 20 17:52:24 crc kubenswrapper[4690]: I0320 17:52:22.380364 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c68dd56-4783-4d46-870d-cdb5843ff342-dns-svc\") pod \"9c68dd56-4783-4d46-870d-cdb5843ff342\" (UID: \"9c68dd56-4783-4d46-870d-cdb5843ff342\") " Mar 20 17:52:24 crc kubenswrapper[4690]: I0320 17:52:22.380404 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-llqxv\" (UniqueName: \"kubernetes.io/projected/9c68dd56-4783-4d46-870d-cdb5843ff342-kube-api-access-llqxv\") pod \"9c68dd56-4783-4d46-870d-cdb5843ff342\" (UID: \"9c68dd56-4783-4d46-870d-cdb5843ff342\") " Mar 20 17:52:24 crc kubenswrapper[4690]: I0320 17:52:22.380427 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9c68dd56-4783-4d46-870d-cdb5843ff342-ovsdbserver-sb\") pod \"9c68dd56-4783-4d46-870d-cdb5843ff342\" (UID: \"9c68dd56-4783-4d46-870d-cdb5843ff342\") " Mar 20 17:52:24 crc kubenswrapper[4690]: I0320 17:52:22.380451 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9c68dd56-4783-4d46-870d-cdb5843ff342-ovsdbserver-nb\") pod \"9c68dd56-4783-4d46-870d-cdb5843ff342\" (UID: \"9c68dd56-4783-4d46-870d-cdb5843ff342\") " Mar 20 17:52:24 crc kubenswrapper[4690]: I0320 17:52:22.380538 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9c68dd56-4783-4d46-870d-cdb5843ff342-dns-swift-storage-0\") pod \"9c68dd56-4783-4d46-870d-cdb5843ff342\" (UID: \"9c68dd56-4783-4d46-870d-cdb5843ff342\") " Mar 20 17:52:24 crc kubenswrapper[4690]: I0320 17:52:22.385624 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c68dd56-4783-4d46-870d-cdb5843ff342-kube-api-access-llqxv" (OuterVolumeSpecName: "kube-api-access-llqxv") pod "9c68dd56-4783-4d46-870d-cdb5843ff342" (UID: "9c68dd56-4783-4d46-870d-cdb5843ff342"). InnerVolumeSpecName "kube-api-access-llqxv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:52:24 crc kubenswrapper[4690]: I0320 17:52:22.422024 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c68dd56-4783-4d46-870d-cdb5843ff342-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9c68dd56-4783-4d46-870d-cdb5843ff342" (UID: "9c68dd56-4783-4d46-870d-cdb5843ff342"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:52:24 crc kubenswrapper[4690]: I0320 17:52:22.422941 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c68dd56-4783-4d46-870d-cdb5843ff342-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "9c68dd56-4783-4d46-870d-cdb5843ff342" (UID: "9c68dd56-4783-4d46-870d-cdb5843ff342"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:52:24 crc kubenswrapper[4690]: I0320 17:52:22.427193 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c68dd56-4783-4d46-870d-cdb5843ff342-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9c68dd56-4783-4d46-870d-cdb5843ff342" (UID: "9c68dd56-4783-4d46-870d-cdb5843ff342"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:52:24 crc kubenswrapper[4690]: I0320 17:52:22.430334 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c68dd56-4783-4d46-870d-cdb5843ff342-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9c68dd56-4783-4d46-870d-cdb5843ff342" (UID: "9c68dd56-4783-4d46-870d-cdb5843ff342"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:52:24 crc kubenswrapper[4690]: I0320 17:52:22.438684 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c68dd56-4783-4d46-870d-cdb5843ff342-config" (OuterVolumeSpecName: "config") pod "9c68dd56-4783-4d46-870d-cdb5843ff342" (UID: "9c68dd56-4783-4d46-870d-cdb5843ff342"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:52:24 crc kubenswrapper[4690]: I0320 17:52:22.482351 4690 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c68dd56-4783-4d46-870d-cdb5843ff342-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 17:52:24 crc kubenswrapper[4690]: I0320 17:52:22.482383 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-llqxv\" (UniqueName: \"kubernetes.io/projected/9c68dd56-4783-4d46-870d-cdb5843ff342-kube-api-access-llqxv\") on node \"crc\" DevicePath \"\"" Mar 20 17:52:24 crc kubenswrapper[4690]: I0320 17:52:22.482399 4690 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9c68dd56-4783-4d46-870d-cdb5843ff342-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 17:52:24 crc kubenswrapper[4690]: I0320 17:52:22.482411 4690 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9c68dd56-4783-4d46-870d-cdb5843ff342-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 17:52:24 crc kubenswrapper[4690]: I0320 17:52:22.482422 4690 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9c68dd56-4783-4d46-870d-cdb5843ff342-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 17:52:24 crc kubenswrapper[4690]: I0320 17:52:22.482434 4690 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c68dd56-4783-4d46-870d-cdb5843ff342-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:52:24 crc kubenswrapper[4690]: I0320 17:52:23.300547 4690 generic.go:334] "Generic (PLEG): container finished" podID="4ee4534b-8d84-4ca5-a8bc-10574d39d7bc" containerID="e32fdf8c2102bd7ba3cad331c73e818ce6d0901e8c7c47f023f4120143d2d905" exitCode=0 Mar 20 17:52:24 crc kubenswrapper[4690]: I0320 17:52:23.300692 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4ee4534b-8d84-4ca5-a8bc-10574d39d7bc","Type":"ContainerDied","Data":"e32fdf8c2102bd7ba3cad331c73e818ce6d0901e8c7c47f023f4120143d2d905"} Mar 20 17:52:24 crc kubenswrapper[4690]: I0320 17:52:23.301146 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-zxhrx" Mar 20 17:52:24 crc kubenswrapper[4690]: I0320 17:52:23.463421 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-zxhrx"] Mar 20 17:52:24 crc kubenswrapper[4690]: I0320 17:52:23.488475 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-zxhrx"] Mar 20 17:52:24 crc kubenswrapper[4690]: I0320 17:52:23.892891 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c68dd56-4783-4d46-870d-cdb5843ff342" path="/var/lib/kubelet/pods/9c68dd56-4783-4d46-870d-cdb5843ff342/volumes" Mar 20 17:52:24 crc kubenswrapper[4690]: I0320 17:52:24.273862 4690 patch_prober.go:28] interesting pod/machine-config-daemon-wtg2q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:52:24 crc kubenswrapper[4690]: I0320 17:52:24.273931 4690 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:52:24 crc kubenswrapper[4690]: I0320 17:52:24.273981 4690 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" Mar 20 17:52:24 crc kubenswrapper[4690]: I0320 17:52:24.274912 4690 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ab2561b6600e9d6bebb46c2c746c35623906cf56d05e6dcd356c447e3e87dfa1"} pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 17:52:24 crc kubenswrapper[4690]: I0320 17:52:24.274994 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" containerName="machine-config-daemon" containerID="cri-o://ab2561b6600e9d6bebb46c2c746c35623906cf56d05e6dcd356c447e3e87dfa1" gracePeriod=600 Mar 20 17:52:24 crc kubenswrapper[4690]: I0320 17:52:24.312449 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4ee4534b-8d84-4ca5-a8bc-10574d39d7bc","Type":"ContainerStarted","Data":"014e5dcc51458e00ea1c1e92fc8066e86e8ba38713cdd0bb150493ff18fbc998"} Mar 20 17:52:24 crc kubenswrapper[4690]: I0320 17:52:24.312690 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 20 17:52:24 crc kubenswrapper[4690]: I0320 17:52:24.351207 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=-9223371949.503597 podStartE2EDuration="1m27.351178999s" podCreationTimestamp="2026-03-20 17:50:57 +0000 UTC" firstStartedPulling="2026-03-20 17:51:11.040485465 +0000 UTC m=+1145.906311143" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:52:24.340483323 +0000 UTC m=+1219.206309011" watchObservedRunningTime="2026-03-20 17:52:24.351178999 +0000 UTC m=+1219.217004697" Mar 20 17:52:24 crc kubenswrapper[4690]: I0320 17:52:24.977958 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-f6h5n"] Mar 20 17:52:24 crc kubenswrapper[4690]: W0320 17:52:24.986891 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd5e029b9_bf4d_4700_9a5f_c35bd3459b15.slice/crio-3511c0655e7d34dc813afbc16bf7ef428907f3729fb70fc4c23bc7e54e518e06 WatchSource:0}: Error finding container 3511c0655e7d34dc813afbc16bf7ef428907f3729fb70fc4c23bc7e54e518e06: Status 404 returned error can't find the container with id 3511c0655e7d34dc813afbc16bf7ef428907f3729fb70fc4c23bc7e54e518e06 Mar 20 17:52:25 crc kubenswrapper[4690]: I0320 17:52:25.322138 4690 generic.go:334] "Generic (PLEG): container finished" podID="d5e029b9-bf4d-4700-9a5f-c35bd3459b15" containerID="423da00fb24e98f7484f72a09b566ac68729f7ec22d23ce862306c7ff6608587" exitCode=0 Mar 20 17:52:25 crc kubenswrapper[4690]: I0320 17:52:25.322208 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-f6h5n" event={"ID":"d5e029b9-bf4d-4700-9a5f-c35bd3459b15","Type":"ContainerDied","Data":"423da00fb24e98f7484f72a09b566ac68729f7ec22d23ce862306c7ff6608587"} Mar 20 17:52:25 crc kubenswrapper[4690]: I0320 17:52:25.322484 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-f6h5n" event={"ID":"d5e029b9-bf4d-4700-9a5f-c35bd3459b15","Type":"ContainerStarted","Data":"3511c0655e7d34dc813afbc16bf7ef428907f3729fb70fc4c23bc7e54e518e06"} Mar 20 17:52:25 crc kubenswrapper[4690]: I0320 17:52:25.325603 4690 generic.go:334] "Generic (PLEG): container finished" podID="c18651e4-89e3-43fd-a780-bfa6df87591e" containerID="ab2561b6600e9d6bebb46c2c746c35623906cf56d05e6dcd356c447e3e87dfa1" exitCode=0 Mar 20 17:52:25 crc kubenswrapper[4690]: I0320 17:52:25.325659 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" event={"ID":"c18651e4-89e3-43fd-a780-bfa6df87591e","Type":"ContainerDied","Data":"ab2561b6600e9d6bebb46c2c746c35623906cf56d05e6dcd356c447e3e87dfa1"} Mar 20 17:52:25 crc kubenswrapper[4690]: I0320 17:52:25.325702 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" event={"ID":"c18651e4-89e3-43fd-a780-bfa6df87591e","Type":"ContainerStarted","Data":"c6c26ff37905c4c37c818991d48555bc929721ae7acd19a88c41bd55b417a5fe"} Mar 20 17:52:25 crc kubenswrapper[4690]: I0320 17:52:25.325720 4690 scope.go:117] "RemoveContainer" containerID="3597106c9e9367c28d243129fc42edbd4550b54914b1aeed86c0200ac6936ead" Mar 20 17:52:26 crc kubenswrapper[4690]: I0320 17:52:26.337685 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-f6h5n" event={"ID":"d5e029b9-bf4d-4700-9a5f-c35bd3459b15","Type":"ContainerStarted","Data":"75ee5ffb621d9785b335466768a7b7cc4b8bfa373b10846427a02d4c1abc31cd"} Mar 20 17:52:26 crc kubenswrapper[4690]: I0320 17:52:26.338135 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74f6bcbc87-f6h5n" Mar 20 17:52:26 crc kubenswrapper[4690]: I0320 17:52:26.359644 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-74f6bcbc87-f6h5n" podStartSLOduration=5.35962036 podStartE2EDuration="5.35962036s" podCreationTimestamp="2026-03-20 17:52:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:52:26.357248572 +0000 UTC m=+1221.223074290" watchObservedRunningTime="2026-03-20 17:52:26.35962036 +0000 UTC m=+1221.225446108" Mar 20 17:52:32 crc kubenswrapper[4690]: I0320 17:52:32.117539 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-74f6bcbc87-f6h5n" Mar 20 17:52:32 crc kubenswrapper[4690]: I0320 17:52:32.195164 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-b56kg"] Mar 20 17:52:32 crc kubenswrapper[4690]: I0320 17:52:32.195576 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-b56kg" podUID="5bc2feb8-84ad-46f0-aa5e-3b17b33f1bce" containerName="dnsmasq-dns" containerID="cri-o://1108152d0e8f6299034524770dd94f6864cd5941547739ec01c98a829e06ef56" gracePeriod=10 Mar 20 17:52:32 crc kubenswrapper[4690]: I0320 17:52:32.414934 4690 generic.go:334] "Generic (PLEG): container finished" podID="5bc2feb8-84ad-46f0-aa5e-3b17b33f1bce" containerID="1108152d0e8f6299034524770dd94f6864cd5941547739ec01c98a829e06ef56" exitCode=0 Mar 20 17:52:32 crc kubenswrapper[4690]: I0320 17:52:32.414979 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-b56kg" event={"ID":"5bc2feb8-84ad-46f0-aa5e-3b17b33f1bce","Type":"ContainerDied","Data":"1108152d0e8f6299034524770dd94f6864cd5941547739ec01c98a829e06ef56"} Mar 20 17:52:32 crc kubenswrapper[4690]: I0320 17:52:32.726559 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-b56kg" Mar 20 17:52:32 crc kubenswrapper[4690]: I0320 17:52:32.782317 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zqqb7\" (UniqueName: \"kubernetes.io/projected/5bc2feb8-84ad-46f0-aa5e-3b17b33f1bce-kube-api-access-zqqb7\") pod \"5bc2feb8-84ad-46f0-aa5e-3b17b33f1bce\" (UID: \"5bc2feb8-84ad-46f0-aa5e-3b17b33f1bce\") " Mar 20 17:52:32 crc kubenswrapper[4690]: I0320 17:52:32.782369 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5bc2feb8-84ad-46f0-aa5e-3b17b33f1bce-ovsdbserver-nb\") pod \"5bc2feb8-84ad-46f0-aa5e-3b17b33f1bce\" (UID: \"5bc2feb8-84ad-46f0-aa5e-3b17b33f1bce\") " Mar 20 17:52:32 crc kubenswrapper[4690]: I0320 17:52:32.782392 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5bc2feb8-84ad-46f0-aa5e-3b17b33f1bce-ovsdbserver-sb\") pod \"5bc2feb8-84ad-46f0-aa5e-3b17b33f1bce\" (UID: \"5bc2feb8-84ad-46f0-aa5e-3b17b33f1bce\") " Mar 20 17:52:32 crc kubenswrapper[4690]: I0320 17:52:32.782431 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5bc2feb8-84ad-46f0-aa5e-3b17b33f1bce-dns-svc\") pod \"5bc2feb8-84ad-46f0-aa5e-3b17b33f1bce\" (UID: \"5bc2feb8-84ad-46f0-aa5e-3b17b33f1bce\") " Mar 20 17:52:32 crc kubenswrapper[4690]: I0320 17:52:32.782484 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bc2feb8-84ad-46f0-aa5e-3b17b33f1bce-config\") pod \"5bc2feb8-84ad-46f0-aa5e-3b17b33f1bce\" (UID: \"5bc2feb8-84ad-46f0-aa5e-3b17b33f1bce\") " Mar 20 17:52:32 crc kubenswrapper[4690]: I0320 17:52:32.788552 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bc2feb8-84ad-46f0-aa5e-3b17b33f1bce-kube-api-access-zqqb7" (OuterVolumeSpecName: "kube-api-access-zqqb7") pod "5bc2feb8-84ad-46f0-aa5e-3b17b33f1bce" (UID: "5bc2feb8-84ad-46f0-aa5e-3b17b33f1bce"). InnerVolumeSpecName "kube-api-access-zqqb7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:52:32 crc kubenswrapper[4690]: I0320 17:52:32.822285 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5bc2feb8-84ad-46f0-aa5e-3b17b33f1bce-config" (OuterVolumeSpecName: "config") pod "5bc2feb8-84ad-46f0-aa5e-3b17b33f1bce" (UID: "5bc2feb8-84ad-46f0-aa5e-3b17b33f1bce"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:52:32 crc kubenswrapper[4690]: I0320 17:52:32.824068 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5bc2feb8-84ad-46f0-aa5e-3b17b33f1bce-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5bc2feb8-84ad-46f0-aa5e-3b17b33f1bce" (UID: "5bc2feb8-84ad-46f0-aa5e-3b17b33f1bce"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:52:32 crc kubenswrapper[4690]: I0320 17:52:32.825621 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5bc2feb8-84ad-46f0-aa5e-3b17b33f1bce-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5bc2feb8-84ad-46f0-aa5e-3b17b33f1bce" (UID: "5bc2feb8-84ad-46f0-aa5e-3b17b33f1bce"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:52:32 crc kubenswrapper[4690]: I0320 17:52:32.839018 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5bc2feb8-84ad-46f0-aa5e-3b17b33f1bce-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5bc2feb8-84ad-46f0-aa5e-3b17b33f1bce" (UID: "5bc2feb8-84ad-46f0-aa5e-3b17b33f1bce"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:52:32 crc kubenswrapper[4690]: I0320 17:52:32.884777 4690 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bc2feb8-84ad-46f0-aa5e-3b17b33f1bce-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:52:32 crc kubenswrapper[4690]: I0320 17:52:32.884817 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zqqb7\" (UniqueName: \"kubernetes.io/projected/5bc2feb8-84ad-46f0-aa5e-3b17b33f1bce-kube-api-access-zqqb7\") on node \"crc\" DevicePath \"\"" Mar 20 17:52:32 crc kubenswrapper[4690]: I0320 17:52:32.884830 4690 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5bc2feb8-84ad-46f0-aa5e-3b17b33f1bce-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 17:52:32 crc kubenswrapper[4690]: I0320 17:52:32.884840 4690 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5bc2feb8-84ad-46f0-aa5e-3b17b33f1bce-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 17:52:32 crc kubenswrapper[4690]: I0320 17:52:32.884847 4690 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5bc2feb8-84ad-46f0-aa5e-3b17b33f1bce-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 17:52:33 crc kubenswrapper[4690]: I0320 17:52:33.423705 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-b56kg" event={"ID":"5bc2feb8-84ad-46f0-aa5e-3b17b33f1bce","Type":"ContainerDied","Data":"629c969a0c79b47e304c722bbdc5b10f3345de19739324c1f0dd3c9ea65e8a75"} Mar 20 17:52:33 crc kubenswrapper[4690]: I0320 17:52:33.423748 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-b56kg" Mar 20 17:52:33 crc kubenswrapper[4690]: I0320 17:52:33.424048 4690 scope.go:117] "RemoveContainer" containerID="1108152d0e8f6299034524770dd94f6864cd5941547739ec01c98a829e06ef56" Mar 20 17:52:33 crc kubenswrapper[4690]: I0320 17:52:33.448043 4690 scope.go:117] "RemoveContainer" containerID="a2e0e3a98dc062d76f2ede084588cd281812e1a12f6d49f55bc2043c7783f75b" Mar 20 17:52:33 crc kubenswrapper[4690]: I0320 17:52:33.515915 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-b56kg"] Mar 20 17:52:33 crc kubenswrapper[4690]: I0320 17:52:33.522608 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-b56kg"] Mar 20 17:52:33 crc kubenswrapper[4690]: I0320 17:52:33.896843 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5bc2feb8-84ad-46f0-aa5e-3b17b33f1bce" path="/var/lib/kubelet/pods/5bc2feb8-84ad-46f0-aa5e-3b17b33f1bce/volumes" Mar 20 17:52:38 crc kubenswrapper[4690]: I0320 17:52:38.519449 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 20 17:52:38 crc kubenswrapper[4690]: I0320 17:52:38.694468 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:52:38 crc kubenswrapper[4690]: I0320 17:52:38.924019 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-sh9q5"] Mar 20 17:52:38 crc kubenswrapper[4690]: E0320 17:52:38.924480 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bc2feb8-84ad-46f0-aa5e-3b17b33f1bce" containerName="dnsmasq-dns" Mar 20 17:52:38 crc kubenswrapper[4690]: I0320 17:52:38.924499 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bc2feb8-84ad-46f0-aa5e-3b17b33f1bce" containerName="dnsmasq-dns" Mar 20 17:52:38 crc kubenswrapper[4690]: E0320 17:52:38.924529 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c68dd56-4783-4d46-870d-cdb5843ff342" containerName="init" Mar 20 17:52:38 crc kubenswrapper[4690]: I0320 17:52:38.924538 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c68dd56-4783-4d46-870d-cdb5843ff342" containerName="init" Mar 20 17:52:38 crc kubenswrapper[4690]: E0320 17:52:38.924550 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c68dd56-4783-4d46-870d-cdb5843ff342" containerName="dnsmasq-dns" Mar 20 17:52:38 crc kubenswrapper[4690]: I0320 17:52:38.924559 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c68dd56-4783-4d46-870d-cdb5843ff342" containerName="dnsmasq-dns" Mar 20 17:52:38 crc kubenswrapper[4690]: E0320 17:52:38.924585 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bc2feb8-84ad-46f0-aa5e-3b17b33f1bce" containerName="init" Mar 20 17:52:38 crc kubenswrapper[4690]: I0320 17:52:38.924593 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bc2feb8-84ad-46f0-aa5e-3b17b33f1bce" containerName="init" Mar 20 17:52:38 crc kubenswrapper[4690]: I0320 17:52:38.924781 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bc2feb8-84ad-46f0-aa5e-3b17b33f1bce" containerName="dnsmasq-dns" Mar 20 17:52:38 crc kubenswrapper[4690]: I0320 17:52:38.924802 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c68dd56-4783-4d46-870d-cdb5843ff342" containerName="dnsmasq-dns" Mar 20 17:52:38 crc kubenswrapper[4690]: I0320 17:52:38.925440 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-sh9q5" Mar 20 17:52:38 crc kubenswrapper[4690]: I0320 17:52:38.944009 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-sh9q5"] Mar 20 17:52:39 crc kubenswrapper[4690]: I0320 17:52:39.019878 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-61c8-account-create-update-nqdzc"] Mar 20 17:52:39 crc kubenswrapper[4690]: I0320 17:52:39.021095 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-61c8-account-create-update-nqdzc" Mar 20 17:52:39 crc kubenswrapper[4690]: I0320 17:52:39.023104 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 20 17:52:39 crc kubenswrapper[4690]: I0320 17:52:39.036208 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-61c8-account-create-update-nqdzc"] Mar 20 17:52:39 crc kubenswrapper[4690]: I0320 17:52:39.105411 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c7fc068-1b4a-4181-9cd5-cc9eb17d691b-operator-scripts\") pod \"cinder-db-create-sh9q5\" (UID: \"3c7fc068-1b4a-4181-9cd5-cc9eb17d691b\") " pod="openstack/cinder-db-create-sh9q5" Mar 20 17:52:39 crc kubenswrapper[4690]: I0320 17:52:39.105518 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsc4k\" (UniqueName: \"kubernetes.io/projected/3c7fc068-1b4a-4181-9cd5-cc9eb17d691b-kube-api-access-qsc4k\") pod \"cinder-db-create-sh9q5\" (UID: \"3c7fc068-1b4a-4181-9cd5-cc9eb17d691b\") " pod="openstack/cinder-db-create-sh9q5" Mar 20 17:52:39 crc kubenswrapper[4690]: I0320 17:52:39.118058 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-h6p97"] Mar 20 17:52:39 crc kubenswrapper[4690]: I0320 17:52:39.119175 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-h6p97" Mar 20 17:52:39 crc kubenswrapper[4690]: I0320 17:52:39.128773 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-h6p97"] Mar 20 17:52:39 crc kubenswrapper[4690]: I0320 17:52:39.171816 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-7rf8f"] Mar 20 17:52:39 crc kubenswrapper[4690]: I0320 17:52:39.172790 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-7rf8f" Mar 20 17:52:39 crc kubenswrapper[4690]: I0320 17:52:39.174715 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 20 17:52:39 crc kubenswrapper[4690]: I0320 17:52:39.175082 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 20 17:52:39 crc kubenswrapper[4690]: I0320 17:52:39.175788 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 20 17:52:39 crc kubenswrapper[4690]: I0320 17:52:39.175938 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-7l4fq" Mar 20 17:52:39 crc kubenswrapper[4690]: I0320 17:52:39.180679 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-7rf8f"] Mar 20 17:52:39 crc kubenswrapper[4690]: I0320 17:52:39.207338 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsc4k\" (UniqueName: \"kubernetes.io/projected/3c7fc068-1b4a-4181-9cd5-cc9eb17d691b-kube-api-access-qsc4k\") pod \"cinder-db-create-sh9q5\" (UID: \"3c7fc068-1b4a-4181-9cd5-cc9eb17d691b\") " pod="openstack/cinder-db-create-sh9q5" Mar 20 17:52:39 crc kubenswrapper[4690]: I0320 17:52:39.207386 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c8c999e8-0459-4d7a-8369-a44ec4af0bde-operator-scripts\") pod \"cinder-61c8-account-create-update-nqdzc\" (UID: \"c8c999e8-0459-4d7a-8369-a44ec4af0bde\") " pod="openstack/cinder-61c8-account-create-update-nqdzc" Mar 20 17:52:39 crc kubenswrapper[4690]: I0320 17:52:39.207478 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lt2z9\" (UniqueName: \"kubernetes.io/projected/c8c999e8-0459-4d7a-8369-a44ec4af0bde-kube-api-access-lt2z9\") pod \"cinder-61c8-account-create-update-nqdzc\" (UID: \"c8c999e8-0459-4d7a-8369-a44ec4af0bde\") " pod="openstack/cinder-61c8-account-create-update-nqdzc" Mar 20 17:52:39 crc kubenswrapper[4690]: I0320 17:52:39.207502 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c7fc068-1b4a-4181-9cd5-cc9eb17d691b-operator-scripts\") pod \"cinder-db-create-sh9q5\" (UID: \"3c7fc068-1b4a-4181-9cd5-cc9eb17d691b\") " pod="openstack/cinder-db-create-sh9q5" Mar 20 17:52:39 crc kubenswrapper[4690]: I0320 17:52:39.208224 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c7fc068-1b4a-4181-9cd5-cc9eb17d691b-operator-scripts\") pod \"cinder-db-create-sh9q5\" (UID: \"3c7fc068-1b4a-4181-9cd5-cc9eb17d691b\") " pod="openstack/cinder-db-create-sh9q5" Mar 20 17:52:39 crc kubenswrapper[4690]: I0320 17:52:39.221977 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-c546-account-create-update-t5wsm"] Mar 20 17:52:39 crc kubenswrapper[4690]: I0320 17:52:39.222947 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c546-account-create-update-t5wsm" Mar 20 17:52:39 crc kubenswrapper[4690]: I0320 17:52:39.232430 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 20 17:52:39 crc kubenswrapper[4690]: I0320 17:52:39.235434 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-w8n6k"] Mar 20 17:52:39 crc kubenswrapper[4690]: I0320 17:52:39.236511 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-w8n6k" Mar 20 17:52:39 crc kubenswrapper[4690]: I0320 17:52:39.240795 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsc4k\" (UniqueName: \"kubernetes.io/projected/3c7fc068-1b4a-4181-9cd5-cc9eb17d691b-kube-api-access-qsc4k\") pod \"cinder-db-create-sh9q5\" (UID: \"3c7fc068-1b4a-4181-9cd5-cc9eb17d691b\") " pod="openstack/cinder-db-create-sh9q5" Mar 20 17:52:39 crc kubenswrapper[4690]: I0320 17:52:39.245136 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-c546-account-create-update-t5wsm"] Mar 20 17:52:39 crc kubenswrapper[4690]: I0320 17:52:39.246167 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-sh9q5" Mar 20 17:52:39 crc kubenswrapper[4690]: I0320 17:52:39.252734 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-w8n6k"] Mar 20 17:52:39 crc kubenswrapper[4690]: I0320 17:52:39.309663 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce1a2c5d-7796-4afb-b1b1-14c8f06ae79c-config-data\") pod \"keystone-db-sync-7rf8f\" (UID: \"ce1a2c5d-7796-4afb-b1b1-14c8f06ae79c\") " pod="openstack/keystone-db-sync-7rf8f" Mar 20 17:52:39 crc kubenswrapper[4690]: I0320 17:52:39.309905 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lt2z9\" (UniqueName: \"kubernetes.io/projected/c8c999e8-0459-4d7a-8369-a44ec4af0bde-kube-api-access-lt2z9\") pod \"cinder-61c8-account-create-update-nqdzc\" (UID: \"c8c999e8-0459-4d7a-8369-a44ec4af0bde\") " pod="openstack/cinder-61c8-account-create-update-nqdzc" Mar 20 17:52:39 crc kubenswrapper[4690]: I0320 17:52:39.310015 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37a5619a-4b07-4071-9e82-35d8f8a32f19-operator-scripts\") pod \"barbican-db-create-h6p97\" (UID: \"37a5619a-4b07-4071-9e82-35d8f8a32f19\") " pod="openstack/barbican-db-create-h6p97" Mar 20 17:52:39 crc kubenswrapper[4690]: I0320 17:52:39.310103 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce1a2c5d-7796-4afb-b1b1-14c8f06ae79c-combined-ca-bundle\") pod \"keystone-db-sync-7rf8f\" (UID: \"ce1a2c5d-7796-4afb-b1b1-14c8f06ae79c\") " pod="openstack/keystone-db-sync-7rf8f" Mar 20 17:52:39 crc kubenswrapper[4690]: I0320 17:52:39.310285 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c8c999e8-0459-4d7a-8369-a44ec4af0bde-operator-scripts\") pod \"cinder-61c8-account-create-update-nqdzc\" (UID: \"c8c999e8-0459-4d7a-8369-a44ec4af0bde\") " pod="openstack/cinder-61c8-account-create-update-nqdzc" Mar 20 17:52:39 crc kubenswrapper[4690]: I0320 17:52:39.310392 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tlwb\" (UniqueName: \"kubernetes.io/projected/ce1a2c5d-7796-4afb-b1b1-14c8f06ae79c-kube-api-access-2tlwb\") pod \"keystone-db-sync-7rf8f\" (UID: \"ce1a2c5d-7796-4afb-b1b1-14c8f06ae79c\") " pod="openstack/keystone-db-sync-7rf8f" Mar 20 17:52:39 crc kubenswrapper[4690]: I0320 17:52:39.310495 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2vck\" (UniqueName: \"kubernetes.io/projected/37a5619a-4b07-4071-9e82-35d8f8a32f19-kube-api-access-p2vck\") pod \"barbican-db-create-h6p97\" (UID: \"37a5619a-4b07-4071-9e82-35d8f8a32f19\") " pod="openstack/barbican-db-create-h6p97" Mar 20 17:52:39 crc kubenswrapper[4690]: I0320 17:52:39.311178 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c8c999e8-0459-4d7a-8369-a44ec4af0bde-operator-scripts\") pod \"cinder-61c8-account-create-update-nqdzc\" (UID: \"c8c999e8-0459-4d7a-8369-a44ec4af0bde\") " pod="openstack/cinder-61c8-account-create-update-nqdzc" Mar 20 17:52:39 crc kubenswrapper[4690]: I0320 17:52:39.331133 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lt2z9\" (UniqueName: \"kubernetes.io/projected/c8c999e8-0459-4d7a-8369-a44ec4af0bde-kube-api-access-lt2z9\") pod \"cinder-61c8-account-create-update-nqdzc\" (UID: \"c8c999e8-0459-4d7a-8369-a44ec4af0bde\") " pod="openstack/cinder-61c8-account-create-update-nqdzc" Mar 20 17:52:39 crc kubenswrapper[4690]: I0320 17:52:39.339048 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-61c8-account-create-update-nqdzc" Mar 20 17:52:39 crc kubenswrapper[4690]: I0320 17:52:39.415619 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2vck\" (UniqueName: \"kubernetes.io/projected/37a5619a-4b07-4071-9e82-35d8f8a32f19-kube-api-access-p2vck\") pod \"barbican-db-create-h6p97\" (UID: \"37a5619a-4b07-4071-9e82-35d8f8a32f19\") " pod="openstack/barbican-db-create-h6p97" Mar 20 17:52:39 crc kubenswrapper[4690]: I0320 17:52:39.415682 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2rbw\" (UniqueName: \"kubernetes.io/projected/7a616d63-75d6-49ee-b12c-e68bcc6303c8-kube-api-access-v2rbw\") pod \"neutron-db-create-w8n6k\" (UID: \"7a616d63-75d6-49ee-b12c-e68bcc6303c8\") " pod="openstack/neutron-db-create-w8n6k" Mar 20 17:52:39 crc kubenswrapper[4690]: I0320 17:52:39.415725 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/56f8f9ef-16ac-496b-a070-f82d7a55e5f8-operator-scripts\") pod \"neutron-c546-account-create-update-t5wsm\" (UID: \"56f8f9ef-16ac-496b-a070-f82d7a55e5f8\") " pod="openstack/neutron-c546-account-create-update-t5wsm" Mar 20 17:52:39 crc kubenswrapper[4690]: I0320 17:52:39.415755 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a616d63-75d6-49ee-b12c-e68bcc6303c8-operator-scripts\") pod \"neutron-db-create-w8n6k\" (UID: \"7a616d63-75d6-49ee-b12c-e68bcc6303c8\") " pod="openstack/neutron-db-create-w8n6k" Mar 20 17:52:39 crc kubenswrapper[4690]: I0320 17:52:39.415816 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce1a2c5d-7796-4afb-b1b1-14c8f06ae79c-config-data\") pod \"keystone-db-sync-7rf8f\" (UID: \"ce1a2c5d-7796-4afb-b1b1-14c8f06ae79c\") " pod="openstack/keystone-db-sync-7rf8f" Mar 20 17:52:39 crc kubenswrapper[4690]: I0320 17:52:39.415850 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzz4w\" (UniqueName: \"kubernetes.io/projected/56f8f9ef-16ac-496b-a070-f82d7a55e5f8-kube-api-access-dzz4w\") pod \"neutron-c546-account-create-update-t5wsm\" (UID: \"56f8f9ef-16ac-496b-a070-f82d7a55e5f8\") " pod="openstack/neutron-c546-account-create-update-t5wsm" Mar 20 17:52:39 crc kubenswrapper[4690]: I0320 17:52:39.415902 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37a5619a-4b07-4071-9e82-35d8f8a32f19-operator-scripts\") pod \"barbican-db-create-h6p97\" (UID: \"37a5619a-4b07-4071-9e82-35d8f8a32f19\") " pod="openstack/barbican-db-create-h6p97" Mar 20 17:52:39 crc kubenswrapper[4690]: I0320 17:52:39.418421 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce1a2c5d-7796-4afb-b1b1-14c8f06ae79c-combined-ca-bundle\") pod \"keystone-db-sync-7rf8f\" (UID: \"ce1a2c5d-7796-4afb-b1b1-14c8f06ae79c\") " pod="openstack/keystone-db-sync-7rf8f" Mar 20 17:52:39 crc kubenswrapper[4690]: I0320 17:52:39.418558 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tlwb\" (UniqueName: \"kubernetes.io/projected/ce1a2c5d-7796-4afb-b1b1-14c8f06ae79c-kube-api-access-2tlwb\") pod \"keystone-db-sync-7rf8f\" (UID: \"ce1a2c5d-7796-4afb-b1b1-14c8f06ae79c\") " pod="openstack/keystone-db-sync-7rf8f" Mar 20 17:52:39 crc kubenswrapper[4690]: I0320 17:52:39.424016 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37a5619a-4b07-4071-9e82-35d8f8a32f19-operator-scripts\") pod \"barbican-db-create-h6p97\" (UID: \"37a5619a-4b07-4071-9e82-35d8f8a32f19\") " pod="openstack/barbican-db-create-h6p97" Mar 20 17:52:39 crc kubenswrapper[4690]: I0320 17:52:39.428241 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce1a2c5d-7796-4afb-b1b1-14c8f06ae79c-combined-ca-bundle\") pod \"keystone-db-sync-7rf8f\" (UID: \"ce1a2c5d-7796-4afb-b1b1-14c8f06ae79c\") " pod="openstack/keystone-db-sync-7rf8f" Mar 20 17:52:39 crc kubenswrapper[4690]: I0320 17:52:39.438652 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2vck\" (UniqueName: \"kubernetes.io/projected/37a5619a-4b07-4071-9e82-35d8f8a32f19-kube-api-access-p2vck\") pod \"barbican-db-create-h6p97\" (UID: \"37a5619a-4b07-4071-9e82-35d8f8a32f19\") " pod="openstack/barbican-db-create-h6p97" Mar 20 17:52:39 crc kubenswrapper[4690]: I0320 17:52:39.438814 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce1a2c5d-7796-4afb-b1b1-14c8f06ae79c-config-data\") pod \"keystone-db-sync-7rf8f\" (UID: \"ce1a2c5d-7796-4afb-b1b1-14c8f06ae79c\") " pod="openstack/keystone-db-sync-7rf8f" Mar 20 17:52:39 crc kubenswrapper[4690]: I0320 17:52:39.438876 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tlwb\" (UniqueName: \"kubernetes.io/projected/ce1a2c5d-7796-4afb-b1b1-14c8f06ae79c-kube-api-access-2tlwb\") pod \"keystone-db-sync-7rf8f\" (UID: \"ce1a2c5d-7796-4afb-b1b1-14c8f06ae79c\") " pod="openstack/keystone-db-sync-7rf8f" Mar 20 17:52:39 crc kubenswrapper[4690]: I0320 17:52:39.441293 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-8927-account-create-update-r78hx"] Mar 20 17:52:39 crc kubenswrapper[4690]: I0320 17:52:39.442359 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-8927-account-create-update-r78hx" Mar 20 17:52:39 crc kubenswrapper[4690]: I0320 17:52:39.443682 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-h6p97" Mar 20 17:52:39 crc kubenswrapper[4690]: I0320 17:52:39.448774 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 20 17:52:39 crc kubenswrapper[4690]: I0320 17:52:39.450168 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-8927-account-create-update-r78hx"] Mar 20 17:52:39 crc kubenswrapper[4690]: I0320 17:52:39.486087 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-7rf8f" Mar 20 17:52:39 crc kubenswrapper[4690]: I0320 17:52:39.519896 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2rbw\" (UniqueName: \"kubernetes.io/projected/7a616d63-75d6-49ee-b12c-e68bcc6303c8-kube-api-access-v2rbw\") pod \"neutron-db-create-w8n6k\" (UID: \"7a616d63-75d6-49ee-b12c-e68bcc6303c8\") " pod="openstack/neutron-db-create-w8n6k" Mar 20 17:52:39 crc kubenswrapper[4690]: I0320 17:52:39.519962 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/56f8f9ef-16ac-496b-a070-f82d7a55e5f8-operator-scripts\") pod \"neutron-c546-account-create-update-t5wsm\" (UID: \"56f8f9ef-16ac-496b-a070-f82d7a55e5f8\") " pod="openstack/neutron-c546-account-create-update-t5wsm" Mar 20 17:52:39 crc kubenswrapper[4690]: I0320 17:52:39.519986 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a616d63-75d6-49ee-b12c-e68bcc6303c8-operator-scripts\") pod \"neutron-db-create-w8n6k\" (UID: \"7a616d63-75d6-49ee-b12c-e68bcc6303c8\") " pod="openstack/neutron-db-create-w8n6k" Mar 20 17:52:39 crc kubenswrapper[4690]: I0320 17:52:39.520039 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzz4w\" (UniqueName: \"kubernetes.io/projected/56f8f9ef-16ac-496b-a070-f82d7a55e5f8-kube-api-access-dzz4w\") pod \"neutron-c546-account-create-update-t5wsm\" (UID: \"56f8f9ef-16ac-496b-a070-f82d7a55e5f8\") " pod="openstack/neutron-c546-account-create-update-t5wsm" Mar 20 17:52:39 crc kubenswrapper[4690]: I0320 17:52:39.520668 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/56f8f9ef-16ac-496b-a070-f82d7a55e5f8-operator-scripts\") pod \"neutron-c546-account-create-update-t5wsm\" (UID: \"56f8f9ef-16ac-496b-a070-f82d7a55e5f8\") " pod="openstack/neutron-c546-account-create-update-t5wsm" Mar 20 17:52:39 crc kubenswrapper[4690]: I0320 17:52:39.520797 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a616d63-75d6-49ee-b12c-e68bcc6303c8-operator-scripts\") pod \"neutron-db-create-w8n6k\" (UID: \"7a616d63-75d6-49ee-b12c-e68bcc6303c8\") " pod="openstack/neutron-db-create-w8n6k" Mar 20 17:52:39 crc kubenswrapper[4690]: I0320 17:52:39.540663 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2rbw\" (UniqueName: \"kubernetes.io/projected/7a616d63-75d6-49ee-b12c-e68bcc6303c8-kube-api-access-v2rbw\") pod \"neutron-db-create-w8n6k\" (UID: \"7a616d63-75d6-49ee-b12c-e68bcc6303c8\") " pod="openstack/neutron-db-create-w8n6k" Mar 20 17:52:39 crc kubenswrapper[4690]: I0320 17:52:39.542536 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzz4w\" (UniqueName: \"kubernetes.io/projected/56f8f9ef-16ac-496b-a070-f82d7a55e5f8-kube-api-access-dzz4w\") pod \"neutron-c546-account-create-update-t5wsm\" (UID: \"56f8f9ef-16ac-496b-a070-f82d7a55e5f8\") " pod="openstack/neutron-c546-account-create-update-t5wsm" Mar 20 17:52:39 crc kubenswrapper[4690]: I0320 17:52:39.621081 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wvdl\" (UniqueName: \"kubernetes.io/projected/80ed3d10-eba4-40ba-b635-7f43e2cc68d9-kube-api-access-4wvdl\") pod \"barbican-8927-account-create-update-r78hx\" (UID: \"80ed3d10-eba4-40ba-b635-7f43e2cc68d9\") " pod="openstack/barbican-8927-account-create-update-r78hx" Mar 20 17:52:39 crc kubenswrapper[4690]: I0320 17:52:39.621150 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80ed3d10-eba4-40ba-b635-7f43e2cc68d9-operator-scripts\") pod \"barbican-8927-account-create-update-r78hx\" (UID: \"80ed3d10-eba4-40ba-b635-7f43e2cc68d9\") " pod="openstack/barbican-8927-account-create-update-r78hx" Mar 20 17:52:39 crc kubenswrapper[4690]: I0320 17:52:39.703540 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-w8n6k" Mar 20 17:52:39 crc kubenswrapper[4690]: I0320 17:52:39.722752 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wvdl\" (UniqueName: \"kubernetes.io/projected/80ed3d10-eba4-40ba-b635-7f43e2cc68d9-kube-api-access-4wvdl\") pod \"barbican-8927-account-create-update-r78hx\" (UID: \"80ed3d10-eba4-40ba-b635-7f43e2cc68d9\") " pod="openstack/barbican-8927-account-create-update-r78hx" Mar 20 17:52:39 crc kubenswrapper[4690]: I0320 17:52:39.722804 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80ed3d10-eba4-40ba-b635-7f43e2cc68d9-operator-scripts\") pod \"barbican-8927-account-create-update-r78hx\" (UID: \"80ed3d10-eba4-40ba-b635-7f43e2cc68d9\") " pod="openstack/barbican-8927-account-create-update-r78hx" Mar 20 17:52:39 crc kubenswrapper[4690]: I0320 17:52:39.723655 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80ed3d10-eba4-40ba-b635-7f43e2cc68d9-operator-scripts\") pod \"barbican-8927-account-create-update-r78hx\" (UID: \"80ed3d10-eba4-40ba-b635-7f43e2cc68d9\") " pod="openstack/barbican-8927-account-create-update-r78hx" Mar 20 17:52:39 crc kubenswrapper[4690]: I0320 17:52:39.726540 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-sh9q5"] Mar 20 17:52:39 crc kubenswrapper[4690]: I0320 17:52:39.740920 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wvdl\" (UniqueName: \"kubernetes.io/projected/80ed3d10-eba4-40ba-b635-7f43e2cc68d9-kube-api-access-4wvdl\") pod \"barbican-8927-account-create-update-r78hx\" (UID: \"80ed3d10-eba4-40ba-b635-7f43e2cc68d9\") " pod="openstack/barbican-8927-account-create-update-r78hx" Mar 20 17:52:39 crc kubenswrapper[4690]: I0320 17:52:39.765737 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-8927-account-create-update-r78hx" Mar 20 17:52:39 crc kubenswrapper[4690]: I0320 17:52:39.839150 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c546-account-create-update-t5wsm" Mar 20 17:52:39 crc kubenswrapper[4690]: W0320 17:52:39.902359 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc8c999e8_0459_4d7a_8369_a44ec4af0bde.slice/crio-53c13238355f63aa9312ab7163c7a7551d79a3ee6664901c399da30c747ba6b3 WatchSource:0}: Error finding container 53c13238355f63aa9312ab7163c7a7551d79a3ee6664901c399da30c747ba6b3: Status 404 returned error can't find the container with id 53c13238355f63aa9312ab7163c7a7551d79a3ee6664901c399da30c747ba6b3 Mar 20 17:52:39 crc kubenswrapper[4690]: I0320 17:52:39.920038 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-61c8-account-create-update-nqdzc"] Mar 20 17:52:39 crc kubenswrapper[4690]: I0320 17:52:39.957470 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-h6p97"] Mar 20 17:52:39 crc kubenswrapper[4690]: I0320 17:52:39.969951 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-7rf8f"] Mar 20 17:52:39 crc kubenswrapper[4690]: W0320 17:52:39.976943 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podce1a2c5d_7796_4afb_b1b1_14c8f06ae79c.slice/crio-f35bfe4e8e1eccd35299ccad1654ee45e4f46d5e1a8d660ea604173372f51746 WatchSource:0}: Error finding container f35bfe4e8e1eccd35299ccad1654ee45e4f46d5e1a8d660ea604173372f51746: Status 404 returned error can't find the container with id f35bfe4e8e1eccd35299ccad1654ee45e4f46d5e1a8d660ea604173372f51746 Mar 20 17:52:39 crc kubenswrapper[4690]: W0320 17:52:39.977950 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37a5619a_4b07_4071_9e82_35d8f8a32f19.slice/crio-f0cf84d4239bc2b4954778817c61482f3470a734fe397b483d31b404b32f1940 WatchSource:0}: Error finding container f0cf84d4239bc2b4954778817c61482f3470a734fe397b483d31b404b32f1940: Status 404 returned error can't find the container with id f0cf84d4239bc2b4954778817c61482f3470a734fe397b483d31b404b32f1940 Mar 20 17:52:40 crc kubenswrapper[4690]: I0320 17:52:40.210504 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-w8n6k"] Mar 20 17:52:40 crc kubenswrapper[4690]: W0320 17:52:40.236157 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a616d63_75d6_49ee_b12c_e68bcc6303c8.slice/crio-a0c96111d1c5a6950ad4e7168f06572471bab600b2d891da96caf37d8fb42ae6 WatchSource:0}: Error finding container a0c96111d1c5a6950ad4e7168f06572471bab600b2d891da96caf37d8fb42ae6: Status 404 returned error can't find the container with id a0c96111d1c5a6950ad4e7168f06572471bab600b2d891da96caf37d8fb42ae6 Mar 20 17:52:40 crc kubenswrapper[4690]: I0320 17:52:40.293508 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-8927-account-create-update-r78hx"] Mar 20 17:52:40 crc kubenswrapper[4690]: I0320 17:52:40.390007 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-c546-account-create-update-t5wsm"] Mar 20 17:52:40 crc kubenswrapper[4690]: W0320 17:52:40.474012 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod56f8f9ef_16ac_496b_a070_f82d7a55e5f8.slice/crio-6da6617240aa401dd575589f561d42d29f6172ad879984820160b441d4179f2e WatchSource:0}: Error finding container 6da6617240aa401dd575589f561d42d29f6172ad879984820160b441d4179f2e: Status 404 returned error can't find the container with id 6da6617240aa401dd575589f561d42d29f6172ad879984820160b441d4179f2e Mar 20 17:52:40 crc kubenswrapper[4690]: I0320 17:52:40.485153 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-w8n6k" event={"ID":"7a616d63-75d6-49ee-b12c-e68bcc6303c8","Type":"ContainerStarted","Data":"a0c96111d1c5a6950ad4e7168f06572471bab600b2d891da96caf37d8fb42ae6"} Mar 20 17:52:40 crc kubenswrapper[4690]: I0320 17:52:40.487299 4690 generic.go:334] "Generic (PLEG): container finished" podID="37a5619a-4b07-4071-9e82-35d8f8a32f19" containerID="b627a95077c84b57c32b1a48a6bbaec55b07ac9158955445e7b39c8f085e3a67" exitCode=0 Mar 20 17:52:40 crc kubenswrapper[4690]: I0320 17:52:40.487360 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-h6p97" event={"ID":"37a5619a-4b07-4071-9e82-35d8f8a32f19","Type":"ContainerDied","Data":"b627a95077c84b57c32b1a48a6bbaec55b07ac9158955445e7b39c8f085e3a67"} Mar 20 17:52:40 crc kubenswrapper[4690]: I0320 17:52:40.487377 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-h6p97" event={"ID":"37a5619a-4b07-4071-9e82-35d8f8a32f19","Type":"ContainerStarted","Data":"f0cf84d4239bc2b4954778817c61482f3470a734fe397b483d31b404b32f1940"} Mar 20 17:52:40 crc kubenswrapper[4690]: I0320 17:52:40.488865 4690 generic.go:334] "Generic (PLEG): container finished" podID="c8c999e8-0459-4d7a-8369-a44ec4af0bde" containerID="b11c76373714c080709968c94f055d40a82ed3ee04a0368540f50a64bdf3515d" exitCode=0 Mar 20 17:52:40 crc kubenswrapper[4690]: I0320 17:52:40.488943 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-61c8-account-create-update-nqdzc" event={"ID":"c8c999e8-0459-4d7a-8369-a44ec4af0bde","Type":"ContainerDied","Data":"b11c76373714c080709968c94f055d40a82ed3ee04a0368540f50a64bdf3515d"} Mar 20 17:52:40 crc kubenswrapper[4690]: I0320 17:52:40.488985 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-61c8-account-create-update-nqdzc" event={"ID":"c8c999e8-0459-4d7a-8369-a44ec4af0bde","Type":"ContainerStarted","Data":"53c13238355f63aa9312ab7163c7a7551d79a3ee6664901c399da30c747ba6b3"} Mar 20 17:52:40 crc kubenswrapper[4690]: I0320 17:52:40.489963 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-7rf8f" event={"ID":"ce1a2c5d-7796-4afb-b1b1-14c8f06ae79c","Type":"ContainerStarted","Data":"f35bfe4e8e1eccd35299ccad1654ee45e4f46d5e1a8d660ea604173372f51746"} Mar 20 17:52:40 crc kubenswrapper[4690]: I0320 17:52:40.491710 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-8927-account-create-update-r78hx" event={"ID":"80ed3d10-eba4-40ba-b635-7f43e2cc68d9","Type":"ContainerStarted","Data":"87b2d04b8375deaa6145e6b43f5d5f9e000ed8f9822610600171c99eecfe617e"} Mar 20 17:52:40 crc kubenswrapper[4690]: I0320 17:52:40.493086 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c546-account-create-update-t5wsm" event={"ID":"56f8f9ef-16ac-496b-a070-f82d7a55e5f8","Type":"ContainerStarted","Data":"6da6617240aa401dd575589f561d42d29f6172ad879984820160b441d4179f2e"} Mar 20 17:52:40 crc kubenswrapper[4690]: I0320 17:52:40.494932 4690 generic.go:334] "Generic (PLEG): container finished" podID="3c7fc068-1b4a-4181-9cd5-cc9eb17d691b" containerID="2a7c80b59b16162714fe6ea48b60263010a288426068f3bed299f76fa86cb2dc" exitCode=0 Mar 20 17:52:40 crc kubenswrapper[4690]: I0320 17:52:40.494992 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-sh9q5" event={"ID":"3c7fc068-1b4a-4181-9cd5-cc9eb17d691b","Type":"ContainerDied","Data":"2a7c80b59b16162714fe6ea48b60263010a288426068f3bed299f76fa86cb2dc"} Mar 20 17:52:40 crc kubenswrapper[4690]: I0320 17:52:40.495025 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-sh9q5" event={"ID":"3c7fc068-1b4a-4181-9cd5-cc9eb17d691b","Type":"ContainerStarted","Data":"eb84e82978ee5c1bae4668993389d39ce1f3490a04d9600a039f4c35ccd3a31c"} Mar 20 17:52:41 crc kubenswrapper[4690]: I0320 17:52:41.505423 4690 generic.go:334] "Generic (PLEG): container finished" podID="80ed3d10-eba4-40ba-b635-7f43e2cc68d9" containerID="cced9f852b989af8b65a3dc613df98da60471356b861f80cd8ca215aec818dc9" exitCode=0 Mar 20 17:52:41 crc kubenswrapper[4690]: I0320 17:52:41.505505 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-8927-account-create-update-r78hx" event={"ID":"80ed3d10-eba4-40ba-b635-7f43e2cc68d9","Type":"ContainerDied","Data":"cced9f852b989af8b65a3dc613df98da60471356b861f80cd8ca215aec818dc9"} Mar 20 17:52:41 crc kubenswrapper[4690]: I0320 17:52:41.507492 4690 generic.go:334] "Generic (PLEG): container finished" podID="56f8f9ef-16ac-496b-a070-f82d7a55e5f8" containerID="e1dffb5d84cdfe9d6befcccc1c07d8a45fcbc2d86ba715ce0f4edfcc4492de08" exitCode=0 Mar 20 17:52:41 crc kubenswrapper[4690]: I0320 17:52:41.507562 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c546-account-create-update-t5wsm" event={"ID":"56f8f9ef-16ac-496b-a070-f82d7a55e5f8","Type":"ContainerDied","Data":"e1dffb5d84cdfe9d6befcccc1c07d8a45fcbc2d86ba715ce0f4edfcc4492de08"} Mar 20 17:52:41 crc kubenswrapper[4690]: I0320 17:52:41.510566 4690 generic.go:334] "Generic (PLEG): container finished" podID="7a616d63-75d6-49ee-b12c-e68bcc6303c8" containerID="a745b162a284579d6444986b78d0fd90e5ca48bb8d085d6310eb736cbb4f4c5e" exitCode=0 Mar 20 17:52:41 crc kubenswrapper[4690]: I0320 17:52:41.510815 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-w8n6k" event={"ID":"7a616d63-75d6-49ee-b12c-e68bcc6303c8","Type":"ContainerDied","Data":"a745b162a284579d6444986b78d0fd90e5ca48bb8d085d6310eb736cbb4f4c5e"} Mar 20 17:52:44 crc kubenswrapper[4690]: I0320 17:52:44.463637 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-sh9q5" Mar 20 17:52:44 crc kubenswrapper[4690]: I0320 17:52:44.470494 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-61c8-account-create-update-nqdzc" Mar 20 17:52:44 crc kubenswrapper[4690]: I0320 17:52:44.479054 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-h6p97" Mar 20 17:52:44 crc kubenswrapper[4690]: I0320 17:52:44.512845 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c546-account-create-update-t5wsm" Mar 20 17:52:44 crc kubenswrapper[4690]: I0320 17:52:44.521712 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-w8n6k" Mar 20 17:52:44 crc kubenswrapper[4690]: I0320 17:52:44.542099 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-8927-account-create-update-r78hx" Mar 20 17:52:44 crc kubenswrapper[4690]: I0320 17:52:44.549393 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-8927-account-create-update-r78hx" Mar 20 17:52:44 crc kubenswrapper[4690]: I0320 17:52:44.549410 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-8927-account-create-update-r78hx" event={"ID":"80ed3d10-eba4-40ba-b635-7f43e2cc68d9","Type":"ContainerDied","Data":"87b2d04b8375deaa6145e6b43f5d5f9e000ed8f9822610600171c99eecfe617e"} Mar 20 17:52:44 crc kubenswrapper[4690]: I0320 17:52:44.549496 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="87b2d04b8375deaa6145e6b43f5d5f9e000ed8f9822610600171c99eecfe617e" Mar 20 17:52:44 crc kubenswrapper[4690]: I0320 17:52:44.552467 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c546-account-create-update-t5wsm" Mar 20 17:52:44 crc kubenswrapper[4690]: I0320 17:52:44.552470 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c546-account-create-update-t5wsm" event={"ID":"56f8f9ef-16ac-496b-a070-f82d7a55e5f8","Type":"ContainerDied","Data":"6da6617240aa401dd575589f561d42d29f6172ad879984820160b441d4179f2e"} Mar 20 17:52:44 crc kubenswrapper[4690]: I0320 17:52:44.552643 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6da6617240aa401dd575589f561d42d29f6172ad879984820160b441d4179f2e" Mar 20 17:52:44 crc kubenswrapper[4690]: I0320 17:52:44.554699 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-sh9q5" Mar 20 17:52:44 crc kubenswrapper[4690]: I0320 17:52:44.554745 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-sh9q5" event={"ID":"3c7fc068-1b4a-4181-9cd5-cc9eb17d691b","Type":"ContainerDied","Data":"eb84e82978ee5c1bae4668993389d39ce1f3490a04d9600a039f4c35ccd3a31c"} Mar 20 17:52:44 crc kubenswrapper[4690]: I0320 17:52:44.554782 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb84e82978ee5c1bae4668993389d39ce1f3490a04d9600a039f4c35ccd3a31c" Mar 20 17:52:44 crc kubenswrapper[4690]: I0320 17:52:44.556827 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-w8n6k" event={"ID":"7a616d63-75d6-49ee-b12c-e68bcc6303c8","Type":"ContainerDied","Data":"a0c96111d1c5a6950ad4e7168f06572471bab600b2d891da96caf37d8fb42ae6"} Mar 20 17:52:44 crc kubenswrapper[4690]: I0320 17:52:44.557010 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a0c96111d1c5a6950ad4e7168f06572471bab600b2d891da96caf37d8fb42ae6" Mar 20 17:52:44 crc kubenswrapper[4690]: I0320 17:52:44.557229 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-w8n6k" Mar 20 17:52:44 crc kubenswrapper[4690]: I0320 17:52:44.563336 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-h6p97" Mar 20 17:52:44 crc kubenswrapper[4690]: I0320 17:52:44.564007 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-h6p97" event={"ID":"37a5619a-4b07-4071-9e82-35d8f8a32f19","Type":"ContainerDied","Data":"f0cf84d4239bc2b4954778817c61482f3470a734fe397b483d31b404b32f1940"} Mar 20 17:52:44 crc kubenswrapper[4690]: I0320 17:52:44.564074 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f0cf84d4239bc2b4954778817c61482f3470a734fe397b483d31b404b32f1940" Mar 20 17:52:44 crc kubenswrapper[4690]: I0320 17:52:44.571826 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-61c8-account-create-update-nqdzc" event={"ID":"c8c999e8-0459-4d7a-8369-a44ec4af0bde","Type":"ContainerDied","Data":"53c13238355f63aa9312ab7163c7a7551d79a3ee6664901c399da30c747ba6b3"} Mar 20 17:52:44 crc kubenswrapper[4690]: I0320 17:52:44.571858 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="53c13238355f63aa9312ab7163c7a7551d79a3ee6664901c399da30c747ba6b3" Mar 20 17:52:44 crc kubenswrapper[4690]: I0320 17:52:44.571898 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-61c8-account-create-update-nqdzc" Mar 20 17:52:44 crc kubenswrapper[4690]: I0320 17:52:44.626436 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dzz4w\" (UniqueName: \"kubernetes.io/projected/56f8f9ef-16ac-496b-a070-f82d7a55e5f8-kube-api-access-dzz4w\") pod \"56f8f9ef-16ac-496b-a070-f82d7a55e5f8\" (UID: \"56f8f9ef-16ac-496b-a070-f82d7a55e5f8\") " Mar 20 17:52:44 crc kubenswrapper[4690]: I0320 17:52:44.626850 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a616d63-75d6-49ee-b12c-e68bcc6303c8-operator-scripts\") pod \"7a616d63-75d6-49ee-b12c-e68bcc6303c8\" (UID: \"7a616d63-75d6-49ee-b12c-e68bcc6303c8\") " Mar 20 17:52:44 crc kubenswrapper[4690]: I0320 17:52:44.626927 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lt2z9\" (UniqueName: \"kubernetes.io/projected/c8c999e8-0459-4d7a-8369-a44ec4af0bde-kube-api-access-lt2z9\") pod \"c8c999e8-0459-4d7a-8369-a44ec4af0bde\" (UID: \"c8c999e8-0459-4d7a-8369-a44ec4af0bde\") " Mar 20 17:52:44 crc kubenswrapper[4690]: I0320 17:52:44.627006 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2vck\" (UniqueName: \"kubernetes.io/projected/37a5619a-4b07-4071-9e82-35d8f8a32f19-kube-api-access-p2vck\") pod \"37a5619a-4b07-4071-9e82-35d8f8a32f19\" (UID: \"37a5619a-4b07-4071-9e82-35d8f8a32f19\") " Mar 20 17:52:44 crc kubenswrapper[4690]: I0320 17:52:44.627127 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/56f8f9ef-16ac-496b-a070-f82d7a55e5f8-operator-scripts\") pod \"56f8f9ef-16ac-496b-a070-f82d7a55e5f8\" (UID: \"56f8f9ef-16ac-496b-a070-f82d7a55e5f8\") " Mar 20 17:52:44 crc kubenswrapper[4690]: I0320 17:52:44.627173 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qsc4k\" (UniqueName: \"kubernetes.io/projected/3c7fc068-1b4a-4181-9cd5-cc9eb17d691b-kube-api-access-qsc4k\") pod \"3c7fc068-1b4a-4181-9cd5-cc9eb17d691b\" (UID: \"3c7fc068-1b4a-4181-9cd5-cc9eb17d691b\") " Mar 20 17:52:44 crc kubenswrapper[4690]: I0320 17:52:44.627214 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v2rbw\" (UniqueName: \"kubernetes.io/projected/7a616d63-75d6-49ee-b12c-e68bcc6303c8-kube-api-access-v2rbw\") pod \"7a616d63-75d6-49ee-b12c-e68bcc6303c8\" (UID: \"7a616d63-75d6-49ee-b12c-e68bcc6303c8\") " Mar 20 17:52:44 crc kubenswrapper[4690]: I0320 17:52:44.627342 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37a5619a-4b07-4071-9e82-35d8f8a32f19-operator-scripts\") pod \"37a5619a-4b07-4071-9e82-35d8f8a32f19\" (UID: \"37a5619a-4b07-4071-9e82-35d8f8a32f19\") " Mar 20 17:52:44 crc kubenswrapper[4690]: I0320 17:52:44.627397 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c7fc068-1b4a-4181-9cd5-cc9eb17d691b-operator-scripts\") pod \"3c7fc068-1b4a-4181-9cd5-cc9eb17d691b\" (UID: \"3c7fc068-1b4a-4181-9cd5-cc9eb17d691b\") " Mar 20 17:52:44 crc kubenswrapper[4690]: I0320 17:52:44.627448 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c8c999e8-0459-4d7a-8369-a44ec4af0bde-operator-scripts\") pod \"c8c999e8-0459-4d7a-8369-a44ec4af0bde\" (UID: \"c8c999e8-0459-4d7a-8369-a44ec4af0bde\") " Mar 20 17:52:44 crc kubenswrapper[4690]: I0320 17:52:44.628674 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a616d63-75d6-49ee-b12c-e68bcc6303c8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7a616d63-75d6-49ee-b12c-e68bcc6303c8" (UID: "7a616d63-75d6-49ee-b12c-e68bcc6303c8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:52:44 crc kubenswrapper[4690]: I0320 17:52:44.628672 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56f8f9ef-16ac-496b-a070-f82d7a55e5f8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "56f8f9ef-16ac-496b-a070-f82d7a55e5f8" (UID: "56f8f9ef-16ac-496b-a070-f82d7a55e5f8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:52:44 crc kubenswrapper[4690]: I0320 17:52:44.629727 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c7fc068-1b4a-4181-9cd5-cc9eb17d691b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3c7fc068-1b4a-4181-9cd5-cc9eb17d691b" (UID: "3c7fc068-1b4a-4181-9cd5-cc9eb17d691b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:52:44 crc kubenswrapper[4690]: I0320 17:52:44.629972 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37a5619a-4b07-4071-9e82-35d8f8a32f19-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "37a5619a-4b07-4071-9e82-35d8f8a32f19" (UID: "37a5619a-4b07-4071-9e82-35d8f8a32f19"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:52:44 crc kubenswrapper[4690]: I0320 17:52:44.630861 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8c999e8-0459-4d7a-8369-a44ec4af0bde-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c8c999e8-0459-4d7a-8369-a44ec4af0bde" (UID: "c8c999e8-0459-4d7a-8369-a44ec4af0bde"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:52:44 crc kubenswrapper[4690]: I0320 17:52:44.632162 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37a5619a-4b07-4071-9e82-35d8f8a32f19-kube-api-access-p2vck" (OuterVolumeSpecName: "kube-api-access-p2vck") pod "37a5619a-4b07-4071-9e82-35d8f8a32f19" (UID: "37a5619a-4b07-4071-9e82-35d8f8a32f19"). InnerVolumeSpecName "kube-api-access-p2vck". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:52:44 crc kubenswrapper[4690]: I0320 17:52:44.632512 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a616d63-75d6-49ee-b12c-e68bcc6303c8-kube-api-access-v2rbw" (OuterVolumeSpecName: "kube-api-access-v2rbw") pod "7a616d63-75d6-49ee-b12c-e68bcc6303c8" (UID: "7a616d63-75d6-49ee-b12c-e68bcc6303c8"). InnerVolumeSpecName "kube-api-access-v2rbw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:52:44 crc kubenswrapper[4690]: I0320 17:52:44.632582 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c7fc068-1b4a-4181-9cd5-cc9eb17d691b-kube-api-access-qsc4k" (OuterVolumeSpecName: "kube-api-access-qsc4k") pod "3c7fc068-1b4a-4181-9cd5-cc9eb17d691b" (UID: "3c7fc068-1b4a-4181-9cd5-cc9eb17d691b"). InnerVolumeSpecName "kube-api-access-qsc4k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:52:44 crc kubenswrapper[4690]: I0320 17:52:44.633460 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8c999e8-0459-4d7a-8369-a44ec4af0bde-kube-api-access-lt2z9" (OuterVolumeSpecName: "kube-api-access-lt2z9") pod "c8c999e8-0459-4d7a-8369-a44ec4af0bde" (UID: "c8c999e8-0459-4d7a-8369-a44ec4af0bde"). InnerVolumeSpecName "kube-api-access-lt2z9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:52:44 crc kubenswrapper[4690]: I0320 17:52:44.633548 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56f8f9ef-16ac-496b-a070-f82d7a55e5f8-kube-api-access-dzz4w" (OuterVolumeSpecName: "kube-api-access-dzz4w") pod "56f8f9ef-16ac-496b-a070-f82d7a55e5f8" (UID: "56f8f9ef-16ac-496b-a070-f82d7a55e5f8"). InnerVolumeSpecName "kube-api-access-dzz4w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:52:44 crc kubenswrapper[4690]: I0320 17:52:44.729375 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80ed3d10-eba4-40ba-b635-7f43e2cc68d9-operator-scripts\") pod \"80ed3d10-eba4-40ba-b635-7f43e2cc68d9\" (UID: \"80ed3d10-eba4-40ba-b635-7f43e2cc68d9\") " Mar 20 17:52:44 crc kubenswrapper[4690]: I0320 17:52:44.729434 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wvdl\" (UniqueName: \"kubernetes.io/projected/80ed3d10-eba4-40ba-b635-7f43e2cc68d9-kube-api-access-4wvdl\") pod \"80ed3d10-eba4-40ba-b635-7f43e2cc68d9\" (UID: \"80ed3d10-eba4-40ba-b635-7f43e2cc68d9\") " Mar 20 17:52:44 crc kubenswrapper[4690]: I0320 17:52:44.729856 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80ed3d10-eba4-40ba-b635-7f43e2cc68d9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "80ed3d10-eba4-40ba-b635-7f43e2cc68d9" (UID: "80ed3d10-eba4-40ba-b635-7f43e2cc68d9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:52:44 crc kubenswrapper[4690]: I0320 17:52:44.729881 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2vck\" (UniqueName: \"kubernetes.io/projected/37a5619a-4b07-4071-9e82-35d8f8a32f19-kube-api-access-p2vck\") on node \"crc\" DevicePath \"\"" Mar 20 17:52:44 crc kubenswrapper[4690]: I0320 17:52:44.729895 4690 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/56f8f9ef-16ac-496b-a070-f82d7a55e5f8-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:52:44 crc kubenswrapper[4690]: I0320 17:52:44.729904 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qsc4k\" (UniqueName: \"kubernetes.io/projected/3c7fc068-1b4a-4181-9cd5-cc9eb17d691b-kube-api-access-qsc4k\") on node \"crc\" DevicePath \"\"" Mar 20 17:52:44 crc kubenswrapper[4690]: I0320 17:52:44.729913 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v2rbw\" (UniqueName: \"kubernetes.io/projected/7a616d63-75d6-49ee-b12c-e68bcc6303c8-kube-api-access-v2rbw\") on node \"crc\" DevicePath \"\"" Mar 20 17:52:44 crc kubenswrapper[4690]: I0320 17:52:44.729921 4690 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37a5619a-4b07-4071-9e82-35d8f8a32f19-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:52:44 crc kubenswrapper[4690]: I0320 17:52:44.729929 4690 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c7fc068-1b4a-4181-9cd5-cc9eb17d691b-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:52:44 crc kubenswrapper[4690]: I0320 17:52:44.729937 4690 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c8c999e8-0459-4d7a-8369-a44ec4af0bde-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:52:44 crc kubenswrapper[4690]: I0320 17:52:44.729946 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dzz4w\" (UniqueName: \"kubernetes.io/projected/56f8f9ef-16ac-496b-a070-f82d7a55e5f8-kube-api-access-dzz4w\") on node \"crc\" DevicePath \"\"" Mar 20 17:52:44 crc kubenswrapper[4690]: I0320 17:52:44.729958 4690 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a616d63-75d6-49ee-b12c-e68bcc6303c8-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:52:44 crc kubenswrapper[4690]: I0320 17:52:44.729967 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lt2z9\" (UniqueName: \"kubernetes.io/projected/c8c999e8-0459-4d7a-8369-a44ec4af0bde-kube-api-access-lt2z9\") on node \"crc\" DevicePath \"\"" Mar 20 17:52:44 crc kubenswrapper[4690]: I0320 17:52:44.733680 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80ed3d10-eba4-40ba-b635-7f43e2cc68d9-kube-api-access-4wvdl" (OuterVolumeSpecName: "kube-api-access-4wvdl") pod "80ed3d10-eba4-40ba-b635-7f43e2cc68d9" (UID: "80ed3d10-eba4-40ba-b635-7f43e2cc68d9"). InnerVolumeSpecName "kube-api-access-4wvdl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:52:44 crc kubenswrapper[4690]: I0320 17:52:44.831533 4690 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80ed3d10-eba4-40ba-b635-7f43e2cc68d9-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:52:44 crc kubenswrapper[4690]: I0320 17:52:44.831707 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wvdl\" (UniqueName: \"kubernetes.io/projected/80ed3d10-eba4-40ba-b635-7f43e2cc68d9-kube-api-access-4wvdl\") on node \"crc\" DevicePath \"\"" Mar 20 17:52:45 crc kubenswrapper[4690]: I0320 17:52:45.593507 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-7rf8f" event={"ID":"ce1a2c5d-7796-4afb-b1b1-14c8f06ae79c","Type":"ContainerStarted","Data":"cb06015b063ddbeb5a91ce5edc99587e449ca03867ccb0724afd6ffc9abc6d78"} Mar 20 17:52:45 crc kubenswrapper[4690]: I0320 17:52:45.665055 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-7rf8f" podStartSLOduration=2.325577278 podStartE2EDuration="6.665009295s" podCreationTimestamp="2026-03-20 17:52:39 +0000 UTC" firstStartedPulling="2026-03-20 17:52:39.979448309 +0000 UTC m=+1234.845273987" lastFinishedPulling="2026-03-20 17:52:44.318880316 +0000 UTC m=+1239.184706004" observedRunningTime="2026-03-20 17:52:45.637846158 +0000 UTC m=+1240.503671836" watchObservedRunningTime="2026-03-20 17:52:45.665009295 +0000 UTC m=+1240.530834993" Mar 20 17:52:47 crc kubenswrapper[4690]: I0320 17:52:47.626917 4690 generic.go:334] "Generic (PLEG): container finished" podID="ce1a2c5d-7796-4afb-b1b1-14c8f06ae79c" containerID="cb06015b063ddbeb5a91ce5edc99587e449ca03867ccb0724afd6ffc9abc6d78" exitCode=0 Mar 20 17:52:47 crc kubenswrapper[4690]: I0320 17:52:47.627046 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-7rf8f" event={"ID":"ce1a2c5d-7796-4afb-b1b1-14c8f06ae79c","Type":"ContainerDied","Data":"cb06015b063ddbeb5a91ce5edc99587e449ca03867ccb0724afd6ffc9abc6d78"} Mar 20 17:52:49 crc kubenswrapper[4690]: I0320 17:52:49.042552 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-7rf8f" Mar 20 17:52:49 crc kubenswrapper[4690]: I0320 17:52:49.207708 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2tlwb\" (UniqueName: \"kubernetes.io/projected/ce1a2c5d-7796-4afb-b1b1-14c8f06ae79c-kube-api-access-2tlwb\") pod \"ce1a2c5d-7796-4afb-b1b1-14c8f06ae79c\" (UID: \"ce1a2c5d-7796-4afb-b1b1-14c8f06ae79c\") " Mar 20 17:52:49 crc kubenswrapper[4690]: I0320 17:52:49.207770 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce1a2c5d-7796-4afb-b1b1-14c8f06ae79c-config-data\") pod \"ce1a2c5d-7796-4afb-b1b1-14c8f06ae79c\" (UID: \"ce1a2c5d-7796-4afb-b1b1-14c8f06ae79c\") " Mar 20 17:52:49 crc kubenswrapper[4690]: I0320 17:52:49.207817 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce1a2c5d-7796-4afb-b1b1-14c8f06ae79c-combined-ca-bundle\") pod \"ce1a2c5d-7796-4afb-b1b1-14c8f06ae79c\" (UID: \"ce1a2c5d-7796-4afb-b1b1-14c8f06ae79c\") " Mar 20 17:52:49 crc kubenswrapper[4690]: I0320 17:52:49.217529 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce1a2c5d-7796-4afb-b1b1-14c8f06ae79c-kube-api-access-2tlwb" (OuterVolumeSpecName: "kube-api-access-2tlwb") pod "ce1a2c5d-7796-4afb-b1b1-14c8f06ae79c" (UID: "ce1a2c5d-7796-4afb-b1b1-14c8f06ae79c"). InnerVolumeSpecName "kube-api-access-2tlwb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:52:49 crc kubenswrapper[4690]: I0320 17:52:49.256360 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce1a2c5d-7796-4afb-b1b1-14c8f06ae79c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ce1a2c5d-7796-4afb-b1b1-14c8f06ae79c" (UID: "ce1a2c5d-7796-4afb-b1b1-14c8f06ae79c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:52:49 crc kubenswrapper[4690]: I0320 17:52:49.282893 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce1a2c5d-7796-4afb-b1b1-14c8f06ae79c-config-data" (OuterVolumeSpecName: "config-data") pod "ce1a2c5d-7796-4afb-b1b1-14c8f06ae79c" (UID: "ce1a2c5d-7796-4afb-b1b1-14c8f06ae79c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:52:49 crc kubenswrapper[4690]: I0320 17:52:49.310612 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2tlwb\" (UniqueName: \"kubernetes.io/projected/ce1a2c5d-7796-4afb-b1b1-14c8f06ae79c-kube-api-access-2tlwb\") on node \"crc\" DevicePath \"\"" Mar 20 17:52:49 crc kubenswrapper[4690]: I0320 17:52:49.310681 4690 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce1a2c5d-7796-4afb-b1b1-14c8f06ae79c-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:52:49 crc kubenswrapper[4690]: I0320 17:52:49.310707 4690 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce1a2c5d-7796-4afb-b1b1-14c8f06ae79c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:52:49 crc kubenswrapper[4690]: I0320 17:52:49.650317 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-7rf8f" event={"ID":"ce1a2c5d-7796-4afb-b1b1-14c8f06ae79c","Type":"ContainerDied","Data":"f35bfe4e8e1eccd35299ccad1654ee45e4f46d5e1a8d660ea604173372f51746"} Mar 20 17:52:49 crc kubenswrapper[4690]: I0320 17:52:49.650638 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f35bfe4e8e1eccd35299ccad1654ee45e4f46d5e1a8d660ea604173372f51746" Mar 20 17:52:49 crc kubenswrapper[4690]: I0320 17:52:49.650391 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-7rf8f" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.249245 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-7qg9w"] Mar 20 17:52:50 crc kubenswrapper[4690]: E0320 17:52:50.249677 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a616d63-75d6-49ee-b12c-e68bcc6303c8" containerName="mariadb-database-create" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.249706 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a616d63-75d6-49ee-b12c-e68bcc6303c8" containerName="mariadb-database-create" Mar 20 17:52:50 crc kubenswrapper[4690]: E0320 17:52:50.249722 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80ed3d10-eba4-40ba-b635-7f43e2cc68d9" containerName="mariadb-account-create-update" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.249731 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="80ed3d10-eba4-40ba-b635-7f43e2cc68d9" containerName="mariadb-account-create-update" Mar 20 17:52:50 crc kubenswrapper[4690]: E0320 17:52:50.249758 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8c999e8-0459-4d7a-8369-a44ec4af0bde" containerName="mariadb-account-create-update" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.249768 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8c999e8-0459-4d7a-8369-a44ec4af0bde" containerName="mariadb-account-create-update" Mar 20 17:52:50 crc kubenswrapper[4690]: E0320 17:52:50.249783 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56f8f9ef-16ac-496b-a070-f82d7a55e5f8" containerName="mariadb-account-create-update" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.249790 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="56f8f9ef-16ac-496b-a070-f82d7a55e5f8" containerName="mariadb-account-create-update" Mar 20 17:52:50 crc kubenswrapper[4690]: E0320 17:52:50.249803 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c7fc068-1b4a-4181-9cd5-cc9eb17d691b" containerName="mariadb-database-create" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.249810 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c7fc068-1b4a-4181-9cd5-cc9eb17d691b" containerName="mariadb-database-create" Mar 20 17:52:50 crc kubenswrapper[4690]: E0320 17:52:50.249834 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce1a2c5d-7796-4afb-b1b1-14c8f06ae79c" containerName="keystone-db-sync" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.249842 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce1a2c5d-7796-4afb-b1b1-14c8f06ae79c" containerName="keystone-db-sync" Mar 20 17:52:50 crc kubenswrapper[4690]: E0320 17:52:50.249861 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37a5619a-4b07-4071-9e82-35d8f8a32f19" containerName="mariadb-database-create" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.249868 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="37a5619a-4b07-4071-9e82-35d8f8a32f19" containerName="mariadb-database-create" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.250060 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="80ed3d10-eba4-40ba-b635-7f43e2cc68d9" containerName="mariadb-account-create-update" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.250079 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8c999e8-0459-4d7a-8369-a44ec4af0bde" containerName="mariadb-account-create-update" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.250092 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a616d63-75d6-49ee-b12c-e68bcc6303c8" containerName="mariadb-database-create" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.250102 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce1a2c5d-7796-4afb-b1b1-14c8f06ae79c" containerName="keystone-db-sync" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.250116 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c7fc068-1b4a-4181-9cd5-cc9eb17d691b" containerName="mariadb-database-create" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.250131 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="37a5619a-4b07-4071-9e82-35d8f8a32f19" containerName="mariadb-database-create" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.250142 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="56f8f9ef-16ac-496b-a070-f82d7a55e5f8" containerName="mariadb-account-create-update" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.251178 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-7qg9w" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.269427 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-7qg9w"] Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.303146 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-f4szb"] Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.304503 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-f4szb" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.308025 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.308059 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-7l4fq" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.308300 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.308434 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.308481 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.330575 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-f4szb"] Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.433231 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b572482e-a7ea-4311-bff2-2cbd1ec4b42e-combined-ca-bundle\") pod \"keystone-bootstrap-f4szb\" (UID: \"b572482e-a7ea-4311-bff2-2cbd1ec4b42e\") " pod="openstack/keystone-bootstrap-f4szb" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.433334 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f5464bae-77a9-4be2-a0ef-56149b4c53c6-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-7qg9w\" (UID: \"f5464bae-77a9-4be2-a0ef-56149b4c53c6\") " pod="openstack/dnsmasq-dns-847c4cc679-7qg9w" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.433424 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b572482e-a7ea-4311-bff2-2cbd1ec4b42e-fernet-keys\") pod \"keystone-bootstrap-f4szb\" (UID: \"b572482e-a7ea-4311-bff2-2cbd1ec4b42e\") " pod="openstack/keystone-bootstrap-f4szb" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.433466 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f5464bae-77a9-4be2-a0ef-56149b4c53c6-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-7qg9w\" (UID: \"f5464bae-77a9-4be2-a0ef-56149b4c53c6\") " pod="openstack/dnsmasq-dns-847c4cc679-7qg9w" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.433493 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b572482e-a7ea-4311-bff2-2cbd1ec4b42e-scripts\") pod \"keystone-bootstrap-f4szb\" (UID: \"b572482e-a7ea-4311-bff2-2cbd1ec4b42e\") " pod="openstack/keystone-bootstrap-f4szb" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.433528 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96fm2\" (UniqueName: \"kubernetes.io/projected/f5464bae-77a9-4be2-a0ef-56149b4c53c6-kube-api-access-96fm2\") pod \"dnsmasq-dns-847c4cc679-7qg9w\" (UID: \"f5464bae-77a9-4be2-a0ef-56149b4c53c6\") " pod="openstack/dnsmasq-dns-847c4cc679-7qg9w" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.433560 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b572482e-a7ea-4311-bff2-2cbd1ec4b42e-config-data\") pod \"keystone-bootstrap-f4szb\" (UID: \"b572482e-a7ea-4311-bff2-2cbd1ec4b42e\") " pod="openstack/keystone-bootstrap-f4szb" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.433580 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b572482e-a7ea-4311-bff2-2cbd1ec4b42e-credential-keys\") pod \"keystone-bootstrap-f4szb\" (UID: \"b572482e-a7ea-4311-bff2-2cbd1ec4b42e\") " pod="openstack/keystone-bootstrap-f4szb" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.433601 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ldlp\" (UniqueName: \"kubernetes.io/projected/b572482e-a7ea-4311-bff2-2cbd1ec4b42e-kube-api-access-6ldlp\") pod \"keystone-bootstrap-f4szb\" (UID: \"b572482e-a7ea-4311-bff2-2cbd1ec4b42e\") " pod="openstack/keystone-bootstrap-f4szb" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.433660 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f5464bae-77a9-4be2-a0ef-56149b4c53c6-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-7qg9w\" (UID: \"f5464bae-77a9-4be2-a0ef-56149b4c53c6\") " pod="openstack/dnsmasq-dns-847c4cc679-7qg9w" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.433695 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5464bae-77a9-4be2-a0ef-56149b4c53c6-config\") pod \"dnsmasq-dns-847c4cc679-7qg9w\" (UID: \"f5464bae-77a9-4be2-a0ef-56149b4c53c6\") " pod="openstack/dnsmasq-dns-847c4cc679-7qg9w" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.433717 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f5464bae-77a9-4be2-a0ef-56149b4c53c6-dns-svc\") pod \"dnsmasq-dns-847c4cc679-7qg9w\" (UID: \"f5464bae-77a9-4be2-a0ef-56149b4c53c6\") " pod="openstack/dnsmasq-dns-847c4cc679-7qg9w" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.495876 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-wqk6t"] Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.496862 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-wqk6t" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.501639 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.501894 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.501913 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-phs8q" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.535466 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ldlp\" (UniqueName: \"kubernetes.io/projected/b572482e-a7ea-4311-bff2-2cbd1ec4b42e-kube-api-access-6ldlp\") pod \"keystone-bootstrap-f4szb\" (UID: \"b572482e-a7ea-4311-bff2-2cbd1ec4b42e\") " pod="openstack/keystone-bootstrap-f4szb" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.535564 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f5464bae-77a9-4be2-a0ef-56149b4c53c6-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-7qg9w\" (UID: \"f5464bae-77a9-4be2-a0ef-56149b4c53c6\") " pod="openstack/dnsmasq-dns-847c4cc679-7qg9w" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.535599 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5464bae-77a9-4be2-a0ef-56149b4c53c6-config\") pod \"dnsmasq-dns-847c4cc679-7qg9w\" (UID: \"f5464bae-77a9-4be2-a0ef-56149b4c53c6\") " pod="openstack/dnsmasq-dns-847c4cc679-7qg9w" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.535623 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f5464bae-77a9-4be2-a0ef-56149b4c53c6-dns-svc\") pod \"dnsmasq-dns-847c4cc679-7qg9w\" (UID: \"f5464bae-77a9-4be2-a0ef-56149b4c53c6\") " pod="openstack/dnsmasq-dns-847c4cc679-7qg9w" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.535644 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b572482e-a7ea-4311-bff2-2cbd1ec4b42e-combined-ca-bundle\") pod \"keystone-bootstrap-f4szb\" (UID: \"b572482e-a7ea-4311-bff2-2cbd1ec4b42e\") " pod="openstack/keystone-bootstrap-f4szb" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.535685 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f5464bae-77a9-4be2-a0ef-56149b4c53c6-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-7qg9w\" (UID: \"f5464bae-77a9-4be2-a0ef-56149b4c53c6\") " pod="openstack/dnsmasq-dns-847c4cc679-7qg9w" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.535722 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b572482e-a7ea-4311-bff2-2cbd1ec4b42e-fernet-keys\") pod \"keystone-bootstrap-f4szb\" (UID: \"b572482e-a7ea-4311-bff2-2cbd1ec4b42e\") " pod="openstack/keystone-bootstrap-f4szb" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.535761 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f5464bae-77a9-4be2-a0ef-56149b4c53c6-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-7qg9w\" (UID: \"f5464bae-77a9-4be2-a0ef-56149b4c53c6\") " pod="openstack/dnsmasq-dns-847c4cc679-7qg9w" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.535783 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b572482e-a7ea-4311-bff2-2cbd1ec4b42e-scripts\") pod \"keystone-bootstrap-f4szb\" (UID: \"b572482e-a7ea-4311-bff2-2cbd1ec4b42e\") " pod="openstack/keystone-bootstrap-f4szb" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.535821 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96fm2\" (UniqueName: \"kubernetes.io/projected/f5464bae-77a9-4be2-a0ef-56149b4c53c6-kube-api-access-96fm2\") pod \"dnsmasq-dns-847c4cc679-7qg9w\" (UID: \"f5464bae-77a9-4be2-a0ef-56149b4c53c6\") " pod="openstack/dnsmasq-dns-847c4cc679-7qg9w" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.535854 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b572482e-a7ea-4311-bff2-2cbd1ec4b42e-config-data\") pod \"keystone-bootstrap-f4szb\" (UID: \"b572482e-a7ea-4311-bff2-2cbd1ec4b42e\") " pod="openstack/keystone-bootstrap-f4szb" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.535871 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b572482e-a7ea-4311-bff2-2cbd1ec4b42e-credential-keys\") pod \"keystone-bootstrap-f4szb\" (UID: \"b572482e-a7ea-4311-bff2-2cbd1ec4b42e\") " pod="openstack/keystone-bootstrap-f4szb" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.537441 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f5464bae-77a9-4be2-a0ef-56149b4c53c6-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-7qg9w\" (UID: \"f5464bae-77a9-4be2-a0ef-56149b4c53c6\") " pod="openstack/dnsmasq-dns-847c4cc679-7qg9w" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.538091 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f5464bae-77a9-4be2-a0ef-56149b4c53c6-dns-svc\") pod \"dnsmasq-dns-847c4cc679-7qg9w\" (UID: \"f5464bae-77a9-4be2-a0ef-56149b4c53c6\") " pod="openstack/dnsmasq-dns-847c4cc679-7qg9w" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.538410 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5464bae-77a9-4be2-a0ef-56149b4c53c6-config\") pod \"dnsmasq-dns-847c4cc679-7qg9w\" (UID: \"f5464bae-77a9-4be2-a0ef-56149b4c53c6\") " pod="openstack/dnsmasq-dns-847c4cc679-7qg9w" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.538853 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f5464bae-77a9-4be2-a0ef-56149b4c53c6-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-7qg9w\" (UID: \"f5464bae-77a9-4be2-a0ef-56149b4c53c6\") " pod="openstack/dnsmasq-dns-847c4cc679-7qg9w" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.539192 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f5464bae-77a9-4be2-a0ef-56149b4c53c6-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-7qg9w\" (UID: \"f5464bae-77a9-4be2-a0ef-56149b4c53c6\") " pod="openstack/dnsmasq-dns-847c4cc679-7qg9w" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.543405 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b572482e-a7ea-4311-bff2-2cbd1ec4b42e-credential-keys\") pod \"keystone-bootstrap-f4szb\" (UID: \"b572482e-a7ea-4311-bff2-2cbd1ec4b42e\") " pod="openstack/keystone-bootstrap-f4szb" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.545628 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b572482e-a7ea-4311-bff2-2cbd1ec4b42e-combined-ca-bundle\") pod \"keystone-bootstrap-f4szb\" (UID: \"b572482e-a7ea-4311-bff2-2cbd1ec4b42e\") " pod="openstack/keystone-bootstrap-f4szb" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.549945 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b572482e-a7ea-4311-bff2-2cbd1ec4b42e-config-data\") pod \"keystone-bootstrap-f4szb\" (UID: \"b572482e-a7ea-4311-bff2-2cbd1ec4b42e\") " pod="openstack/keystone-bootstrap-f4szb" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.554725 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b572482e-a7ea-4311-bff2-2cbd1ec4b42e-scripts\") pod \"keystone-bootstrap-f4szb\" (UID: \"b572482e-a7ea-4311-bff2-2cbd1ec4b42e\") " pod="openstack/keystone-bootstrap-f4szb" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.560331 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b572482e-a7ea-4311-bff2-2cbd1ec4b42e-fernet-keys\") pod \"keystone-bootstrap-f4szb\" (UID: \"b572482e-a7ea-4311-bff2-2cbd1ec4b42e\") " pod="openstack/keystone-bootstrap-f4szb" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.565019 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96fm2\" (UniqueName: \"kubernetes.io/projected/f5464bae-77a9-4be2-a0ef-56149b4c53c6-kube-api-access-96fm2\") pod \"dnsmasq-dns-847c4cc679-7qg9w\" (UID: \"f5464bae-77a9-4be2-a0ef-56149b4c53c6\") " pod="openstack/dnsmasq-dns-847c4cc679-7qg9w" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.582717 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ldlp\" (UniqueName: \"kubernetes.io/projected/b572482e-a7ea-4311-bff2-2cbd1ec4b42e-kube-api-access-6ldlp\") pod \"keystone-bootstrap-f4szb\" (UID: \"b572482e-a7ea-4311-bff2-2cbd1ec4b42e\") " pod="openstack/keystone-bootstrap-f4szb" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.611621 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-7qg9w" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.624585 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5f5b984557-pdc26"] Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.626242 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5f5b984557-pdc26" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.639110 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ef3bcd50-5724-42a1-92df-262256c07d45-db-sync-config-data\") pod \"cinder-db-sync-wqk6t\" (UID: \"ef3bcd50-5724-42a1-92df-262256c07d45\") " pod="openstack/cinder-db-sync-wqk6t" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.639201 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef3bcd50-5724-42a1-92df-262256c07d45-combined-ca-bundle\") pod \"cinder-db-sync-wqk6t\" (UID: \"ef3bcd50-5724-42a1-92df-262256c07d45\") " pod="openstack/cinder-db-sync-wqk6t" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.639260 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef3bcd50-5724-42a1-92df-262256c07d45-scripts\") pod \"cinder-db-sync-wqk6t\" (UID: \"ef3bcd50-5724-42a1-92df-262256c07d45\") " pod="openstack/cinder-db-sync-wqk6t" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.639296 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gqwx\" (UniqueName: \"kubernetes.io/projected/ef3bcd50-5724-42a1-92df-262256c07d45-kube-api-access-8gqwx\") pod \"cinder-db-sync-wqk6t\" (UID: \"ef3bcd50-5724-42a1-92df-262256c07d45\") " pod="openstack/cinder-db-sync-wqk6t" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.639318 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ef3bcd50-5724-42a1-92df-262256c07d45-etc-machine-id\") pod \"cinder-db-sync-wqk6t\" (UID: \"ef3bcd50-5724-42a1-92df-262256c07d45\") " pod="openstack/cinder-db-sync-wqk6t" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.639340 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef3bcd50-5724-42a1-92df-262256c07d45-config-data\") pod \"cinder-db-sync-wqk6t\" (UID: \"ef3bcd50-5724-42a1-92df-262256c07d45\") " pod="openstack/cinder-db-sync-wqk6t" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.644341 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-wqk6t"] Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.647763 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-blcw6" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.647966 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.649490 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.653605 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.662130 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-f4szb" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.698322 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-t2qth"] Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.699377 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-t2qth" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.706472 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.707037 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-xwbns" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.707226 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.739330 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5f5b984557-pdc26"] Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.740420 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef3bcd50-5724-42a1-92df-262256c07d45-scripts\") pod \"cinder-db-sync-wqk6t\" (UID: \"ef3bcd50-5724-42a1-92df-262256c07d45\") " pod="openstack/cinder-db-sync-wqk6t" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.740469 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gqwx\" (UniqueName: \"kubernetes.io/projected/ef3bcd50-5724-42a1-92df-262256c07d45-kube-api-access-8gqwx\") pod \"cinder-db-sync-wqk6t\" (UID: \"ef3bcd50-5724-42a1-92df-262256c07d45\") " pod="openstack/cinder-db-sync-wqk6t" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.740496 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/94582dde-cc89-4d46-8a8d-655a743e8c02-horizon-secret-key\") pod \"horizon-5f5b984557-pdc26\" (UID: \"94582dde-cc89-4d46-8a8d-655a743e8c02\") " pod="openstack/horizon-5f5b984557-pdc26" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.740520 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ef3bcd50-5724-42a1-92df-262256c07d45-etc-machine-id\") pod \"cinder-db-sync-wqk6t\" (UID: \"ef3bcd50-5724-42a1-92df-262256c07d45\") " pod="openstack/cinder-db-sync-wqk6t" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.740544 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef3bcd50-5724-42a1-92df-262256c07d45-config-data\") pod \"cinder-db-sync-wqk6t\" (UID: \"ef3bcd50-5724-42a1-92df-262256c07d45\") " pod="openstack/cinder-db-sync-wqk6t" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.740578 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ef3bcd50-5724-42a1-92df-262256c07d45-db-sync-config-data\") pod \"cinder-db-sync-wqk6t\" (UID: \"ef3bcd50-5724-42a1-92df-262256c07d45\") " pod="openstack/cinder-db-sync-wqk6t" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.740608 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/94582dde-cc89-4d46-8a8d-655a743e8c02-scripts\") pod \"horizon-5f5b984557-pdc26\" (UID: \"94582dde-cc89-4d46-8a8d-655a743e8c02\") " pod="openstack/horizon-5f5b984557-pdc26" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.740637 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/94582dde-cc89-4d46-8a8d-655a743e8c02-config-data\") pod \"horizon-5f5b984557-pdc26\" (UID: \"94582dde-cc89-4d46-8a8d-655a743e8c02\") " pod="openstack/horizon-5f5b984557-pdc26" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.740664 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef3bcd50-5724-42a1-92df-262256c07d45-combined-ca-bundle\") pod \"cinder-db-sync-wqk6t\" (UID: \"ef3bcd50-5724-42a1-92df-262256c07d45\") " pod="openstack/cinder-db-sync-wqk6t" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.740704 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/94582dde-cc89-4d46-8a8d-655a743e8c02-logs\") pod \"horizon-5f5b984557-pdc26\" (UID: \"94582dde-cc89-4d46-8a8d-655a743e8c02\") " pod="openstack/horizon-5f5b984557-pdc26" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.740725 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwhnw\" (UniqueName: \"kubernetes.io/projected/94582dde-cc89-4d46-8a8d-655a743e8c02-kube-api-access-kwhnw\") pod \"horizon-5f5b984557-pdc26\" (UID: \"94582dde-cc89-4d46-8a8d-655a743e8c02\") " pod="openstack/horizon-5f5b984557-pdc26" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.744218 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ef3bcd50-5724-42a1-92df-262256c07d45-etc-machine-id\") pod \"cinder-db-sync-wqk6t\" (UID: \"ef3bcd50-5724-42a1-92df-262256c07d45\") " pod="openstack/cinder-db-sync-wqk6t" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.746635 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef3bcd50-5724-42a1-92df-262256c07d45-scripts\") pod \"cinder-db-sync-wqk6t\" (UID: \"ef3bcd50-5724-42a1-92df-262256c07d45\") " pod="openstack/cinder-db-sync-wqk6t" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.747427 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ef3bcd50-5724-42a1-92df-262256c07d45-db-sync-config-data\") pod \"cinder-db-sync-wqk6t\" (UID: \"ef3bcd50-5724-42a1-92df-262256c07d45\") " pod="openstack/cinder-db-sync-wqk6t" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.749288 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef3bcd50-5724-42a1-92df-262256c07d45-config-data\") pod \"cinder-db-sync-wqk6t\" (UID: \"ef3bcd50-5724-42a1-92df-262256c07d45\") " pod="openstack/cinder-db-sync-wqk6t" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.749831 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef3bcd50-5724-42a1-92df-262256c07d45-combined-ca-bundle\") pod \"cinder-db-sync-wqk6t\" (UID: \"ef3bcd50-5724-42a1-92df-262256c07d45\") " pod="openstack/cinder-db-sync-wqk6t" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.764137 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-t2qth"] Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.796859 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gqwx\" (UniqueName: \"kubernetes.io/projected/ef3bcd50-5724-42a1-92df-262256c07d45-kube-api-access-8gqwx\") pod \"cinder-db-sync-wqk6t\" (UID: \"ef3bcd50-5724-42a1-92df-262256c07d45\") " pod="openstack/cinder-db-sync-wqk6t" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.809128 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-wqk6t" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.823261 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-m4wn2"] Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.824351 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-m4wn2" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.831747 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.831914 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-hbpx6" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.841760 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/94582dde-cc89-4d46-8a8d-655a743e8c02-scripts\") pod \"horizon-5f5b984557-pdc26\" (UID: \"94582dde-cc89-4d46-8a8d-655a743e8c02\") " pod="openstack/horizon-5f5b984557-pdc26" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.841814 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/94582dde-cc89-4d46-8a8d-655a743e8c02-config-data\") pod \"horizon-5f5b984557-pdc26\" (UID: \"94582dde-cc89-4d46-8a8d-655a743e8c02\") " pod="openstack/horizon-5f5b984557-pdc26" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.841864 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/94582dde-cc89-4d46-8a8d-655a743e8c02-logs\") pod \"horizon-5f5b984557-pdc26\" (UID: \"94582dde-cc89-4d46-8a8d-655a743e8c02\") " pod="openstack/horizon-5f5b984557-pdc26" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.841885 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwhnw\" (UniqueName: \"kubernetes.io/projected/94582dde-cc89-4d46-8a8d-655a743e8c02-kube-api-access-kwhnw\") pod \"horizon-5f5b984557-pdc26\" (UID: \"94582dde-cc89-4d46-8a8d-655a743e8c02\") " pod="openstack/horizon-5f5b984557-pdc26" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.841914 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3770976f-1610-4bb2-97db-0d81d8af8de1-config\") pod \"neutron-db-sync-t2qth\" (UID: \"3770976f-1610-4bb2-97db-0d81d8af8de1\") " pod="openstack/neutron-db-sync-t2qth" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.841935 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3770976f-1610-4bb2-97db-0d81d8af8de1-combined-ca-bundle\") pod \"neutron-db-sync-t2qth\" (UID: \"3770976f-1610-4bb2-97db-0d81d8af8de1\") " pod="openstack/neutron-db-sync-t2qth" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.841955 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/94582dde-cc89-4d46-8a8d-655a743e8c02-horizon-secret-key\") pod \"horizon-5f5b984557-pdc26\" (UID: \"94582dde-cc89-4d46-8a8d-655a743e8c02\") " pod="openstack/horizon-5f5b984557-pdc26" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.841973 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2j52\" (UniqueName: \"kubernetes.io/projected/3770976f-1610-4bb2-97db-0d81d8af8de1-kube-api-access-s2j52\") pod \"neutron-db-sync-t2qth\" (UID: \"3770976f-1610-4bb2-97db-0d81d8af8de1\") " pod="openstack/neutron-db-sync-t2qth" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.842601 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/94582dde-cc89-4d46-8a8d-655a743e8c02-scripts\") pod \"horizon-5f5b984557-pdc26\" (UID: \"94582dde-cc89-4d46-8a8d-655a743e8c02\") " pod="openstack/horizon-5f5b984557-pdc26" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.843401 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/94582dde-cc89-4d46-8a8d-655a743e8c02-config-data\") pod \"horizon-5f5b984557-pdc26\" (UID: \"94582dde-cc89-4d46-8a8d-655a743e8c02\") " pod="openstack/horizon-5f5b984557-pdc26" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.843593 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/94582dde-cc89-4d46-8a8d-655a743e8c02-logs\") pod \"horizon-5f5b984557-pdc26\" (UID: \"94582dde-cc89-4d46-8a8d-655a743e8c02\") " pod="openstack/horizon-5f5b984557-pdc26" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.856547 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/94582dde-cc89-4d46-8a8d-655a743e8c02-horizon-secret-key\") pod \"horizon-5f5b984557-pdc26\" (UID: \"94582dde-cc89-4d46-8a8d-655a743e8c02\") " pod="openstack/horizon-5f5b984557-pdc26" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.866214 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-m4wn2"] Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.871806 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwhnw\" (UniqueName: \"kubernetes.io/projected/94582dde-cc89-4d46-8a8d-655a743e8c02-kube-api-access-kwhnw\") pod \"horizon-5f5b984557-pdc26\" (UID: \"94582dde-cc89-4d46-8a8d-655a743e8c02\") " pod="openstack/horizon-5f5b984557-pdc26" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.883491 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-7qg9w"] Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.917097 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-dzgr7"] Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.918139 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-dzgr7" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.923170 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.923363 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-tq8mp" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.923457 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.931304 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-d974c8585-p46g9"] Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.932489 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-d974c8585-p46g9" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.943316 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3770976f-1610-4bb2-97db-0d81d8af8de1-config\") pod \"neutron-db-sync-t2qth\" (UID: \"3770976f-1610-4bb2-97db-0d81d8af8de1\") " pod="openstack/neutron-db-sync-t2qth" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.943357 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3770976f-1610-4bb2-97db-0d81d8af8de1-combined-ca-bundle\") pod \"neutron-db-sync-t2qth\" (UID: \"3770976f-1610-4bb2-97db-0d81d8af8de1\") " pod="openstack/neutron-db-sync-t2qth" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.943392 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2j52\" (UniqueName: \"kubernetes.io/projected/3770976f-1610-4bb2-97db-0d81d8af8de1-kube-api-access-s2j52\") pod \"neutron-db-sync-t2qth\" (UID: \"3770976f-1610-4bb2-97db-0d81d8af8de1\") " pod="openstack/neutron-db-sync-t2qth" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.943432 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ck7x\" (UniqueName: \"kubernetes.io/projected/d1fc6c70-315f-47d3-b8d3-17e3da8ee4a0-kube-api-access-6ck7x\") pod \"barbican-db-sync-m4wn2\" (UID: \"d1fc6c70-315f-47d3-b8d3-17e3da8ee4a0\") " pod="openstack/barbican-db-sync-m4wn2" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.943453 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1fc6c70-315f-47d3-b8d3-17e3da8ee4a0-combined-ca-bundle\") pod \"barbican-db-sync-m4wn2\" (UID: \"d1fc6c70-315f-47d3-b8d3-17e3da8ee4a0\") " pod="openstack/barbican-db-sync-m4wn2" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.943493 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d1fc6c70-315f-47d3-b8d3-17e3da8ee4a0-db-sync-config-data\") pod \"barbican-db-sync-m4wn2\" (UID: \"d1fc6c70-315f-47d3-b8d3-17e3da8ee4a0\") " pod="openstack/barbican-db-sync-m4wn2" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.946876 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3770976f-1610-4bb2-97db-0d81d8af8de1-combined-ca-bundle\") pod \"neutron-db-sync-t2qth\" (UID: \"3770976f-1610-4bb2-97db-0d81d8af8de1\") " pod="openstack/neutron-db-sync-t2qth" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.951820 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/3770976f-1610-4bb2-97db-0d81d8af8de1-config\") pod \"neutron-db-sync-t2qth\" (UID: \"3770976f-1610-4bb2-97db-0d81d8af8de1\") " pod="openstack/neutron-db-sync-t2qth" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.955768 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.956974 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.959534 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.959841 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.959945 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-x6sz7" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.960098 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 20 17:52:50 crc kubenswrapper[4690]: I0320 17:52:50.997195 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-pn26k"] Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.002801 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-pn26k" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.043700 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2j52\" (UniqueName: \"kubernetes.io/projected/3770976f-1610-4bb2-97db-0d81d8af8de1-kube-api-access-s2j52\") pod \"neutron-db-sync-t2qth\" (UID: \"3770976f-1610-4bb2-97db-0d81d8af8de1\") " pod="openstack/neutron-db-sync-t2qth" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.043924 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.044841 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bls69\" (UniqueName: \"kubernetes.io/projected/b5f727fa-133b-4692-b725-e6854dc359fd-kube-api-access-bls69\") pod \"glance-default-external-api-0\" (UID: \"b5f727fa-133b-4692-b725-e6854dc359fd\") " pod="openstack/glance-default-external-api-0" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.044895 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e58293ba-94a0-47f1-977a-2a10fd0f5845-config-data\") pod \"horizon-d974c8585-p46g9\" (UID: \"e58293ba-94a0-47f1-977a-2a10fd0f5845\") " pod="openstack/horizon-d974c8585-p46g9" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.044936 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e58293ba-94a0-47f1-977a-2a10fd0f5845-scripts\") pod \"horizon-d974c8585-p46g9\" (UID: \"e58293ba-94a0-47f1-977a-2a10fd0f5845\") " pod="openstack/horizon-d974c8585-p46g9" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.044966 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5f727fa-133b-4692-b725-e6854dc359fd-config-data\") pod \"glance-default-external-api-0\" (UID: \"b5f727fa-133b-4692-b725-e6854dc359fd\") " pod="openstack/glance-default-external-api-0" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.044982 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14a50078-3ca8-4c47-8067-7473a9376323-config-data\") pod \"placement-db-sync-dzgr7\" (UID: \"14a50078-3ca8-4c47-8067-7473a9376323\") " pod="openstack/placement-db-sync-dzgr7" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.045001 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ck7x\" (UniqueName: \"kubernetes.io/projected/d1fc6c70-315f-47d3-b8d3-17e3da8ee4a0-kube-api-access-6ck7x\") pod \"barbican-db-sync-m4wn2\" (UID: \"d1fc6c70-315f-47d3-b8d3-17e3da8ee4a0\") " pod="openstack/barbican-db-sync-m4wn2" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.045021 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1fc6c70-315f-47d3-b8d3-17e3da8ee4a0-combined-ca-bundle\") pod \"barbican-db-sync-m4wn2\" (UID: \"d1fc6c70-315f-47d3-b8d3-17e3da8ee4a0\") " pod="openstack/barbican-db-sync-m4wn2" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.045042 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e58293ba-94a0-47f1-977a-2a10fd0f5845-logs\") pod \"horizon-d974c8585-p46g9\" (UID: \"e58293ba-94a0-47f1-977a-2a10fd0f5845\") " pod="openstack/horizon-d974c8585-p46g9" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.045059 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14a50078-3ca8-4c47-8067-7473a9376323-combined-ca-bundle\") pod \"placement-db-sync-dzgr7\" (UID: \"14a50078-3ca8-4c47-8067-7473a9376323\") " pod="openstack/placement-db-sync-dzgr7" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.045074 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14a50078-3ca8-4c47-8067-7473a9376323-logs\") pod \"placement-db-sync-dzgr7\" (UID: \"14a50078-3ca8-4c47-8067-7473a9376323\") " pod="openstack/placement-db-sync-dzgr7" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.045088 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b5f727fa-133b-4692-b725-e6854dc359fd-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b5f727fa-133b-4692-b725-e6854dc359fd\") " pod="openstack/glance-default-external-api-0" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.045102 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e58293ba-94a0-47f1-977a-2a10fd0f5845-horizon-secret-key\") pod \"horizon-d974c8585-p46g9\" (UID: \"e58293ba-94a0-47f1-977a-2a10fd0f5845\") " pod="openstack/horizon-d974c8585-p46g9" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.045118 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14a50078-3ca8-4c47-8067-7473a9376323-scripts\") pod \"placement-db-sync-dzgr7\" (UID: \"14a50078-3ca8-4c47-8067-7473a9376323\") " pod="openstack/placement-db-sync-dzgr7" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.045131 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"b5f727fa-133b-4692-b725-e6854dc359fd\") " pod="openstack/glance-default-external-api-0" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.045153 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d1fc6c70-315f-47d3-b8d3-17e3da8ee4a0-db-sync-config-data\") pod \"barbican-db-sync-m4wn2\" (UID: \"d1fc6c70-315f-47d3-b8d3-17e3da8ee4a0\") " pod="openstack/barbican-db-sync-m4wn2" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.045169 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5f727fa-133b-4692-b725-e6854dc359fd-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"b5f727fa-133b-4692-b725-e6854dc359fd\") " pod="openstack/glance-default-external-api-0" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.045199 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hkm2\" (UniqueName: \"kubernetes.io/projected/e58293ba-94a0-47f1-977a-2a10fd0f5845-kube-api-access-6hkm2\") pod \"horizon-d974c8585-p46g9\" (UID: \"e58293ba-94a0-47f1-977a-2a10fd0f5845\") " pod="openstack/horizon-d974c8585-p46g9" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.045223 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5f727fa-133b-4692-b725-e6854dc359fd-scripts\") pod \"glance-default-external-api-0\" (UID: \"b5f727fa-133b-4692-b725-e6854dc359fd\") " pod="openstack/glance-default-external-api-0" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.045248 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hdw2\" (UniqueName: \"kubernetes.io/projected/14a50078-3ca8-4c47-8067-7473a9376323-kube-api-access-8hdw2\") pod \"placement-db-sync-dzgr7\" (UID: \"14a50078-3ca8-4c47-8067-7473a9376323\") " pod="openstack/placement-db-sync-dzgr7" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.045282 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5f727fa-133b-4692-b725-e6854dc359fd-logs\") pod \"glance-default-external-api-0\" (UID: \"b5f727fa-133b-4692-b725-e6854dc359fd\") " pod="openstack/glance-default-external-api-0" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.045296 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5f727fa-133b-4692-b725-e6854dc359fd-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b5f727fa-133b-4692-b725-e6854dc359fd\") " pod="openstack/glance-default-external-api-0" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.045917 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.049081 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.050234 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1fc6c70-315f-47d3-b8d3-17e3da8ee4a0-combined-ca-bundle\") pod \"barbican-db-sync-m4wn2\" (UID: \"d1fc6c70-315f-47d3-b8d3-17e3da8ee4a0\") " pod="openstack/barbican-db-sync-m4wn2" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.051793 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.056574 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d1fc6c70-315f-47d3-b8d3-17e3da8ee4a0-db-sync-config-data\") pod \"barbican-db-sync-m4wn2\" (UID: \"d1fc6c70-315f-47d3-b8d3-17e3da8ee4a0\") " pod="openstack/barbican-db-sync-m4wn2" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.065868 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-dzgr7"] Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.069688 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ck7x\" (UniqueName: \"kubernetes.io/projected/d1fc6c70-315f-47d3-b8d3-17e3da8ee4a0-kube-api-access-6ck7x\") pod \"barbican-db-sync-m4wn2\" (UID: \"d1fc6c70-315f-47d3-b8d3-17e3da8ee4a0\") " pod="openstack/barbican-db-sync-m4wn2" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.076933 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-d974c8585-p46g9"] Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.086531 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.090605 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5f5b984557-pdc26" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.102461 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-pn26k"] Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.116052 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-t2qth" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.123081 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.146775 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3959350-36d3-4ea7-92af-94ac690b406e-scripts\") pod \"ceilometer-0\" (UID: \"d3959350-36d3-4ea7-92af-94ac690b406e\") " pod="openstack/ceilometer-0" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.146981 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bls69\" (UniqueName: \"kubernetes.io/projected/b5f727fa-133b-4692-b725-e6854dc359fd-kube-api-access-bls69\") pod \"glance-default-external-api-0\" (UID: \"b5f727fa-133b-4692-b725-e6854dc359fd\") " pod="openstack/glance-default-external-api-0" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.147147 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3959350-36d3-4ea7-92af-94ac690b406e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d3959350-36d3-4ea7-92af-94ac690b406e\") " pod="openstack/ceilometer-0" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.147222 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e58293ba-94a0-47f1-977a-2a10fd0f5845-config-data\") pod \"horizon-d974c8585-p46g9\" (UID: \"e58293ba-94a0-47f1-977a-2a10fd0f5845\") " pod="openstack/horizon-d974c8585-p46g9" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.147310 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e58293ba-94a0-47f1-977a-2a10fd0f5845-scripts\") pod \"horizon-d974c8585-p46g9\" (UID: \"e58293ba-94a0-47f1-977a-2a10fd0f5845\") " pod="openstack/horizon-d974c8585-p46g9" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.147954 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3959350-36d3-4ea7-92af-94ac690b406e-run-httpd\") pod \"ceilometer-0\" (UID: \"d3959350-36d3-4ea7-92af-94ac690b406e\") " pod="openstack/ceilometer-0" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.148024 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5f727fa-133b-4692-b725-e6854dc359fd-config-data\") pod \"glance-default-external-api-0\" (UID: \"b5f727fa-133b-4692-b725-e6854dc359fd\") " pod="openstack/glance-default-external-api-0" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.148099 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14a50078-3ca8-4c47-8067-7473a9376323-config-data\") pod \"placement-db-sync-dzgr7\" (UID: \"14a50078-3ca8-4c47-8067-7473a9376323\") " pod="openstack/placement-db-sync-dzgr7" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.148175 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c491527c-0ddb-41bc-86f2-78334b0b3075-config\") pod \"dnsmasq-dns-785d8bcb8c-pn26k\" (UID: \"c491527c-0ddb-41bc-86f2-78334b0b3075\") " pod="openstack/dnsmasq-dns-785d8bcb8c-pn26k" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.148250 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c491527c-0ddb-41bc-86f2-78334b0b3075-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-pn26k\" (UID: \"c491527c-0ddb-41bc-86f2-78334b0b3075\") " pod="openstack/dnsmasq-dns-785d8bcb8c-pn26k" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.148388 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e58293ba-94a0-47f1-977a-2a10fd0f5845-logs\") pod \"horizon-d974c8585-p46g9\" (UID: \"e58293ba-94a0-47f1-977a-2a10fd0f5845\") " pod="openstack/horizon-d974c8585-p46g9" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.148458 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d3959350-36d3-4ea7-92af-94ac690b406e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d3959350-36d3-4ea7-92af-94ac690b406e\") " pod="openstack/ceilometer-0" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.148531 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzlvs\" (UniqueName: \"kubernetes.io/projected/c491527c-0ddb-41bc-86f2-78334b0b3075-kube-api-access-tzlvs\") pod \"dnsmasq-dns-785d8bcb8c-pn26k\" (UID: \"c491527c-0ddb-41bc-86f2-78334b0b3075\") " pod="openstack/dnsmasq-dns-785d8bcb8c-pn26k" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.148596 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14a50078-3ca8-4c47-8067-7473a9376323-combined-ca-bundle\") pod \"placement-db-sync-dzgr7\" (UID: \"14a50078-3ca8-4c47-8067-7473a9376323\") " pod="openstack/placement-db-sync-dzgr7" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.148657 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14a50078-3ca8-4c47-8067-7473a9376323-logs\") pod \"placement-db-sync-dzgr7\" (UID: \"14a50078-3ca8-4c47-8067-7473a9376323\") " pod="openstack/placement-db-sync-dzgr7" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.148715 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b5f727fa-133b-4692-b725-e6854dc359fd-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b5f727fa-133b-4692-b725-e6854dc359fd\") " pod="openstack/glance-default-external-api-0" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.148772 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e58293ba-94a0-47f1-977a-2a10fd0f5845-horizon-secret-key\") pod \"horizon-d974c8585-p46g9\" (UID: \"e58293ba-94a0-47f1-977a-2a10fd0f5845\") " pod="openstack/horizon-d974c8585-p46g9" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.148842 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njdkp\" (UniqueName: \"kubernetes.io/projected/d3959350-36d3-4ea7-92af-94ac690b406e-kube-api-access-njdkp\") pod \"ceilometer-0\" (UID: \"d3959350-36d3-4ea7-92af-94ac690b406e\") " pod="openstack/ceilometer-0" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.148905 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14a50078-3ca8-4c47-8067-7473a9376323-scripts\") pod \"placement-db-sync-dzgr7\" (UID: \"14a50078-3ca8-4c47-8067-7473a9376323\") " pod="openstack/placement-db-sync-dzgr7" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.148964 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"b5f727fa-133b-4692-b725-e6854dc359fd\") " pod="openstack/glance-default-external-api-0" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.149030 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3959350-36d3-4ea7-92af-94ac690b406e-log-httpd\") pod \"ceilometer-0\" (UID: \"d3959350-36d3-4ea7-92af-94ac690b406e\") " pod="openstack/ceilometer-0" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.149090 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5f727fa-133b-4692-b725-e6854dc359fd-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"b5f727fa-133b-4692-b725-e6854dc359fd\") " pod="openstack/glance-default-external-api-0" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.149163 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hkm2\" (UniqueName: \"kubernetes.io/projected/e58293ba-94a0-47f1-977a-2a10fd0f5845-kube-api-access-6hkm2\") pod \"horizon-d974c8585-p46g9\" (UID: \"e58293ba-94a0-47f1-977a-2a10fd0f5845\") " pod="openstack/horizon-d974c8585-p46g9" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.149232 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5f727fa-133b-4692-b725-e6854dc359fd-scripts\") pod \"glance-default-external-api-0\" (UID: \"b5f727fa-133b-4692-b725-e6854dc359fd\") " pod="openstack/glance-default-external-api-0" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.149326 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c491527c-0ddb-41bc-86f2-78334b0b3075-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-pn26k\" (UID: \"c491527c-0ddb-41bc-86f2-78334b0b3075\") " pod="openstack/dnsmasq-dns-785d8bcb8c-pn26k" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.149418 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hdw2\" (UniqueName: \"kubernetes.io/projected/14a50078-3ca8-4c47-8067-7473a9376323-kube-api-access-8hdw2\") pod \"placement-db-sync-dzgr7\" (UID: \"14a50078-3ca8-4c47-8067-7473a9376323\") " pod="openstack/placement-db-sync-dzgr7" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.149476 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3959350-36d3-4ea7-92af-94ac690b406e-config-data\") pod \"ceilometer-0\" (UID: \"d3959350-36d3-4ea7-92af-94ac690b406e\") " pod="openstack/ceilometer-0" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.149552 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5f727fa-133b-4692-b725-e6854dc359fd-logs\") pod \"glance-default-external-api-0\" (UID: \"b5f727fa-133b-4692-b725-e6854dc359fd\") " pod="openstack/glance-default-external-api-0" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.149620 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5f727fa-133b-4692-b725-e6854dc359fd-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b5f727fa-133b-4692-b725-e6854dc359fd\") " pod="openstack/glance-default-external-api-0" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.149688 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c491527c-0ddb-41bc-86f2-78334b0b3075-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-pn26k\" (UID: \"c491527c-0ddb-41bc-86f2-78334b0b3075\") " pod="openstack/dnsmasq-dns-785d8bcb8c-pn26k" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.149754 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c491527c-0ddb-41bc-86f2-78334b0b3075-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-pn26k\" (UID: \"c491527c-0ddb-41bc-86f2-78334b0b3075\") " pod="openstack/dnsmasq-dns-785d8bcb8c-pn26k" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.151118 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e58293ba-94a0-47f1-977a-2a10fd0f5845-config-data\") pod \"horizon-d974c8585-p46g9\" (UID: \"e58293ba-94a0-47f1-977a-2a10fd0f5845\") " pod="openstack/horizon-d974c8585-p46g9" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.153138 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e58293ba-94a0-47f1-977a-2a10fd0f5845-scripts\") pod \"horizon-d974c8585-p46g9\" (UID: \"e58293ba-94a0-47f1-977a-2a10fd0f5845\") " pod="openstack/horizon-d974c8585-p46g9" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.156578 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.158695 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b5f727fa-133b-4692-b725-e6854dc359fd-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b5f727fa-133b-4692-b725-e6854dc359fd\") " pod="openstack/glance-default-external-api-0" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.159541 4690 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"b5f727fa-133b-4692-b725-e6854dc359fd\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-external-api-0" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.163213 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e58293ba-94a0-47f1-977a-2a10fd0f5845-logs\") pod \"horizon-d974c8585-p46g9\" (UID: \"e58293ba-94a0-47f1-977a-2a10fd0f5845\") " pod="openstack/horizon-d974c8585-p46g9" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.160919 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5f727fa-133b-4692-b725-e6854dc359fd-logs\") pod \"glance-default-external-api-0\" (UID: \"b5f727fa-133b-4692-b725-e6854dc359fd\") " pod="openstack/glance-default-external-api-0" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.164436 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14a50078-3ca8-4c47-8067-7473a9376323-logs\") pod \"placement-db-sync-dzgr7\" (UID: \"14a50078-3ca8-4c47-8067-7473a9376323\") " pod="openstack/placement-db-sync-dzgr7" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.166432 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.166723 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5f727fa-133b-4692-b725-e6854dc359fd-scripts\") pod \"glance-default-external-api-0\" (UID: \"b5f727fa-133b-4692-b725-e6854dc359fd\") " pod="openstack/glance-default-external-api-0" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.167367 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bls69\" (UniqueName: \"kubernetes.io/projected/b5f727fa-133b-4692-b725-e6854dc359fd-kube-api-access-bls69\") pod \"glance-default-external-api-0\" (UID: \"b5f727fa-133b-4692-b725-e6854dc359fd\") " pod="openstack/glance-default-external-api-0" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.169621 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.169807 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.171648 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14a50078-3ca8-4c47-8067-7473a9376323-config-data\") pod \"placement-db-sync-dzgr7\" (UID: \"14a50078-3ca8-4c47-8067-7473a9376323\") " pod="openstack/placement-db-sync-dzgr7" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.171827 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-m4wn2" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.174717 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e58293ba-94a0-47f1-977a-2a10fd0f5845-horizon-secret-key\") pod \"horizon-d974c8585-p46g9\" (UID: \"e58293ba-94a0-47f1-977a-2a10fd0f5845\") " pod="openstack/horizon-d974c8585-p46g9" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.176214 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14a50078-3ca8-4c47-8067-7473a9376323-scripts\") pod \"placement-db-sync-dzgr7\" (UID: \"14a50078-3ca8-4c47-8067-7473a9376323\") " pod="openstack/placement-db-sync-dzgr7" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.177253 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hkm2\" (UniqueName: \"kubernetes.io/projected/e58293ba-94a0-47f1-977a-2a10fd0f5845-kube-api-access-6hkm2\") pod \"horizon-d974c8585-p46g9\" (UID: \"e58293ba-94a0-47f1-977a-2a10fd0f5845\") " pod="openstack/horizon-d974c8585-p46g9" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.177790 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hdw2\" (UniqueName: \"kubernetes.io/projected/14a50078-3ca8-4c47-8067-7473a9376323-kube-api-access-8hdw2\") pod \"placement-db-sync-dzgr7\" (UID: \"14a50078-3ca8-4c47-8067-7473a9376323\") " pod="openstack/placement-db-sync-dzgr7" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.179399 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.186365 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14a50078-3ca8-4c47-8067-7473a9376323-combined-ca-bundle\") pod \"placement-db-sync-dzgr7\" (UID: \"14a50078-3ca8-4c47-8067-7473a9376323\") " pod="openstack/placement-db-sync-dzgr7" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.188992 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5f727fa-133b-4692-b725-e6854dc359fd-config-data\") pod \"glance-default-external-api-0\" (UID: \"b5f727fa-133b-4692-b725-e6854dc359fd\") " pod="openstack/glance-default-external-api-0" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.192432 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5f727fa-133b-4692-b725-e6854dc359fd-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"b5f727fa-133b-4692-b725-e6854dc359fd\") " pod="openstack/glance-default-external-api-0" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.193356 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5f727fa-133b-4692-b725-e6854dc359fd-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b5f727fa-133b-4692-b725-e6854dc359fd\") " pod="openstack/glance-default-external-api-0" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.195059 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"b5f727fa-133b-4692-b725-e6854dc359fd\") " pod="openstack/glance-default-external-api-0" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.250753 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d3959350-36d3-4ea7-92af-94ac690b406e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d3959350-36d3-4ea7-92af-94ac690b406e\") " pod="openstack/ceilometer-0" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.250789 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzlvs\" (UniqueName: \"kubernetes.io/projected/c491527c-0ddb-41bc-86f2-78334b0b3075-kube-api-access-tzlvs\") pod \"dnsmasq-dns-785d8bcb8c-pn26k\" (UID: \"c491527c-0ddb-41bc-86f2-78334b0b3075\") " pod="openstack/dnsmasq-dns-785d8bcb8c-pn26k" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.250810 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njdkp\" (UniqueName: \"kubernetes.io/projected/d3959350-36d3-4ea7-92af-94ac690b406e-kube-api-access-njdkp\") pod \"ceilometer-0\" (UID: \"d3959350-36d3-4ea7-92af-94ac690b406e\") " pod="openstack/ceilometer-0" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.250841 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3959350-36d3-4ea7-92af-94ac690b406e-log-httpd\") pod \"ceilometer-0\" (UID: \"d3959350-36d3-4ea7-92af-94ac690b406e\") " pod="openstack/ceilometer-0" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.250860 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe0f9e9c-fafa-47fd-9c26-58c6f77549b0-scripts\") pod \"glance-default-internal-api-0\" (UID: \"fe0f9e9c-fafa-47fd-9c26-58c6f77549b0\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.250894 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"fe0f9e9c-fafa-47fd-9c26-58c6f77549b0\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.250911 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c491527c-0ddb-41bc-86f2-78334b0b3075-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-pn26k\" (UID: \"c491527c-0ddb-41bc-86f2-78334b0b3075\") " pod="openstack/dnsmasq-dns-785d8bcb8c-pn26k" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.250928 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3959350-36d3-4ea7-92af-94ac690b406e-config-data\") pod \"ceilometer-0\" (UID: \"d3959350-36d3-4ea7-92af-94ac690b406e\") " pod="openstack/ceilometer-0" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.250942 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe0f9e9c-fafa-47fd-9c26-58c6f77549b0-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"fe0f9e9c-fafa-47fd-9c26-58c6f77549b0\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.250970 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c491527c-0ddb-41bc-86f2-78334b0b3075-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-pn26k\" (UID: \"c491527c-0ddb-41bc-86f2-78334b0b3075\") " pod="openstack/dnsmasq-dns-785d8bcb8c-pn26k" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.250985 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe0f9e9c-fafa-47fd-9c26-58c6f77549b0-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"fe0f9e9c-fafa-47fd-9c26-58c6f77549b0\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.251006 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c491527c-0ddb-41bc-86f2-78334b0b3075-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-pn26k\" (UID: \"c491527c-0ddb-41bc-86f2-78334b0b3075\") " pod="openstack/dnsmasq-dns-785d8bcb8c-pn26k" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.251028 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe0f9e9c-fafa-47fd-9c26-58c6f77549b0-logs\") pod \"glance-default-internal-api-0\" (UID: \"fe0f9e9c-fafa-47fd-9c26-58c6f77549b0\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.251053 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3959350-36d3-4ea7-92af-94ac690b406e-scripts\") pod \"ceilometer-0\" (UID: \"d3959350-36d3-4ea7-92af-94ac690b406e\") " pod="openstack/ceilometer-0" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.251083 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fe0f9e9c-fafa-47fd-9c26-58c6f77549b0-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"fe0f9e9c-fafa-47fd-9c26-58c6f77549b0\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.251102 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gz7h2\" (UniqueName: \"kubernetes.io/projected/fe0f9e9c-fafa-47fd-9c26-58c6f77549b0-kube-api-access-gz7h2\") pod \"glance-default-internal-api-0\" (UID: \"fe0f9e9c-fafa-47fd-9c26-58c6f77549b0\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.251123 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3959350-36d3-4ea7-92af-94ac690b406e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d3959350-36d3-4ea7-92af-94ac690b406e\") " pod="openstack/ceilometer-0" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.251163 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3959350-36d3-4ea7-92af-94ac690b406e-run-httpd\") pod \"ceilometer-0\" (UID: \"d3959350-36d3-4ea7-92af-94ac690b406e\") " pod="openstack/ceilometer-0" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.251186 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c491527c-0ddb-41bc-86f2-78334b0b3075-config\") pod \"dnsmasq-dns-785d8bcb8c-pn26k\" (UID: \"c491527c-0ddb-41bc-86f2-78334b0b3075\") " pod="openstack/dnsmasq-dns-785d8bcb8c-pn26k" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.251211 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe0f9e9c-fafa-47fd-9c26-58c6f77549b0-config-data\") pod \"glance-default-internal-api-0\" (UID: \"fe0f9e9c-fafa-47fd-9c26-58c6f77549b0\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.251230 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c491527c-0ddb-41bc-86f2-78334b0b3075-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-pn26k\" (UID: \"c491527c-0ddb-41bc-86f2-78334b0b3075\") " pod="openstack/dnsmasq-dns-785d8bcb8c-pn26k" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.251989 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c491527c-0ddb-41bc-86f2-78334b0b3075-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-pn26k\" (UID: \"c491527c-0ddb-41bc-86f2-78334b0b3075\") " pod="openstack/dnsmasq-dns-785d8bcb8c-pn26k" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.252586 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c491527c-0ddb-41bc-86f2-78334b0b3075-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-pn26k\" (UID: \"c491527c-0ddb-41bc-86f2-78334b0b3075\") " pod="openstack/dnsmasq-dns-785d8bcb8c-pn26k" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.253164 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c491527c-0ddb-41bc-86f2-78334b0b3075-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-pn26k\" (UID: \"c491527c-0ddb-41bc-86f2-78334b0b3075\") " pod="openstack/dnsmasq-dns-785d8bcb8c-pn26k" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.253501 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3959350-36d3-4ea7-92af-94ac690b406e-run-httpd\") pod \"ceilometer-0\" (UID: \"d3959350-36d3-4ea7-92af-94ac690b406e\") " pod="openstack/ceilometer-0" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.253774 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c491527c-0ddb-41bc-86f2-78334b0b3075-config\") pod \"dnsmasq-dns-785d8bcb8c-pn26k\" (UID: \"c491527c-0ddb-41bc-86f2-78334b0b3075\") " pod="openstack/dnsmasq-dns-785d8bcb8c-pn26k" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.254047 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3959350-36d3-4ea7-92af-94ac690b406e-log-httpd\") pod \"ceilometer-0\" (UID: \"d3959350-36d3-4ea7-92af-94ac690b406e\") " pod="openstack/ceilometer-0" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.254575 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-dzgr7" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.254624 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c491527c-0ddb-41bc-86f2-78334b0b3075-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-pn26k\" (UID: \"c491527c-0ddb-41bc-86f2-78334b0b3075\") " pod="openstack/dnsmasq-dns-785d8bcb8c-pn26k" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.254919 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3959350-36d3-4ea7-92af-94ac690b406e-config-data\") pod \"ceilometer-0\" (UID: \"d3959350-36d3-4ea7-92af-94ac690b406e\") " pod="openstack/ceilometer-0" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.256638 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d3959350-36d3-4ea7-92af-94ac690b406e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d3959350-36d3-4ea7-92af-94ac690b406e\") " pod="openstack/ceilometer-0" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.260352 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3959350-36d3-4ea7-92af-94ac690b406e-scripts\") pod \"ceilometer-0\" (UID: \"d3959350-36d3-4ea7-92af-94ac690b406e\") " pod="openstack/ceilometer-0" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.261017 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3959350-36d3-4ea7-92af-94ac690b406e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d3959350-36d3-4ea7-92af-94ac690b406e\") " pod="openstack/ceilometer-0" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.283426 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzlvs\" (UniqueName: \"kubernetes.io/projected/c491527c-0ddb-41bc-86f2-78334b0b3075-kube-api-access-tzlvs\") pod \"dnsmasq-dns-785d8bcb8c-pn26k\" (UID: \"c491527c-0ddb-41bc-86f2-78334b0b3075\") " pod="openstack/dnsmasq-dns-785d8bcb8c-pn26k" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.294364 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njdkp\" (UniqueName: \"kubernetes.io/projected/d3959350-36d3-4ea7-92af-94ac690b406e-kube-api-access-njdkp\") pod \"ceilometer-0\" (UID: \"d3959350-36d3-4ea7-92af-94ac690b406e\") " pod="openstack/ceilometer-0" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.297814 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-d974c8585-p46g9" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.312075 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.347983 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-pn26k" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.352579 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe0f9e9c-fafa-47fd-9c26-58c6f77549b0-scripts\") pod \"glance-default-internal-api-0\" (UID: \"fe0f9e9c-fafa-47fd-9c26-58c6f77549b0\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.352628 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"fe0f9e9c-fafa-47fd-9c26-58c6f77549b0\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.352654 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe0f9e9c-fafa-47fd-9c26-58c6f77549b0-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"fe0f9e9c-fafa-47fd-9c26-58c6f77549b0\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.352800 4690 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"fe0f9e9c-fafa-47fd-9c26-58c6f77549b0\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-internal-api-0" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.353410 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe0f9e9c-fafa-47fd-9c26-58c6f77549b0-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"fe0f9e9c-fafa-47fd-9c26-58c6f77549b0\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.353836 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe0f9e9c-fafa-47fd-9c26-58c6f77549b0-logs\") pod \"glance-default-internal-api-0\" (UID: \"fe0f9e9c-fafa-47fd-9c26-58c6f77549b0\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.353952 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fe0f9e9c-fafa-47fd-9c26-58c6f77549b0-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"fe0f9e9c-fafa-47fd-9c26-58c6f77549b0\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.354017 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gz7h2\" (UniqueName: \"kubernetes.io/projected/fe0f9e9c-fafa-47fd-9c26-58c6f77549b0-kube-api-access-gz7h2\") pod \"glance-default-internal-api-0\" (UID: \"fe0f9e9c-fafa-47fd-9c26-58c6f77549b0\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.354123 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe0f9e9c-fafa-47fd-9c26-58c6f77549b0-config-data\") pod \"glance-default-internal-api-0\" (UID: \"fe0f9e9c-fafa-47fd-9c26-58c6f77549b0\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.354417 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe0f9e9c-fafa-47fd-9c26-58c6f77549b0-logs\") pod \"glance-default-internal-api-0\" (UID: \"fe0f9e9c-fafa-47fd-9c26-58c6f77549b0\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.354980 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fe0f9e9c-fafa-47fd-9c26-58c6f77549b0-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"fe0f9e9c-fafa-47fd-9c26-58c6f77549b0\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.356243 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe0f9e9c-fafa-47fd-9c26-58c6f77549b0-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"fe0f9e9c-fafa-47fd-9c26-58c6f77549b0\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.357594 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe0f9e9c-fafa-47fd-9c26-58c6f77549b0-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"fe0f9e9c-fafa-47fd-9c26-58c6f77549b0\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.358102 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe0f9e9c-fafa-47fd-9c26-58c6f77549b0-scripts\") pod \"glance-default-internal-api-0\" (UID: \"fe0f9e9c-fafa-47fd-9c26-58c6f77549b0\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.358493 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe0f9e9c-fafa-47fd-9c26-58c6f77549b0-config-data\") pod \"glance-default-internal-api-0\" (UID: \"fe0f9e9c-fafa-47fd-9c26-58c6f77549b0\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.369370 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gz7h2\" (UniqueName: \"kubernetes.io/projected/fe0f9e9c-fafa-47fd-9c26-58c6f77549b0-kube-api-access-gz7h2\") pod \"glance-default-internal-api-0\" (UID: \"fe0f9e9c-fafa-47fd-9c26-58c6f77549b0\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.379899 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"fe0f9e9c-fafa-47fd-9c26-58c6f77549b0\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.396036 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.440062 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-7qg9w"] Mar 20 17:52:51 crc kubenswrapper[4690]: W0320 17:52:51.480875 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5464bae_77a9_4be2_a0ef_56149b4c53c6.slice/crio-8c650896fc60faa06a3413134a0f3abd38faaa7e2e114943bf481cdcc4499cb1 WatchSource:0}: Error finding container 8c650896fc60faa06a3413134a0f3abd38faaa7e2e114943bf481cdcc4499cb1: Status 404 returned error can't find the container with id 8c650896fc60faa06a3413134a0f3abd38faaa7e2e114943bf481cdcc4499cb1 Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.495815 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.557683 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-f4szb"] Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.573494 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-wqk6t"] Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.725473 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-wqk6t" event={"ID":"ef3bcd50-5724-42a1-92df-262256c07d45","Type":"ContainerStarted","Data":"ce7b0954fa2bd6328ca83735ac00e5f329ae92e775b41ebd79c2c75e40d6a856"} Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.729930 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-f4szb" event={"ID":"b572482e-a7ea-4311-bff2-2cbd1ec4b42e","Type":"ContainerStarted","Data":"aaca3b1392dec6d8d06c58001965720304aa77e766b602620c81fb7a7ac14aee"} Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.731524 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-7qg9w" event={"ID":"f5464bae-77a9-4be2-a0ef-56149b4c53c6","Type":"ContainerStarted","Data":"8c650896fc60faa06a3413134a0f3abd38faaa7e2e114943bf481cdcc4499cb1"} Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.762963 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-m4wn2"] Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.780455 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5f5b984557-pdc26"] Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.787888 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-t2qth"] Mar 20 17:52:51 crc kubenswrapper[4690]: I0320 17:52:51.942547 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-dzgr7"] Mar 20 17:52:52 crc kubenswrapper[4690]: I0320 17:52:52.119257 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-d974c8585-p46g9"] Mar 20 17:52:52 crc kubenswrapper[4690]: W0320 17:52:52.204642 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd3959350_36d3_4ea7_92af_94ac690b406e.slice/crio-70b3cc244b65edbea95e6721df694d7d742b7c665bcaf84d71699df2e10bfb9e WatchSource:0}: Error finding container 70b3cc244b65edbea95e6721df694d7d742b7c665bcaf84d71699df2e10bfb9e: Status 404 returned error can't find the container with id 70b3cc244b65edbea95e6721df694d7d742b7c665bcaf84d71699df2e10bfb9e Mar 20 17:52:52 crc kubenswrapper[4690]: W0320 17:52:52.207029 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5f727fa_133b_4692_b725_e6854dc359fd.slice/crio-784d2930b4839444625d546a6070040db4072e81f61faa429f66b9eb420e7913 WatchSource:0}: Error finding container 784d2930b4839444625d546a6070040db4072e81f61faa429f66b9eb420e7913: Status 404 returned error can't find the container with id 784d2930b4839444625d546a6070040db4072e81f61faa429f66b9eb420e7913 Mar 20 17:52:52 crc kubenswrapper[4690]: W0320 17:52:52.210986 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc491527c_0ddb_41bc_86f2_78334b0b3075.slice/crio-e8a264045acf53343689a596ede5744acdfe25296397850af8ec3e9effe60730 WatchSource:0}: Error finding container e8a264045acf53343689a596ede5744acdfe25296397850af8ec3e9effe60730: Status 404 returned error can't find the container with id e8a264045acf53343689a596ede5744acdfe25296397850af8ec3e9effe60730 Mar 20 17:52:52 crc kubenswrapper[4690]: I0320 17:52:52.215784 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 17:52:52 crc kubenswrapper[4690]: I0320 17:52:52.226842 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:52:52 crc kubenswrapper[4690]: I0320 17:52:52.241794 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-pn26k"] Mar 20 17:52:52 crc kubenswrapper[4690]: I0320 17:52:52.360704 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 17:52:52 crc kubenswrapper[4690]: I0320 17:52:52.765008 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 17:52:52 crc kubenswrapper[4690]: I0320 17:52:52.775742 4690 generic.go:334] "Generic (PLEG): container finished" podID="c491527c-0ddb-41bc-86f2-78334b0b3075" containerID="75f31e088a4d0f24d44802f59974fd215688b0bbfe529b9a1c55ac2368d286f7" exitCode=0 Mar 20 17:52:52 crc kubenswrapper[4690]: I0320 17:52:52.775800 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-pn26k" event={"ID":"c491527c-0ddb-41bc-86f2-78334b0b3075","Type":"ContainerDied","Data":"75f31e088a4d0f24d44802f59974fd215688b0bbfe529b9a1c55ac2368d286f7"} Mar 20 17:52:52 crc kubenswrapper[4690]: I0320 17:52:52.775825 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-pn26k" event={"ID":"c491527c-0ddb-41bc-86f2-78334b0b3075","Type":"ContainerStarted","Data":"e8a264045acf53343689a596ede5744acdfe25296397850af8ec3e9effe60730"} Mar 20 17:52:52 crc kubenswrapper[4690]: I0320 17:52:52.802103 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5f5b984557-pdc26" event={"ID":"94582dde-cc89-4d46-8a8d-655a743e8c02","Type":"ContainerStarted","Data":"9eb81841b72316a1512f4e8eaeb5b51b5e5e5a32aea8a5a6f3968aefa0982f0a"} Mar 20 17:52:52 crc kubenswrapper[4690]: I0320 17:52:52.817386 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5f5b984557-pdc26"] Mar 20 17:52:52 crc kubenswrapper[4690]: I0320 17:52:52.835600 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-d974c8585-p46g9" event={"ID":"e58293ba-94a0-47f1-977a-2a10fd0f5845","Type":"ContainerStarted","Data":"981c62c8508c07c6d0631b84e33152a25df3591d9369b345a2375943883d4d7b"} Mar 20 17:52:52 crc kubenswrapper[4690]: I0320 17:52:52.863234 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-dd9c5c889-p2cbw"] Mar 20 17:52:52 crc kubenswrapper[4690]: I0320 17:52:52.864939 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-dd9c5c889-p2cbw" Mar 20 17:52:52 crc kubenswrapper[4690]: I0320 17:52:52.867020 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-m4wn2" event={"ID":"d1fc6c70-315f-47d3-b8d3-17e3da8ee4a0","Type":"ContainerStarted","Data":"269ee3c16fb9ae06427e37a08ebc86df57d7a3c610371a6b5fa5e41aa7314240"} Mar 20 17:52:52 crc kubenswrapper[4690]: I0320 17:52:52.873967 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-dd9c5c889-p2cbw"] Mar 20 17:52:52 crc kubenswrapper[4690]: I0320 17:52:52.874994 4690 generic.go:334] "Generic (PLEG): container finished" podID="f5464bae-77a9-4be2-a0ef-56149b4c53c6" containerID="766cf21eb2fb9de725dd5a9fd199032b396f6fbaef6e9306cf251e3af3441e73" exitCode=0 Mar 20 17:52:52 crc kubenswrapper[4690]: I0320 17:52:52.875201 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-7qg9w" event={"ID":"f5464bae-77a9-4be2-a0ef-56149b4c53c6","Type":"ContainerDied","Data":"766cf21eb2fb9de725dd5a9fd199032b396f6fbaef6e9306cf251e3af3441e73"} Mar 20 17:52:52 crc kubenswrapper[4690]: I0320 17:52:52.894825 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-f4szb" event={"ID":"b572482e-a7ea-4311-bff2-2cbd1ec4b42e","Type":"ContainerStarted","Data":"18b20fa2f931ba634ef35191294085cb0f1172e33f8bc6cc2c9c6269344e60d8"} Mar 20 17:52:52 crc kubenswrapper[4690]: I0320 17:52:52.898039 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:52:52 crc kubenswrapper[4690]: I0320 17:52:52.903322 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 17:52:52 crc kubenswrapper[4690]: I0320 17:52:52.913366 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"fe0f9e9c-fafa-47fd-9c26-58c6f77549b0","Type":"ContainerStarted","Data":"e4ca2ac6dfdf22b2164911289da3eeaa53890b1e09dd24c0c0c673f75654dfdf"} Mar 20 17:52:52 crc kubenswrapper[4690]: I0320 17:52:52.914849 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-dzgr7" event={"ID":"14a50078-3ca8-4c47-8067-7473a9376323","Type":"ContainerStarted","Data":"4240b34311429dae5c14a389c14abcb9f8ebb9c87a73ca9455634d57ad2b7f52"} Mar 20 17:52:52 crc kubenswrapper[4690]: I0320 17:52:52.924799 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-t2qth" event={"ID":"3770976f-1610-4bb2-97db-0d81d8af8de1","Type":"ContainerStarted","Data":"cba91b6ab23732ed261f40e322910c9f1b17b102b8693b9ec31cdbe5057efa66"} Mar 20 17:52:52 crc kubenswrapper[4690]: I0320 17:52:52.924847 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-t2qth" event={"ID":"3770976f-1610-4bb2-97db-0d81d8af8de1","Type":"ContainerStarted","Data":"06fcc9c3c97ee1cfffdb9190a5a2c534c94e9d009361140cf6b83d9da2eb4294"} Mar 20 17:52:52 crc kubenswrapper[4690]: I0320 17:52:52.937385 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d3959350-36d3-4ea7-92af-94ac690b406e","Type":"ContainerStarted","Data":"70b3cc244b65edbea95e6721df694d7d742b7c665bcaf84d71699df2e10bfb9e"} Mar 20 17:52:52 crc kubenswrapper[4690]: I0320 17:52:52.952239 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-f4szb" podStartSLOduration=2.952218705 podStartE2EDuration="2.952218705s" podCreationTimestamp="2026-03-20 17:52:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:52:52.947384386 +0000 UTC m=+1247.813210084" watchObservedRunningTime="2026-03-20 17:52:52.952218705 +0000 UTC m=+1247.818044383" Mar 20 17:52:52 crc kubenswrapper[4690]: I0320 17:52:52.955183 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b5f727fa-133b-4692-b725-e6854dc359fd","Type":"ContainerStarted","Data":"784d2930b4839444625d546a6070040db4072e81f61faa429f66b9eb420e7913"} Mar 20 17:52:52 crc kubenswrapper[4690]: I0320 17:52:52.974823 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-t2qth" podStartSLOduration=2.974807191 podStartE2EDuration="2.974807191s" podCreationTimestamp="2026-03-20 17:52:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:52:52.969680625 +0000 UTC m=+1247.835506303" watchObservedRunningTime="2026-03-20 17:52:52.974807191 +0000 UTC m=+1247.840632869" Mar 20 17:52:53 crc kubenswrapper[4690]: I0320 17:52:53.002346 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2a888cef-7b34-4080-9689-79877684be58-horizon-secret-key\") pod \"horizon-dd9c5c889-p2cbw\" (UID: \"2a888cef-7b34-4080-9689-79877684be58\") " pod="openstack/horizon-dd9c5c889-p2cbw" Mar 20 17:52:53 crc kubenswrapper[4690]: I0320 17:52:53.002391 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2a888cef-7b34-4080-9689-79877684be58-scripts\") pod \"horizon-dd9c5c889-p2cbw\" (UID: \"2a888cef-7b34-4080-9689-79877684be58\") " pod="openstack/horizon-dd9c5c889-p2cbw" Mar 20 17:52:53 crc kubenswrapper[4690]: I0320 17:52:53.002454 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2a888cef-7b34-4080-9689-79877684be58-config-data\") pod \"horizon-dd9c5c889-p2cbw\" (UID: \"2a888cef-7b34-4080-9689-79877684be58\") " pod="openstack/horizon-dd9c5c889-p2cbw" Mar 20 17:52:53 crc kubenswrapper[4690]: I0320 17:52:53.002507 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a888cef-7b34-4080-9689-79877684be58-logs\") pod \"horizon-dd9c5c889-p2cbw\" (UID: \"2a888cef-7b34-4080-9689-79877684be58\") " pod="openstack/horizon-dd9c5c889-p2cbw" Mar 20 17:52:53 crc kubenswrapper[4690]: I0320 17:52:53.002526 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6knm\" (UniqueName: \"kubernetes.io/projected/2a888cef-7b34-4080-9689-79877684be58-kube-api-access-d6knm\") pod \"horizon-dd9c5c889-p2cbw\" (UID: \"2a888cef-7b34-4080-9689-79877684be58\") " pod="openstack/horizon-dd9c5c889-p2cbw" Mar 20 17:52:53 crc kubenswrapper[4690]: I0320 17:52:53.104015 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2a888cef-7b34-4080-9689-79877684be58-horizon-secret-key\") pod \"horizon-dd9c5c889-p2cbw\" (UID: \"2a888cef-7b34-4080-9689-79877684be58\") " pod="openstack/horizon-dd9c5c889-p2cbw" Mar 20 17:52:53 crc kubenswrapper[4690]: I0320 17:52:53.104057 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2a888cef-7b34-4080-9689-79877684be58-scripts\") pod \"horizon-dd9c5c889-p2cbw\" (UID: \"2a888cef-7b34-4080-9689-79877684be58\") " pod="openstack/horizon-dd9c5c889-p2cbw" Mar 20 17:52:53 crc kubenswrapper[4690]: I0320 17:52:53.104128 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2a888cef-7b34-4080-9689-79877684be58-config-data\") pod \"horizon-dd9c5c889-p2cbw\" (UID: \"2a888cef-7b34-4080-9689-79877684be58\") " pod="openstack/horizon-dd9c5c889-p2cbw" Mar 20 17:52:53 crc kubenswrapper[4690]: I0320 17:52:53.104153 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a888cef-7b34-4080-9689-79877684be58-logs\") pod \"horizon-dd9c5c889-p2cbw\" (UID: \"2a888cef-7b34-4080-9689-79877684be58\") " pod="openstack/horizon-dd9c5c889-p2cbw" Mar 20 17:52:53 crc kubenswrapper[4690]: I0320 17:52:53.104170 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6knm\" (UniqueName: \"kubernetes.io/projected/2a888cef-7b34-4080-9689-79877684be58-kube-api-access-d6knm\") pod \"horizon-dd9c5c889-p2cbw\" (UID: \"2a888cef-7b34-4080-9689-79877684be58\") " pod="openstack/horizon-dd9c5c889-p2cbw" Mar 20 17:52:53 crc kubenswrapper[4690]: I0320 17:52:53.105375 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2a888cef-7b34-4080-9689-79877684be58-scripts\") pod \"horizon-dd9c5c889-p2cbw\" (UID: \"2a888cef-7b34-4080-9689-79877684be58\") " pod="openstack/horizon-dd9c5c889-p2cbw" Mar 20 17:52:53 crc kubenswrapper[4690]: I0320 17:52:53.105946 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a888cef-7b34-4080-9689-79877684be58-logs\") pod \"horizon-dd9c5c889-p2cbw\" (UID: \"2a888cef-7b34-4080-9689-79877684be58\") " pod="openstack/horizon-dd9c5c889-p2cbw" Mar 20 17:52:53 crc kubenswrapper[4690]: I0320 17:52:53.106567 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2a888cef-7b34-4080-9689-79877684be58-config-data\") pod \"horizon-dd9c5c889-p2cbw\" (UID: \"2a888cef-7b34-4080-9689-79877684be58\") " pod="openstack/horizon-dd9c5c889-p2cbw" Mar 20 17:52:53 crc kubenswrapper[4690]: I0320 17:52:53.110470 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2a888cef-7b34-4080-9689-79877684be58-horizon-secret-key\") pod \"horizon-dd9c5c889-p2cbw\" (UID: \"2a888cef-7b34-4080-9689-79877684be58\") " pod="openstack/horizon-dd9c5c889-p2cbw" Mar 20 17:52:53 crc kubenswrapper[4690]: I0320 17:52:53.125379 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6knm\" (UniqueName: \"kubernetes.io/projected/2a888cef-7b34-4080-9689-79877684be58-kube-api-access-d6knm\") pod \"horizon-dd9c5c889-p2cbw\" (UID: \"2a888cef-7b34-4080-9689-79877684be58\") " pod="openstack/horizon-dd9c5c889-p2cbw" Mar 20 17:52:53 crc kubenswrapper[4690]: I0320 17:52:53.269790 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-7qg9w" Mar 20 17:52:53 crc kubenswrapper[4690]: I0320 17:52:53.282890 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-dd9c5c889-p2cbw" Mar 20 17:52:53 crc kubenswrapper[4690]: I0320 17:52:53.410937 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f5464bae-77a9-4be2-a0ef-56149b4c53c6-ovsdbserver-sb\") pod \"f5464bae-77a9-4be2-a0ef-56149b4c53c6\" (UID: \"f5464bae-77a9-4be2-a0ef-56149b4c53c6\") " Mar 20 17:52:53 crc kubenswrapper[4690]: I0320 17:52:53.410986 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-96fm2\" (UniqueName: \"kubernetes.io/projected/f5464bae-77a9-4be2-a0ef-56149b4c53c6-kube-api-access-96fm2\") pod \"f5464bae-77a9-4be2-a0ef-56149b4c53c6\" (UID: \"f5464bae-77a9-4be2-a0ef-56149b4c53c6\") " Mar 20 17:52:53 crc kubenswrapper[4690]: I0320 17:52:53.411046 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f5464bae-77a9-4be2-a0ef-56149b4c53c6-dns-svc\") pod \"f5464bae-77a9-4be2-a0ef-56149b4c53c6\" (UID: \"f5464bae-77a9-4be2-a0ef-56149b4c53c6\") " Mar 20 17:52:53 crc kubenswrapper[4690]: I0320 17:52:53.411062 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f5464bae-77a9-4be2-a0ef-56149b4c53c6-ovsdbserver-nb\") pod \"f5464bae-77a9-4be2-a0ef-56149b4c53c6\" (UID: \"f5464bae-77a9-4be2-a0ef-56149b4c53c6\") " Mar 20 17:52:53 crc kubenswrapper[4690]: I0320 17:52:53.411103 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5464bae-77a9-4be2-a0ef-56149b4c53c6-config\") pod \"f5464bae-77a9-4be2-a0ef-56149b4c53c6\" (UID: \"f5464bae-77a9-4be2-a0ef-56149b4c53c6\") " Mar 20 17:52:53 crc kubenswrapper[4690]: I0320 17:52:53.411163 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f5464bae-77a9-4be2-a0ef-56149b4c53c6-dns-swift-storage-0\") pod \"f5464bae-77a9-4be2-a0ef-56149b4c53c6\" (UID: \"f5464bae-77a9-4be2-a0ef-56149b4c53c6\") " Mar 20 17:52:53 crc kubenswrapper[4690]: I0320 17:52:53.416650 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5464bae-77a9-4be2-a0ef-56149b4c53c6-kube-api-access-96fm2" (OuterVolumeSpecName: "kube-api-access-96fm2") pod "f5464bae-77a9-4be2-a0ef-56149b4c53c6" (UID: "f5464bae-77a9-4be2-a0ef-56149b4c53c6"). InnerVolumeSpecName "kube-api-access-96fm2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:52:53 crc kubenswrapper[4690]: I0320 17:52:53.456449 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5464bae-77a9-4be2-a0ef-56149b4c53c6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f5464bae-77a9-4be2-a0ef-56149b4c53c6" (UID: "f5464bae-77a9-4be2-a0ef-56149b4c53c6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:52:53 crc kubenswrapper[4690]: I0320 17:52:53.473087 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5464bae-77a9-4be2-a0ef-56149b4c53c6-config" (OuterVolumeSpecName: "config") pod "f5464bae-77a9-4be2-a0ef-56149b4c53c6" (UID: "f5464bae-77a9-4be2-a0ef-56149b4c53c6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:52:53 crc kubenswrapper[4690]: I0320 17:52:53.477024 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5464bae-77a9-4be2-a0ef-56149b4c53c6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f5464bae-77a9-4be2-a0ef-56149b4c53c6" (UID: "f5464bae-77a9-4be2-a0ef-56149b4c53c6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:52:53 crc kubenswrapper[4690]: I0320 17:52:53.482250 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5464bae-77a9-4be2-a0ef-56149b4c53c6-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f5464bae-77a9-4be2-a0ef-56149b4c53c6" (UID: "f5464bae-77a9-4be2-a0ef-56149b4c53c6"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:52:53 crc kubenswrapper[4690]: I0320 17:52:53.500004 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5464bae-77a9-4be2-a0ef-56149b4c53c6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f5464bae-77a9-4be2-a0ef-56149b4c53c6" (UID: "f5464bae-77a9-4be2-a0ef-56149b4c53c6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:52:53 crc kubenswrapper[4690]: I0320 17:52:53.519149 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-96fm2\" (UniqueName: \"kubernetes.io/projected/f5464bae-77a9-4be2-a0ef-56149b4c53c6-kube-api-access-96fm2\") on node \"crc\" DevicePath \"\"" Mar 20 17:52:53 crc kubenswrapper[4690]: I0320 17:52:53.519236 4690 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f5464bae-77a9-4be2-a0ef-56149b4c53c6-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 17:52:53 crc kubenswrapper[4690]: I0320 17:52:53.519335 4690 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f5464bae-77a9-4be2-a0ef-56149b4c53c6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 17:52:53 crc kubenswrapper[4690]: I0320 17:52:53.519388 4690 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5464bae-77a9-4be2-a0ef-56149b4c53c6-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:52:53 crc kubenswrapper[4690]: I0320 17:52:53.519439 4690 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f5464bae-77a9-4be2-a0ef-56149b4c53c6-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 17:52:53 crc kubenswrapper[4690]: I0320 17:52:53.519492 4690 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f5464bae-77a9-4be2-a0ef-56149b4c53c6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 17:52:53 crc kubenswrapper[4690]: I0320 17:52:53.824024 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-dd9c5c889-p2cbw"] Mar 20 17:52:53 crc kubenswrapper[4690]: I0320 17:52:53.976893 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"fe0f9e9c-fafa-47fd-9c26-58c6f77549b0","Type":"ContainerStarted","Data":"fc1f322af9e991ae5b698c4deba85094dbcc4f9719aba26a22ba3279040910bb"} Mar 20 17:52:53 crc kubenswrapper[4690]: I0320 17:52:53.981950 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-7qg9w" event={"ID":"f5464bae-77a9-4be2-a0ef-56149b4c53c6","Type":"ContainerDied","Data":"8c650896fc60faa06a3413134a0f3abd38faaa7e2e114943bf481cdcc4499cb1"} Mar 20 17:52:53 crc kubenswrapper[4690]: I0320 17:52:53.981997 4690 scope.go:117] "RemoveContainer" containerID="766cf21eb2fb9de725dd5a9fd199032b396f6fbaef6e9306cf251e3af3441e73" Mar 20 17:52:53 crc kubenswrapper[4690]: I0320 17:52:53.982104 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-7qg9w" Mar 20 17:52:53 crc kubenswrapper[4690]: I0320 17:52:53.984999 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b5f727fa-133b-4692-b725-e6854dc359fd","Type":"ContainerStarted","Data":"aad04c4f37bcade50528ca6e45886e535e75346d3f121c5a109505ce3274d6f1"} Mar 20 17:52:54 crc kubenswrapper[4690]: I0320 17:52:54.000957 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-pn26k" event={"ID":"c491527c-0ddb-41bc-86f2-78334b0b3075","Type":"ContainerStarted","Data":"9ebf62b4eb2a590670aef2257f8b622aa85255d20524cbe44ec03ce8bc9710d8"} Mar 20 17:52:54 crc kubenswrapper[4690]: I0320 17:52:54.001682 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-785d8bcb8c-pn26k" Mar 20 17:52:54 crc kubenswrapper[4690]: I0320 17:52:54.068187 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-7qg9w"] Mar 20 17:52:54 crc kubenswrapper[4690]: I0320 17:52:54.068237 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-dd9c5c889-p2cbw" event={"ID":"2a888cef-7b34-4080-9689-79877684be58","Type":"ContainerStarted","Data":"94e1a80a70166a34a353cd235f9fca27370e866e7d524cc08636775cd6dc2dec"} Mar 20 17:52:54 crc kubenswrapper[4690]: I0320 17:52:54.094768 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-7qg9w"] Mar 20 17:52:54 crc kubenswrapper[4690]: I0320 17:52:54.113501 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-785d8bcb8c-pn26k" podStartSLOduration=4.11348322 podStartE2EDuration="4.11348322s" podCreationTimestamp="2026-03-20 17:52:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:52:54.068137372 +0000 UTC m=+1248.933963050" watchObservedRunningTime="2026-03-20 17:52:54.11348322 +0000 UTC m=+1248.979308898" Mar 20 17:52:55 crc kubenswrapper[4690]: I0320 17:52:55.084828 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b5f727fa-133b-4692-b725-e6854dc359fd","Type":"ContainerStarted","Data":"05b2f269e92ffd8edc6ad6ac18dd0cd4e2903f7a1e3ce02e0534162a769823a8"} Mar 20 17:52:55 crc kubenswrapper[4690]: I0320 17:52:55.085328 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="b5f727fa-133b-4692-b725-e6854dc359fd" containerName="glance-log" containerID="cri-o://aad04c4f37bcade50528ca6e45886e535e75346d3f121c5a109505ce3274d6f1" gracePeriod=30 Mar 20 17:52:55 crc kubenswrapper[4690]: I0320 17:52:55.085593 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="b5f727fa-133b-4692-b725-e6854dc359fd" containerName="glance-httpd" containerID="cri-o://05b2f269e92ffd8edc6ad6ac18dd0cd4e2903f7a1e3ce02e0534162a769823a8" gracePeriod=30 Mar 20 17:52:55 crc kubenswrapper[4690]: I0320 17:52:55.092017 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"fe0f9e9c-fafa-47fd-9c26-58c6f77549b0","Type":"ContainerStarted","Data":"79dab24f4bf0296932cee4a6d89d82e399b89305d1e558f9a96ed466729c8f4c"} Mar 20 17:52:55 crc kubenswrapper[4690]: I0320 17:52:55.092181 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="fe0f9e9c-fafa-47fd-9c26-58c6f77549b0" containerName="glance-log" containerID="cri-o://fc1f322af9e991ae5b698c4deba85094dbcc4f9719aba26a22ba3279040910bb" gracePeriod=30 Mar 20 17:52:55 crc kubenswrapper[4690]: I0320 17:52:55.092306 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="fe0f9e9c-fafa-47fd-9c26-58c6f77549b0" containerName="glance-httpd" containerID="cri-o://79dab24f4bf0296932cee4a6d89d82e399b89305d1e558f9a96ed466729c8f4c" gracePeriod=30 Mar 20 17:52:55 crc kubenswrapper[4690]: I0320 17:52:55.111492 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.111472653 podStartE2EDuration="5.111472653s" podCreationTimestamp="2026-03-20 17:52:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:52:55.106076938 +0000 UTC m=+1249.971902616" watchObservedRunningTime="2026-03-20 17:52:55.111472653 +0000 UTC m=+1249.977298331" Mar 20 17:52:55 crc kubenswrapper[4690]: I0320 17:52:55.130019 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.130001523 podStartE2EDuration="4.130001523s" podCreationTimestamp="2026-03-20 17:52:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:52:55.127285625 +0000 UTC m=+1249.993111303" watchObservedRunningTime="2026-03-20 17:52:55.130001523 +0000 UTC m=+1249.995827201" Mar 20 17:52:55 crc kubenswrapper[4690]: I0320 17:52:55.896487 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5464bae-77a9-4be2-a0ef-56149b4c53c6" path="/var/lib/kubelet/pods/f5464bae-77a9-4be2-a0ef-56149b4c53c6/volumes" Mar 20 17:52:56 crc kubenswrapper[4690]: I0320 17:52:56.114426 4690 generic.go:334] "Generic (PLEG): container finished" podID="b5f727fa-133b-4692-b725-e6854dc359fd" containerID="05b2f269e92ffd8edc6ad6ac18dd0cd4e2903f7a1e3ce02e0534162a769823a8" exitCode=0 Mar 20 17:52:56 crc kubenswrapper[4690]: I0320 17:52:56.114465 4690 generic.go:334] "Generic (PLEG): container finished" podID="b5f727fa-133b-4692-b725-e6854dc359fd" containerID="aad04c4f37bcade50528ca6e45886e535e75346d3f121c5a109505ce3274d6f1" exitCode=143 Mar 20 17:52:56 crc kubenswrapper[4690]: I0320 17:52:56.114512 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b5f727fa-133b-4692-b725-e6854dc359fd","Type":"ContainerDied","Data":"05b2f269e92ffd8edc6ad6ac18dd0cd4e2903f7a1e3ce02e0534162a769823a8"} Mar 20 17:52:56 crc kubenswrapper[4690]: I0320 17:52:56.114557 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b5f727fa-133b-4692-b725-e6854dc359fd","Type":"ContainerDied","Data":"aad04c4f37bcade50528ca6e45886e535e75346d3f121c5a109505ce3274d6f1"} Mar 20 17:52:56 crc kubenswrapper[4690]: I0320 17:52:56.118423 4690 generic.go:334] "Generic (PLEG): container finished" podID="fe0f9e9c-fafa-47fd-9c26-58c6f77549b0" containerID="79dab24f4bf0296932cee4a6d89d82e399b89305d1e558f9a96ed466729c8f4c" exitCode=0 Mar 20 17:52:56 crc kubenswrapper[4690]: I0320 17:52:56.118453 4690 generic.go:334] "Generic (PLEG): container finished" podID="fe0f9e9c-fafa-47fd-9c26-58c6f77549b0" containerID="fc1f322af9e991ae5b698c4deba85094dbcc4f9719aba26a22ba3279040910bb" exitCode=143 Mar 20 17:52:56 crc kubenswrapper[4690]: I0320 17:52:56.118629 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"fe0f9e9c-fafa-47fd-9c26-58c6f77549b0","Type":"ContainerDied","Data":"79dab24f4bf0296932cee4a6d89d82e399b89305d1e558f9a96ed466729c8f4c"} Mar 20 17:52:56 crc kubenswrapper[4690]: I0320 17:52:56.118672 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"fe0f9e9c-fafa-47fd-9c26-58c6f77549b0","Type":"ContainerDied","Data":"fc1f322af9e991ae5b698c4deba85094dbcc4f9719aba26a22ba3279040910bb"} Mar 20 17:52:57 crc kubenswrapper[4690]: I0320 17:52:57.164164 4690 generic.go:334] "Generic (PLEG): container finished" podID="b572482e-a7ea-4311-bff2-2cbd1ec4b42e" containerID="18b20fa2f931ba634ef35191294085cb0f1172e33f8bc6cc2c9c6269344e60d8" exitCode=0 Mar 20 17:52:57 crc kubenswrapper[4690]: I0320 17:52:57.164553 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-f4szb" event={"ID":"b572482e-a7ea-4311-bff2-2cbd1ec4b42e","Type":"ContainerDied","Data":"18b20fa2f931ba634ef35191294085cb0f1172e33f8bc6cc2c9c6269344e60d8"} Mar 20 17:52:59 crc kubenswrapper[4690]: I0320 17:52:59.326111 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-d974c8585-p46g9"] Mar 20 17:52:59 crc kubenswrapper[4690]: I0320 17:52:59.366798 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-867c5896-qkwmr"] Mar 20 17:52:59 crc kubenswrapper[4690]: E0320 17:52:59.367138 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5464bae-77a9-4be2-a0ef-56149b4c53c6" containerName="init" Mar 20 17:52:59 crc kubenswrapper[4690]: I0320 17:52:59.367158 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5464bae-77a9-4be2-a0ef-56149b4c53c6" containerName="init" Mar 20 17:52:59 crc kubenswrapper[4690]: I0320 17:52:59.367354 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5464bae-77a9-4be2-a0ef-56149b4c53c6" containerName="init" Mar 20 17:52:59 crc kubenswrapper[4690]: I0320 17:52:59.368153 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-867c5896-qkwmr" Mar 20 17:52:59 crc kubenswrapper[4690]: I0320 17:52:59.372330 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Mar 20 17:52:59 crc kubenswrapper[4690]: I0320 17:52:59.388372 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-867c5896-qkwmr"] Mar 20 17:52:59 crc kubenswrapper[4690]: I0320 17:52:59.402715 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/607d61e7-e52a-46e6-a23a-2d4714c5b543-logs\") pod \"horizon-867c5896-qkwmr\" (UID: \"607d61e7-e52a-46e6-a23a-2d4714c5b543\") " pod="openstack/horizon-867c5896-qkwmr" Mar 20 17:52:59 crc kubenswrapper[4690]: I0320 17:52:59.402774 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/607d61e7-e52a-46e6-a23a-2d4714c5b543-horizon-secret-key\") pod \"horizon-867c5896-qkwmr\" (UID: \"607d61e7-e52a-46e6-a23a-2d4714c5b543\") " pod="openstack/horizon-867c5896-qkwmr" Mar 20 17:52:59 crc kubenswrapper[4690]: I0320 17:52:59.402793 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/607d61e7-e52a-46e6-a23a-2d4714c5b543-config-data\") pod \"horizon-867c5896-qkwmr\" (UID: \"607d61e7-e52a-46e6-a23a-2d4714c5b543\") " pod="openstack/horizon-867c5896-qkwmr" Mar 20 17:52:59 crc kubenswrapper[4690]: I0320 17:52:59.402808 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/607d61e7-e52a-46e6-a23a-2d4714c5b543-scripts\") pod \"horizon-867c5896-qkwmr\" (UID: \"607d61e7-e52a-46e6-a23a-2d4714c5b543\") " pod="openstack/horizon-867c5896-qkwmr" Mar 20 17:52:59 crc kubenswrapper[4690]: I0320 17:52:59.402856 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlrb8\" (UniqueName: \"kubernetes.io/projected/607d61e7-e52a-46e6-a23a-2d4714c5b543-kube-api-access-rlrb8\") pod \"horizon-867c5896-qkwmr\" (UID: \"607d61e7-e52a-46e6-a23a-2d4714c5b543\") " pod="openstack/horizon-867c5896-qkwmr" Mar 20 17:52:59 crc kubenswrapper[4690]: I0320 17:52:59.402876 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/607d61e7-e52a-46e6-a23a-2d4714c5b543-combined-ca-bundle\") pod \"horizon-867c5896-qkwmr\" (UID: \"607d61e7-e52a-46e6-a23a-2d4714c5b543\") " pod="openstack/horizon-867c5896-qkwmr" Mar 20 17:52:59 crc kubenswrapper[4690]: I0320 17:52:59.402913 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/607d61e7-e52a-46e6-a23a-2d4714c5b543-horizon-tls-certs\") pod \"horizon-867c5896-qkwmr\" (UID: \"607d61e7-e52a-46e6-a23a-2d4714c5b543\") " pod="openstack/horizon-867c5896-qkwmr" Mar 20 17:52:59 crc kubenswrapper[4690]: I0320 17:52:59.480208 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-dd9c5c889-p2cbw"] Mar 20 17:52:59 crc kubenswrapper[4690]: I0320 17:52:59.503735 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/607d61e7-e52a-46e6-a23a-2d4714c5b543-combined-ca-bundle\") pod \"horizon-867c5896-qkwmr\" (UID: \"607d61e7-e52a-46e6-a23a-2d4714c5b543\") " pod="openstack/horizon-867c5896-qkwmr" Mar 20 17:52:59 crc kubenswrapper[4690]: I0320 17:52:59.503822 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/607d61e7-e52a-46e6-a23a-2d4714c5b543-horizon-tls-certs\") pod \"horizon-867c5896-qkwmr\" (UID: \"607d61e7-e52a-46e6-a23a-2d4714c5b543\") " pod="openstack/horizon-867c5896-qkwmr" Mar 20 17:52:59 crc kubenswrapper[4690]: I0320 17:52:59.503900 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/607d61e7-e52a-46e6-a23a-2d4714c5b543-logs\") pod \"horizon-867c5896-qkwmr\" (UID: \"607d61e7-e52a-46e6-a23a-2d4714c5b543\") " pod="openstack/horizon-867c5896-qkwmr" Mar 20 17:52:59 crc kubenswrapper[4690]: I0320 17:52:59.503933 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/607d61e7-e52a-46e6-a23a-2d4714c5b543-horizon-secret-key\") pod \"horizon-867c5896-qkwmr\" (UID: \"607d61e7-e52a-46e6-a23a-2d4714c5b543\") " pod="openstack/horizon-867c5896-qkwmr" Mar 20 17:52:59 crc kubenswrapper[4690]: I0320 17:52:59.503947 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/607d61e7-e52a-46e6-a23a-2d4714c5b543-config-data\") pod \"horizon-867c5896-qkwmr\" (UID: \"607d61e7-e52a-46e6-a23a-2d4714c5b543\") " pod="openstack/horizon-867c5896-qkwmr" Mar 20 17:52:59 crc kubenswrapper[4690]: I0320 17:52:59.503963 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/607d61e7-e52a-46e6-a23a-2d4714c5b543-scripts\") pod \"horizon-867c5896-qkwmr\" (UID: \"607d61e7-e52a-46e6-a23a-2d4714c5b543\") " pod="openstack/horizon-867c5896-qkwmr" Mar 20 17:52:59 crc kubenswrapper[4690]: I0320 17:52:59.503988 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlrb8\" (UniqueName: \"kubernetes.io/projected/607d61e7-e52a-46e6-a23a-2d4714c5b543-kube-api-access-rlrb8\") pod \"horizon-867c5896-qkwmr\" (UID: \"607d61e7-e52a-46e6-a23a-2d4714c5b543\") " pod="openstack/horizon-867c5896-qkwmr" Mar 20 17:52:59 crc kubenswrapper[4690]: I0320 17:52:59.505187 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/607d61e7-e52a-46e6-a23a-2d4714c5b543-logs\") pod \"horizon-867c5896-qkwmr\" (UID: \"607d61e7-e52a-46e6-a23a-2d4714c5b543\") " pod="openstack/horizon-867c5896-qkwmr" Mar 20 17:52:59 crc kubenswrapper[4690]: I0320 17:52:59.506283 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/607d61e7-e52a-46e6-a23a-2d4714c5b543-config-data\") pod \"horizon-867c5896-qkwmr\" (UID: \"607d61e7-e52a-46e6-a23a-2d4714c5b543\") " pod="openstack/horizon-867c5896-qkwmr" Mar 20 17:52:59 crc kubenswrapper[4690]: I0320 17:52:59.506698 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/607d61e7-e52a-46e6-a23a-2d4714c5b543-scripts\") pod \"horizon-867c5896-qkwmr\" (UID: \"607d61e7-e52a-46e6-a23a-2d4714c5b543\") " pod="openstack/horizon-867c5896-qkwmr" Mar 20 17:52:59 crc kubenswrapper[4690]: I0320 17:52:59.510195 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/607d61e7-e52a-46e6-a23a-2d4714c5b543-horizon-tls-certs\") pod \"horizon-867c5896-qkwmr\" (UID: \"607d61e7-e52a-46e6-a23a-2d4714c5b543\") " pod="openstack/horizon-867c5896-qkwmr" Mar 20 17:52:59 crc kubenswrapper[4690]: I0320 17:52:59.511908 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/607d61e7-e52a-46e6-a23a-2d4714c5b543-combined-ca-bundle\") pod \"horizon-867c5896-qkwmr\" (UID: \"607d61e7-e52a-46e6-a23a-2d4714c5b543\") " pod="openstack/horizon-867c5896-qkwmr" Mar 20 17:52:59 crc kubenswrapper[4690]: I0320 17:52:59.515418 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/607d61e7-e52a-46e6-a23a-2d4714c5b543-horizon-secret-key\") pod \"horizon-867c5896-qkwmr\" (UID: \"607d61e7-e52a-46e6-a23a-2d4714c5b543\") " pod="openstack/horizon-867c5896-qkwmr" Mar 20 17:52:59 crc kubenswrapper[4690]: I0320 17:52:59.551700 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-dc95ccffb-gvrdq"] Mar 20 17:52:59 crc kubenswrapper[4690]: I0320 17:52:59.553110 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-dc95ccffb-gvrdq" Mar 20 17:52:59 crc kubenswrapper[4690]: I0320 17:52:59.569947 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlrb8\" (UniqueName: \"kubernetes.io/projected/607d61e7-e52a-46e6-a23a-2d4714c5b543-kube-api-access-rlrb8\") pod \"horizon-867c5896-qkwmr\" (UID: \"607d61e7-e52a-46e6-a23a-2d4714c5b543\") " pod="openstack/horizon-867c5896-qkwmr" Mar 20 17:52:59 crc kubenswrapper[4690]: I0320 17:52:59.609870 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zz6nm\" (UniqueName: \"kubernetes.io/projected/799b195a-e6e5-4a19-b41a-1c7550e21e90-kube-api-access-zz6nm\") pod \"horizon-dc95ccffb-gvrdq\" (UID: \"799b195a-e6e5-4a19-b41a-1c7550e21e90\") " pod="openstack/horizon-dc95ccffb-gvrdq" Mar 20 17:52:59 crc kubenswrapper[4690]: I0320 17:52:59.609918 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/799b195a-e6e5-4a19-b41a-1c7550e21e90-horizon-tls-certs\") pod \"horizon-dc95ccffb-gvrdq\" (UID: \"799b195a-e6e5-4a19-b41a-1c7550e21e90\") " pod="openstack/horizon-dc95ccffb-gvrdq" Mar 20 17:52:59 crc kubenswrapper[4690]: I0320 17:52:59.609962 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/799b195a-e6e5-4a19-b41a-1c7550e21e90-scripts\") pod \"horizon-dc95ccffb-gvrdq\" (UID: \"799b195a-e6e5-4a19-b41a-1c7550e21e90\") " pod="openstack/horizon-dc95ccffb-gvrdq" Mar 20 17:52:59 crc kubenswrapper[4690]: I0320 17:52:59.609993 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/799b195a-e6e5-4a19-b41a-1c7550e21e90-config-data\") pod \"horizon-dc95ccffb-gvrdq\" (UID: \"799b195a-e6e5-4a19-b41a-1c7550e21e90\") " pod="openstack/horizon-dc95ccffb-gvrdq" Mar 20 17:52:59 crc kubenswrapper[4690]: I0320 17:52:59.610007 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/799b195a-e6e5-4a19-b41a-1c7550e21e90-horizon-secret-key\") pod \"horizon-dc95ccffb-gvrdq\" (UID: \"799b195a-e6e5-4a19-b41a-1c7550e21e90\") " pod="openstack/horizon-dc95ccffb-gvrdq" Mar 20 17:52:59 crc kubenswrapper[4690]: I0320 17:52:59.610063 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/799b195a-e6e5-4a19-b41a-1c7550e21e90-combined-ca-bundle\") pod \"horizon-dc95ccffb-gvrdq\" (UID: \"799b195a-e6e5-4a19-b41a-1c7550e21e90\") " pod="openstack/horizon-dc95ccffb-gvrdq" Mar 20 17:52:59 crc kubenswrapper[4690]: I0320 17:52:59.610098 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/799b195a-e6e5-4a19-b41a-1c7550e21e90-logs\") pod \"horizon-dc95ccffb-gvrdq\" (UID: \"799b195a-e6e5-4a19-b41a-1c7550e21e90\") " pod="openstack/horizon-dc95ccffb-gvrdq" Mar 20 17:52:59 crc kubenswrapper[4690]: I0320 17:52:59.618631 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-dc95ccffb-gvrdq"] Mar 20 17:52:59 crc kubenswrapper[4690]: I0320 17:52:59.687036 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-867c5896-qkwmr" Mar 20 17:52:59 crc kubenswrapper[4690]: I0320 17:52:59.712136 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/799b195a-e6e5-4a19-b41a-1c7550e21e90-scripts\") pod \"horizon-dc95ccffb-gvrdq\" (UID: \"799b195a-e6e5-4a19-b41a-1c7550e21e90\") " pod="openstack/horizon-dc95ccffb-gvrdq" Mar 20 17:52:59 crc kubenswrapper[4690]: I0320 17:52:59.712541 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/799b195a-e6e5-4a19-b41a-1c7550e21e90-config-data\") pod \"horizon-dc95ccffb-gvrdq\" (UID: \"799b195a-e6e5-4a19-b41a-1c7550e21e90\") " pod="openstack/horizon-dc95ccffb-gvrdq" Mar 20 17:52:59 crc kubenswrapper[4690]: I0320 17:52:59.712568 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/799b195a-e6e5-4a19-b41a-1c7550e21e90-horizon-secret-key\") pod \"horizon-dc95ccffb-gvrdq\" (UID: \"799b195a-e6e5-4a19-b41a-1c7550e21e90\") " pod="openstack/horizon-dc95ccffb-gvrdq" Mar 20 17:52:59 crc kubenswrapper[4690]: I0320 17:52:59.712661 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/799b195a-e6e5-4a19-b41a-1c7550e21e90-combined-ca-bundle\") pod \"horizon-dc95ccffb-gvrdq\" (UID: \"799b195a-e6e5-4a19-b41a-1c7550e21e90\") " pod="openstack/horizon-dc95ccffb-gvrdq" Mar 20 17:52:59 crc kubenswrapper[4690]: I0320 17:52:59.712721 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/799b195a-e6e5-4a19-b41a-1c7550e21e90-logs\") pod \"horizon-dc95ccffb-gvrdq\" (UID: \"799b195a-e6e5-4a19-b41a-1c7550e21e90\") " pod="openstack/horizon-dc95ccffb-gvrdq" Mar 20 17:52:59 crc kubenswrapper[4690]: I0320 17:52:59.712766 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zz6nm\" (UniqueName: \"kubernetes.io/projected/799b195a-e6e5-4a19-b41a-1c7550e21e90-kube-api-access-zz6nm\") pod \"horizon-dc95ccffb-gvrdq\" (UID: \"799b195a-e6e5-4a19-b41a-1c7550e21e90\") " pod="openstack/horizon-dc95ccffb-gvrdq" Mar 20 17:52:59 crc kubenswrapper[4690]: I0320 17:52:59.712789 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/799b195a-e6e5-4a19-b41a-1c7550e21e90-horizon-tls-certs\") pod \"horizon-dc95ccffb-gvrdq\" (UID: \"799b195a-e6e5-4a19-b41a-1c7550e21e90\") " pod="openstack/horizon-dc95ccffb-gvrdq" Mar 20 17:52:59 crc kubenswrapper[4690]: I0320 17:52:59.712897 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/799b195a-e6e5-4a19-b41a-1c7550e21e90-scripts\") pod \"horizon-dc95ccffb-gvrdq\" (UID: \"799b195a-e6e5-4a19-b41a-1c7550e21e90\") " pod="openstack/horizon-dc95ccffb-gvrdq" Mar 20 17:52:59 crc kubenswrapper[4690]: I0320 17:52:59.713301 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/799b195a-e6e5-4a19-b41a-1c7550e21e90-logs\") pod \"horizon-dc95ccffb-gvrdq\" (UID: \"799b195a-e6e5-4a19-b41a-1c7550e21e90\") " pod="openstack/horizon-dc95ccffb-gvrdq" Mar 20 17:52:59 crc kubenswrapper[4690]: I0320 17:52:59.713622 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/799b195a-e6e5-4a19-b41a-1c7550e21e90-config-data\") pod \"horizon-dc95ccffb-gvrdq\" (UID: \"799b195a-e6e5-4a19-b41a-1c7550e21e90\") " pod="openstack/horizon-dc95ccffb-gvrdq" Mar 20 17:52:59 crc kubenswrapper[4690]: I0320 17:52:59.733932 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/799b195a-e6e5-4a19-b41a-1c7550e21e90-horizon-secret-key\") pod \"horizon-dc95ccffb-gvrdq\" (UID: \"799b195a-e6e5-4a19-b41a-1c7550e21e90\") " pod="openstack/horizon-dc95ccffb-gvrdq" Mar 20 17:52:59 crc kubenswrapper[4690]: I0320 17:52:59.734061 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/799b195a-e6e5-4a19-b41a-1c7550e21e90-combined-ca-bundle\") pod \"horizon-dc95ccffb-gvrdq\" (UID: \"799b195a-e6e5-4a19-b41a-1c7550e21e90\") " pod="openstack/horizon-dc95ccffb-gvrdq" Mar 20 17:52:59 crc kubenswrapper[4690]: I0320 17:52:59.738762 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/799b195a-e6e5-4a19-b41a-1c7550e21e90-horizon-tls-certs\") pod \"horizon-dc95ccffb-gvrdq\" (UID: \"799b195a-e6e5-4a19-b41a-1c7550e21e90\") " pod="openstack/horizon-dc95ccffb-gvrdq" Mar 20 17:52:59 crc kubenswrapper[4690]: I0320 17:52:59.754939 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zz6nm\" (UniqueName: \"kubernetes.io/projected/799b195a-e6e5-4a19-b41a-1c7550e21e90-kube-api-access-zz6nm\") pod \"horizon-dc95ccffb-gvrdq\" (UID: \"799b195a-e6e5-4a19-b41a-1c7550e21e90\") " pod="openstack/horizon-dc95ccffb-gvrdq" Mar 20 17:52:59 crc kubenswrapper[4690]: I0320 17:52:59.943741 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-dc95ccffb-gvrdq" Mar 20 17:53:01 crc kubenswrapper[4690]: I0320 17:53:01.349486 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-785d8bcb8c-pn26k" Mar 20 17:53:01 crc kubenswrapper[4690]: I0320 17:53:01.438038 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-f6h5n"] Mar 20 17:53:01 crc kubenswrapper[4690]: I0320 17:53:01.438317 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-74f6bcbc87-f6h5n" podUID="d5e029b9-bf4d-4700-9a5f-c35bd3459b15" containerName="dnsmasq-dns" containerID="cri-o://75ee5ffb621d9785b335466768a7b7cc4b8bfa373b10846427a02d4c1abc31cd" gracePeriod=10 Mar 20 17:53:02 crc kubenswrapper[4690]: I0320 17:53:02.116820 4690 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-f6h5n" podUID="d5e029b9-bf4d-4700-9a5f-c35bd3459b15" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.127:5353: connect: connection refused" Mar 20 17:53:03 crc kubenswrapper[4690]: I0320 17:53:03.230137 4690 generic.go:334] "Generic (PLEG): container finished" podID="d5e029b9-bf4d-4700-9a5f-c35bd3459b15" containerID="75ee5ffb621d9785b335466768a7b7cc4b8bfa373b10846427a02d4c1abc31cd" exitCode=0 Mar 20 17:53:03 crc kubenswrapper[4690]: I0320 17:53:03.230305 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-f6h5n" event={"ID":"d5e029b9-bf4d-4700-9a5f-c35bd3459b15","Type":"ContainerDied","Data":"75ee5ffb621d9785b335466768a7b7cc4b8bfa373b10846427a02d4c1abc31cd"} Mar 20 17:53:07 crc kubenswrapper[4690]: I0320 17:53:07.116124 4690 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-f6h5n" podUID="d5e029b9-bf4d-4700-9a5f-c35bd3459b15" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.127:5353: connect: connection refused" Mar 20 17:53:07 crc kubenswrapper[4690]: E0320 17:53:07.438496 4690 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Mar 20 17:53:07 crc kubenswrapper[4690]: E0320 17:53:07.438715 4690 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nf6h66h658hc5h675h59dh54h5c7h5b6h66hc9hd4h5cdh587h56bh89hc7h55fhf6h55h56ch5b9h96h89h6bh688h574h5dch86h5b5hffh5c4q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kwhnw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-5f5b984557-pdc26_openstack(94582dde-cc89-4d46-8a8d-655a743e8c02): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 17:53:07 crc kubenswrapper[4690]: E0320 17:53:07.455316 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-5f5b984557-pdc26" podUID="94582dde-cc89-4d46-8a8d-655a743e8c02" Mar 20 17:53:07 crc kubenswrapper[4690]: E0320 17:53:07.478572 4690 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Mar 20 17:53:07 crc kubenswrapper[4690]: E0320 17:53:07.478748 4690 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5bh9fh5bh548h88h584h676h96h9fh566h64ch9fhd8h665h5f9h569h564h5d4h588h57ch68h5dch94hcdh5c8h596h544h645h667h5d9h54fh5c7q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-d6knm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-dd9c5c889-p2cbw_openstack(2a888cef-7b34-4080-9689-79877684be58): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 17:53:07 crc kubenswrapper[4690]: E0320 17:53:07.482446 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-dd9c5c889-p2cbw" podUID="2a888cef-7b34-4080-9689-79877684be58" Mar 20 17:53:07 crc kubenswrapper[4690]: I0320 17:53:07.550156 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 17:53:07 crc kubenswrapper[4690]: I0320 17:53:07.571563 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b5f727fa-133b-4692-b725-e6854dc359fd-httpd-run\") pod \"b5f727fa-133b-4692-b725-e6854dc359fd\" (UID: \"b5f727fa-133b-4692-b725-e6854dc359fd\") " Mar 20 17:53:07 crc kubenswrapper[4690]: I0320 17:53:07.571650 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5f727fa-133b-4692-b725-e6854dc359fd-combined-ca-bundle\") pod \"b5f727fa-133b-4692-b725-e6854dc359fd\" (UID: \"b5f727fa-133b-4692-b725-e6854dc359fd\") " Mar 20 17:53:07 crc kubenswrapper[4690]: I0320 17:53:07.571716 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5f727fa-133b-4692-b725-e6854dc359fd-config-data\") pod \"b5f727fa-133b-4692-b725-e6854dc359fd\" (UID: \"b5f727fa-133b-4692-b725-e6854dc359fd\") " Mar 20 17:53:07 crc kubenswrapper[4690]: I0320 17:53:07.571835 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5f727fa-133b-4692-b725-e6854dc359fd-scripts\") pod \"b5f727fa-133b-4692-b725-e6854dc359fd\" (UID: \"b5f727fa-133b-4692-b725-e6854dc359fd\") " Mar 20 17:53:07 crc kubenswrapper[4690]: I0320 17:53:07.571871 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bls69\" (UniqueName: \"kubernetes.io/projected/b5f727fa-133b-4692-b725-e6854dc359fd-kube-api-access-bls69\") pod \"b5f727fa-133b-4692-b725-e6854dc359fd\" (UID: \"b5f727fa-133b-4692-b725-e6854dc359fd\") " Mar 20 17:53:07 crc kubenswrapper[4690]: I0320 17:53:07.571926 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5f727fa-133b-4692-b725-e6854dc359fd-public-tls-certs\") pod \"b5f727fa-133b-4692-b725-e6854dc359fd\" (UID: \"b5f727fa-133b-4692-b725-e6854dc359fd\") " Mar 20 17:53:07 crc kubenswrapper[4690]: I0320 17:53:07.571983 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5f727fa-133b-4692-b725-e6854dc359fd-logs\") pod \"b5f727fa-133b-4692-b725-e6854dc359fd\" (UID: \"b5f727fa-133b-4692-b725-e6854dc359fd\") " Mar 20 17:53:07 crc kubenswrapper[4690]: I0320 17:53:07.572236 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5f727fa-133b-4692-b725-e6854dc359fd-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "b5f727fa-133b-4692-b725-e6854dc359fd" (UID: "b5f727fa-133b-4692-b725-e6854dc359fd"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:53:07 crc kubenswrapper[4690]: I0320 17:53:07.574823 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5f727fa-133b-4692-b725-e6854dc359fd-logs" (OuterVolumeSpecName: "logs") pod "b5f727fa-133b-4692-b725-e6854dc359fd" (UID: "b5f727fa-133b-4692-b725-e6854dc359fd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:53:07 crc kubenswrapper[4690]: I0320 17:53:07.575310 4690 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5f727fa-133b-4692-b725-e6854dc359fd-logs\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:07 crc kubenswrapper[4690]: I0320 17:53:07.575342 4690 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b5f727fa-133b-4692-b725-e6854dc359fd-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:07 crc kubenswrapper[4690]: I0320 17:53:07.584745 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5f727fa-133b-4692-b725-e6854dc359fd-scripts" (OuterVolumeSpecName: "scripts") pod "b5f727fa-133b-4692-b725-e6854dc359fd" (UID: "b5f727fa-133b-4692-b725-e6854dc359fd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:53:07 crc kubenswrapper[4690]: I0320 17:53:07.589028 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5f727fa-133b-4692-b725-e6854dc359fd-kube-api-access-bls69" (OuterVolumeSpecName: "kube-api-access-bls69") pod "b5f727fa-133b-4692-b725-e6854dc359fd" (UID: "b5f727fa-133b-4692-b725-e6854dc359fd"). InnerVolumeSpecName "kube-api-access-bls69". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:53:07 crc kubenswrapper[4690]: I0320 17:53:07.634608 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5f727fa-133b-4692-b725-e6854dc359fd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b5f727fa-133b-4692-b725-e6854dc359fd" (UID: "b5f727fa-133b-4692-b725-e6854dc359fd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:53:07 crc kubenswrapper[4690]: I0320 17:53:07.634838 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5f727fa-133b-4692-b725-e6854dc359fd-config-data" (OuterVolumeSpecName: "config-data") pod "b5f727fa-133b-4692-b725-e6854dc359fd" (UID: "b5f727fa-133b-4692-b725-e6854dc359fd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:53:07 crc kubenswrapper[4690]: I0320 17:53:07.642033 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5f727fa-133b-4692-b725-e6854dc359fd-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "b5f727fa-133b-4692-b725-e6854dc359fd" (UID: "b5f727fa-133b-4692-b725-e6854dc359fd"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:53:07 crc kubenswrapper[4690]: I0320 17:53:07.677367 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"b5f727fa-133b-4692-b725-e6854dc359fd\" (UID: \"b5f727fa-133b-4692-b725-e6854dc359fd\") " Mar 20 17:53:07 crc kubenswrapper[4690]: I0320 17:53:07.678218 4690 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5f727fa-133b-4692-b725-e6854dc359fd-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:07 crc kubenswrapper[4690]: I0320 17:53:07.678247 4690 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5f727fa-133b-4692-b725-e6854dc359fd-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:07 crc kubenswrapper[4690]: I0320 17:53:07.678274 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bls69\" (UniqueName: \"kubernetes.io/projected/b5f727fa-133b-4692-b725-e6854dc359fd-kube-api-access-bls69\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:07 crc kubenswrapper[4690]: I0320 17:53:07.678289 4690 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5f727fa-133b-4692-b725-e6854dc359fd-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:07 crc kubenswrapper[4690]: I0320 17:53:07.678304 4690 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5f727fa-133b-4692-b725-e6854dc359fd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:07 crc kubenswrapper[4690]: I0320 17:53:07.682076 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "b5f727fa-133b-4692-b725-e6854dc359fd" (UID: "b5f727fa-133b-4692-b725-e6854dc359fd"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 17:53:07 crc kubenswrapper[4690]: I0320 17:53:07.779491 4690 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Mar 20 17:53:07 crc kubenswrapper[4690]: I0320 17:53:07.802693 4690 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Mar 20 17:53:07 crc kubenswrapper[4690]: I0320 17:53:07.881206 4690 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:08 crc kubenswrapper[4690]: I0320 17:53:08.290141 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 17:53:08 crc kubenswrapper[4690]: I0320 17:53:08.290399 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b5f727fa-133b-4692-b725-e6854dc359fd","Type":"ContainerDied","Data":"784d2930b4839444625d546a6070040db4072e81f61faa429f66b9eb420e7913"} Mar 20 17:53:08 crc kubenswrapper[4690]: I0320 17:53:08.290462 4690 scope.go:117] "RemoveContainer" containerID="05b2f269e92ffd8edc6ad6ac18dd0cd4e2903f7a1e3ce02e0534162a769823a8" Mar 20 17:53:08 crc kubenswrapper[4690]: I0320 17:53:08.390431 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 17:53:08 crc kubenswrapper[4690]: I0320 17:53:08.402385 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 17:53:08 crc kubenswrapper[4690]: I0320 17:53:08.419212 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 17:53:08 crc kubenswrapper[4690]: E0320 17:53:08.419714 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5f727fa-133b-4692-b725-e6854dc359fd" containerName="glance-httpd" Mar 20 17:53:08 crc kubenswrapper[4690]: I0320 17:53:08.419730 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5f727fa-133b-4692-b725-e6854dc359fd" containerName="glance-httpd" Mar 20 17:53:08 crc kubenswrapper[4690]: E0320 17:53:08.419752 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5f727fa-133b-4692-b725-e6854dc359fd" containerName="glance-log" Mar 20 17:53:08 crc kubenswrapper[4690]: I0320 17:53:08.419758 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5f727fa-133b-4692-b725-e6854dc359fd" containerName="glance-log" Mar 20 17:53:08 crc kubenswrapper[4690]: I0320 17:53:08.419941 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5f727fa-133b-4692-b725-e6854dc359fd" containerName="glance-log" Mar 20 17:53:08 crc kubenswrapper[4690]: I0320 17:53:08.419958 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5f727fa-133b-4692-b725-e6854dc359fd" containerName="glance-httpd" Mar 20 17:53:08 crc kubenswrapper[4690]: I0320 17:53:08.420928 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 17:53:08 crc kubenswrapper[4690]: I0320 17:53:08.423432 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 20 17:53:08 crc kubenswrapper[4690]: I0320 17:53:08.424135 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 20 17:53:08 crc kubenswrapper[4690]: I0320 17:53:08.439644 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 17:53:08 crc kubenswrapper[4690]: I0320 17:53:08.594739 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d11f0ffd-e625-4b90-a1e5-2315bf45529d-logs\") pod \"glance-default-external-api-0\" (UID: \"d11f0ffd-e625-4b90-a1e5-2315bf45529d\") " pod="openstack/glance-default-external-api-0" Mar 20 17:53:08 crc kubenswrapper[4690]: I0320 17:53:08.594778 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d11f0ffd-e625-4b90-a1e5-2315bf45529d-config-data\") pod \"glance-default-external-api-0\" (UID: \"d11f0ffd-e625-4b90-a1e5-2315bf45529d\") " pod="openstack/glance-default-external-api-0" Mar 20 17:53:08 crc kubenswrapper[4690]: I0320 17:53:08.594865 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d11f0ffd-e625-4b90-a1e5-2315bf45529d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d11f0ffd-e625-4b90-a1e5-2315bf45529d\") " pod="openstack/glance-default-external-api-0" Mar 20 17:53:08 crc kubenswrapper[4690]: I0320 17:53:08.594897 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d11f0ffd-e625-4b90-a1e5-2315bf45529d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d11f0ffd-e625-4b90-a1e5-2315bf45529d\") " pod="openstack/glance-default-external-api-0" Mar 20 17:53:08 crc kubenswrapper[4690]: I0320 17:53:08.594920 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"d11f0ffd-e625-4b90-a1e5-2315bf45529d\") " pod="openstack/glance-default-external-api-0" Mar 20 17:53:08 crc kubenswrapper[4690]: I0320 17:53:08.594947 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d11f0ffd-e625-4b90-a1e5-2315bf45529d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d11f0ffd-e625-4b90-a1e5-2315bf45529d\") " pod="openstack/glance-default-external-api-0" Mar 20 17:53:08 crc kubenswrapper[4690]: I0320 17:53:08.595133 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6n4br\" (UniqueName: \"kubernetes.io/projected/d11f0ffd-e625-4b90-a1e5-2315bf45529d-kube-api-access-6n4br\") pod \"glance-default-external-api-0\" (UID: \"d11f0ffd-e625-4b90-a1e5-2315bf45529d\") " pod="openstack/glance-default-external-api-0" Mar 20 17:53:08 crc kubenswrapper[4690]: I0320 17:53:08.595236 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d11f0ffd-e625-4b90-a1e5-2315bf45529d-scripts\") pod \"glance-default-external-api-0\" (UID: \"d11f0ffd-e625-4b90-a1e5-2315bf45529d\") " pod="openstack/glance-default-external-api-0" Mar 20 17:53:08 crc kubenswrapper[4690]: I0320 17:53:08.697312 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d11f0ffd-e625-4b90-a1e5-2315bf45529d-logs\") pod \"glance-default-external-api-0\" (UID: \"d11f0ffd-e625-4b90-a1e5-2315bf45529d\") " pod="openstack/glance-default-external-api-0" Mar 20 17:53:08 crc kubenswrapper[4690]: I0320 17:53:08.697689 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d11f0ffd-e625-4b90-a1e5-2315bf45529d-config-data\") pod \"glance-default-external-api-0\" (UID: \"d11f0ffd-e625-4b90-a1e5-2315bf45529d\") " pod="openstack/glance-default-external-api-0" Mar 20 17:53:08 crc kubenswrapper[4690]: I0320 17:53:08.697850 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d11f0ffd-e625-4b90-a1e5-2315bf45529d-logs\") pod \"glance-default-external-api-0\" (UID: \"d11f0ffd-e625-4b90-a1e5-2315bf45529d\") " pod="openstack/glance-default-external-api-0" Mar 20 17:53:08 crc kubenswrapper[4690]: I0320 17:53:08.699008 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d11f0ffd-e625-4b90-a1e5-2315bf45529d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d11f0ffd-e625-4b90-a1e5-2315bf45529d\") " pod="openstack/glance-default-external-api-0" Mar 20 17:53:08 crc kubenswrapper[4690]: I0320 17:53:08.699129 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d11f0ffd-e625-4b90-a1e5-2315bf45529d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d11f0ffd-e625-4b90-a1e5-2315bf45529d\") " pod="openstack/glance-default-external-api-0" Mar 20 17:53:08 crc kubenswrapper[4690]: I0320 17:53:08.699172 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"d11f0ffd-e625-4b90-a1e5-2315bf45529d\") " pod="openstack/glance-default-external-api-0" Mar 20 17:53:08 crc kubenswrapper[4690]: I0320 17:53:08.699240 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d11f0ffd-e625-4b90-a1e5-2315bf45529d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d11f0ffd-e625-4b90-a1e5-2315bf45529d\") " pod="openstack/glance-default-external-api-0" Mar 20 17:53:08 crc kubenswrapper[4690]: I0320 17:53:08.699303 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6n4br\" (UniqueName: \"kubernetes.io/projected/d11f0ffd-e625-4b90-a1e5-2315bf45529d-kube-api-access-6n4br\") pod \"glance-default-external-api-0\" (UID: \"d11f0ffd-e625-4b90-a1e5-2315bf45529d\") " pod="openstack/glance-default-external-api-0" Mar 20 17:53:08 crc kubenswrapper[4690]: I0320 17:53:08.699344 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d11f0ffd-e625-4b90-a1e5-2315bf45529d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d11f0ffd-e625-4b90-a1e5-2315bf45529d\") " pod="openstack/glance-default-external-api-0" Mar 20 17:53:08 crc kubenswrapper[4690]: I0320 17:53:08.699371 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d11f0ffd-e625-4b90-a1e5-2315bf45529d-scripts\") pod \"glance-default-external-api-0\" (UID: \"d11f0ffd-e625-4b90-a1e5-2315bf45529d\") " pod="openstack/glance-default-external-api-0" Mar 20 17:53:08 crc kubenswrapper[4690]: I0320 17:53:08.699737 4690 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"d11f0ffd-e625-4b90-a1e5-2315bf45529d\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-external-api-0" Mar 20 17:53:08 crc kubenswrapper[4690]: I0320 17:53:08.704003 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d11f0ffd-e625-4b90-a1e5-2315bf45529d-scripts\") pod \"glance-default-external-api-0\" (UID: \"d11f0ffd-e625-4b90-a1e5-2315bf45529d\") " pod="openstack/glance-default-external-api-0" Mar 20 17:53:08 crc kubenswrapper[4690]: I0320 17:53:08.704708 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d11f0ffd-e625-4b90-a1e5-2315bf45529d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d11f0ffd-e625-4b90-a1e5-2315bf45529d\") " pod="openstack/glance-default-external-api-0" Mar 20 17:53:08 crc kubenswrapper[4690]: I0320 17:53:08.704838 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d11f0ffd-e625-4b90-a1e5-2315bf45529d-config-data\") pod \"glance-default-external-api-0\" (UID: \"d11f0ffd-e625-4b90-a1e5-2315bf45529d\") " pod="openstack/glance-default-external-api-0" Mar 20 17:53:08 crc kubenswrapper[4690]: I0320 17:53:08.711896 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d11f0ffd-e625-4b90-a1e5-2315bf45529d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d11f0ffd-e625-4b90-a1e5-2315bf45529d\") " pod="openstack/glance-default-external-api-0" Mar 20 17:53:08 crc kubenswrapper[4690]: I0320 17:53:08.716415 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6n4br\" (UniqueName: \"kubernetes.io/projected/d11f0ffd-e625-4b90-a1e5-2315bf45529d-kube-api-access-6n4br\") pod \"glance-default-external-api-0\" (UID: \"d11f0ffd-e625-4b90-a1e5-2315bf45529d\") " pod="openstack/glance-default-external-api-0" Mar 20 17:53:08 crc kubenswrapper[4690]: I0320 17:53:08.735309 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"d11f0ffd-e625-4b90-a1e5-2315bf45529d\") " pod="openstack/glance-default-external-api-0" Mar 20 17:53:08 crc kubenswrapper[4690]: I0320 17:53:08.749114 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 17:53:09 crc kubenswrapper[4690]: E0320 17:53:09.470479 4690 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Mar 20 17:53:09 crc kubenswrapper[4690]: E0320 17:53:09.470749 4690 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh577h688h58bh585hd7h54bh677hcfh5b8h694h587h99h9dh5c6h84h68fh57fh5dfh647h99h59dh5b4h645h68bh668hcbh686hdch5d4h666hd4q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6hkm2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-d974c8585-p46g9_openstack(e58293ba-94a0-47f1-977a-2a10fd0f5845): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 17:53:09 crc kubenswrapper[4690]: E0320 17:53:09.472774 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-d974c8585-p46g9" podUID="e58293ba-94a0-47f1-977a-2a10fd0f5845" Mar 20 17:53:09 crc kubenswrapper[4690]: I0320 17:53:09.579772 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 17:53:09 crc kubenswrapper[4690]: I0320 17:53:09.584085 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-f4szb" Mar 20 17:53:09 crc kubenswrapper[4690]: I0320 17:53:09.716829 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe0f9e9c-fafa-47fd-9c26-58c6f77549b0-scripts\") pod \"fe0f9e9c-fafa-47fd-9c26-58c6f77549b0\" (UID: \"fe0f9e9c-fafa-47fd-9c26-58c6f77549b0\") " Mar 20 17:53:09 crc kubenswrapper[4690]: I0320 17:53:09.716869 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"fe0f9e9c-fafa-47fd-9c26-58c6f77549b0\" (UID: \"fe0f9e9c-fafa-47fd-9c26-58c6f77549b0\") " Mar 20 17:53:09 crc kubenswrapper[4690]: I0320 17:53:09.716901 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b572482e-a7ea-4311-bff2-2cbd1ec4b42e-combined-ca-bundle\") pod \"b572482e-a7ea-4311-bff2-2cbd1ec4b42e\" (UID: \"b572482e-a7ea-4311-bff2-2cbd1ec4b42e\") " Mar 20 17:53:09 crc kubenswrapper[4690]: I0320 17:53:09.716925 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b572482e-a7ea-4311-bff2-2cbd1ec4b42e-scripts\") pod \"b572482e-a7ea-4311-bff2-2cbd1ec4b42e\" (UID: \"b572482e-a7ea-4311-bff2-2cbd1ec4b42e\") " Mar 20 17:53:09 crc kubenswrapper[4690]: I0320 17:53:09.716957 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gz7h2\" (UniqueName: \"kubernetes.io/projected/fe0f9e9c-fafa-47fd-9c26-58c6f77549b0-kube-api-access-gz7h2\") pod \"fe0f9e9c-fafa-47fd-9c26-58c6f77549b0\" (UID: \"fe0f9e9c-fafa-47fd-9c26-58c6f77549b0\") " Mar 20 17:53:09 crc kubenswrapper[4690]: I0320 17:53:09.716983 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe0f9e9c-fafa-47fd-9c26-58c6f77549b0-internal-tls-certs\") pod \"fe0f9e9c-fafa-47fd-9c26-58c6f77549b0\" (UID: \"fe0f9e9c-fafa-47fd-9c26-58c6f77549b0\") " Mar 20 17:53:09 crc kubenswrapper[4690]: I0320 17:53:09.716999 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe0f9e9c-fafa-47fd-9c26-58c6f77549b0-config-data\") pod \"fe0f9e9c-fafa-47fd-9c26-58c6f77549b0\" (UID: \"fe0f9e9c-fafa-47fd-9c26-58c6f77549b0\") " Mar 20 17:53:09 crc kubenswrapper[4690]: I0320 17:53:09.717070 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b572482e-a7ea-4311-bff2-2cbd1ec4b42e-fernet-keys\") pod \"b572482e-a7ea-4311-bff2-2cbd1ec4b42e\" (UID: \"b572482e-a7ea-4311-bff2-2cbd1ec4b42e\") " Mar 20 17:53:09 crc kubenswrapper[4690]: I0320 17:53:09.717114 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe0f9e9c-fafa-47fd-9c26-58c6f77549b0-combined-ca-bundle\") pod \"fe0f9e9c-fafa-47fd-9c26-58c6f77549b0\" (UID: \"fe0f9e9c-fafa-47fd-9c26-58c6f77549b0\") " Mar 20 17:53:09 crc kubenswrapper[4690]: I0320 17:53:09.717163 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fe0f9e9c-fafa-47fd-9c26-58c6f77549b0-httpd-run\") pod \"fe0f9e9c-fafa-47fd-9c26-58c6f77549b0\" (UID: \"fe0f9e9c-fafa-47fd-9c26-58c6f77549b0\") " Mar 20 17:53:09 crc kubenswrapper[4690]: I0320 17:53:09.717178 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b572482e-a7ea-4311-bff2-2cbd1ec4b42e-credential-keys\") pod \"b572482e-a7ea-4311-bff2-2cbd1ec4b42e\" (UID: \"b572482e-a7ea-4311-bff2-2cbd1ec4b42e\") " Mar 20 17:53:09 crc kubenswrapper[4690]: I0320 17:53:09.717248 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ldlp\" (UniqueName: \"kubernetes.io/projected/b572482e-a7ea-4311-bff2-2cbd1ec4b42e-kube-api-access-6ldlp\") pod \"b572482e-a7ea-4311-bff2-2cbd1ec4b42e\" (UID: \"b572482e-a7ea-4311-bff2-2cbd1ec4b42e\") " Mar 20 17:53:09 crc kubenswrapper[4690]: I0320 17:53:09.717280 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b572482e-a7ea-4311-bff2-2cbd1ec4b42e-config-data\") pod \"b572482e-a7ea-4311-bff2-2cbd1ec4b42e\" (UID: \"b572482e-a7ea-4311-bff2-2cbd1ec4b42e\") " Mar 20 17:53:09 crc kubenswrapper[4690]: I0320 17:53:09.717299 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe0f9e9c-fafa-47fd-9c26-58c6f77549b0-logs\") pod \"fe0f9e9c-fafa-47fd-9c26-58c6f77549b0\" (UID: \"fe0f9e9c-fafa-47fd-9c26-58c6f77549b0\") " Mar 20 17:53:09 crc kubenswrapper[4690]: I0320 17:53:09.718108 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe0f9e9c-fafa-47fd-9c26-58c6f77549b0-logs" (OuterVolumeSpecName: "logs") pod "fe0f9e9c-fafa-47fd-9c26-58c6f77549b0" (UID: "fe0f9e9c-fafa-47fd-9c26-58c6f77549b0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:53:09 crc kubenswrapper[4690]: I0320 17:53:09.719656 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe0f9e9c-fafa-47fd-9c26-58c6f77549b0-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "fe0f9e9c-fafa-47fd-9c26-58c6f77549b0" (UID: "fe0f9e9c-fafa-47fd-9c26-58c6f77549b0"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:53:09 crc kubenswrapper[4690]: I0320 17:53:09.721216 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe0f9e9c-fafa-47fd-9c26-58c6f77549b0-scripts" (OuterVolumeSpecName: "scripts") pod "fe0f9e9c-fafa-47fd-9c26-58c6f77549b0" (UID: "fe0f9e9c-fafa-47fd-9c26-58c6f77549b0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:53:09 crc kubenswrapper[4690]: I0320 17:53:09.721389 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b572482e-a7ea-4311-bff2-2cbd1ec4b42e-scripts" (OuterVolumeSpecName: "scripts") pod "b572482e-a7ea-4311-bff2-2cbd1ec4b42e" (UID: "b572482e-a7ea-4311-bff2-2cbd1ec4b42e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:53:09 crc kubenswrapper[4690]: I0320 17:53:09.724405 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "fe0f9e9c-fafa-47fd-9c26-58c6f77549b0" (UID: "fe0f9e9c-fafa-47fd-9c26-58c6f77549b0"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 17:53:09 crc kubenswrapper[4690]: I0320 17:53:09.724487 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b572482e-a7ea-4311-bff2-2cbd1ec4b42e-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "b572482e-a7ea-4311-bff2-2cbd1ec4b42e" (UID: "b572482e-a7ea-4311-bff2-2cbd1ec4b42e"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:53:09 crc kubenswrapper[4690]: I0320 17:53:09.724709 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe0f9e9c-fafa-47fd-9c26-58c6f77549b0-kube-api-access-gz7h2" (OuterVolumeSpecName: "kube-api-access-gz7h2") pod "fe0f9e9c-fafa-47fd-9c26-58c6f77549b0" (UID: "fe0f9e9c-fafa-47fd-9c26-58c6f77549b0"). InnerVolumeSpecName "kube-api-access-gz7h2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:53:09 crc kubenswrapper[4690]: I0320 17:53:09.731758 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b572482e-a7ea-4311-bff2-2cbd1ec4b42e-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "b572482e-a7ea-4311-bff2-2cbd1ec4b42e" (UID: "b572482e-a7ea-4311-bff2-2cbd1ec4b42e"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:53:09 crc kubenswrapper[4690]: I0320 17:53:09.731920 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b572482e-a7ea-4311-bff2-2cbd1ec4b42e-kube-api-access-6ldlp" (OuterVolumeSpecName: "kube-api-access-6ldlp") pod "b572482e-a7ea-4311-bff2-2cbd1ec4b42e" (UID: "b572482e-a7ea-4311-bff2-2cbd1ec4b42e"). InnerVolumeSpecName "kube-api-access-6ldlp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:53:09 crc kubenswrapper[4690]: I0320 17:53:09.750551 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b572482e-a7ea-4311-bff2-2cbd1ec4b42e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b572482e-a7ea-4311-bff2-2cbd1ec4b42e" (UID: "b572482e-a7ea-4311-bff2-2cbd1ec4b42e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:53:09 crc kubenswrapper[4690]: I0320 17:53:09.759237 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe0f9e9c-fafa-47fd-9c26-58c6f77549b0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fe0f9e9c-fafa-47fd-9c26-58c6f77549b0" (UID: "fe0f9e9c-fafa-47fd-9c26-58c6f77549b0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:53:09 crc kubenswrapper[4690]: I0320 17:53:09.766445 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe0f9e9c-fafa-47fd-9c26-58c6f77549b0-config-data" (OuterVolumeSpecName: "config-data") pod "fe0f9e9c-fafa-47fd-9c26-58c6f77549b0" (UID: "fe0f9e9c-fafa-47fd-9c26-58c6f77549b0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:53:09 crc kubenswrapper[4690]: I0320 17:53:09.770974 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b572482e-a7ea-4311-bff2-2cbd1ec4b42e-config-data" (OuterVolumeSpecName: "config-data") pod "b572482e-a7ea-4311-bff2-2cbd1ec4b42e" (UID: "b572482e-a7ea-4311-bff2-2cbd1ec4b42e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:53:09 crc kubenswrapper[4690]: I0320 17:53:09.784925 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe0f9e9c-fafa-47fd-9c26-58c6f77549b0-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "fe0f9e9c-fafa-47fd-9c26-58c6f77549b0" (UID: "fe0f9e9c-fafa-47fd-9c26-58c6f77549b0"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:53:09 crc kubenswrapper[4690]: I0320 17:53:09.819396 4690 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe0f9e9c-fafa-47fd-9c26-58c6f77549b0-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:09 crc kubenswrapper[4690]: I0320 17:53:09.819610 4690 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Mar 20 17:53:09 crc kubenswrapper[4690]: I0320 17:53:09.819708 4690 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b572482e-a7ea-4311-bff2-2cbd1ec4b42e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:09 crc kubenswrapper[4690]: I0320 17:53:09.819782 4690 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b572482e-a7ea-4311-bff2-2cbd1ec4b42e-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:09 crc kubenswrapper[4690]: I0320 17:53:09.819840 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gz7h2\" (UniqueName: \"kubernetes.io/projected/fe0f9e9c-fafa-47fd-9c26-58c6f77549b0-kube-api-access-gz7h2\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:09 crc kubenswrapper[4690]: I0320 17:53:09.819893 4690 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe0f9e9c-fafa-47fd-9c26-58c6f77549b0-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:09 crc kubenswrapper[4690]: I0320 17:53:09.819949 4690 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe0f9e9c-fafa-47fd-9c26-58c6f77549b0-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:09 crc kubenswrapper[4690]: I0320 17:53:09.820006 4690 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b572482e-a7ea-4311-bff2-2cbd1ec4b42e-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:09 crc kubenswrapper[4690]: I0320 17:53:09.820058 4690 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe0f9e9c-fafa-47fd-9c26-58c6f77549b0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:09 crc kubenswrapper[4690]: I0320 17:53:09.820112 4690 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fe0f9e9c-fafa-47fd-9c26-58c6f77549b0-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:09 crc kubenswrapper[4690]: I0320 17:53:09.820161 4690 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b572482e-a7ea-4311-bff2-2cbd1ec4b42e-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:09 crc kubenswrapper[4690]: I0320 17:53:09.820209 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ldlp\" (UniqueName: \"kubernetes.io/projected/b572482e-a7ea-4311-bff2-2cbd1ec4b42e-kube-api-access-6ldlp\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:09 crc kubenswrapper[4690]: I0320 17:53:09.820273 4690 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b572482e-a7ea-4311-bff2-2cbd1ec4b42e-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:09 crc kubenswrapper[4690]: I0320 17:53:09.820331 4690 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe0f9e9c-fafa-47fd-9c26-58c6f77549b0-logs\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:09 crc kubenswrapper[4690]: I0320 17:53:09.838366 4690 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Mar 20 17:53:09 crc kubenswrapper[4690]: I0320 17:53:09.895516 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5f727fa-133b-4692-b725-e6854dc359fd" path="/var/lib/kubelet/pods/b5f727fa-133b-4692-b725-e6854dc359fd/volumes" Mar 20 17:53:09 crc kubenswrapper[4690]: I0320 17:53:09.922211 4690 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:10 crc kubenswrapper[4690]: I0320 17:53:10.307634 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 17:53:10 crc kubenswrapper[4690]: I0320 17:53:10.307650 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"fe0f9e9c-fafa-47fd-9c26-58c6f77549b0","Type":"ContainerDied","Data":"e4ca2ac6dfdf22b2164911289da3eeaa53890b1e09dd24c0c0c673f75654dfdf"} Mar 20 17:53:10 crc kubenswrapper[4690]: I0320 17:53:10.311199 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-f4szb" event={"ID":"b572482e-a7ea-4311-bff2-2cbd1ec4b42e","Type":"ContainerDied","Data":"aaca3b1392dec6d8d06c58001965720304aa77e766b602620c81fb7a7ac14aee"} Mar 20 17:53:10 crc kubenswrapper[4690]: I0320 17:53:10.311288 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aaca3b1392dec6d8d06c58001965720304aa77e766b602620c81fb7a7ac14aee" Mar 20 17:53:10 crc kubenswrapper[4690]: I0320 17:53:10.311221 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-f4szb" Mar 20 17:53:10 crc kubenswrapper[4690]: I0320 17:53:10.379109 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 17:53:10 crc kubenswrapper[4690]: I0320 17:53:10.391546 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 17:53:10 crc kubenswrapper[4690]: I0320 17:53:10.403519 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 17:53:10 crc kubenswrapper[4690]: E0320 17:53:10.404019 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe0f9e9c-fafa-47fd-9c26-58c6f77549b0" containerName="glance-httpd" Mar 20 17:53:10 crc kubenswrapper[4690]: I0320 17:53:10.404046 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe0f9e9c-fafa-47fd-9c26-58c6f77549b0" containerName="glance-httpd" Mar 20 17:53:10 crc kubenswrapper[4690]: E0320 17:53:10.404086 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b572482e-a7ea-4311-bff2-2cbd1ec4b42e" containerName="keystone-bootstrap" Mar 20 17:53:10 crc kubenswrapper[4690]: I0320 17:53:10.404099 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="b572482e-a7ea-4311-bff2-2cbd1ec4b42e" containerName="keystone-bootstrap" Mar 20 17:53:10 crc kubenswrapper[4690]: E0320 17:53:10.404122 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe0f9e9c-fafa-47fd-9c26-58c6f77549b0" containerName="glance-log" Mar 20 17:53:10 crc kubenswrapper[4690]: I0320 17:53:10.404134 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe0f9e9c-fafa-47fd-9c26-58c6f77549b0" containerName="glance-log" Mar 20 17:53:10 crc kubenswrapper[4690]: I0320 17:53:10.404389 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="b572482e-a7ea-4311-bff2-2cbd1ec4b42e" containerName="keystone-bootstrap" Mar 20 17:53:10 crc kubenswrapper[4690]: I0320 17:53:10.404422 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe0f9e9c-fafa-47fd-9c26-58c6f77549b0" containerName="glance-httpd" Mar 20 17:53:10 crc kubenswrapper[4690]: I0320 17:53:10.404448 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe0f9e9c-fafa-47fd-9c26-58c6f77549b0" containerName="glance-log" Mar 20 17:53:10 crc kubenswrapper[4690]: I0320 17:53:10.405746 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 17:53:10 crc kubenswrapper[4690]: I0320 17:53:10.408224 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 20 17:53:10 crc kubenswrapper[4690]: I0320 17:53:10.408486 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 20 17:53:10 crc kubenswrapper[4690]: I0320 17:53:10.416507 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 17:53:10 crc kubenswrapper[4690]: I0320 17:53:10.540730 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/deb0f27d-5620-4c5e-b5b0-a068c76c566f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"deb0f27d-5620-4c5e-b5b0-a068c76c566f\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:53:10 crc kubenswrapper[4690]: I0320 17:53:10.541017 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vt7k2\" (UniqueName: \"kubernetes.io/projected/deb0f27d-5620-4c5e-b5b0-a068c76c566f-kube-api-access-vt7k2\") pod \"glance-default-internal-api-0\" (UID: \"deb0f27d-5620-4c5e-b5b0-a068c76c566f\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:53:10 crc kubenswrapper[4690]: I0320 17:53:10.541062 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/deb0f27d-5620-4c5e-b5b0-a068c76c566f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"deb0f27d-5620-4c5e-b5b0-a068c76c566f\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:53:10 crc kubenswrapper[4690]: I0320 17:53:10.541105 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"deb0f27d-5620-4c5e-b5b0-a068c76c566f\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:53:10 crc kubenswrapper[4690]: I0320 17:53:10.541133 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/deb0f27d-5620-4c5e-b5b0-a068c76c566f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"deb0f27d-5620-4c5e-b5b0-a068c76c566f\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:53:10 crc kubenswrapper[4690]: I0320 17:53:10.541170 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/deb0f27d-5620-4c5e-b5b0-a068c76c566f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"deb0f27d-5620-4c5e-b5b0-a068c76c566f\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:53:10 crc kubenswrapper[4690]: I0320 17:53:10.541203 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/deb0f27d-5620-4c5e-b5b0-a068c76c566f-logs\") pod \"glance-default-internal-api-0\" (UID: \"deb0f27d-5620-4c5e-b5b0-a068c76c566f\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:53:10 crc kubenswrapper[4690]: I0320 17:53:10.541321 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/deb0f27d-5620-4c5e-b5b0-a068c76c566f-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"deb0f27d-5620-4c5e-b5b0-a068c76c566f\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:53:10 crc kubenswrapper[4690]: I0320 17:53:10.642851 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/deb0f27d-5620-4c5e-b5b0-a068c76c566f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"deb0f27d-5620-4c5e-b5b0-a068c76c566f\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:53:10 crc kubenswrapper[4690]: I0320 17:53:10.642914 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/deb0f27d-5620-4c5e-b5b0-a068c76c566f-logs\") pod \"glance-default-internal-api-0\" (UID: \"deb0f27d-5620-4c5e-b5b0-a068c76c566f\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:53:10 crc kubenswrapper[4690]: I0320 17:53:10.642950 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/deb0f27d-5620-4c5e-b5b0-a068c76c566f-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"deb0f27d-5620-4c5e-b5b0-a068c76c566f\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:53:10 crc kubenswrapper[4690]: I0320 17:53:10.643031 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/deb0f27d-5620-4c5e-b5b0-a068c76c566f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"deb0f27d-5620-4c5e-b5b0-a068c76c566f\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:53:10 crc kubenswrapper[4690]: I0320 17:53:10.643048 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vt7k2\" (UniqueName: \"kubernetes.io/projected/deb0f27d-5620-4c5e-b5b0-a068c76c566f-kube-api-access-vt7k2\") pod \"glance-default-internal-api-0\" (UID: \"deb0f27d-5620-4c5e-b5b0-a068c76c566f\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:53:10 crc kubenswrapper[4690]: I0320 17:53:10.643072 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/deb0f27d-5620-4c5e-b5b0-a068c76c566f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"deb0f27d-5620-4c5e-b5b0-a068c76c566f\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:53:10 crc kubenswrapper[4690]: I0320 17:53:10.643096 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"deb0f27d-5620-4c5e-b5b0-a068c76c566f\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:53:10 crc kubenswrapper[4690]: I0320 17:53:10.643161 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/deb0f27d-5620-4c5e-b5b0-a068c76c566f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"deb0f27d-5620-4c5e-b5b0-a068c76c566f\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:53:10 crc kubenswrapper[4690]: I0320 17:53:10.645086 4690 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"deb0f27d-5620-4c5e-b5b0-a068c76c566f\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-internal-api-0" Mar 20 17:53:10 crc kubenswrapper[4690]: I0320 17:53:10.645166 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/deb0f27d-5620-4c5e-b5b0-a068c76c566f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"deb0f27d-5620-4c5e-b5b0-a068c76c566f\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:53:10 crc kubenswrapper[4690]: I0320 17:53:10.645320 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/deb0f27d-5620-4c5e-b5b0-a068c76c566f-logs\") pod \"glance-default-internal-api-0\" (UID: \"deb0f27d-5620-4c5e-b5b0-a068c76c566f\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:53:10 crc kubenswrapper[4690]: I0320 17:53:10.650627 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/deb0f27d-5620-4c5e-b5b0-a068c76c566f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"deb0f27d-5620-4c5e-b5b0-a068c76c566f\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:53:10 crc kubenswrapper[4690]: I0320 17:53:10.654710 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/deb0f27d-5620-4c5e-b5b0-a068c76c566f-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"deb0f27d-5620-4c5e-b5b0-a068c76c566f\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:53:10 crc kubenswrapper[4690]: I0320 17:53:10.657100 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/deb0f27d-5620-4c5e-b5b0-a068c76c566f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"deb0f27d-5620-4c5e-b5b0-a068c76c566f\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:53:10 crc kubenswrapper[4690]: I0320 17:53:10.665776 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/deb0f27d-5620-4c5e-b5b0-a068c76c566f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"deb0f27d-5620-4c5e-b5b0-a068c76c566f\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:53:10 crc kubenswrapper[4690]: I0320 17:53:10.666732 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vt7k2\" (UniqueName: \"kubernetes.io/projected/deb0f27d-5620-4c5e-b5b0-a068c76c566f-kube-api-access-vt7k2\") pod \"glance-default-internal-api-0\" (UID: \"deb0f27d-5620-4c5e-b5b0-a068c76c566f\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:53:10 crc kubenswrapper[4690]: I0320 17:53:10.677455 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"deb0f27d-5620-4c5e-b5b0-a068c76c566f\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:53:10 crc kubenswrapper[4690]: I0320 17:53:10.730809 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 17:53:10 crc kubenswrapper[4690]: I0320 17:53:10.791910 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-f4szb"] Mar 20 17:53:10 crc kubenswrapper[4690]: I0320 17:53:10.806866 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-f4szb"] Mar 20 17:53:10 crc kubenswrapper[4690]: I0320 17:53:10.830501 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-m2vln"] Mar 20 17:53:10 crc kubenswrapper[4690]: I0320 17:53:10.832358 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-m2vln" Mar 20 17:53:10 crc kubenswrapper[4690]: I0320 17:53:10.835172 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 20 17:53:10 crc kubenswrapper[4690]: I0320 17:53:10.835211 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 20 17:53:10 crc kubenswrapper[4690]: I0320 17:53:10.835393 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 20 17:53:10 crc kubenswrapper[4690]: I0320 17:53:10.835464 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-7l4fq" Mar 20 17:53:10 crc kubenswrapper[4690]: I0320 17:53:10.836077 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 20 17:53:10 crc kubenswrapper[4690]: I0320 17:53:10.845169 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-m2vln"] Mar 20 17:53:10 crc kubenswrapper[4690]: I0320 17:53:10.952184 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2055a33a-e663-4664-9240-aab3d338c45e-config-data\") pod \"keystone-bootstrap-m2vln\" (UID: \"2055a33a-e663-4664-9240-aab3d338c45e\") " pod="openstack/keystone-bootstrap-m2vln" Mar 20 17:53:10 crc kubenswrapper[4690]: I0320 17:53:10.953790 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2055a33a-e663-4664-9240-aab3d338c45e-scripts\") pod \"keystone-bootstrap-m2vln\" (UID: \"2055a33a-e663-4664-9240-aab3d338c45e\") " pod="openstack/keystone-bootstrap-m2vln" Mar 20 17:53:10 crc kubenswrapper[4690]: I0320 17:53:10.953885 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2055a33a-e663-4664-9240-aab3d338c45e-fernet-keys\") pod \"keystone-bootstrap-m2vln\" (UID: \"2055a33a-e663-4664-9240-aab3d338c45e\") " pod="openstack/keystone-bootstrap-m2vln" Mar 20 17:53:10 crc kubenswrapper[4690]: I0320 17:53:10.954006 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2055a33a-e663-4664-9240-aab3d338c45e-credential-keys\") pod \"keystone-bootstrap-m2vln\" (UID: \"2055a33a-e663-4664-9240-aab3d338c45e\") " pod="openstack/keystone-bootstrap-m2vln" Mar 20 17:53:10 crc kubenswrapper[4690]: I0320 17:53:10.954050 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h57tl\" (UniqueName: \"kubernetes.io/projected/2055a33a-e663-4664-9240-aab3d338c45e-kube-api-access-h57tl\") pod \"keystone-bootstrap-m2vln\" (UID: \"2055a33a-e663-4664-9240-aab3d338c45e\") " pod="openstack/keystone-bootstrap-m2vln" Mar 20 17:53:10 crc kubenswrapper[4690]: I0320 17:53:10.954184 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2055a33a-e663-4664-9240-aab3d338c45e-combined-ca-bundle\") pod \"keystone-bootstrap-m2vln\" (UID: \"2055a33a-e663-4664-9240-aab3d338c45e\") " pod="openstack/keystone-bootstrap-m2vln" Mar 20 17:53:11 crc kubenswrapper[4690]: I0320 17:53:11.055950 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2055a33a-e663-4664-9240-aab3d338c45e-fernet-keys\") pod \"keystone-bootstrap-m2vln\" (UID: \"2055a33a-e663-4664-9240-aab3d338c45e\") " pod="openstack/keystone-bootstrap-m2vln" Mar 20 17:53:11 crc kubenswrapper[4690]: I0320 17:53:11.055999 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2055a33a-e663-4664-9240-aab3d338c45e-scripts\") pod \"keystone-bootstrap-m2vln\" (UID: \"2055a33a-e663-4664-9240-aab3d338c45e\") " pod="openstack/keystone-bootstrap-m2vln" Mar 20 17:53:11 crc kubenswrapper[4690]: I0320 17:53:11.056064 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2055a33a-e663-4664-9240-aab3d338c45e-credential-keys\") pod \"keystone-bootstrap-m2vln\" (UID: \"2055a33a-e663-4664-9240-aab3d338c45e\") " pod="openstack/keystone-bootstrap-m2vln" Mar 20 17:53:11 crc kubenswrapper[4690]: I0320 17:53:11.056105 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h57tl\" (UniqueName: \"kubernetes.io/projected/2055a33a-e663-4664-9240-aab3d338c45e-kube-api-access-h57tl\") pod \"keystone-bootstrap-m2vln\" (UID: \"2055a33a-e663-4664-9240-aab3d338c45e\") " pod="openstack/keystone-bootstrap-m2vln" Mar 20 17:53:11 crc kubenswrapper[4690]: I0320 17:53:11.056141 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2055a33a-e663-4664-9240-aab3d338c45e-combined-ca-bundle\") pod \"keystone-bootstrap-m2vln\" (UID: \"2055a33a-e663-4664-9240-aab3d338c45e\") " pod="openstack/keystone-bootstrap-m2vln" Mar 20 17:53:11 crc kubenswrapper[4690]: I0320 17:53:11.056232 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2055a33a-e663-4664-9240-aab3d338c45e-config-data\") pod \"keystone-bootstrap-m2vln\" (UID: \"2055a33a-e663-4664-9240-aab3d338c45e\") " pod="openstack/keystone-bootstrap-m2vln" Mar 20 17:53:11 crc kubenswrapper[4690]: I0320 17:53:11.061084 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2055a33a-e663-4664-9240-aab3d338c45e-scripts\") pod \"keystone-bootstrap-m2vln\" (UID: \"2055a33a-e663-4664-9240-aab3d338c45e\") " pod="openstack/keystone-bootstrap-m2vln" Mar 20 17:53:11 crc kubenswrapper[4690]: I0320 17:53:11.062957 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2055a33a-e663-4664-9240-aab3d338c45e-combined-ca-bundle\") pod \"keystone-bootstrap-m2vln\" (UID: \"2055a33a-e663-4664-9240-aab3d338c45e\") " pod="openstack/keystone-bootstrap-m2vln" Mar 20 17:53:11 crc kubenswrapper[4690]: I0320 17:53:11.062996 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2055a33a-e663-4664-9240-aab3d338c45e-config-data\") pod \"keystone-bootstrap-m2vln\" (UID: \"2055a33a-e663-4664-9240-aab3d338c45e\") " pod="openstack/keystone-bootstrap-m2vln" Mar 20 17:53:11 crc kubenswrapper[4690]: I0320 17:53:11.065026 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2055a33a-e663-4664-9240-aab3d338c45e-fernet-keys\") pod \"keystone-bootstrap-m2vln\" (UID: \"2055a33a-e663-4664-9240-aab3d338c45e\") " pod="openstack/keystone-bootstrap-m2vln" Mar 20 17:53:11 crc kubenswrapper[4690]: I0320 17:53:11.066775 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2055a33a-e663-4664-9240-aab3d338c45e-credential-keys\") pod \"keystone-bootstrap-m2vln\" (UID: \"2055a33a-e663-4664-9240-aab3d338c45e\") " pod="openstack/keystone-bootstrap-m2vln" Mar 20 17:53:11 crc kubenswrapper[4690]: I0320 17:53:11.071607 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h57tl\" (UniqueName: \"kubernetes.io/projected/2055a33a-e663-4664-9240-aab3d338c45e-kube-api-access-h57tl\") pod \"keystone-bootstrap-m2vln\" (UID: \"2055a33a-e663-4664-9240-aab3d338c45e\") " pod="openstack/keystone-bootstrap-m2vln" Mar 20 17:53:11 crc kubenswrapper[4690]: I0320 17:53:11.156514 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-m2vln" Mar 20 17:53:11 crc kubenswrapper[4690]: I0320 17:53:11.893686 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b572482e-a7ea-4311-bff2-2cbd1ec4b42e" path="/var/lib/kubelet/pods/b572482e-a7ea-4311-bff2-2cbd1ec4b42e/volumes" Mar 20 17:53:11 crc kubenswrapper[4690]: I0320 17:53:11.894807 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe0f9e9c-fafa-47fd-9c26-58c6f77549b0" path="/var/lib/kubelet/pods/fe0f9e9c-fafa-47fd-9c26-58c6f77549b0/volumes" Mar 20 17:53:12 crc kubenswrapper[4690]: I0320 17:53:12.331285 4690 generic.go:334] "Generic (PLEG): container finished" podID="3770976f-1610-4bb2-97db-0d81d8af8de1" containerID="cba91b6ab23732ed261f40e322910c9f1b17b102b8693b9ec31cdbe5057efa66" exitCode=0 Mar 20 17:53:12 crc kubenswrapper[4690]: I0320 17:53:12.331393 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-t2qth" event={"ID":"3770976f-1610-4bb2-97db-0d81d8af8de1","Type":"ContainerDied","Data":"cba91b6ab23732ed261f40e322910c9f1b17b102b8693b9ec31cdbe5057efa66"} Mar 20 17:53:12 crc kubenswrapper[4690]: I0320 17:53:12.386736 4690 scope.go:117] "RemoveContainer" containerID="8ba775c8634732f742dc32ece3d7951ce7af5e382cd21ed071defbae46a49a7a" Mar 20 17:53:16 crc kubenswrapper[4690]: I0320 17:53:16.568088 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-dd9c5c889-p2cbw" Mar 20 17:53:16 crc kubenswrapper[4690]: I0320 17:53:16.574716 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5f5b984557-pdc26" Mar 20 17:53:16 crc kubenswrapper[4690]: I0320 17:53:16.665433 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/94582dde-cc89-4d46-8a8d-655a743e8c02-config-data\") pod \"94582dde-cc89-4d46-8a8d-655a743e8c02\" (UID: \"94582dde-cc89-4d46-8a8d-655a743e8c02\") " Mar 20 17:53:16 crc kubenswrapper[4690]: I0320 17:53:16.665497 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2a888cef-7b34-4080-9689-79877684be58-config-data\") pod \"2a888cef-7b34-4080-9689-79877684be58\" (UID: \"2a888cef-7b34-4080-9689-79877684be58\") " Mar 20 17:53:16 crc kubenswrapper[4690]: I0320 17:53:16.665566 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/94582dde-cc89-4d46-8a8d-655a743e8c02-logs\") pod \"94582dde-cc89-4d46-8a8d-655a743e8c02\" (UID: \"94582dde-cc89-4d46-8a8d-655a743e8c02\") " Mar 20 17:53:16 crc kubenswrapper[4690]: I0320 17:53:16.665663 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/94582dde-cc89-4d46-8a8d-655a743e8c02-scripts\") pod \"94582dde-cc89-4d46-8a8d-655a743e8c02\" (UID: \"94582dde-cc89-4d46-8a8d-655a743e8c02\") " Mar 20 17:53:16 crc kubenswrapper[4690]: I0320 17:53:16.665714 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2a888cef-7b34-4080-9689-79877684be58-scripts\") pod \"2a888cef-7b34-4080-9689-79877684be58\" (UID: \"2a888cef-7b34-4080-9689-79877684be58\") " Mar 20 17:53:16 crc kubenswrapper[4690]: I0320 17:53:16.665759 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/94582dde-cc89-4d46-8a8d-655a743e8c02-horizon-secret-key\") pod \"94582dde-cc89-4d46-8a8d-655a743e8c02\" (UID: \"94582dde-cc89-4d46-8a8d-655a743e8c02\") " Mar 20 17:53:16 crc kubenswrapper[4690]: I0320 17:53:16.665779 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2a888cef-7b34-4080-9689-79877684be58-horizon-secret-key\") pod \"2a888cef-7b34-4080-9689-79877684be58\" (UID: \"2a888cef-7b34-4080-9689-79877684be58\") " Mar 20 17:53:16 crc kubenswrapper[4690]: I0320 17:53:16.665807 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a888cef-7b34-4080-9689-79877684be58-logs\") pod \"2a888cef-7b34-4080-9689-79877684be58\" (UID: \"2a888cef-7b34-4080-9689-79877684be58\") " Mar 20 17:53:16 crc kubenswrapper[4690]: I0320 17:53:16.665843 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwhnw\" (UniqueName: \"kubernetes.io/projected/94582dde-cc89-4d46-8a8d-655a743e8c02-kube-api-access-kwhnw\") pod \"94582dde-cc89-4d46-8a8d-655a743e8c02\" (UID: \"94582dde-cc89-4d46-8a8d-655a743e8c02\") " Mar 20 17:53:16 crc kubenswrapper[4690]: I0320 17:53:16.665860 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6knm\" (UniqueName: \"kubernetes.io/projected/2a888cef-7b34-4080-9689-79877684be58-kube-api-access-d6knm\") pod \"2a888cef-7b34-4080-9689-79877684be58\" (UID: \"2a888cef-7b34-4080-9689-79877684be58\") " Mar 20 17:53:16 crc kubenswrapper[4690]: I0320 17:53:16.666754 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a888cef-7b34-4080-9689-79877684be58-logs" (OuterVolumeSpecName: "logs") pod "2a888cef-7b34-4080-9689-79877684be58" (UID: "2a888cef-7b34-4080-9689-79877684be58"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:53:16 crc kubenswrapper[4690]: I0320 17:53:16.667002 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94582dde-cc89-4d46-8a8d-655a743e8c02-logs" (OuterVolumeSpecName: "logs") pod "94582dde-cc89-4d46-8a8d-655a743e8c02" (UID: "94582dde-cc89-4d46-8a8d-655a743e8c02"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:53:16 crc kubenswrapper[4690]: I0320 17:53:16.667149 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94582dde-cc89-4d46-8a8d-655a743e8c02-scripts" (OuterVolumeSpecName: "scripts") pod "94582dde-cc89-4d46-8a8d-655a743e8c02" (UID: "94582dde-cc89-4d46-8a8d-655a743e8c02"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:53:16 crc kubenswrapper[4690]: I0320 17:53:16.667196 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a888cef-7b34-4080-9689-79877684be58-scripts" (OuterVolumeSpecName: "scripts") pod "2a888cef-7b34-4080-9689-79877684be58" (UID: "2a888cef-7b34-4080-9689-79877684be58"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:53:16 crc kubenswrapper[4690]: I0320 17:53:16.667380 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94582dde-cc89-4d46-8a8d-655a743e8c02-config-data" (OuterVolumeSpecName: "config-data") pod "94582dde-cc89-4d46-8a8d-655a743e8c02" (UID: "94582dde-cc89-4d46-8a8d-655a743e8c02"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:53:16 crc kubenswrapper[4690]: I0320 17:53:16.667389 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a888cef-7b34-4080-9689-79877684be58-config-data" (OuterVolumeSpecName: "config-data") pod "2a888cef-7b34-4080-9689-79877684be58" (UID: "2a888cef-7b34-4080-9689-79877684be58"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:53:16 crc kubenswrapper[4690]: I0320 17:53:16.673997 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94582dde-cc89-4d46-8a8d-655a743e8c02-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "94582dde-cc89-4d46-8a8d-655a743e8c02" (UID: "94582dde-cc89-4d46-8a8d-655a743e8c02"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:53:16 crc kubenswrapper[4690]: I0320 17:53:16.674060 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a888cef-7b34-4080-9689-79877684be58-kube-api-access-d6knm" (OuterVolumeSpecName: "kube-api-access-d6knm") pod "2a888cef-7b34-4080-9689-79877684be58" (UID: "2a888cef-7b34-4080-9689-79877684be58"). InnerVolumeSpecName "kube-api-access-d6knm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:53:16 crc kubenswrapper[4690]: I0320 17:53:16.674998 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94582dde-cc89-4d46-8a8d-655a743e8c02-kube-api-access-kwhnw" (OuterVolumeSpecName: "kube-api-access-kwhnw") pod "94582dde-cc89-4d46-8a8d-655a743e8c02" (UID: "94582dde-cc89-4d46-8a8d-655a743e8c02"). InnerVolumeSpecName "kube-api-access-kwhnw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:53:16 crc kubenswrapper[4690]: I0320 17:53:16.675599 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a888cef-7b34-4080-9689-79877684be58-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "2a888cef-7b34-4080-9689-79877684be58" (UID: "2a888cef-7b34-4080-9689-79877684be58"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:53:16 crc kubenswrapper[4690]: I0320 17:53:16.770998 4690 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/94582dde-cc89-4d46-8a8d-655a743e8c02-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:16 crc kubenswrapper[4690]: I0320 17:53:16.771037 4690 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2a888cef-7b34-4080-9689-79877684be58-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:16 crc kubenswrapper[4690]: I0320 17:53:16.771051 4690 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/94582dde-cc89-4d46-8a8d-655a743e8c02-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:16 crc kubenswrapper[4690]: I0320 17:53:16.771067 4690 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2a888cef-7b34-4080-9689-79877684be58-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:16 crc kubenswrapper[4690]: I0320 17:53:16.771084 4690 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a888cef-7b34-4080-9689-79877684be58-logs\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:16 crc kubenswrapper[4690]: I0320 17:53:16.771096 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kwhnw\" (UniqueName: \"kubernetes.io/projected/94582dde-cc89-4d46-8a8d-655a743e8c02-kube-api-access-kwhnw\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:16 crc kubenswrapper[4690]: I0320 17:53:16.771109 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6knm\" (UniqueName: \"kubernetes.io/projected/2a888cef-7b34-4080-9689-79877684be58-kube-api-access-d6knm\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:16 crc kubenswrapper[4690]: I0320 17:53:16.771121 4690 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/94582dde-cc89-4d46-8a8d-655a743e8c02-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:16 crc kubenswrapper[4690]: I0320 17:53:16.771138 4690 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2a888cef-7b34-4080-9689-79877684be58-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:16 crc kubenswrapper[4690]: I0320 17:53:16.771151 4690 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/94582dde-cc89-4d46-8a8d-655a743e8c02-logs\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:17 crc kubenswrapper[4690]: E0320 17:53:17.082063 4690 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Mar 20 17:53:17 crc kubenswrapper[4690]: E0320 17:53:17.082314 4690 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6ck7x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-m4wn2_openstack(d1fc6c70-315f-47d3-b8d3-17e3da8ee4a0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 17:53:17 crc kubenswrapper[4690]: E0320 17:53:17.083888 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-m4wn2" podUID="d1fc6c70-315f-47d3-b8d3-17e3da8ee4a0" Mar 20 17:53:17 crc kubenswrapper[4690]: I0320 17:53:17.118388 4690 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-f6h5n" podUID="d5e029b9-bf4d-4700-9a5f-c35bd3459b15" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.127:5353: i/o timeout" Mar 20 17:53:17 crc kubenswrapper[4690]: I0320 17:53:17.118722 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74f6bcbc87-f6h5n" Mar 20 17:53:17 crc kubenswrapper[4690]: I0320 17:53:17.139245 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-f6h5n" Mar 20 17:53:17 crc kubenswrapper[4690]: I0320 17:53:17.156125 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-d974c8585-p46g9" Mar 20 17:53:17 crc kubenswrapper[4690]: I0320 17:53:17.160335 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-t2qth" Mar 20 17:53:17 crc kubenswrapper[4690]: I0320 17:53:17.288692 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d5e029b9-bf4d-4700-9a5f-c35bd3459b15-ovsdbserver-sb\") pod \"d5e029b9-bf4d-4700-9a5f-c35bd3459b15\" (UID: \"d5e029b9-bf4d-4700-9a5f-c35bd3459b15\") " Mar 20 17:53:17 crc kubenswrapper[4690]: I0320 17:53:17.288776 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e58293ba-94a0-47f1-977a-2a10fd0f5845-horizon-secret-key\") pod \"e58293ba-94a0-47f1-977a-2a10fd0f5845\" (UID: \"e58293ba-94a0-47f1-977a-2a10fd0f5845\") " Mar 20 17:53:17 crc kubenswrapper[4690]: I0320 17:53:17.288801 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d5e029b9-bf4d-4700-9a5f-c35bd3459b15-ovsdbserver-nb\") pod \"d5e029b9-bf4d-4700-9a5f-c35bd3459b15\" (UID: \"d5e029b9-bf4d-4700-9a5f-c35bd3459b15\") " Mar 20 17:53:17 crc kubenswrapper[4690]: I0320 17:53:17.288821 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5e029b9-bf4d-4700-9a5f-c35bd3459b15-config\") pod \"d5e029b9-bf4d-4700-9a5f-c35bd3459b15\" (UID: \"d5e029b9-bf4d-4700-9a5f-c35bd3459b15\") " Mar 20 17:53:17 crc kubenswrapper[4690]: I0320 17:53:17.288848 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6hkm2\" (UniqueName: \"kubernetes.io/projected/e58293ba-94a0-47f1-977a-2a10fd0f5845-kube-api-access-6hkm2\") pod \"e58293ba-94a0-47f1-977a-2a10fd0f5845\" (UID: \"e58293ba-94a0-47f1-977a-2a10fd0f5845\") " Mar 20 17:53:17 crc kubenswrapper[4690]: I0320 17:53:17.288872 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e58293ba-94a0-47f1-977a-2a10fd0f5845-config-data\") pod \"e58293ba-94a0-47f1-977a-2a10fd0f5845\" (UID: \"e58293ba-94a0-47f1-977a-2a10fd0f5845\") " Mar 20 17:53:17 crc kubenswrapper[4690]: I0320 17:53:17.288940 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3770976f-1610-4bb2-97db-0d81d8af8de1-config\") pod \"3770976f-1610-4bb2-97db-0d81d8af8de1\" (UID: \"3770976f-1610-4bb2-97db-0d81d8af8de1\") " Mar 20 17:53:17 crc kubenswrapper[4690]: I0320 17:53:17.288967 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2j52\" (UniqueName: \"kubernetes.io/projected/3770976f-1610-4bb2-97db-0d81d8af8de1-kube-api-access-s2j52\") pod \"3770976f-1610-4bb2-97db-0d81d8af8de1\" (UID: \"3770976f-1610-4bb2-97db-0d81d8af8de1\") " Mar 20 17:53:17 crc kubenswrapper[4690]: I0320 17:53:17.288983 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e58293ba-94a0-47f1-977a-2a10fd0f5845-scripts\") pod \"e58293ba-94a0-47f1-977a-2a10fd0f5845\" (UID: \"e58293ba-94a0-47f1-977a-2a10fd0f5845\") " Mar 20 17:53:17 crc kubenswrapper[4690]: I0320 17:53:17.289017 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-742ks\" (UniqueName: \"kubernetes.io/projected/d5e029b9-bf4d-4700-9a5f-c35bd3459b15-kube-api-access-742ks\") pod \"d5e029b9-bf4d-4700-9a5f-c35bd3459b15\" (UID: \"d5e029b9-bf4d-4700-9a5f-c35bd3459b15\") " Mar 20 17:53:17 crc kubenswrapper[4690]: I0320 17:53:17.289041 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3770976f-1610-4bb2-97db-0d81d8af8de1-combined-ca-bundle\") pod \"3770976f-1610-4bb2-97db-0d81d8af8de1\" (UID: \"3770976f-1610-4bb2-97db-0d81d8af8de1\") " Mar 20 17:53:17 crc kubenswrapper[4690]: I0320 17:53:17.289082 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e58293ba-94a0-47f1-977a-2a10fd0f5845-logs\") pod \"e58293ba-94a0-47f1-977a-2a10fd0f5845\" (UID: \"e58293ba-94a0-47f1-977a-2a10fd0f5845\") " Mar 20 17:53:17 crc kubenswrapper[4690]: I0320 17:53:17.289099 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d5e029b9-bf4d-4700-9a5f-c35bd3459b15-dns-swift-storage-0\") pod \"d5e029b9-bf4d-4700-9a5f-c35bd3459b15\" (UID: \"d5e029b9-bf4d-4700-9a5f-c35bd3459b15\") " Mar 20 17:53:17 crc kubenswrapper[4690]: I0320 17:53:17.289121 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d5e029b9-bf4d-4700-9a5f-c35bd3459b15-dns-svc\") pod \"d5e029b9-bf4d-4700-9a5f-c35bd3459b15\" (UID: \"d5e029b9-bf4d-4700-9a5f-c35bd3459b15\") " Mar 20 17:53:17 crc kubenswrapper[4690]: I0320 17:53:17.293775 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e58293ba-94a0-47f1-977a-2a10fd0f5845-logs" (OuterVolumeSpecName: "logs") pod "e58293ba-94a0-47f1-977a-2a10fd0f5845" (UID: "e58293ba-94a0-47f1-977a-2a10fd0f5845"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:53:17 crc kubenswrapper[4690]: I0320 17:53:17.303818 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e58293ba-94a0-47f1-977a-2a10fd0f5845-config-data" (OuterVolumeSpecName: "config-data") pod "e58293ba-94a0-47f1-977a-2a10fd0f5845" (UID: "e58293ba-94a0-47f1-977a-2a10fd0f5845"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:53:17 crc kubenswrapper[4690]: I0320 17:53:17.308480 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e58293ba-94a0-47f1-977a-2a10fd0f5845-scripts" (OuterVolumeSpecName: "scripts") pod "e58293ba-94a0-47f1-977a-2a10fd0f5845" (UID: "e58293ba-94a0-47f1-977a-2a10fd0f5845"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:53:17 crc kubenswrapper[4690]: I0320 17:53:17.311107 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3770976f-1610-4bb2-97db-0d81d8af8de1-kube-api-access-s2j52" (OuterVolumeSpecName: "kube-api-access-s2j52") pod "3770976f-1610-4bb2-97db-0d81d8af8de1" (UID: "3770976f-1610-4bb2-97db-0d81d8af8de1"). InnerVolumeSpecName "kube-api-access-s2j52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:53:17 crc kubenswrapper[4690]: I0320 17:53:17.331184 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e58293ba-94a0-47f1-977a-2a10fd0f5845-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "e58293ba-94a0-47f1-977a-2a10fd0f5845" (UID: "e58293ba-94a0-47f1-977a-2a10fd0f5845"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:53:17 crc kubenswrapper[4690]: I0320 17:53:17.336922 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5e029b9-bf4d-4700-9a5f-c35bd3459b15-kube-api-access-742ks" (OuterVolumeSpecName: "kube-api-access-742ks") pod "d5e029b9-bf4d-4700-9a5f-c35bd3459b15" (UID: "d5e029b9-bf4d-4700-9a5f-c35bd3459b15"). InnerVolumeSpecName "kube-api-access-742ks". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:53:17 crc kubenswrapper[4690]: I0320 17:53:17.339320 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e58293ba-94a0-47f1-977a-2a10fd0f5845-kube-api-access-6hkm2" (OuterVolumeSpecName: "kube-api-access-6hkm2") pod "e58293ba-94a0-47f1-977a-2a10fd0f5845" (UID: "e58293ba-94a0-47f1-977a-2a10fd0f5845"). InnerVolumeSpecName "kube-api-access-6hkm2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:53:17 crc kubenswrapper[4690]: I0320 17:53:17.376769 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3770976f-1610-4bb2-97db-0d81d8af8de1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3770976f-1610-4bb2-97db-0d81d8af8de1" (UID: "3770976f-1610-4bb2-97db-0d81d8af8de1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:53:17 crc kubenswrapper[4690]: I0320 17:53:17.382020 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3770976f-1610-4bb2-97db-0d81d8af8de1-config" (OuterVolumeSpecName: "config") pod "3770976f-1610-4bb2-97db-0d81d8af8de1" (UID: "3770976f-1610-4bb2-97db-0d81d8af8de1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:53:17 crc kubenswrapper[4690]: I0320 17:53:17.382972 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-t2qth" Mar 20 17:53:17 crc kubenswrapper[4690]: I0320 17:53:17.383113 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-t2qth" event={"ID":"3770976f-1610-4bb2-97db-0d81d8af8de1","Type":"ContainerDied","Data":"06fcc9c3c97ee1cfffdb9190a5a2c534c94e9d009361140cf6b83d9da2eb4294"} Mar 20 17:53:17 crc kubenswrapper[4690]: I0320 17:53:17.383150 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="06fcc9c3c97ee1cfffdb9190a5a2c534c94e9d009361140cf6b83d9da2eb4294" Mar 20 17:53:17 crc kubenswrapper[4690]: I0320 17:53:17.385914 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5f5b984557-pdc26" event={"ID":"94582dde-cc89-4d46-8a8d-655a743e8c02","Type":"ContainerDied","Data":"9eb81841b72316a1512f4e8eaeb5b51b5e5e5a32aea8a5a6f3968aefa0982f0a"} Mar 20 17:53:17 crc kubenswrapper[4690]: I0320 17:53:17.386005 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5f5b984557-pdc26" Mar 20 17:53:17 crc kubenswrapper[4690]: I0320 17:53:17.390466 4690 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/3770976f-1610-4bb2-97db-0d81d8af8de1-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:17 crc kubenswrapper[4690]: I0320 17:53:17.390488 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s2j52\" (UniqueName: \"kubernetes.io/projected/3770976f-1610-4bb2-97db-0d81d8af8de1-kube-api-access-s2j52\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:17 crc kubenswrapper[4690]: I0320 17:53:17.390496 4690 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e58293ba-94a0-47f1-977a-2a10fd0f5845-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:17 crc kubenswrapper[4690]: I0320 17:53:17.390505 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-742ks\" (UniqueName: \"kubernetes.io/projected/d5e029b9-bf4d-4700-9a5f-c35bd3459b15-kube-api-access-742ks\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:17 crc kubenswrapper[4690]: I0320 17:53:17.390514 4690 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3770976f-1610-4bb2-97db-0d81d8af8de1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:17 crc kubenswrapper[4690]: I0320 17:53:17.390523 4690 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e58293ba-94a0-47f1-977a-2a10fd0f5845-logs\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:17 crc kubenswrapper[4690]: I0320 17:53:17.390531 4690 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e58293ba-94a0-47f1-977a-2a10fd0f5845-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:17 crc kubenswrapper[4690]: I0320 17:53:17.390539 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6hkm2\" (UniqueName: \"kubernetes.io/projected/e58293ba-94a0-47f1-977a-2a10fd0f5845-kube-api-access-6hkm2\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:17 crc kubenswrapper[4690]: I0320 17:53:17.390547 4690 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e58293ba-94a0-47f1-977a-2a10fd0f5845-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:17 crc kubenswrapper[4690]: I0320 17:53:17.391079 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-dd9c5c889-p2cbw" event={"ID":"2a888cef-7b34-4080-9689-79877684be58","Type":"ContainerDied","Data":"94e1a80a70166a34a353cd235f9fca27370e866e7d524cc08636775cd6dc2dec"} Mar 20 17:53:17 crc kubenswrapper[4690]: I0320 17:53:17.391166 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-dd9c5c889-p2cbw" Mar 20 17:53:17 crc kubenswrapper[4690]: I0320 17:53:17.392316 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5e029b9-bf4d-4700-9a5f-c35bd3459b15-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d5e029b9-bf4d-4700-9a5f-c35bd3459b15" (UID: "d5e029b9-bf4d-4700-9a5f-c35bd3459b15"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:53:17 crc kubenswrapper[4690]: I0320 17:53:17.402018 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-d974c8585-p46g9" Mar 20 17:53:17 crc kubenswrapper[4690]: I0320 17:53:17.402030 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-d974c8585-p46g9" event={"ID":"e58293ba-94a0-47f1-977a-2a10fd0f5845","Type":"ContainerDied","Data":"981c62c8508c07c6d0631b84e33152a25df3591d9369b345a2375943883d4d7b"} Mar 20 17:53:17 crc kubenswrapper[4690]: I0320 17:53:17.406033 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-f6h5n" Mar 20 17:53:17 crc kubenswrapper[4690]: I0320 17:53:17.406214 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-f6h5n" event={"ID":"d5e029b9-bf4d-4700-9a5f-c35bd3459b15","Type":"ContainerDied","Data":"3511c0655e7d34dc813afbc16bf7ef428907f3729fb70fc4c23bc7e54e518e06"} Mar 20 17:53:17 crc kubenswrapper[4690]: E0320 17:53:17.406902 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-m4wn2" podUID="d1fc6c70-315f-47d3-b8d3-17e3da8ee4a0" Mar 20 17:53:17 crc kubenswrapper[4690]: I0320 17:53:17.409903 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5e029b9-bf4d-4700-9a5f-c35bd3459b15-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d5e029b9-bf4d-4700-9a5f-c35bd3459b15" (UID: "d5e029b9-bf4d-4700-9a5f-c35bd3459b15"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:53:17 crc kubenswrapper[4690]: I0320 17:53:17.409933 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5e029b9-bf4d-4700-9a5f-c35bd3459b15-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d5e029b9-bf4d-4700-9a5f-c35bd3459b15" (UID: "d5e029b9-bf4d-4700-9a5f-c35bd3459b15"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:53:17 crc kubenswrapper[4690]: I0320 17:53:17.410892 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5e029b9-bf4d-4700-9a5f-c35bd3459b15-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d5e029b9-bf4d-4700-9a5f-c35bd3459b15" (UID: "d5e029b9-bf4d-4700-9a5f-c35bd3459b15"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:53:17 crc kubenswrapper[4690]: I0320 17:53:17.432413 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5e029b9-bf4d-4700-9a5f-c35bd3459b15-config" (OuterVolumeSpecName: "config") pod "d5e029b9-bf4d-4700-9a5f-c35bd3459b15" (UID: "d5e029b9-bf4d-4700-9a5f-c35bd3459b15"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:53:17 crc kubenswrapper[4690]: I0320 17:53:17.492527 4690 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d5e029b9-bf4d-4700-9a5f-c35bd3459b15-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:17 crc kubenswrapper[4690]: I0320 17:53:17.492558 4690 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d5e029b9-bf4d-4700-9a5f-c35bd3459b15-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:17 crc kubenswrapper[4690]: I0320 17:53:17.492567 4690 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d5e029b9-bf4d-4700-9a5f-c35bd3459b15-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:17 crc kubenswrapper[4690]: I0320 17:53:17.492575 4690 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d5e029b9-bf4d-4700-9a5f-c35bd3459b15-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:17 crc kubenswrapper[4690]: I0320 17:53:17.492584 4690 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5e029b9-bf4d-4700-9a5f-c35bd3459b15-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:17 crc kubenswrapper[4690]: I0320 17:53:17.511230 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-dd9c5c889-p2cbw"] Mar 20 17:53:17 crc kubenswrapper[4690]: I0320 17:53:17.524831 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-dd9c5c889-p2cbw"] Mar 20 17:53:17 crc kubenswrapper[4690]: I0320 17:53:17.556461 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-d974c8585-p46g9"] Mar 20 17:53:17 crc kubenswrapper[4690]: I0320 17:53:17.565816 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-d974c8585-p46g9"] Mar 20 17:53:17 crc kubenswrapper[4690]: I0320 17:53:17.580230 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5f5b984557-pdc26"] Mar 20 17:53:17 crc kubenswrapper[4690]: I0320 17:53:17.586460 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5f5b984557-pdc26"] Mar 20 17:53:17 crc kubenswrapper[4690]: I0320 17:53:17.755171 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-f6h5n"] Mar 20 17:53:17 crc kubenswrapper[4690]: I0320 17:53:17.773631 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-f6h5n"] Mar 20 17:53:17 crc kubenswrapper[4690]: I0320 17:53:17.894707 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a888cef-7b34-4080-9689-79877684be58" path="/var/lib/kubelet/pods/2a888cef-7b34-4080-9689-79877684be58/volumes" Mar 20 17:53:17 crc kubenswrapper[4690]: I0320 17:53:17.895233 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94582dde-cc89-4d46-8a8d-655a743e8c02" path="/var/lib/kubelet/pods/94582dde-cc89-4d46-8a8d-655a743e8c02/volumes" Mar 20 17:53:17 crc kubenswrapper[4690]: I0320 17:53:17.895662 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5e029b9-bf4d-4700-9a5f-c35bd3459b15" path="/var/lib/kubelet/pods/d5e029b9-bf4d-4700-9a5f-c35bd3459b15/volumes" Mar 20 17:53:17 crc kubenswrapper[4690]: I0320 17:53:17.896486 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e58293ba-94a0-47f1-977a-2a10fd0f5845" path="/var/lib/kubelet/pods/e58293ba-94a0-47f1-977a-2a10fd0f5845/volumes" Mar 20 17:53:18 crc kubenswrapper[4690]: I0320 17:53:18.354305 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-h8fdq"] Mar 20 17:53:18 crc kubenswrapper[4690]: E0320 17:53:18.354649 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5e029b9-bf4d-4700-9a5f-c35bd3459b15" containerName="init" Mar 20 17:53:18 crc kubenswrapper[4690]: I0320 17:53:18.354686 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5e029b9-bf4d-4700-9a5f-c35bd3459b15" containerName="init" Mar 20 17:53:18 crc kubenswrapper[4690]: E0320 17:53:18.354700 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5e029b9-bf4d-4700-9a5f-c35bd3459b15" containerName="dnsmasq-dns" Mar 20 17:53:18 crc kubenswrapper[4690]: I0320 17:53:18.354707 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5e029b9-bf4d-4700-9a5f-c35bd3459b15" containerName="dnsmasq-dns" Mar 20 17:53:18 crc kubenswrapper[4690]: E0320 17:53:18.354717 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3770976f-1610-4bb2-97db-0d81d8af8de1" containerName="neutron-db-sync" Mar 20 17:53:18 crc kubenswrapper[4690]: I0320 17:53:18.354725 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="3770976f-1610-4bb2-97db-0d81d8af8de1" containerName="neutron-db-sync" Mar 20 17:53:18 crc kubenswrapper[4690]: I0320 17:53:18.354914 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5e029b9-bf4d-4700-9a5f-c35bd3459b15" containerName="dnsmasq-dns" Mar 20 17:53:18 crc kubenswrapper[4690]: I0320 17:53:18.354933 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="3770976f-1610-4bb2-97db-0d81d8af8de1" containerName="neutron-db-sync" Mar 20 17:53:18 crc kubenswrapper[4690]: I0320 17:53:18.355742 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-h8fdq" Mar 20 17:53:18 crc kubenswrapper[4690]: I0320 17:53:18.372687 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-h8fdq"] Mar 20 17:53:18 crc kubenswrapper[4690]: I0320 17:53:18.533042 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a934f11e-9b01-4a42-ba23-75fbf6461c04-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-h8fdq\" (UID: \"a934f11e-9b01-4a42-ba23-75fbf6461c04\") " pod="openstack/dnsmasq-dns-55f844cf75-h8fdq" Mar 20 17:53:18 crc kubenswrapper[4690]: I0320 17:53:18.533196 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a934f11e-9b01-4a42-ba23-75fbf6461c04-config\") pod \"dnsmasq-dns-55f844cf75-h8fdq\" (UID: \"a934f11e-9b01-4a42-ba23-75fbf6461c04\") " pod="openstack/dnsmasq-dns-55f844cf75-h8fdq" Mar 20 17:53:18 crc kubenswrapper[4690]: I0320 17:53:18.533227 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a934f11e-9b01-4a42-ba23-75fbf6461c04-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-h8fdq\" (UID: \"a934f11e-9b01-4a42-ba23-75fbf6461c04\") " pod="openstack/dnsmasq-dns-55f844cf75-h8fdq" Mar 20 17:53:18 crc kubenswrapper[4690]: I0320 17:53:18.533336 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2t2qk\" (UniqueName: \"kubernetes.io/projected/a934f11e-9b01-4a42-ba23-75fbf6461c04-kube-api-access-2t2qk\") pod \"dnsmasq-dns-55f844cf75-h8fdq\" (UID: \"a934f11e-9b01-4a42-ba23-75fbf6461c04\") " pod="openstack/dnsmasq-dns-55f844cf75-h8fdq" Mar 20 17:53:18 crc kubenswrapper[4690]: I0320 17:53:18.533363 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a934f11e-9b01-4a42-ba23-75fbf6461c04-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-h8fdq\" (UID: \"a934f11e-9b01-4a42-ba23-75fbf6461c04\") " pod="openstack/dnsmasq-dns-55f844cf75-h8fdq" Mar 20 17:53:18 crc kubenswrapper[4690]: I0320 17:53:18.533468 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a934f11e-9b01-4a42-ba23-75fbf6461c04-dns-svc\") pod \"dnsmasq-dns-55f844cf75-h8fdq\" (UID: \"a934f11e-9b01-4a42-ba23-75fbf6461c04\") " pod="openstack/dnsmasq-dns-55f844cf75-h8fdq" Mar 20 17:53:18 crc kubenswrapper[4690]: E0320 17:53:18.539108 4690 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Mar 20 17:53:18 crc kubenswrapper[4690]: E0320 17:53:18.539407 4690 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8gqwx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-wqk6t_openstack(ef3bcd50-5724-42a1-92df-262256c07d45): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 17:53:18 crc kubenswrapper[4690]: I0320 17:53:18.539657 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-75b55fdddd-6ht5q"] Mar 20 17:53:18 crc kubenswrapper[4690]: E0320 17:53:18.541064 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-wqk6t" podUID="ef3bcd50-5724-42a1-92df-262256c07d45" Mar 20 17:53:18 crc kubenswrapper[4690]: I0320 17:53:18.542807 4690 scope.go:117] "RemoveContainer" containerID="aad04c4f37bcade50528ca6e45886e535e75346d3f121c5a109505ce3274d6f1" Mar 20 17:53:18 crc kubenswrapper[4690]: I0320 17:53:18.545114 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-75b55fdddd-6ht5q" Mar 20 17:53:18 crc kubenswrapper[4690]: I0320 17:53:18.547625 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 20 17:53:18 crc kubenswrapper[4690]: I0320 17:53:18.548062 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 20 17:53:18 crc kubenswrapper[4690]: I0320 17:53:18.564864 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-xwbns" Mar 20 17:53:18 crc kubenswrapper[4690]: I0320 17:53:18.565113 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 20 17:53:18 crc kubenswrapper[4690]: I0320 17:53:18.568528 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-75b55fdddd-6ht5q"] Mar 20 17:53:18 crc kubenswrapper[4690]: I0320 17:53:18.625358 4690 scope.go:117] "RemoveContainer" containerID="79dab24f4bf0296932cee4a6d89d82e399b89305d1e558f9a96ed466729c8f4c" Mar 20 17:53:18 crc kubenswrapper[4690]: I0320 17:53:18.634855 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/04ab06c3-11ab-4253-bafa-fea6ac93bedf-config\") pod \"neutron-75b55fdddd-6ht5q\" (UID: \"04ab06c3-11ab-4253-bafa-fea6ac93bedf\") " pod="openstack/neutron-75b55fdddd-6ht5q" Mar 20 17:53:18 crc kubenswrapper[4690]: I0320 17:53:18.634918 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04ab06c3-11ab-4253-bafa-fea6ac93bedf-combined-ca-bundle\") pod \"neutron-75b55fdddd-6ht5q\" (UID: \"04ab06c3-11ab-4253-bafa-fea6ac93bedf\") " pod="openstack/neutron-75b55fdddd-6ht5q" Mar 20 17:53:18 crc kubenswrapper[4690]: I0320 17:53:18.635025 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a934f11e-9b01-4a42-ba23-75fbf6461c04-dns-svc\") pod \"dnsmasq-dns-55f844cf75-h8fdq\" (UID: \"a934f11e-9b01-4a42-ba23-75fbf6461c04\") " pod="openstack/dnsmasq-dns-55f844cf75-h8fdq" Mar 20 17:53:18 crc kubenswrapper[4690]: I0320 17:53:18.636014 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbkbz\" (UniqueName: \"kubernetes.io/projected/04ab06c3-11ab-4253-bafa-fea6ac93bedf-kube-api-access-kbkbz\") pod \"neutron-75b55fdddd-6ht5q\" (UID: \"04ab06c3-11ab-4253-bafa-fea6ac93bedf\") " pod="openstack/neutron-75b55fdddd-6ht5q" Mar 20 17:53:18 crc kubenswrapper[4690]: I0320 17:53:18.636125 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a934f11e-9b01-4a42-ba23-75fbf6461c04-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-h8fdq\" (UID: \"a934f11e-9b01-4a42-ba23-75fbf6461c04\") " pod="openstack/dnsmasq-dns-55f844cf75-h8fdq" Mar 20 17:53:18 crc kubenswrapper[4690]: I0320 17:53:18.636237 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a934f11e-9b01-4a42-ba23-75fbf6461c04-config\") pod \"dnsmasq-dns-55f844cf75-h8fdq\" (UID: \"a934f11e-9b01-4a42-ba23-75fbf6461c04\") " pod="openstack/dnsmasq-dns-55f844cf75-h8fdq" Mar 20 17:53:18 crc kubenswrapper[4690]: I0320 17:53:18.636274 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a934f11e-9b01-4a42-ba23-75fbf6461c04-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-h8fdq\" (UID: \"a934f11e-9b01-4a42-ba23-75fbf6461c04\") " pod="openstack/dnsmasq-dns-55f844cf75-h8fdq" Mar 20 17:53:18 crc kubenswrapper[4690]: I0320 17:53:18.636308 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/04ab06c3-11ab-4253-bafa-fea6ac93bedf-httpd-config\") pod \"neutron-75b55fdddd-6ht5q\" (UID: \"04ab06c3-11ab-4253-bafa-fea6ac93bedf\") " pod="openstack/neutron-75b55fdddd-6ht5q" Mar 20 17:53:18 crc kubenswrapper[4690]: I0320 17:53:18.636340 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/04ab06c3-11ab-4253-bafa-fea6ac93bedf-ovndb-tls-certs\") pod \"neutron-75b55fdddd-6ht5q\" (UID: \"04ab06c3-11ab-4253-bafa-fea6ac93bedf\") " pod="openstack/neutron-75b55fdddd-6ht5q" Mar 20 17:53:18 crc kubenswrapper[4690]: I0320 17:53:18.636387 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2t2qk\" (UniqueName: \"kubernetes.io/projected/a934f11e-9b01-4a42-ba23-75fbf6461c04-kube-api-access-2t2qk\") pod \"dnsmasq-dns-55f844cf75-h8fdq\" (UID: \"a934f11e-9b01-4a42-ba23-75fbf6461c04\") " pod="openstack/dnsmasq-dns-55f844cf75-h8fdq" Mar 20 17:53:18 crc kubenswrapper[4690]: I0320 17:53:18.636410 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a934f11e-9b01-4a42-ba23-75fbf6461c04-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-h8fdq\" (UID: \"a934f11e-9b01-4a42-ba23-75fbf6461c04\") " pod="openstack/dnsmasq-dns-55f844cf75-h8fdq" Mar 20 17:53:18 crc kubenswrapper[4690]: I0320 17:53:18.636418 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a934f11e-9b01-4a42-ba23-75fbf6461c04-dns-svc\") pod \"dnsmasq-dns-55f844cf75-h8fdq\" (UID: \"a934f11e-9b01-4a42-ba23-75fbf6461c04\") " pod="openstack/dnsmasq-dns-55f844cf75-h8fdq" Mar 20 17:53:18 crc kubenswrapper[4690]: I0320 17:53:18.636972 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a934f11e-9b01-4a42-ba23-75fbf6461c04-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-h8fdq\" (UID: \"a934f11e-9b01-4a42-ba23-75fbf6461c04\") " pod="openstack/dnsmasq-dns-55f844cf75-h8fdq" Mar 20 17:53:18 crc kubenswrapper[4690]: I0320 17:53:18.637863 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a934f11e-9b01-4a42-ba23-75fbf6461c04-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-h8fdq\" (UID: \"a934f11e-9b01-4a42-ba23-75fbf6461c04\") " pod="openstack/dnsmasq-dns-55f844cf75-h8fdq" Mar 20 17:53:18 crc kubenswrapper[4690]: I0320 17:53:18.638032 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a934f11e-9b01-4a42-ba23-75fbf6461c04-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-h8fdq\" (UID: \"a934f11e-9b01-4a42-ba23-75fbf6461c04\") " pod="openstack/dnsmasq-dns-55f844cf75-h8fdq" Mar 20 17:53:18 crc kubenswrapper[4690]: I0320 17:53:18.638128 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a934f11e-9b01-4a42-ba23-75fbf6461c04-config\") pod \"dnsmasq-dns-55f844cf75-h8fdq\" (UID: \"a934f11e-9b01-4a42-ba23-75fbf6461c04\") " pod="openstack/dnsmasq-dns-55f844cf75-h8fdq" Mar 20 17:53:18 crc kubenswrapper[4690]: I0320 17:53:18.662351 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2t2qk\" (UniqueName: \"kubernetes.io/projected/a934f11e-9b01-4a42-ba23-75fbf6461c04-kube-api-access-2t2qk\") pod \"dnsmasq-dns-55f844cf75-h8fdq\" (UID: \"a934f11e-9b01-4a42-ba23-75fbf6461c04\") " pod="openstack/dnsmasq-dns-55f844cf75-h8fdq" Mar 20 17:53:18 crc kubenswrapper[4690]: I0320 17:53:18.682591 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-h8fdq" Mar 20 17:53:18 crc kubenswrapper[4690]: I0320 17:53:18.737606 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04ab06c3-11ab-4253-bafa-fea6ac93bedf-combined-ca-bundle\") pod \"neutron-75b55fdddd-6ht5q\" (UID: \"04ab06c3-11ab-4253-bafa-fea6ac93bedf\") " pod="openstack/neutron-75b55fdddd-6ht5q" Mar 20 17:53:18 crc kubenswrapper[4690]: I0320 17:53:18.737658 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbkbz\" (UniqueName: \"kubernetes.io/projected/04ab06c3-11ab-4253-bafa-fea6ac93bedf-kube-api-access-kbkbz\") pod \"neutron-75b55fdddd-6ht5q\" (UID: \"04ab06c3-11ab-4253-bafa-fea6ac93bedf\") " pod="openstack/neutron-75b55fdddd-6ht5q" Mar 20 17:53:18 crc kubenswrapper[4690]: I0320 17:53:18.737739 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/04ab06c3-11ab-4253-bafa-fea6ac93bedf-httpd-config\") pod \"neutron-75b55fdddd-6ht5q\" (UID: \"04ab06c3-11ab-4253-bafa-fea6ac93bedf\") " pod="openstack/neutron-75b55fdddd-6ht5q" Mar 20 17:53:18 crc kubenswrapper[4690]: I0320 17:53:18.737765 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/04ab06c3-11ab-4253-bafa-fea6ac93bedf-ovndb-tls-certs\") pod \"neutron-75b55fdddd-6ht5q\" (UID: \"04ab06c3-11ab-4253-bafa-fea6ac93bedf\") " pod="openstack/neutron-75b55fdddd-6ht5q" Mar 20 17:53:18 crc kubenswrapper[4690]: I0320 17:53:18.737800 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/04ab06c3-11ab-4253-bafa-fea6ac93bedf-config\") pod \"neutron-75b55fdddd-6ht5q\" (UID: \"04ab06c3-11ab-4253-bafa-fea6ac93bedf\") " pod="openstack/neutron-75b55fdddd-6ht5q" Mar 20 17:53:18 crc kubenswrapper[4690]: I0320 17:53:18.741916 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/04ab06c3-11ab-4253-bafa-fea6ac93bedf-httpd-config\") pod \"neutron-75b55fdddd-6ht5q\" (UID: \"04ab06c3-11ab-4253-bafa-fea6ac93bedf\") " pod="openstack/neutron-75b55fdddd-6ht5q" Mar 20 17:53:18 crc kubenswrapper[4690]: I0320 17:53:18.742721 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/04ab06c3-11ab-4253-bafa-fea6ac93bedf-config\") pod \"neutron-75b55fdddd-6ht5q\" (UID: \"04ab06c3-11ab-4253-bafa-fea6ac93bedf\") " pod="openstack/neutron-75b55fdddd-6ht5q" Mar 20 17:53:18 crc kubenswrapper[4690]: I0320 17:53:18.745280 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/04ab06c3-11ab-4253-bafa-fea6ac93bedf-ovndb-tls-certs\") pod \"neutron-75b55fdddd-6ht5q\" (UID: \"04ab06c3-11ab-4253-bafa-fea6ac93bedf\") " pod="openstack/neutron-75b55fdddd-6ht5q" Mar 20 17:53:18 crc kubenswrapper[4690]: I0320 17:53:18.753414 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04ab06c3-11ab-4253-bafa-fea6ac93bedf-combined-ca-bundle\") pod \"neutron-75b55fdddd-6ht5q\" (UID: \"04ab06c3-11ab-4253-bafa-fea6ac93bedf\") " pod="openstack/neutron-75b55fdddd-6ht5q" Mar 20 17:53:18 crc kubenswrapper[4690]: I0320 17:53:18.764006 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbkbz\" (UniqueName: \"kubernetes.io/projected/04ab06c3-11ab-4253-bafa-fea6ac93bedf-kube-api-access-kbkbz\") pod \"neutron-75b55fdddd-6ht5q\" (UID: \"04ab06c3-11ab-4253-bafa-fea6ac93bedf\") " pod="openstack/neutron-75b55fdddd-6ht5q" Mar 20 17:53:18 crc kubenswrapper[4690]: I0320 17:53:18.782333 4690 scope.go:117] "RemoveContainer" containerID="fc1f322af9e991ae5b698c4deba85094dbcc4f9719aba26a22ba3279040910bb" Mar 20 17:53:18 crc kubenswrapper[4690]: I0320 17:53:18.902968 4690 scope.go:117] "RemoveContainer" containerID="75ee5ffb621d9785b335466768a7b7cc4b8bfa373b10846427a02d4c1abc31cd" Mar 20 17:53:18 crc kubenswrapper[4690]: I0320 17:53:18.903019 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-75b55fdddd-6ht5q" Mar 20 17:53:18 crc kubenswrapper[4690]: I0320 17:53:18.962447 4690 scope.go:117] "RemoveContainer" containerID="423da00fb24e98f7484f72a09b566ac68729f7ec22d23ce862306c7ff6608587" Mar 20 17:53:19 crc kubenswrapper[4690]: I0320 17:53:19.095771 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-867c5896-qkwmr"] Mar 20 17:53:19 crc kubenswrapper[4690]: I0320 17:53:19.202901 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-dc95ccffb-gvrdq"] Mar 20 17:53:19 crc kubenswrapper[4690]: W0320 17:53:19.215524 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod799b195a_e6e5_4a19_b41a_1c7550e21e90.slice/crio-476b105aadb7963a0e54c2721afbe610e3e06abb802d1277979df20f6bbc8fb7 WatchSource:0}: Error finding container 476b105aadb7963a0e54c2721afbe610e3e06abb802d1277979df20f6bbc8fb7: Status 404 returned error can't find the container with id 476b105aadb7963a0e54c2721afbe610e3e06abb802d1277979df20f6bbc8fb7 Mar 20 17:53:19 crc kubenswrapper[4690]: I0320 17:53:19.262545 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-m2vln"] Mar 20 17:53:19 crc kubenswrapper[4690]: W0320 17:53:19.378282 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd11f0ffd_e625_4b90_a1e5_2315bf45529d.slice/crio-ba14332577c0b75ae272556eb11fd1aaaace63c827978692cf05a08e58396ed5 WatchSource:0}: Error finding container ba14332577c0b75ae272556eb11fd1aaaace63c827978692cf05a08e58396ed5: Status 404 returned error can't find the container with id ba14332577c0b75ae272556eb11fd1aaaace63c827978692cf05a08e58396ed5 Mar 20 17:53:19 crc kubenswrapper[4690]: I0320 17:53:19.385753 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 17:53:19 crc kubenswrapper[4690]: I0320 17:53:19.429150 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-dc95ccffb-gvrdq" event={"ID":"799b195a-e6e5-4a19-b41a-1c7550e21e90","Type":"ContainerStarted","Data":"476b105aadb7963a0e54c2721afbe610e3e06abb802d1277979df20f6bbc8fb7"} Mar 20 17:53:19 crc kubenswrapper[4690]: I0320 17:53:19.434370 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-dzgr7" event={"ID":"14a50078-3ca8-4c47-8067-7473a9376323","Type":"ContainerStarted","Data":"a7b92f445b086ac03ad7068b57f19c3f42f306c08763ea5e7ee0bf1bb4b060a9"} Mar 20 17:53:19 crc kubenswrapper[4690]: I0320 17:53:19.443555 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d11f0ffd-e625-4b90-a1e5-2315bf45529d","Type":"ContainerStarted","Data":"ba14332577c0b75ae272556eb11fd1aaaace63c827978692cf05a08e58396ed5"} Mar 20 17:53:19 crc kubenswrapper[4690]: I0320 17:53:19.445388 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d3959350-36d3-4ea7-92af-94ac690b406e","Type":"ContainerStarted","Data":"b23e19d0c9c2d945939c246d0d2686ed67c7d671693f2df3bb0092146af680bd"} Mar 20 17:53:19 crc kubenswrapper[4690]: I0320 17:53:19.457036 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-dzgr7" podStartSLOduration=4.351573456 podStartE2EDuration="29.457020665s" podCreationTimestamp="2026-03-20 17:52:50 +0000 UTC" firstStartedPulling="2026-03-20 17:52:51.978563659 +0000 UTC m=+1246.844389327" lastFinishedPulling="2026-03-20 17:53:17.084010818 +0000 UTC m=+1271.949836536" observedRunningTime="2026-03-20 17:53:19.454148673 +0000 UTC m=+1274.319974351" watchObservedRunningTime="2026-03-20 17:53:19.457020665 +0000 UTC m=+1274.322846343" Mar 20 17:53:19 crc kubenswrapper[4690]: I0320 17:53:19.467737 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-867c5896-qkwmr" event={"ID":"607d61e7-e52a-46e6-a23a-2d4714c5b543","Type":"ContainerStarted","Data":"b621d720f873c1bef307556d0e25334ff094c8a37d689fb519e73f28d11cced4"} Mar 20 17:53:19 crc kubenswrapper[4690]: I0320 17:53:19.474352 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-m2vln" event={"ID":"2055a33a-e663-4664-9240-aab3d338c45e","Type":"ContainerStarted","Data":"497b63fb8f9a7e7b166b696d40aad1501583fcdb5de96bd16904abbdc8df6ebf"} Mar 20 17:53:19 crc kubenswrapper[4690]: E0320 17:53:19.490809 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-wqk6t" podUID="ef3bcd50-5724-42a1-92df-262256c07d45" Mar 20 17:53:19 crc kubenswrapper[4690]: I0320 17:53:19.493822 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 17:53:19 crc kubenswrapper[4690]: I0320 17:53:19.515705 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-h8fdq"] Mar 20 17:53:19 crc kubenswrapper[4690]: I0320 17:53:19.582466 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-75b55fdddd-6ht5q"] Mar 20 17:53:20 crc kubenswrapper[4690]: I0320 17:53:20.486764 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-75b55fdddd-6ht5q" event={"ID":"04ab06c3-11ab-4253-bafa-fea6ac93bedf","Type":"ContainerStarted","Data":"d5ab180629d73512bae5dd907281bbec2640f129392bd67d7ae8ac78c3ca5703"} Mar 20 17:53:20 crc kubenswrapper[4690]: I0320 17:53:20.487406 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-75b55fdddd-6ht5q" event={"ID":"04ab06c3-11ab-4253-bafa-fea6ac93bedf","Type":"ContainerStarted","Data":"9dc9555160a85373f86d6289076c054b66d8a4efcaf73bd80e525e3f4b9a1393"} Mar 20 17:53:20 crc kubenswrapper[4690]: I0320 17:53:20.487426 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-75b55fdddd-6ht5q" event={"ID":"04ab06c3-11ab-4253-bafa-fea6ac93bedf","Type":"ContainerStarted","Data":"df618994c4b8b0b639b4d393714ca9a0a45e07fc92084bb9a36b4a4dedd90888"} Mar 20 17:53:20 crc kubenswrapper[4690]: I0320 17:53:20.487562 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-75b55fdddd-6ht5q" Mar 20 17:53:20 crc kubenswrapper[4690]: I0320 17:53:20.492864 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-867c5896-qkwmr" event={"ID":"607d61e7-e52a-46e6-a23a-2d4714c5b543","Type":"ContainerStarted","Data":"986e03253e1d42403b7786f6087a48f5db97b4dbed738848947823b11c19e91a"} Mar 20 17:53:20 crc kubenswrapper[4690]: I0320 17:53:20.492916 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-867c5896-qkwmr" event={"ID":"607d61e7-e52a-46e6-a23a-2d4714c5b543","Type":"ContainerStarted","Data":"83415bbed66278723c555c9441d97cd81cb450f1f463045bdaece0319a8abe3d"} Mar 20 17:53:20 crc kubenswrapper[4690]: I0320 17:53:20.513872 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-75b55fdddd-6ht5q" podStartSLOduration=2.513849932 podStartE2EDuration="2.513849932s" podCreationTimestamp="2026-03-20 17:53:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:53:20.509714014 +0000 UTC m=+1275.375539702" watchObservedRunningTime="2026-03-20 17:53:20.513849932 +0000 UTC m=+1275.379675610" Mar 20 17:53:20 crc kubenswrapper[4690]: I0320 17:53:20.515142 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"deb0f27d-5620-4c5e-b5b0-a068c76c566f","Type":"ContainerStarted","Data":"015697887f8aa9d888845169592a7b02b23c82171277e5982e44a949633a207f"} Mar 20 17:53:20 crc kubenswrapper[4690]: I0320 17:53:20.515189 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"deb0f27d-5620-4c5e-b5b0-a068c76c566f","Type":"ContainerStarted","Data":"8e4e3c37f96e939e058abff7821c99808dc700877ce98560f64c091b0836e0dd"} Mar 20 17:53:20 crc kubenswrapper[4690]: I0320 17:53:20.527827 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d11f0ffd-e625-4b90-a1e5-2315bf45529d","Type":"ContainerStarted","Data":"812ab71b05d73eb8736a5210c10c237da0abea28d91276df114f5435a9ef3edb"} Mar 20 17:53:20 crc kubenswrapper[4690]: I0320 17:53:20.535948 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-867c5896-qkwmr" podStartSLOduration=20.985511575 podStartE2EDuration="21.535933414s" podCreationTimestamp="2026-03-20 17:52:59 +0000 UTC" firstStartedPulling="2026-03-20 17:53:19.112654436 +0000 UTC m=+1273.978480114" lastFinishedPulling="2026-03-20 17:53:19.663076275 +0000 UTC m=+1274.528901953" observedRunningTime="2026-03-20 17:53:20.529714306 +0000 UTC m=+1275.395539974" watchObservedRunningTime="2026-03-20 17:53:20.535933414 +0000 UTC m=+1275.401759092" Mar 20 17:53:20 crc kubenswrapper[4690]: I0320 17:53:20.540152 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-m2vln" event={"ID":"2055a33a-e663-4664-9240-aab3d338c45e","Type":"ContainerStarted","Data":"2d7067f9eb49bcecf0d8a345aba52d85b7adeb2469cdac922ca43bee76fec1df"} Mar 20 17:53:20 crc kubenswrapper[4690]: I0320 17:53:20.543594 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-dc95ccffb-gvrdq" event={"ID":"799b195a-e6e5-4a19-b41a-1c7550e21e90","Type":"ContainerStarted","Data":"8889c09cc3a72c2f4fab805ad783f74a3d37be121ac61daa7e15486390b44d83"} Mar 20 17:53:20 crc kubenswrapper[4690]: I0320 17:53:20.543632 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-dc95ccffb-gvrdq" event={"ID":"799b195a-e6e5-4a19-b41a-1c7550e21e90","Type":"ContainerStarted","Data":"0e7b0db7df78f174ed875e812b8a2d57f9384cb47b49e2463ea28bed52e74594"} Mar 20 17:53:20 crc kubenswrapper[4690]: I0320 17:53:20.548547 4690 generic.go:334] "Generic (PLEG): container finished" podID="a934f11e-9b01-4a42-ba23-75fbf6461c04" containerID="abbeb25bb51ec5e5c03664ce8d182749bb93e9b728366afc993485009921368a" exitCode=0 Mar 20 17:53:20 crc kubenswrapper[4690]: I0320 17:53:20.548669 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-h8fdq" event={"ID":"a934f11e-9b01-4a42-ba23-75fbf6461c04","Type":"ContainerDied","Data":"abbeb25bb51ec5e5c03664ce8d182749bb93e9b728366afc993485009921368a"} Mar 20 17:53:20 crc kubenswrapper[4690]: I0320 17:53:20.548718 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-h8fdq" event={"ID":"a934f11e-9b01-4a42-ba23-75fbf6461c04","Type":"ContainerStarted","Data":"6f56d9c4d921023b50fe4d774be1a3efa30728fed3dd0f3a5345f4d3c3992ed7"} Mar 20 17:53:20 crc kubenswrapper[4690]: I0320 17:53:20.557842 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-m2vln" podStartSLOduration=10.557826501 podStartE2EDuration="10.557826501s" podCreationTimestamp="2026-03-20 17:53:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:53:20.554126595 +0000 UTC m=+1275.419952283" watchObservedRunningTime="2026-03-20 17:53:20.557826501 +0000 UTC m=+1275.423652179" Mar 20 17:53:20 crc kubenswrapper[4690]: I0320 17:53:20.617586 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-dc95ccffb-gvrdq" podStartSLOduration=21.099938392 podStartE2EDuration="21.617565682s" podCreationTimestamp="2026-03-20 17:52:59 +0000 UTC" firstStartedPulling="2026-03-20 17:53:19.22317178 +0000 UTC m=+1274.088997458" lastFinishedPulling="2026-03-20 17:53:19.74079907 +0000 UTC m=+1274.606624748" observedRunningTime="2026-03-20 17:53:20.604045115 +0000 UTC m=+1275.469870793" watchObservedRunningTime="2026-03-20 17:53:20.617565682 +0000 UTC m=+1275.483391360" Mar 20 17:53:20 crc kubenswrapper[4690]: I0320 17:53:20.698826 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-84b8586c59-k9gqt"] Mar 20 17:53:20 crc kubenswrapper[4690]: I0320 17:53:20.700454 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-84b8586c59-k9gqt" Mar 20 17:53:20 crc kubenswrapper[4690]: I0320 17:53:20.702632 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 20 17:53:20 crc kubenswrapper[4690]: I0320 17:53:20.704248 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 20 17:53:20 crc kubenswrapper[4690]: I0320 17:53:20.707727 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-84b8586c59-k9gqt"] Mar 20 17:53:20 crc kubenswrapper[4690]: I0320 17:53:20.882588 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1874a81-9fc0-4cb0-a681-8ab78df069a0-public-tls-certs\") pod \"neutron-84b8586c59-k9gqt\" (UID: \"e1874a81-9fc0-4cb0-a681-8ab78df069a0\") " pod="openstack/neutron-84b8586c59-k9gqt" Mar 20 17:53:20 crc kubenswrapper[4690]: I0320 17:53:20.883125 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1874a81-9fc0-4cb0-a681-8ab78df069a0-internal-tls-certs\") pod \"neutron-84b8586c59-k9gqt\" (UID: \"e1874a81-9fc0-4cb0-a681-8ab78df069a0\") " pod="openstack/neutron-84b8586c59-k9gqt" Mar 20 17:53:20 crc kubenswrapper[4690]: I0320 17:53:20.883238 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1874a81-9fc0-4cb0-a681-8ab78df069a0-combined-ca-bundle\") pod \"neutron-84b8586c59-k9gqt\" (UID: \"e1874a81-9fc0-4cb0-a681-8ab78df069a0\") " pod="openstack/neutron-84b8586c59-k9gqt" Mar 20 17:53:20 crc kubenswrapper[4690]: I0320 17:53:20.883276 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e1874a81-9fc0-4cb0-a681-8ab78df069a0-config\") pod \"neutron-84b8586c59-k9gqt\" (UID: \"e1874a81-9fc0-4cb0-a681-8ab78df069a0\") " pod="openstack/neutron-84b8586c59-k9gqt" Mar 20 17:53:20 crc kubenswrapper[4690]: I0320 17:53:20.883350 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1874a81-9fc0-4cb0-a681-8ab78df069a0-ovndb-tls-certs\") pod \"neutron-84b8586c59-k9gqt\" (UID: \"e1874a81-9fc0-4cb0-a681-8ab78df069a0\") " pod="openstack/neutron-84b8586c59-k9gqt" Mar 20 17:53:20 crc kubenswrapper[4690]: I0320 17:53:20.883396 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcwtp\" (UniqueName: \"kubernetes.io/projected/e1874a81-9fc0-4cb0-a681-8ab78df069a0-kube-api-access-xcwtp\") pod \"neutron-84b8586c59-k9gqt\" (UID: \"e1874a81-9fc0-4cb0-a681-8ab78df069a0\") " pod="openstack/neutron-84b8586c59-k9gqt" Mar 20 17:53:20 crc kubenswrapper[4690]: I0320 17:53:20.883469 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e1874a81-9fc0-4cb0-a681-8ab78df069a0-httpd-config\") pod \"neutron-84b8586c59-k9gqt\" (UID: \"e1874a81-9fc0-4cb0-a681-8ab78df069a0\") " pod="openstack/neutron-84b8586c59-k9gqt" Mar 20 17:53:21 crc kubenswrapper[4690]: I0320 17:53:21.001442 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e1874a81-9fc0-4cb0-a681-8ab78df069a0-httpd-config\") pod \"neutron-84b8586c59-k9gqt\" (UID: \"e1874a81-9fc0-4cb0-a681-8ab78df069a0\") " pod="openstack/neutron-84b8586c59-k9gqt" Mar 20 17:53:21 crc kubenswrapper[4690]: I0320 17:53:21.001599 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1874a81-9fc0-4cb0-a681-8ab78df069a0-public-tls-certs\") pod \"neutron-84b8586c59-k9gqt\" (UID: \"e1874a81-9fc0-4cb0-a681-8ab78df069a0\") " pod="openstack/neutron-84b8586c59-k9gqt" Mar 20 17:53:21 crc kubenswrapper[4690]: I0320 17:53:21.001778 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1874a81-9fc0-4cb0-a681-8ab78df069a0-internal-tls-certs\") pod \"neutron-84b8586c59-k9gqt\" (UID: \"e1874a81-9fc0-4cb0-a681-8ab78df069a0\") " pod="openstack/neutron-84b8586c59-k9gqt" Mar 20 17:53:21 crc kubenswrapper[4690]: I0320 17:53:21.001866 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1874a81-9fc0-4cb0-a681-8ab78df069a0-combined-ca-bundle\") pod \"neutron-84b8586c59-k9gqt\" (UID: \"e1874a81-9fc0-4cb0-a681-8ab78df069a0\") " pod="openstack/neutron-84b8586c59-k9gqt" Mar 20 17:53:21 crc kubenswrapper[4690]: I0320 17:53:21.001937 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e1874a81-9fc0-4cb0-a681-8ab78df069a0-config\") pod \"neutron-84b8586c59-k9gqt\" (UID: \"e1874a81-9fc0-4cb0-a681-8ab78df069a0\") " pod="openstack/neutron-84b8586c59-k9gqt" Mar 20 17:53:21 crc kubenswrapper[4690]: I0320 17:53:21.001983 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1874a81-9fc0-4cb0-a681-8ab78df069a0-ovndb-tls-certs\") pod \"neutron-84b8586c59-k9gqt\" (UID: \"e1874a81-9fc0-4cb0-a681-8ab78df069a0\") " pod="openstack/neutron-84b8586c59-k9gqt" Mar 20 17:53:21 crc kubenswrapper[4690]: I0320 17:53:21.002124 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcwtp\" (UniqueName: \"kubernetes.io/projected/e1874a81-9fc0-4cb0-a681-8ab78df069a0-kube-api-access-xcwtp\") pod \"neutron-84b8586c59-k9gqt\" (UID: \"e1874a81-9fc0-4cb0-a681-8ab78df069a0\") " pod="openstack/neutron-84b8586c59-k9gqt" Mar 20 17:53:21 crc kubenswrapper[4690]: I0320 17:53:21.010287 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1874a81-9fc0-4cb0-a681-8ab78df069a0-public-tls-certs\") pod \"neutron-84b8586c59-k9gqt\" (UID: \"e1874a81-9fc0-4cb0-a681-8ab78df069a0\") " pod="openstack/neutron-84b8586c59-k9gqt" Mar 20 17:53:21 crc kubenswrapper[4690]: I0320 17:53:21.012804 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1874a81-9fc0-4cb0-a681-8ab78df069a0-ovndb-tls-certs\") pod \"neutron-84b8586c59-k9gqt\" (UID: \"e1874a81-9fc0-4cb0-a681-8ab78df069a0\") " pod="openstack/neutron-84b8586c59-k9gqt" Mar 20 17:53:21 crc kubenswrapper[4690]: I0320 17:53:21.014528 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/e1874a81-9fc0-4cb0-a681-8ab78df069a0-config\") pod \"neutron-84b8586c59-k9gqt\" (UID: \"e1874a81-9fc0-4cb0-a681-8ab78df069a0\") " pod="openstack/neutron-84b8586c59-k9gqt" Mar 20 17:53:21 crc kubenswrapper[4690]: I0320 17:53:21.015552 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1874a81-9fc0-4cb0-a681-8ab78df069a0-internal-tls-certs\") pod \"neutron-84b8586c59-k9gqt\" (UID: \"e1874a81-9fc0-4cb0-a681-8ab78df069a0\") " pod="openstack/neutron-84b8586c59-k9gqt" Mar 20 17:53:21 crc kubenswrapper[4690]: I0320 17:53:21.025443 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e1874a81-9fc0-4cb0-a681-8ab78df069a0-httpd-config\") pod \"neutron-84b8586c59-k9gqt\" (UID: \"e1874a81-9fc0-4cb0-a681-8ab78df069a0\") " pod="openstack/neutron-84b8586c59-k9gqt" Mar 20 17:53:21 crc kubenswrapper[4690]: I0320 17:53:21.027317 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1874a81-9fc0-4cb0-a681-8ab78df069a0-combined-ca-bundle\") pod \"neutron-84b8586c59-k9gqt\" (UID: \"e1874a81-9fc0-4cb0-a681-8ab78df069a0\") " pod="openstack/neutron-84b8586c59-k9gqt" Mar 20 17:53:21 crc kubenswrapper[4690]: I0320 17:53:21.033423 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcwtp\" (UniqueName: \"kubernetes.io/projected/e1874a81-9fc0-4cb0-a681-8ab78df069a0-kube-api-access-xcwtp\") pod \"neutron-84b8586c59-k9gqt\" (UID: \"e1874a81-9fc0-4cb0-a681-8ab78df069a0\") " pod="openstack/neutron-84b8586c59-k9gqt" Mar 20 17:53:21 crc kubenswrapper[4690]: I0320 17:53:21.044799 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-84b8586c59-k9gqt" Mar 20 17:53:21 crc kubenswrapper[4690]: I0320 17:53:21.558808 4690 generic.go:334] "Generic (PLEG): container finished" podID="14a50078-3ca8-4c47-8067-7473a9376323" containerID="a7b92f445b086ac03ad7068b57f19c3f42f306c08763ea5e7ee0bf1bb4b060a9" exitCode=0 Mar 20 17:53:21 crc kubenswrapper[4690]: I0320 17:53:21.559126 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-dzgr7" event={"ID":"14a50078-3ca8-4c47-8067-7473a9376323","Type":"ContainerDied","Data":"a7b92f445b086ac03ad7068b57f19c3f42f306c08763ea5e7ee0bf1bb4b060a9"} Mar 20 17:53:21 crc kubenswrapper[4690]: I0320 17:53:21.563462 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d11f0ffd-e625-4b90-a1e5-2315bf45529d","Type":"ContainerStarted","Data":"f525f7e9a79c32144c72f3cdc1110e4430998bf95419f4035a87d2a359ae37e3"} Mar 20 17:53:21 crc kubenswrapper[4690]: I0320 17:53:21.600949 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=13.600931355 podStartE2EDuration="13.600931355s" podCreationTimestamp="2026-03-20 17:53:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:53:21.599443383 +0000 UTC m=+1276.465269071" watchObservedRunningTime="2026-03-20 17:53:21.600931355 +0000 UTC m=+1276.466757033" Mar 20 17:53:22 crc kubenswrapper[4690]: I0320 17:53:22.118667 4690 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-f6h5n" podUID="d5e029b9-bf4d-4700-9a5f-c35bd3459b15" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.127:5353: i/o timeout" Mar 20 17:53:22 crc kubenswrapper[4690]: I0320 17:53:22.919225 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-84b8586c59-k9gqt"] Mar 20 17:53:22 crc kubenswrapper[4690]: W0320 17:53:22.928369 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode1874a81_9fc0_4cb0_a681_8ab78df069a0.slice/crio-f9f880736267c6488b3b56ef34460cc43c3cee9d34f6fa362e061a623b852b36 WatchSource:0}: Error finding container f9f880736267c6488b3b56ef34460cc43c3cee9d34f6fa362e061a623b852b36: Status 404 returned error can't find the container with id f9f880736267c6488b3b56ef34460cc43c3cee9d34f6fa362e061a623b852b36 Mar 20 17:53:22 crc kubenswrapper[4690]: I0320 17:53:22.958739 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-dzgr7" Mar 20 17:53:23 crc kubenswrapper[4690]: I0320 17:53:23.146782 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14a50078-3ca8-4c47-8067-7473a9376323-combined-ca-bundle\") pod \"14a50078-3ca8-4c47-8067-7473a9376323\" (UID: \"14a50078-3ca8-4c47-8067-7473a9376323\") " Mar 20 17:53:23 crc kubenswrapper[4690]: I0320 17:53:23.146922 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14a50078-3ca8-4c47-8067-7473a9376323-scripts\") pod \"14a50078-3ca8-4c47-8067-7473a9376323\" (UID: \"14a50078-3ca8-4c47-8067-7473a9376323\") " Mar 20 17:53:23 crc kubenswrapper[4690]: I0320 17:53:23.146995 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14a50078-3ca8-4c47-8067-7473a9376323-logs\") pod \"14a50078-3ca8-4c47-8067-7473a9376323\" (UID: \"14a50078-3ca8-4c47-8067-7473a9376323\") " Mar 20 17:53:23 crc kubenswrapper[4690]: I0320 17:53:23.147060 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8hdw2\" (UniqueName: \"kubernetes.io/projected/14a50078-3ca8-4c47-8067-7473a9376323-kube-api-access-8hdw2\") pod \"14a50078-3ca8-4c47-8067-7473a9376323\" (UID: \"14a50078-3ca8-4c47-8067-7473a9376323\") " Mar 20 17:53:23 crc kubenswrapper[4690]: I0320 17:53:23.147091 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14a50078-3ca8-4c47-8067-7473a9376323-config-data\") pod \"14a50078-3ca8-4c47-8067-7473a9376323\" (UID: \"14a50078-3ca8-4c47-8067-7473a9376323\") " Mar 20 17:53:23 crc kubenswrapper[4690]: I0320 17:53:23.149373 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14a50078-3ca8-4c47-8067-7473a9376323-logs" (OuterVolumeSpecName: "logs") pod "14a50078-3ca8-4c47-8067-7473a9376323" (UID: "14a50078-3ca8-4c47-8067-7473a9376323"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:53:23 crc kubenswrapper[4690]: I0320 17:53:23.155218 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14a50078-3ca8-4c47-8067-7473a9376323-scripts" (OuterVolumeSpecName: "scripts") pod "14a50078-3ca8-4c47-8067-7473a9376323" (UID: "14a50078-3ca8-4c47-8067-7473a9376323"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:53:23 crc kubenswrapper[4690]: I0320 17:53:23.155715 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14a50078-3ca8-4c47-8067-7473a9376323-kube-api-access-8hdw2" (OuterVolumeSpecName: "kube-api-access-8hdw2") pod "14a50078-3ca8-4c47-8067-7473a9376323" (UID: "14a50078-3ca8-4c47-8067-7473a9376323"). InnerVolumeSpecName "kube-api-access-8hdw2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:53:23 crc kubenswrapper[4690]: I0320 17:53:23.187914 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14a50078-3ca8-4c47-8067-7473a9376323-config-data" (OuterVolumeSpecName: "config-data") pod "14a50078-3ca8-4c47-8067-7473a9376323" (UID: "14a50078-3ca8-4c47-8067-7473a9376323"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:53:23 crc kubenswrapper[4690]: I0320 17:53:23.191735 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14a50078-3ca8-4c47-8067-7473a9376323-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "14a50078-3ca8-4c47-8067-7473a9376323" (UID: "14a50078-3ca8-4c47-8067-7473a9376323"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:53:23 crc kubenswrapper[4690]: I0320 17:53:23.251782 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8hdw2\" (UniqueName: \"kubernetes.io/projected/14a50078-3ca8-4c47-8067-7473a9376323-kube-api-access-8hdw2\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:23 crc kubenswrapper[4690]: I0320 17:53:23.251824 4690 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14a50078-3ca8-4c47-8067-7473a9376323-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:23 crc kubenswrapper[4690]: I0320 17:53:23.251837 4690 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14a50078-3ca8-4c47-8067-7473a9376323-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:23 crc kubenswrapper[4690]: I0320 17:53:23.251850 4690 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14a50078-3ca8-4c47-8067-7473a9376323-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:23 crc kubenswrapper[4690]: I0320 17:53:23.251859 4690 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14a50078-3ca8-4c47-8067-7473a9376323-logs\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:23 crc kubenswrapper[4690]: I0320 17:53:23.650778 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-dzgr7" event={"ID":"14a50078-3ca8-4c47-8067-7473a9376323","Type":"ContainerDied","Data":"4240b34311429dae5c14a389c14abcb9f8ebb9c87a73ca9455634d57ad2b7f52"} Mar 20 17:53:23 crc kubenswrapper[4690]: I0320 17:53:23.651178 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4240b34311429dae5c14a389c14abcb9f8ebb9c87a73ca9455634d57ad2b7f52" Mar 20 17:53:23 crc kubenswrapper[4690]: I0320 17:53:23.651351 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-dzgr7" Mar 20 17:53:23 crc kubenswrapper[4690]: I0320 17:53:23.679941 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"deb0f27d-5620-4c5e-b5b0-a068c76c566f","Type":"ContainerStarted","Data":"2033301cfebc62e3d9fc98b727590b0fa89505b126f1251f5a0e64100e88156e"} Mar 20 17:53:23 crc kubenswrapper[4690]: I0320 17:53:23.692870 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d3959350-36d3-4ea7-92af-94ac690b406e","Type":"ContainerStarted","Data":"8060b4fcb26959d8193b3d3c27a2c289d9a43a03978a9852fe94303c958c5274"} Mar 20 17:53:23 crc kubenswrapper[4690]: I0320 17:53:23.696029 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-h8fdq" event={"ID":"a934f11e-9b01-4a42-ba23-75fbf6461c04","Type":"ContainerStarted","Data":"82ddac1eb80305863e20d8cde15e54b6d46c7fbf09cf9fd0f255effe58bb9898"} Mar 20 17:53:23 crc kubenswrapper[4690]: I0320 17:53:23.701609 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55f844cf75-h8fdq" Mar 20 17:53:23 crc kubenswrapper[4690]: I0320 17:53:23.709452 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=13.70941834 podStartE2EDuration="13.70941834s" podCreationTimestamp="2026-03-20 17:53:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:53:23.707597958 +0000 UTC m=+1278.573423636" watchObservedRunningTime="2026-03-20 17:53:23.70941834 +0000 UTC m=+1278.575244008" Mar 20 17:53:23 crc kubenswrapper[4690]: I0320 17:53:23.710994 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-84b8586c59-k9gqt" event={"ID":"e1874a81-9fc0-4cb0-a681-8ab78df069a0","Type":"ContainerStarted","Data":"add83b1f86a4bfa988fa96f33e00996aa647c96022a0d0eed9e863ed744e00da"} Mar 20 17:53:23 crc kubenswrapper[4690]: I0320 17:53:23.711038 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-84b8586c59-k9gqt" event={"ID":"e1874a81-9fc0-4cb0-a681-8ab78df069a0","Type":"ContainerStarted","Data":"6ce1b558ef5aec84880806931e4b38462b049a8175ebbdb3b31e68f707f0a15c"} Mar 20 17:53:23 crc kubenswrapper[4690]: I0320 17:53:23.711050 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-84b8586c59-k9gqt" event={"ID":"e1874a81-9fc0-4cb0-a681-8ab78df069a0","Type":"ContainerStarted","Data":"f9f880736267c6488b3b56ef34460cc43c3cee9d34f6fa362e061a623b852b36"} Mar 20 17:53:23 crc kubenswrapper[4690]: I0320 17:53:23.711889 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-84b8586c59-k9gqt" Mar 20 17:53:23 crc kubenswrapper[4690]: I0320 17:53:23.779869 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-846cbcbcb-bk7ct"] Mar 20 17:53:23 crc kubenswrapper[4690]: E0320 17:53:23.780315 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14a50078-3ca8-4c47-8067-7473a9376323" containerName="placement-db-sync" Mar 20 17:53:23 crc kubenswrapper[4690]: I0320 17:53:23.780331 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="14a50078-3ca8-4c47-8067-7473a9376323" containerName="placement-db-sync" Mar 20 17:53:23 crc kubenswrapper[4690]: I0320 17:53:23.780564 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="14a50078-3ca8-4c47-8067-7473a9376323" containerName="placement-db-sync" Mar 20 17:53:23 crc kubenswrapper[4690]: I0320 17:53:23.781575 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-846cbcbcb-bk7ct" Mar 20 17:53:23 crc kubenswrapper[4690]: I0320 17:53:23.781744 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55f844cf75-h8fdq" podStartSLOduration=5.78171516 podStartE2EDuration="5.78171516s" podCreationTimestamp="2026-03-20 17:53:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:53:23.741692204 +0000 UTC m=+1278.607517882" watchObservedRunningTime="2026-03-20 17:53:23.78171516 +0000 UTC m=+1278.647540828" Mar 20 17:53:23 crc kubenswrapper[4690]: I0320 17:53:23.801885 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 20 17:53:23 crc kubenswrapper[4690]: I0320 17:53:23.802470 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 20 17:53:23 crc kubenswrapper[4690]: I0320 17:53:23.802639 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-tq8mp" Mar 20 17:53:23 crc kubenswrapper[4690]: I0320 17:53:23.802756 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 20 17:53:23 crc kubenswrapper[4690]: I0320 17:53:23.802855 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 20 17:53:23 crc kubenswrapper[4690]: I0320 17:53:23.826880 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-846cbcbcb-bk7ct"] Mar 20 17:53:23 crc kubenswrapper[4690]: I0320 17:53:23.834685 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-84b8586c59-k9gqt" podStartSLOduration=3.834660065 podStartE2EDuration="3.834660065s" podCreationTimestamp="2026-03-20 17:53:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:53:23.802087463 +0000 UTC m=+1278.667913141" watchObservedRunningTime="2026-03-20 17:53:23.834660065 +0000 UTC m=+1278.700485743" Mar 20 17:53:23 crc kubenswrapper[4690]: I0320 17:53:23.965678 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eeac13cf-6875-434a-b276-fae77a828d02-logs\") pod \"placement-846cbcbcb-bk7ct\" (UID: \"eeac13cf-6875-434a-b276-fae77a828d02\") " pod="openstack/placement-846cbcbcb-bk7ct" Mar 20 17:53:23 crc kubenswrapper[4690]: I0320 17:53:23.965734 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eeac13cf-6875-434a-b276-fae77a828d02-scripts\") pod \"placement-846cbcbcb-bk7ct\" (UID: \"eeac13cf-6875-434a-b276-fae77a828d02\") " pod="openstack/placement-846cbcbcb-bk7ct" Mar 20 17:53:23 crc kubenswrapper[4690]: I0320 17:53:23.965774 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eeac13cf-6875-434a-b276-fae77a828d02-config-data\") pod \"placement-846cbcbcb-bk7ct\" (UID: \"eeac13cf-6875-434a-b276-fae77a828d02\") " pod="openstack/placement-846cbcbcb-bk7ct" Mar 20 17:53:23 crc kubenswrapper[4690]: I0320 17:53:23.965790 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wdsz\" (UniqueName: \"kubernetes.io/projected/eeac13cf-6875-434a-b276-fae77a828d02-kube-api-access-6wdsz\") pod \"placement-846cbcbcb-bk7ct\" (UID: \"eeac13cf-6875-434a-b276-fae77a828d02\") " pod="openstack/placement-846cbcbcb-bk7ct" Mar 20 17:53:23 crc kubenswrapper[4690]: I0320 17:53:23.965821 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eeac13cf-6875-434a-b276-fae77a828d02-public-tls-certs\") pod \"placement-846cbcbcb-bk7ct\" (UID: \"eeac13cf-6875-434a-b276-fae77a828d02\") " pod="openstack/placement-846cbcbcb-bk7ct" Mar 20 17:53:23 crc kubenswrapper[4690]: I0320 17:53:23.965872 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eeac13cf-6875-434a-b276-fae77a828d02-combined-ca-bundle\") pod \"placement-846cbcbcb-bk7ct\" (UID: \"eeac13cf-6875-434a-b276-fae77a828d02\") " pod="openstack/placement-846cbcbcb-bk7ct" Mar 20 17:53:23 crc kubenswrapper[4690]: I0320 17:53:23.965902 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eeac13cf-6875-434a-b276-fae77a828d02-internal-tls-certs\") pod \"placement-846cbcbcb-bk7ct\" (UID: \"eeac13cf-6875-434a-b276-fae77a828d02\") " pod="openstack/placement-846cbcbcb-bk7ct" Mar 20 17:53:24 crc kubenswrapper[4690]: I0320 17:53:24.067061 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eeac13cf-6875-434a-b276-fae77a828d02-combined-ca-bundle\") pod \"placement-846cbcbcb-bk7ct\" (UID: \"eeac13cf-6875-434a-b276-fae77a828d02\") " pod="openstack/placement-846cbcbcb-bk7ct" Mar 20 17:53:24 crc kubenswrapper[4690]: I0320 17:53:24.067439 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eeac13cf-6875-434a-b276-fae77a828d02-internal-tls-certs\") pod \"placement-846cbcbcb-bk7ct\" (UID: \"eeac13cf-6875-434a-b276-fae77a828d02\") " pod="openstack/placement-846cbcbcb-bk7ct" Mar 20 17:53:24 crc kubenswrapper[4690]: I0320 17:53:24.067547 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eeac13cf-6875-434a-b276-fae77a828d02-logs\") pod \"placement-846cbcbcb-bk7ct\" (UID: \"eeac13cf-6875-434a-b276-fae77a828d02\") " pod="openstack/placement-846cbcbcb-bk7ct" Mar 20 17:53:24 crc kubenswrapper[4690]: I0320 17:53:24.067570 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eeac13cf-6875-434a-b276-fae77a828d02-scripts\") pod \"placement-846cbcbcb-bk7ct\" (UID: \"eeac13cf-6875-434a-b276-fae77a828d02\") " pod="openstack/placement-846cbcbcb-bk7ct" Mar 20 17:53:24 crc kubenswrapper[4690]: I0320 17:53:24.067795 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eeac13cf-6875-434a-b276-fae77a828d02-config-data\") pod \"placement-846cbcbcb-bk7ct\" (UID: \"eeac13cf-6875-434a-b276-fae77a828d02\") " pod="openstack/placement-846cbcbcb-bk7ct" Mar 20 17:53:24 crc kubenswrapper[4690]: I0320 17:53:24.067823 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wdsz\" (UniqueName: \"kubernetes.io/projected/eeac13cf-6875-434a-b276-fae77a828d02-kube-api-access-6wdsz\") pod \"placement-846cbcbcb-bk7ct\" (UID: \"eeac13cf-6875-434a-b276-fae77a828d02\") " pod="openstack/placement-846cbcbcb-bk7ct" Mar 20 17:53:24 crc kubenswrapper[4690]: I0320 17:53:24.067854 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eeac13cf-6875-434a-b276-fae77a828d02-public-tls-certs\") pod \"placement-846cbcbcb-bk7ct\" (UID: \"eeac13cf-6875-434a-b276-fae77a828d02\") " pod="openstack/placement-846cbcbcb-bk7ct" Mar 20 17:53:24 crc kubenswrapper[4690]: I0320 17:53:24.072514 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eeac13cf-6875-434a-b276-fae77a828d02-logs\") pod \"placement-846cbcbcb-bk7ct\" (UID: \"eeac13cf-6875-434a-b276-fae77a828d02\") " pod="openstack/placement-846cbcbcb-bk7ct" Mar 20 17:53:24 crc kubenswrapper[4690]: I0320 17:53:24.074596 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eeac13cf-6875-434a-b276-fae77a828d02-internal-tls-certs\") pod \"placement-846cbcbcb-bk7ct\" (UID: \"eeac13cf-6875-434a-b276-fae77a828d02\") " pod="openstack/placement-846cbcbcb-bk7ct" Mar 20 17:53:24 crc kubenswrapper[4690]: I0320 17:53:24.074653 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eeac13cf-6875-434a-b276-fae77a828d02-public-tls-certs\") pod \"placement-846cbcbcb-bk7ct\" (UID: \"eeac13cf-6875-434a-b276-fae77a828d02\") " pod="openstack/placement-846cbcbcb-bk7ct" Mar 20 17:53:24 crc kubenswrapper[4690]: I0320 17:53:24.075856 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eeac13cf-6875-434a-b276-fae77a828d02-config-data\") pod \"placement-846cbcbcb-bk7ct\" (UID: \"eeac13cf-6875-434a-b276-fae77a828d02\") " pod="openstack/placement-846cbcbcb-bk7ct" Mar 20 17:53:24 crc kubenswrapper[4690]: I0320 17:53:24.078768 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eeac13cf-6875-434a-b276-fae77a828d02-combined-ca-bundle\") pod \"placement-846cbcbcb-bk7ct\" (UID: \"eeac13cf-6875-434a-b276-fae77a828d02\") " pod="openstack/placement-846cbcbcb-bk7ct" Mar 20 17:53:24 crc kubenswrapper[4690]: I0320 17:53:24.080422 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eeac13cf-6875-434a-b276-fae77a828d02-scripts\") pod \"placement-846cbcbcb-bk7ct\" (UID: \"eeac13cf-6875-434a-b276-fae77a828d02\") " pod="openstack/placement-846cbcbcb-bk7ct" Mar 20 17:53:24 crc kubenswrapper[4690]: I0320 17:53:24.086876 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wdsz\" (UniqueName: \"kubernetes.io/projected/eeac13cf-6875-434a-b276-fae77a828d02-kube-api-access-6wdsz\") pod \"placement-846cbcbcb-bk7ct\" (UID: \"eeac13cf-6875-434a-b276-fae77a828d02\") " pod="openstack/placement-846cbcbcb-bk7ct" Mar 20 17:53:24 crc kubenswrapper[4690]: I0320 17:53:24.122495 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-846cbcbcb-bk7ct" Mar 20 17:53:24 crc kubenswrapper[4690]: I0320 17:53:24.695045 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-846cbcbcb-bk7ct"] Mar 20 17:53:24 crc kubenswrapper[4690]: I0320 17:53:24.740479 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-846cbcbcb-bk7ct" event={"ID":"eeac13cf-6875-434a-b276-fae77a828d02","Type":"ContainerStarted","Data":"7b36f31d6af928bfcd59680b0012319f817b3c646a271c618565edc67ffd8ada"} Mar 20 17:53:24 crc kubenswrapper[4690]: I0320 17:53:24.744662 4690 generic.go:334] "Generic (PLEG): container finished" podID="2055a33a-e663-4664-9240-aab3d338c45e" containerID="2d7067f9eb49bcecf0d8a345aba52d85b7adeb2469cdac922ca43bee76fec1df" exitCode=0 Mar 20 17:53:24 crc kubenswrapper[4690]: I0320 17:53:24.745678 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-m2vln" event={"ID":"2055a33a-e663-4664-9240-aab3d338c45e","Type":"ContainerDied","Data":"2d7067f9eb49bcecf0d8a345aba52d85b7adeb2469cdac922ca43bee76fec1df"} Mar 20 17:53:25 crc kubenswrapper[4690]: I0320 17:53:25.756696 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-846cbcbcb-bk7ct" event={"ID":"eeac13cf-6875-434a-b276-fae77a828d02","Type":"ContainerStarted","Data":"e273e0db8a848a00bb240bb0f1a7fd1fe0162b2b098a6033f1f05d2ac6cb0bba"} Mar 20 17:53:25 crc kubenswrapper[4690]: I0320 17:53:25.757019 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-846cbcbcb-bk7ct" event={"ID":"eeac13cf-6875-434a-b276-fae77a828d02","Type":"ContainerStarted","Data":"660b50aac9a97ca5f210780af2fbe5551050f7fb5a3827150cecccb01b045837"} Mar 20 17:53:25 crc kubenswrapper[4690]: I0320 17:53:25.789618 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-846cbcbcb-bk7ct" podStartSLOduration=2.789596555 podStartE2EDuration="2.789596555s" podCreationTimestamp="2026-03-20 17:53:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:53:25.775830941 +0000 UTC m=+1280.641656629" watchObservedRunningTime="2026-03-20 17:53:25.789596555 +0000 UTC m=+1280.655422243" Mar 20 17:53:26 crc kubenswrapper[4690]: I0320 17:53:26.763433 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-846cbcbcb-bk7ct" Mar 20 17:53:26 crc kubenswrapper[4690]: I0320 17:53:26.763771 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-846cbcbcb-bk7ct" Mar 20 17:53:28 crc kubenswrapper[4690]: I0320 17:53:28.686102 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55f844cf75-h8fdq" Mar 20 17:53:28 crc kubenswrapper[4690]: I0320 17:53:28.726544 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-m2vln" Mar 20 17:53:28 crc kubenswrapper[4690]: I0320 17:53:28.750555 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 20 17:53:28 crc kubenswrapper[4690]: I0320 17:53:28.750625 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 20 17:53:28 crc kubenswrapper[4690]: I0320 17:53:28.756932 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-pn26k"] Mar 20 17:53:28 crc kubenswrapper[4690]: I0320 17:53:28.757347 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-785d8bcb8c-pn26k" podUID="c491527c-0ddb-41bc-86f2-78334b0b3075" containerName="dnsmasq-dns" containerID="cri-o://9ebf62b4eb2a590670aef2257f8b622aa85255d20524cbe44ec03ce8bc9710d8" gracePeriod=10 Mar 20 17:53:28 crc kubenswrapper[4690]: I0320 17:53:28.810996 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-m2vln" event={"ID":"2055a33a-e663-4664-9240-aab3d338c45e","Type":"ContainerDied","Data":"497b63fb8f9a7e7b166b696d40aad1501583fcdb5de96bd16904abbdc8df6ebf"} Mar 20 17:53:28 crc kubenswrapper[4690]: I0320 17:53:28.811040 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="497b63fb8f9a7e7b166b696d40aad1501583fcdb5de96bd16904abbdc8df6ebf" Mar 20 17:53:28 crc kubenswrapper[4690]: I0320 17:53:28.811099 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-m2vln" Mar 20 17:53:28 crc kubenswrapper[4690]: I0320 17:53:28.844525 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 20 17:53:28 crc kubenswrapper[4690]: I0320 17:53:28.849326 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 20 17:53:28 crc kubenswrapper[4690]: I0320 17:53:28.853177 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 20 17:53:28 crc kubenswrapper[4690]: I0320 17:53:28.889161 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2055a33a-e663-4664-9240-aab3d338c45e-credential-keys\") pod \"2055a33a-e663-4664-9240-aab3d338c45e\" (UID: \"2055a33a-e663-4664-9240-aab3d338c45e\") " Mar 20 17:53:28 crc kubenswrapper[4690]: I0320 17:53:28.889328 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2055a33a-e663-4664-9240-aab3d338c45e-scripts\") pod \"2055a33a-e663-4664-9240-aab3d338c45e\" (UID: \"2055a33a-e663-4664-9240-aab3d338c45e\") " Mar 20 17:53:28 crc kubenswrapper[4690]: I0320 17:53:28.889447 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h57tl\" (UniqueName: \"kubernetes.io/projected/2055a33a-e663-4664-9240-aab3d338c45e-kube-api-access-h57tl\") pod \"2055a33a-e663-4664-9240-aab3d338c45e\" (UID: \"2055a33a-e663-4664-9240-aab3d338c45e\") " Mar 20 17:53:28 crc kubenswrapper[4690]: I0320 17:53:28.889484 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2055a33a-e663-4664-9240-aab3d338c45e-fernet-keys\") pod \"2055a33a-e663-4664-9240-aab3d338c45e\" (UID: \"2055a33a-e663-4664-9240-aab3d338c45e\") " Mar 20 17:53:28 crc kubenswrapper[4690]: I0320 17:53:28.889507 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2055a33a-e663-4664-9240-aab3d338c45e-combined-ca-bundle\") pod \"2055a33a-e663-4664-9240-aab3d338c45e\" (UID: \"2055a33a-e663-4664-9240-aab3d338c45e\") " Mar 20 17:53:28 crc kubenswrapper[4690]: I0320 17:53:28.889535 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2055a33a-e663-4664-9240-aab3d338c45e-config-data\") pod \"2055a33a-e663-4664-9240-aab3d338c45e\" (UID: \"2055a33a-e663-4664-9240-aab3d338c45e\") " Mar 20 17:53:28 crc kubenswrapper[4690]: I0320 17:53:28.897882 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2055a33a-e663-4664-9240-aab3d338c45e-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "2055a33a-e663-4664-9240-aab3d338c45e" (UID: "2055a33a-e663-4664-9240-aab3d338c45e"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:53:28 crc kubenswrapper[4690]: I0320 17:53:28.899514 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2055a33a-e663-4664-9240-aab3d338c45e-scripts" (OuterVolumeSpecName: "scripts") pod "2055a33a-e663-4664-9240-aab3d338c45e" (UID: "2055a33a-e663-4664-9240-aab3d338c45e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:53:28 crc kubenswrapper[4690]: I0320 17:53:28.900487 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2055a33a-e663-4664-9240-aab3d338c45e-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "2055a33a-e663-4664-9240-aab3d338c45e" (UID: "2055a33a-e663-4664-9240-aab3d338c45e"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:53:28 crc kubenswrapper[4690]: I0320 17:53:28.900638 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2055a33a-e663-4664-9240-aab3d338c45e-kube-api-access-h57tl" (OuterVolumeSpecName: "kube-api-access-h57tl") pod "2055a33a-e663-4664-9240-aab3d338c45e" (UID: "2055a33a-e663-4664-9240-aab3d338c45e"). InnerVolumeSpecName "kube-api-access-h57tl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:53:28 crc kubenswrapper[4690]: I0320 17:53:28.918870 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2055a33a-e663-4664-9240-aab3d338c45e-config-data" (OuterVolumeSpecName: "config-data") pod "2055a33a-e663-4664-9240-aab3d338c45e" (UID: "2055a33a-e663-4664-9240-aab3d338c45e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:53:28 crc kubenswrapper[4690]: I0320 17:53:28.919109 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2055a33a-e663-4664-9240-aab3d338c45e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2055a33a-e663-4664-9240-aab3d338c45e" (UID: "2055a33a-e663-4664-9240-aab3d338c45e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:53:28 crc kubenswrapper[4690]: I0320 17:53:28.991509 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h57tl\" (UniqueName: \"kubernetes.io/projected/2055a33a-e663-4664-9240-aab3d338c45e-kube-api-access-h57tl\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:28 crc kubenswrapper[4690]: I0320 17:53:28.991734 4690 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2055a33a-e663-4664-9240-aab3d338c45e-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:28 crc kubenswrapper[4690]: I0320 17:53:28.991744 4690 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2055a33a-e663-4664-9240-aab3d338c45e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:28 crc kubenswrapper[4690]: I0320 17:53:28.991752 4690 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2055a33a-e663-4664-9240-aab3d338c45e-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:28 crc kubenswrapper[4690]: I0320 17:53:28.991761 4690 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2055a33a-e663-4664-9240-aab3d338c45e-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:28 crc kubenswrapper[4690]: I0320 17:53:28.991769 4690 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2055a33a-e663-4664-9240-aab3d338c45e-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:29 crc kubenswrapper[4690]: I0320 17:53:29.204612 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-pn26k" Mar 20 17:53:29 crc kubenswrapper[4690]: I0320 17:53:29.317104 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c491527c-0ddb-41bc-86f2-78334b0b3075-config\") pod \"c491527c-0ddb-41bc-86f2-78334b0b3075\" (UID: \"c491527c-0ddb-41bc-86f2-78334b0b3075\") " Mar 20 17:53:29 crc kubenswrapper[4690]: I0320 17:53:29.317264 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c491527c-0ddb-41bc-86f2-78334b0b3075-dns-swift-storage-0\") pod \"c491527c-0ddb-41bc-86f2-78334b0b3075\" (UID: \"c491527c-0ddb-41bc-86f2-78334b0b3075\") " Mar 20 17:53:29 crc kubenswrapper[4690]: I0320 17:53:29.317296 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzlvs\" (UniqueName: \"kubernetes.io/projected/c491527c-0ddb-41bc-86f2-78334b0b3075-kube-api-access-tzlvs\") pod \"c491527c-0ddb-41bc-86f2-78334b0b3075\" (UID: \"c491527c-0ddb-41bc-86f2-78334b0b3075\") " Mar 20 17:53:29 crc kubenswrapper[4690]: I0320 17:53:29.317323 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c491527c-0ddb-41bc-86f2-78334b0b3075-ovsdbserver-sb\") pod \"c491527c-0ddb-41bc-86f2-78334b0b3075\" (UID: \"c491527c-0ddb-41bc-86f2-78334b0b3075\") " Mar 20 17:53:29 crc kubenswrapper[4690]: I0320 17:53:29.317371 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c491527c-0ddb-41bc-86f2-78334b0b3075-ovsdbserver-nb\") pod \"c491527c-0ddb-41bc-86f2-78334b0b3075\" (UID: \"c491527c-0ddb-41bc-86f2-78334b0b3075\") " Mar 20 17:53:29 crc kubenswrapper[4690]: I0320 17:53:29.317395 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c491527c-0ddb-41bc-86f2-78334b0b3075-dns-svc\") pod \"c491527c-0ddb-41bc-86f2-78334b0b3075\" (UID: \"c491527c-0ddb-41bc-86f2-78334b0b3075\") " Mar 20 17:53:29 crc kubenswrapper[4690]: I0320 17:53:29.332321 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c491527c-0ddb-41bc-86f2-78334b0b3075-kube-api-access-tzlvs" (OuterVolumeSpecName: "kube-api-access-tzlvs") pod "c491527c-0ddb-41bc-86f2-78334b0b3075" (UID: "c491527c-0ddb-41bc-86f2-78334b0b3075"). InnerVolumeSpecName "kube-api-access-tzlvs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:53:29 crc kubenswrapper[4690]: I0320 17:53:29.361098 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c491527c-0ddb-41bc-86f2-78334b0b3075-config" (OuterVolumeSpecName: "config") pod "c491527c-0ddb-41bc-86f2-78334b0b3075" (UID: "c491527c-0ddb-41bc-86f2-78334b0b3075"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:53:29 crc kubenswrapper[4690]: I0320 17:53:29.371136 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c491527c-0ddb-41bc-86f2-78334b0b3075-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c491527c-0ddb-41bc-86f2-78334b0b3075" (UID: "c491527c-0ddb-41bc-86f2-78334b0b3075"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:53:29 crc kubenswrapper[4690]: I0320 17:53:29.371755 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c491527c-0ddb-41bc-86f2-78334b0b3075-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c491527c-0ddb-41bc-86f2-78334b0b3075" (UID: "c491527c-0ddb-41bc-86f2-78334b0b3075"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:53:29 crc kubenswrapper[4690]: I0320 17:53:29.380910 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c491527c-0ddb-41bc-86f2-78334b0b3075-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c491527c-0ddb-41bc-86f2-78334b0b3075" (UID: "c491527c-0ddb-41bc-86f2-78334b0b3075"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:53:29 crc kubenswrapper[4690]: I0320 17:53:29.383986 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c491527c-0ddb-41bc-86f2-78334b0b3075-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c491527c-0ddb-41bc-86f2-78334b0b3075" (UID: "c491527c-0ddb-41bc-86f2-78334b0b3075"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:53:29 crc kubenswrapper[4690]: I0320 17:53:29.418931 4690 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c491527c-0ddb-41bc-86f2-78334b0b3075-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:29 crc kubenswrapper[4690]: I0320 17:53:29.418966 4690 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c491527c-0ddb-41bc-86f2-78334b0b3075-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:29 crc kubenswrapper[4690]: I0320 17:53:29.418975 4690 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c491527c-0ddb-41bc-86f2-78334b0b3075-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:29 crc kubenswrapper[4690]: I0320 17:53:29.418984 4690 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c491527c-0ddb-41bc-86f2-78334b0b3075-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:29 crc kubenswrapper[4690]: I0320 17:53:29.418996 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tzlvs\" (UniqueName: \"kubernetes.io/projected/c491527c-0ddb-41bc-86f2-78334b0b3075-kube-api-access-tzlvs\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:29 crc kubenswrapper[4690]: I0320 17:53:29.419004 4690 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c491527c-0ddb-41bc-86f2-78334b0b3075-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:29 crc kubenswrapper[4690]: I0320 17:53:29.688100 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-867c5896-qkwmr" Mar 20 17:53:29 crc kubenswrapper[4690]: I0320 17:53:29.688156 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-867c5896-qkwmr" Mar 20 17:53:29 crc kubenswrapper[4690]: I0320 17:53:29.840318 4690 generic.go:334] "Generic (PLEG): container finished" podID="c491527c-0ddb-41bc-86f2-78334b0b3075" containerID="9ebf62b4eb2a590670aef2257f8b622aa85255d20524cbe44ec03ce8bc9710d8" exitCode=0 Mar 20 17:53:29 crc kubenswrapper[4690]: I0320 17:53:29.840717 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-pn26k" event={"ID":"c491527c-0ddb-41bc-86f2-78334b0b3075","Type":"ContainerDied","Data":"9ebf62b4eb2a590670aef2257f8b622aa85255d20524cbe44ec03ce8bc9710d8"} Mar 20 17:53:29 crc kubenswrapper[4690]: I0320 17:53:29.840747 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-pn26k" event={"ID":"c491527c-0ddb-41bc-86f2-78334b0b3075","Type":"ContainerDied","Data":"e8a264045acf53343689a596ede5744acdfe25296397850af8ec3e9effe60730"} Mar 20 17:53:29 crc kubenswrapper[4690]: I0320 17:53:29.840762 4690 scope.go:117] "RemoveContainer" containerID="9ebf62b4eb2a590670aef2257f8b622aa85255d20524cbe44ec03ce8bc9710d8" Mar 20 17:53:29 crc kubenswrapper[4690]: I0320 17:53:29.840899 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-pn26k" Mar 20 17:53:29 crc kubenswrapper[4690]: I0320 17:53:29.871206 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d3959350-36d3-4ea7-92af-94ac690b406e","Type":"ContainerStarted","Data":"fc19a725163dc985f48cb92b28e512a1cc93f3c84065130f45af53fa8bc63b9b"} Mar 20 17:53:29 crc kubenswrapper[4690]: I0320 17:53:29.872280 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 20 17:53:29 crc kubenswrapper[4690]: I0320 17:53:29.880849 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-b966595d7-ccrp2"] Mar 20 17:53:29 crc kubenswrapper[4690]: E0320 17:53:29.881555 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c491527c-0ddb-41bc-86f2-78334b0b3075" containerName="dnsmasq-dns" Mar 20 17:53:29 crc kubenswrapper[4690]: I0320 17:53:29.881649 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="c491527c-0ddb-41bc-86f2-78334b0b3075" containerName="dnsmasq-dns" Mar 20 17:53:29 crc kubenswrapper[4690]: E0320 17:53:29.881742 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2055a33a-e663-4664-9240-aab3d338c45e" containerName="keystone-bootstrap" Mar 20 17:53:29 crc kubenswrapper[4690]: I0320 17:53:29.881825 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="2055a33a-e663-4664-9240-aab3d338c45e" containerName="keystone-bootstrap" Mar 20 17:53:29 crc kubenswrapper[4690]: E0320 17:53:29.881910 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c491527c-0ddb-41bc-86f2-78334b0b3075" containerName="init" Mar 20 17:53:29 crc kubenswrapper[4690]: I0320 17:53:29.881983 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="c491527c-0ddb-41bc-86f2-78334b0b3075" containerName="init" Mar 20 17:53:29 crc kubenswrapper[4690]: I0320 17:53:29.882334 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="2055a33a-e663-4664-9240-aab3d338c45e" containerName="keystone-bootstrap" Mar 20 17:53:29 crc kubenswrapper[4690]: I0320 17:53:29.882456 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="c491527c-0ddb-41bc-86f2-78334b0b3075" containerName="dnsmasq-dns" Mar 20 17:53:29 crc kubenswrapper[4690]: I0320 17:53:29.883228 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-b966595d7-ccrp2" Mar 20 17:53:29 crc kubenswrapper[4690]: I0320 17:53:29.887831 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 20 17:53:29 crc kubenswrapper[4690]: I0320 17:53:29.888214 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 20 17:53:29 crc kubenswrapper[4690]: I0320 17:53:29.888367 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-7l4fq" Mar 20 17:53:29 crc kubenswrapper[4690]: I0320 17:53:29.888475 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 20 17:53:29 crc kubenswrapper[4690]: I0320 17:53:29.888618 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 20 17:53:29 crc kubenswrapper[4690]: I0320 17:53:29.888726 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 20 17:53:29 crc kubenswrapper[4690]: I0320 17:53:29.898473 4690 scope.go:117] "RemoveContainer" containerID="75f31e088a4d0f24d44802f59974fd215688b0bbfe529b9a1c55ac2368d286f7" Mar 20 17:53:29 crc kubenswrapper[4690]: I0320 17:53:29.960984 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-dc95ccffb-gvrdq" Mar 20 17:53:29 crc kubenswrapper[4690]: I0320 17:53:29.961018 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-b966595d7-ccrp2"] Mar 20 17:53:29 crc kubenswrapper[4690]: I0320 17:53:29.961035 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-dc95ccffb-gvrdq" Mar 20 17:53:29 crc kubenswrapper[4690]: I0320 17:53:29.978387 4690 scope.go:117] "RemoveContainer" containerID="9ebf62b4eb2a590670aef2257f8b622aa85255d20524cbe44ec03ce8bc9710d8" Mar 20 17:53:29 crc kubenswrapper[4690]: I0320 17:53:29.981014 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-pn26k"] Mar 20 17:53:29 crc kubenswrapper[4690]: E0320 17:53:29.982319 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ebf62b4eb2a590670aef2257f8b622aa85255d20524cbe44ec03ce8bc9710d8\": container with ID starting with 9ebf62b4eb2a590670aef2257f8b622aa85255d20524cbe44ec03ce8bc9710d8 not found: ID does not exist" containerID="9ebf62b4eb2a590670aef2257f8b622aa85255d20524cbe44ec03ce8bc9710d8" Mar 20 17:53:29 crc kubenswrapper[4690]: I0320 17:53:29.982347 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ebf62b4eb2a590670aef2257f8b622aa85255d20524cbe44ec03ce8bc9710d8"} err="failed to get container status \"9ebf62b4eb2a590670aef2257f8b622aa85255d20524cbe44ec03ce8bc9710d8\": rpc error: code = NotFound desc = could not find container \"9ebf62b4eb2a590670aef2257f8b622aa85255d20524cbe44ec03ce8bc9710d8\": container with ID starting with 9ebf62b4eb2a590670aef2257f8b622aa85255d20524cbe44ec03ce8bc9710d8 not found: ID does not exist" Mar 20 17:53:29 crc kubenswrapper[4690]: I0320 17:53:29.982367 4690 scope.go:117] "RemoveContainer" containerID="75f31e088a4d0f24d44802f59974fd215688b0bbfe529b9a1c55ac2368d286f7" Mar 20 17:53:29 crc kubenswrapper[4690]: E0320 17:53:29.986327 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75f31e088a4d0f24d44802f59974fd215688b0bbfe529b9a1c55ac2368d286f7\": container with ID starting with 75f31e088a4d0f24d44802f59974fd215688b0bbfe529b9a1c55ac2368d286f7 not found: ID does not exist" containerID="75f31e088a4d0f24d44802f59974fd215688b0bbfe529b9a1c55ac2368d286f7" Mar 20 17:53:29 crc kubenswrapper[4690]: I0320 17:53:29.986353 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75f31e088a4d0f24d44802f59974fd215688b0bbfe529b9a1c55ac2368d286f7"} err="failed to get container status \"75f31e088a4d0f24d44802f59974fd215688b0bbfe529b9a1c55ac2368d286f7\": rpc error: code = NotFound desc = could not find container \"75f31e088a4d0f24d44802f59974fd215688b0bbfe529b9a1c55ac2368d286f7\": container with ID starting with 75f31e088a4d0f24d44802f59974fd215688b0bbfe529b9a1c55ac2368d286f7 not found: ID does not exist" Mar 20 17:53:29 crc kubenswrapper[4690]: I0320 17:53:29.988311 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-pn26k"] Mar 20 17:53:30 crc kubenswrapper[4690]: I0320 17:53:30.034455 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqr58\" (UniqueName: \"kubernetes.io/projected/112d1eb4-f375-4825-94e3-d721fbafbeaa-kube-api-access-xqr58\") pod \"keystone-b966595d7-ccrp2\" (UID: \"112d1eb4-f375-4825-94e3-d721fbafbeaa\") " pod="openstack/keystone-b966595d7-ccrp2" Mar 20 17:53:30 crc kubenswrapper[4690]: I0320 17:53:30.034501 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/112d1eb4-f375-4825-94e3-d721fbafbeaa-scripts\") pod \"keystone-b966595d7-ccrp2\" (UID: \"112d1eb4-f375-4825-94e3-d721fbafbeaa\") " pod="openstack/keystone-b966595d7-ccrp2" Mar 20 17:53:30 crc kubenswrapper[4690]: I0320 17:53:30.034559 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/112d1eb4-f375-4825-94e3-d721fbafbeaa-config-data\") pod \"keystone-b966595d7-ccrp2\" (UID: \"112d1eb4-f375-4825-94e3-d721fbafbeaa\") " pod="openstack/keystone-b966595d7-ccrp2" Mar 20 17:53:30 crc kubenswrapper[4690]: I0320 17:53:30.034595 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/112d1eb4-f375-4825-94e3-d721fbafbeaa-credential-keys\") pod \"keystone-b966595d7-ccrp2\" (UID: \"112d1eb4-f375-4825-94e3-d721fbafbeaa\") " pod="openstack/keystone-b966595d7-ccrp2" Mar 20 17:53:30 crc kubenswrapper[4690]: I0320 17:53:30.034621 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/112d1eb4-f375-4825-94e3-d721fbafbeaa-internal-tls-certs\") pod \"keystone-b966595d7-ccrp2\" (UID: \"112d1eb4-f375-4825-94e3-d721fbafbeaa\") " pod="openstack/keystone-b966595d7-ccrp2" Mar 20 17:53:30 crc kubenswrapper[4690]: I0320 17:53:30.034802 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/112d1eb4-f375-4825-94e3-d721fbafbeaa-fernet-keys\") pod \"keystone-b966595d7-ccrp2\" (UID: \"112d1eb4-f375-4825-94e3-d721fbafbeaa\") " pod="openstack/keystone-b966595d7-ccrp2" Mar 20 17:53:30 crc kubenswrapper[4690]: I0320 17:53:30.034833 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/112d1eb4-f375-4825-94e3-d721fbafbeaa-public-tls-certs\") pod \"keystone-b966595d7-ccrp2\" (UID: \"112d1eb4-f375-4825-94e3-d721fbafbeaa\") " pod="openstack/keystone-b966595d7-ccrp2" Mar 20 17:53:30 crc kubenswrapper[4690]: I0320 17:53:30.034871 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/112d1eb4-f375-4825-94e3-d721fbafbeaa-combined-ca-bundle\") pod \"keystone-b966595d7-ccrp2\" (UID: \"112d1eb4-f375-4825-94e3-d721fbafbeaa\") " pod="openstack/keystone-b966595d7-ccrp2" Mar 20 17:53:30 crc kubenswrapper[4690]: I0320 17:53:30.136531 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqr58\" (UniqueName: \"kubernetes.io/projected/112d1eb4-f375-4825-94e3-d721fbafbeaa-kube-api-access-xqr58\") pod \"keystone-b966595d7-ccrp2\" (UID: \"112d1eb4-f375-4825-94e3-d721fbafbeaa\") " pod="openstack/keystone-b966595d7-ccrp2" Mar 20 17:53:30 crc kubenswrapper[4690]: I0320 17:53:30.136586 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/112d1eb4-f375-4825-94e3-d721fbafbeaa-scripts\") pod \"keystone-b966595d7-ccrp2\" (UID: \"112d1eb4-f375-4825-94e3-d721fbafbeaa\") " pod="openstack/keystone-b966595d7-ccrp2" Mar 20 17:53:30 crc kubenswrapper[4690]: I0320 17:53:30.136634 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/112d1eb4-f375-4825-94e3-d721fbafbeaa-config-data\") pod \"keystone-b966595d7-ccrp2\" (UID: \"112d1eb4-f375-4825-94e3-d721fbafbeaa\") " pod="openstack/keystone-b966595d7-ccrp2" Mar 20 17:53:30 crc kubenswrapper[4690]: I0320 17:53:30.136692 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/112d1eb4-f375-4825-94e3-d721fbafbeaa-credential-keys\") pod \"keystone-b966595d7-ccrp2\" (UID: \"112d1eb4-f375-4825-94e3-d721fbafbeaa\") " pod="openstack/keystone-b966595d7-ccrp2" Mar 20 17:53:30 crc kubenswrapper[4690]: I0320 17:53:30.136728 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/112d1eb4-f375-4825-94e3-d721fbafbeaa-internal-tls-certs\") pod \"keystone-b966595d7-ccrp2\" (UID: \"112d1eb4-f375-4825-94e3-d721fbafbeaa\") " pod="openstack/keystone-b966595d7-ccrp2" Mar 20 17:53:30 crc kubenswrapper[4690]: I0320 17:53:30.136872 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/112d1eb4-f375-4825-94e3-d721fbafbeaa-fernet-keys\") pod \"keystone-b966595d7-ccrp2\" (UID: \"112d1eb4-f375-4825-94e3-d721fbafbeaa\") " pod="openstack/keystone-b966595d7-ccrp2" Mar 20 17:53:30 crc kubenswrapper[4690]: I0320 17:53:30.136927 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/112d1eb4-f375-4825-94e3-d721fbafbeaa-public-tls-certs\") pod \"keystone-b966595d7-ccrp2\" (UID: \"112d1eb4-f375-4825-94e3-d721fbafbeaa\") " pod="openstack/keystone-b966595d7-ccrp2" Mar 20 17:53:30 crc kubenswrapper[4690]: I0320 17:53:30.136970 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/112d1eb4-f375-4825-94e3-d721fbafbeaa-combined-ca-bundle\") pod \"keystone-b966595d7-ccrp2\" (UID: \"112d1eb4-f375-4825-94e3-d721fbafbeaa\") " pod="openstack/keystone-b966595d7-ccrp2" Mar 20 17:53:30 crc kubenswrapper[4690]: I0320 17:53:30.141084 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/112d1eb4-f375-4825-94e3-d721fbafbeaa-config-data\") pod \"keystone-b966595d7-ccrp2\" (UID: \"112d1eb4-f375-4825-94e3-d721fbafbeaa\") " pod="openstack/keystone-b966595d7-ccrp2" Mar 20 17:53:30 crc kubenswrapper[4690]: I0320 17:53:30.141519 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/112d1eb4-f375-4825-94e3-d721fbafbeaa-credential-keys\") pod \"keystone-b966595d7-ccrp2\" (UID: \"112d1eb4-f375-4825-94e3-d721fbafbeaa\") " pod="openstack/keystone-b966595d7-ccrp2" Mar 20 17:53:30 crc kubenswrapper[4690]: I0320 17:53:30.143270 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/112d1eb4-f375-4825-94e3-d721fbafbeaa-internal-tls-certs\") pod \"keystone-b966595d7-ccrp2\" (UID: \"112d1eb4-f375-4825-94e3-d721fbafbeaa\") " pod="openstack/keystone-b966595d7-ccrp2" Mar 20 17:53:30 crc kubenswrapper[4690]: I0320 17:53:30.144737 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/112d1eb4-f375-4825-94e3-d721fbafbeaa-public-tls-certs\") pod \"keystone-b966595d7-ccrp2\" (UID: \"112d1eb4-f375-4825-94e3-d721fbafbeaa\") " pod="openstack/keystone-b966595d7-ccrp2" Mar 20 17:53:30 crc kubenswrapper[4690]: I0320 17:53:30.147768 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/112d1eb4-f375-4825-94e3-d721fbafbeaa-combined-ca-bundle\") pod \"keystone-b966595d7-ccrp2\" (UID: \"112d1eb4-f375-4825-94e3-d721fbafbeaa\") " pod="openstack/keystone-b966595d7-ccrp2" Mar 20 17:53:30 crc kubenswrapper[4690]: I0320 17:53:30.147817 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/112d1eb4-f375-4825-94e3-d721fbafbeaa-fernet-keys\") pod \"keystone-b966595d7-ccrp2\" (UID: \"112d1eb4-f375-4825-94e3-d721fbafbeaa\") " pod="openstack/keystone-b966595d7-ccrp2" Mar 20 17:53:30 crc kubenswrapper[4690]: I0320 17:53:30.157159 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqr58\" (UniqueName: \"kubernetes.io/projected/112d1eb4-f375-4825-94e3-d721fbafbeaa-kube-api-access-xqr58\") pod \"keystone-b966595d7-ccrp2\" (UID: \"112d1eb4-f375-4825-94e3-d721fbafbeaa\") " pod="openstack/keystone-b966595d7-ccrp2" Mar 20 17:53:30 crc kubenswrapper[4690]: I0320 17:53:30.157438 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/112d1eb4-f375-4825-94e3-d721fbafbeaa-scripts\") pod \"keystone-b966595d7-ccrp2\" (UID: \"112d1eb4-f375-4825-94e3-d721fbafbeaa\") " pod="openstack/keystone-b966595d7-ccrp2" Mar 20 17:53:30 crc kubenswrapper[4690]: I0320 17:53:30.302116 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-b966595d7-ccrp2" Mar 20 17:53:30 crc kubenswrapper[4690]: I0320 17:53:30.731072 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 20 17:53:30 crc kubenswrapper[4690]: I0320 17:53:30.731458 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 20 17:53:30 crc kubenswrapper[4690]: I0320 17:53:30.788200 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 20 17:53:30 crc kubenswrapper[4690]: I0320 17:53:30.790141 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 20 17:53:30 crc kubenswrapper[4690]: I0320 17:53:30.853917 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-b966595d7-ccrp2"] Mar 20 17:53:30 crc kubenswrapper[4690]: W0320 17:53:30.865620 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod112d1eb4_f375_4825_94e3_d721fbafbeaa.slice/crio-e4444986f778fe6457a1890e05d3257f7495adb519c2121f25cc09dfb62452ae WatchSource:0}: Error finding container e4444986f778fe6457a1890e05d3257f7495adb519c2121f25cc09dfb62452ae: Status 404 returned error can't find the container with id e4444986f778fe6457a1890e05d3257f7495adb519c2121f25cc09dfb62452ae Mar 20 17:53:30 crc kubenswrapper[4690]: I0320 17:53:30.911026 4690 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 17:53:30 crc kubenswrapper[4690]: I0320 17:53:30.913534 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-b966595d7-ccrp2" event={"ID":"112d1eb4-f375-4825-94e3-d721fbafbeaa","Type":"ContainerStarted","Data":"e4444986f778fe6457a1890e05d3257f7495adb519c2121f25cc09dfb62452ae"} Mar 20 17:53:30 crc kubenswrapper[4690]: I0320 17:53:30.914741 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 20 17:53:30 crc kubenswrapper[4690]: I0320 17:53:30.914763 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 20 17:53:31 crc kubenswrapper[4690]: I0320 17:53:31.092274 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 20 17:53:31 crc kubenswrapper[4690]: I0320 17:53:31.907537 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c491527c-0ddb-41bc-86f2-78334b0b3075" path="/var/lib/kubelet/pods/c491527c-0ddb-41bc-86f2-78334b0b3075/volumes" Mar 20 17:53:31 crc kubenswrapper[4690]: I0320 17:53:31.935363 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-b966595d7-ccrp2" event={"ID":"112d1eb4-f375-4825-94e3-d721fbafbeaa","Type":"ContainerStarted","Data":"63bc0da7f0f78ee76bc2db636f24b060c1b2f6f33305ed4daf22f571aebfb80c"} Mar 20 17:53:31 crc kubenswrapper[4690]: I0320 17:53:31.937212 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-b966595d7-ccrp2" Mar 20 17:53:31 crc kubenswrapper[4690]: I0320 17:53:31.945775 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-m4wn2" event={"ID":"d1fc6c70-315f-47d3-b8d3-17e3da8ee4a0","Type":"ContainerStarted","Data":"12214f370e00b564156a5d39029a1d0d2c6c81f91e1b974f4dcdfb2559e97034"} Mar 20 17:53:31 crc kubenswrapper[4690]: I0320 17:53:31.955428 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-wqk6t" event={"ID":"ef3bcd50-5724-42a1-92df-262256c07d45","Type":"ContainerStarted","Data":"5a32bba702758b598f20bc79be94a3a7ce52e126fb8463b7326ee083977e5faa"} Mar 20 17:53:31 crc kubenswrapper[4690]: I0320 17:53:31.955453 4690 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 17:53:31 crc kubenswrapper[4690]: I0320 17:53:31.962890 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-b966595d7-ccrp2" podStartSLOduration=2.962868243 podStartE2EDuration="2.962868243s" podCreationTimestamp="2026-03-20 17:53:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:53:31.959329432 +0000 UTC m=+1286.825155200" watchObservedRunningTime="2026-03-20 17:53:31.962868243 +0000 UTC m=+1286.828693921" Mar 20 17:53:32 crc kubenswrapper[4690]: I0320 17:53:32.014357 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-wqk6t" podStartSLOduration=2.797046943 podStartE2EDuration="42.014335527s" podCreationTimestamp="2026-03-20 17:52:50 +0000 UTC" firstStartedPulling="2026-03-20 17:52:51.637868095 +0000 UTC m=+1246.503693773" lastFinishedPulling="2026-03-20 17:53:30.855156659 +0000 UTC m=+1285.720982357" observedRunningTime="2026-03-20 17:53:31.988473456 +0000 UTC m=+1286.854299154" watchObservedRunningTime="2026-03-20 17:53:32.014335527 +0000 UTC m=+1286.880161205" Mar 20 17:53:32 crc kubenswrapper[4690]: I0320 17:53:32.021225 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-m4wn2" podStartSLOduration=3.131714594 podStartE2EDuration="42.021160362s" podCreationTimestamp="2026-03-20 17:52:50 +0000 UTC" firstStartedPulling="2026-03-20 17:52:51.966768341 +0000 UTC m=+1246.832594019" lastFinishedPulling="2026-03-20 17:53:30.856214109 +0000 UTC m=+1285.722039787" observedRunningTime="2026-03-20 17:53:32.010105516 +0000 UTC m=+1286.875931194" watchObservedRunningTime="2026-03-20 17:53:32.021160362 +0000 UTC m=+1286.886986040" Mar 20 17:53:32 crc kubenswrapper[4690]: I0320 17:53:32.355558 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 20 17:53:32 crc kubenswrapper[4690]: I0320 17:53:32.973783 4690 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 17:53:32 crc kubenswrapper[4690]: I0320 17:53:32.973809 4690 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 17:53:33 crc kubenswrapper[4690]: I0320 17:53:33.713792 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 20 17:53:33 crc kubenswrapper[4690]: I0320 17:53:33.976937 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 20 17:53:35 crc kubenswrapper[4690]: I0320 17:53:35.016976 4690 generic.go:334] "Generic (PLEG): container finished" podID="d1fc6c70-315f-47d3-b8d3-17e3da8ee4a0" containerID="12214f370e00b564156a5d39029a1d0d2c6c81f91e1b974f4dcdfb2559e97034" exitCode=0 Mar 20 17:53:35 crc kubenswrapper[4690]: I0320 17:53:35.018325 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-m4wn2" event={"ID":"d1fc6c70-315f-47d3-b8d3-17e3da8ee4a0","Type":"ContainerDied","Data":"12214f370e00b564156a5d39029a1d0d2c6c81f91e1b974f4dcdfb2559e97034"} Mar 20 17:53:37 crc kubenswrapper[4690]: I0320 17:53:37.310297 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-m4wn2" Mar 20 17:53:37 crc kubenswrapper[4690]: I0320 17:53:37.487019 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d1fc6c70-315f-47d3-b8d3-17e3da8ee4a0-db-sync-config-data\") pod \"d1fc6c70-315f-47d3-b8d3-17e3da8ee4a0\" (UID: \"d1fc6c70-315f-47d3-b8d3-17e3da8ee4a0\") " Mar 20 17:53:37 crc kubenswrapper[4690]: I0320 17:53:37.487185 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1fc6c70-315f-47d3-b8d3-17e3da8ee4a0-combined-ca-bundle\") pod \"d1fc6c70-315f-47d3-b8d3-17e3da8ee4a0\" (UID: \"d1fc6c70-315f-47d3-b8d3-17e3da8ee4a0\") " Mar 20 17:53:37 crc kubenswrapper[4690]: I0320 17:53:37.487231 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ck7x\" (UniqueName: \"kubernetes.io/projected/d1fc6c70-315f-47d3-b8d3-17e3da8ee4a0-kube-api-access-6ck7x\") pod \"d1fc6c70-315f-47d3-b8d3-17e3da8ee4a0\" (UID: \"d1fc6c70-315f-47d3-b8d3-17e3da8ee4a0\") " Mar 20 17:53:37 crc kubenswrapper[4690]: I0320 17:53:37.509675 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1fc6c70-315f-47d3-b8d3-17e3da8ee4a0-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "d1fc6c70-315f-47d3-b8d3-17e3da8ee4a0" (UID: "d1fc6c70-315f-47d3-b8d3-17e3da8ee4a0"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:53:37 crc kubenswrapper[4690]: I0320 17:53:37.509834 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1fc6c70-315f-47d3-b8d3-17e3da8ee4a0-kube-api-access-6ck7x" (OuterVolumeSpecName: "kube-api-access-6ck7x") pod "d1fc6c70-315f-47d3-b8d3-17e3da8ee4a0" (UID: "d1fc6c70-315f-47d3-b8d3-17e3da8ee4a0"). InnerVolumeSpecName "kube-api-access-6ck7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:53:37 crc kubenswrapper[4690]: I0320 17:53:37.523437 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1fc6c70-315f-47d3-b8d3-17e3da8ee4a0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d1fc6c70-315f-47d3-b8d3-17e3da8ee4a0" (UID: "d1fc6c70-315f-47d3-b8d3-17e3da8ee4a0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:53:37 crc kubenswrapper[4690]: I0320 17:53:37.588691 4690 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d1fc6c70-315f-47d3-b8d3-17e3da8ee4a0-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:37 crc kubenswrapper[4690]: I0320 17:53:37.588717 4690 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1fc6c70-315f-47d3-b8d3-17e3da8ee4a0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:37 crc kubenswrapper[4690]: I0320 17:53:37.588729 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ck7x\" (UniqueName: \"kubernetes.io/projected/d1fc6c70-315f-47d3-b8d3-17e3da8ee4a0-kube-api-access-6ck7x\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:38 crc kubenswrapper[4690]: I0320 17:53:38.050135 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-m4wn2" Mar 20 17:53:38 crc kubenswrapper[4690]: I0320 17:53:38.050143 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-m4wn2" event={"ID":"d1fc6c70-315f-47d3-b8d3-17e3da8ee4a0","Type":"ContainerDied","Data":"269ee3c16fb9ae06427e37a08ebc86df57d7a3c610371a6b5fa5e41aa7314240"} Mar 20 17:53:38 crc kubenswrapper[4690]: I0320 17:53:38.050586 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="269ee3c16fb9ae06427e37a08ebc86df57d7a3c610371a6b5fa5e41aa7314240" Mar 20 17:53:38 crc kubenswrapper[4690]: I0320 17:53:38.055749 4690 generic.go:334] "Generic (PLEG): container finished" podID="ef3bcd50-5724-42a1-92df-262256c07d45" containerID="5a32bba702758b598f20bc79be94a3a7ce52e126fb8463b7326ee083977e5faa" exitCode=0 Mar 20 17:53:38 crc kubenswrapper[4690]: I0320 17:53:38.055800 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-wqk6t" event={"ID":"ef3bcd50-5724-42a1-92df-262256c07d45","Type":"ContainerDied","Data":"5a32bba702758b598f20bc79be94a3a7ce52e126fb8463b7326ee083977e5faa"} Mar 20 17:53:38 crc kubenswrapper[4690]: I0320 17:53:38.543290 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-7f7645db4-ph4nl"] Mar 20 17:53:38 crc kubenswrapper[4690]: E0320 17:53:38.543680 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1fc6c70-315f-47d3-b8d3-17e3da8ee4a0" containerName="barbican-db-sync" Mar 20 17:53:38 crc kubenswrapper[4690]: I0320 17:53:38.543691 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1fc6c70-315f-47d3-b8d3-17e3da8ee4a0" containerName="barbican-db-sync" Mar 20 17:53:38 crc kubenswrapper[4690]: I0320 17:53:38.543853 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1fc6c70-315f-47d3-b8d3-17e3da8ee4a0" containerName="barbican-db-sync" Mar 20 17:53:38 crc kubenswrapper[4690]: I0320 17:53:38.544695 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7f7645db4-ph4nl" Mar 20 17:53:38 crc kubenswrapper[4690]: I0320 17:53:38.550286 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 20 17:53:38 crc kubenswrapper[4690]: I0320 17:53:38.555488 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Mar 20 17:53:38 crc kubenswrapper[4690]: I0320 17:53:38.555888 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-hbpx6" Mar 20 17:53:38 crc kubenswrapper[4690]: I0320 17:53:38.621356 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-76d4bbb45f-p692s"] Mar 20 17:53:38 crc kubenswrapper[4690]: I0320 17:53:38.640598 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-76d4bbb45f-p692s" Mar 20 17:53:38 crc kubenswrapper[4690]: I0320 17:53:38.649740 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Mar 20 17:53:38 crc kubenswrapper[4690]: I0320 17:53:38.667134 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-76d4bbb45f-p692s"] Mar 20 17:53:38 crc kubenswrapper[4690]: I0320 17:53:38.675612 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7f7645db4-ph4nl"] Mar 20 17:53:38 crc kubenswrapper[4690]: I0320 17:53:38.692472 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-dhqzh"] Mar 20 17:53:38 crc kubenswrapper[4690]: I0320 17:53:38.700584 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-dhqzh"] Mar 20 17:53:38 crc kubenswrapper[4690]: I0320 17:53:38.700691 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-dhqzh" Mar 20 17:53:38 crc kubenswrapper[4690]: I0320 17:53:38.707658 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpn52\" (UniqueName: \"kubernetes.io/projected/eb2225a2-e763-42d5-affd-562463c266e6-kube-api-access-lpn52\") pod \"barbican-keystone-listener-7f7645db4-ph4nl\" (UID: \"eb2225a2-e763-42d5-affd-562463c266e6\") " pod="openstack/barbican-keystone-listener-7f7645db4-ph4nl" Mar 20 17:53:38 crc kubenswrapper[4690]: I0320 17:53:38.707720 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb2225a2-e763-42d5-affd-562463c266e6-combined-ca-bundle\") pod \"barbican-keystone-listener-7f7645db4-ph4nl\" (UID: \"eb2225a2-e763-42d5-affd-562463c266e6\") " pod="openstack/barbican-keystone-listener-7f7645db4-ph4nl" Mar 20 17:53:38 crc kubenswrapper[4690]: I0320 17:53:38.707819 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eb2225a2-e763-42d5-affd-562463c266e6-config-data-custom\") pod \"barbican-keystone-listener-7f7645db4-ph4nl\" (UID: \"eb2225a2-e763-42d5-affd-562463c266e6\") " pod="openstack/barbican-keystone-listener-7f7645db4-ph4nl" Mar 20 17:53:38 crc kubenswrapper[4690]: I0320 17:53:38.707836 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb2225a2-e763-42d5-affd-562463c266e6-logs\") pod \"barbican-keystone-listener-7f7645db4-ph4nl\" (UID: \"eb2225a2-e763-42d5-affd-562463c266e6\") " pod="openstack/barbican-keystone-listener-7f7645db4-ph4nl" Mar 20 17:53:38 crc kubenswrapper[4690]: I0320 17:53:38.707862 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb2225a2-e763-42d5-affd-562463c266e6-config-data\") pod \"barbican-keystone-listener-7f7645db4-ph4nl\" (UID: \"eb2225a2-e763-42d5-affd-562463c266e6\") " pod="openstack/barbican-keystone-listener-7f7645db4-ph4nl" Mar 20 17:53:38 crc kubenswrapper[4690]: I0320 17:53:38.807422 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-78ff979d76-5nxvv"] Mar 20 17:53:38 crc kubenswrapper[4690]: I0320 17:53:38.809423 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5tdk\" (UniqueName: \"kubernetes.io/projected/c9f236dc-288b-4297-a7d4-f8ee50ba166e-kube-api-access-x5tdk\") pod \"dnsmasq-dns-85ff748b95-dhqzh\" (UID: \"c9f236dc-288b-4297-a7d4-f8ee50ba166e\") " pod="openstack/dnsmasq-dns-85ff748b95-dhqzh" Mar 20 17:53:38 crc kubenswrapper[4690]: I0320 17:53:38.809457 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6df0c1b-ea55-44ae-8fb9-9573c54322a8-config-data\") pod \"barbican-worker-76d4bbb45f-p692s\" (UID: \"d6df0c1b-ea55-44ae-8fb9-9573c54322a8\") " pod="openstack/barbican-worker-76d4bbb45f-p692s" Mar 20 17:53:38 crc kubenswrapper[4690]: I0320 17:53:38.809487 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6df0c1b-ea55-44ae-8fb9-9573c54322a8-combined-ca-bundle\") pod \"barbican-worker-76d4bbb45f-p692s\" (UID: \"d6df0c1b-ea55-44ae-8fb9-9573c54322a8\") " pod="openstack/barbican-worker-76d4bbb45f-p692s" Mar 20 17:53:38 crc kubenswrapper[4690]: I0320 17:53:38.809515 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpn52\" (UniqueName: \"kubernetes.io/projected/eb2225a2-e763-42d5-affd-562463c266e6-kube-api-access-lpn52\") pod \"barbican-keystone-listener-7f7645db4-ph4nl\" (UID: \"eb2225a2-e763-42d5-affd-562463c266e6\") " pod="openstack/barbican-keystone-listener-7f7645db4-ph4nl" Mar 20 17:53:38 crc kubenswrapper[4690]: I0320 17:53:38.809536 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d6df0c1b-ea55-44ae-8fb9-9573c54322a8-config-data-custom\") pod \"barbican-worker-76d4bbb45f-p692s\" (UID: \"d6df0c1b-ea55-44ae-8fb9-9573c54322a8\") " pod="openstack/barbican-worker-76d4bbb45f-p692s" Mar 20 17:53:38 crc kubenswrapper[4690]: I0320 17:53:38.809563 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c9f236dc-288b-4297-a7d4-f8ee50ba166e-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-dhqzh\" (UID: \"c9f236dc-288b-4297-a7d4-f8ee50ba166e\") " pod="openstack/dnsmasq-dns-85ff748b95-dhqzh" Mar 20 17:53:38 crc kubenswrapper[4690]: I0320 17:53:38.809581 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6df0c1b-ea55-44ae-8fb9-9573c54322a8-logs\") pod \"barbican-worker-76d4bbb45f-p692s\" (UID: \"d6df0c1b-ea55-44ae-8fb9-9573c54322a8\") " pod="openstack/barbican-worker-76d4bbb45f-p692s" Mar 20 17:53:38 crc kubenswrapper[4690]: I0320 17:53:38.809598 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb2225a2-e763-42d5-affd-562463c266e6-combined-ca-bundle\") pod \"barbican-keystone-listener-7f7645db4-ph4nl\" (UID: \"eb2225a2-e763-42d5-affd-562463c266e6\") " pod="openstack/barbican-keystone-listener-7f7645db4-ph4nl" Mar 20 17:53:38 crc kubenswrapper[4690]: I0320 17:53:38.809628 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9f236dc-288b-4297-a7d4-f8ee50ba166e-config\") pod \"dnsmasq-dns-85ff748b95-dhqzh\" (UID: \"c9f236dc-288b-4297-a7d4-f8ee50ba166e\") " pod="openstack/dnsmasq-dns-85ff748b95-dhqzh" Mar 20 17:53:38 crc kubenswrapper[4690]: I0320 17:53:38.809662 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c9f236dc-288b-4297-a7d4-f8ee50ba166e-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-dhqzh\" (UID: \"c9f236dc-288b-4297-a7d4-f8ee50ba166e\") " pod="openstack/dnsmasq-dns-85ff748b95-dhqzh" Mar 20 17:53:38 crc kubenswrapper[4690]: I0320 17:53:38.809701 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c9f236dc-288b-4297-a7d4-f8ee50ba166e-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-dhqzh\" (UID: \"c9f236dc-288b-4297-a7d4-f8ee50ba166e\") " pod="openstack/dnsmasq-dns-85ff748b95-dhqzh" Mar 20 17:53:38 crc kubenswrapper[4690]: I0320 17:53:38.809771 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eb2225a2-e763-42d5-affd-562463c266e6-config-data-custom\") pod \"barbican-keystone-listener-7f7645db4-ph4nl\" (UID: \"eb2225a2-e763-42d5-affd-562463c266e6\") " pod="openstack/barbican-keystone-listener-7f7645db4-ph4nl" Mar 20 17:53:38 crc kubenswrapper[4690]: I0320 17:53:38.809802 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb2225a2-e763-42d5-affd-562463c266e6-logs\") pod \"barbican-keystone-listener-7f7645db4-ph4nl\" (UID: \"eb2225a2-e763-42d5-affd-562463c266e6\") " pod="openstack/barbican-keystone-listener-7f7645db4-ph4nl" Mar 20 17:53:38 crc kubenswrapper[4690]: I0320 17:53:38.809850 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c9f236dc-288b-4297-a7d4-f8ee50ba166e-dns-svc\") pod \"dnsmasq-dns-85ff748b95-dhqzh\" (UID: \"c9f236dc-288b-4297-a7d4-f8ee50ba166e\") " pod="openstack/dnsmasq-dns-85ff748b95-dhqzh" Mar 20 17:53:38 crc kubenswrapper[4690]: I0320 17:53:38.809877 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rp9x6\" (UniqueName: \"kubernetes.io/projected/d6df0c1b-ea55-44ae-8fb9-9573c54322a8-kube-api-access-rp9x6\") pod \"barbican-worker-76d4bbb45f-p692s\" (UID: \"d6df0c1b-ea55-44ae-8fb9-9573c54322a8\") " pod="openstack/barbican-worker-76d4bbb45f-p692s" Mar 20 17:53:38 crc kubenswrapper[4690]: I0320 17:53:38.809902 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb2225a2-e763-42d5-affd-562463c266e6-config-data\") pod \"barbican-keystone-listener-7f7645db4-ph4nl\" (UID: \"eb2225a2-e763-42d5-affd-562463c266e6\") " pod="openstack/barbican-keystone-listener-7f7645db4-ph4nl" Mar 20 17:53:38 crc kubenswrapper[4690]: I0320 17:53:38.810539 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-78ff979d76-5nxvv" Mar 20 17:53:38 crc kubenswrapper[4690]: I0320 17:53:38.811181 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb2225a2-e763-42d5-affd-562463c266e6-logs\") pod \"barbican-keystone-listener-7f7645db4-ph4nl\" (UID: \"eb2225a2-e763-42d5-affd-562463c266e6\") " pod="openstack/barbican-keystone-listener-7f7645db4-ph4nl" Mar 20 17:53:38 crc kubenswrapper[4690]: I0320 17:53:38.815826 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Mar 20 17:53:38 crc kubenswrapper[4690]: I0320 17:53:38.816237 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eb2225a2-e763-42d5-affd-562463c266e6-config-data-custom\") pod \"barbican-keystone-listener-7f7645db4-ph4nl\" (UID: \"eb2225a2-e763-42d5-affd-562463c266e6\") " pod="openstack/barbican-keystone-listener-7f7645db4-ph4nl" Mar 20 17:53:38 crc kubenswrapper[4690]: I0320 17:53:38.826604 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb2225a2-e763-42d5-affd-562463c266e6-combined-ca-bundle\") pod \"barbican-keystone-listener-7f7645db4-ph4nl\" (UID: \"eb2225a2-e763-42d5-affd-562463c266e6\") " pod="openstack/barbican-keystone-listener-7f7645db4-ph4nl" Mar 20 17:53:38 crc kubenswrapper[4690]: I0320 17:53:38.836176 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb2225a2-e763-42d5-affd-562463c266e6-config-data\") pod \"barbican-keystone-listener-7f7645db4-ph4nl\" (UID: \"eb2225a2-e763-42d5-affd-562463c266e6\") " pod="openstack/barbican-keystone-listener-7f7645db4-ph4nl" Mar 20 17:53:38 crc kubenswrapper[4690]: I0320 17:53:38.837457 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-78ff979d76-5nxvv"] Mar 20 17:53:38 crc kubenswrapper[4690]: I0320 17:53:38.852617 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpn52\" (UniqueName: \"kubernetes.io/projected/eb2225a2-e763-42d5-affd-562463c266e6-kube-api-access-lpn52\") pod \"barbican-keystone-listener-7f7645db4-ph4nl\" (UID: \"eb2225a2-e763-42d5-affd-562463c266e6\") " pod="openstack/barbican-keystone-listener-7f7645db4-ph4nl" Mar 20 17:53:38 crc kubenswrapper[4690]: I0320 17:53:38.912109 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61654056-0ac2-4383-9018-4ddd302b2620-combined-ca-bundle\") pod \"barbican-api-78ff979d76-5nxvv\" (UID: \"61654056-0ac2-4383-9018-4ddd302b2620\") " pod="openstack/barbican-api-78ff979d76-5nxvv" Mar 20 17:53:38 crc kubenswrapper[4690]: I0320 17:53:38.912153 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/61654056-0ac2-4383-9018-4ddd302b2620-config-data-custom\") pod \"barbican-api-78ff979d76-5nxvv\" (UID: \"61654056-0ac2-4383-9018-4ddd302b2620\") " pod="openstack/barbican-api-78ff979d76-5nxvv" Mar 20 17:53:38 crc kubenswrapper[4690]: I0320 17:53:38.912203 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5tdk\" (UniqueName: \"kubernetes.io/projected/c9f236dc-288b-4297-a7d4-f8ee50ba166e-kube-api-access-x5tdk\") pod \"dnsmasq-dns-85ff748b95-dhqzh\" (UID: \"c9f236dc-288b-4297-a7d4-f8ee50ba166e\") " pod="openstack/dnsmasq-dns-85ff748b95-dhqzh" Mar 20 17:53:38 crc kubenswrapper[4690]: I0320 17:53:38.912231 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6df0c1b-ea55-44ae-8fb9-9573c54322a8-config-data\") pod \"barbican-worker-76d4bbb45f-p692s\" (UID: \"d6df0c1b-ea55-44ae-8fb9-9573c54322a8\") " pod="openstack/barbican-worker-76d4bbb45f-p692s" Mar 20 17:53:38 crc kubenswrapper[4690]: I0320 17:53:38.912280 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6df0c1b-ea55-44ae-8fb9-9573c54322a8-combined-ca-bundle\") pod \"barbican-worker-76d4bbb45f-p692s\" (UID: \"d6df0c1b-ea55-44ae-8fb9-9573c54322a8\") " pod="openstack/barbican-worker-76d4bbb45f-p692s" Mar 20 17:53:38 crc kubenswrapper[4690]: I0320 17:53:38.912303 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d6df0c1b-ea55-44ae-8fb9-9573c54322a8-config-data-custom\") pod \"barbican-worker-76d4bbb45f-p692s\" (UID: \"d6df0c1b-ea55-44ae-8fb9-9573c54322a8\") " pod="openstack/barbican-worker-76d4bbb45f-p692s" Mar 20 17:53:38 crc kubenswrapper[4690]: I0320 17:53:38.912321 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7k9p\" (UniqueName: \"kubernetes.io/projected/61654056-0ac2-4383-9018-4ddd302b2620-kube-api-access-f7k9p\") pod \"barbican-api-78ff979d76-5nxvv\" (UID: \"61654056-0ac2-4383-9018-4ddd302b2620\") " pod="openstack/barbican-api-78ff979d76-5nxvv" Mar 20 17:53:38 crc kubenswrapper[4690]: I0320 17:53:38.912345 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c9f236dc-288b-4297-a7d4-f8ee50ba166e-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-dhqzh\" (UID: \"c9f236dc-288b-4297-a7d4-f8ee50ba166e\") " pod="openstack/dnsmasq-dns-85ff748b95-dhqzh" Mar 20 17:53:38 crc kubenswrapper[4690]: I0320 17:53:38.912363 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6df0c1b-ea55-44ae-8fb9-9573c54322a8-logs\") pod \"barbican-worker-76d4bbb45f-p692s\" (UID: \"d6df0c1b-ea55-44ae-8fb9-9573c54322a8\") " pod="openstack/barbican-worker-76d4bbb45f-p692s" Mar 20 17:53:38 crc kubenswrapper[4690]: I0320 17:53:38.912455 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61654056-0ac2-4383-9018-4ddd302b2620-logs\") pod \"barbican-api-78ff979d76-5nxvv\" (UID: \"61654056-0ac2-4383-9018-4ddd302b2620\") " pod="openstack/barbican-api-78ff979d76-5nxvv" Mar 20 17:53:38 crc kubenswrapper[4690]: I0320 17:53:38.912516 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9f236dc-288b-4297-a7d4-f8ee50ba166e-config\") pod \"dnsmasq-dns-85ff748b95-dhqzh\" (UID: \"c9f236dc-288b-4297-a7d4-f8ee50ba166e\") " pod="openstack/dnsmasq-dns-85ff748b95-dhqzh" Mar 20 17:53:38 crc kubenswrapper[4690]: I0320 17:53:38.912593 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c9f236dc-288b-4297-a7d4-f8ee50ba166e-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-dhqzh\" (UID: \"c9f236dc-288b-4297-a7d4-f8ee50ba166e\") " pod="openstack/dnsmasq-dns-85ff748b95-dhqzh" Mar 20 17:53:38 crc kubenswrapper[4690]: I0320 17:53:38.912672 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61654056-0ac2-4383-9018-4ddd302b2620-config-data\") pod \"barbican-api-78ff979d76-5nxvv\" (UID: \"61654056-0ac2-4383-9018-4ddd302b2620\") " pod="openstack/barbican-api-78ff979d76-5nxvv" Mar 20 17:53:38 crc kubenswrapper[4690]: I0320 17:53:38.912691 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6df0c1b-ea55-44ae-8fb9-9573c54322a8-logs\") pod \"barbican-worker-76d4bbb45f-p692s\" (UID: \"d6df0c1b-ea55-44ae-8fb9-9573c54322a8\") " pod="openstack/barbican-worker-76d4bbb45f-p692s" Mar 20 17:53:38 crc kubenswrapper[4690]: I0320 17:53:38.912721 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c9f236dc-288b-4297-a7d4-f8ee50ba166e-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-dhqzh\" (UID: \"c9f236dc-288b-4297-a7d4-f8ee50ba166e\") " pod="openstack/dnsmasq-dns-85ff748b95-dhqzh" Mar 20 17:53:38 crc kubenswrapper[4690]: I0320 17:53:38.912781 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c9f236dc-288b-4297-a7d4-f8ee50ba166e-dns-svc\") pod \"dnsmasq-dns-85ff748b95-dhqzh\" (UID: \"c9f236dc-288b-4297-a7d4-f8ee50ba166e\") " pod="openstack/dnsmasq-dns-85ff748b95-dhqzh" Mar 20 17:53:38 crc kubenswrapper[4690]: I0320 17:53:38.912814 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rp9x6\" (UniqueName: \"kubernetes.io/projected/d6df0c1b-ea55-44ae-8fb9-9573c54322a8-kube-api-access-rp9x6\") pod \"barbican-worker-76d4bbb45f-p692s\" (UID: \"d6df0c1b-ea55-44ae-8fb9-9573c54322a8\") " pod="openstack/barbican-worker-76d4bbb45f-p692s" Mar 20 17:53:38 crc kubenswrapper[4690]: I0320 17:53:38.913445 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c9f236dc-288b-4297-a7d4-f8ee50ba166e-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-dhqzh\" (UID: \"c9f236dc-288b-4297-a7d4-f8ee50ba166e\") " pod="openstack/dnsmasq-dns-85ff748b95-dhqzh" Mar 20 17:53:38 crc kubenswrapper[4690]: I0320 17:53:38.913605 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c9f236dc-288b-4297-a7d4-f8ee50ba166e-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-dhqzh\" (UID: \"c9f236dc-288b-4297-a7d4-f8ee50ba166e\") " pod="openstack/dnsmasq-dns-85ff748b95-dhqzh" Mar 20 17:53:38 crc kubenswrapper[4690]: I0320 17:53:38.913906 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c9f236dc-288b-4297-a7d4-f8ee50ba166e-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-dhqzh\" (UID: \"c9f236dc-288b-4297-a7d4-f8ee50ba166e\") " pod="openstack/dnsmasq-dns-85ff748b95-dhqzh" Mar 20 17:53:38 crc kubenswrapper[4690]: I0320 17:53:38.913937 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c9f236dc-288b-4297-a7d4-f8ee50ba166e-dns-svc\") pod \"dnsmasq-dns-85ff748b95-dhqzh\" (UID: \"c9f236dc-288b-4297-a7d4-f8ee50ba166e\") " pod="openstack/dnsmasq-dns-85ff748b95-dhqzh" Mar 20 17:53:38 crc kubenswrapper[4690]: I0320 17:53:38.913999 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9f236dc-288b-4297-a7d4-f8ee50ba166e-config\") pod \"dnsmasq-dns-85ff748b95-dhqzh\" (UID: \"c9f236dc-288b-4297-a7d4-f8ee50ba166e\") " pod="openstack/dnsmasq-dns-85ff748b95-dhqzh" Mar 20 17:53:38 crc kubenswrapper[4690]: I0320 17:53:38.915541 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6df0c1b-ea55-44ae-8fb9-9573c54322a8-combined-ca-bundle\") pod \"barbican-worker-76d4bbb45f-p692s\" (UID: \"d6df0c1b-ea55-44ae-8fb9-9573c54322a8\") " pod="openstack/barbican-worker-76d4bbb45f-p692s" Mar 20 17:53:38 crc kubenswrapper[4690]: I0320 17:53:38.916330 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6df0c1b-ea55-44ae-8fb9-9573c54322a8-config-data\") pod \"barbican-worker-76d4bbb45f-p692s\" (UID: \"d6df0c1b-ea55-44ae-8fb9-9573c54322a8\") " pod="openstack/barbican-worker-76d4bbb45f-p692s" Mar 20 17:53:38 crc kubenswrapper[4690]: I0320 17:53:38.917014 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d6df0c1b-ea55-44ae-8fb9-9573c54322a8-config-data-custom\") pod \"barbican-worker-76d4bbb45f-p692s\" (UID: \"d6df0c1b-ea55-44ae-8fb9-9573c54322a8\") " pod="openstack/barbican-worker-76d4bbb45f-p692s" Mar 20 17:53:38 crc kubenswrapper[4690]: I0320 17:53:38.923103 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7f7645db4-ph4nl" Mar 20 17:53:38 crc kubenswrapper[4690]: I0320 17:53:38.928884 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5tdk\" (UniqueName: \"kubernetes.io/projected/c9f236dc-288b-4297-a7d4-f8ee50ba166e-kube-api-access-x5tdk\") pod \"dnsmasq-dns-85ff748b95-dhqzh\" (UID: \"c9f236dc-288b-4297-a7d4-f8ee50ba166e\") " pod="openstack/dnsmasq-dns-85ff748b95-dhqzh" Mar 20 17:53:38 crc kubenswrapper[4690]: I0320 17:53:38.930243 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rp9x6\" (UniqueName: \"kubernetes.io/projected/d6df0c1b-ea55-44ae-8fb9-9573c54322a8-kube-api-access-rp9x6\") pod \"barbican-worker-76d4bbb45f-p692s\" (UID: \"d6df0c1b-ea55-44ae-8fb9-9573c54322a8\") " pod="openstack/barbican-worker-76d4bbb45f-p692s" Mar 20 17:53:38 crc kubenswrapper[4690]: I0320 17:53:38.993759 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-76d4bbb45f-p692s" Mar 20 17:53:39 crc kubenswrapper[4690]: I0320 17:53:39.017629 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7k9p\" (UniqueName: \"kubernetes.io/projected/61654056-0ac2-4383-9018-4ddd302b2620-kube-api-access-f7k9p\") pod \"barbican-api-78ff979d76-5nxvv\" (UID: \"61654056-0ac2-4383-9018-4ddd302b2620\") " pod="openstack/barbican-api-78ff979d76-5nxvv" Mar 20 17:53:39 crc kubenswrapper[4690]: I0320 17:53:39.017693 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61654056-0ac2-4383-9018-4ddd302b2620-logs\") pod \"barbican-api-78ff979d76-5nxvv\" (UID: \"61654056-0ac2-4383-9018-4ddd302b2620\") " pod="openstack/barbican-api-78ff979d76-5nxvv" Mar 20 17:53:39 crc kubenswrapper[4690]: I0320 17:53:39.017757 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61654056-0ac2-4383-9018-4ddd302b2620-config-data\") pod \"barbican-api-78ff979d76-5nxvv\" (UID: \"61654056-0ac2-4383-9018-4ddd302b2620\") " pod="openstack/barbican-api-78ff979d76-5nxvv" Mar 20 17:53:39 crc kubenswrapper[4690]: I0320 17:53:39.017851 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61654056-0ac2-4383-9018-4ddd302b2620-combined-ca-bundle\") pod \"barbican-api-78ff979d76-5nxvv\" (UID: \"61654056-0ac2-4383-9018-4ddd302b2620\") " pod="openstack/barbican-api-78ff979d76-5nxvv" Mar 20 17:53:39 crc kubenswrapper[4690]: I0320 17:53:39.017867 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/61654056-0ac2-4383-9018-4ddd302b2620-config-data-custom\") pod \"barbican-api-78ff979d76-5nxvv\" (UID: \"61654056-0ac2-4383-9018-4ddd302b2620\") " pod="openstack/barbican-api-78ff979d76-5nxvv" Mar 20 17:53:39 crc kubenswrapper[4690]: I0320 17:53:39.019045 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61654056-0ac2-4383-9018-4ddd302b2620-logs\") pod \"barbican-api-78ff979d76-5nxvv\" (UID: \"61654056-0ac2-4383-9018-4ddd302b2620\") " pod="openstack/barbican-api-78ff979d76-5nxvv" Mar 20 17:53:39 crc kubenswrapper[4690]: I0320 17:53:39.021367 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61654056-0ac2-4383-9018-4ddd302b2620-combined-ca-bundle\") pod \"barbican-api-78ff979d76-5nxvv\" (UID: \"61654056-0ac2-4383-9018-4ddd302b2620\") " pod="openstack/barbican-api-78ff979d76-5nxvv" Mar 20 17:53:39 crc kubenswrapper[4690]: I0320 17:53:39.022570 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61654056-0ac2-4383-9018-4ddd302b2620-config-data\") pod \"barbican-api-78ff979d76-5nxvv\" (UID: \"61654056-0ac2-4383-9018-4ddd302b2620\") " pod="openstack/barbican-api-78ff979d76-5nxvv" Mar 20 17:53:39 crc kubenswrapper[4690]: I0320 17:53:39.023090 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/61654056-0ac2-4383-9018-4ddd302b2620-config-data-custom\") pod \"barbican-api-78ff979d76-5nxvv\" (UID: \"61654056-0ac2-4383-9018-4ddd302b2620\") " pod="openstack/barbican-api-78ff979d76-5nxvv" Mar 20 17:53:39 crc kubenswrapper[4690]: I0320 17:53:39.025734 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-dhqzh" Mar 20 17:53:39 crc kubenswrapper[4690]: I0320 17:53:39.045496 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7k9p\" (UniqueName: \"kubernetes.io/projected/61654056-0ac2-4383-9018-4ddd302b2620-kube-api-access-f7k9p\") pod \"barbican-api-78ff979d76-5nxvv\" (UID: \"61654056-0ac2-4383-9018-4ddd302b2620\") " pod="openstack/barbican-api-78ff979d76-5nxvv" Mar 20 17:53:39 crc kubenswrapper[4690]: I0320 17:53:39.197373 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-78ff979d76-5nxvv" Mar 20 17:53:39 crc kubenswrapper[4690]: I0320 17:53:39.379459 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-wqk6t" Mar 20 17:53:39 crc kubenswrapper[4690]: I0320 17:53:39.527121 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef3bcd50-5724-42a1-92df-262256c07d45-config-data\") pod \"ef3bcd50-5724-42a1-92df-262256c07d45\" (UID: \"ef3bcd50-5724-42a1-92df-262256c07d45\") " Mar 20 17:53:39 crc kubenswrapper[4690]: I0320 17:53:39.527507 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef3bcd50-5724-42a1-92df-262256c07d45-combined-ca-bundle\") pod \"ef3bcd50-5724-42a1-92df-262256c07d45\" (UID: \"ef3bcd50-5724-42a1-92df-262256c07d45\") " Mar 20 17:53:39 crc kubenswrapper[4690]: I0320 17:53:39.527770 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef3bcd50-5724-42a1-92df-262256c07d45-scripts\") pod \"ef3bcd50-5724-42a1-92df-262256c07d45\" (UID: \"ef3bcd50-5724-42a1-92df-262256c07d45\") " Mar 20 17:53:39 crc kubenswrapper[4690]: I0320 17:53:39.527840 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8gqwx\" (UniqueName: \"kubernetes.io/projected/ef3bcd50-5724-42a1-92df-262256c07d45-kube-api-access-8gqwx\") pod \"ef3bcd50-5724-42a1-92df-262256c07d45\" (UID: \"ef3bcd50-5724-42a1-92df-262256c07d45\") " Mar 20 17:53:39 crc kubenswrapper[4690]: I0320 17:53:39.527926 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ef3bcd50-5724-42a1-92df-262256c07d45-etc-machine-id\") pod \"ef3bcd50-5724-42a1-92df-262256c07d45\" (UID: \"ef3bcd50-5724-42a1-92df-262256c07d45\") " Mar 20 17:53:39 crc kubenswrapper[4690]: I0320 17:53:39.527981 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ef3bcd50-5724-42a1-92df-262256c07d45-db-sync-config-data\") pod \"ef3bcd50-5724-42a1-92df-262256c07d45\" (UID: \"ef3bcd50-5724-42a1-92df-262256c07d45\") " Mar 20 17:53:39 crc kubenswrapper[4690]: I0320 17:53:39.530076 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ef3bcd50-5724-42a1-92df-262256c07d45-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "ef3bcd50-5724-42a1-92df-262256c07d45" (UID: "ef3bcd50-5724-42a1-92df-262256c07d45"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:53:39 crc kubenswrapper[4690]: I0320 17:53:39.538821 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef3bcd50-5724-42a1-92df-262256c07d45-kube-api-access-8gqwx" (OuterVolumeSpecName: "kube-api-access-8gqwx") pod "ef3bcd50-5724-42a1-92df-262256c07d45" (UID: "ef3bcd50-5724-42a1-92df-262256c07d45"). InnerVolumeSpecName "kube-api-access-8gqwx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:53:39 crc kubenswrapper[4690]: I0320 17:53:39.540754 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef3bcd50-5724-42a1-92df-262256c07d45-scripts" (OuterVolumeSpecName: "scripts") pod "ef3bcd50-5724-42a1-92df-262256c07d45" (UID: "ef3bcd50-5724-42a1-92df-262256c07d45"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:53:39 crc kubenswrapper[4690]: I0320 17:53:39.543540 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef3bcd50-5724-42a1-92df-262256c07d45-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "ef3bcd50-5724-42a1-92df-262256c07d45" (UID: "ef3bcd50-5724-42a1-92df-262256c07d45"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:53:39 crc kubenswrapper[4690]: I0320 17:53:39.572748 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef3bcd50-5724-42a1-92df-262256c07d45-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ef3bcd50-5724-42a1-92df-262256c07d45" (UID: "ef3bcd50-5724-42a1-92df-262256c07d45"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:53:39 crc kubenswrapper[4690]: I0320 17:53:39.615437 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef3bcd50-5724-42a1-92df-262256c07d45-config-data" (OuterVolumeSpecName: "config-data") pod "ef3bcd50-5724-42a1-92df-262256c07d45" (UID: "ef3bcd50-5724-42a1-92df-262256c07d45"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:53:39 crc kubenswrapper[4690]: I0320 17:53:39.630418 4690 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef3bcd50-5724-42a1-92df-262256c07d45-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:39 crc kubenswrapper[4690]: I0320 17:53:39.630454 4690 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef3bcd50-5724-42a1-92df-262256c07d45-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:39 crc kubenswrapper[4690]: I0320 17:53:39.630466 4690 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef3bcd50-5724-42a1-92df-262256c07d45-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:39 crc kubenswrapper[4690]: I0320 17:53:39.630475 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8gqwx\" (UniqueName: \"kubernetes.io/projected/ef3bcd50-5724-42a1-92df-262256c07d45-kube-api-access-8gqwx\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:39 crc kubenswrapper[4690]: I0320 17:53:39.630484 4690 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ef3bcd50-5724-42a1-92df-262256c07d45-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:39 crc kubenswrapper[4690]: I0320 17:53:39.630495 4690 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ef3bcd50-5724-42a1-92df-262256c07d45-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:39 crc kubenswrapper[4690]: I0320 17:53:39.696080 4690 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-867c5896-qkwmr" podUID="607d61e7-e52a-46e6-a23a-2d4714c5b543" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.148:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.148:8443: connect: connection refused" Mar 20 17:53:39 crc kubenswrapper[4690]: I0320 17:53:39.859473 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-76d4bbb45f-p692s"] Mar 20 17:53:39 crc kubenswrapper[4690]: I0320 17:53:39.946852 4690 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-dc95ccffb-gvrdq" podUID="799b195a-e6e5-4a19-b41a-1c7550e21e90" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.149:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.149:8443: connect: connection refused" Mar 20 17:53:39 crc kubenswrapper[4690]: I0320 17:53:39.963120 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7f7645db4-ph4nl"] Mar 20 17:53:40 crc kubenswrapper[4690]: W0320 17:53:40.049011 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod61654056_0ac2_4383_9018_4ddd302b2620.slice/crio-cd16dd73f52d56a362f16d90398a87d5de93a54ec31c9949ffe4fe4f75ff39c1 WatchSource:0}: Error finding container cd16dd73f52d56a362f16d90398a87d5de93a54ec31c9949ffe4fe4f75ff39c1: Status 404 returned error can't find the container with id cd16dd73f52d56a362f16d90398a87d5de93a54ec31c9949ffe4fe4f75ff39c1 Mar 20 17:53:40 crc kubenswrapper[4690]: I0320 17:53:40.050830 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-78ff979d76-5nxvv"] Mar 20 17:53:40 crc kubenswrapper[4690]: I0320 17:53:40.058160 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-dhqzh"] Mar 20 17:53:40 crc kubenswrapper[4690]: W0320 17:53:40.058248 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc9f236dc_288b_4297_a7d4_f8ee50ba166e.slice/crio-ec722365507d320ad8327b95edd30d888ac4ad3ae14c899566e0f8441cda6250 WatchSource:0}: Error finding container ec722365507d320ad8327b95edd30d888ac4ad3ae14c899566e0f8441cda6250: Status 404 returned error can't find the container with id ec722365507d320ad8327b95edd30d888ac4ad3ae14c899566e0f8441cda6250 Mar 20 17:53:40 crc kubenswrapper[4690]: I0320 17:53:40.112642 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-76d4bbb45f-p692s" event={"ID":"d6df0c1b-ea55-44ae-8fb9-9573c54322a8","Type":"ContainerStarted","Data":"5fb81d609f4dc3f10b90e260a6fbd8d70912aebab5281d9c1dcd73cd0b49ac1d"} Mar 20 17:53:40 crc kubenswrapper[4690]: I0320 17:53:40.115482 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-dhqzh" event={"ID":"c9f236dc-288b-4297-a7d4-f8ee50ba166e","Type":"ContainerStarted","Data":"ec722365507d320ad8327b95edd30d888ac4ad3ae14c899566e0f8441cda6250"} Mar 20 17:53:40 crc kubenswrapper[4690]: I0320 17:53:40.119865 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7f7645db4-ph4nl" event={"ID":"eb2225a2-e763-42d5-affd-562463c266e6","Type":"ContainerStarted","Data":"2619eba5916ef5b8cbc19b9f62e574a724bd119801c1274eff584e64aa533c0f"} Mar 20 17:53:40 crc kubenswrapper[4690]: I0320 17:53:40.128373 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-wqk6t" event={"ID":"ef3bcd50-5724-42a1-92df-262256c07d45","Type":"ContainerDied","Data":"ce7b0954fa2bd6328ca83735ac00e5f329ae92e775b41ebd79c2c75e40d6a856"} Mar 20 17:53:40 crc kubenswrapper[4690]: I0320 17:53:40.128429 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce7b0954fa2bd6328ca83735ac00e5f329ae92e775b41ebd79c2c75e40d6a856" Mar 20 17:53:40 crc kubenswrapper[4690]: I0320 17:53:40.128404 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-wqk6t" Mar 20 17:53:40 crc kubenswrapper[4690]: I0320 17:53:40.132863 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-78ff979d76-5nxvv" event={"ID":"61654056-0ac2-4383-9018-4ddd302b2620","Type":"ContainerStarted","Data":"cd16dd73f52d56a362f16d90398a87d5de93a54ec31c9949ffe4fe4f75ff39c1"} Mar 20 17:53:40 crc kubenswrapper[4690]: I0320 17:53:40.138080 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d3959350-36d3-4ea7-92af-94ac690b406e","Type":"ContainerStarted","Data":"d7e706118957fff69c764285ed8e3aa61937a5d2ba48bed03d1ddd3c4bd727f3"} Mar 20 17:53:40 crc kubenswrapper[4690]: I0320 17:53:40.138563 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 17:53:40 crc kubenswrapper[4690]: I0320 17:53:40.138951 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d3959350-36d3-4ea7-92af-94ac690b406e" containerName="sg-core" containerID="cri-o://fc19a725163dc985f48cb92b28e512a1cc93f3c84065130f45af53fa8bc63b9b" gracePeriod=30 Mar 20 17:53:40 crc kubenswrapper[4690]: I0320 17:53:40.139095 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d3959350-36d3-4ea7-92af-94ac690b406e" containerName="proxy-httpd" containerID="cri-o://d7e706118957fff69c764285ed8e3aa61937a5d2ba48bed03d1ddd3c4bd727f3" gracePeriod=30 Mar 20 17:53:40 crc kubenswrapper[4690]: I0320 17:53:40.139370 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d3959350-36d3-4ea7-92af-94ac690b406e" containerName="ceilometer-notification-agent" containerID="cri-o://8060b4fcb26959d8193b3d3c27a2c289d9a43a03978a9852fe94303c958c5274" gracePeriod=30 Mar 20 17:53:40 crc kubenswrapper[4690]: I0320 17:53:40.139509 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d3959350-36d3-4ea7-92af-94ac690b406e" containerName="ceilometer-central-agent" containerID="cri-o://b23e19d0c9c2d945939c246d0d2686ed67c7d671693f2df3bb0092146af680bd" gracePeriod=30 Mar 20 17:53:40 crc kubenswrapper[4690]: I0320 17:53:40.161584 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.924196213 podStartE2EDuration="50.161568489s" podCreationTimestamp="2026-03-20 17:52:50 +0000 UTC" firstStartedPulling="2026-03-20 17:52:52.207082942 +0000 UTC m=+1247.072908620" lastFinishedPulling="2026-03-20 17:53:39.444455218 +0000 UTC m=+1294.310280896" observedRunningTime="2026-03-20 17:53:40.159192461 +0000 UTC m=+1295.025018139" watchObservedRunningTime="2026-03-20 17:53:40.161568489 +0000 UTC m=+1295.027394167" Mar 20 17:53:40 crc kubenswrapper[4690]: I0320 17:53:40.262831 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 17:53:40 crc kubenswrapper[4690]: E0320 17:53:40.263197 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef3bcd50-5724-42a1-92df-262256c07d45" containerName="cinder-db-sync" Mar 20 17:53:40 crc kubenswrapper[4690]: I0320 17:53:40.263213 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef3bcd50-5724-42a1-92df-262256c07d45" containerName="cinder-db-sync" Mar 20 17:53:40 crc kubenswrapper[4690]: I0320 17:53:40.263447 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef3bcd50-5724-42a1-92df-262256c07d45" containerName="cinder-db-sync" Mar 20 17:53:40 crc kubenswrapper[4690]: I0320 17:53:40.264317 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 17:53:40 crc kubenswrapper[4690]: I0320 17:53:40.267215 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-phs8q" Mar 20 17:53:40 crc kubenswrapper[4690]: I0320 17:53:40.267453 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 20 17:53:40 crc kubenswrapper[4690]: I0320 17:53:40.267489 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 20 17:53:40 crc kubenswrapper[4690]: I0320 17:53:40.267510 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 20 17:53:40 crc kubenswrapper[4690]: I0320 17:53:40.284195 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 17:53:40 crc kubenswrapper[4690]: I0320 17:53:40.313592 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-dhqzh"] Mar 20 17:53:40 crc kubenswrapper[4690]: I0320 17:53:40.343216 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2dbf4c6-e3eb-4984-a39a-0981181cea31-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"c2dbf4c6-e3eb-4984-a39a-0981181cea31\") " pod="openstack/cinder-scheduler-0" Mar 20 17:53:40 crc kubenswrapper[4690]: I0320 17:53:40.343345 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2dbf4c6-e3eb-4984-a39a-0981181cea31-config-data\") pod \"cinder-scheduler-0\" (UID: \"c2dbf4c6-e3eb-4984-a39a-0981181cea31\") " pod="openstack/cinder-scheduler-0" Mar 20 17:53:40 crc kubenswrapper[4690]: I0320 17:53:40.343367 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c2dbf4c6-e3eb-4984-a39a-0981181cea31-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"c2dbf4c6-e3eb-4984-a39a-0981181cea31\") " pod="openstack/cinder-scheduler-0" Mar 20 17:53:40 crc kubenswrapper[4690]: I0320 17:53:40.343460 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c2dbf4c6-e3eb-4984-a39a-0981181cea31-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"c2dbf4c6-e3eb-4984-a39a-0981181cea31\") " pod="openstack/cinder-scheduler-0" Mar 20 17:53:40 crc kubenswrapper[4690]: I0320 17:53:40.343480 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pczz\" (UniqueName: \"kubernetes.io/projected/c2dbf4c6-e3eb-4984-a39a-0981181cea31-kube-api-access-2pczz\") pod \"cinder-scheduler-0\" (UID: \"c2dbf4c6-e3eb-4984-a39a-0981181cea31\") " pod="openstack/cinder-scheduler-0" Mar 20 17:53:40 crc kubenswrapper[4690]: I0320 17:53:40.343506 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2dbf4c6-e3eb-4984-a39a-0981181cea31-scripts\") pod \"cinder-scheduler-0\" (UID: \"c2dbf4c6-e3eb-4984-a39a-0981181cea31\") " pod="openstack/cinder-scheduler-0" Mar 20 17:53:40 crc kubenswrapper[4690]: I0320 17:53:40.344362 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-m9hgh"] Mar 20 17:53:40 crc kubenswrapper[4690]: I0320 17:53:40.345811 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-m9hgh" Mar 20 17:53:40 crc kubenswrapper[4690]: I0320 17:53:40.398072 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-m9hgh"] Mar 20 17:53:40 crc kubenswrapper[4690]: I0320 17:53:40.444852 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/14ff8ae3-f423-405a-bdef-0805c9925ba5-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-m9hgh\" (UID: \"14ff8ae3-f423-405a-bdef-0805c9925ba5\") " pod="openstack/dnsmasq-dns-5c9776ccc5-m9hgh" Mar 20 17:53:40 crc kubenswrapper[4690]: I0320 17:53:40.444948 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2dbf4c6-e3eb-4984-a39a-0981181cea31-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"c2dbf4c6-e3eb-4984-a39a-0981181cea31\") " pod="openstack/cinder-scheduler-0" Mar 20 17:53:40 crc kubenswrapper[4690]: I0320 17:53:40.445034 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14ff8ae3-f423-405a-bdef-0805c9925ba5-config\") pod \"dnsmasq-dns-5c9776ccc5-m9hgh\" (UID: \"14ff8ae3-f423-405a-bdef-0805c9925ba5\") " pod="openstack/dnsmasq-dns-5c9776ccc5-m9hgh" Mar 20 17:53:40 crc kubenswrapper[4690]: I0320 17:53:40.445055 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/14ff8ae3-f423-405a-bdef-0805c9925ba5-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-m9hgh\" (UID: \"14ff8ae3-f423-405a-bdef-0805c9925ba5\") " pod="openstack/dnsmasq-dns-5c9776ccc5-m9hgh" Mar 20 17:53:40 crc kubenswrapper[4690]: I0320 17:53:40.445090 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2dbf4c6-e3eb-4984-a39a-0981181cea31-config-data\") pod \"cinder-scheduler-0\" (UID: \"c2dbf4c6-e3eb-4984-a39a-0981181cea31\") " pod="openstack/cinder-scheduler-0" Mar 20 17:53:40 crc kubenswrapper[4690]: I0320 17:53:40.445113 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c2dbf4c6-e3eb-4984-a39a-0981181cea31-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"c2dbf4c6-e3eb-4984-a39a-0981181cea31\") " pod="openstack/cinder-scheduler-0" Mar 20 17:53:40 crc kubenswrapper[4690]: I0320 17:53:40.445138 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/14ff8ae3-f423-405a-bdef-0805c9925ba5-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-m9hgh\" (UID: \"14ff8ae3-f423-405a-bdef-0805c9925ba5\") " pod="openstack/dnsmasq-dns-5c9776ccc5-m9hgh" Mar 20 17:53:40 crc kubenswrapper[4690]: I0320 17:53:40.445180 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c2dbf4c6-e3eb-4984-a39a-0981181cea31-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"c2dbf4c6-e3eb-4984-a39a-0981181cea31\") " pod="openstack/cinder-scheduler-0" Mar 20 17:53:40 crc kubenswrapper[4690]: I0320 17:53:40.445205 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pczz\" (UniqueName: \"kubernetes.io/projected/c2dbf4c6-e3eb-4984-a39a-0981181cea31-kube-api-access-2pczz\") pod \"cinder-scheduler-0\" (UID: \"c2dbf4c6-e3eb-4984-a39a-0981181cea31\") " pod="openstack/cinder-scheduler-0" Mar 20 17:53:40 crc kubenswrapper[4690]: I0320 17:53:40.445222 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/14ff8ae3-f423-405a-bdef-0805c9925ba5-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-m9hgh\" (UID: \"14ff8ae3-f423-405a-bdef-0805c9925ba5\") " pod="openstack/dnsmasq-dns-5c9776ccc5-m9hgh" Mar 20 17:53:40 crc kubenswrapper[4690]: I0320 17:53:40.445244 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhmp4\" (UniqueName: \"kubernetes.io/projected/14ff8ae3-f423-405a-bdef-0805c9925ba5-kube-api-access-hhmp4\") pod \"dnsmasq-dns-5c9776ccc5-m9hgh\" (UID: \"14ff8ae3-f423-405a-bdef-0805c9925ba5\") " pod="openstack/dnsmasq-dns-5c9776ccc5-m9hgh" Mar 20 17:53:40 crc kubenswrapper[4690]: I0320 17:53:40.445275 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2dbf4c6-e3eb-4984-a39a-0981181cea31-scripts\") pod \"cinder-scheduler-0\" (UID: \"c2dbf4c6-e3eb-4984-a39a-0981181cea31\") " pod="openstack/cinder-scheduler-0" Mar 20 17:53:40 crc kubenswrapper[4690]: I0320 17:53:40.445971 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c2dbf4c6-e3eb-4984-a39a-0981181cea31-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"c2dbf4c6-e3eb-4984-a39a-0981181cea31\") " pod="openstack/cinder-scheduler-0" Mar 20 17:53:40 crc kubenswrapper[4690]: I0320 17:53:40.454964 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c2dbf4c6-e3eb-4984-a39a-0981181cea31-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"c2dbf4c6-e3eb-4984-a39a-0981181cea31\") " pod="openstack/cinder-scheduler-0" Mar 20 17:53:40 crc kubenswrapper[4690]: I0320 17:53:40.460936 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2dbf4c6-e3eb-4984-a39a-0981181cea31-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"c2dbf4c6-e3eb-4984-a39a-0981181cea31\") " pod="openstack/cinder-scheduler-0" Mar 20 17:53:40 crc kubenswrapper[4690]: I0320 17:53:40.461748 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2dbf4c6-e3eb-4984-a39a-0981181cea31-config-data\") pod \"cinder-scheduler-0\" (UID: \"c2dbf4c6-e3eb-4984-a39a-0981181cea31\") " pod="openstack/cinder-scheduler-0" Mar 20 17:53:40 crc kubenswrapper[4690]: I0320 17:53:40.462698 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2dbf4c6-e3eb-4984-a39a-0981181cea31-scripts\") pod \"cinder-scheduler-0\" (UID: \"c2dbf4c6-e3eb-4984-a39a-0981181cea31\") " pod="openstack/cinder-scheduler-0" Mar 20 17:53:40 crc kubenswrapper[4690]: I0320 17:53:40.463289 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 20 17:53:40 crc kubenswrapper[4690]: I0320 17:53:40.467545 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 17:53:40 crc kubenswrapper[4690]: I0320 17:53:40.476708 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 20 17:53:40 crc kubenswrapper[4690]: I0320 17:53:40.476913 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pczz\" (UniqueName: \"kubernetes.io/projected/c2dbf4c6-e3eb-4984-a39a-0981181cea31-kube-api-access-2pczz\") pod \"cinder-scheduler-0\" (UID: \"c2dbf4c6-e3eb-4984-a39a-0981181cea31\") " pod="openstack/cinder-scheduler-0" Mar 20 17:53:40 crc kubenswrapper[4690]: I0320 17:53:40.480525 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 20 17:53:40 crc kubenswrapper[4690]: I0320 17:53:40.546237 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/14ff8ae3-f423-405a-bdef-0805c9925ba5-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-m9hgh\" (UID: \"14ff8ae3-f423-405a-bdef-0805c9925ba5\") " pod="openstack/dnsmasq-dns-5c9776ccc5-m9hgh" Mar 20 17:53:40 crc kubenswrapper[4690]: I0320 17:53:40.546332 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6xgp\" (UniqueName: \"kubernetes.io/projected/a5823518-338b-4aeb-8425-7e7bd2463a3a-kube-api-access-g6xgp\") pod \"cinder-api-0\" (UID: \"a5823518-338b-4aeb-8425-7e7bd2463a3a\") " pod="openstack/cinder-api-0" Mar 20 17:53:40 crc kubenswrapper[4690]: I0320 17:53:40.546360 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a5823518-338b-4aeb-8425-7e7bd2463a3a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a5823518-338b-4aeb-8425-7e7bd2463a3a\") " pod="openstack/cinder-api-0" Mar 20 17:53:40 crc kubenswrapper[4690]: I0320 17:53:40.546382 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a5823518-338b-4aeb-8425-7e7bd2463a3a-config-data-custom\") pod \"cinder-api-0\" (UID: \"a5823518-338b-4aeb-8425-7e7bd2463a3a\") " pod="openstack/cinder-api-0" Mar 20 17:53:40 crc kubenswrapper[4690]: I0320 17:53:40.546421 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5823518-338b-4aeb-8425-7e7bd2463a3a-logs\") pod \"cinder-api-0\" (UID: \"a5823518-338b-4aeb-8425-7e7bd2463a3a\") " pod="openstack/cinder-api-0" Mar 20 17:53:40 crc kubenswrapper[4690]: I0320 17:53:40.546445 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5823518-338b-4aeb-8425-7e7bd2463a3a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a5823518-338b-4aeb-8425-7e7bd2463a3a\") " pod="openstack/cinder-api-0" Mar 20 17:53:40 crc kubenswrapper[4690]: I0320 17:53:40.546461 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14ff8ae3-f423-405a-bdef-0805c9925ba5-config\") pod \"dnsmasq-dns-5c9776ccc5-m9hgh\" (UID: \"14ff8ae3-f423-405a-bdef-0805c9925ba5\") " pod="openstack/dnsmasq-dns-5c9776ccc5-m9hgh" Mar 20 17:53:40 crc kubenswrapper[4690]: I0320 17:53:40.546480 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/14ff8ae3-f423-405a-bdef-0805c9925ba5-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-m9hgh\" (UID: \"14ff8ae3-f423-405a-bdef-0805c9925ba5\") " pod="openstack/dnsmasq-dns-5c9776ccc5-m9hgh" Mar 20 17:53:40 crc kubenswrapper[4690]: I0320 17:53:40.546528 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5823518-338b-4aeb-8425-7e7bd2463a3a-scripts\") pod \"cinder-api-0\" (UID: \"a5823518-338b-4aeb-8425-7e7bd2463a3a\") " pod="openstack/cinder-api-0" Mar 20 17:53:40 crc kubenswrapper[4690]: I0320 17:53:40.547284 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/14ff8ae3-f423-405a-bdef-0805c9925ba5-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-m9hgh\" (UID: \"14ff8ae3-f423-405a-bdef-0805c9925ba5\") " pod="openstack/dnsmasq-dns-5c9776ccc5-m9hgh" Mar 20 17:53:40 crc kubenswrapper[4690]: I0320 17:53:40.547371 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14ff8ae3-f423-405a-bdef-0805c9925ba5-config\") pod \"dnsmasq-dns-5c9776ccc5-m9hgh\" (UID: \"14ff8ae3-f423-405a-bdef-0805c9925ba5\") " pod="openstack/dnsmasq-dns-5c9776ccc5-m9hgh" Mar 20 17:53:40 crc kubenswrapper[4690]: I0320 17:53:40.547506 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/14ff8ae3-f423-405a-bdef-0805c9925ba5-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-m9hgh\" (UID: \"14ff8ae3-f423-405a-bdef-0805c9925ba5\") " pod="openstack/dnsmasq-dns-5c9776ccc5-m9hgh" Mar 20 17:53:40 crc kubenswrapper[4690]: I0320 17:53:40.547553 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/14ff8ae3-f423-405a-bdef-0805c9925ba5-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-m9hgh\" (UID: \"14ff8ae3-f423-405a-bdef-0805c9925ba5\") " pod="openstack/dnsmasq-dns-5c9776ccc5-m9hgh" Mar 20 17:53:40 crc kubenswrapper[4690]: I0320 17:53:40.547592 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5823518-338b-4aeb-8425-7e7bd2463a3a-config-data\") pod \"cinder-api-0\" (UID: \"a5823518-338b-4aeb-8425-7e7bd2463a3a\") " pod="openstack/cinder-api-0" Mar 20 17:53:40 crc kubenswrapper[4690]: I0320 17:53:40.547608 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhmp4\" (UniqueName: \"kubernetes.io/projected/14ff8ae3-f423-405a-bdef-0805c9925ba5-kube-api-access-hhmp4\") pod \"dnsmasq-dns-5c9776ccc5-m9hgh\" (UID: \"14ff8ae3-f423-405a-bdef-0805c9925ba5\") " pod="openstack/dnsmasq-dns-5c9776ccc5-m9hgh" Mar 20 17:53:40 crc kubenswrapper[4690]: I0320 17:53:40.547781 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/14ff8ae3-f423-405a-bdef-0805c9925ba5-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-m9hgh\" (UID: \"14ff8ae3-f423-405a-bdef-0805c9925ba5\") " pod="openstack/dnsmasq-dns-5c9776ccc5-m9hgh" Mar 20 17:53:40 crc kubenswrapper[4690]: I0320 17:53:40.548320 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/14ff8ae3-f423-405a-bdef-0805c9925ba5-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-m9hgh\" (UID: \"14ff8ae3-f423-405a-bdef-0805c9925ba5\") " pod="openstack/dnsmasq-dns-5c9776ccc5-m9hgh" Mar 20 17:53:40 crc kubenswrapper[4690]: I0320 17:53:40.548486 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/14ff8ae3-f423-405a-bdef-0805c9925ba5-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-m9hgh\" (UID: \"14ff8ae3-f423-405a-bdef-0805c9925ba5\") " pod="openstack/dnsmasq-dns-5c9776ccc5-m9hgh" Mar 20 17:53:40 crc kubenswrapper[4690]: I0320 17:53:40.563629 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhmp4\" (UniqueName: \"kubernetes.io/projected/14ff8ae3-f423-405a-bdef-0805c9925ba5-kube-api-access-hhmp4\") pod \"dnsmasq-dns-5c9776ccc5-m9hgh\" (UID: \"14ff8ae3-f423-405a-bdef-0805c9925ba5\") " pod="openstack/dnsmasq-dns-5c9776ccc5-m9hgh" Mar 20 17:53:40 crc kubenswrapper[4690]: I0320 17:53:40.622723 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 17:53:40 crc kubenswrapper[4690]: I0320 17:53:40.649156 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5823518-338b-4aeb-8425-7e7bd2463a3a-config-data\") pod \"cinder-api-0\" (UID: \"a5823518-338b-4aeb-8425-7e7bd2463a3a\") " pod="openstack/cinder-api-0" Mar 20 17:53:40 crc kubenswrapper[4690]: I0320 17:53:40.649312 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6xgp\" (UniqueName: \"kubernetes.io/projected/a5823518-338b-4aeb-8425-7e7bd2463a3a-kube-api-access-g6xgp\") pod \"cinder-api-0\" (UID: \"a5823518-338b-4aeb-8425-7e7bd2463a3a\") " pod="openstack/cinder-api-0" Mar 20 17:53:40 crc kubenswrapper[4690]: I0320 17:53:40.649346 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a5823518-338b-4aeb-8425-7e7bd2463a3a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a5823518-338b-4aeb-8425-7e7bd2463a3a\") " pod="openstack/cinder-api-0" Mar 20 17:53:40 crc kubenswrapper[4690]: I0320 17:53:40.649373 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a5823518-338b-4aeb-8425-7e7bd2463a3a-config-data-custom\") pod \"cinder-api-0\" (UID: \"a5823518-338b-4aeb-8425-7e7bd2463a3a\") " pod="openstack/cinder-api-0" Mar 20 17:53:40 crc kubenswrapper[4690]: I0320 17:53:40.649423 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5823518-338b-4aeb-8425-7e7bd2463a3a-logs\") pod \"cinder-api-0\" (UID: \"a5823518-338b-4aeb-8425-7e7bd2463a3a\") " pod="openstack/cinder-api-0" Mar 20 17:53:40 crc kubenswrapper[4690]: I0320 17:53:40.649455 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5823518-338b-4aeb-8425-7e7bd2463a3a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a5823518-338b-4aeb-8425-7e7bd2463a3a\") " pod="openstack/cinder-api-0" Mar 20 17:53:40 crc kubenswrapper[4690]: I0320 17:53:40.649503 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5823518-338b-4aeb-8425-7e7bd2463a3a-scripts\") pod \"cinder-api-0\" (UID: \"a5823518-338b-4aeb-8425-7e7bd2463a3a\") " pod="openstack/cinder-api-0" Mar 20 17:53:40 crc kubenswrapper[4690]: I0320 17:53:40.649697 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a5823518-338b-4aeb-8425-7e7bd2463a3a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a5823518-338b-4aeb-8425-7e7bd2463a3a\") " pod="openstack/cinder-api-0" Mar 20 17:53:40 crc kubenswrapper[4690]: I0320 17:53:40.651192 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5823518-338b-4aeb-8425-7e7bd2463a3a-logs\") pod \"cinder-api-0\" (UID: \"a5823518-338b-4aeb-8425-7e7bd2463a3a\") " pod="openstack/cinder-api-0" Mar 20 17:53:40 crc kubenswrapper[4690]: I0320 17:53:40.653905 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5823518-338b-4aeb-8425-7e7bd2463a3a-scripts\") pod \"cinder-api-0\" (UID: \"a5823518-338b-4aeb-8425-7e7bd2463a3a\") " pod="openstack/cinder-api-0" Mar 20 17:53:40 crc kubenswrapper[4690]: I0320 17:53:40.655050 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a5823518-338b-4aeb-8425-7e7bd2463a3a-config-data-custom\") pod \"cinder-api-0\" (UID: \"a5823518-338b-4aeb-8425-7e7bd2463a3a\") " pod="openstack/cinder-api-0" Mar 20 17:53:40 crc kubenswrapper[4690]: I0320 17:53:40.656116 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5823518-338b-4aeb-8425-7e7bd2463a3a-config-data\") pod \"cinder-api-0\" (UID: \"a5823518-338b-4aeb-8425-7e7bd2463a3a\") " pod="openstack/cinder-api-0" Mar 20 17:53:40 crc kubenswrapper[4690]: I0320 17:53:40.658736 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5823518-338b-4aeb-8425-7e7bd2463a3a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a5823518-338b-4aeb-8425-7e7bd2463a3a\") " pod="openstack/cinder-api-0" Mar 20 17:53:40 crc kubenswrapper[4690]: I0320 17:53:40.669468 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6xgp\" (UniqueName: \"kubernetes.io/projected/a5823518-338b-4aeb-8425-7e7bd2463a3a-kube-api-access-g6xgp\") pod \"cinder-api-0\" (UID: \"a5823518-338b-4aeb-8425-7e7bd2463a3a\") " pod="openstack/cinder-api-0" Mar 20 17:53:40 crc kubenswrapper[4690]: I0320 17:53:40.720546 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-m9hgh" Mar 20 17:53:40 crc kubenswrapper[4690]: I0320 17:53:40.814280 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 17:53:41 crc kubenswrapper[4690]: I0320 17:53:41.091938 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 17:53:41 crc kubenswrapper[4690]: I0320 17:53:41.150320 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c2dbf4c6-e3eb-4984-a39a-0981181cea31","Type":"ContainerStarted","Data":"cfe61d2e5628a1be9ae529eab87694351bc292528f361a6a1c264f31d287b55e"} Mar 20 17:53:41 crc kubenswrapper[4690]: I0320 17:53:41.152985 4690 generic.go:334] "Generic (PLEG): container finished" podID="c9f236dc-288b-4297-a7d4-f8ee50ba166e" containerID="dc7caae3fdce0c7521fb1df19319f654305496ff707f980125d5a440a5a940ac" exitCode=0 Mar 20 17:53:41 crc kubenswrapper[4690]: I0320 17:53:41.153075 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-dhqzh" event={"ID":"c9f236dc-288b-4297-a7d4-f8ee50ba166e","Type":"ContainerDied","Data":"dc7caae3fdce0c7521fb1df19319f654305496ff707f980125d5a440a5a940ac"} Mar 20 17:53:41 crc kubenswrapper[4690]: I0320 17:53:41.156887 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-78ff979d76-5nxvv" event={"ID":"61654056-0ac2-4383-9018-4ddd302b2620","Type":"ContainerStarted","Data":"03c36b28d28a3c83edeff3a8e5edb85be531d0ee6476f60042a6c8fb3ca890c8"} Mar 20 17:53:41 crc kubenswrapper[4690]: I0320 17:53:41.156925 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-78ff979d76-5nxvv" event={"ID":"61654056-0ac2-4383-9018-4ddd302b2620","Type":"ContainerStarted","Data":"093db73a54b1c649a617b97632a944df93d4462e6a78ec7a84674a066b2eb40c"} Mar 20 17:53:41 crc kubenswrapper[4690]: I0320 17:53:41.157205 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-78ff979d76-5nxvv" Mar 20 17:53:41 crc kubenswrapper[4690]: I0320 17:53:41.157310 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-78ff979d76-5nxvv" Mar 20 17:53:41 crc kubenswrapper[4690]: I0320 17:53:41.162381 4690 generic.go:334] "Generic (PLEG): container finished" podID="d3959350-36d3-4ea7-92af-94ac690b406e" containerID="d7e706118957fff69c764285ed8e3aa61937a5d2ba48bed03d1ddd3c4bd727f3" exitCode=0 Mar 20 17:53:41 crc kubenswrapper[4690]: I0320 17:53:41.162408 4690 generic.go:334] "Generic (PLEG): container finished" podID="d3959350-36d3-4ea7-92af-94ac690b406e" containerID="fc19a725163dc985f48cb92b28e512a1cc93f3c84065130f45af53fa8bc63b9b" exitCode=2 Mar 20 17:53:41 crc kubenswrapper[4690]: I0320 17:53:41.162416 4690 generic.go:334] "Generic (PLEG): container finished" podID="d3959350-36d3-4ea7-92af-94ac690b406e" containerID="b23e19d0c9c2d945939c246d0d2686ed67c7d671693f2df3bb0092146af680bd" exitCode=0 Mar 20 17:53:41 crc kubenswrapper[4690]: I0320 17:53:41.162436 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d3959350-36d3-4ea7-92af-94ac690b406e","Type":"ContainerDied","Data":"d7e706118957fff69c764285ed8e3aa61937a5d2ba48bed03d1ddd3c4bd727f3"} Mar 20 17:53:41 crc kubenswrapper[4690]: I0320 17:53:41.162460 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d3959350-36d3-4ea7-92af-94ac690b406e","Type":"ContainerDied","Data":"fc19a725163dc985f48cb92b28e512a1cc93f3c84065130f45af53fa8bc63b9b"} Mar 20 17:53:41 crc kubenswrapper[4690]: I0320 17:53:41.162471 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d3959350-36d3-4ea7-92af-94ac690b406e","Type":"ContainerDied","Data":"b23e19d0c9c2d945939c246d0d2686ed67c7d671693f2df3bb0092146af680bd"} Mar 20 17:53:41 crc kubenswrapper[4690]: I0320 17:53:41.203877 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-78ff979d76-5nxvv" podStartSLOduration=3.203858839 podStartE2EDuration="3.203858839s" podCreationTimestamp="2026-03-20 17:53:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:53:41.201016337 +0000 UTC m=+1296.066842015" watchObservedRunningTime="2026-03-20 17:53:41.203858839 +0000 UTC m=+1296.069684517" Mar 20 17:53:41 crc kubenswrapper[4690]: I0320 17:53:41.385719 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-m9hgh"] Mar 20 17:53:41 crc kubenswrapper[4690]: I0320 17:53:41.438518 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 20 17:53:41 crc kubenswrapper[4690]: W0320 17:53:41.995813 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda5823518_338b_4aeb_8425_7e7bd2463a3a.slice/crio-a88ae6bcd2a21283e57b78dc5be2da37aeae8cad44e34f1a474bbed150629d53 WatchSource:0}: Error finding container a88ae6bcd2a21283e57b78dc5be2da37aeae8cad44e34f1a474bbed150629d53: Status 404 returned error can't find the container with id a88ae6bcd2a21283e57b78dc5be2da37aeae8cad44e34f1a474bbed150629d53 Mar 20 17:53:42 crc kubenswrapper[4690]: W0320 17:53:42.005726 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod14ff8ae3_f423_405a_bdef_0805c9925ba5.slice/crio-d071b197fc3af1e53517392c47fd9d9b11fa1a68db14a7bee4f36dc80346cf65 WatchSource:0}: Error finding container d071b197fc3af1e53517392c47fd9d9b11fa1a68db14a7bee4f36dc80346cf65: Status 404 returned error can't find the container with id d071b197fc3af1e53517392c47fd9d9b11fa1a68db14a7bee4f36dc80346cf65 Mar 20 17:53:42 crc kubenswrapper[4690]: I0320 17:53:42.109329 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-dhqzh" Mar 20 17:53:42 crc kubenswrapper[4690]: I0320 17:53:42.175293 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-m9hgh" event={"ID":"14ff8ae3-f423-405a-bdef-0805c9925ba5","Type":"ContainerStarted","Data":"d071b197fc3af1e53517392c47fd9d9b11fa1a68db14a7bee4f36dc80346cf65"} Mar 20 17:53:42 crc kubenswrapper[4690]: I0320 17:53:42.176918 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-dhqzh" event={"ID":"c9f236dc-288b-4297-a7d4-f8ee50ba166e","Type":"ContainerDied","Data":"ec722365507d320ad8327b95edd30d888ac4ad3ae14c899566e0f8441cda6250"} Mar 20 17:53:42 crc kubenswrapper[4690]: I0320 17:53:42.176966 4690 scope.go:117] "RemoveContainer" containerID="dc7caae3fdce0c7521fb1df19319f654305496ff707f980125d5a440a5a940ac" Mar 20 17:53:42 crc kubenswrapper[4690]: I0320 17:53:42.176972 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-dhqzh" Mar 20 17:53:42 crc kubenswrapper[4690]: I0320 17:53:42.180865 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a5823518-338b-4aeb-8425-7e7bd2463a3a","Type":"ContainerStarted","Data":"a88ae6bcd2a21283e57b78dc5be2da37aeae8cad44e34f1a474bbed150629d53"} Mar 20 17:53:42 crc kubenswrapper[4690]: I0320 17:53:42.194776 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c9f236dc-288b-4297-a7d4-f8ee50ba166e-dns-swift-storage-0\") pod \"c9f236dc-288b-4297-a7d4-f8ee50ba166e\" (UID: \"c9f236dc-288b-4297-a7d4-f8ee50ba166e\") " Mar 20 17:53:42 crc kubenswrapper[4690]: I0320 17:53:42.194837 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c9f236dc-288b-4297-a7d4-f8ee50ba166e-ovsdbserver-sb\") pod \"c9f236dc-288b-4297-a7d4-f8ee50ba166e\" (UID: \"c9f236dc-288b-4297-a7d4-f8ee50ba166e\") " Mar 20 17:53:42 crc kubenswrapper[4690]: I0320 17:53:42.194873 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c9f236dc-288b-4297-a7d4-f8ee50ba166e-ovsdbserver-nb\") pod \"c9f236dc-288b-4297-a7d4-f8ee50ba166e\" (UID: \"c9f236dc-288b-4297-a7d4-f8ee50ba166e\") " Mar 20 17:53:42 crc kubenswrapper[4690]: I0320 17:53:42.195010 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c9f236dc-288b-4297-a7d4-f8ee50ba166e-dns-svc\") pod \"c9f236dc-288b-4297-a7d4-f8ee50ba166e\" (UID: \"c9f236dc-288b-4297-a7d4-f8ee50ba166e\") " Mar 20 17:53:42 crc kubenswrapper[4690]: I0320 17:53:42.195033 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x5tdk\" (UniqueName: \"kubernetes.io/projected/c9f236dc-288b-4297-a7d4-f8ee50ba166e-kube-api-access-x5tdk\") pod \"c9f236dc-288b-4297-a7d4-f8ee50ba166e\" (UID: \"c9f236dc-288b-4297-a7d4-f8ee50ba166e\") " Mar 20 17:53:42 crc kubenswrapper[4690]: I0320 17:53:42.195091 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9f236dc-288b-4297-a7d4-f8ee50ba166e-config\") pod \"c9f236dc-288b-4297-a7d4-f8ee50ba166e\" (UID: \"c9f236dc-288b-4297-a7d4-f8ee50ba166e\") " Mar 20 17:53:42 crc kubenswrapper[4690]: I0320 17:53:42.204573 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9f236dc-288b-4297-a7d4-f8ee50ba166e-kube-api-access-x5tdk" (OuterVolumeSpecName: "kube-api-access-x5tdk") pod "c9f236dc-288b-4297-a7d4-f8ee50ba166e" (UID: "c9f236dc-288b-4297-a7d4-f8ee50ba166e"). InnerVolumeSpecName "kube-api-access-x5tdk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:53:42 crc kubenswrapper[4690]: I0320 17:53:42.229666 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9f236dc-288b-4297-a7d4-f8ee50ba166e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c9f236dc-288b-4297-a7d4-f8ee50ba166e" (UID: "c9f236dc-288b-4297-a7d4-f8ee50ba166e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:53:42 crc kubenswrapper[4690]: I0320 17:53:42.232627 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9f236dc-288b-4297-a7d4-f8ee50ba166e-config" (OuterVolumeSpecName: "config") pod "c9f236dc-288b-4297-a7d4-f8ee50ba166e" (UID: "c9f236dc-288b-4297-a7d4-f8ee50ba166e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:53:42 crc kubenswrapper[4690]: I0320 17:53:42.234646 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9f236dc-288b-4297-a7d4-f8ee50ba166e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c9f236dc-288b-4297-a7d4-f8ee50ba166e" (UID: "c9f236dc-288b-4297-a7d4-f8ee50ba166e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:53:42 crc kubenswrapper[4690]: I0320 17:53:42.239457 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9f236dc-288b-4297-a7d4-f8ee50ba166e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c9f236dc-288b-4297-a7d4-f8ee50ba166e" (UID: "c9f236dc-288b-4297-a7d4-f8ee50ba166e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:53:42 crc kubenswrapper[4690]: I0320 17:53:42.245036 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9f236dc-288b-4297-a7d4-f8ee50ba166e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c9f236dc-288b-4297-a7d4-f8ee50ba166e" (UID: "c9f236dc-288b-4297-a7d4-f8ee50ba166e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:53:42 crc kubenswrapper[4690]: I0320 17:53:42.297732 4690 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c9f236dc-288b-4297-a7d4-f8ee50ba166e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:42 crc kubenswrapper[4690]: I0320 17:53:42.297762 4690 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c9f236dc-288b-4297-a7d4-f8ee50ba166e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:42 crc kubenswrapper[4690]: I0320 17:53:42.297775 4690 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c9f236dc-288b-4297-a7d4-f8ee50ba166e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:42 crc kubenswrapper[4690]: I0320 17:53:42.297786 4690 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c9f236dc-288b-4297-a7d4-f8ee50ba166e-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:42 crc kubenswrapper[4690]: I0320 17:53:42.297799 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x5tdk\" (UniqueName: \"kubernetes.io/projected/c9f236dc-288b-4297-a7d4-f8ee50ba166e-kube-api-access-x5tdk\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:42 crc kubenswrapper[4690]: I0320 17:53:42.297812 4690 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9f236dc-288b-4297-a7d4-f8ee50ba166e-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:42 crc kubenswrapper[4690]: I0320 17:53:42.547889 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-dhqzh"] Mar 20 17:53:42 crc kubenswrapper[4690]: I0320 17:53:42.557176 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-dhqzh"] Mar 20 17:53:43 crc kubenswrapper[4690]: I0320 17:53:43.197366 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a5823518-338b-4aeb-8425-7e7bd2463a3a","Type":"ContainerStarted","Data":"bc3602515a5a5f6a60354a25dd2f053eda4540f69c4dc2b8086c4cadc6523a2c"} Mar 20 17:53:43 crc kubenswrapper[4690]: I0320 17:53:43.199423 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c2dbf4c6-e3eb-4984-a39a-0981181cea31","Type":"ContainerStarted","Data":"3a3910af7bc6e45224246856a6c4b3db5027b27a45428a159726fd635325e20f"} Mar 20 17:53:43 crc kubenswrapper[4690]: I0320 17:53:43.203767 4690 generic.go:334] "Generic (PLEG): container finished" podID="14ff8ae3-f423-405a-bdef-0805c9925ba5" containerID="dfbcd3140b95eaa4f33f7554d49b3fa306a7665d254d3b428545c604bc069749" exitCode=0 Mar 20 17:53:43 crc kubenswrapper[4690]: I0320 17:53:43.203844 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-m9hgh" event={"ID":"14ff8ae3-f423-405a-bdef-0805c9925ba5","Type":"ContainerDied","Data":"dfbcd3140b95eaa4f33f7554d49b3fa306a7665d254d3b428545c604bc069749"} Mar 20 17:53:43 crc kubenswrapper[4690]: I0320 17:53:43.238999 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-76d4bbb45f-p692s" event={"ID":"d6df0c1b-ea55-44ae-8fb9-9573c54322a8","Type":"ContainerStarted","Data":"ba74d34f179850426b838a997991cb994d905d5112679a7f5039212d82fd9bc6"} Mar 20 17:53:43 crc kubenswrapper[4690]: I0320 17:53:43.239202 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-76d4bbb45f-p692s" event={"ID":"d6df0c1b-ea55-44ae-8fb9-9573c54322a8","Type":"ContainerStarted","Data":"2614dcdde1fa3f2cfbb085a0230473f8daf911c74f3b228adc84b7cedfd82413"} Mar 20 17:53:43 crc kubenswrapper[4690]: I0320 17:53:43.243322 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7f7645db4-ph4nl" event={"ID":"eb2225a2-e763-42d5-affd-562463c266e6","Type":"ContainerStarted","Data":"599b0e5cd377b5635f963f120a9bd05adf3097ce3673d713d2af5702b48cd018"} Mar 20 17:53:43 crc kubenswrapper[4690]: I0320 17:53:43.243434 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7f7645db4-ph4nl" event={"ID":"eb2225a2-e763-42d5-affd-562463c266e6","Type":"ContainerStarted","Data":"e00009c7709af7789fdc34eb05e08452b3034c2dc0aacccebdc2c0507cf08b19"} Mar 20 17:53:43 crc kubenswrapper[4690]: I0320 17:53:43.286731 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-7f7645db4-ph4nl" podStartSLOduration=3.105057868 podStartE2EDuration="5.286709957s" podCreationTimestamp="2026-03-20 17:53:38 +0000 UTC" firstStartedPulling="2026-03-20 17:53:39.967868514 +0000 UTC m=+1294.833694192" lastFinishedPulling="2026-03-20 17:53:42.149520603 +0000 UTC m=+1297.015346281" observedRunningTime="2026-03-20 17:53:43.277509383 +0000 UTC m=+1298.143335061" watchObservedRunningTime="2026-03-20 17:53:43.286709957 +0000 UTC m=+1298.152535625" Mar 20 17:53:43 crc kubenswrapper[4690]: I0320 17:53:43.308774 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-76d4bbb45f-p692s" podStartSLOduration=3.026653693 podStartE2EDuration="5.308754398s" podCreationTimestamp="2026-03-20 17:53:38 +0000 UTC" firstStartedPulling="2026-03-20 17:53:39.865771101 +0000 UTC m=+1294.731596769" lastFinishedPulling="2026-03-20 17:53:42.147871796 +0000 UTC m=+1297.013697474" observedRunningTime="2026-03-20 17:53:43.296141339 +0000 UTC m=+1298.161967027" watchObservedRunningTime="2026-03-20 17:53:43.308754398 +0000 UTC m=+1298.174580076" Mar 20 17:53:43 crc kubenswrapper[4690]: I0320 17:53:43.895207 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9f236dc-288b-4297-a7d4-f8ee50ba166e" path="/var/lib/kubelet/pods/c9f236dc-288b-4297-a7d4-f8ee50ba166e/volumes" Mar 20 17:53:44 crc kubenswrapper[4690]: I0320 17:53:44.262238 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a5823518-338b-4aeb-8425-7e7bd2463a3a","Type":"ContainerStarted","Data":"81faa1084e35849b978953a8eb2fa4aa17c391d15892885e2c4fe1892838c2c7"} Mar 20 17:53:44 crc kubenswrapper[4690]: I0320 17:53:44.262377 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 20 17:53:44 crc kubenswrapper[4690]: I0320 17:53:44.263956 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c2dbf4c6-e3eb-4984-a39a-0981181cea31","Type":"ContainerStarted","Data":"78a6a708e8f1e4570a60795cb1ccaacbc4545031dccf0efe5c2caca106220363"} Mar 20 17:53:44 crc kubenswrapper[4690]: I0320 17:53:44.265897 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-m9hgh" event={"ID":"14ff8ae3-f423-405a-bdef-0805c9925ba5","Type":"ContainerStarted","Data":"ac3419b66515324fa8c3078ab0d570c15a8e78206b2f6c405aee336be32614cb"} Mar 20 17:53:44 crc kubenswrapper[4690]: I0320 17:53:44.287735 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.2877172439999995 podStartE2EDuration="4.287717244s" podCreationTimestamp="2026-03-20 17:53:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:53:44.281914413 +0000 UTC m=+1299.147740091" watchObservedRunningTime="2026-03-20 17:53:44.287717244 +0000 UTC m=+1299.153542922" Mar 20 17:53:44 crc kubenswrapper[4690]: I0320 17:53:44.311833 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c9776ccc5-m9hgh" podStartSLOduration=4.311800921 podStartE2EDuration="4.311800921s" podCreationTimestamp="2026-03-20 17:53:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:53:44.308800448 +0000 UTC m=+1299.174626136" watchObservedRunningTime="2026-03-20 17:53:44.311800921 +0000 UTC m=+1299.177626599" Mar 20 17:53:44 crc kubenswrapper[4690]: I0320 17:53:44.339648 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.169709477 podStartE2EDuration="4.339627292s" podCreationTimestamp="2026-03-20 17:53:40 +0000 UTC" firstStartedPulling="2026-03-20 17:53:41.121690506 +0000 UTC m=+1295.987516184" lastFinishedPulling="2026-03-20 17:53:42.291608311 +0000 UTC m=+1297.157433999" observedRunningTime="2026-03-20 17:53:44.332526935 +0000 UTC m=+1299.198352633" watchObservedRunningTime="2026-03-20 17:53:44.339627292 +0000 UTC m=+1299.205452980" Mar 20 17:53:44 crc kubenswrapper[4690]: I0320 17:53:44.758176 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 20 17:53:45 crc kubenswrapper[4690]: I0320 17:53:45.274785 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c9776ccc5-m9hgh" Mar 20 17:53:45 crc kubenswrapper[4690]: I0320 17:53:45.395215 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-944584c5d-v2mwf"] Mar 20 17:53:45 crc kubenswrapper[4690]: E0320 17:53:45.395871 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9f236dc-288b-4297-a7d4-f8ee50ba166e" containerName="init" Mar 20 17:53:45 crc kubenswrapper[4690]: I0320 17:53:45.395889 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9f236dc-288b-4297-a7d4-f8ee50ba166e" containerName="init" Mar 20 17:53:45 crc kubenswrapper[4690]: I0320 17:53:45.396075 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9f236dc-288b-4297-a7d4-f8ee50ba166e" containerName="init" Mar 20 17:53:45 crc kubenswrapper[4690]: I0320 17:53:45.396971 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-944584c5d-v2mwf" Mar 20 17:53:45 crc kubenswrapper[4690]: I0320 17:53:45.400057 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Mar 20 17:53:45 crc kubenswrapper[4690]: I0320 17:53:45.400306 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Mar 20 17:53:45 crc kubenswrapper[4690]: I0320 17:53:45.406360 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-944584c5d-v2mwf"] Mar 20 17:53:45 crc kubenswrapper[4690]: I0320 17:53:45.590473 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d7354f4-3635-4c6c-a382-f405c559ef59-public-tls-certs\") pod \"barbican-api-944584c5d-v2mwf\" (UID: \"7d7354f4-3635-4c6c-a382-f405c559ef59\") " pod="openstack/barbican-api-944584c5d-v2mwf" Mar 20 17:53:45 crc kubenswrapper[4690]: I0320 17:53:45.590531 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fd5ln\" (UniqueName: \"kubernetes.io/projected/7d7354f4-3635-4c6c-a382-f405c559ef59-kube-api-access-fd5ln\") pod \"barbican-api-944584c5d-v2mwf\" (UID: \"7d7354f4-3635-4c6c-a382-f405c559ef59\") " pod="openstack/barbican-api-944584c5d-v2mwf" Mar 20 17:53:45 crc kubenswrapper[4690]: I0320 17:53:45.590561 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d7354f4-3635-4c6c-a382-f405c559ef59-config-data\") pod \"barbican-api-944584c5d-v2mwf\" (UID: \"7d7354f4-3635-4c6c-a382-f405c559ef59\") " pod="openstack/barbican-api-944584c5d-v2mwf" Mar 20 17:53:45 crc kubenswrapper[4690]: I0320 17:53:45.590615 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7d7354f4-3635-4c6c-a382-f405c559ef59-config-data-custom\") pod \"barbican-api-944584c5d-v2mwf\" (UID: \"7d7354f4-3635-4c6c-a382-f405c559ef59\") " pod="openstack/barbican-api-944584c5d-v2mwf" Mar 20 17:53:45 crc kubenswrapper[4690]: I0320 17:53:45.590631 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d7354f4-3635-4c6c-a382-f405c559ef59-internal-tls-certs\") pod \"barbican-api-944584c5d-v2mwf\" (UID: \"7d7354f4-3635-4c6c-a382-f405c559ef59\") " pod="openstack/barbican-api-944584c5d-v2mwf" Mar 20 17:53:45 crc kubenswrapper[4690]: I0320 17:53:45.590689 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d7354f4-3635-4c6c-a382-f405c559ef59-combined-ca-bundle\") pod \"barbican-api-944584c5d-v2mwf\" (UID: \"7d7354f4-3635-4c6c-a382-f405c559ef59\") " pod="openstack/barbican-api-944584c5d-v2mwf" Mar 20 17:53:45 crc kubenswrapper[4690]: I0320 17:53:45.590704 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d7354f4-3635-4c6c-a382-f405c559ef59-logs\") pod \"barbican-api-944584c5d-v2mwf\" (UID: \"7d7354f4-3635-4c6c-a382-f405c559ef59\") " pod="openstack/barbican-api-944584c5d-v2mwf" Mar 20 17:53:45 crc kubenswrapper[4690]: I0320 17:53:45.628376 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 20 17:53:45 crc kubenswrapper[4690]: I0320 17:53:45.692111 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7d7354f4-3635-4c6c-a382-f405c559ef59-config-data-custom\") pod \"barbican-api-944584c5d-v2mwf\" (UID: \"7d7354f4-3635-4c6c-a382-f405c559ef59\") " pod="openstack/barbican-api-944584c5d-v2mwf" Mar 20 17:53:45 crc kubenswrapper[4690]: I0320 17:53:45.692159 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d7354f4-3635-4c6c-a382-f405c559ef59-internal-tls-certs\") pod \"barbican-api-944584c5d-v2mwf\" (UID: \"7d7354f4-3635-4c6c-a382-f405c559ef59\") " pod="openstack/barbican-api-944584c5d-v2mwf" Mar 20 17:53:45 crc kubenswrapper[4690]: I0320 17:53:45.692239 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d7354f4-3635-4c6c-a382-f405c559ef59-combined-ca-bundle\") pod \"barbican-api-944584c5d-v2mwf\" (UID: \"7d7354f4-3635-4c6c-a382-f405c559ef59\") " pod="openstack/barbican-api-944584c5d-v2mwf" Mar 20 17:53:45 crc kubenswrapper[4690]: I0320 17:53:45.692308 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d7354f4-3635-4c6c-a382-f405c559ef59-logs\") pod \"barbican-api-944584c5d-v2mwf\" (UID: \"7d7354f4-3635-4c6c-a382-f405c559ef59\") " pod="openstack/barbican-api-944584c5d-v2mwf" Mar 20 17:53:45 crc kubenswrapper[4690]: I0320 17:53:45.692382 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d7354f4-3635-4c6c-a382-f405c559ef59-public-tls-certs\") pod \"barbican-api-944584c5d-v2mwf\" (UID: \"7d7354f4-3635-4c6c-a382-f405c559ef59\") " pod="openstack/barbican-api-944584c5d-v2mwf" Mar 20 17:53:45 crc kubenswrapper[4690]: I0320 17:53:45.692420 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fd5ln\" (UniqueName: \"kubernetes.io/projected/7d7354f4-3635-4c6c-a382-f405c559ef59-kube-api-access-fd5ln\") pod \"barbican-api-944584c5d-v2mwf\" (UID: \"7d7354f4-3635-4c6c-a382-f405c559ef59\") " pod="openstack/barbican-api-944584c5d-v2mwf" Mar 20 17:53:45 crc kubenswrapper[4690]: I0320 17:53:45.692455 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d7354f4-3635-4c6c-a382-f405c559ef59-config-data\") pod \"barbican-api-944584c5d-v2mwf\" (UID: \"7d7354f4-3635-4c6c-a382-f405c559ef59\") " pod="openstack/barbican-api-944584c5d-v2mwf" Mar 20 17:53:45 crc kubenswrapper[4690]: I0320 17:53:45.693612 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d7354f4-3635-4c6c-a382-f405c559ef59-logs\") pod \"barbican-api-944584c5d-v2mwf\" (UID: \"7d7354f4-3635-4c6c-a382-f405c559ef59\") " pod="openstack/barbican-api-944584c5d-v2mwf" Mar 20 17:53:45 crc kubenswrapper[4690]: I0320 17:53:45.697924 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d7354f4-3635-4c6c-a382-f405c559ef59-public-tls-certs\") pod \"barbican-api-944584c5d-v2mwf\" (UID: \"7d7354f4-3635-4c6c-a382-f405c559ef59\") " pod="openstack/barbican-api-944584c5d-v2mwf" Mar 20 17:53:45 crc kubenswrapper[4690]: I0320 17:53:45.699203 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d7354f4-3635-4c6c-a382-f405c559ef59-config-data\") pod \"barbican-api-944584c5d-v2mwf\" (UID: \"7d7354f4-3635-4c6c-a382-f405c559ef59\") " pod="openstack/barbican-api-944584c5d-v2mwf" Mar 20 17:53:45 crc kubenswrapper[4690]: I0320 17:53:45.709947 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d7354f4-3635-4c6c-a382-f405c559ef59-combined-ca-bundle\") pod \"barbican-api-944584c5d-v2mwf\" (UID: \"7d7354f4-3635-4c6c-a382-f405c559ef59\") " pod="openstack/barbican-api-944584c5d-v2mwf" Mar 20 17:53:45 crc kubenswrapper[4690]: I0320 17:53:45.710275 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d7354f4-3635-4c6c-a382-f405c559ef59-internal-tls-certs\") pod \"barbican-api-944584c5d-v2mwf\" (UID: \"7d7354f4-3635-4c6c-a382-f405c559ef59\") " pod="openstack/barbican-api-944584c5d-v2mwf" Mar 20 17:53:45 crc kubenswrapper[4690]: I0320 17:53:45.712783 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7d7354f4-3635-4c6c-a382-f405c559ef59-config-data-custom\") pod \"barbican-api-944584c5d-v2mwf\" (UID: \"7d7354f4-3635-4c6c-a382-f405c559ef59\") " pod="openstack/barbican-api-944584c5d-v2mwf" Mar 20 17:53:45 crc kubenswrapper[4690]: I0320 17:53:45.713634 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fd5ln\" (UniqueName: \"kubernetes.io/projected/7d7354f4-3635-4c6c-a382-f405c559ef59-kube-api-access-fd5ln\") pod \"barbican-api-944584c5d-v2mwf\" (UID: \"7d7354f4-3635-4c6c-a382-f405c559ef59\") " pod="openstack/barbican-api-944584c5d-v2mwf" Mar 20 17:53:45 crc kubenswrapper[4690]: I0320 17:53:45.760562 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-944584c5d-v2mwf" Mar 20 17:53:45 crc kubenswrapper[4690]: I0320 17:53:45.863822 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:53:46 crc kubenswrapper[4690]: I0320 17:53:46.000058 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3959350-36d3-4ea7-92af-94ac690b406e-log-httpd\") pod \"d3959350-36d3-4ea7-92af-94ac690b406e\" (UID: \"d3959350-36d3-4ea7-92af-94ac690b406e\") " Mar 20 17:53:46 crc kubenswrapper[4690]: I0320 17:53:46.000212 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3959350-36d3-4ea7-92af-94ac690b406e-combined-ca-bundle\") pod \"d3959350-36d3-4ea7-92af-94ac690b406e\" (UID: \"d3959350-36d3-4ea7-92af-94ac690b406e\") " Mar 20 17:53:46 crc kubenswrapper[4690]: I0320 17:53:46.000266 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d3959350-36d3-4ea7-92af-94ac690b406e-sg-core-conf-yaml\") pod \"d3959350-36d3-4ea7-92af-94ac690b406e\" (UID: \"d3959350-36d3-4ea7-92af-94ac690b406e\") " Mar 20 17:53:46 crc kubenswrapper[4690]: I0320 17:53:46.000368 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3959350-36d3-4ea7-92af-94ac690b406e-config-data\") pod \"d3959350-36d3-4ea7-92af-94ac690b406e\" (UID: \"d3959350-36d3-4ea7-92af-94ac690b406e\") " Mar 20 17:53:46 crc kubenswrapper[4690]: I0320 17:53:46.000419 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3959350-36d3-4ea7-92af-94ac690b406e-scripts\") pod \"d3959350-36d3-4ea7-92af-94ac690b406e\" (UID: \"d3959350-36d3-4ea7-92af-94ac690b406e\") " Mar 20 17:53:46 crc kubenswrapper[4690]: I0320 17:53:46.000478 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-njdkp\" (UniqueName: \"kubernetes.io/projected/d3959350-36d3-4ea7-92af-94ac690b406e-kube-api-access-njdkp\") pod \"d3959350-36d3-4ea7-92af-94ac690b406e\" (UID: \"d3959350-36d3-4ea7-92af-94ac690b406e\") " Mar 20 17:53:46 crc kubenswrapper[4690]: I0320 17:53:46.000508 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3959350-36d3-4ea7-92af-94ac690b406e-run-httpd\") pod \"d3959350-36d3-4ea7-92af-94ac690b406e\" (UID: \"d3959350-36d3-4ea7-92af-94ac690b406e\") " Mar 20 17:53:46 crc kubenswrapper[4690]: I0320 17:53:46.000644 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3959350-36d3-4ea7-92af-94ac690b406e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d3959350-36d3-4ea7-92af-94ac690b406e" (UID: "d3959350-36d3-4ea7-92af-94ac690b406e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:53:46 crc kubenswrapper[4690]: I0320 17:53:46.000949 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3959350-36d3-4ea7-92af-94ac690b406e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d3959350-36d3-4ea7-92af-94ac690b406e" (UID: "d3959350-36d3-4ea7-92af-94ac690b406e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:53:46 crc kubenswrapper[4690]: I0320 17:53:46.001315 4690 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3959350-36d3-4ea7-92af-94ac690b406e-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:46 crc kubenswrapper[4690]: I0320 17:53:46.001340 4690 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3959350-36d3-4ea7-92af-94ac690b406e-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:46 crc kubenswrapper[4690]: I0320 17:53:46.006671 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3959350-36d3-4ea7-92af-94ac690b406e-kube-api-access-njdkp" (OuterVolumeSpecName: "kube-api-access-njdkp") pod "d3959350-36d3-4ea7-92af-94ac690b406e" (UID: "d3959350-36d3-4ea7-92af-94ac690b406e"). InnerVolumeSpecName "kube-api-access-njdkp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:53:46 crc kubenswrapper[4690]: I0320 17:53:46.017327 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3959350-36d3-4ea7-92af-94ac690b406e-scripts" (OuterVolumeSpecName: "scripts") pod "d3959350-36d3-4ea7-92af-94ac690b406e" (UID: "d3959350-36d3-4ea7-92af-94ac690b406e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:53:46 crc kubenswrapper[4690]: I0320 17:53:46.038393 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3959350-36d3-4ea7-92af-94ac690b406e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d3959350-36d3-4ea7-92af-94ac690b406e" (UID: "d3959350-36d3-4ea7-92af-94ac690b406e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:53:46 crc kubenswrapper[4690]: I0320 17:53:46.093032 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3959350-36d3-4ea7-92af-94ac690b406e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d3959350-36d3-4ea7-92af-94ac690b406e" (UID: "d3959350-36d3-4ea7-92af-94ac690b406e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:53:46 crc kubenswrapper[4690]: I0320 17:53:46.103665 4690 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3959350-36d3-4ea7-92af-94ac690b406e-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:46 crc kubenswrapper[4690]: I0320 17:53:46.103693 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-njdkp\" (UniqueName: \"kubernetes.io/projected/d3959350-36d3-4ea7-92af-94ac690b406e-kube-api-access-njdkp\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:46 crc kubenswrapper[4690]: I0320 17:53:46.103704 4690 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3959350-36d3-4ea7-92af-94ac690b406e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:46 crc kubenswrapper[4690]: I0320 17:53:46.103712 4690 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d3959350-36d3-4ea7-92af-94ac690b406e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:46 crc kubenswrapper[4690]: I0320 17:53:46.104802 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3959350-36d3-4ea7-92af-94ac690b406e-config-data" (OuterVolumeSpecName: "config-data") pod "d3959350-36d3-4ea7-92af-94ac690b406e" (UID: "d3959350-36d3-4ea7-92af-94ac690b406e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:53:46 crc kubenswrapper[4690]: I0320 17:53:46.205729 4690 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3959350-36d3-4ea7-92af-94ac690b406e-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:46 crc kubenswrapper[4690]: I0320 17:53:46.212599 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-944584c5d-v2mwf"] Mar 20 17:53:46 crc kubenswrapper[4690]: W0320 17:53:46.214673 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d7354f4_3635_4c6c_a382_f405c559ef59.slice/crio-41c3abd703f87a97012bf99c9332e23771b7c1a5f7f3e8a9fa4f5734ac2abcfd WatchSource:0}: Error finding container 41c3abd703f87a97012bf99c9332e23771b7c1a5f7f3e8a9fa4f5734ac2abcfd: Status 404 returned error can't find the container with id 41c3abd703f87a97012bf99c9332e23771b7c1a5f7f3e8a9fa4f5734ac2abcfd Mar 20 17:53:46 crc kubenswrapper[4690]: I0320 17:53:46.286136 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-944584c5d-v2mwf" event={"ID":"7d7354f4-3635-4c6c-a382-f405c559ef59","Type":"ContainerStarted","Data":"41c3abd703f87a97012bf99c9332e23771b7c1a5f7f3e8a9fa4f5734ac2abcfd"} Mar 20 17:53:46 crc kubenswrapper[4690]: I0320 17:53:46.291644 4690 generic.go:334] "Generic (PLEG): container finished" podID="d3959350-36d3-4ea7-92af-94ac690b406e" containerID="8060b4fcb26959d8193b3d3c27a2c289d9a43a03978a9852fe94303c958c5274" exitCode=0 Mar 20 17:53:46 crc kubenswrapper[4690]: I0320 17:53:46.291693 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:53:46 crc kubenswrapper[4690]: I0320 17:53:46.291708 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d3959350-36d3-4ea7-92af-94ac690b406e","Type":"ContainerDied","Data":"8060b4fcb26959d8193b3d3c27a2c289d9a43a03978a9852fe94303c958c5274"} Mar 20 17:53:46 crc kubenswrapper[4690]: I0320 17:53:46.292169 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d3959350-36d3-4ea7-92af-94ac690b406e","Type":"ContainerDied","Data":"70b3cc244b65edbea95e6721df694d7d742b7c665bcaf84d71699df2e10bfb9e"} Mar 20 17:53:46 crc kubenswrapper[4690]: I0320 17:53:46.292216 4690 scope.go:117] "RemoveContainer" containerID="d7e706118957fff69c764285ed8e3aa61937a5d2ba48bed03d1ddd3c4bd727f3" Mar 20 17:53:46 crc kubenswrapper[4690]: I0320 17:53:46.292765 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="a5823518-338b-4aeb-8425-7e7bd2463a3a" containerName="cinder-api-log" containerID="cri-o://bc3602515a5a5f6a60354a25dd2f053eda4540f69c4dc2b8086c4cadc6523a2c" gracePeriod=30 Mar 20 17:53:46 crc kubenswrapper[4690]: I0320 17:53:46.292869 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="a5823518-338b-4aeb-8425-7e7bd2463a3a" containerName="cinder-api" containerID="cri-o://81faa1084e35849b978953a8eb2fa4aa17c391d15892885e2c4fe1892838c2c7" gracePeriod=30 Mar 20 17:53:46 crc kubenswrapper[4690]: I0320 17:53:46.324552 4690 scope.go:117] "RemoveContainer" containerID="fc19a725163dc985f48cb92b28e512a1cc93f3c84065130f45af53fa8bc63b9b" Mar 20 17:53:46 crc kubenswrapper[4690]: I0320 17:53:46.348229 4690 scope.go:117] "RemoveContainer" containerID="8060b4fcb26959d8193b3d3c27a2c289d9a43a03978a9852fe94303c958c5274" Mar 20 17:53:46 crc kubenswrapper[4690]: I0320 17:53:46.358180 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:53:46 crc kubenswrapper[4690]: I0320 17:53:46.385197 4690 scope.go:117] "RemoveContainer" containerID="b23e19d0c9c2d945939c246d0d2686ed67c7d671693f2df3bb0092146af680bd" Mar 20 17:53:46 crc kubenswrapper[4690]: I0320 17:53:46.386325 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:53:46 crc kubenswrapper[4690]: I0320 17:53:46.405599 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:53:46 crc kubenswrapper[4690]: E0320 17:53:46.406227 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3959350-36d3-4ea7-92af-94ac690b406e" containerName="proxy-httpd" Mar 20 17:53:46 crc kubenswrapper[4690]: I0320 17:53:46.406285 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3959350-36d3-4ea7-92af-94ac690b406e" containerName="proxy-httpd" Mar 20 17:53:46 crc kubenswrapper[4690]: E0320 17:53:46.406306 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3959350-36d3-4ea7-92af-94ac690b406e" containerName="ceilometer-central-agent" Mar 20 17:53:46 crc kubenswrapper[4690]: I0320 17:53:46.406318 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3959350-36d3-4ea7-92af-94ac690b406e" containerName="ceilometer-central-agent" Mar 20 17:53:46 crc kubenswrapper[4690]: E0320 17:53:46.406388 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3959350-36d3-4ea7-92af-94ac690b406e" containerName="sg-core" Mar 20 17:53:46 crc kubenswrapper[4690]: I0320 17:53:46.406406 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3959350-36d3-4ea7-92af-94ac690b406e" containerName="sg-core" Mar 20 17:53:46 crc kubenswrapper[4690]: E0320 17:53:46.406433 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3959350-36d3-4ea7-92af-94ac690b406e" containerName="ceilometer-notification-agent" Mar 20 17:53:46 crc kubenswrapper[4690]: I0320 17:53:46.406445 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3959350-36d3-4ea7-92af-94ac690b406e" containerName="ceilometer-notification-agent" Mar 20 17:53:46 crc kubenswrapper[4690]: I0320 17:53:46.406700 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3959350-36d3-4ea7-92af-94ac690b406e" containerName="ceilometer-notification-agent" Mar 20 17:53:46 crc kubenswrapper[4690]: I0320 17:53:46.406727 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3959350-36d3-4ea7-92af-94ac690b406e" containerName="ceilometer-central-agent" Mar 20 17:53:46 crc kubenswrapper[4690]: I0320 17:53:46.406745 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3959350-36d3-4ea7-92af-94ac690b406e" containerName="sg-core" Mar 20 17:53:46 crc kubenswrapper[4690]: I0320 17:53:46.406762 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3959350-36d3-4ea7-92af-94ac690b406e" containerName="proxy-httpd" Mar 20 17:53:46 crc kubenswrapper[4690]: I0320 17:53:46.409308 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:53:46 crc kubenswrapper[4690]: I0320 17:53:46.414325 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:53:46 crc kubenswrapper[4690]: I0320 17:53:46.418401 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 17:53:46 crc kubenswrapper[4690]: I0320 17:53:46.418447 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 17:53:46 crc kubenswrapper[4690]: I0320 17:53:46.461470 4690 scope.go:117] "RemoveContainer" containerID="d7e706118957fff69c764285ed8e3aa61937a5d2ba48bed03d1ddd3c4bd727f3" Mar 20 17:53:46 crc kubenswrapper[4690]: E0320 17:53:46.461831 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7e706118957fff69c764285ed8e3aa61937a5d2ba48bed03d1ddd3c4bd727f3\": container with ID starting with d7e706118957fff69c764285ed8e3aa61937a5d2ba48bed03d1ddd3c4bd727f3 not found: ID does not exist" containerID="d7e706118957fff69c764285ed8e3aa61937a5d2ba48bed03d1ddd3c4bd727f3" Mar 20 17:53:46 crc kubenswrapper[4690]: I0320 17:53:46.461854 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7e706118957fff69c764285ed8e3aa61937a5d2ba48bed03d1ddd3c4bd727f3"} err="failed to get container status \"d7e706118957fff69c764285ed8e3aa61937a5d2ba48bed03d1ddd3c4bd727f3\": rpc error: code = NotFound desc = could not find container \"d7e706118957fff69c764285ed8e3aa61937a5d2ba48bed03d1ddd3c4bd727f3\": container with ID starting with d7e706118957fff69c764285ed8e3aa61937a5d2ba48bed03d1ddd3c4bd727f3 not found: ID does not exist" Mar 20 17:53:46 crc kubenswrapper[4690]: I0320 17:53:46.461874 4690 scope.go:117] "RemoveContainer" containerID="fc19a725163dc985f48cb92b28e512a1cc93f3c84065130f45af53fa8bc63b9b" Mar 20 17:53:46 crc kubenswrapper[4690]: E0320 17:53:46.462118 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc19a725163dc985f48cb92b28e512a1cc93f3c84065130f45af53fa8bc63b9b\": container with ID starting with fc19a725163dc985f48cb92b28e512a1cc93f3c84065130f45af53fa8bc63b9b not found: ID does not exist" containerID="fc19a725163dc985f48cb92b28e512a1cc93f3c84065130f45af53fa8bc63b9b" Mar 20 17:53:46 crc kubenswrapper[4690]: I0320 17:53:46.462131 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc19a725163dc985f48cb92b28e512a1cc93f3c84065130f45af53fa8bc63b9b"} err="failed to get container status \"fc19a725163dc985f48cb92b28e512a1cc93f3c84065130f45af53fa8bc63b9b\": rpc error: code = NotFound desc = could not find container \"fc19a725163dc985f48cb92b28e512a1cc93f3c84065130f45af53fa8bc63b9b\": container with ID starting with fc19a725163dc985f48cb92b28e512a1cc93f3c84065130f45af53fa8bc63b9b not found: ID does not exist" Mar 20 17:53:46 crc kubenswrapper[4690]: I0320 17:53:46.462142 4690 scope.go:117] "RemoveContainer" containerID="8060b4fcb26959d8193b3d3c27a2c289d9a43a03978a9852fe94303c958c5274" Mar 20 17:53:46 crc kubenswrapper[4690]: E0320 17:53:46.462364 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8060b4fcb26959d8193b3d3c27a2c289d9a43a03978a9852fe94303c958c5274\": container with ID starting with 8060b4fcb26959d8193b3d3c27a2c289d9a43a03978a9852fe94303c958c5274 not found: ID does not exist" containerID="8060b4fcb26959d8193b3d3c27a2c289d9a43a03978a9852fe94303c958c5274" Mar 20 17:53:46 crc kubenswrapper[4690]: I0320 17:53:46.462384 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8060b4fcb26959d8193b3d3c27a2c289d9a43a03978a9852fe94303c958c5274"} err="failed to get container status \"8060b4fcb26959d8193b3d3c27a2c289d9a43a03978a9852fe94303c958c5274\": rpc error: code = NotFound desc = could not find container \"8060b4fcb26959d8193b3d3c27a2c289d9a43a03978a9852fe94303c958c5274\": container with ID starting with 8060b4fcb26959d8193b3d3c27a2c289d9a43a03978a9852fe94303c958c5274 not found: ID does not exist" Mar 20 17:53:46 crc kubenswrapper[4690]: I0320 17:53:46.462397 4690 scope.go:117] "RemoveContainer" containerID="b23e19d0c9c2d945939c246d0d2686ed67c7d671693f2df3bb0092146af680bd" Mar 20 17:53:46 crc kubenswrapper[4690]: E0320 17:53:46.462582 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b23e19d0c9c2d945939c246d0d2686ed67c7d671693f2df3bb0092146af680bd\": container with ID starting with b23e19d0c9c2d945939c246d0d2686ed67c7d671693f2df3bb0092146af680bd not found: ID does not exist" containerID="b23e19d0c9c2d945939c246d0d2686ed67c7d671693f2df3bb0092146af680bd" Mar 20 17:53:46 crc kubenswrapper[4690]: I0320 17:53:46.462600 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b23e19d0c9c2d945939c246d0d2686ed67c7d671693f2df3bb0092146af680bd"} err="failed to get container status \"b23e19d0c9c2d945939c246d0d2686ed67c7d671693f2df3bb0092146af680bd\": rpc error: code = NotFound desc = could not find container \"b23e19d0c9c2d945939c246d0d2686ed67c7d671693f2df3bb0092146af680bd\": container with ID starting with b23e19d0c9c2d945939c246d0d2686ed67c7d671693f2df3bb0092146af680bd not found: ID does not exist" Mar 20 17:53:46 crc kubenswrapper[4690]: I0320 17:53:46.513321 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdrjf\" (UniqueName: \"kubernetes.io/projected/48156974-0ab6-4f24-8d90-c5dcdfbe9f37-kube-api-access-kdrjf\") pod \"ceilometer-0\" (UID: \"48156974-0ab6-4f24-8d90-c5dcdfbe9f37\") " pod="openstack/ceilometer-0" Mar 20 17:53:46 crc kubenswrapper[4690]: I0320 17:53:46.513392 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48156974-0ab6-4f24-8d90-c5dcdfbe9f37-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"48156974-0ab6-4f24-8d90-c5dcdfbe9f37\") " pod="openstack/ceilometer-0" Mar 20 17:53:46 crc kubenswrapper[4690]: I0320 17:53:46.513423 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/48156974-0ab6-4f24-8d90-c5dcdfbe9f37-log-httpd\") pod \"ceilometer-0\" (UID: \"48156974-0ab6-4f24-8d90-c5dcdfbe9f37\") " pod="openstack/ceilometer-0" Mar 20 17:53:46 crc kubenswrapper[4690]: I0320 17:53:46.513484 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/48156974-0ab6-4f24-8d90-c5dcdfbe9f37-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"48156974-0ab6-4f24-8d90-c5dcdfbe9f37\") " pod="openstack/ceilometer-0" Mar 20 17:53:46 crc kubenswrapper[4690]: I0320 17:53:46.513563 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48156974-0ab6-4f24-8d90-c5dcdfbe9f37-scripts\") pod \"ceilometer-0\" (UID: \"48156974-0ab6-4f24-8d90-c5dcdfbe9f37\") " pod="openstack/ceilometer-0" Mar 20 17:53:46 crc kubenswrapper[4690]: I0320 17:53:46.513850 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48156974-0ab6-4f24-8d90-c5dcdfbe9f37-config-data\") pod \"ceilometer-0\" (UID: \"48156974-0ab6-4f24-8d90-c5dcdfbe9f37\") " pod="openstack/ceilometer-0" Mar 20 17:53:46 crc kubenswrapper[4690]: I0320 17:53:46.513904 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/48156974-0ab6-4f24-8d90-c5dcdfbe9f37-run-httpd\") pod \"ceilometer-0\" (UID: \"48156974-0ab6-4f24-8d90-c5dcdfbe9f37\") " pod="openstack/ceilometer-0" Mar 20 17:53:46 crc kubenswrapper[4690]: I0320 17:53:46.615918 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48156974-0ab6-4f24-8d90-c5dcdfbe9f37-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"48156974-0ab6-4f24-8d90-c5dcdfbe9f37\") " pod="openstack/ceilometer-0" Mar 20 17:53:46 crc kubenswrapper[4690]: I0320 17:53:46.616219 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/48156974-0ab6-4f24-8d90-c5dcdfbe9f37-log-httpd\") pod \"ceilometer-0\" (UID: \"48156974-0ab6-4f24-8d90-c5dcdfbe9f37\") " pod="openstack/ceilometer-0" Mar 20 17:53:46 crc kubenswrapper[4690]: I0320 17:53:46.616247 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/48156974-0ab6-4f24-8d90-c5dcdfbe9f37-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"48156974-0ab6-4f24-8d90-c5dcdfbe9f37\") " pod="openstack/ceilometer-0" Mar 20 17:53:46 crc kubenswrapper[4690]: I0320 17:53:46.616321 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48156974-0ab6-4f24-8d90-c5dcdfbe9f37-scripts\") pod \"ceilometer-0\" (UID: \"48156974-0ab6-4f24-8d90-c5dcdfbe9f37\") " pod="openstack/ceilometer-0" Mar 20 17:53:46 crc kubenswrapper[4690]: I0320 17:53:46.616390 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48156974-0ab6-4f24-8d90-c5dcdfbe9f37-config-data\") pod \"ceilometer-0\" (UID: \"48156974-0ab6-4f24-8d90-c5dcdfbe9f37\") " pod="openstack/ceilometer-0" Mar 20 17:53:46 crc kubenswrapper[4690]: I0320 17:53:46.616416 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/48156974-0ab6-4f24-8d90-c5dcdfbe9f37-run-httpd\") pod \"ceilometer-0\" (UID: \"48156974-0ab6-4f24-8d90-c5dcdfbe9f37\") " pod="openstack/ceilometer-0" Mar 20 17:53:46 crc kubenswrapper[4690]: I0320 17:53:46.616493 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdrjf\" (UniqueName: \"kubernetes.io/projected/48156974-0ab6-4f24-8d90-c5dcdfbe9f37-kube-api-access-kdrjf\") pod \"ceilometer-0\" (UID: \"48156974-0ab6-4f24-8d90-c5dcdfbe9f37\") " pod="openstack/ceilometer-0" Mar 20 17:53:46 crc kubenswrapper[4690]: I0320 17:53:46.618203 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/48156974-0ab6-4f24-8d90-c5dcdfbe9f37-log-httpd\") pod \"ceilometer-0\" (UID: \"48156974-0ab6-4f24-8d90-c5dcdfbe9f37\") " pod="openstack/ceilometer-0" Mar 20 17:53:46 crc kubenswrapper[4690]: I0320 17:53:46.618475 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/48156974-0ab6-4f24-8d90-c5dcdfbe9f37-run-httpd\") pod \"ceilometer-0\" (UID: \"48156974-0ab6-4f24-8d90-c5dcdfbe9f37\") " pod="openstack/ceilometer-0" Mar 20 17:53:46 crc kubenswrapper[4690]: I0320 17:53:46.622312 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/48156974-0ab6-4f24-8d90-c5dcdfbe9f37-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"48156974-0ab6-4f24-8d90-c5dcdfbe9f37\") " pod="openstack/ceilometer-0" Mar 20 17:53:46 crc kubenswrapper[4690]: I0320 17:53:46.623693 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48156974-0ab6-4f24-8d90-c5dcdfbe9f37-config-data\") pod \"ceilometer-0\" (UID: \"48156974-0ab6-4f24-8d90-c5dcdfbe9f37\") " pod="openstack/ceilometer-0" Mar 20 17:53:46 crc kubenswrapper[4690]: I0320 17:53:46.631635 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48156974-0ab6-4f24-8d90-c5dcdfbe9f37-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"48156974-0ab6-4f24-8d90-c5dcdfbe9f37\") " pod="openstack/ceilometer-0" Mar 20 17:53:46 crc kubenswrapper[4690]: I0320 17:53:46.635758 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdrjf\" (UniqueName: \"kubernetes.io/projected/48156974-0ab6-4f24-8d90-c5dcdfbe9f37-kube-api-access-kdrjf\") pod \"ceilometer-0\" (UID: \"48156974-0ab6-4f24-8d90-c5dcdfbe9f37\") " pod="openstack/ceilometer-0" Mar 20 17:53:46 crc kubenswrapper[4690]: I0320 17:53:46.637142 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48156974-0ab6-4f24-8d90-c5dcdfbe9f37-scripts\") pod \"ceilometer-0\" (UID: \"48156974-0ab6-4f24-8d90-c5dcdfbe9f37\") " pod="openstack/ceilometer-0" Mar 20 17:53:46 crc kubenswrapper[4690]: I0320 17:53:46.747169 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:53:46 crc kubenswrapper[4690]: I0320 17:53:46.867248 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 17:53:46 crc kubenswrapper[4690]: I0320 17:53:46.929560 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5823518-338b-4aeb-8425-7e7bd2463a3a-combined-ca-bundle\") pod \"a5823518-338b-4aeb-8425-7e7bd2463a3a\" (UID: \"a5823518-338b-4aeb-8425-7e7bd2463a3a\") " Mar 20 17:53:46 crc kubenswrapper[4690]: I0320 17:53:46.961439 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5823518-338b-4aeb-8425-7e7bd2463a3a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a5823518-338b-4aeb-8425-7e7bd2463a3a" (UID: "a5823518-338b-4aeb-8425-7e7bd2463a3a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:53:47 crc kubenswrapper[4690]: I0320 17:53:47.030803 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5823518-338b-4aeb-8425-7e7bd2463a3a-config-data\") pod \"a5823518-338b-4aeb-8425-7e7bd2463a3a\" (UID: \"a5823518-338b-4aeb-8425-7e7bd2463a3a\") " Mar 20 17:53:47 crc kubenswrapper[4690]: I0320 17:53:47.030849 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a5823518-338b-4aeb-8425-7e7bd2463a3a-config-data-custom\") pod \"a5823518-338b-4aeb-8425-7e7bd2463a3a\" (UID: \"a5823518-338b-4aeb-8425-7e7bd2463a3a\") " Mar 20 17:53:47 crc kubenswrapper[4690]: I0320 17:53:47.031202 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g6xgp\" (UniqueName: \"kubernetes.io/projected/a5823518-338b-4aeb-8425-7e7bd2463a3a-kube-api-access-g6xgp\") pod \"a5823518-338b-4aeb-8425-7e7bd2463a3a\" (UID: \"a5823518-338b-4aeb-8425-7e7bd2463a3a\") " Mar 20 17:53:47 crc kubenswrapper[4690]: I0320 17:53:47.031270 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5823518-338b-4aeb-8425-7e7bd2463a3a-scripts\") pod \"a5823518-338b-4aeb-8425-7e7bd2463a3a\" (UID: \"a5823518-338b-4aeb-8425-7e7bd2463a3a\") " Mar 20 17:53:47 crc kubenswrapper[4690]: I0320 17:53:47.031303 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a5823518-338b-4aeb-8425-7e7bd2463a3a-etc-machine-id\") pod \"a5823518-338b-4aeb-8425-7e7bd2463a3a\" (UID: \"a5823518-338b-4aeb-8425-7e7bd2463a3a\") " Mar 20 17:53:47 crc kubenswrapper[4690]: I0320 17:53:47.031337 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5823518-338b-4aeb-8425-7e7bd2463a3a-logs\") pod \"a5823518-338b-4aeb-8425-7e7bd2463a3a\" (UID: \"a5823518-338b-4aeb-8425-7e7bd2463a3a\") " Mar 20 17:53:47 crc kubenswrapper[4690]: I0320 17:53:47.031725 4690 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5823518-338b-4aeb-8425-7e7bd2463a3a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:47 crc kubenswrapper[4690]: I0320 17:53:47.032000 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5823518-338b-4aeb-8425-7e7bd2463a3a-logs" (OuterVolumeSpecName: "logs") pod "a5823518-338b-4aeb-8425-7e7bd2463a3a" (UID: "a5823518-338b-4aeb-8425-7e7bd2463a3a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:53:47 crc kubenswrapper[4690]: I0320 17:53:47.032807 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a5823518-338b-4aeb-8425-7e7bd2463a3a-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "a5823518-338b-4aeb-8425-7e7bd2463a3a" (UID: "a5823518-338b-4aeb-8425-7e7bd2463a3a"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:53:47 crc kubenswrapper[4690]: I0320 17:53:47.035544 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5823518-338b-4aeb-8425-7e7bd2463a3a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "a5823518-338b-4aeb-8425-7e7bd2463a3a" (UID: "a5823518-338b-4aeb-8425-7e7bd2463a3a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:53:47 crc kubenswrapper[4690]: I0320 17:53:47.035593 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5823518-338b-4aeb-8425-7e7bd2463a3a-scripts" (OuterVolumeSpecName: "scripts") pod "a5823518-338b-4aeb-8425-7e7bd2463a3a" (UID: "a5823518-338b-4aeb-8425-7e7bd2463a3a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:53:47 crc kubenswrapper[4690]: I0320 17:53:47.035810 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5823518-338b-4aeb-8425-7e7bd2463a3a-kube-api-access-g6xgp" (OuterVolumeSpecName: "kube-api-access-g6xgp") pod "a5823518-338b-4aeb-8425-7e7bd2463a3a" (UID: "a5823518-338b-4aeb-8425-7e7bd2463a3a"). InnerVolumeSpecName "kube-api-access-g6xgp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:53:47 crc kubenswrapper[4690]: I0320 17:53:47.096100 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5823518-338b-4aeb-8425-7e7bd2463a3a-config-data" (OuterVolumeSpecName: "config-data") pod "a5823518-338b-4aeb-8425-7e7bd2463a3a" (UID: "a5823518-338b-4aeb-8425-7e7bd2463a3a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:53:47 crc kubenswrapper[4690]: I0320 17:53:47.133869 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g6xgp\" (UniqueName: \"kubernetes.io/projected/a5823518-338b-4aeb-8425-7e7bd2463a3a-kube-api-access-g6xgp\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:47 crc kubenswrapper[4690]: I0320 17:53:47.133904 4690 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5823518-338b-4aeb-8425-7e7bd2463a3a-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:47 crc kubenswrapper[4690]: I0320 17:53:47.133914 4690 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a5823518-338b-4aeb-8425-7e7bd2463a3a-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:47 crc kubenswrapper[4690]: I0320 17:53:47.133923 4690 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5823518-338b-4aeb-8425-7e7bd2463a3a-logs\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:47 crc kubenswrapper[4690]: I0320 17:53:47.133932 4690 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5823518-338b-4aeb-8425-7e7bd2463a3a-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:47 crc kubenswrapper[4690]: I0320 17:53:47.133946 4690 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a5823518-338b-4aeb-8425-7e7bd2463a3a-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:47 crc kubenswrapper[4690]: I0320 17:53:47.197723 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:53:47 crc kubenswrapper[4690]: W0320 17:53:47.200430 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48156974_0ab6_4f24_8d90_c5dcdfbe9f37.slice/crio-85005629a656067448cb1c231d7ab8725957d4f5240bafa4723337aa4b8c4bfb WatchSource:0}: Error finding container 85005629a656067448cb1c231d7ab8725957d4f5240bafa4723337aa4b8c4bfb: Status 404 returned error can't find the container with id 85005629a656067448cb1c231d7ab8725957d4f5240bafa4723337aa4b8c4bfb Mar 20 17:53:47 crc kubenswrapper[4690]: I0320 17:53:47.304784 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-944584c5d-v2mwf" event={"ID":"7d7354f4-3635-4c6c-a382-f405c559ef59","Type":"ContainerStarted","Data":"5e577bf10c962c6f7a37b06305e7a2b008638ea44adc9bb3f7a2a0ed00b7a6d0"} Mar 20 17:53:47 crc kubenswrapper[4690]: I0320 17:53:47.304842 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-944584c5d-v2mwf" event={"ID":"7d7354f4-3635-4c6c-a382-f405c559ef59","Type":"ContainerStarted","Data":"66b7b244fd5460b6a12cf235bd5cdbce1e06adb273c803b3f7dce74b451f0a87"} Mar 20 17:53:47 crc kubenswrapper[4690]: I0320 17:53:47.306120 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-944584c5d-v2mwf" Mar 20 17:53:47 crc kubenswrapper[4690]: I0320 17:53:47.306158 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-944584c5d-v2mwf" Mar 20 17:53:47 crc kubenswrapper[4690]: I0320 17:53:47.308434 4690 generic.go:334] "Generic (PLEG): container finished" podID="a5823518-338b-4aeb-8425-7e7bd2463a3a" containerID="81faa1084e35849b978953a8eb2fa4aa17c391d15892885e2c4fe1892838c2c7" exitCode=0 Mar 20 17:53:47 crc kubenswrapper[4690]: I0320 17:53:47.308469 4690 generic.go:334] "Generic (PLEG): container finished" podID="a5823518-338b-4aeb-8425-7e7bd2463a3a" containerID="bc3602515a5a5f6a60354a25dd2f053eda4540f69c4dc2b8086c4cadc6523a2c" exitCode=143 Mar 20 17:53:47 crc kubenswrapper[4690]: I0320 17:53:47.308522 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a5823518-338b-4aeb-8425-7e7bd2463a3a","Type":"ContainerDied","Data":"81faa1084e35849b978953a8eb2fa4aa17c391d15892885e2c4fe1892838c2c7"} Mar 20 17:53:47 crc kubenswrapper[4690]: I0320 17:53:47.308549 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a5823518-338b-4aeb-8425-7e7bd2463a3a","Type":"ContainerDied","Data":"bc3602515a5a5f6a60354a25dd2f053eda4540f69c4dc2b8086c4cadc6523a2c"} Mar 20 17:53:47 crc kubenswrapper[4690]: I0320 17:53:47.308566 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a5823518-338b-4aeb-8425-7e7bd2463a3a","Type":"ContainerDied","Data":"a88ae6bcd2a21283e57b78dc5be2da37aeae8cad44e34f1a474bbed150629d53"} Mar 20 17:53:47 crc kubenswrapper[4690]: I0320 17:53:47.308589 4690 scope.go:117] "RemoveContainer" containerID="81faa1084e35849b978953a8eb2fa4aa17c391d15892885e2c4fe1892838c2c7" Mar 20 17:53:47 crc kubenswrapper[4690]: I0320 17:53:47.308686 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 17:53:47 crc kubenswrapper[4690]: I0320 17:53:47.313669 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"48156974-0ab6-4f24-8d90-c5dcdfbe9f37","Type":"ContainerStarted","Data":"85005629a656067448cb1c231d7ab8725957d4f5240bafa4723337aa4b8c4bfb"} Mar 20 17:53:47 crc kubenswrapper[4690]: I0320 17:53:47.338028 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-944584c5d-v2mwf" podStartSLOduration=2.338006604 podStartE2EDuration="2.338006604s" podCreationTimestamp="2026-03-20 17:53:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:53:47.335644439 +0000 UTC m=+1302.201470197" watchObservedRunningTime="2026-03-20 17:53:47.338006604 +0000 UTC m=+1302.203832282" Mar 20 17:53:47 crc kubenswrapper[4690]: I0320 17:53:47.349285 4690 scope.go:117] "RemoveContainer" containerID="bc3602515a5a5f6a60354a25dd2f053eda4540f69c4dc2b8086c4cadc6523a2c" Mar 20 17:53:47 crc kubenswrapper[4690]: I0320 17:53:47.371327 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 20 17:53:47 crc kubenswrapper[4690]: I0320 17:53:47.373380 4690 scope.go:117] "RemoveContainer" containerID="81faa1084e35849b978953a8eb2fa4aa17c391d15892885e2c4fe1892838c2c7" Mar 20 17:53:47 crc kubenswrapper[4690]: E0320 17:53:47.373729 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81faa1084e35849b978953a8eb2fa4aa17c391d15892885e2c4fe1892838c2c7\": container with ID starting with 81faa1084e35849b978953a8eb2fa4aa17c391d15892885e2c4fe1892838c2c7 not found: ID does not exist" containerID="81faa1084e35849b978953a8eb2fa4aa17c391d15892885e2c4fe1892838c2c7" Mar 20 17:53:47 crc kubenswrapper[4690]: I0320 17:53:47.373754 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81faa1084e35849b978953a8eb2fa4aa17c391d15892885e2c4fe1892838c2c7"} err="failed to get container status \"81faa1084e35849b978953a8eb2fa4aa17c391d15892885e2c4fe1892838c2c7\": rpc error: code = NotFound desc = could not find container \"81faa1084e35849b978953a8eb2fa4aa17c391d15892885e2c4fe1892838c2c7\": container with ID starting with 81faa1084e35849b978953a8eb2fa4aa17c391d15892885e2c4fe1892838c2c7 not found: ID does not exist" Mar 20 17:53:47 crc kubenswrapper[4690]: I0320 17:53:47.373772 4690 scope.go:117] "RemoveContainer" containerID="bc3602515a5a5f6a60354a25dd2f053eda4540f69c4dc2b8086c4cadc6523a2c" Mar 20 17:53:47 crc kubenswrapper[4690]: E0320 17:53:47.374051 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc3602515a5a5f6a60354a25dd2f053eda4540f69c4dc2b8086c4cadc6523a2c\": container with ID starting with bc3602515a5a5f6a60354a25dd2f053eda4540f69c4dc2b8086c4cadc6523a2c not found: ID does not exist" containerID="bc3602515a5a5f6a60354a25dd2f053eda4540f69c4dc2b8086c4cadc6523a2c" Mar 20 17:53:47 crc kubenswrapper[4690]: I0320 17:53:47.374070 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc3602515a5a5f6a60354a25dd2f053eda4540f69c4dc2b8086c4cadc6523a2c"} err="failed to get container status \"bc3602515a5a5f6a60354a25dd2f053eda4540f69c4dc2b8086c4cadc6523a2c\": rpc error: code = NotFound desc = could not find container \"bc3602515a5a5f6a60354a25dd2f053eda4540f69c4dc2b8086c4cadc6523a2c\": container with ID starting with bc3602515a5a5f6a60354a25dd2f053eda4540f69c4dc2b8086c4cadc6523a2c not found: ID does not exist" Mar 20 17:53:47 crc kubenswrapper[4690]: I0320 17:53:47.374103 4690 scope.go:117] "RemoveContainer" containerID="81faa1084e35849b978953a8eb2fa4aa17c391d15892885e2c4fe1892838c2c7" Mar 20 17:53:47 crc kubenswrapper[4690]: I0320 17:53:47.374292 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81faa1084e35849b978953a8eb2fa4aa17c391d15892885e2c4fe1892838c2c7"} err="failed to get container status \"81faa1084e35849b978953a8eb2fa4aa17c391d15892885e2c4fe1892838c2c7\": rpc error: code = NotFound desc = could not find container \"81faa1084e35849b978953a8eb2fa4aa17c391d15892885e2c4fe1892838c2c7\": container with ID starting with 81faa1084e35849b978953a8eb2fa4aa17c391d15892885e2c4fe1892838c2c7 not found: ID does not exist" Mar 20 17:53:47 crc kubenswrapper[4690]: I0320 17:53:47.374307 4690 scope.go:117] "RemoveContainer" containerID="bc3602515a5a5f6a60354a25dd2f053eda4540f69c4dc2b8086c4cadc6523a2c" Mar 20 17:53:47 crc kubenswrapper[4690]: I0320 17:53:47.374475 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc3602515a5a5f6a60354a25dd2f053eda4540f69c4dc2b8086c4cadc6523a2c"} err="failed to get container status \"bc3602515a5a5f6a60354a25dd2f053eda4540f69c4dc2b8086c4cadc6523a2c\": rpc error: code = NotFound desc = could not find container \"bc3602515a5a5f6a60354a25dd2f053eda4540f69c4dc2b8086c4cadc6523a2c\": container with ID starting with bc3602515a5a5f6a60354a25dd2f053eda4540f69c4dc2b8086c4cadc6523a2c not found: ID does not exist" Mar 20 17:53:47 crc kubenswrapper[4690]: I0320 17:53:47.380935 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 20 17:53:47 crc kubenswrapper[4690]: I0320 17:53:47.402700 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 20 17:53:47 crc kubenswrapper[4690]: E0320 17:53:47.403356 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5823518-338b-4aeb-8425-7e7bd2463a3a" containerName="cinder-api" Mar 20 17:53:47 crc kubenswrapper[4690]: I0320 17:53:47.403379 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5823518-338b-4aeb-8425-7e7bd2463a3a" containerName="cinder-api" Mar 20 17:53:47 crc kubenswrapper[4690]: E0320 17:53:47.403422 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5823518-338b-4aeb-8425-7e7bd2463a3a" containerName="cinder-api-log" Mar 20 17:53:47 crc kubenswrapper[4690]: I0320 17:53:47.403428 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5823518-338b-4aeb-8425-7e7bd2463a3a" containerName="cinder-api-log" Mar 20 17:53:47 crc kubenswrapper[4690]: I0320 17:53:47.403602 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5823518-338b-4aeb-8425-7e7bd2463a3a" containerName="cinder-api-log" Mar 20 17:53:47 crc kubenswrapper[4690]: I0320 17:53:47.403613 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5823518-338b-4aeb-8425-7e7bd2463a3a" containerName="cinder-api" Mar 20 17:53:47 crc kubenswrapper[4690]: I0320 17:53:47.404540 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 17:53:47 crc kubenswrapper[4690]: I0320 17:53:47.406500 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 20 17:53:47 crc kubenswrapper[4690]: I0320 17:53:47.407623 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 20 17:53:47 crc kubenswrapper[4690]: I0320 17:53:47.411896 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 20 17:53:47 crc kubenswrapper[4690]: I0320 17:53:47.423290 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 20 17:53:47 crc kubenswrapper[4690]: I0320 17:53:47.458090 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/370661b8-157c-4a7f-ae3e-379d122d48b3-config-data\") pod \"cinder-api-0\" (UID: \"370661b8-157c-4a7f-ae3e-379d122d48b3\") " pod="openstack/cinder-api-0" Mar 20 17:53:47 crc kubenswrapper[4690]: I0320 17:53:47.458187 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/370661b8-157c-4a7f-ae3e-379d122d48b3-scripts\") pod \"cinder-api-0\" (UID: \"370661b8-157c-4a7f-ae3e-379d122d48b3\") " pod="openstack/cinder-api-0" Mar 20 17:53:47 crc kubenswrapper[4690]: I0320 17:53:47.458338 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drkw6\" (UniqueName: \"kubernetes.io/projected/370661b8-157c-4a7f-ae3e-379d122d48b3-kube-api-access-drkw6\") pod \"cinder-api-0\" (UID: \"370661b8-157c-4a7f-ae3e-379d122d48b3\") " pod="openstack/cinder-api-0" Mar 20 17:53:47 crc kubenswrapper[4690]: I0320 17:53:47.458372 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/370661b8-157c-4a7f-ae3e-379d122d48b3-logs\") pod \"cinder-api-0\" (UID: \"370661b8-157c-4a7f-ae3e-379d122d48b3\") " pod="openstack/cinder-api-0" Mar 20 17:53:47 crc kubenswrapper[4690]: I0320 17:53:47.458509 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/370661b8-157c-4a7f-ae3e-379d122d48b3-config-data-custom\") pod \"cinder-api-0\" (UID: \"370661b8-157c-4a7f-ae3e-379d122d48b3\") " pod="openstack/cinder-api-0" Mar 20 17:53:47 crc kubenswrapper[4690]: I0320 17:53:47.458563 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/370661b8-157c-4a7f-ae3e-379d122d48b3-public-tls-certs\") pod \"cinder-api-0\" (UID: \"370661b8-157c-4a7f-ae3e-379d122d48b3\") " pod="openstack/cinder-api-0" Mar 20 17:53:47 crc kubenswrapper[4690]: I0320 17:53:47.458717 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/370661b8-157c-4a7f-ae3e-379d122d48b3-etc-machine-id\") pod \"cinder-api-0\" (UID: \"370661b8-157c-4a7f-ae3e-379d122d48b3\") " pod="openstack/cinder-api-0" Mar 20 17:53:47 crc kubenswrapper[4690]: I0320 17:53:47.458778 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/370661b8-157c-4a7f-ae3e-379d122d48b3-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"370661b8-157c-4a7f-ae3e-379d122d48b3\") " pod="openstack/cinder-api-0" Mar 20 17:53:47 crc kubenswrapper[4690]: I0320 17:53:47.458848 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/370661b8-157c-4a7f-ae3e-379d122d48b3-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"370661b8-157c-4a7f-ae3e-379d122d48b3\") " pod="openstack/cinder-api-0" Mar 20 17:53:47 crc kubenswrapper[4690]: I0320 17:53:47.559875 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drkw6\" (UniqueName: \"kubernetes.io/projected/370661b8-157c-4a7f-ae3e-379d122d48b3-kube-api-access-drkw6\") pod \"cinder-api-0\" (UID: \"370661b8-157c-4a7f-ae3e-379d122d48b3\") " pod="openstack/cinder-api-0" Mar 20 17:53:47 crc kubenswrapper[4690]: I0320 17:53:47.559929 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/370661b8-157c-4a7f-ae3e-379d122d48b3-logs\") pod \"cinder-api-0\" (UID: \"370661b8-157c-4a7f-ae3e-379d122d48b3\") " pod="openstack/cinder-api-0" Mar 20 17:53:47 crc kubenswrapper[4690]: I0320 17:53:47.559964 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/370661b8-157c-4a7f-ae3e-379d122d48b3-config-data-custom\") pod \"cinder-api-0\" (UID: \"370661b8-157c-4a7f-ae3e-379d122d48b3\") " pod="openstack/cinder-api-0" Mar 20 17:53:47 crc kubenswrapper[4690]: I0320 17:53:47.559983 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/370661b8-157c-4a7f-ae3e-379d122d48b3-public-tls-certs\") pod \"cinder-api-0\" (UID: \"370661b8-157c-4a7f-ae3e-379d122d48b3\") " pod="openstack/cinder-api-0" Mar 20 17:53:47 crc kubenswrapper[4690]: I0320 17:53:47.560016 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/370661b8-157c-4a7f-ae3e-379d122d48b3-etc-machine-id\") pod \"cinder-api-0\" (UID: \"370661b8-157c-4a7f-ae3e-379d122d48b3\") " pod="openstack/cinder-api-0" Mar 20 17:53:47 crc kubenswrapper[4690]: I0320 17:53:47.560037 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/370661b8-157c-4a7f-ae3e-379d122d48b3-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"370661b8-157c-4a7f-ae3e-379d122d48b3\") " pod="openstack/cinder-api-0" Mar 20 17:53:47 crc kubenswrapper[4690]: I0320 17:53:47.560070 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/370661b8-157c-4a7f-ae3e-379d122d48b3-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"370661b8-157c-4a7f-ae3e-379d122d48b3\") " pod="openstack/cinder-api-0" Mar 20 17:53:47 crc kubenswrapper[4690]: I0320 17:53:47.560116 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/370661b8-157c-4a7f-ae3e-379d122d48b3-config-data\") pod \"cinder-api-0\" (UID: \"370661b8-157c-4a7f-ae3e-379d122d48b3\") " pod="openstack/cinder-api-0" Mar 20 17:53:47 crc kubenswrapper[4690]: I0320 17:53:47.560154 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/370661b8-157c-4a7f-ae3e-379d122d48b3-scripts\") pod \"cinder-api-0\" (UID: \"370661b8-157c-4a7f-ae3e-379d122d48b3\") " pod="openstack/cinder-api-0" Mar 20 17:53:47 crc kubenswrapper[4690]: I0320 17:53:47.560230 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/370661b8-157c-4a7f-ae3e-379d122d48b3-etc-machine-id\") pod \"cinder-api-0\" (UID: \"370661b8-157c-4a7f-ae3e-379d122d48b3\") " pod="openstack/cinder-api-0" Mar 20 17:53:47 crc kubenswrapper[4690]: I0320 17:53:47.560680 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/370661b8-157c-4a7f-ae3e-379d122d48b3-logs\") pod \"cinder-api-0\" (UID: \"370661b8-157c-4a7f-ae3e-379d122d48b3\") " pod="openstack/cinder-api-0" Mar 20 17:53:47 crc kubenswrapper[4690]: I0320 17:53:47.573069 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/370661b8-157c-4a7f-ae3e-379d122d48b3-public-tls-certs\") pod \"cinder-api-0\" (UID: \"370661b8-157c-4a7f-ae3e-379d122d48b3\") " pod="openstack/cinder-api-0" Mar 20 17:53:47 crc kubenswrapper[4690]: I0320 17:53:47.573083 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/370661b8-157c-4a7f-ae3e-379d122d48b3-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"370661b8-157c-4a7f-ae3e-379d122d48b3\") " pod="openstack/cinder-api-0" Mar 20 17:53:47 crc kubenswrapper[4690]: I0320 17:53:47.573104 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/370661b8-157c-4a7f-ae3e-379d122d48b3-scripts\") pod \"cinder-api-0\" (UID: \"370661b8-157c-4a7f-ae3e-379d122d48b3\") " pod="openstack/cinder-api-0" Mar 20 17:53:47 crc kubenswrapper[4690]: I0320 17:53:47.573582 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/370661b8-157c-4a7f-ae3e-379d122d48b3-config-data-custom\") pod \"cinder-api-0\" (UID: \"370661b8-157c-4a7f-ae3e-379d122d48b3\") " pod="openstack/cinder-api-0" Mar 20 17:53:47 crc kubenswrapper[4690]: I0320 17:53:47.576142 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/370661b8-157c-4a7f-ae3e-379d122d48b3-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"370661b8-157c-4a7f-ae3e-379d122d48b3\") " pod="openstack/cinder-api-0" Mar 20 17:53:47 crc kubenswrapper[4690]: I0320 17:53:47.576674 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drkw6\" (UniqueName: \"kubernetes.io/projected/370661b8-157c-4a7f-ae3e-379d122d48b3-kube-api-access-drkw6\") pod \"cinder-api-0\" (UID: \"370661b8-157c-4a7f-ae3e-379d122d48b3\") " pod="openstack/cinder-api-0" Mar 20 17:53:47 crc kubenswrapper[4690]: I0320 17:53:47.576695 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/370661b8-157c-4a7f-ae3e-379d122d48b3-config-data\") pod \"cinder-api-0\" (UID: \"370661b8-157c-4a7f-ae3e-379d122d48b3\") " pod="openstack/cinder-api-0" Mar 20 17:53:47 crc kubenswrapper[4690]: I0320 17:53:47.728417 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 17:53:47 crc kubenswrapper[4690]: I0320 17:53:47.895891 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5823518-338b-4aeb-8425-7e7bd2463a3a" path="/var/lib/kubelet/pods/a5823518-338b-4aeb-8425-7e7bd2463a3a/volumes" Mar 20 17:53:47 crc kubenswrapper[4690]: I0320 17:53:47.897214 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3959350-36d3-4ea7-92af-94ac690b406e" path="/var/lib/kubelet/pods/d3959350-36d3-4ea7-92af-94ac690b406e/volumes" Mar 20 17:53:48 crc kubenswrapper[4690]: I0320 17:53:48.200425 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 20 17:53:48 crc kubenswrapper[4690]: W0320 17:53:48.205278 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod370661b8_157c_4a7f_ae3e_379d122d48b3.slice/crio-114c288731219d21ce74190aedb47abee93734a35f149365b44521e57c770ebb WatchSource:0}: Error finding container 114c288731219d21ce74190aedb47abee93734a35f149365b44521e57c770ebb: Status 404 returned error can't find the container with id 114c288731219d21ce74190aedb47abee93734a35f149365b44521e57c770ebb Mar 20 17:53:48 crc kubenswrapper[4690]: I0320 17:53:48.338950 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"370661b8-157c-4a7f-ae3e-379d122d48b3","Type":"ContainerStarted","Data":"114c288731219d21ce74190aedb47abee93734a35f149365b44521e57c770ebb"} Mar 20 17:53:48 crc kubenswrapper[4690]: I0320 17:53:48.341053 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"48156974-0ab6-4f24-8d90-c5dcdfbe9f37","Type":"ContainerStarted","Data":"e1f7e0388a8a1063e80631cc24d1602951be9d3c2eaa052af5e1aee8d9a983e3"} Mar 20 17:53:48 crc kubenswrapper[4690]: I0320 17:53:48.914661 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-75b55fdddd-6ht5q" Mar 20 17:53:49 crc kubenswrapper[4690]: I0320 17:53:49.161821 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-84b8586c59-k9gqt"] Mar 20 17:53:49 crc kubenswrapper[4690]: I0320 17:53:49.162519 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-84b8586c59-k9gqt" podUID="e1874a81-9fc0-4cb0-a681-8ab78df069a0" containerName="neutron-api" containerID="cri-o://6ce1b558ef5aec84880806931e4b38462b049a8175ebbdb3b31e68f707f0a15c" gracePeriod=30 Mar 20 17:53:49 crc kubenswrapper[4690]: I0320 17:53:49.162636 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-84b8586c59-k9gqt" podUID="e1874a81-9fc0-4cb0-a681-8ab78df069a0" containerName="neutron-httpd" containerID="cri-o://add83b1f86a4bfa988fa96f33e00996aa647c96022a0d0eed9e863ed744e00da" gracePeriod=30 Mar 20 17:53:49 crc kubenswrapper[4690]: I0320 17:53:49.248846 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-675c5fd7b7-z9vsh"] Mar 20 17:53:49 crc kubenswrapper[4690]: I0320 17:53:49.250286 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-675c5fd7b7-z9vsh" Mar 20 17:53:49 crc kubenswrapper[4690]: I0320 17:53:49.276709 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-675c5fd7b7-z9vsh"] Mar 20 17:53:49 crc kubenswrapper[4690]: I0320 17:53:49.402461 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"48156974-0ab6-4f24-8d90-c5dcdfbe9f37","Type":"ContainerStarted","Data":"b9302a3f2fd2401586d51ea789f00ae1f6527ddafdc9f8afb41111770b82dbe5"} Mar 20 17:53:49 crc kubenswrapper[4690]: I0320 17:53:49.417595 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1ce9f480-c11d-4009-98e7-8e1d4a13ecd8-config\") pod \"neutron-675c5fd7b7-z9vsh\" (UID: \"1ce9f480-c11d-4009-98e7-8e1d4a13ecd8\") " pod="openstack/neutron-675c5fd7b7-z9vsh" Mar 20 17:53:49 crc kubenswrapper[4690]: I0320 17:53:49.417637 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bd6td\" (UniqueName: \"kubernetes.io/projected/1ce9f480-c11d-4009-98e7-8e1d4a13ecd8-kube-api-access-bd6td\") pod \"neutron-675c5fd7b7-z9vsh\" (UID: \"1ce9f480-c11d-4009-98e7-8e1d4a13ecd8\") " pod="openstack/neutron-675c5fd7b7-z9vsh" Mar 20 17:53:49 crc kubenswrapper[4690]: I0320 17:53:49.417843 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ce9f480-c11d-4009-98e7-8e1d4a13ecd8-public-tls-certs\") pod \"neutron-675c5fd7b7-z9vsh\" (UID: \"1ce9f480-c11d-4009-98e7-8e1d4a13ecd8\") " pod="openstack/neutron-675c5fd7b7-z9vsh" Mar 20 17:53:49 crc kubenswrapper[4690]: I0320 17:53:49.417993 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ce9f480-c11d-4009-98e7-8e1d4a13ecd8-internal-tls-certs\") pod \"neutron-675c5fd7b7-z9vsh\" (UID: \"1ce9f480-c11d-4009-98e7-8e1d4a13ecd8\") " pod="openstack/neutron-675c5fd7b7-z9vsh" Mar 20 17:53:49 crc kubenswrapper[4690]: I0320 17:53:49.418133 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/1ce9f480-c11d-4009-98e7-8e1d4a13ecd8-httpd-config\") pod \"neutron-675c5fd7b7-z9vsh\" (UID: \"1ce9f480-c11d-4009-98e7-8e1d4a13ecd8\") " pod="openstack/neutron-675c5fd7b7-z9vsh" Mar 20 17:53:49 crc kubenswrapper[4690]: I0320 17:53:49.418175 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ce9f480-c11d-4009-98e7-8e1d4a13ecd8-ovndb-tls-certs\") pod \"neutron-675c5fd7b7-z9vsh\" (UID: \"1ce9f480-c11d-4009-98e7-8e1d4a13ecd8\") " pod="openstack/neutron-675c5fd7b7-z9vsh" Mar 20 17:53:49 crc kubenswrapper[4690]: I0320 17:53:49.418285 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ce9f480-c11d-4009-98e7-8e1d4a13ecd8-combined-ca-bundle\") pod \"neutron-675c5fd7b7-z9vsh\" (UID: \"1ce9f480-c11d-4009-98e7-8e1d4a13ecd8\") " pod="openstack/neutron-675c5fd7b7-z9vsh" Mar 20 17:53:49 crc kubenswrapper[4690]: I0320 17:53:49.419343 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"370661b8-157c-4a7f-ae3e-379d122d48b3","Type":"ContainerStarted","Data":"c4d127039b6b0831eb7e16059221005b8ca4d397eab11761cd0b622726c43086"} Mar 20 17:53:49 crc kubenswrapper[4690]: I0320 17:53:49.458878 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-84b8586c59-k9gqt" Mar 20 17:53:49 crc kubenswrapper[4690]: I0320 17:53:49.519577 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ce9f480-c11d-4009-98e7-8e1d4a13ecd8-combined-ca-bundle\") pod \"neutron-675c5fd7b7-z9vsh\" (UID: \"1ce9f480-c11d-4009-98e7-8e1d4a13ecd8\") " pod="openstack/neutron-675c5fd7b7-z9vsh" Mar 20 17:53:49 crc kubenswrapper[4690]: I0320 17:53:49.519655 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1ce9f480-c11d-4009-98e7-8e1d4a13ecd8-config\") pod \"neutron-675c5fd7b7-z9vsh\" (UID: \"1ce9f480-c11d-4009-98e7-8e1d4a13ecd8\") " pod="openstack/neutron-675c5fd7b7-z9vsh" Mar 20 17:53:49 crc kubenswrapper[4690]: I0320 17:53:49.519676 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bd6td\" (UniqueName: \"kubernetes.io/projected/1ce9f480-c11d-4009-98e7-8e1d4a13ecd8-kube-api-access-bd6td\") pod \"neutron-675c5fd7b7-z9vsh\" (UID: \"1ce9f480-c11d-4009-98e7-8e1d4a13ecd8\") " pod="openstack/neutron-675c5fd7b7-z9vsh" Mar 20 17:53:49 crc kubenswrapper[4690]: I0320 17:53:49.519722 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ce9f480-c11d-4009-98e7-8e1d4a13ecd8-public-tls-certs\") pod \"neutron-675c5fd7b7-z9vsh\" (UID: \"1ce9f480-c11d-4009-98e7-8e1d4a13ecd8\") " pod="openstack/neutron-675c5fd7b7-z9vsh" Mar 20 17:53:49 crc kubenswrapper[4690]: I0320 17:53:49.519768 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ce9f480-c11d-4009-98e7-8e1d4a13ecd8-internal-tls-certs\") pod \"neutron-675c5fd7b7-z9vsh\" (UID: \"1ce9f480-c11d-4009-98e7-8e1d4a13ecd8\") " pod="openstack/neutron-675c5fd7b7-z9vsh" Mar 20 17:53:49 crc kubenswrapper[4690]: I0320 17:53:49.519810 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/1ce9f480-c11d-4009-98e7-8e1d4a13ecd8-httpd-config\") pod \"neutron-675c5fd7b7-z9vsh\" (UID: \"1ce9f480-c11d-4009-98e7-8e1d4a13ecd8\") " pod="openstack/neutron-675c5fd7b7-z9vsh" Mar 20 17:53:49 crc kubenswrapper[4690]: I0320 17:53:49.519830 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ce9f480-c11d-4009-98e7-8e1d4a13ecd8-ovndb-tls-certs\") pod \"neutron-675c5fd7b7-z9vsh\" (UID: \"1ce9f480-c11d-4009-98e7-8e1d4a13ecd8\") " pod="openstack/neutron-675c5fd7b7-z9vsh" Mar 20 17:53:49 crc kubenswrapper[4690]: I0320 17:53:49.524631 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ce9f480-c11d-4009-98e7-8e1d4a13ecd8-combined-ca-bundle\") pod \"neutron-675c5fd7b7-z9vsh\" (UID: \"1ce9f480-c11d-4009-98e7-8e1d4a13ecd8\") " pod="openstack/neutron-675c5fd7b7-z9vsh" Mar 20 17:53:49 crc kubenswrapper[4690]: I0320 17:53:49.525299 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/1ce9f480-c11d-4009-98e7-8e1d4a13ecd8-config\") pod \"neutron-675c5fd7b7-z9vsh\" (UID: \"1ce9f480-c11d-4009-98e7-8e1d4a13ecd8\") " pod="openstack/neutron-675c5fd7b7-z9vsh" Mar 20 17:53:49 crc kubenswrapper[4690]: I0320 17:53:49.525763 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ce9f480-c11d-4009-98e7-8e1d4a13ecd8-ovndb-tls-certs\") pod \"neutron-675c5fd7b7-z9vsh\" (UID: \"1ce9f480-c11d-4009-98e7-8e1d4a13ecd8\") " pod="openstack/neutron-675c5fd7b7-z9vsh" Mar 20 17:53:49 crc kubenswrapper[4690]: I0320 17:53:49.529400 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ce9f480-c11d-4009-98e7-8e1d4a13ecd8-public-tls-certs\") pod \"neutron-675c5fd7b7-z9vsh\" (UID: \"1ce9f480-c11d-4009-98e7-8e1d4a13ecd8\") " pod="openstack/neutron-675c5fd7b7-z9vsh" Mar 20 17:53:49 crc kubenswrapper[4690]: I0320 17:53:49.530762 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ce9f480-c11d-4009-98e7-8e1d4a13ecd8-internal-tls-certs\") pod \"neutron-675c5fd7b7-z9vsh\" (UID: \"1ce9f480-c11d-4009-98e7-8e1d4a13ecd8\") " pod="openstack/neutron-675c5fd7b7-z9vsh" Mar 20 17:53:49 crc kubenswrapper[4690]: I0320 17:53:49.535074 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/1ce9f480-c11d-4009-98e7-8e1d4a13ecd8-httpd-config\") pod \"neutron-675c5fd7b7-z9vsh\" (UID: \"1ce9f480-c11d-4009-98e7-8e1d4a13ecd8\") " pod="openstack/neutron-675c5fd7b7-z9vsh" Mar 20 17:53:49 crc kubenswrapper[4690]: I0320 17:53:49.538333 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bd6td\" (UniqueName: \"kubernetes.io/projected/1ce9f480-c11d-4009-98e7-8e1d4a13ecd8-kube-api-access-bd6td\") pod \"neutron-675c5fd7b7-z9vsh\" (UID: \"1ce9f480-c11d-4009-98e7-8e1d4a13ecd8\") " pod="openstack/neutron-675c5fd7b7-z9vsh" Mar 20 17:53:49 crc kubenswrapper[4690]: I0320 17:53:49.605490 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-675c5fd7b7-z9vsh" Mar 20 17:53:50 crc kubenswrapper[4690]: I0320 17:53:50.173598 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-675c5fd7b7-z9vsh"] Mar 20 17:53:50 crc kubenswrapper[4690]: I0320 17:53:50.483565 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"48156974-0ab6-4f24-8d90-c5dcdfbe9f37","Type":"ContainerStarted","Data":"408b2ebf9a8651945b2fcdd0c6cf8210f0de2d8332dd37200ca5a8d151c68af9"} Mar 20 17:53:50 crc kubenswrapper[4690]: I0320 17:53:50.487700 4690 generic.go:334] "Generic (PLEG): container finished" podID="e1874a81-9fc0-4cb0-a681-8ab78df069a0" containerID="add83b1f86a4bfa988fa96f33e00996aa647c96022a0d0eed9e863ed744e00da" exitCode=0 Mar 20 17:53:50 crc kubenswrapper[4690]: I0320 17:53:50.487761 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-84b8586c59-k9gqt" event={"ID":"e1874a81-9fc0-4cb0-a681-8ab78df069a0","Type":"ContainerDied","Data":"add83b1f86a4bfa988fa96f33e00996aa647c96022a0d0eed9e863ed744e00da"} Mar 20 17:53:50 crc kubenswrapper[4690]: I0320 17:53:50.489736 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"370661b8-157c-4a7f-ae3e-379d122d48b3","Type":"ContainerStarted","Data":"6f0ab58753f45e9330f83c24864487792328897772e0825486616625b9fdf4f5"} Mar 20 17:53:50 crc kubenswrapper[4690]: I0320 17:53:50.489903 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 20 17:53:50 crc kubenswrapper[4690]: I0320 17:53:50.491828 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-675c5fd7b7-z9vsh" event={"ID":"1ce9f480-c11d-4009-98e7-8e1d4a13ecd8","Type":"ContainerStarted","Data":"73a9579befc1a831acca0a37c4ede5bd383af76c9cd8d082b132b9c09faf50a5"} Mar 20 17:53:50 crc kubenswrapper[4690]: I0320 17:53:50.491861 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-675c5fd7b7-z9vsh" event={"ID":"1ce9f480-c11d-4009-98e7-8e1d4a13ecd8","Type":"ContainerStarted","Data":"ea048d05799ec99b6bf6190535da95e5aa2cee0b0af4966684abefe21c615fa1"} Mar 20 17:53:50 crc kubenswrapper[4690]: I0320 17:53:50.514539 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.51452376 podStartE2EDuration="3.51452376s" podCreationTimestamp="2026-03-20 17:53:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:53:50.505399758 +0000 UTC m=+1305.371225436" watchObservedRunningTime="2026-03-20 17:53:50.51452376 +0000 UTC m=+1305.380349438" Mar 20 17:53:50 crc kubenswrapper[4690]: I0320 17:53:50.722429 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c9776ccc5-m9hgh" Mar 20 17:53:50 crc kubenswrapper[4690]: I0320 17:53:50.818455 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-h8fdq"] Mar 20 17:53:50 crc kubenswrapper[4690]: I0320 17:53:50.819023 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-55f844cf75-h8fdq" podUID="a934f11e-9b01-4a42-ba23-75fbf6461c04" containerName="dnsmasq-dns" containerID="cri-o://82ddac1eb80305863e20d8cde15e54b6d46c7fbf09cf9fd0f255effe58bb9898" gracePeriod=10 Mar 20 17:53:51 crc kubenswrapper[4690]: I0320 17:53:51.034614 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 20 17:53:51 crc kubenswrapper[4690]: I0320 17:53:51.045904 4690 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-84b8586c59-k9gqt" podUID="e1874a81-9fc0-4cb0-a681-8ab78df069a0" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.155:9696/\": dial tcp 10.217.0.155:9696: connect: connection refused" Mar 20 17:53:51 crc kubenswrapper[4690]: I0320 17:53:51.091090 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 17:53:51 crc kubenswrapper[4690]: I0320 17:53:51.184998 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-78ff979d76-5nxvv" Mar 20 17:53:51 crc kubenswrapper[4690]: I0320 17:53:51.278487 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-h8fdq" Mar 20 17:53:51 crc kubenswrapper[4690]: I0320 17:53:51.392407 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-78ff979d76-5nxvv" Mar 20 17:53:51 crc kubenswrapper[4690]: I0320 17:53:51.463044 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a934f11e-9b01-4a42-ba23-75fbf6461c04-ovsdbserver-nb\") pod \"a934f11e-9b01-4a42-ba23-75fbf6461c04\" (UID: \"a934f11e-9b01-4a42-ba23-75fbf6461c04\") " Mar 20 17:53:51 crc kubenswrapper[4690]: I0320 17:53:51.463085 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a934f11e-9b01-4a42-ba23-75fbf6461c04-dns-svc\") pod \"a934f11e-9b01-4a42-ba23-75fbf6461c04\" (UID: \"a934f11e-9b01-4a42-ba23-75fbf6461c04\") " Mar 20 17:53:51 crc kubenswrapper[4690]: I0320 17:53:51.463116 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a934f11e-9b01-4a42-ba23-75fbf6461c04-ovsdbserver-sb\") pod \"a934f11e-9b01-4a42-ba23-75fbf6461c04\" (UID: \"a934f11e-9b01-4a42-ba23-75fbf6461c04\") " Mar 20 17:53:51 crc kubenswrapper[4690]: I0320 17:53:51.463147 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a934f11e-9b01-4a42-ba23-75fbf6461c04-dns-swift-storage-0\") pod \"a934f11e-9b01-4a42-ba23-75fbf6461c04\" (UID: \"a934f11e-9b01-4a42-ba23-75fbf6461c04\") " Mar 20 17:53:51 crc kubenswrapper[4690]: I0320 17:53:51.463190 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a934f11e-9b01-4a42-ba23-75fbf6461c04-config\") pod \"a934f11e-9b01-4a42-ba23-75fbf6461c04\" (UID: \"a934f11e-9b01-4a42-ba23-75fbf6461c04\") " Mar 20 17:53:51 crc kubenswrapper[4690]: I0320 17:53:51.463266 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2t2qk\" (UniqueName: \"kubernetes.io/projected/a934f11e-9b01-4a42-ba23-75fbf6461c04-kube-api-access-2t2qk\") pod \"a934f11e-9b01-4a42-ba23-75fbf6461c04\" (UID: \"a934f11e-9b01-4a42-ba23-75fbf6461c04\") " Mar 20 17:53:51 crc kubenswrapper[4690]: I0320 17:53:51.468936 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a934f11e-9b01-4a42-ba23-75fbf6461c04-kube-api-access-2t2qk" (OuterVolumeSpecName: "kube-api-access-2t2qk") pod "a934f11e-9b01-4a42-ba23-75fbf6461c04" (UID: "a934f11e-9b01-4a42-ba23-75fbf6461c04"). InnerVolumeSpecName "kube-api-access-2t2qk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:53:51 crc kubenswrapper[4690]: I0320 17:53:51.550593 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"48156974-0ab6-4f24-8d90-c5dcdfbe9f37","Type":"ContainerStarted","Data":"3dad433afeaad3163f1f217c7c266a77d8f05e0dfc6cb01f8f33f4d9a49c1ca2"} Mar 20 17:53:51 crc kubenswrapper[4690]: I0320 17:53:51.551777 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 17:53:51 crc kubenswrapper[4690]: I0320 17:53:51.559015 4690 generic.go:334] "Generic (PLEG): container finished" podID="a934f11e-9b01-4a42-ba23-75fbf6461c04" containerID="82ddac1eb80305863e20d8cde15e54b6d46c7fbf09cf9fd0f255effe58bb9898" exitCode=0 Mar 20 17:53:51 crc kubenswrapper[4690]: I0320 17:53:51.559126 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-h8fdq" Mar 20 17:53:51 crc kubenswrapper[4690]: I0320 17:53:51.559333 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-h8fdq" event={"ID":"a934f11e-9b01-4a42-ba23-75fbf6461c04","Type":"ContainerDied","Data":"82ddac1eb80305863e20d8cde15e54b6d46c7fbf09cf9fd0f255effe58bb9898"} Mar 20 17:53:51 crc kubenswrapper[4690]: I0320 17:53:51.559359 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-h8fdq" event={"ID":"a934f11e-9b01-4a42-ba23-75fbf6461c04","Type":"ContainerDied","Data":"6f56d9c4d921023b50fe4d774be1a3efa30728fed3dd0f3a5345f4d3c3992ed7"} Mar 20 17:53:51 crc kubenswrapper[4690]: I0320 17:53:51.559377 4690 scope.go:117] "RemoveContainer" containerID="82ddac1eb80305863e20d8cde15e54b6d46c7fbf09cf9fd0f255effe58bb9898" Mar 20 17:53:51 crc kubenswrapper[4690]: I0320 17:53:51.563189 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a934f11e-9b01-4a42-ba23-75fbf6461c04-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a934f11e-9b01-4a42-ba23-75fbf6461c04" (UID: "a934f11e-9b01-4a42-ba23-75fbf6461c04"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:53:51 crc kubenswrapper[4690]: I0320 17:53:51.569072 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-675c5fd7b7-z9vsh" event={"ID":"1ce9f480-c11d-4009-98e7-8e1d4a13ecd8","Type":"ContainerStarted","Data":"e3003e56c48f42d0643d42648a64dc82cadaa4f50bb1f8cb381ac5197645307e"} Mar 20 17:53:51 crc kubenswrapper[4690]: I0320 17:53:51.569109 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-675c5fd7b7-z9vsh" Mar 20 17:53:51 crc kubenswrapper[4690]: I0320 17:53:51.569224 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="c2dbf4c6-e3eb-4984-a39a-0981181cea31" containerName="cinder-scheduler" containerID="cri-o://3a3910af7bc6e45224246856a6c4b3db5027b27a45428a159726fd635325e20f" gracePeriod=30 Mar 20 17:53:51 crc kubenswrapper[4690]: I0320 17:53:51.571576 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="c2dbf4c6-e3eb-4984-a39a-0981181cea31" containerName="probe" containerID="cri-o://78a6a708e8f1e4570a60795cb1ccaacbc4545031dccf0efe5c2caca106220363" gracePeriod=30 Mar 20 17:53:51 crc kubenswrapper[4690]: I0320 17:53:51.584276 4690 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a934f11e-9b01-4a42-ba23-75fbf6461c04-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:51 crc kubenswrapper[4690]: I0320 17:53:51.584437 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2t2qk\" (UniqueName: \"kubernetes.io/projected/a934f11e-9b01-4a42-ba23-75fbf6461c04-kube-api-access-2t2qk\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:51 crc kubenswrapper[4690]: I0320 17:53:51.591042 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.663586113 podStartE2EDuration="5.591021258s" podCreationTimestamp="2026-03-20 17:53:46 +0000 UTC" firstStartedPulling="2026-03-20 17:53:47.203225941 +0000 UTC m=+1302.069051629" lastFinishedPulling="2026-03-20 17:53:51.130661106 +0000 UTC m=+1305.996486774" observedRunningTime="2026-03-20 17:53:51.577813492 +0000 UTC m=+1306.443639170" watchObservedRunningTime="2026-03-20 17:53:51.591021258 +0000 UTC m=+1306.456846936" Mar 20 17:53:51 crc kubenswrapper[4690]: I0320 17:53:51.593869 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a934f11e-9b01-4a42-ba23-75fbf6461c04-config" (OuterVolumeSpecName: "config") pod "a934f11e-9b01-4a42-ba23-75fbf6461c04" (UID: "a934f11e-9b01-4a42-ba23-75fbf6461c04"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:53:51 crc kubenswrapper[4690]: I0320 17:53:51.607727 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a934f11e-9b01-4a42-ba23-75fbf6461c04-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a934f11e-9b01-4a42-ba23-75fbf6461c04" (UID: "a934f11e-9b01-4a42-ba23-75fbf6461c04"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:53:51 crc kubenswrapper[4690]: I0320 17:53:51.617273 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a934f11e-9b01-4a42-ba23-75fbf6461c04-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a934f11e-9b01-4a42-ba23-75fbf6461c04" (UID: "a934f11e-9b01-4a42-ba23-75fbf6461c04"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:53:51 crc kubenswrapper[4690]: I0320 17:53:51.654511 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a934f11e-9b01-4a42-ba23-75fbf6461c04-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a934f11e-9b01-4a42-ba23-75fbf6461c04" (UID: "a934f11e-9b01-4a42-ba23-75fbf6461c04"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:53:51 crc kubenswrapper[4690]: I0320 17:53:51.660279 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-675c5fd7b7-z9vsh" podStartSLOduration=2.660246125 podStartE2EDuration="2.660246125s" podCreationTimestamp="2026-03-20 17:53:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:53:51.637129275 +0000 UTC m=+1306.502954953" watchObservedRunningTime="2026-03-20 17:53:51.660246125 +0000 UTC m=+1306.526071803" Mar 20 17:53:51 crc kubenswrapper[4690]: I0320 17:53:51.687040 4690 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a934f11e-9b01-4a42-ba23-75fbf6461c04-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:51 crc kubenswrapper[4690]: I0320 17:53:51.687074 4690 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a934f11e-9b01-4a42-ba23-75fbf6461c04-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:51 crc kubenswrapper[4690]: I0320 17:53:51.687087 4690 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a934f11e-9b01-4a42-ba23-75fbf6461c04-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:51 crc kubenswrapper[4690]: I0320 17:53:51.687098 4690 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a934f11e-9b01-4a42-ba23-75fbf6461c04-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:51 crc kubenswrapper[4690]: I0320 17:53:51.725703 4690 scope.go:117] "RemoveContainer" containerID="abbeb25bb51ec5e5c03664ce8d182749bb93e9b728366afc993485009921368a" Mar 20 17:53:51 crc kubenswrapper[4690]: I0320 17:53:51.754311 4690 scope.go:117] "RemoveContainer" containerID="82ddac1eb80305863e20d8cde15e54b6d46c7fbf09cf9fd0f255effe58bb9898" Mar 20 17:53:51 crc kubenswrapper[4690]: E0320 17:53:51.754725 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82ddac1eb80305863e20d8cde15e54b6d46c7fbf09cf9fd0f255effe58bb9898\": container with ID starting with 82ddac1eb80305863e20d8cde15e54b6d46c7fbf09cf9fd0f255effe58bb9898 not found: ID does not exist" containerID="82ddac1eb80305863e20d8cde15e54b6d46c7fbf09cf9fd0f255effe58bb9898" Mar 20 17:53:51 crc kubenswrapper[4690]: I0320 17:53:51.754768 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82ddac1eb80305863e20d8cde15e54b6d46c7fbf09cf9fd0f255effe58bb9898"} err="failed to get container status \"82ddac1eb80305863e20d8cde15e54b6d46c7fbf09cf9fd0f255effe58bb9898\": rpc error: code = NotFound desc = could not find container \"82ddac1eb80305863e20d8cde15e54b6d46c7fbf09cf9fd0f255effe58bb9898\": container with ID starting with 82ddac1eb80305863e20d8cde15e54b6d46c7fbf09cf9fd0f255effe58bb9898 not found: ID does not exist" Mar 20 17:53:51 crc kubenswrapper[4690]: I0320 17:53:51.754792 4690 scope.go:117] "RemoveContainer" containerID="abbeb25bb51ec5e5c03664ce8d182749bb93e9b728366afc993485009921368a" Mar 20 17:53:51 crc kubenswrapper[4690]: E0320 17:53:51.755031 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abbeb25bb51ec5e5c03664ce8d182749bb93e9b728366afc993485009921368a\": container with ID starting with abbeb25bb51ec5e5c03664ce8d182749bb93e9b728366afc993485009921368a not found: ID does not exist" containerID="abbeb25bb51ec5e5c03664ce8d182749bb93e9b728366afc993485009921368a" Mar 20 17:53:51 crc kubenswrapper[4690]: I0320 17:53:51.755059 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abbeb25bb51ec5e5c03664ce8d182749bb93e9b728366afc993485009921368a"} err="failed to get container status \"abbeb25bb51ec5e5c03664ce8d182749bb93e9b728366afc993485009921368a\": rpc error: code = NotFound desc = could not find container \"abbeb25bb51ec5e5c03664ce8d182749bb93e9b728366afc993485009921368a\": container with ID starting with abbeb25bb51ec5e5c03664ce8d182749bb93e9b728366afc993485009921368a not found: ID does not exist" Mar 20 17:53:51 crc kubenswrapper[4690]: I0320 17:53:51.902035 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-h8fdq"] Mar 20 17:53:51 crc kubenswrapper[4690]: I0320 17:53:51.909592 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-h8fdq"] Mar 20 17:53:52 crc kubenswrapper[4690]: I0320 17:53:52.110094 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-867c5896-qkwmr" Mar 20 17:53:52 crc kubenswrapper[4690]: I0320 17:53:52.367170 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-dc95ccffb-gvrdq" Mar 20 17:53:52 crc kubenswrapper[4690]: I0320 17:53:52.579006 4690 generic.go:334] "Generic (PLEG): container finished" podID="c2dbf4c6-e3eb-4984-a39a-0981181cea31" containerID="78a6a708e8f1e4570a60795cb1ccaacbc4545031dccf0efe5c2caca106220363" exitCode=0 Mar 20 17:53:52 crc kubenswrapper[4690]: I0320 17:53:52.579860 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c2dbf4c6-e3eb-4984-a39a-0981181cea31","Type":"ContainerDied","Data":"78a6a708e8f1e4570a60795cb1ccaacbc4545031dccf0efe5c2caca106220363"} Mar 20 17:53:53 crc kubenswrapper[4690]: I0320 17:53:53.509887 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-84b8586c59-k9gqt" Mar 20 17:53:53 crc kubenswrapper[4690]: I0320 17:53:53.590309 4690 generic.go:334] "Generic (PLEG): container finished" podID="e1874a81-9fc0-4cb0-a681-8ab78df069a0" containerID="6ce1b558ef5aec84880806931e4b38462b049a8175ebbdb3b31e68f707f0a15c" exitCode=0 Mar 20 17:53:53 crc kubenswrapper[4690]: I0320 17:53:53.591051 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-84b8586c59-k9gqt" Mar 20 17:53:53 crc kubenswrapper[4690]: I0320 17:53:53.591094 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-84b8586c59-k9gqt" event={"ID":"e1874a81-9fc0-4cb0-a681-8ab78df069a0","Type":"ContainerDied","Data":"6ce1b558ef5aec84880806931e4b38462b049a8175ebbdb3b31e68f707f0a15c"} Mar 20 17:53:53 crc kubenswrapper[4690]: I0320 17:53:53.591165 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-84b8586c59-k9gqt" event={"ID":"e1874a81-9fc0-4cb0-a681-8ab78df069a0","Type":"ContainerDied","Data":"f9f880736267c6488b3b56ef34460cc43c3cee9d34f6fa362e061a623b852b36"} Mar 20 17:53:53 crc kubenswrapper[4690]: I0320 17:53:53.591185 4690 scope.go:117] "RemoveContainer" containerID="add83b1f86a4bfa988fa96f33e00996aa647c96022a0d0eed9e863ed744e00da" Mar 20 17:53:53 crc kubenswrapper[4690]: I0320 17:53:53.613498 4690 scope.go:117] "RemoveContainer" containerID="6ce1b558ef5aec84880806931e4b38462b049a8175ebbdb3b31e68f707f0a15c" Mar 20 17:53:53 crc kubenswrapper[4690]: I0320 17:53:53.620180 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcwtp\" (UniqueName: \"kubernetes.io/projected/e1874a81-9fc0-4cb0-a681-8ab78df069a0-kube-api-access-xcwtp\") pod \"e1874a81-9fc0-4cb0-a681-8ab78df069a0\" (UID: \"e1874a81-9fc0-4cb0-a681-8ab78df069a0\") " Mar 20 17:53:53 crc kubenswrapper[4690]: I0320 17:53:53.620346 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e1874a81-9fc0-4cb0-a681-8ab78df069a0-config\") pod \"e1874a81-9fc0-4cb0-a681-8ab78df069a0\" (UID: \"e1874a81-9fc0-4cb0-a681-8ab78df069a0\") " Mar 20 17:53:53 crc kubenswrapper[4690]: I0320 17:53:53.620386 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1874a81-9fc0-4cb0-a681-8ab78df069a0-combined-ca-bundle\") pod \"e1874a81-9fc0-4cb0-a681-8ab78df069a0\" (UID: \"e1874a81-9fc0-4cb0-a681-8ab78df069a0\") " Mar 20 17:53:53 crc kubenswrapper[4690]: I0320 17:53:53.620421 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1874a81-9fc0-4cb0-a681-8ab78df069a0-public-tls-certs\") pod \"e1874a81-9fc0-4cb0-a681-8ab78df069a0\" (UID: \"e1874a81-9fc0-4cb0-a681-8ab78df069a0\") " Mar 20 17:53:53 crc kubenswrapper[4690]: I0320 17:53:53.620489 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1874a81-9fc0-4cb0-a681-8ab78df069a0-internal-tls-certs\") pod \"e1874a81-9fc0-4cb0-a681-8ab78df069a0\" (UID: \"e1874a81-9fc0-4cb0-a681-8ab78df069a0\") " Mar 20 17:53:53 crc kubenswrapper[4690]: I0320 17:53:53.620544 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1874a81-9fc0-4cb0-a681-8ab78df069a0-ovndb-tls-certs\") pod \"e1874a81-9fc0-4cb0-a681-8ab78df069a0\" (UID: \"e1874a81-9fc0-4cb0-a681-8ab78df069a0\") " Mar 20 17:53:53 crc kubenswrapper[4690]: I0320 17:53:53.620584 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e1874a81-9fc0-4cb0-a681-8ab78df069a0-httpd-config\") pod \"e1874a81-9fc0-4cb0-a681-8ab78df069a0\" (UID: \"e1874a81-9fc0-4cb0-a681-8ab78df069a0\") " Mar 20 17:53:53 crc kubenswrapper[4690]: I0320 17:53:53.626078 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1874a81-9fc0-4cb0-a681-8ab78df069a0-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "e1874a81-9fc0-4cb0-a681-8ab78df069a0" (UID: "e1874a81-9fc0-4cb0-a681-8ab78df069a0"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:53:53 crc kubenswrapper[4690]: I0320 17:53:53.635114 4690 scope.go:117] "RemoveContainer" containerID="add83b1f86a4bfa988fa96f33e00996aa647c96022a0d0eed9e863ed744e00da" Mar 20 17:53:53 crc kubenswrapper[4690]: E0320 17:53:53.635583 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"add83b1f86a4bfa988fa96f33e00996aa647c96022a0d0eed9e863ed744e00da\": container with ID starting with add83b1f86a4bfa988fa96f33e00996aa647c96022a0d0eed9e863ed744e00da not found: ID does not exist" containerID="add83b1f86a4bfa988fa96f33e00996aa647c96022a0d0eed9e863ed744e00da" Mar 20 17:53:53 crc kubenswrapper[4690]: I0320 17:53:53.635631 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"add83b1f86a4bfa988fa96f33e00996aa647c96022a0d0eed9e863ed744e00da"} err="failed to get container status \"add83b1f86a4bfa988fa96f33e00996aa647c96022a0d0eed9e863ed744e00da\": rpc error: code = NotFound desc = could not find container \"add83b1f86a4bfa988fa96f33e00996aa647c96022a0d0eed9e863ed744e00da\": container with ID starting with add83b1f86a4bfa988fa96f33e00996aa647c96022a0d0eed9e863ed744e00da not found: ID does not exist" Mar 20 17:53:53 crc kubenswrapper[4690]: I0320 17:53:53.635681 4690 scope.go:117] "RemoveContainer" containerID="6ce1b558ef5aec84880806931e4b38462b049a8175ebbdb3b31e68f707f0a15c" Mar 20 17:53:53 crc kubenswrapper[4690]: E0320 17:53:53.636708 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ce1b558ef5aec84880806931e4b38462b049a8175ebbdb3b31e68f707f0a15c\": container with ID starting with 6ce1b558ef5aec84880806931e4b38462b049a8175ebbdb3b31e68f707f0a15c not found: ID does not exist" containerID="6ce1b558ef5aec84880806931e4b38462b049a8175ebbdb3b31e68f707f0a15c" Mar 20 17:53:53 crc kubenswrapper[4690]: I0320 17:53:53.636741 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ce1b558ef5aec84880806931e4b38462b049a8175ebbdb3b31e68f707f0a15c"} err="failed to get container status \"6ce1b558ef5aec84880806931e4b38462b049a8175ebbdb3b31e68f707f0a15c\": rpc error: code = NotFound desc = could not find container \"6ce1b558ef5aec84880806931e4b38462b049a8175ebbdb3b31e68f707f0a15c\": container with ID starting with 6ce1b558ef5aec84880806931e4b38462b049a8175ebbdb3b31e68f707f0a15c not found: ID does not exist" Mar 20 17:53:53 crc kubenswrapper[4690]: I0320 17:53:53.637351 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1874a81-9fc0-4cb0-a681-8ab78df069a0-kube-api-access-xcwtp" (OuterVolumeSpecName: "kube-api-access-xcwtp") pod "e1874a81-9fc0-4cb0-a681-8ab78df069a0" (UID: "e1874a81-9fc0-4cb0-a681-8ab78df069a0"). InnerVolumeSpecName "kube-api-access-xcwtp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:53:53 crc kubenswrapper[4690]: I0320 17:53:53.682093 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1874a81-9fc0-4cb0-a681-8ab78df069a0-config" (OuterVolumeSpecName: "config") pod "e1874a81-9fc0-4cb0-a681-8ab78df069a0" (UID: "e1874a81-9fc0-4cb0-a681-8ab78df069a0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:53:53 crc kubenswrapper[4690]: I0320 17:53:53.695339 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1874a81-9fc0-4cb0-a681-8ab78df069a0-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e1874a81-9fc0-4cb0-a681-8ab78df069a0" (UID: "e1874a81-9fc0-4cb0-a681-8ab78df069a0"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:53:53 crc kubenswrapper[4690]: I0320 17:53:53.697706 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1874a81-9fc0-4cb0-a681-8ab78df069a0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e1874a81-9fc0-4cb0-a681-8ab78df069a0" (UID: "e1874a81-9fc0-4cb0-a681-8ab78df069a0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:53:53 crc kubenswrapper[4690]: I0320 17:53:53.710923 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1874a81-9fc0-4cb0-a681-8ab78df069a0-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "e1874a81-9fc0-4cb0-a681-8ab78df069a0" (UID: "e1874a81-9fc0-4cb0-a681-8ab78df069a0"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:53:53 crc kubenswrapper[4690]: I0320 17:53:53.723082 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcwtp\" (UniqueName: \"kubernetes.io/projected/e1874a81-9fc0-4cb0-a681-8ab78df069a0-kube-api-access-xcwtp\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:53 crc kubenswrapper[4690]: I0320 17:53:53.723113 4690 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/e1874a81-9fc0-4cb0-a681-8ab78df069a0-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:53 crc kubenswrapper[4690]: I0320 17:53:53.723122 4690 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1874a81-9fc0-4cb0-a681-8ab78df069a0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:53 crc kubenswrapper[4690]: I0320 17:53:53.723134 4690 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1874a81-9fc0-4cb0-a681-8ab78df069a0-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:53 crc kubenswrapper[4690]: I0320 17:53:53.723142 4690 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1874a81-9fc0-4cb0-a681-8ab78df069a0-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:53 crc kubenswrapper[4690]: I0320 17:53:53.723152 4690 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e1874a81-9fc0-4cb0-a681-8ab78df069a0-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:53 crc kubenswrapper[4690]: I0320 17:53:53.728332 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1874a81-9fc0-4cb0-a681-8ab78df069a0-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e1874a81-9fc0-4cb0-a681-8ab78df069a0" (UID: "e1874a81-9fc0-4cb0-a681-8ab78df069a0"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:53:53 crc kubenswrapper[4690]: I0320 17:53:53.825103 4690 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1874a81-9fc0-4cb0-a681-8ab78df069a0-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:53 crc kubenswrapper[4690]: I0320 17:53:53.919166 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a934f11e-9b01-4a42-ba23-75fbf6461c04" path="/var/lib/kubelet/pods/a934f11e-9b01-4a42-ba23-75fbf6461c04/volumes" Mar 20 17:53:53 crc kubenswrapper[4690]: I0320 17:53:53.939747 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-84b8586c59-k9gqt"] Mar 20 17:53:53 crc kubenswrapper[4690]: I0320 17:53:53.944019 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-867c5896-qkwmr" Mar 20 17:53:53 crc kubenswrapper[4690]: I0320 17:53:53.949158 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-84b8586c59-k9gqt"] Mar 20 17:53:54 crc kubenswrapper[4690]: I0320 17:53:54.193011 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-dc95ccffb-gvrdq" Mar 20 17:53:54 crc kubenswrapper[4690]: I0320 17:53:54.264977 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-867c5896-qkwmr"] Mar 20 17:53:54 crc kubenswrapper[4690]: I0320 17:53:54.601215 4690 generic.go:334] "Generic (PLEG): container finished" podID="c2dbf4c6-e3eb-4984-a39a-0981181cea31" containerID="3a3910af7bc6e45224246856a6c4b3db5027b27a45428a159726fd635325e20f" exitCode=0 Mar 20 17:53:54 crc kubenswrapper[4690]: I0320 17:53:54.601281 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c2dbf4c6-e3eb-4984-a39a-0981181cea31","Type":"ContainerDied","Data":"3a3910af7bc6e45224246856a6c4b3db5027b27a45428a159726fd635325e20f"} Mar 20 17:53:54 crc kubenswrapper[4690]: I0320 17:53:54.601647 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c2dbf4c6-e3eb-4984-a39a-0981181cea31","Type":"ContainerDied","Data":"cfe61d2e5628a1be9ae529eab87694351bc292528f361a6a1c264f31d287b55e"} Mar 20 17:53:54 crc kubenswrapper[4690]: I0320 17:53:54.601663 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cfe61d2e5628a1be9ae529eab87694351bc292528f361a6a1c264f31d287b55e" Mar 20 17:53:54 crc kubenswrapper[4690]: I0320 17:53:54.601785 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-867c5896-qkwmr" podUID="607d61e7-e52a-46e6-a23a-2d4714c5b543" containerName="horizon-log" containerID="cri-o://83415bbed66278723c555c9441d97cd81cb450f1f463045bdaece0319a8abe3d" gracePeriod=30 Mar 20 17:53:54 crc kubenswrapper[4690]: I0320 17:53:54.601885 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-867c5896-qkwmr" podUID="607d61e7-e52a-46e6-a23a-2d4714c5b543" containerName="horizon" containerID="cri-o://986e03253e1d42403b7786f6087a48f5db97b4dbed738848947823b11c19e91a" gracePeriod=30 Mar 20 17:53:54 crc kubenswrapper[4690]: I0320 17:53:54.629545 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 17:53:54 crc kubenswrapper[4690]: I0320 17:53:54.643722 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2dbf4c6-e3eb-4984-a39a-0981181cea31-scripts\") pod \"c2dbf4c6-e3eb-4984-a39a-0981181cea31\" (UID: \"c2dbf4c6-e3eb-4984-a39a-0981181cea31\") " Mar 20 17:53:54 crc kubenswrapper[4690]: I0320 17:53:54.643862 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2dbf4c6-e3eb-4984-a39a-0981181cea31-config-data\") pod \"c2dbf4c6-e3eb-4984-a39a-0981181cea31\" (UID: \"c2dbf4c6-e3eb-4984-a39a-0981181cea31\") " Mar 20 17:53:54 crc kubenswrapper[4690]: I0320 17:53:54.643921 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c2dbf4c6-e3eb-4984-a39a-0981181cea31-etc-machine-id\") pod \"c2dbf4c6-e3eb-4984-a39a-0981181cea31\" (UID: \"c2dbf4c6-e3eb-4984-a39a-0981181cea31\") " Mar 20 17:53:54 crc kubenswrapper[4690]: I0320 17:53:54.643963 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2dbf4c6-e3eb-4984-a39a-0981181cea31-combined-ca-bundle\") pod \"c2dbf4c6-e3eb-4984-a39a-0981181cea31\" (UID: \"c2dbf4c6-e3eb-4984-a39a-0981181cea31\") " Mar 20 17:53:54 crc kubenswrapper[4690]: I0320 17:53:54.644032 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c2dbf4c6-e3eb-4984-a39a-0981181cea31-config-data-custom\") pod \"c2dbf4c6-e3eb-4984-a39a-0981181cea31\" (UID: \"c2dbf4c6-e3eb-4984-a39a-0981181cea31\") " Mar 20 17:53:54 crc kubenswrapper[4690]: I0320 17:53:54.644058 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c2dbf4c6-e3eb-4984-a39a-0981181cea31-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "c2dbf4c6-e3eb-4984-a39a-0981181cea31" (UID: "c2dbf4c6-e3eb-4984-a39a-0981181cea31"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:53:54 crc kubenswrapper[4690]: I0320 17:53:54.644088 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2pczz\" (UniqueName: \"kubernetes.io/projected/c2dbf4c6-e3eb-4984-a39a-0981181cea31-kube-api-access-2pczz\") pod \"c2dbf4c6-e3eb-4984-a39a-0981181cea31\" (UID: \"c2dbf4c6-e3eb-4984-a39a-0981181cea31\") " Mar 20 17:53:54 crc kubenswrapper[4690]: I0320 17:53:54.645043 4690 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c2dbf4c6-e3eb-4984-a39a-0981181cea31-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:54 crc kubenswrapper[4690]: I0320 17:53:54.659985 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2dbf4c6-e3eb-4984-a39a-0981181cea31-kube-api-access-2pczz" (OuterVolumeSpecName: "kube-api-access-2pczz") pod "c2dbf4c6-e3eb-4984-a39a-0981181cea31" (UID: "c2dbf4c6-e3eb-4984-a39a-0981181cea31"). InnerVolumeSpecName "kube-api-access-2pczz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:53:54 crc kubenswrapper[4690]: I0320 17:53:54.664893 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2dbf4c6-e3eb-4984-a39a-0981181cea31-scripts" (OuterVolumeSpecName: "scripts") pod "c2dbf4c6-e3eb-4984-a39a-0981181cea31" (UID: "c2dbf4c6-e3eb-4984-a39a-0981181cea31"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:53:54 crc kubenswrapper[4690]: I0320 17:53:54.671474 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2dbf4c6-e3eb-4984-a39a-0981181cea31-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c2dbf4c6-e3eb-4984-a39a-0981181cea31" (UID: "c2dbf4c6-e3eb-4984-a39a-0981181cea31"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:53:54 crc kubenswrapper[4690]: I0320 17:53:54.725029 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2dbf4c6-e3eb-4984-a39a-0981181cea31-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c2dbf4c6-e3eb-4984-a39a-0981181cea31" (UID: "c2dbf4c6-e3eb-4984-a39a-0981181cea31"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:53:54 crc kubenswrapper[4690]: I0320 17:53:54.746379 4690 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2dbf4c6-e3eb-4984-a39a-0981181cea31-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:54 crc kubenswrapper[4690]: I0320 17:53:54.746600 4690 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c2dbf4c6-e3eb-4984-a39a-0981181cea31-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:54 crc kubenswrapper[4690]: I0320 17:53:54.746669 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2pczz\" (UniqueName: \"kubernetes.io/projected/c2dbf4c6-e3eb-4984-a39a-0981181cea31-kube-api-access-2pczz\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:54 crc kubenswrapper[4690]: I0320 17:53:54.746781 4690 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2dbf4c6-e3eb-4984-a39a-0981181cea31-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:54 crc kubenswrapper[4690]: I0320 17:53:54.812645 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2dbf4c6-e3eb-4984-a39a-0981181cea31-config-data" (OuterVolumeSpecName: "config-data") pod "c2dbf4c6-e3eb-4984-a39a-0981181cea31" (UID: "c2dbf4c6-e3eb-4984-a39a-0981181cea31"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:53:54 crc kubenswrapper[4690]: I0320 17:53:54.848730 4690 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2dbf4c6-e3eb-4984-a39a-0981181cea31-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:55 crc kubenswrapper[4690]: I0320 17:53:55.258716 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-846cbcbcb-bk7ct" Mar 20 17:53:55 crc kubenswrapper[4690]: I0320 17:53:55.524028 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-846cbcbcb-bk7ct" Mar 20 17:53:55 crc kubenswrapper[4690]: I0320 17:53:55.617957 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 17:53:55 crc kubenswrapper[4690]: I0320 17:53:55.658803 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 17:53:55 crc kubenswrapper[4690]: I0320 17:53:55.675745 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 17:53:55 crc kubenswrapper[4690]: I0320 17:53:55.692799 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 17:53:55 crc kubenswrapper[4690]: E0320 17:53:55.693227 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1874a81-9fc0-4cb0-a681-8ab78df069a0" containerName="neutron-httpd" Mar 20 17:53:55 crc kubenswrapper[4690]: I0320 17:53:55.693244 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1874a81-9fc0-4cb0-a681-8ab78df069a0" containerName="neutron-httpd" Mar 20 17:53:55 crc kubenswrapper[4690]: E0320 17:53:55.693278 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a934f11e-9b01-4a42-ba23-75fbf6461c04" containerName="dnsmasq-dns" Mar 20 17:53:55 crc kubenswrapper[4690]: I0320 17:53:55.693289 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="a934f11e-9b01-4a42-ba23-75fbf6461c04" containerName="dnsmasq-dns" Mar 20 17:53:55 crc kubenswrapper[4690]: E0320 17:53:55.693310 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2dbf4c6-e3eb-4984-a39a-0981181cea31" containerName="probe" Mar 20 17:53:55 crc kubenswrapper[4690]: I0320 17:53:55.693320 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2dbf4c6-e3eb-4984-a39a-0981181cea31" containerName="probe" Mar 20 17:53:55 crc kubenswrapper[4690]: E0320 17:53:55.693338 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1874a81-9fc0-4cb0-a681-8ab78df069a0" containerName="neutron-api" Mar 20 17:53:55 crc kubenswrapper[4690]: I0320 17:53:55.693345 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1874a81-9fc0-4cb0-a681-8ab78df069a0" containerName="neutron-api" Mar 20 17:53:55 crc kubenswrapper[4690]: E0320 17:53:55.693365 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2dbf4c6-e3eb-4984-a39a-0981181cea31" containerName="cinder-scheduler" Mar 20 17:53:55 crc kubenswrapper[4690]: I0320 17:53:55.693372 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2dbf4c6-e3eb-4984-a39a-0981181cea31" containerName="cinder-scheduler" Mar 20 17:53:55 crc kubenswrapper[4690]: E0320 17:53:55.693386 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a934f11e-9b01-4a42-ba23-75fbf6461c04" containerName="init" Mar 20 17:53:55 crc kubenswrapper[4690]: I0320 17:53:55.693397 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="a934f11e-9b01-4a42-ba23-75fbf6461c04" containerName="init" Mar 20 17:53:55 crc kubenswrapper[4690]: I0320 17:53:55.693605 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1874a81-9fc0-4cb0-a681-8ab78df069a0" containerName="neutron-httpd" Mar 20 17:53:55 crc kubenswrapper[4690]: I0320 17:53:55.693622 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2dbf4c6-e3eb-4984-a39a-0981181cea31" containerName="cinder-scheduler" Mar 20 17:53:55 crc kubenswrapper[4690]: I0320 17:53:55.693636 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1874a81-9fc0-4cb0-a681-8ab78df069a0" containerName="neutron-api" Mar 20 17:53:55 crc kubenswrapper[4690]: I0320 17:53:55.693644 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="a934f11e-9b01-4a42-ba23-75fbf6461c04" containerName="dnsmasq-dns" Mar 20 17:53:55 crc kubenswrapper[4690]: I0320 17:53:55.693658 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2dbf4c6-e3eb-4984-a39a-0981181cea31" containerName="probe" Mar 20 17:53:55 crc kubenswrapper[4690]: I0320 17:53:55.694849 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 17:53:55 crc kubenswrapper[4690]: I0320 17:53:55.696662 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 20 17:53:55 crc kubenswrapper[4690]: I0320 17:53:55.713864 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 17:53:55 crc kubenswrapper[4690]: I0320 17:53:55.824349 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-77ff877fdd-nntbj"] Mar 20 17:53:55 crc kubenswrapper[4690]: I0320 17:53:55.825726 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-77ff877fdd-nntbj" Mar 20 17:53:55 crc kubenswrapper[4690]: I0320 17:53:55.836690 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-77ff877fdd-nntbj"] Mar 20 17:53:55 crc kubenswrapper[4690]: I0320 17:53:55.866733 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/582dd6c0-32f0-41f1-b62d-2dfc7f5b6509-scripts\") pod \"cinder-scheduler-0\" (UID: \"582dd6c0-32f0-41f1-b62d-2dfc7f5b6509\") " pod="openstack/cinder-scheduler-0" Mar 20 17:53:55 crc kubenswrapper[4690]: I0320 17:53:55.867504 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/582dd6c0-32f0-41f1-b62d-2dfc7f5b6509-config-data\") pod \"cinder-scheduler-0\" (UID: \"582dd6c0-32f0-41f1-b62d-2dfc7f5b6509\") " pod="openstack/cinder-scheduler-0" Mar 20 17:53:55 crc kubenswrapper[4690]: I0320 17:53:55.867607 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52mnh\" (UniqueName: \"kubernetes.io/projected/582dd6c0-32f0-41f1-b62d-2dfc7f5b6509-kube-api-access-52mnh\") pod \"cinder-scheduler-0\" (UID: \"582dd6c0-32f0-41f1-b62d-2dfc7f5b6509\") " pod="openstack/cinder-scheduler-0" Mar 20 17:53:55 crc kubenswrapper[4690]: I0320 17:53:55.867764 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/582dd6c0-32f0-41f1-b62d-2dfc7f5b6509-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"582dd6c0-32f0-41f1-b62d-2dfc7f5b6509\") " pod="openstack/cinder-scheduler-0" Mar 20 17:53:55 crc kubenswrapper[4690]: I0320 17:53:55.867947 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/582dd6c0-32f0-41f1-b62d-2dfc7f5b6509-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"582dd6c0-32f0-41f1-b62d-2dfc7f5b6509\") " pod="openstack/cinder-scheduler-0" Mar 20 17:53:55 crc kubenswrapper[4690]: I0320 17:53:55.868098 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/582dd6c0-32f0-41f1-b62d-2dfc7f5b6509-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"582dd6c0-32f0-41f1-b62d-2dfc7f5b6509\") " pod="openstack/cinder-scheduler-0" Mar 20 17:53:55 crc kubenswrapper[4690]: I0320 17:53:55.893490 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2dbf4c6-e3eb-4984-a39a-0981181cea31" path="/var/lib/kubelet/pods/c2dbf4c6-e3eb-4984-a39a-0981181cea31/volumes" Mar 20 17:53:55 crc kubenswrapper[4690]: I0320 17:53:55.894057 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1874a81-9fc0-4cb0-a681-8ab78df069a0" path="/var/lib/kubelet/pods/e1874a81-9fc0-4cb0-a681-8ab78df069a0/volumes" Mar 20 17:53:55 crc kubenswrapper[4690]: I0320 17:53:55.969350 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/582dd6c0-32f0-41f1-b62d-2dfc7f5b6509-config-data\") pod \"cinder-scheduler-0\" (UID: \"582dd6c0-32f0-41f1-b62d-2dfc7f5b6509\") " pod="openstack/cinder-scheduler-0" Mar 20 17:53:55 crc kubenswrapper[4690]: I0320 17:53:55.969409 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02713b3f-f042-40fc-a24e-f68ac876ae20-logs\") pod \"placement-77ff877fdd-nntbj\" (UID: \"02713b3f-f042-40fc-a24e-f68ac876ae20\") " pod="openstack/placement-77ff877fdd-nntbj" Mar 20 17:53:55 crc kubenswrapper[4690]: I0320 17:53:55.969447 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52mnh\" (UniqueName: \"kubernetes.io/projected/582dd6c0-32f0-41f1-b62d-2dfc7f5b6509-kube-api-access-52mnh\") pod \"cinder-scheduler-0\" (UID: \"582dd6c0-32f0-41f1-b62d-2dfc7f5b6509\") " pod="openstack/cinder-scheduler-0" Mar 20 17:53:55 crc kubenswrapper[4690]: I0320 17:53:55.969464 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/02713b3f-f042-40fc-a24e-f68ac876ae20-internal-tls-certs\") pod \"placement-77ff877fdd-nntbj\" (UID: \"02713b3f-f042-40fc-a24e-f68ac876ae20\") " pod="openstack/placement-77ff877fdd-nntbj" Mar 20 17:53:55 crc kubenswrapper[4690]: I0320 17:53:55.969494 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/582dd6c0-32f0-41f1-b62d-2dfc7f5b6509-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"582dd6c0-32f0-41f1-b62d-2dfc7f5b6509\") " pod="openstack/cinder-scheduler-0" Mar 20 17:53:55 crc kubenswrapper[4690]: I0320 17:53:55.969617 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/582dd6c0-32f0-41f1-b62d-2dfc7f5b6509-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"582dd6c0-32f0-41f1-b62d-2dfc7f5b6509\") " pod="openstack/cinder-scheduler-0" Mar 20 17:53:55 crc kubenswrapper[4690]: I0320 17:53:55.969692 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/582dd6c0-32f0-41f1-b62d-2dfc7f5b6509-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"582dd6c0-32f0-41f1-b62d-2dfc7f5b6509\") " pod="openstack/cinder-scheduler-0" Mar 20 17:53:55 crc kubenswrapper[4690]: I0320 17:53:55.969822 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02713b3f-f042-40fc-a24e-f68ac876ae20-scripts\") pod \"placement-77ff877fdd-nntbj\" (UID: \"02713b3f-f042-40fc-a24e-f68ac876ae20\") " pod="openstack/placement-77ff877fdd-nntbj" Mar 20 17:53:55 crc kubenswrapper[4690]: I0320 17:53:55.969907 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02713b3f-f042-40fc-a24e-f68ac876ae20-config-data\") pod \"placement-77ff877fdd-nntbj\" (UID: \"02713b3f-f042-40fc-a24e-f68ac876ae20\") " pod="openstack/placement-77ff877fdd-nntbj" Mar 20 17:53:55 crc kubenswrapper[4690]: I0320 17:53:55.969969 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/02713b3f-f042-40fc-a24e-f68ac876ae20-public-tls-certs\") pod \"placement-77ff877fdd-nntbj\" (UID: \"02713b3f-f042-40fc-a24e-f68ac876ae20\") " pod="openstack/placement-77ff877fdd-nntbj" Mar 20 17:53:55 crc kubenswrapper[4690]: I0320 17:53:55.969998 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/582dd6c0-32f0-41f1-b62d-2dfc7f5b6509-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"582dd6c0-32f0-41f1-b62d-2dfc7f5b6509\") " pod="openstack/cinder-scheduler-0" Mar 20 17:53:55 crc kubenswrapper[4690]: I0320 17:53:55.970105 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02713b3f-f042-40fc-a24e-f68ac876ae20-combined-ca-bundle\") pod \"placement-77ff877fdd-nntbj\" (UID: \"02713b3f-f042-40fc-a24e-f68ac876ae20\") " pod="openstack/placement-77ff877fdd-nntbj" Mar 20 17:53:55 crc kubenswrapper[4690]: I0320 17:53:55.970192 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/582dd6c0-32f0-41f1-b62d-2dfc7f5b6509-scripts\") pod \"cinder-scheduler-0\" (UID: \"582dd6c0-32f0-41f1-b62d-2dfc7f5b6509\") " pod="openstack/cinder-scheduler-0" Mar 20 17:53:55 crc kubenswrapper[4690]: I0320 17:53:55.970223 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pl4rq\" (UniqueName: \"kubernetes.io/projected/02713b3f-f042-40fc-a24e-f68ac876ae20-kube-api-access-pl4rq\") pod \"placement-77ff877fdd-nntbj\" (UID: \"02713b3f-f042-40fc-a24e-f68ac876ae20\") " pod="openstack/placement-77ff877fdd-nntbj" Mar 20 17:53:55 crc kubenswrapper[4690]: I0320 17:53:55.974896 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/582dd6c0-32f0-41f1-b62d-2dfc7f5b6509-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"582dd6c0-32f0-41f1-b62d-2dfc7f5b6509\") " pod="openstack/cinder-scheduler-0" Mar 20 17:53:55 crc kubenswrapper[4690]: I0320 17:53:55.974949 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/582dd6c0-32f0-41f1-b62d-2dfc7f5b6509-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"582dd6c0-32f0-41f1-b62d-2dfc7f5b6509\") " pod="openstack/cinder-scheduler-0" Mar 20 17:53:55 crc kubenswrapper[4690]: I0320 17:53:55.978629 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/582dd6c0-32f0-41f1-b62d-2dfc7f5b6509-scripts\") pod \"cinder-scheduler-0\" (UID: \"582dd6c0-32f0-41f1-b62d-2dfc7f5b6509\") " pod="openstack/cinder-scheduler-0" Mar 20 17:53:55 crc kubenswrapper[4690]: I0320 17:53:55.979060 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/582dd6c0-32f0-41f1-b62d-2dfc7f5b6509-config-data\") pod \"cinder-scheduler-0\" (UID: \"582dd6c0-32f0-41f1-b62d-2dfc7f5b6509\") " pod="openstack/cinder-scheduler-0" Mar 20 17:53:55 crc kubenswrapper[4690]: I0320 17:53:55.989962 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52mnh\" (UniqueName: \"kubernetes.io/projected/582dd6c0-32f0-41f1-b62d-2dfc7f5b6509-kube-api-access-52mnh\") pod \"cinder-scheduler-0\" (UID: \"582dd6c0-32f0-41f1-b62d-2dfc7f5b6509\") " pod="openstack/cinder-scheduler-0" Mar 20 17:53:56 crc kubenswrapper[4690]: I0320 17:53:56.011951 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 17:53:56 crc kubenswrapper[4690]: I0320 17:53:56.075335 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/02713b3f-f042-40fc-a24e-f68ac876ae20-internal-tls-certs\") pod \"placement-77ff877fdd-nntbj\" (UID: \"02713b3f-f042-40fc-a24e-f68ac876ae20\") " pod="openstack/placement-77ff877fdd-nntbj" Mar 20 17:53:56 crc kubenswrapper[4690]: I0320 17:53:56.075760 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02713b3f-f042-40fc-a24e-f68ac876ae20-scripts\") pod \"placement-77ff877fdd-nntbj\" (UID: \"02713b3f-f042-40fc-a24e-f68ac876ae20\") " pod="openstack/placement-77ff877fdd-nntbj" Mar 20 17:53:56 crc kubenswrapper[4690]: I0320 17:53:56.075782 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02713b3f-f042-40fc-a24e-f68ac876ae20-config-data\") pod \"placement-77ff877fdd-nntbj\" (UID: \"02713b3f-f042-40fc-a24e-f68ac876ae20\") " pod="openstack/placement-77ff877fdd-nntbj" Mar 20 17:53:56 crc kubenswrapper[4690]: I0320 17:53:56.075809 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/02713b3f-f042-40fc-a24e-f68ac876ae20-public-tls-certs\") pod \"placement-77ff877fdd-nntbj\" (UID: \"02713b3f-f042-40fc-a24e-f68ac876ae20\") " pod="openstack/placement-77ff877fdd-nntbj" Mar 20 17:53:56 crc kubenswrapper[4690]: I0320 17:53:56.075836 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02713b3f-f042-40fc-a24e-f68ac876ae20-combined-ca-bundle\") pod \"placement-77ff877fdd-nntbj\" (UID: \"02713b3f-f042-40fc-a24e-f68ac876ae20\") " pod="openstack/placement-77ff877fdd-nntbj" Mar 20 17:53:56 crc kubenswrapper[4690]: I0320 17:53:56.075863 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pl4rq\" (UniqueName: \"kubernetes.io/projected/02713b3f-f042-40fc-a24e-f68ac876ae20-kube-api-access-pl4rq\") pod \"placement-77ff877fdd-nntbj\" (UID: \"02713b3f-f042-40fc-a24e-f68ac876ae20\") " pod="openstack/placement-77ff877fdd-nntbj" Mar 20 17:53:56 crc kubenswrapper[4690]: I0320 17:53:56.075900 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02713b3f-f042-40fc-a24e-f68ac876ae20-logs\") pod \"placement-77ff877fdd-nntbj\" (UID: \"02713b3f-f042-40fc-a24e-f68ac876ae20\") " pod="openstack/placement-77ff877fdd-nntbj" Mar 20 17:53:56 crc kubenswrapper[4690]: I0320 17:53:56.080422 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02713b3f-f042-40fc-a24e-f68ac876ae20-logs\") pod \"placement-77ff877fdd-nntbj\" (UID: \"02713b3f-f042-40fc-a24e-f68ac876ae20\") " pod="openstack/placement-77ff877fdd-nntbj" Mar 20 17:53:56 crc kubenswrapper[4690]: I0320 17:53:56.081558 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/02713b3f-f042-40fc-a24e-f68ac876ae20-internal-tls-certs\") pod \"placement-77ff877fdd-nntbj\" (UID: \"02713b3f-f042-40fc-a24e-f68ac876ae20\") " pod="openstack/placement-77ff877fdd-nntbj" Mar 20 17:53:56 crc kubenswrapper[4690]: I0320 17:53:56.084242 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02713b3f-f042-40fc-a24e-f68ac876ae20-combined-ca-bundle\") pod \"placement-77ff877fdd-nntbj\" (UID: \"02713b3f-f042-40fc-a24e-f68ac876ae20\") " pod="openstack/placement-77ff877fdd-nntbj" Mar 20 17:53:56 crc kubenswrapper[4690]: I0320 17:53:56.088654 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02713b3f-f042-40fc-a24e-f68ac876ae20-scripts\") pod \"placement-77ff877fdd-nntbj\" (UID: \"02713b3f-f042-40fc-a24e-f68ac876ae20\") " pod="openstack/placement-77ff877fdd-nntbj" Mar 20 17:53:56 crc kubenswrapper[4690]: I0320 17:53:56.094999 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02713b3f-f042-40fc-a24e-f68ac876ae20-config-data\") pod \"placement-77ff877fdd-nntbj\" (UID: \"02713b3f-f042-40fc-a24e-f68ac876ae20\") " pod="openstack/placement-77ff877fdd-nntbj" Mar 20 17:53:56 crc kubenswrapper[4690]: I0320 17:53:56.095761 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pl4rq\" (UniqueName: \"kubernetes.io/projected/02713b3f-f042-40fc-a24e-f68ac876ae20-kube-api-access-pl4rq\") pod \"placement-77ff877fdd-nntbj\" (UID: \"02713b3f-f042-40fc-a24e-f68ac876ae20\") " pod="openstack/placement-77ff877fdd-nntbj" Mar 20 17:53:56 crc kubenswrapper[4690]: I0320 17:53:56.096701 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/02713b3f-f042-40fc-a24e-f68ac876ae20-public-tls-certs\") pod \"placement-77ff877fdd-nntbj\" (UID: \"02713b3f-f042-40fc-a24e-f68ac876ae20\") " pod="openstack/placement-77ff877fdd-nntbj" Mar 20 17:53:56 crc kubenswrapper[4690]: I0320 17:53:56.138925 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-77ff877fdd-nntbj" Mar 20 17:53:56 crc kubenswrapper[4690]: I0320 17:53:56.583473 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 17:53:56 crc kubenswrapper[4690]: W0320 17:53:56.590064 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod582dd6c0_32f0_41f1_b62d_2dfc7f5b6509.slice/crio-348a7a6a48f8373e98b21b0d03d4e6897699777e1f2c72dd6467756592239848 WatchSource:0}: Error finding container 348a7a6a48f8373e98b21b0d03d4e6897699777e1f2c72dd6467756592239848: Status 404 returned error can't find the container with id 348a7a6a48f8373e98b21b0d03d4e6897699777e1f2c72dd6467756592239848 Mar 20 17:53:56 crc kubenswrapper[4690]: I0320 17:53:56.632190 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"582dd6c0-32f0-41f1-b62d-2dfc7f5b6509","Type":"ContainerStarted","Data":"348a7a6a48f8373e98b21b0d03d4e6897699777e1f2c72dd6467756592239848"} Mar 20 17:53:56 crc kubenswrapper[4690]: W0320 17:53:56.674432 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod02713b3f_f042_40fc_a24e_f68ac876ae20.slice/crio-51591f59483f26550c2568ce3129842d2e4ba111de0838b69d9a631b5cca4325 WatchSource:0}: Error finding container 51591f59483f26550c2568ce3129842d2e4ba111de0838b69d9a631b5cca4325: Status 404 returned error can't find the container with id 51591f59483f26550c2568ce3129842d2e4ba111de0838b69d9a631b5cca4325 Mar 20 17:53:56 crc kubenswrapper[4690]: I0320 17:53:56.685068 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-77ff877fdd-nntbj"] Mar 20 17:53:57 crc kubenswrapper[4690]: I0320 17:53:57.353187 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-944584c5d-v2mwf" Mar 20 17:53:57 crc kubenswrapper[4690]: I0320 17:53:57.465420 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-944584c5d-v2mwf" Mar 20 17:53:57 crc kubenswrapper[4690]: I0320 17:53:57.571712 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-78ff979d76-5nxvv"] Mar 20 17:53:57 crc kubenswrapper[4690]: I0320 17:53:57.571928 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-78ff979d76-5nxvv" podUID="61654056-0ac2-4383-9018-4ddd302b2620" containerName="barbican-api-log" containerID="cri-o://093db73a54b1c649a617b97632a944df93d4462e6a78ec7a84674a066b2eb40c" gracePeriod=30 Mar 20 17:53:57 crc kubenswrapper[4690]: I0320 17:53:57.574059 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-78ff979d76-5nxvv" podUID="61654056-0ac2-4383-9018-4ddd302b2620" containerName="barbican-api" containerID="cri-o://03c36b28d28a3c83edeff3a8e5edb85be531d0ee6476f60042a6c8fb3ca890c8" gracePeriod=30 Mar 20 17:53:57 crc kubenswrapper[4690]: I0320 17:53:57.672676 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-77ff877fdd-nntbj" event={"ID":"02713b3f-f042-40fc-a24e-f68ac876ae20","Type":"ContainerStarted","Data":"106641c006dab838956fc7a9c2c7ffa163a82736c7a67676cadf9b2a0e2cf033"} Mar 20 17:53:57 crc kubenswrapper[4690]: I0320 17:53:57.673125 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-77ff877fdd-nntbj" event={"ID":"02713b3f-f042-40fc-a24e-f68ac876ae20","Type":"ContainerStarted","Data":"aee3f9bec8025de8f456c61d1193bc21badbfefad09cc9c07babe28bdf903d96"} Mar 20 17:53:57 crc kubenswrapper[4690]: I0320 17:53:57.673136 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-77ff877fdd-nntbj" event={"ID":"02713b3f-f042-40fc-a24e-f68ac876ae20","Type":"ContainerStarted","Data":"51591f59483f26550c2568ce3129842d2e4ba111de0838b69d9a631b5cca4325"} Mar 20 17:53:57 crc kubenswrapper[4690]: I0320 17:53:57.674784 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-77ff877fdd-nntbj" Mar 20 17:53:57 crc kubenswrapper[4690]: I0320 17:53:57.674823 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-77ff877fdd-nntbj" Mar 20 17:53:57 crc kubenswrapper[4690]: I0320 17:53:57.689154 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"582dd6c0-32f0-41f1-b62d-2dfc7f5b6509","Type":"ContainerStarted","Data":"261b3b6f44cf2081af434ced2d5eca942a8858618468b7f3d4318ac5060173f1"} Mar 20 17:53:57 crc kubenswrapper[4690]: I0320 17:53:57.752968 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-77ff877fdd-nntbj" podStartSLOduration=2.752944957 podStartE2EDuration="2.752944957s" podCreationTimestamp="2026-03-20 17:53:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:53:57.714849332 +0000 UTC m=+1312.580675030" watchObservedRunningTime="2026-03-20 17:53:57.752944957 +0000 UTC m=+1312.618770635" Mar 20 17:53:58 crc kubenswrapper[4690]: I0320 17:53:58.702337 4690 generic.go:334] "Generic (PLEG): container finished" podID="607d61e7-e52a-46e6-a23a-2d4714c5b543" containerID="986e03253e1d42403b7786f6087a48f5db97b4dbed738848947823b11c19e91a" exitCode=0 Mar 20 17:53:58 crc kubenswrapper[4690]: I0320 17:53:58.702409 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-867c5896-qkwmr" event={"ID":"607d61e7-e52a-46e6-a23a-2d4714c5b543","Type":"ContainerDied","Data":"986e03253e1d42403b7786f6087a48f5db97b4dbed738848947823b11c19e91a"} Mar 20 17:53:58 crc kubenswrapper[4690]: I0320 17:53:58.705226 4690 generic.go:334] "Generic (PLEG): container finished" podID="61654056-0ac2-4383-9018-4ddd302b2620" containerID="093db73a54b1c649a617b97632a944df93d4462e6a78ec7a84674a066b2eb40c" exitCode=143 Mar 20 17:53:58 crc kubenswrapper[4690]: I0320 17:53:58.705308 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-78ff979d76-5nxvv" event={"ID":"61654056-0ac2-4383-9018-4ddd302b2620","Type":"ContainerDied","Data":"093db73a54b1c649a617b97632a944df93d4462e6a78ec7a84674a066b2eb40c"} Mar 20 17:53:58 crc kubenswrapper[4690]: I0320 17:53:58.709005 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"582dd6c0-32f0-41f1-b62d-2dfc7f5b6509","Type":"ContainerStarted","Data":"7440bd0e66b0687d398812cfbb2380b62eb2de768f917c98cb53a08686528a6c"} Mar 20 17:53:58 crc kubenswrapper[4690]: I0320 17:53:58.779947 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.7799298329999997 podStartE2EDuration="3.779929833s" podCreationTimestamp="2026-03-20 17:53:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:53:58.777807884 +0000 UTC m=+1313.643633572" watchObservedRunningTime="2026-03-20 17:53:58.779929833 +0000 UTC m=+1313.645755511" Mar 20 17:53:59 crc kubenswrapper[4690]: I0320 17:53:59.688475 4690 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-867c5896-qkwmr" podUID="607d61e7-e52a-46e6-a23a-2d4714c5b543" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.148:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.148:8443: connect: connection refused" Mar 20 17:54:00 crc kubenswrapper[4690]: I0320 17:54:00.102231 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 20 17:54:00 crc kubenswrapper[4690]: I0320 17:54:00.185540 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567154-x5hrx"] Mar 20 17:54:00 crc kubenswrapper[4690]: I0320 17:54:00.187355 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567154-x5hrx" Mar 20 17:54:00 crc kubenswrapper[4690]: I0320 17:54:00.191245 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 17:54:00 crc kubenswrapper[4690]: I0320 17:54:00.191645 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5fwhb" Mar 20 17:54:00 crc kubenswrapper[4690]: I0320 17:54:00.191877 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 17:54:00 crc kubenswrapper[4690]: I0320 17:54:00.195091 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567154-x5hrx"] Mar 20 17:54:00 crc kubenswrapper[4690]: I0320 17:54:00.355315 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78gfn\" (UniqueName: \"kubernetes.io/projected/b35a2b62-f869-4a99-a922-68822abfaa30-kube-api-access-78gfn\") pod \"auto-csr-approver-29567154-x5hrx\" (UID: \"b35a2b62-f869-4a99-a922-68822abfaa30\") " pod="openshift-infra/auto-csr-approver-29567154-x5hrx" Mar 20 17:54:00 crc kubenswrapper[4690]: I0320 17:54:00.457530 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78gfn\" (UniqueName: \"kubernetes.io/projected/b35a2b62-f869-4a99-a922-68822abfaa30-kube-api-access-78gfn\") pod \"auto-csr-approver-29567154-x5hrx\" (UID: \"b35a2b62-f869-4a99-a922-68822abfaa30\") " pod="openshift-infra/auto-csr-approver-29567154-x5hrx" Mar 20 17:54:00 crc kubenswrapper[4690]: I0320 17:54:00.483081 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78gfn\" (UniqueName: \"kubernetes.io/projected/b35a2b62-f869-4a99-a922-68822abfaa30-kube-api-access-78gfn\") pod \"auto-csr-approver-29567154-x5hrx\" (UID: \"b35a2b62-f869-4a99-a922-68822abfaa30\") " pod="openshift-infra/auto-csr-approver-29567154-x5hrx" Mar 20 17:54:00 crc kubenswrapper[4690]: I0320 17:54:00.513056 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567154-x5hrx" Mar 20 17:54:00 crc kubenswrapper[4690]: I0320 17:54:00.809601 4690 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-78ff979d76-5nxvv" podUID="61654056-0ac2-4383-9018-4ddd302b2620" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.161:9311/healthcheck\": read tcp 10.217.0.2:43454->10.217.0.161:9311: read: connection reset by peer" Mar 20 17:54:00 crc kubenswrapper[4690]: I0320 17:54:00.809620 4690 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-78ff979d76-5nxvv" podUID="61654056-0ac2-4383-9018-4ddd302b2620" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.161:9311/healthcheck\": read tcp 10.217.0.2:43442->10.217.0.161:9311: read: connection reset by peer" Mar 20 17:54:01 crc kubenswrapper[4690]: I0320 17:54:01.000045 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567154-x5hrx"] Mar 20 17:54:01 crc kubenswrapper[4690]: W0320 17:54:01.000372 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb35a2b62_f869_4a99_a922_68822abfaa30.slice/crio-ab8950d580798ce55cd6d3a2af0183c361ef42383aeae4b695f350338c90431e WatchSource:0}: Error finding container ab8950d580798ce55cd6d3a2af0183c361ef42383aeae4b695f350338c90431e: Status 404 returned error can't find the container with id ab8950d580798ce55cd6d3a2af0183c361ef42383aeae4b695f350338c90431e Mar 20 17:54:01 crc kubenswrapper[4690]: I0320 17:54:01.012626 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 20 17:54:01 crc kubenswrapper[4690]: I0320 17:54:01.264114 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-78ff979d76-5nxvv" Mar 20 17:54:01 crc kubenswrapper[4690]: I0320 17:54:01.376284 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f7k9p\" (UniqueName: \"kubernetes.io/projected/61654056-0ac2-4383-9018-4ddd302b2620-kube-api-access-f7k9p\") pod \"61654056-0ac2-4383-9018-4ddd302b2620\" (UID: \"61654056-0ac2-4383-9018-4ddd302b2620\") " Mar 20 17:54:01 crc kubenswrapper[4690]: I0320 17:54:01.376677 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/61654056-0ac2-4383-9018-4ddd302b2620-config-data-custom\") pod \"61654056-0ac2-4383-9018-4ddd302b2620\" (UID: \"61654056-0ac2-4383-9018-4ddd302b2620\") " Mar 20 17:54:01 crc kubenswrapper[4690]: I0320 17:54:01.376778 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61654056-0ac2-4383-9018-4ddd302b2620-combined-ca-bundle\") pod \"61654056-0ac2-4383-9018-4ddd302b2620\" (UID: \"61654056-0ac2-4383-9018-4ddd302b2620\") " Mar 20 17:54:01 crc kubenswrapper[4690]: I0320 17:54:01.376812 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61654056-0ac2-4383-9018-4ddd302b2620-logs\") pod \"61654056-0ac2-4383-9018-4ddd302b2620\" (UID: \"61654056-0ac2-4383-9018-4ddd302b2620\") " Mar 20 17:54:01 crc kubenswrapper[4690]: I0320 17:54:01.376854 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61654056-0ac2-4383-9018-4ddd302b2620-config-data\") pod \"61654056-0ac2-4383-9018-4ddd302b2620\" (UID: \"61654056-0ac2-4383-9018-4ddd302b2620\") " Mar 20 17:54:01 crc kubenswrapper[4690]: I0320 17:54:01.379507 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61654056-0ac2-4383-9018-4ddd302b2620-logs" (OuterVolumeSpecName: "logs") pod "61654056-0ac2-4383-9018-4ddd302b2620" (UID: "61654056-0ac2-4383-9018-4ddd302b2620"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:54:01 crc kubenswrapper[4690]: I0320 17:54:01.383515 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61654056-0ac2-4383-9018-4ddd302b2620-kube-api-access-f7k9p" (OuterVolumeSpecName: "kube-api-access-f7k9p") pod "61654056-0ac2-4383-9018-4ddd302b2620" (UID: "61654056-0ac2-4383-9018-4ddd302b2620"). InnerVolumeSpecName "kube-api-access-f7k9p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:54:01 crc kubenswrapper[4690]: I0320 17:54:01.384264 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61654056-0ac2-4383-9018-4ddd302b2620-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "61654056-0ac2-4383-9018-4ddd302b2620" (UID: "61654056-0ac2-4383-9018-4ddd302b2620"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:54:01 crc kubenswrapper[4690]: I0320 17:54:01.408582 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61654056-0ac2-4383-9018-4ddd302b2620-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "61654056-0ac2-4383-9018-4ddd302b2620" (UID: "61654056-0ac2-4383-9018-4ddd302b2620"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:54:01 crc kubenswrapper[4690]: I0320 17:54:01.429603 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61654056-0ac2-4383-9018-4ddd302b2620-config-data" (OuterVolumeSpecName: "config-data") pod "61654056-0ac2-4383-9018-4ddd302b2620" (UID: "61654056-0ac2-4383-9018-4ddd302b2620"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:54:01 crc kubenswrapper[4690]: I0320 17:54:01.478460 4690 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/61654056-0ac2-4383-9018-4ddd302b2620-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:01 crc kubenswrapper[4690]: I0320 17:54:01.478504 4690 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61654056-0ac2-4383-9018-4ddd302b2620-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:01 crc kubenswrapper[4690]: I0320 17:54:01.478517 4690 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61654056-0ac2-4383-9018-4ddd302b2620-logs\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:01 crc kubenswrapper[4690]: I0320 17:54:01.478530 4690 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61654056-0ac2-4383-9018-4ddd302b2620-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:01 crc kubenswrapper[4690]: I0320 17:54:01.478541 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f7k9p\" (UniqueName: \"kubernetes.io/projected/61654056-0ac2-4383-9018-4ddd302b2620-kube-api-access-f7k9p\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:01 crc kubenswrapper[4690]: I0320 17:54:01.747785 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567154-x5hrx" event={"ID":"b35a2b62-f869-4a99-a922-68822abfaa30","Type":"ContainerStarted","Data":"ab8950d580798ce55cd6d3a2af0183c361ef42383aeae4b695f350338c90431e"} Mar 20 17:54:01 crc kubenswrapper[4690]: I0320 17:54:01.749959 4690 generic.go:334] "Generic (PLEG): container finished" podID="61654056-0ac2-4383-9018-4ddd302b2620" containerID="03c36b28d28a3c83edeff3a8e5edb85be531d0ee6476f60042a6c8fb3ca890c8" exitCode=0 Mar 20 17:54:01 crc kubenswrapper[4690]: I0320 17:54:01.750036 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-78ff979d76-5nxvv" Mar 20 17:54:01 crc kubenswrapper[4690]: I0320 17:54:01.750075 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-78ff979d76-5nxvv" event={"ID":"61654056-0ac2-4383-9018-4ddd302b2620","Type":"ContainerDied","Data":"03c36b28d28a3c83edeff3a8e5edb85be531d0ee6476f60042a6c8fb3ca890c8"} Mar 20 17:54:01 crc kubenswrapper[4690]: I0320 17:54:01.750295 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-78ff979d76-5nxvv" event={"ID":"61654056-0ac2-4383-9018-4ddd302b2620","Type":"ContainerDied","Data":"cd16dd73f52d56a362f16d90398a87d5de93a54ec31c9949ffe4fe4f75ff39c1"} Mar 20 17:54:01 crc kubenswrapper[4690]: I0320 17:54:01.750322 4690 scope.go:117] "RemoveContainer" containerID="03c36b28d28a3c83edeff3a8e5edb85be531d0ee6476f60042a6c8fb3ca890c8" Mar 20 17:54:01 crc kubenswrapper[4690]: I0320 17:54:01.786019 4690 scope.go:117] "RemoveContainer" containerID="093db73a54b1c649a617b97632a944df93d4462e6a78ec7a84674a066b2eb40c" Mar 20 17:54:01 crc kubenswrapper[4690]: I0320 17:54:01.794236 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-78ff979d76-5nxvv"] Mar 20 17:54:01 crc kubenswrapper[4690]: I0320 17:54:01.802910 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-78ff979d76-5nxvv"] Mar 20 17:54:01 crc kubenswrapper[4690]: I0320 17:54:01.808148 4690 scope.go:117] "RemoveContainer" containerID="03c36b28d28a3c83edeff3a8e5edb85be531d0ee6476f60042a6c8fb3ca890c8" Mar 20 17:54:01 crc kubenswrapper[4690]: E0320 17:54:01.808583 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03c36b28d28a3c83edeff3a8e5edb85be531d0ee6476f60042a6c8fb3ca890c8\": container with ID starting with 03c36b28d28a3c83edeff3a8e5edb85be531d0ee6476f60042a6c8fb3ca890c8 not found: ID does not exist" containerID="03c36b28d28a3c83edeff3a8e5edb85be531d0ee6476f60042a6c8fb3ca890c8" Mar 20 17:54:01 crc kubenswrapper[4690]: I0320 17:54:01.808623 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03c36b28d28a3c83edeff3a8e5edb85be531d0ee6476f60042a6c8fb3ca890c8"} err="failed to get container status \"03c36b28d28a3c83edeff3a8e5edb85be531d0ee6476f60042a6c8fb3ca890c8\": rpc error: code = NotFound desc = could not find container \"03c36b28d28a3c83edeff3a8e5edb85be531d0ee6476f60042a6c8fb3ca890c8\": container with ID starting with 03c36b28d28a3c83edeff3a8e5edb85be531d0ee6476f60042a6c8fb3ca890c8 not found: ID does not exist" Mar 20 17:54:01 crc kubenswrapper[4690]: I0320 17:54:01.808650 4690 scope.go:117] "RemoveContainer" containerID="093db73a54b1c649a617b97632a944df93d4462e6a78ec7a84674a066b2eb40c" Mar 20 17:54:01 crc kubenswrapper[4690]: E0320 17:54:01.808899 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"093db73a54b1c649a617b97632a944df93d4462e6a78ec7a84674a066b2eb40c\": container with ID starting with 093db73a54b1c649a617b97632a944df93d4462e6a78ec7a84674a066b2eb40c not found: ID does not exist" containerID="093db73a54b1c649a617b97632a944df93d4462e6a78ec7a84674a066b2eb40c" Mar 20 17:54:01 crc kubenswrapper[4690]: I0320 17:54:01.808918 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"093db73a54b1c649a617b97632a944df93d4462e6a78ec7a84674a066b2eb40c"} err="failed to get container status \"093db73a54b1c649a617b97632a944df93d4462e6a78ec7a84674a066b2eb40c\": rpc error: code = NotFound desc = could not find container \"093db73a54b1c649a617b97632a944df93d4462e6a78ec7a84674a066b2eb40c\": container with ID starting with 093db73a54b1c649a617b97632a944df93d4462e6a78ec7a84674a066b2eb40c not found: ID does not exist" Mar 20 17:54:01 crc kubenswrapper[4690]: I0320 17:54:01.894830 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61654056-0ac2-4383-9018-4ddd302b2620" path="/var/lib/kubelet/pods/61654056-0ac2-4383-9018-4ddd302b2620/volumes" Mar 20 17:54:02 crc kubenswrapper[4690]: I0320 17:54:02.092791 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-b966595d7-ccrp2" Mar 20 17:54:02 crc kubenswrapper[4690]: I0320 17:54:02.760572 4690 generic.go:334] "Generic (PLEG): container finished" podID="b35a2b62-f869-4a99-a922-68822abfaa30" containerID="eba29818292a7ee79edced794f1a5f5bd1986090cfbc4d98d81db31754af90eb" exitCode=0 Mar 20 17:54:02 crc kubenswrapper[4690]: I0320 17:54:02.760621 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567154-x5hrx" event={"ID":"b35a2b62-f869-4a99-a922-68822abfaa30","Type":"ContainerDied","Data":"eba29818292a7ee79edced794f1a5f5bd1986090cfbc4d98d81db31754af90eb"} Mar 20 17:54:03 crc kubenswrapper[4690]: I0320 17:54:03.453391 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 20 17:54:03 crc kubenswrapper[4690]: E0320 17:54:03.454047 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61654056-0ac2-4383-9018-4ddd302b2620" containerName="barbican-api-log" Mar 20 17:54:03 crc kubenswrapper[4690]: I0320 17:54:03.454075 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="61654056-0ac2-4383-9018-4ddd302b2620" containerName="barbican-api-log" Mar 20 17:54:03 crc kubenswrapper[4690]: E0320 17:54:03.454100 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61654056-0ac2-4383-9018-4ddd302b2620" containerName="barbican-api" Mar 20 17:54:03 crc kubenswrapper[4690]: I0320 17:54:03.454114 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="61654056-0ac2-4383-9018-4ddd302b2620" containerName="barbican-api" Mar 20 17:54:03 crc kubenswrapper[4690]: I0320 17:54:03.454474 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="61654056-0ac2-4383-9018-4ddd302b2620" containerName="barbican-api" Mar 20 17:54:03 crc kubenswrapper[4690]: I0320 17:54:03.454523 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="61654056-0ac2-4383-9018-4ddd302b2620" containerName="barbican-api-log" Mar 20 17:54:03 crc kubenswrapper[4690]: I0320 17:54:03.455539 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 20 17:54:03 crc kubenswrapper[4690]: I0320 17:54:03.458865 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 20 17:54:03 crc kubenswrapper[4690]: I0320 17:54:03.459034 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 20 17:54:03 crc kubenswrapper[4690]: I0320 17:54:03.462997 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-9r77s" Mar 20 17:54:03 crc kubenswrapper[4690]: I0320 17:54:03.488007 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 20 17:54:03 crc kubenswrapper[4690]: I0320 17:54:03.618954 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a537f291-8787-434c-84bc-4355ccccbe47-openstack-config-secret\") pod \"openstackclient\" (UID: \"a537f291-8787-434c-84bc-4355ccccbe47\") " pod="openstack/openstackclient" Mar 20 17:54:03 crc kubenswrapper[4690]: I0320 17:54:03.619027 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a537f291-8787-434c-84bc-4355ccccbe47-combined-ca-bundle\") pod \"openstackclient\" (UID: \"a537f291-8787-434c-84bc-4355ccccbe47\") " pod="openstack/openstackclient" Mar 20 17:54:03 crc kubenswrapper[4690]: I0320 17:54:03.619206 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a537f291-8787-434c-84bc-4355ccccbe47-openstack-config\") pod \"openstackclient\" (UID: \"a537f291-8787-434c-84bc-4355ccccbe47\") " pod="openstack/openstackclient" Mar 20 17:54:03 crc kubenswrapper[4690]: I0320 17:54:03.619336 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4xtp\" (UniqueName: \"kubernetes.io/projected/a537f291-8787-434c-84bc-4355ccccbe47-kube-api-access-r4xtp\") pod \"openstackclient\" (UID: \"a537f291-8787-434c-84bc-4355ccccbe47\") " pod="openstack/openstackclient" Mar 20 17:54:03 crc kubenswrapper[4690]: I0320 17:54:03.721367 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a537f291-8787-434c-84bc-4355ccccbe47-combined-ca-bundle\") pod \"openstackclient\" (UID: \"a537f291-8787-434c-84bc-4355ccccbe47\") " pod="openstack/openstackclient" Mar 20 17:54:03 crc kubenswrapper[4690]: I0320 17:54:03.721495 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a537f291-8787-434c-84bc-4355ccccbe47-openstack-config\") pod \"openstackclient\" (UID: \"a537f291-8787-434c-84bc-4355ccccbe47\") " pod="openstack/openstackclient" Mar 20 17:54:03 crc kubenswrapper[4690]: I0320 17:54:03.721523 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4xtp\" (UniqueName: \"kubernetes.io/projected/a537f291-8787-434c-84bc-4355ccccbe47-kube-api-access-r4xtp\") pod \"openstackclient\" (UID: \"a537f291-8787-434c-84bc-4355ccccbe47\") " pod="openstack/openstackclient" Mar 20 17:54:03 crc kubenswrapper[4690]: I0320 17:54:03.721581 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a537f291-8787-434c-84bc-4355ccccbe47-openstack-config-secret\") pod \"openstackclient\" (UID: \"a537f291-8787-434c-84bc-4355ccccbe47\") " pod="openstack/openstackclient" Mar 20 17:54:03 crc kubenswrapper[4690]: I0320 17:54:03.722611 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a537f291-8787-434c-84bc-4355ccccbe47-openstack-config\") pod \"openstackclient\" (UID: \"a537f291-8787-434c-84bc-4355ccccbe47\") " pod="openstack/openstackclient" Mar 20 17:54:03 crc kubenswrapper[4690]: I0320 17:54:03.727705 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a537f291-8787-434c-84bc-4355ccccbe47-openstack-config-secret\") pod \"openstackclient\" (UID: \"a537f291-8787-434c-84bc-4355ccccbe47\") " pod="openstack/openstackclient" Mar 20 17:54:03 crc kubenswrapper[4690]: I0320 17:54:03.734874 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a537f291-8787-434c-84bc-4355ccccbe47-combined-ca-bundle\") pod \"openstackclient\" (UID: \"a537f291-8787-434c-84bc-4355ccccbe47\") " pod="openstack/openstackclient" Mar 20 17:54:03 crc kubenswrapper[4690]: I0320 17:54:03.748746 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4xtp\" (UniqueName: \"kubernetes.io/projected/a537f291-8787-434c-84bc-4355ccccbe47-kube-api-access-r4xtp\") pod \"openstackclient\" (UID: \"a537f291-8787-434c-84bc-4355ccccbe47\") " pod="openstack/openstackclient" Mar 20 17:54:03 crc kubenswrapper[4690]: I0320 17:54:03.781636 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 20 17:54:04 crc kubenswrapper[4690]: I0320 17:54:04.161945 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567154-x5hrx" Mar 20 17:54:04 crc kubenswrapper[4690]: I0320 17:54:04.324010 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 20 17:54:04 crc kubenswrapper[4690]: W0320 17:54:04.325388 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda537f291_8787_434c_84bc_4355ccccbe47.slice/crio-9ce9e139cfc90d1160aba7b1020d036c8a75e82048b00a0389399792a6974073 WatchSource:0}: Error finding container 9ce9e139cfc90d1160aba7b1020d036c8a75e82048b00a0389399792a6974073: Status 404 returned error can't find the container with id 9ce9e139cfc90d1160aba7b1020d036c8a75e82048b00a0389399792a6974073 Mar 20 17:54:04 crc kubenswrapper[4690]: I0320 17:54:04.336555 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78gfn\" (UniqueName: \"kubernetes.io/projected/b35a2b62-f869-4a99-a922-68822abfaa30-kube-api-access-78gfn\") pod \"b35a2b62-f869-4a99-a922-68822abfaa30\" (UID: \"b35a2b62-f869-4a99-a922-68822abfaa30\") " Mar 20 17:54:04 crc kubenswrapper[4690]: I0320 17:54:04.344637 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b35a2b62-f869-4a99-a922-68822abfaa30-kube-api-access-78gfn" (OuterVolumeSpecName: "kube-api-access-78gfn") pod "b35a2b62-f869-4a99-a922-68822abfaa30" (UID: "b35a2b62-f869-4a99-a922-68822abfaa30"). InnerVolumeSpecName "kube-api-access-78gfn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:54:04 crc kubenswrapper[4690]: I0320 17:54:04.439101 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78gfn\" (UniqueName: \"kubernetes.io/projected/b35a2b62-f869-4a99-a922-68822abfaa30-kube-api-access-78gfn\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:04 crc kubenswrapper[4690]: I0320 17:54:04.781342 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567154-x5hrx" event={"ID":"b35a2b62-f869-4a99-a922-68822abfaa30","Type":"ContainerDied","Data":"ab8950d580798ce55cd6d3a2af0183c361ef42383aeae4b695f350338c90431e"} Mar 20 17:54:04 crc kubenswrapper[4690]: I0320 17:54:04.781380 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567154-x5hrx" Mar 20 17:54:04 crc kubenswrapper[4690]: I0320 17:54:04.781406 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab8950d580798ce55cd6d3a2af0183c361ef42383aeae4b695f350338c90431e" Mar 20 17:54:04 crc kubenswrapper[4690]: I0320 17:54:04.782942 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"a537f291-8787-434c-84bc-4355ccccbe47","Type":"ContainerStarted","Data":"9ce9e139cfc90d1160aba7b1020d036c8a75e82048b00a0389399792a6974073"} Mar 20 17:54:05 crc kubenswrapper[4690]: I0320 17:54:05.265606 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567148-8qwm8"] Mar 20 17:54:05 crc kubenswrapper[4690]: I0320 17:54:05.275235 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567148-8qwm8"] Mar 20 17:54:05 crc kubenswrapper[4690]: I0320 17:54:05.900597 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8bb50ad2-7215-449a-8280-a13e4e324734" path="/var/lib/kubelet/pods/8bb50ad2-7215-449a-8280-a13e4e324734/volumes" Mar 20 17:54:06 crc kubenswrapper[4690]: I0320 17:54:06.288056 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 20 17:54:07 crc kubenswrapper[4690]: I0320 17:54:07.087976 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-799f9bd8b7-4q7w9"] Mar 20 17:54:07 crc kubenswrapper[4690]: E0320 17:54:07.088390 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b35a2b62-f869-4a99-a922-68822abfaa30" containerName="oc" Mar 20 17:54:07 crc kubenswrapper[4690]: I0320 17:54:07.088401 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="b35a2b62-f869-4a99-a922-68822abfaa30" containerName="oc" Mar 20 17:54:07 crc kubenswrapper[4690]: I0320 17:54:07.088576 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="b35a2b62-f869-4a99-a922-68822abfaa30" containerName="oc" Mar 20 17:54:07 crc kubenswrapper[4690]: I0320 17:54:07.094702 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-799f9bd8b7-4q7w9" Mar 20 17:54:07 crc kubenswrapper[4690]: I0320 17:54:07.102705 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Mar 20 17:54:07 crc kubenswrapper[4690]: I0320 17:54:07.104955 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 20 17:54:07 crc kubenswrapper[4690]: I0320 17:54:07.109664 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Mar 20 17:54:07 crc kubenswrapper[4690]: I0320 17:54:07.125330 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-799f9bd8b7-4q7w9"] Mar 20 17:54:07 crc kubenswrapper[4690]: I0320 17:54:07.204431 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcfn7\" (UniqueName: \"kubernetes.io/projected/3f074183-2793-4719-95b3-c2df447c93ab-kube-api-access-mcfn7\") pod \"swift-proxy-799f9bd8b7-4q7w9\" (UID: \"3f074183-2793-4719-95b3-c2df447c93ab\") " pod="openstack/swift-proxy-799f9bd8b7-4q7w9" Mar 20 17:54:07 crc kubenswrapper[4690]: I0320 17:54:07.204507 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f074183-2793-4719-95b3-c2df447c93ab-config-data\") pod \"swift-proxy-799f9bd8b7-4q7w9\" (UID: \"3f074183-2793-4719-95b3-c2df447c93ab\") " pod="openstack/swift-proxy-799f9bd8b7-4q7w9" Mar 20 17:54:07 crc kubenswrapper[4690]: I0320 17:54:07.204742 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3f074183-2793-4719-95b3-c2df447c93ab-log-httpd\") pod \"swift-proxy-799f9bd8b7-4q7w9\" (UID: \"3f074183-2793-4719-95b3-c2df447c93ab\") " pod="openstack/swift-proxy-799f9bd8b7-4q7w9" Mar 20 17:54:07 crc kubenswrapper[4690]: I0320 17:54:07.204825 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f074183-2793-4719-95b3-c2df447c93ab-combined-ca-bundle\") pod \"swift-proxy-799f9bd8b7-4q7w9\" (UID: \"3f074183-2793-4719-95b3-c2df447c93ab\") " pod="openstack/swift-proxy-799f9bd8b7-4q7w9" Mar 20 17:54:07 crc kubenswrapper[4690]: I0320 17:54:07.204875 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3f074183-2793-4719-95b3-c2df447c93ab-run-httpd\") pod \"swift-proxy-799f9bd8b7-4q7w9\" (UID: \"3f074183-2793-4719-95b3-c2df447c93ab\") " pod="openstack/swift-proxy-799f9bd8b7-4q7w9" Mar 20 17:54:07 crc kubenswrapper[4690]: I0320 17:54:07.204989 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3f074183-2793-4719-95b3-c2df447c93ab-etc-swift\") pod \"swift-proxy-799f9bd8b7-4q7w9\" (UID: \"3f074183-2793-4719-95b3-c2df447c93ab\") " pod="openstack/swift-proxy-799f9bd8b7-4q7w9" Mar 20 17:54:07 crc kubenswrapper[4690]: I0320 17:54:07.205097 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f074183-2793-4719-95b3-c2df447c93ab-internal-tls-certs\") pod \"swift-proxy-799f9bd8b7-4q7w9\" (UID: \"3f074183-2793-4719-95b3-c2df447c93ab\") " pod="openstack/swift-proxy-799f9bd8b7-4q7w9" Mar 20 17:54:07 crc kubenswrapper[4690]: I0320 17:54:07.205237 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f074183-2793-4719-95b3-c2df447c93ab-public-tls-certs\") pod \"swift-proxy-799f9bd8b7-4q7w9\" (UID: \"3f074183-2793-4719-95b3-c2df447c93ab\") " pod="openstack/swift-proxy-799f9bd8b7-4q7w9" Mar 20 17:54:07 crc kubenswrapper[4690]: I0320 17:54:07.306910 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f074183-2793-4719-95b3-c2df447c93ab-internal-tls-certs\") pod \"swift-proxy-799f9bd8b7-4q7w9\" (UID: \"3f074183-2793-4719-95b3-c2df447c93ab\") " pod="openstack/swift-proxy-799f9bd8b7-4q7w9" Mar 20 17:54:07 crc kubenswrapper[4690]: I0320 17:54:07.307019 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f074183-2793-4719-95b3-c2df447c93ab-public-tls-certs\") pod \"swift-proxy-799f9bd8b7-4q7w9\" (UID: \"3f074183-2793-4719-95b3-c2df447c93ab\") " pod="openstack/swift-proxy-799f9bd8b7-4q7w9" Mar 20 17:54:07 crc kubenswrapper[4690]: I0320 17:54:07.307047 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcfn7\" (UniqueName: \"kubernetes.io/projected/3f074183-2793-4719-95b3-c2df447c93ab-kube-api-access-mcfn7\") pod \"swift-proxy-799f9bd8b7-4q7w9\" (UID: \"3f074183-2793-4719-95b3-c2df447c93ab\") " pod="openstack/swift-proxy-799f9bd8b7-4q7w9" Mar 20 17:54:07 crc kubenswrapper[4690]: I0320 17:54:07.307077 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f074183-2793-4719-95b3-c2df447c93ab-config-data\") pod \"swift-proxy-799f9bd8b7-4q7w9\" (UID: \"3f074183-2793-4719-95b3-c2df447c93ab\") " pod="openstack/swift-proxy-799f9bd8b7-4q7w9" Mar 20 17:54:07 crc kubenswrapper[4690]: I0320 17:54:07.307143 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3f074183-2793-4719-95b3-c2df447c93ab-log-httpd\") pod \"swift-proxy-799f9bd8b7-4q7w9\" (UID: \"3f074183-2793-4719-95b3-c2df447c93ab\") " pod="openstack/swift-proxy-799f9bd8b7-4q7w9" Mar 20 17:54:07 crc kubenswrapper[4690]: I0320 17:54:07.307176 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f074183-2793-4719-95b3-c2df447c93ab-combined-ca-bundle\") pod \"swift-proxy-799f9bd8b7-4q7w9\" (UID: \"3f074183-2793-4719-95b3-c2df447c93ab\") " pod="openstack/swift-proxy-799f9bd8b7-4q7w9" Mar 20 17:54:07 crc kubenswrapper[4690]: I0320 17:54:07.307203 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3f074183-2793-4719-95b3-c2df447c93ab-run-httpd\") pod \"swift-proxy-799f9bd8b7-4q7w9\" (UID: \"3f074183-2793-4719-95b3-c2df447c93ab\") " pod="openstack/swift-proxy-799f9bd8b7-4q7w9" Mar 20 17:54:07 crc kubenswrapper[4690]: I0320 17:54:07.307294 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3f074183-2793-4719-95b3-c2df447c93ab-etc-swift\") pod \"swift-proxy-799f9bd8b7-4q7w9\" (UID: \"3f074183-2793-4719-95b3-c2df447c93ab\") " pod="openstack/swift-proxy-799f9bd8b7-4q7w9" Mar 20 17:54:07 crc kubenswrapper[4690]: I0320 17:54:07.308541 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3f074183-2793-4719-95b3-c2df447c93ab-run-httpd\") pod \"swift-proxy-799f9bd8b7-4q7w9\" (UID: \"3f074183-2793-4719-95b3-c2df447c93ab\") " pod="openstack/swift-proxy-799f9bd8b7-4q7w9" Mar 20 17:54:07 crc kubenswrapper[4690]: I0320 17:54:07.308691 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3f074183-2793-4719-95b3-c2df447c93ab-log-httpd\") pod \"swift-proxy-799f9bd8b7-4q7w9\" (UID: \"3f074183-2793-4719-95b3-c2df447c93ab\") " pod="openstack/swift-proxy-799f9bd8b7-4q7w9" Mar 20 17:54:07 crc kubenswrapper[4690]: I0320 17:54:07.324318 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f074183-2793-4719-95b3-c2df447c93ab-public-tls-certs\") pod \"swift-proxy-799f9bd8b7-4q7w9\" (UID: \"3f074183-2793-4719-95b3-c2df447c93ab\") " pod="openstack/swift-proxy-799f9bd8b7-4q7w9" Mar 20 17:54:07 crc kubenswrapper[4690]: I0320 17:54:07.324486 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f074183-2793-4719-95b3-c2df447c93ab-config-data\") pod \"swift-proxy-799f9bd8b7-4q7w9\" (UID: \"3f074183-2793-4719-95b3-c2df447c93ab\") " pod="openstack/swift-proxy-799f9bd8b7-4q7w9" Mar 20 17:54:07 crc kubenswrapper[4690]: I0320 17:54:07.324534 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f074183-2793-4719-95b3-c2df447c93ab-combined-ca-bundle\") pod \"swift-proxy-799f9bd8b7-4q7w9\" (UID: \"3f074183-2793-4719-95b3-c2df447c93ab\") " pod="openstack/swift-proxy-799f9bd8b7-4q7w9" Mar 20 17:54:07 crc kubenswrapper[4690]: I0320 17:54:07.324911 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f074183-2793-4719-95b3-c2df447c93ab-internal-tls-certs\") pod \"swift-proxy-799f9bd8b7-4q7w9\" (UID: \"3f074183-2793-4719-95b3-c2df447c93ab\") " pod="openstack/swift-proxy-799f9bd8b7-4q7w9" Mar 20 17:54:07 crc kubenswrapper[4690]: I0320 17:54:07.325726 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3f074183-2793-4719-95b3-c2df447c93ab-etc-swift\") pod \"swift-proxy-799f9bd8b7-4q7w9\" (UID: \"3f074183-2793-4719-95b3-c2df447c93ab\") " pod="openstack/swift-proxy-799f9bd8b7-4q7w9" Mar 20 17:54:07 crc kubenswrapper[4690]: I0320 17:54:07.326390 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcfn7\" (UniqueName: \"kubernetes.io/projected/3f074183-2793-4719-95b3-c2df447c93ab-kube-api-access-mcfn7\") pod \"swift-proxy-799f9bd8b7-4q7w9\" (UID: \"3f074183-2793-4719-95b3-c2df447c93ab\") " pod="openstack/swift-proxy-799f9bd8b7-4q7w9" Mar 20 17:54:07 crc kubenswrapper[4690]: I0320 17:54:07.468435 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-799f9bd8b7-4q7w9" Mar 20 17:54:08 crc kubenswrapper[4690]: W0320 17:54:08.086358 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3f074183_2793_4719_95b3_c2df447c93ab.slice/crio-632d93bdb25e1577ce54528e379bf654c9e80839ae59a555235474454263f7ec WatchSource:0}: Error finding container 632d93bdb25e1577ce54528e379bf654c9e80839ae59a555235474454263f7ec: Status 404 returned error can't find the container with id 632d93bdb25e1577ce54528e379bf654c9e80839ae59a555235474454263f7ec Mar 20 17:54:08 crc kubenswrapper[4690]: I0320 17:54:08.086450 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-799f9bd8b7-4q7w9"] Mar 20 17:54:08 crc kubenswrapper[4690]: I0320 17:54:08.819378 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-799f9bd8b7-4q7w9" event={"ID":"3f074183-2793-4719-95b3-c2df447c93ab","Type":"ContainerStarted","Data":"ab13031c59d83115a50a0d1d5c37e54ac34a04d0ee8b8ecebf645487b2a34bca"} Mar 20 17:54:08 crc kubenswrapper[4690]: I0320 17:54:08.819834 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-799f9bd8b7-4q7w9" Mar 20 17:54:08 crc kubenswrapper[4690]: I0320 17:54:08.819848 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-799f9bd8b7-4q7w9" Mar 20 17:54:08 crc kubenswrapper[4690]: I0320 17:54:08.819856 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-799f9bd8b7-4q7w9" event={"ID":"3f074183-2793-4719-95b3-c2df447c93ab","Type":"ContainerStarted","Data":"2675dbff4824a5ab04bb535eba67d739d635e659d166c8a24ccd3febb3f31edc"} Mar 20 17:54:08 crc kubenswrapper[4690]: I0320 17:54:08.819866 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-799f9bd8b7-4q7w9" event={"ID":"3f074183-2793-4719-95b3-c2df447c93ab","Type":"ContainerStarted","Data":"632d93bdb25e1577ce54528e379bf654c9e80839ae59a555235474454263f7ec"} Mar 20 17:54:08 crc kubenswrapper[4690]: I0320 17:54:08.851160 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-799f9bd8b7-4q7w9" podStartSLOduration=1.851140496 podStartE2EDuration="1.851140496s" podCreationTimestamp="2026-03-20 17:54:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:54:08.840301496 +0000 UTC m=+1323.706127184" watchObservedRunningTime="2026-03-20 17:54:08.851140496 +0000 UTC m=+1323.716966174" Mar 20 17:54:09 crc kubenswrapper[4690]: I0320 17:54:09.678349 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:54:09 crc kubenswrapper[4690]: I0320 17:54:09.678909 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="48156974-0ab6-4f24-8d90-c5dcdfbe9f37" containerName="ceilometer-central-agent" containerID="cri-o://e1f7e0388a8a1063e80631cc24d1602951be9d3c2eaa052af5e1aee8d9a983e3" gracePeriod=30 Mar 20 17:54:09 crc kubenswrapper[4690]: I0320 17:54:09.679393 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="48156974-0ab6-4f24-8d90-c5dcdfbe9f37" containerName="proxy-httpd" containerID="cri-o://3dad433afeaad3163f1f217c7c266a77d8f05e0dfc6cb01f8f33f4d9a49c1ca2" gracePeriod=30 Mar 20 17:54:09 crc kubenswrapper[4690]: I0320 17:54:09.679445 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="48156974-0ab6-4f24-8d90-c5dcdfbe9f37" containerName="sg-core" containerID="cri-o://408b2ebf9a8651945b2fcdd0c6cf8210f0de2d8332dd37200ca5a8d151c68af9" gracePeriod=30 Mar 20 17:54:09 crc kubenswrapper[4690]: I0320 17:54:09.679480 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="48156974-0ab6-4f24-8d90-c5dcdfbe9f37" containerName="ceilometer-notification-agent" containerID="cri-o://b9302a3f2fd2401586d51ea789f00ae1f6527ddafdc9f8afb41111770b82dbe5" gracePeriod=30 Mar 20 17:54:09 crc kubenswrapper[4690]: I0320 17:54:09.690483 4690 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-867c5896-qkwmr" podUID="607d61e7-e52a-46e6-a23a-2d4714c5b543" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.148:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.148:8443: connect: connection refused" Mar 20 17:54:09 crc kubenswrapper[4690]: I0320 17:54:09.726156 4690 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="48156974-0ab6-4f24-8d90-c5dcdfbe9f37" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.166:3000/\": EOF" Mar 20 17:54:09 crc kubenswrapper[4690]: I0320 17:54:09.838075 4690 generic.go:334] "Generic (PLEG): container finished" podID="48156974-0ab6-4f24-8d90-c5dcdfbe9f37" containerID="408b2ebf9a8651945b2fcdd0c6cf8210f0de2d8332dd37200ca5a8d151c68af9" exitCode=2 Mar 20 17:54:09 crc kubenswrapper[4690]: I0320 17:54:09.838147 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"48156974-0ab6-4f24-8d90-c5dcdfbe9f37","Type":"ContainerDied","Data":"408b2ebf9a8651945b2fcdd0c6cf8210f0de2d8332dd37200ca5a8d151c68af9"} Mar 20 17:54:10 crc kubenswrapper[4690]: I0320 17:54:10.848029 4690 generic.go:334] "Generic (PLEG): container finished" podID="48156974-0ab6-4f24-8d90-c5dcdfbe9f37" containerID="3dad433afeaad3163f1f217c7c266a77d8f05e0dfc6cb01f8f33f4d9a49c1ca2" exitCode=0 Mar 20 17:54:10 crc kubenswrapper[4690]: I0320 17:54:10.848059 4690 generic.go:334] "Generic (PLEG): container finished" podID="48156974-0ab6-4f24-8d90-c5dcdfbe9f37" containerID="b9302a3f2fd2401586d51ea789f00ae1f6527ddafdc9f8afb41111770b82dbe5" exitCode=0 Mar 20 17:54:10 crc kubenswrapper[4690]: I0320 17:54:10.848068 4690 generic.go:334] "Generic (PLEG): container finished" podID="48156974-0ab6-4f24-8d90-c5dcdfbe9f37" containerID="e1f7e0388a8a1063e80631cc24d1602951be9d3c2eaa052af5e1aee8d9a983e3" exitCode=0 Mar 20 17:54:10 crc kubenswrapper[4690]: I0320 17:54:10.848093 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"48156974-0ab6-4f24-8d90-c5dcdfbe9f37","Type":"ContainerDied","Data":"3dad433afeaad3163f1f217c7c266a77d8f05e0dfc6cb01f8f33f4d9a49c1ca2"} Mar 20 17:54:10 crc kubenswrapper[4690]: I0320 17:54:10.848131 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"48156974-0ab6-4f24-8d90-c5dcdfbe9f37","Type":"ContainerDied","Data":"b9302a3f2fd2401586d51ea789f00ae1f6527ddafdc9f8afb41111770b82dbe5"} Mar 20 17:54:10 crc kubenswrapper[4690]: I0320 17:54:10.848140 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"48156974-0ab6-4f24-8d90-c5dcdfbe9f37","Type":"ContainerDied","Data":"e1f7e0388a8a1063e80631cc24d1602951be9d3c2eaa052af5e1aee8d9a983e3"} Mar 20 17:54:11 crc kubenswrapper[4690]: I0320 17:54:11.076992 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-plcj5"] Mar 20 17:54:11 crc kubenswrapper[4690]: I0320 17:54:11.105442 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-plcj5"] Mar 20 17:54:11 crc kubenswrapper[4690]: I0320 17:54:11.105571 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-plcj5" Mar 20 17:54:11 crc kubenswrapper[4690]: I0320 17:54:11.152294 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-rb6zb"] Mar 20 17:54:11 crc kubenswrapper[4690]: I0320 17:54:11.154017 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-rb6zb" Mar 20 17:54:11 crc kubenswrapper[4690]: I0320 17:54:11.180587 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-rb6zb"] Mar 20 17:54:11 crc kubenswrapper[4690]: I0320 17:54:11.199035 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4k27\" (UniqueName: \"kubernetes.io/projected/fbcd1d86-01fb-4773-a19c-10ac29e045e1-kube-api-access-n4k27\") pod \"nova-api-db-create-plcj5\" (UID: \"fbcd1d86-01fb-4773-a19c-10ac29e045e1\") " pod="openstack/nova-api-db-create-plcj5" Mar 20 17:54:11 crc kubenswrapper[4690]: I0320 17:54:11.199116 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fbcd1d86-01fb-4773-a19c-10ac29e045e1-operator-scripts\") pod \"nova-api-db-create-plcj5\" (UID: \"fbcd1d86-01fb-4773-a19c-10ac29e045e1\") " pod="openstack/nova-api-db-create-plcj5" Mar 20 17:54:11 crc kubenswrapper[4690]: I0320 17:54:11.199145 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9643bf40-9abd-47b9-9e39-2b1dfea6949d-operator-scripts\") pod \"nova-cell0-db-create-rb6zb\" (UID: \"9643bf40-9abd-47b9-9e39-2b1dfea6949d\") " pod="openstack/nova-cell0-db-create-rb6zb" Mar 20 17:54:11 crc kubenswrapper[4690]: I0320 17:54:11.199294 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rf5jn\" (UniqueName: \"kubernetes.io/projected/9643bf40-9abd-47b9-9e39-2b1dfea6949d-kube-api-access-rf5jn\") pod \"nova-cell0-db-create-rb6zb\" (UID: \"9643bf40-9abd-47b9-9e39-2b1dfea6949d\") " pod="openstack/nova-cell0-db-create-rb6zb" Mar 20 17:54:11 crc kubenswrapper[4690]: I0320 17:54:11.251191 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-2bb3-account-create-update-t2z8f"] Mar 20 17:54:11 crc kubenswrapper[4690]: I0320 17:54:11.253001 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-2bb3-account-create-update-t2z8f" Mar 20 17:54:11 crc kubenswrapper[4690]: I0320 17:54:11.256072 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 20 17:54:11 crc kubenswrapper[4690]: I0320 17:54:11.288708 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-2bb3-account-create-update-t2z8f"] Mar 20 17:54:11 crc kubenswrapper[4690]: I0320 17:54:11.300637 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4k27\" (UniqueName: \"kubernetes.io/projected/fbcd1d86-01fb-4773-a19c-10ac29e045e1-kube-api-access-n4k27\") pod \"nova-api-db-create-plcj5\" (UID: \"fbcd1d86-01fb-4773-a19c-10ac29e045e1\") " pod="openstack/nova-api-db-create-plcj5" Mar 20 17:54:11 crc kubenswrapper[4690]: I0320 17:54:11.300702 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/393d76ad-66f6-46fe-93d8-833e5193f216-operator-scripts\") pod \"nova-api-2bb3-account-create-update-t2z8f\" (UID: \"393d76ad-66f6-46fe-93d8-833e5193f216\") " pod="openstack/nova-api-2bb3-account-create-update-t2z8f" Mar 20 17:54:11 crc kubenswrapper[4690]: I0320 17:54:11.300750 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fbcd1d86-01fb-4773-a19c-10ac29e045e1-operator-scripts\") pod \"nova-api-db-create-plcj5\" (UID: \"fbcd1d86-01fb-4773-a19c-10ac29e045e1\") " pod="openstack/nova-api-db-create-plcj5" Mar 20 17:54:11 crc kubenswrapper[4690]: I0320 17:54:11.300774 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9643bf40-9abd-47b9-9e39-2b1dfea6949d-operator-scripts\") pod \"nova-cell0-db-create-rb6zb\" (UID: \"9643bf40-9abd-47b9-9e39-2b1dfea6949d\") " pod="openstack/nova-cell0-db-create-rb6zb" Mar 20 17:54:11 crc kubenswrapper[4690]: I0320 17:54:11.300838 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlld8\" (UniqueName: \"kubernetes.io/projected/393d76ad-66f6-46fe-93d8-833e5193f216-kube-api-access-wlld8\") pod \"nova-api-2bb3-account-create-update-t2z8f\" (UID: \"393d76ad-66f6-46fe-93d8-833e5193f216\") " pod="openstack/nova-api-2bb3-account-create-update-t2z8f" Mar 20 17:54:11 crc kubenswrapper[4690]: I0320 17:54:11.300894 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rf5jn\" (UniqueName: \"kubernetes.io/projected/9643bf40-9abd-47b9-9e39-2b1dfea6949d-kube-api-access-rf5jn\") pod \"nova-cell0-db-create-rb6zb\" (UID: \"9643bf40-9abd-47b9-9e39-2b1dfea6949d\") " pod="openstack/nova-cell0-db-create-rb6zb" Mar 20 17:54:11 crc kubenswrapper[4690]: I0320 17:54:11.302034 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fbcd1d86-01fb-4773-a19c-10ac29e045e1-operator-scripts\") pod \"nova-api-db-create-plcj5\" (UID: \"fbcd1d86-01fb-4773-a19c-10ac29e045e1\") " pod="openstack/nova-api-db-create-plcj5" Mar 20 17:54:11 crc kubenswrapper[4690]: I0320 17:54:11.302194 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9643bf40-9abd-47b9-9e39-2b1dfea6949d-operator-scripts\") pod \"nova-cell0-db-create-rb6zb\" (UID: \"9643bf40-9abd-47b9-9e39-2b1dfea6949d\") " pod="openstack/nova-cell0-db-create-rb6zb" Mar 20 17:54:11 crc kubenswrapper[4690]: I0320 17:54:11.318473 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rf5jn\" (UniqueName: \"kubernetes.io/projected/9643bf40-9abd-47b9-9e39-2b1dfea6949d-kube-api-access-rf5jn\") pod \"nova-cell0-db-create-rb6zb\" (UID: \"9643bf40-9abd-47b9-9e39-2b1dfea6949d\") " pod="openstack/nova-cell0-db-create-rb6zb" Mar 20 17:54:11 crc kubenswrapper[4690]: I0320 17:54:11.333120 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4k27\" (UniqueName: \"kubernetes.io/projected/fbcd1d86-01fb-4773-a19c-10ac29e045e1-kube-api-access-n4k27\") pod \"nova-api-db-create-plcj5\" (UID: \"fbcd1d86-01fb-4773-a19c-10ac29e045e1\") " pod="openstack/nova-api-db-create-plcj5" Mar 20 17:54:11 crc kubenswrapper[4690]: I0320 17:54:11.349074 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-nb497"] Mar 20 17:54:11 crc kubenswrapper[4690]: I0320 17:54:11.350280 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-nb497" Mar 20 17:54:11 crc kubenswrapper[4690]: I0320 17:54:11.387716 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-nb497"] Mar 20 17:54:11 crc kubenswrapper[4690]: I0320 17:54:11.402458 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlld8\" (UniqueName: \"kubernetes.io/projected/393d76ad-66f6-46fe-93d8-833e5193f216-kube-api-access-wlld8\") pod \"nova-api-2bb3-account-create-update-t2z8f\" (UID: \"393d76ad-66f6-46fe-93d8-833e5193f216\") " pod="openstack/nova-api-2bb3-account-create-update-t2z8f" Mar 20 17:54:11 crc kubenswrapper[4690]: I0320 17:54:11.402633 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/393d76ad-66f6-46fe-93d8-833e5193f216-operator-scripts\") pod \"nova-api-2bb3-account-create-update-t2z8f\" (UID: \"393d76ad-66f6-46fe-93d8-833e5193f216\") " pod="openstack/nova-api-2bb3-account-create-update-t2z8f" Mar 20 17:54:11 crc kubenswrapper[4690]: I0320 17:54:11.403581 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/393d76ad-66f6-46fe-93d8-833e5193f216-operator-scripts\") pod \"nova-api-2bb3-account-create-update-t2z8f\" (UID: \"393d76ad-66f6-46fe-93d8-833e5193f216\") " pod="openstack/nova-api-2bb3-account-create-update-t2z8f" Mar 20 17:54:11 crc kubenswrapper[4690]: I0320 17:54:11.429004 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlld8\" (UniqueName: \"kubernetes.io/projected/393d76ad-66f6-46fe-93d8-833e5193f216-kube-api-access-wlld8\") pod \"nova-api-2bb3-account-create-update-t2z8f\" (UID: \"393d76ad-66f6-46fe-93d8-833e5193f216\") " pod="openstack/nova-api-2bb3-account-create-update-t2z8f" Mar 20 17:54:11 crc kubenswrapper[4690]: I0320 17:54:11.440824 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-plcj5" Mar 20 17:54:11 crc kubenswrapper[4690]: I0320 17:54:11.485660 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-rb6zb" Mar 20 17:54:11 crc kubenswrapper[4690]: I0320 17:54:11.493341 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-ce9f-account-create-update-4vcxr"] Mar 20 17:54:11 crc kubenswrapper[4690]: I0320 17:54:11.494510 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-ce9f-account-create-update-4vcxr" Mar 20 17:54:11 crc kubenswrapper[4690]: I0320 17:54:11.503533 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 20 17:54:11 crc kubenswrapper[4690]: I0320 17:54:11.504330 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/140a83c9-1fb7-49cc-83a6-b78677db1779-operator-scripts\") pod \"nova-cell1-db-create-nb497\" (UID: \"140a83c9-1fb7-49cc-83a6-b78677db1779\") " pod="openstack/nova-cell1-db-create-nb497" Mar 20 17:54:11 crc kubenswrapper[4690]: I0320 17:54:11.504427 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msjqv\" (UniqueName: \"kubernetes.io/projected/140a83c9-1fb7-49cc-83a6-b78677db1779-kube-api-access-msjqv\") pod \"nova-cell1-db-create-nb497\" (UID: \"140a83c9-1fb7-49cc-83a6-b78677db1779\") " pod="openstack/nova-cell1-db-create-nb497" Mar 20 17:54:11 crc kubenswrapper[4690]: I0320 17:54:11.532506 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-ce9f-account-create-update-4vcxr"] Mar 20 17:54:11 crc kubenswrapper[4690]: I0320 17:54:11.577637 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-2bb3-account-create-update-t2z8f" Mar 20 17:54:11 crc kubenswrapper[4690]: I0320 17:54:11.606706 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/140a83c9-1fb7-49cc-83a6-b78677db1779-operator-scripts\") pod \"nova-cell1-db-create-nb497\" (UID: \"140a83c9-1fb7-49cc-83a6-b78677db1779\") " pod="openstack/nova-cell1-db-create-nb497" Mar 20 17:54:11 crc kubenswrapper[4690]: I0320 17:54:11.606922 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-msjqv\" (UniqueName: \"kubernetes.io/projected/140a83c9-1fb7-49cc-83a6-b78677db1779-kube-api-access-msjqv\") pod \"nova-cell1-db-create-nb497\" (UID: \"140a83c9-1fb7-49cc-83a6-b78677db1779\") " pod="openstack/nova-cell1-db-create-nb497" Mar 20 17:54:11 crc kubenswrapper[4690]: I0320 17:54:11.607009 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ebf4cd57-6e6d-40f1-9fb6-a0d140c04c30-operator-scripts\") pod \"nova-cell0-ce9f-account-create-update-4vcxr\" (UID: \"ebf4cd57-6e6d-40f1-9fb6-a0d140c04c30\") " pod="openstack/nova-cell0-ce9f-account-create-update-4vcxr" Mar 20 17:54:11 crc kubenswrapper[4690]: I0320 17:54:11.607092 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6x9l4\" (UniqueName: \"kubernetes.io/projected/ebf4cd57-6e6d-40f1-9fb6-a0d140c04c30-kube-api-access-6x9l4\") pod \"nova-cell0-ce9f-account-create-update-4vcxr\" (UID: \"ebf4cd57-6e6d-40f1-9fb6-a0d140c04c30\") " pod="openstack/nova-cell0-ce9f-account-create-update-4vcxr" Mar 20 17:54:11 crc kubenswrapper[4690]: I0320 17:54:11.608076 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/140a83c9-1fb7-49cc-83a6-b78677db1779-operator-scripts\") pod \"nova-cell1-db-create-nb497\" (UID: \"140a83c9-1fb7-49cc-83a6-b78677db1779\") " pod="openstack/nova-cell1-db-create-nb497" Mar 20 17:54:11 crc kubenswrapper[4690]: I0320 17:54:11.639435 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-msjqv\" (UniqueName: \"kubernetes.io/projected/140a83c9-1fb7-49cc-83a6-b78677db1779-kube-api-access-msjqv\") pod \"nova-cell1-db-create-nb497\" (UID: \"140a83c9-1fb7-49cc-83a6-b78677db1779\") " pod="openstack/nova-cell1-db-create-nb497" Mar 20 17:54:11 crc kubenswrapper[4690]: I0320 17:54:11.656721 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-3684-account-create-update-9hhdp"] Mar 20 17:54:11 crc kubenswrapper[4690]: I0320 17:54:11.657914 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-3684-account-create-update-9hhdp" Mar 20 17:54:11 crc kubenswrapper[4690]: I0320 17:54:11.661739 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 20 17:54:11 crc kubenswrapper[4690]: I0320 17:54:11.671393 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-3684-account-create-update-9hhdp"] Mar 20 17:54:11 crc kubenswrapper[4690]: I0320 17:54:11.709488 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ck6bj\" (UniqueName: \"kubernetes.io/projected/6ffe945e-e151-44eb-82d1-99c46c113fbe-kube-api-access-ck6bj\") pod \"nova-cell1-3684-account-create-update-9hhdp\" (UID: \"6ffe945e-e151-44eb-82d1-99c46c113fbe\") " pod="openstack/nova-cell1-3684-account-create-update-9hhdp" Mar 20 17:54:11 crc kubenswrapper[4690]: I0320 17:54:11.709545 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6ffe945e-e151-44eb-82d1-99c46c113fbe-operator-scripts\") pod \"nova-cell1-3684-account-create-update-9hhdp\" (UID: \"6ffe945e-e151-44eb-82d1-99c46c113fbe\") " pod="openstack/nova-cell1-3684-account-create-update-9hhdp" Mar 20 17:54:11 crc kubenswrapper[4690]: I0320 17:54:11.709571 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ebf4cd57-6e6d-40f1-9fb6-a0d140c04c30-operator-scripts\") pod \"nova-cell0-ce9f-account-create-update-4vcxr\" (UID: \"ebf4cd57-6e6d-40f1-9fb6-a0d140c04c30\") " pod="openstack/nova-cell0-ce9f-account-create-update-4vcxr" Mar 20 17:54:11 crc kubenswrapper[4690]: I0320 17:54:11.709605 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6x9l4\" (UniqueName: \"kubernetes.io/projected/ebf4cd57-6e6d-40f1-9fb6-a0d140c04c30-kube-api-access-6x9l4\") pod \"nova-cell0-ce9f-account-create-update-4vcxr\" (UID: \"ebf4cd57-6e6d-40f1-9fb6-a0d140c04c30\") " pod="openstack/nova-cell0-ce9f-account-create-update-4vcxr" Mar 20 17:54:11 crc kubenswrapper[4690]: I0320 17:54:11.710388 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ebf4cd57-6e6d-40f1-9fb6-a0d140c04c30-operator-scripts\") pod \"nova-cell0-ce9f-account-create-update-4vcxr\" (UID: \"ebf4cd57-6e6d-40f1-9fb6-a0d140c04c30\") " pod="openstack/nova-cell0-ce9f-account-create-update-4vcxr" Mar 20 17:54:11 crc kubenswrapper[4690]: I0320 17:54:11.733771 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6x9l4\" (UniqueName: \"kubernetes.io/projected/ebf4cd57-6e6d-40f1-9fb6-a0d140c04c30-kube-api-access-6x9l4\") pod \"nova-cell0-ce9f-account-create-update-4vcxr\" (UID: \"ebf4cd57-6e6d-40f1-9fb6-a0d140c04c30\") " pod="openstack/nova-cell0-ce9f-account-create-update-4vcxr" Mar 20 17:54:11 crc kubenswrapper[4690]: I0320 17:54:11.811391 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ck6bj\" (UniqueName: \"kubernetes.io/projected/6ffe945e-e151-44eb-82d1-99c46c113fbe-kube-api-access-ck6bj\") pod \"nova-cell1-3684-account-create-update-9hhdp\" (UID: \"6ffe945e-e151-44eb-82d1-99c46c113fbe\") " pod="openstack/nova-cell1-3684-account-create-update-9hhdp" Mar 20 17:54:11 crc kubenswrapper[4690]: I0320 17:54:11.811488 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6ffe945e-e151-44eb-82d1-99c46c113fbe-operator-scripts\") pod \"nova-cell1-3684-account-create-update-9hhdp\" (UID: \"6ffe945e-e151-44eb-82d1-99c46c113fbe\") " pod="openstack/nova-cell1-3684-account-create-update-9hhdp" Mar 20 17:54:11 crc kubenswrapper[4690]: I0320 17:54:11.812162 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6ffe945e-e151-44eb-82d1-99c46c113fbe-operator-scripts\") pod \"nova-cell1-3684-account-create-update-9hhdp\" (UID: \"6ffe945e-e151-44eb-82d1-99c46c113fbe\") " pod="openstack/nova-cell1-3684-account-create-update-9hhdp" Mar 20 17:54:11 crc kubenswrapper[4690]: I0320 17:54:11.821682 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-nb497" Mar 20 17:54:11 crc kubenswrapper[4690]: I0320 17:54:11.826214 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ck6bj\" (UniqueName: \"kubernetes.io/projected/6ffe945e-e151-44eb-82d1-99c46c113fbe-kube-api-access-ck6bj\") pod \"nova-cell1-3684-account-create-update-9hhdp\" (UID: \"6ffe945e-e151-44eb-82d1-99c46c113fbe\") " pod="openstack/nova-cell1-3684-account-create-update-9hhdp" Mar 20 17:54:11 crc kubenswrapper[4690]: I0320 17:54:11.837097 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-ce9f-account-create-update-4vcxr" Mar 20 17:54:12 crc kubenswrapper[4690]: I0320 17:54:12.003538 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-3684-account-create-update-9hhdp" Mar 20 17:54:15 crc kubenswrapper[4690]: I0320 17:54:15.466307 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:54:15 crc kubenswrapper[4690]: I0320 17:54:15.589127 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/48156974-0ab6-4f24-8d90-c5dcdfbe9f37-sg-core-conf-yaml\") pod \"48156974-0ab6-4f24-8d90-c5dcdfbe9f37\" (UID: \"48156974-0ab6-4f24-8d90-c5dcdfbe9f37\") " Mar 20 17:54:15 crc kubenswrapper[4690]: I0320 17:54:15.589230 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48156974-0ab6-4f24-8d90-c5dcdfbe9f37-combined-ca-bundle\") pod \"48156974-0ab6-4f24-8d90-c5dcdfbe9f37\" (UID: \"48156974-0ab6-4f24-8d90-c5dcdfbe9f37\") " Mar 20 17:54:15 crc kubenswrapper[4690]: I0320 17:54:15.589287 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/48156974-0ab6-4f24-8d90-c5dcdfbe9f37-run-httpd\") pod \"48156974-0ab6-4f24-8d90-c5dcdfbe9f37\" (UID: \"48156974-0ab6-4f24-8d90-c5dcdfbe9f37\") " Mar 20 17:54:15 crc kubenswrapper[4690]: I0320 17:54:15.589374 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdrjf\" (UniqueName: \"kubernetes.io/projected/48156974-0ab6-4f24-8d90-c5dcdfbe9f37-kube-api-access-kdrjf\") pod \"48156974-0ab6-4f24-8d90-c5dcdfbe9f37\" (UID: \"48156974-0ab6-4f24-8d90-c5dcdfbe9f37\") " Mar 20 17:54:15 crc kubenswrapper[4690]: I0320 17:54:15.589423 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48156974-0ab6-4f24-8d90-c5dcdfbe9f37-config-data\") pod \"48156974-0ab6-4f24-8d90-c5dcdfbe9f37\" (UID: \"48156974-0ab6-4f24-8d90-c5dcdfbe9f37\") " Mar 20 17:54:15 crc kubenswrapper[4690]: I0320 17:54:15.589474 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/48156974-0ab6-4f24-8d90-c5dcdfbe9f37-log-httpd\") pod \"48156974-0ab6-4f24-8d90-c5dcdfbe9f37\" (UID: \"48156974-0ab6-4f24-8d90-c5dcdfbe9f37\") " Mar 20 17:54:15 crc kubenswrapper[4690]: I0320 17:54:15.589497 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48156974-0ab6-4f24-8d90-c5dcdfbe9f37-scripts\") pod \"48156974-0ab6-4f24-8d90-c5dcdfbe9f37\" (UID: \"48156974-0ab6-4f24-8d90-c5dcdfbe9f37\") " Mar 20 17:54:15 crc kubenswrapper[4690]: I0320 17:54:15.590206 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48156974-0ab6-4f24-8d90-c5dcdfbe9f37-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "48156974-0ab6-4f24-8d90-c5dcdfbe9f37" (UID: "48156974-0ab6-4f24-8d90-c5dcdfbe9f37"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:54:15 crc kubenswrapper[4690]: I0320 17:54:15.591452 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48156974-0ab6-4f24-8d90-c5dcdfbe9f37-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "48156974-0ab6-4f24-8d90-c5dcdfbe9f37" (UID: "48156974-0ab6-4f24-8d90-c5dcdfbe9f37"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:54:15 crc kubenswrapper[4690]: I0320 17:54:15.596003 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48156974-0ab6-4f24-8d90-c5dcdfbe9f37-scripts" (OuterVolumeSpecName: "scripts") pod "48156974-0ab6-4f24-8d90-c5dcdfbe9f37" (UID: "48156974-0ab6-4f24-8d90-c5dcdfbe9f37"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:54:15 crc kubenswrapper[4690]: I0320 17:54:15.597888 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48156974-0ab6-4f24-8d90-c5dcdfbe9f37-kube-api-access-kdrjf" (OuterVolumeSpecName: "kube-api-access-kdrjf") pod "48156974-0ab6-4f24-8d90-c5dcdfbe9f37" (UID: "48156974-0ab6-4f24-8d90-c5dcdfbe9f37"). InnerVolumeSpecName "kube-api-access-kdrjf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:54:15 crc kubenswrapper[4690]: I0320 17:54:15.625584 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48156974-0ab6-4f24-8d90-c5dcdfbe9f37-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "48156974-0ab6-4f24-8d90-c5dcdfbe9f37" (UID: "48156974-0ab6-4f24-8d90-c5dcdfbe9f37"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:54:15 crc kubenswrapper[4690]: I0320 17:54:15.643757 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-3684-account-create-update-9hhdp"] Mar 20 17:54:15 crc kubenswrapper[4690]: I0320 17:54:15.700377 4690 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/48156974-0ab6-4f24-8d90-c5dcdfbe9f37-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:15 crc kubenswrapper[4690]: I0320 17:54:15.700399 4690 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48156974-0ab6-4f24-8d90-c5dcdfbe9f37-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:15 crc kubenswrapper[4690]: I0320 17:54:15.700410 4690 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/48156974-0ab6-4f24-8d90-c5dcdfbe9f37-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:15 crc kubenswrapper[4690]: I0320 17:54:15.700420 4690 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/48156974-0ab6-4f24-8d90-c5dcdfbe9f37-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:15 crc kubenswrapper[4690]: I0320 17:54:15.700429 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kdrjf\" (UniqueName: \"kubernetes.io/projected/48156974-0ab6-4f24-8d90-c5dcdfbe9f37-kube-api-access-kdrjf\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:15 crc kubenswrapper[4690]: I0320 17:54:15.710211 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48156974-0ab6-4f24-8d90-c5dcdfbe9f37-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "48156974-0ab6-4f24-8d90-c5dcdfbe9f37" (UID: "48156974-0ab6-4f24-8d90-c5dcdfbe9f37"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:54:15 crc kubenswrapper[4690]: I0320 17:54:15.727468 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48156974-0ab6-4f24-8d90-c5dcdfbe9f37-config-data" (OuterVolumeSpecName: "config-data") pod "48156974-0ab6-4f24-8d90-c5dcdfbe9f37" (UID: "48156974-0ab6-4f24-8d90-c5dcdfbe9f37"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:54:15 crc kubenswrapper[4690]: I0320 17:54:15.804493 4690 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48156974-0ab6-4f24-8d90-c5dcdfbe9f37-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:15 crc kubenswrapper[4690]: I0320 17:54:15.804570 4690 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48156974-0ab6-4f24-8d90-c5dcdfbe9f37-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:15 crc kubenswrapper[4690]: I0320 17:54:15.824058 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-rb6zb"] Mar 20 17:54:15 crc kubenswrapper[4690]: I0320 17:54:15.842957 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-plcj5"] Mar 20 17:54:15 crc kubenswrapper[4690]: I0320 17:54:15.852064 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-2bb3-account-create-update-t2z8f"] Mar 20 17:54:15 crc kubenswrapper[4690]: I0320 17:54:15.927990 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-3684-account-create-update-9hhdp" event={"ID":"6ffe945e-e151-44eb-82d1-99c46c113fbe","Type":"ContainerStarted","Data":"f6803aedebf555d37ebe569b5527b1cd56fd53b3197d0cbbd757d5db758379ad"} Mar 20 17:54:15 crc kubenswrapper[4690]: I0320 17:54:15.928036 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-3684-account-create-update-9hhdp" event={"ID":"6ffe945e-e151-44eb-82d1-99c46c113fbe","Type":"ContainerStarted","Data":"7d40ac3db15bf93fdf36652f533be62c19f98db7ec0716cb10af78ec4d0220d7"} Mar 20 17:54:15 crc kubenswrapper[4690]: I0320 17:54:15.934333 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-rb6zb" event={"ID":"9643bf40-9abd-47b9-9e39-2b1dfea6949d","Type":"ContainerStarted","Data":"de26b9ebd3d1e97e60d38b9c71f1258b34be4f54b33bba4c681a09ecd10ad071"} Mar 20 17:54:15 crc kubenswrapper[4690]: I0320 17:54:15.945559 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-2bb3-account-create-update-t2z8f" event={"ID":"393d76ad-66f6-46fe-93d8-833e5193f216","Type":"ContainerStarted","Data":"37dc383b16818e8de9ee15a20fb1f312d35fc74a48b43b24630a447421e39112"} Mar 20 17:54:15 crc kubenswrapper[4690]: I0320 17:54:15.956287 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"48156974-0ab6-4f24-8d90-c5dcdfbe9f37","Type":"ContainerDied","Data":"85005629a656067448cb1c231d7ab8725957d4f5240bafa4723337aa4b8c4bfb"} Mar 20 17:54:15 crc kubenswrapper[4690]: I0320 17:54:15.956339 4690 scope.go:117] "RemoveContainer" containerID="3dad433afeaad3163f1f217c7c266a77d8f05e0dfc6cb01f8f33f4d9a49c1ca2" Mar 20 17:54:15 crc kubenswrapper[4690]: I0320 17:54:15.956463 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:54:15 crc kubenswrapper[4690]: I0320 17:54:15.972880 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"a537f291-8787-434c-84bc-4355ccccbe47","Type":"ContainerStarted","Data":"caca53c5736f1396edb4afcd4e3ccf42682611249bda15886a9200c9f398d885"} Mar 20 17:54:15 crc kubenswrapper[4690]: I0320 17:54:15.996390 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-plcj5" event={"ID":"fbcd1d86-01fb-4773-a19c-10ac29e045e1","Type":"ContainerStarted","Data":"3931c227597f25a4090ce7057a390d2bec44fe77832e162fb336b5aca37da02f"} Mar 20 17:54:16 crc kubenswrapper[4690]: I0320 17:54:16.078429 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-ce9f-account-create-update-4vcxr"] Mar 20 17:54:16 crc kubenswrapper[4690]: I0320 17:54:16.093226 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-nb497"] Mar 20 17:54:16 crc kubenswrapper[4690]: I0320 17:54:16.094152 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.409052593 podStartE2EDuration="13.094140209s" podCreationTimestamp="2026-03-20 17:54:03 +0000 UTC" firstStartedPulling="2026-03-20 17:54:04.327086544 +0000 UTC m=+1319.192912232" lastFinishedPulling="2026-03-20 17:54:15.01217416 +0000 UTC m=+1329.877999848" observedRunningTime="2026-03-20 17:54:16.026876847 +0000 UTC m=+1330.892702535" watchObservedRunningTime="2026-03-20 17:54:16.094140209 +0000 UTC m=+1330.959965887" Mar 20 17:54:16 crc kubenswrapper[4690]: I0320 17:54:16.109739 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-3684-account-create-update-9hhdp" podStartSLOduration=5.10971486 podStartE2EDuration="5.10971486s" podCreationTimestamp="2026-03-20 17:54:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:54:16.053620468 +0000 UTC m=+1330.919446146" watchObservedRunningTime="2026-03-20 17:54:16.10971486 +0000 UTC m=+1330.975540538" Mar 20 17:54:16 crc kubenswrapper[4690]: I0320 17:54:16.160747 4690 scope.go:117] "RemoveContainer" containerID="408b2ebf9a8651945b2fcdd0c6cf8210f0de2d8332dd37200ca5a8d151c68af9" Mar 20 17:54:16 crc kubenswrapper[4690]: I0320 17:54:16.231161 4690 scope.go:117] "RemoveContainer" containerID="b9302a3f2fd2401586d51ea789f00ae1f6527ddafdc9f8afb41111770b82dbe5" Mar 20 17:54:16 crc kubenswrapper[4690]: I0320 17:54:16.238742 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:54:16 crc kubenswrapper[4690]: I0320 17:54:16.257644 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:54:16 crc kubenswrapper[4690]: I0320 17:54:16.265034 4690 scope.go:117] "RemoveContainer" containerID="e1f7e0388a8a1063e80631cc24d1602951be9d3c2eaa052af5e1aee8d9a983e3" Mar 20 17:54:16 crc kubenswrapper[4690]: I0320 17:54:16.284729 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:54:16 crc kubenswrapper[4690]: E0320 17:54:16.285210 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48156974-0ab6-4f24-8d90-c5dcdfbe9f37" containerName="proxy-httpd" Mar 20 17:54:16 crc kubenswrapper[4690]: I0320 17:54:16.285233 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="48156974-0ab6-4f24-8d90-c5dcdfbe9f37" containerName="proxy-httpd" Mar 20 17:54:16 crc kubenswrapper[4690]: E0320 17:54:16.285274 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48156974-0ab6-4f24-8d90-c5dcdfbe9f37" containerName="sg-core" Mar 20 17:54:16 crc kubenswrapper[4690]: I0320 17:54:16.285284 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="48156974-0ab6-4f24-8d90-c5dcdfbe9f37" containerName="sg-core" Mar 20 17:54:16 crc kubenswrapper[4690]: E0320 17:54:16.285301 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48156974-0ab6-4f24-8d90-c5dcdfbe9f37" containerName="ceilometer-central-agent" Mar 20 17:54:16 crc kubenswrapper[4690]: I0320 17:54:16.285308 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="48156974-0ab6-4f24-8d90-c5dcdfbe9f37" containerName="ceilometer-central-agent" Mar 20 17:54:16 crc kubenswrapper[4690]: E0320 17:54:16.285324 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48156974-0ab6-4f24-8d90-c5dcdfbe9f37" containerName="ceilometer-notification-agent" Mar 20 17:54:16 crc kubenswrapper[4690]: I0320 17:54:16.285332 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="48156974-0ab6-4f24-8d90-c5dcdfbe9f37" containerName="ceilometer-notification-agent" Mar 20 17:54:16 crc kubenswrapper[4690]: I0320 17:54:16.285523 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="48156974-0ab6-4f24-8d90-c5dcdfbe9f37" containerName="proxy-httpd" Mar 20 17:54:16 crc kubenswrapper[4690]: I0320 17:54:16.285538 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="48156974-0ab6-4f24-8d90-c5dcdfbe9f37" containerName="ceilometer-central-agent" Mar 20 17:54:16 crc kubenswrapper[4690]: I0320 17:54:16.285551 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="48156974-0ab6-4f24-8d90-c5dcdfbe9f37" containerName="sg-core" Mar 20 17:54:16 crc kubenswrapper[4690]: I0320 17:54:16.285573 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="48156974-0ab6-4f24-8d90-c5dcdfbe9f37" containerName="ceilometer-notification-agent" Mar 20 17:54:16 crc kubenswrapper[4690]: I0320 17:54:16.287910 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:54:16 crc kubenswrapper[4690]: I0320 17:54:16.291677 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 17:54:16 crc kubenswrapper[4690]: I0320 17:54:16.292483 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 17:54:16 crc kubenswrapper[4690]: I0320 17:54:16.319385 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e76b465-2eaf-4a41-9f08-5b397fb08181-scripts\") pod \"ceilometer-0\" (UID: \"4e76b465-2eaf-4a41-9f08-5b397fb08181\") " pod="openstack/ceilometer-0" Mar 20 17:54:16 crc kubenswrapper[4690]: I0320 17:54:16.319494 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e76b465-2eaf-4a41-9f08-5b397fb08181-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4e76b465-2eaf-4a41-9f08-5b397fb08181\") " pod="openstack/ceilometer-0" Mar 20 17:54:16 crc kubenswrapper[4690]: I0320 17:54:16.319582 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4e76b465-2eaf-4a41-9f08-5b397fb08181-run-httpd\") pod \"ceilometer-0\" (UID: \"4e76b465-2eaf-4a41-9f08-5b397fb08181\") " pod="openstack/ceilometer-0" Mar 20 17:54:16 crc kubenswrapper[4690]: I0320 17:54:16.319631 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e76b465-2eaf-4a41-9f08-5b397fb08181-config-data\") pod \"ceilometer-0\" (UID: \"4e76b465-2eaf-4a41-9f08-5b397fb08181\") " pod="openstack/ceilometer-0" Mar 20 17:54:16 crc kubenswrapper[4690]: I0320 17:54:16.319668 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnv7d\" (UniqueName: \"kubernetes.io/projected/4e76b465-2eaf-4a41-9f08-5b397fb08181-kube-api-access-hnv7d\") pod \"ceilometer-0\" (UID: \"4e76b465-2eaf-4a41-9f08-5b397fb08181\") " pod="openstack/ceilometer-0" Mar 20 17:54:16 crc kubenswrapper[4690]: I0320 17:54:16.319701 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4e76b465-2eaf-4a41-9f08-5b397fb08181-log-httpd\") pod \"ceilometer-0\" (UID: \"4e76b465-2eaf-4a41-9f08-5b397fb08181\") " pod="openstack/ceilometer-0" Mar 20 17:54:16 crc kubenswrapper[4690]: I0320 17:54:16.319804 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4e76b465-2eaf-4a41-9f08-5b397fb08181-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4e76b465-2eaf-4a41-9f08-5b397fb08181\") " pod="openstack/ceilometer-0" Mar 20 17:54:16 crc kubenswrapper[4690]: I0320 17:54:16.326555 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:54:16 crc kubenswrapper[4690]: I0320 17:54:16.424107 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e76b465-2eaf-4a41-9f08-5b397fb08181-scripts\") pod \"ceilometer-0\" (UID: \"4e76b465-2eaf-4a41-9f08-5b397fb08181\") " pod="openstack/ceilometer-0" Mar 20 17:54:16 crc kubenswrapper[4690]: I0320 17:54:16.424533 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e76b465-2eaf-4a41-9f08-5b397fb08181-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4e76b465-2eaf-4a41-9f08-5b397fb08181\") " pod="openstack/ceilometer-0" Mar 20 17:54:16 crc kubenswrapper[4690]: I0320 17:54:16.424611 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4e76b465-2eaf-4a41-9f08-5b397fb08181-run-httpd\") pod \"ceilometer-0\" (UID: \"4e76b465-2eaf-4a41-9f08-5b397fb08181\") " pod="openstack/ceilometer-0" Mar 20 17:54:16 crc kubenswrapper[4690]: I0320 17:54:16.424647 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e76b465-2eaf-4a41-9f08-5b397fb08181-config-data\") pod \"ceilometer-0\" (UID: \"4e76b465-2eaf-4a41-9f08-5b397fb08181\") " pod="openstack/ceilometer-0" Mar 20 17:54:16 crc kubenswrapper[4690]: I0320 17:54:16.424678 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnv7d\" (UniqueName: \"kubernetes.io/projected/4e76b465-2eaf-4a41-9f08-5b397fb08181-kube-api-access-hnv7d\") pod \"ceilometer-0\" (UID: \"4e76b465-2eaf-4a41-9f08-5b397fb08181\") " pod="openstack/ceilometer-0" Mar 20 17:54:16 crc kubenswrapper[4690]: I0320 17:54:16.424713 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4e76b465-2eaf-4a41-9f08-5b397fb08181-log-httpd\") pod \"ceilometer-0\" (UID: \"4e76b465-2eaf-4a41-9f08-5b397fb08181\") " pod="openstack/ceilometer-0" Mar 20 17:54:16 crc kubenswrapper[4690]: I0320 17:54:16.424776 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4e76b465-2eaf-4a41-9f08-5b397fb08181-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4e76b465-2eaf-4a41-9f08-5b397fb08181\") " pod="openstack/ceilometer-0" Mar 20 17:54:16 crc kubenswrapper[4690]: I0320 17:54:16.426140 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4e76b465-2eaf-4a41-9f08-5b397fb08181-run-httpd\") pod \"ceilometer-0\" (UID: \"4e76b465-2eaf-4a41-9f08-5b397fb08181\") " pod="openstack/ceilometer-0" Mar 20 17:54:16 crc kubenswrapper[4690]: I0320 17:54:16.426198 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4e76b465-2eaf-4a41-9f08-5b397fb08181-log-httpd\") pod \"ceilometer-0\" (UID: \"4e76b465-2eaf-4a41-9f08-5b397fb08181\") " pod="openstack/ceilometer-0" Mar 20 17:54:16 crc kubenswrapper[4690]: I0320 17:54:16.430065 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e76b465-2eaf-4a41-9f08-5b397fb08181-config-data\") pod \"ceilometer-0\" (UID: \"4e76b465-2eaf-4a41-9f08-5b397fb08181\") " pod="openstack/ceilometer-0" Mar 20 17:54:16 crc kubenswrapper[4690]: I0320 17:54:16.435290 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e76b465-2eaf-4a41-9f08-5b397fb08181-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4e76b465-2eaf-4a41-9f08-5b397fb08181\") " pod="openstack/ceilometer-0" Mar 20 17:54:16 crc kubenswrapper[4690]: I0320 17:54:16.435411 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e76b465-2eaf-4a41-9f08-5b397fb08181-scripts\") pod \"ceilometer-0\" (UID: \"4e76b465-2eaf-4a41-9f08-5b397fb08181\") " pod="openstack/ceilometer-0" Mar 20 17:54:16 crc kubenswrapper[4690]: I0320 17:54:16.435873 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4e76b465-2eaf-4a41-9f08-5b397fb08181-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4e76b465-2eaf-4a41-9f08-5b397fb08181\") " pod="openstack/ceilometer-0" Mar 20 17:54:16 crc kubenswrapper[4690]: I0320 17:54:16.447470 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnv7d\" (UniqueName: \"kubernetes.io/projected/4e76b465-2eaf-4a41-9f08-5b397fb08181-kube-api-access-hnv7d\") pod \"ceilometer-0\" (UID: \"4e76b465-2eaf-4a41-9f08-5b397fb08181\") " pod="openstack/ceilometer-0" Mar 20 17:54:16 crc kubenswrapper[4690]: I0320 17:54:16.624914 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:54:17 crc kubenswrapper[4690]: I0320 17:54:17.005327 4690 generic.go:334] "Generic (PLEG): container finished" podID="6ffe945e-e151-44eb-82d1-99c46c113fbe" containerID="f6803aedebf555d37ebe569b5527b1cd56fd53b3197d0cbbd757d5db758379ad" exitCode=0 Mar 20 17:54:17 crc kubenswrapper[4690]: I0320 17:54:17.005428 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-3684-account-create-update-9hhdp" event={"ID":"6ffe945e-e151-44eb-82d1-99c46c113fbe","Type":"ContainerDied","Data":"f6803aedebf555d37ebe569b5527b1cd56fd53b3197d0cbbd757d5db758379ad"} Mar 20 17:54:17 crc kubenswrapper[4690]: I0320 17:54:17.007142 4690 generic.go:334] "Generic (PLEG): container finished" podID="9643bf40-9abd-47b9-9e39-2b1dfea6949d" containerID="61274cb8cd6778715a81cba83f182cc3dc615d2d6ca808fe2febc61a37e4d6ad" exitCode=0 Mar 20 17:54:17 crc kubenswrapper[4690]: I0320 17:54:17.007208 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-rb6zb" event={"ID":"9643bf40-9abd-47b9-9e39-2b1dfea6949d","Type":"ContainerDied","Data":"61274cb8cd6778715a81cba83f182cc3dc615d2d6ca808fe2febc61a37e4d6ad"} Mar 20 17:54:17 crc kubenswrapper[4690]: I0320 17:54:17.009502 4690 generic.go:334] "Generic (PLEG): container finished" podID="140a83c9-1fb7-49cc-83a6-b78677db1779" containerID="d559e825f90c6dd81b674f00f720dd771ed4238cfc8ce5c0300dcfadf0dd3bb6" exitCode=0 Mar 20 17:54:17 crc kubenswrapper[4690]: I0320 17:54:17.009576 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-nb497" event={"ID":"140a83c9-1fb7-49cc-83a6-b78677db1779","Type":"ContainerDied","Data":"d559e825f90c6dd81b674f00f720dd771ed4238cfc8ce5c0300dcfadf0dd3bb6"} Mar 20 17:54:17 crc kubenswrapper[4690]: I0320 17:54:17.009595 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-nb497" event={"ID":"140a83c9-1fb7-49cc-83a6-b78677db1779","Type":"ContainerStarted","Data":"5c1bd48613a54f98e1fa3598f9e29a2eaae96c241af71a5590c59e58160b7673"} Mar 20 17:54:17 crc kubenswrapper[4690]: I0320 17:54:17.017637 4690 generic.go:334] "Generic (PLEG): container finished" podID="393d76ad-66f6-46fe-93d8-833e5193f216" containerID="2271441d1328cfe6e45b0d4e507a22440a0fcaa13840aada66b397bde92e354c" exitCode=0 Mar 20 17:54:17 crc kubenswrapper[4690]: I0320 17:54:17.017681 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-2bb3-account-create-update-t2z8f" event={"ID":"393d76ad-66f6-46fe-93d8-833e5193f216","Type":"ContainerDied","Data":"2271441d1328cfe6e45b0d4e507a22440a0fcaa13840aada66b397bde92e354c"} Mar 20 17:54:17 crc kubenswrapper[4690]: I0320 17:54:17.025563 4690 generic.go:334] "Generic (PLEG): container finished" podID="ebf4cd57-6e6d-40f1-9fb6-a0d140c04c30" containerID="c23a13d04ecada2157aaa2f4170be1d91cd1e581bebe8dfc0552d1ff02ca790a" exitCode=0 Mar 20 17:54:17 crc kubenswrapper[4690]: I0320 17:54:17.025797 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-ce9f-account-create-update-4vcxr" event={"ID":"ebf4cd57-6e6d-40f1-9fb6-a0d140c04c30","Type":"ContainerDied","Data":"c23a13d04ecada2157aaa2f4170be1d91cd1e581bebe8dfc0552d1ff02ca790a"} Mar 20 17:54:17 crc kubenswrapper[4690]: I0320 17:54:17.025841 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-ce9f-account-create-update-4vcxr" event={"ID":"ebf4cd57-6e6d-40f1-9fb6-a0d140c04c30","Type":"ContainerStarted","Data":"232a328b84a641597c9ded8a9f5997c7b0c0e87b468fafd71669d623cc128afa"} Mar 20 17:54:17 crc kubenswrapper[4690]: I0320 17:54:17.027616 4690 generic.go:334] "Generic (PLEG): container finished" podID="fbcd1d86-01fb-4773-a19c-10ac29e045e1" containerID="a2c347262d69e38b72766e97307bcd4b12f3a01e70cb971c0fcefa75c411a292" exitCode=0 Mar 20 17:54:17 crc kubenswrapper[4690]: I0320 17:54:17.028414 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-plcj5" event={"ID":"fbcd1d86-01fb-4773-a19c-10ac29e045e1","Type":"ContainerDied","Data":"a2c347262d69e38b72766e97307bcd4b12f3a01e70cb971c0fcefa75c411a292"} Mar 20 17:54:17 crc kubenswrapper[4690]: W0320 17:54:17.084073 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4e76b465_2eaf_4a41_9f08_5b397fb08181.slice/crio-d2c3bf32129e84a1bb8d4795d9a3c039911f1e859ee6186956ad1b362ca18eb8 WatchSource:0}: Error finding container d2c3bf32129e84a1bb8d4795d9a3c039911f1e859ee6186956ad1b362ca18eb8: Status 404 returned error can't find the container with id d2c3bf32129e84a1bb8d4795d9a3c039911f1e859ee6186956ad1b362ca18eb8 Mar 20 17:54:17 crc kubenswrapper[4690]: I0320 17:54:17.090911 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:54:17 crc kubenswrapper[4690]: I0320 17:54:17.496807 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-799f9bd8b7-4q7w9" Mar 20 17:54:17 crc kubenswrapper[4690]: I0320 17:54:17.499823 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-799f9bd8b7-4q7w9" Mar 20 17:54:17 crc kubenswrapper[4690]: I0320 17:54:17.895663 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48156974-0ab6-4f24-8d90-c5dcdfbe9f37" path="/var/lib/kubelet/pods/48156974-0ab6-4f24-8d90-c5dcdfbe9f37/volumes" Mar 20 17:54:18 crc kubenswrapper[4690]: I0320 17:54:18.089149 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4e76b465-2eaf-4a41-9f08-5b397fb08181","Type":"ContainerStarted","Data":"126704503de4765a6440e251e1b8bb306826619827125fa5a235eaf63ff4b101"} Mar 20 17:54:18 crc kubenswrapper[4690]: I0320 17:54:18.089485 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4e76b465-2eaf-4a41-9f08-5b397fb08181","Type":"ContainerStarted","Data":"d2c3bf32129e84a1bb8d4795d9a3c039911f1e859ee6186956ad1b362ca18eb8"} Mar 20 17:54:18 crc kubenswrapper[4690]: I0320 17:54:18.512955 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-2bb3-account-create-update-t2z8f" Mar 20 17:54:18 crc kubenswrapper[4690]: I0320 17:54:18.670986 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/393d76ad-66f6-46fe-93d8-833e5193f216-operator-scripts\") pod \"393d76ad-66f6-46fe-93d8-833e5193f216\" (UID: \"393d76ad-66f6-46fe-93d8-833e5193f216\") " Mar 20 17:54:18 crc kubenswrapper[4690]: I0320 17:54:18.671348 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wlld8\" (UniqueName: \"kubernetes.io/projected/393d76ad-66f6-46fe-93d8-833e5193f216-kube-api-access-wlld8\") pod \"393d76ad-66f6-46fe-93d8-833e5193f216\" (UID: \"393d76ad-66f6-46fe-93d8-833e5193f216\") " Mar 20 17:54:18 crc kubenswrapper[4690]: I0320 17:54:18.672015 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/393d76ad-66f6-46fe-93d8-833e5193f216-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "393d76ad-66f6-46fe-93d8-833e5193f216" (UID: "393d76ad-66f6-46fe-93d8-833e5193f216"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:54:18 crc kubenswrapper[4690]: I0320 17:54:18.686634 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/393d76ad-66f6-46fe-93d8-833e5193f216-kube-api-access-wlld8" (OuterVolumeSpecName: "kube-api-access-wlld8") pod "393d76ad-66f6-46fe-93d8-833e5193f216" (UID: "393d76ad-66f6-46fe-93d8-833e5193f216"). InnerVolumeSpecName "kube-api-access-wlld8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:54:18 crc kubenswrapper[4690]: I0320 17:54:18.737184 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-plcj5" Mar 20 17:54:18 crc kubenswrapper[4690]: I0320 17:54:18.743568 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-nb497" Mar 20 17:54:18 crc kubenswrapper[4690]: I0320 17:54:18.750222 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-rb6zb" Mar 20 17:54:18 crc kubenswrapper[4690]: I0320 17:54:18.759636 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-ce9f-account-create-update-4vcxr" Mar 20 17:54:18 crc kubenswrapper[4690]: I0320 17:54:18.772302 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/140a83c9-1fb7-49cc-83a6-b78677db1779-operator-scripts\") pod \"140a83c9-1fb7-49cc-83a6-b78677db1779\" (UID: \"140a83c9-1fb7-49cc-83a6-b78677db1779\") " Mar 20 17:54:18 crc kubenswrapper[4690]: I0320 17:54:18.772343 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-msjqv\" (UniqueName: \"kubernetes.io/projected/140a83c9-1fb7-49cc-83a6-b78677db1779-kube-api-access-msjqv\") pod \"140a83c9-1fb7-49cc-83a6-b78677db1779\" (UID: \"140a83c9-1fb7-49cc-83a6-b78677db1779\") " Mar 20 17:54:18 crc kubenswrapper[4690]: I0320 17:54:18.772375 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fbcd1d86-01fb-4773-a19c-10ac29e045e1-operator-scripts\") pod \"fbcd1d86-01fb-4773-a19c-10ac29e045e1\" (UID: \"fbcd1d86-01fb-4773-a19c-10ac29e045e1\") " Mar 20 17:54:18 crc kubenswrapper[4690]: I0320 17:54:18.772404 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ebf4cd57-6e6d-40f1-9fb6-a0d140c04c30-operator-scripts\") pod \"ebf4cd57-6e6d-40f1-9fb6-a0d140c04c30\" (UID: \"ebf4cd57-6e6d-40f1-9fb6-a0d140c04c30\") " Mar 20 17:54:18 crc kubenswrapper[4690]: I0320 17:54:18.772436 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n4k27\" (UniqueName: \"kubernetes.io/projected/fbcd1d86-01fb-4773-a19c-10ac29e045e1-kube-api-access-n4k27\") pod \"fbcd1d86-01fb-4773-a19c-10ac29e045e1\" (UID: \"fbcd1d86-01fb-4773-a19c-10ac29e045e1\") " Mar 20 17:54:18 crc kubenswrapper[4690]: I0320 17:54:18.772456 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rf5jn\" (UniqueName: \"kubernetes.io/projected/9643bf40-9abd-47b9-9e39-2b1dfea6949d-kube-api-access-rf5jn\") pod \"9643bf40-9abd-47b9-9e39-2b1dfea6949d\" (UID: \"9643bf40-9abd-47b9-9e39-2b1dfea6949d\") " Mar 20 17:54:18 crc kubenswrapper[4690]: I0320 17:54:18.772473 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9643bf40-9abd-47b9-9e39-2b1dfea6949d-operator-scripts\") pod \"9643bf40-9abd-47b9-9e39-2b1dfea6949d\" (UID: \"9643bf40-9abd-47b9-9e39-2b1dfea6949d\") " Mar 20 17:54:18 crc kubenswrapper[4690]: I0320 17:54:18.772493 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6x9l4\" (UniqueName: \"kubernetes.io/projected/ebf4cd57-6e6d-40f1-9fb6-a0d140c04c30-kube-api-access-6x9l4\") pod \"ebf4cd57-6e6d-40f1-9fb6-a0d140c04c30\" (UID: \"ebf4cd57-6e6d-40f1-9fb6-a0d140c04c30\") " Mar 20 17:54:18 crc kubenswrapper[4690]: I0320 17:54:18.772751 4690 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/393d76ad-66f6-46fe-93d8-833e5193f216-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:18 crc kubenswrapper[4690]: I0320 17:54:18.772767 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wlld8\" (UniqueName: \"kubernetes.io/projected/393d76ad-66f6-46fe-93d8-833e5193f216-kube-api-access-wlld8\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:18 crc kubenswrapper[4690]: I0320 17:54:18.773484 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebf4cd57-6e6d-40f1-9fb6-a0d140c04c30-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ebf4cd57-6e6d-40f1-9fb6-a0d140c04c30" (UID: "ebf4cd57-6e6d-40f1-9fb6-a0d140c04c30"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:54:18 crc kubenswrapper[4690]: I0320 17:54:18.773817 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/140a83c9-1fb7-49cc-83a6-b78677db1779-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "140a83c9-1fb7-49cc-83a6-b78677db1779" (UID: "140a83c9-1fb7-49cc-83a6-b78677db1779"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:54:18 crc kubenswrapper[4690]: I0320 17:54:18.774152 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-3684-account-create-update-9hhdp" Mar 20 17:54:18 crc kubenswrapper[4690]: I0320 17:54:18.774916 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9643bf40-9abd-47b9-9e39-2b1dfea6949d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9643bf40-9abd-47b9-9e39-2b1dfea6949d" (UID: "9643bf40-9abd-47b9-9e39-2b1dfea6949d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:54:18 crc kubenswrapper[4690]: I0320 17:54:18.775011 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fbcd1d86-01fb-4773-a19c-10ac29e045e1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fbcd1d86-01fb-4773-a19c-10ac29e045e1" (UID: "fbcd1d86-01fb-4773-a19c-10ac29e045e1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:54:18 crc kubenswrapper[4690]: I0320 17:54:18.775493 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebf4cd57-6e6d-40f1-9fb6-a0d140c04c30-kube-api-access-6x9l4" (OuterVolumeSpecName: "kube-api-access-6x9l4") pod "ebf4cd57-6e6d-40f1-9fb6-a0d140c04c30" (UID: "ebf4cd57-6e6d-40f1-9fb6-a0d140c04c30"). InnerVolumeSpecName "kube-api-access-6x9l4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:54:18 crc kubenswrapper[4690]: I0320 17:54:18.776008 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/140a83c9-1fb7-49cc-83a6-b78677db1779-kube-api-access-msjqv" (OuterVolumeSpecName: "kube-api-access-msjqv") pod "140a83c9-1fb7-49cc-83a6-b78677db1779" (UID: "140a83c9-1fb7-49cc-83a6-b78677db1779"). InnerVolumeSpecName "kube-api-access-msjqv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:54:18 crc kubenswrapper[4690]: I0320 17:54:18.778180 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbcd1d86-01fb-4773-a19c-10ac29e045e1-kube-api-access-n4k27" (OuterVolumeSpecName: "kube-api-access-n4k27") pod "fbcd1d86-01fb-4773-a19c-10ac29e045e1" (UID: "fbcd1d86-01fb-4773-a19c-10ac29e045e1"). InnerVolumeSpecName "kube-api-access-n4k27". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:54:18 crc kubenswrapper[4690]: I0320 17:54:18.781431 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9643bf40-9abd-47b9-9e39-2b1dfea6949d-kube-api-access-rf5jn" (OuterVolumeSpecName: "kube-api-access-rf5jn") pod "9643bf40-9abd-47b9-9e39-2b1dfea6949d" (UID: "9643bf40-9abd-47b9-9e39-2b1dfea6949d"). InnerVolumeSpecName "kube-api-access-rf5jn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:54:18 crc kubenswrapper[4690]: I0320 17:54:18.874271 4690 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fbcd1d86-01fb-4773-a19c-10ac29e045e1-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:18 crc kubenswrapper[4690]: I0320 17:54:18.874306 4690 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ebf4cd57-6e6d-40f1-9fb6-a0d140c04c30-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:18 crc kubenswrapper[4690]: I0320 17:54:18.874317 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n4k27\" (UniqueName: \"kubernetes.io/projected/fbcd1d86-01fb-4773-a19c-10ac29e045e1-kube-api-access-n4k27\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:18 crc kubenswrapper[4690]: I0320 17:54:18.874327 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rf5jn\" (UniqueName: \"kubernetes.io/projected/9643bf40-9abd-47b9-9e39-2b1dfea6949d-kube-api-access-rf5jn\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:18 crc kubenswrapper[4690]: I0320 17:54:18.874337 4690 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9643bf40-9abd-47b9-9e39-2b1dfea6949d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:18 crc kubenswrapper[4690]: I0320 17:54:18.874346 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6x9l4\" (UniqueName: \"kubernetes.io/projected/ebf4cd57-6e6d-40f1-9fb6-a0d140c04c30-kube-api-access-6x9l4\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:18 crc kubenswrapper[4690]: I0320 17:54:18.874354 4690 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/140a83c9-1fb7-49cc-83a6-b78677db1779-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:18 crc kubenswrapper[4690]: I0320 17:54:18.874362 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-msjqv\" (UniqueName: \"kubernetes.io/projected/140a83c9-1fb7-49cc-83a6-b78677db1779-kube-api-access-msjqv\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:18 crc kubenswrapper[4690]: I0320 17:54:18.975016 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ck6bj\" (UniqueName: \"kubernetes.io/projected/6ffe945e-e151-44eb-82d1-99c46c113fbe-kube-api-access-ck6bj\") pod \"6ffe945e-e151-44eb-82d1-99c46c113fbe\" (UID: \"6ffe945e-e151-44eb-82d1-99c46c113fbe\") " Mar 20 17:54:18 crc kubenswrapper[4690]: I0320 17:54:18.975193 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6ffe945e-e151-44eb-82d1-99c46c113fbe-operator-scripts\") pod \"6ffe945e-e151-44eb-82d1-99c46c113fbe\" (UID: \"6ffe945e-e151-44eb-82d1-99c46c113fbe\") " Mar 20 17:54:18 crc kubenswrapper[4690]: I0320 17:54:18.975575 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ffe945e-e151-44eb-82d1-99c46c113fbe-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6ffe945e-e151-44eb-82d1-99c46c113fbe" (UID: "6ffe945e-e151-44eb-82d1-99c46c113fbe"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:54:18 crc kubenswrapper[4690]: I0320 17:54:18.976308 4690 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6ffe945e-e151-44eb-82d1-99c46c113fbe-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:18 crc kubenswrapper[4690]: I0320 17:54:18.980926 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ffe945e-e151-44eb-82d1-99c46c113fbe-kube-api-access-ck6bj" (OuterVolumeSpecName: "kube-api-access-ck6bj") pod "6ffe945e-e151-44eb-82d1-99c46c113fbe" (UID: "6ffe945e-e151-44eb-82d1-99c46c113fbe"). InnerVolumeSpecName "kube-api-access-ck6bj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:54:19 crc kubenswrapper[4690]: I0320 17:54:19.014853 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:54:19 crc kubenswrapper[4690]: I0320 17:54:19.044087 4690 scope.go:117] "RemoveContainer" containerID="6403cdd07c5bf3519ce0d679d19102efd34e6c654fbee43e83f521f45730b56f" Mar 20 17:54:19 crc kubenswrapper[4690]: I0320 17:54:19.077919 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ck6bj\" (UniqueName: \"kubernetes.io/projected/6ffe945e-e151-44eb-82d1-99c46c113fbe-kube-api-access-ck6bj\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:19 crc kubenswrapper[4690]: I0320 17:54:19.107356 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4e76b465-2eaf-4a41-9f08-5b397fb08181","Type":"ContainerStarted","Data":"bdf6fbb853034ae1eac8dfde0006b0780f68c09f9498a4a6929c39e37fdab2d1"} Mar 20 17:54:19 crc kubenswrapper[4690]: I0320 17:54:19.113989 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-nb497" event={"ID":"140a83c9-1fb7-49cc-83a6-b78677db1779","Type":"ContainerDied","Data":"5c1bd48613a54f98e1fa3598f9e29a2eaae96c241af71a5590c59e58160b7673"} Mar 20 17:54:19 crc kubenswrapper[4690]: I0320 17:54:19.114016 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5c1bd48613a54f98e1fa3598f9e29a2eaae96c241af71a5590c59e58160b7673" Mar 20 17:54:19 crc kubenswrapper[4690]: I0320 17:54:19.114068 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-nb497" Mar 20 17:54:19 crc kubenswrapper[4690]: I0320 17:54:19.129062 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-2bb3-account-create-update-t2z8f" event={"ID":"393d76ad-66f6-46fe-93d8-833e5193f216","Type":"ContainerDied","Data":"37dc383b16818e8de9ee15a20fb1f312d35fc74a48b43b24630a447421e39112"} Mar 20 17:54:19 crc kubenswrapper[4690]: I0320 17:54:19.129104 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="37dc383b16818e8de9ee15a20fb1f312d35fc74a48b43b24630a447421e39112" Mar 20 17:54:19 crc kubenswrapper[4690]: I0320 17:54:19.129156 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-2bb3-account-create-update-t2z8f" Mar 20 17:54:19 crc kubenswrapper[4690]: I0320 17:54:19.132112 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-ce9f-account-create-update-4vcxr" event={"ID":"ebf4cd57-6e6d-40f1-9fb6-a0d140c04c30","Type":"ContainerDied","Data":"232a328b84a641597c9ded8a9f5997c7b0c0e87b468fafd71669d623cc128afa"} Mar 20 17:54:19 crc kubenswrapper[4690]: I0320 17:54:19.132146 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="232a328b84a641597c9ded8a9f5997c7b0c0e87b468fafd71669d623cc128afa" Mar 20 17:54:19 crc kubenswrapper[4690]: I0320 17:54:19.132211 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-ce9f-account-create-update-4vcxr" Mar 20 17:54:19 crc kubenswrapper[4690]: I0320 17:54:19.148803 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-plcj5" Mar 20 17:54:19 crc kubenswrapper[4690]: I0320 17:54:19.148793 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-plcj5" event={"ID":"fbcd1d86-01fb-4773-a19c-10ac29e045e1","Type":"ContainerDied","Data":"3931c227597f25a4090ce7057a390d2bec44fe77832e162fb336b5aca37da02f"} Mar 20 17:54:19 crc kubenswrapper[4690]: I0320 17:54:19.149360 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3931c227597f25a4090ce7057a390d2bec44fe77832e162fb336b5aca37da02f" Mar 20 17:54:19 crc kubenswrapper[4690]: I0320 17:54:19.151162 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-3684-account-create-update-9hhdp" event={"ID":"6ffe945e-e151-44eb-82d1-99c46c113fbe","Type":"ContainerDied","Data":"7d40ac3db15bf93fdf36652f533be62c19f98db7ec0716cb10af78ec4d0220d7"} Mar 20 17:54:19 crc kubenswrapper[4690]: I0320 17:54:19.151191 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d40ac3db15bf93fdf36652f533be62c19f98db7ec0716cb10af78ec4d0220d7" Mar 20 17:54:19 crc kubenswrapper[4690]: I0320 17:54:19.151246 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-3684-account-create-update-9hhdp" Mar 20 17:54:19 crc kubenswrapper[4690]: I0320 17:54:19.153776 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-rb6zb" event={"ID":"9643bf40-9abd-47b9-9e39-2b1dfea6949d","Type":"ContainerDied","Data":"de26b9ebd3d1e97e60d38b9c71f1258b34be4f54b33bba4c681a09ecd10ad071"} Mar 20 17:54:19 crc kubenswrapper[4690]: I0320 17:54:19.153819 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de26b9ebd3d1e97e60d38b9c71f1258b34be4f54b33bba4c681a09ecd10ad071" Mar 20 17:54:19 crc kubenswrapper[4690]: I0320 17:54:19.153849 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-rb6zb" Mar 20 17:54:19 crc kubenswrapper[4690]: I0320 17:54:19.625336 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-675c5fd7b7-z9vsh" Mar 20 17:54:19 crc kubenswrapper[4690]: I0320 17:54:19.689974 4690 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-867c5896-qkwmr" podUID="607d61e7-e52a-46e6-a23a-2d4714c5b543" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.148:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.148:8443: connect: connection refused" Mar 20 17:54:19 crc kubenswrapper[4690]: I0320 17:54:19.690086 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-867c5896-qkwmr" Mar 20 17:54:19 crc kubenswrapper[4690]: I0320 17:54:19.707859 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-75b55fdddd-6ht5q"] Mar 20 17:54:19 crc kubenswrapper[4690]: I0320 17:54:19.708136 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-75b55fdddd-6ht5q" podUID="04ab06c3-11ab-4253-bafa-fea6ac93bedf" containerName="neutron-api" containerID="cri-o://9dc9555160a85373f86d6289076c054b66d8a4efcaf73bd80e525e3f4b9a1393" gracePeriod=30 Mar 20 17:54:19 crc kubenswrapper[4690]: I0320 17:54:19.708319 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-75b55fdddd-6ht5q" podUID="04ab06c3-11ab-4253-bafa-fea6ac93bedf" containerName="neutron-httpd" containerID="cri-o://d5ab180629d73512bae5dd907281bbec2640f129392bd67d7ae8ac78c3ca5703" gracePeriod=30 Mar 20 17:54:20 crc kubenswrapper[4690]: I0320 17:54:20.164738 4690 generic.go:334] "Generic (PLEG): container finished" podID="04ab06c3-11ab-4253-bafa-fea6ac93bedf" containerID="d5ab180629d73512bae5dd907281bbec2640f129392bd67d7ae8ac78c3ca5703" exitCode=0 Mar 20 17:54:20 crc kubenswrapper[4690]: I0320 17:54:20.165021 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-75b55fdddd-6ht5q" event={"ID":"04ab06c3-11ab-4253-bafa-fea6ac93bedf","Type":"ContainerDied","Data":"d5ab180629d73512bae5dd907281bbec2640f129392bd67d7ae8ac78c3ca5703"} Mar 20 17:54:21 crc kubenswrapper[4690]: I0320 17:54:21.175351 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4e76b465-2eaf-4a41-9f08-5b397fb08181","Type":"ContainerStarted","Data":"edcc85af13a00a47cd97cc62d1a8ea4ba5430ccc2575031b7d929018657d3fd1"} Mar 20 17:54:21 crc kubenswrapper[4690]: I0320 17:54:21.907578 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-njclq"] Mar 20 17:54:21 crc kubenswrapper[4690]: E0320 17:54:21.908420 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9643bf40-9abd-47b9-9e39-2b1dfea6949d" containerName="mariadb-database-create" Mar 20 17:54:21 crc kubenswrapper[4690]: I0320 17:54:21.908443 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="9643bf40-9abd-47b9-9e39-2b1dfea6949d" containerName="mariadb-database-create" Mar 20 17:54:21 crc kubenswrapper[4690]: E0320 17:54:21.908460 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebf4cd57-6e6d-40f1-9fb6-a0d140c04c30" containerName="mariadb-account-create-update" Mar 20 17:54:21 crc kubenswrapper[4690]: I0320 17:54:21.908468 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebf4cd57-6e6d-40f1-9fb6-a0d140c04c30" containerName="mariadb-account-create-update" Mar 20 17:54:21 crc kubenswrapper[4690]: E0320 17:54:21.908508 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ffe945e-e151-44eb-82d1-99c46c113fbe" containerName="mariadb-account-create-update" Mar 20 17:54:21 crc kubenswrapper[4690]: I0320 17:54:21.908517 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ffe945e-e151-44eb-82d1-99c46c113fbe" containerName="mariadb-account-create-update" Mar 20 17:54:21 crc kubenswrapper[4690]: E0320 17:54:21.908532 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="140a83c9-1fb7-49cc-83a6-b78677db1779" containerName="mariadb-database-create" Mar 20 17:54:21 crc kubenswrapper[4690]: I0320 17:54:21.908552 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="140a83c9-1fb7-49cc-83a6-b78677db1779" containerName="mariadb-database-create" Mar 20 17:54:21 crc kubenswrapper[4690]: E0320 17:54:21.908565 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbcd1d86-01fb-4773-a19c-10ac29e045e1" containerName="mariadb-database-create" Mar 20 17:54:21 crc kubenswrapper[4690]: I0320 17:54:21.908574 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbcd1d86-01fb-4773-a19c-10ac29e045e1" containerName="mariadb-database-create" Mar 20 17:54:21 crc kubenswrapper[4690]: E0320 17:54:21.908592 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="393d76ad-66f6-46fe-93d8-833e5193f216" containerName="mariadb-account-create-update" Mar 20 17:54:21 crc kubenswrapper[4690]: I0320 17:54:21.908600 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="393d76ad-66f6-46fe-93d8-833e5193f216" containerName="mariadb-account-create-update" Mar 20 17:54:21 crc kubenswrapper[4690]: I0320 17:54:21.908796 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbcd1d86-01fb-4773-a19c-10ac29e045e1" containerName="mariadb-database-create" Mar 20 17:54:21 crc kubenswrapper[4690]: I0320 17:54:21.908807 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ffe945e-e151-44eb-82d1-99c46c113fbe" containerName="mariadb-account-create-update" Mar 20 17:54:21 crc kubenswrapper[4690]: I0320 17:54:21.908818 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="9643bf40-9abd-47b9-9e39-2b1dfea6949d" containerName="mariadb-database-create" Mar 20 17:54:21 crc kubenswrapper[4690]: I0320 17:54:21.908840 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="140a83c9-1fb7-49cc-83a6-b78677db1779" containerName="mariadb-database-create" Mar 20 17:54:21 crc kubenswrapper[4690]: I0320 17:54:21.908848 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="393d76ad-66f6-46fe-93d8-833e5193f216" containerName="mariadb-account-create-update" Mar 20 17:54:21 crc kubenswrapper[4690]: I0320 17:54:21.908859 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebf4cd57-6e6d-40f1-9fb6-a0d140c04c30" containerName="mariadb-account-create-update" Mar 20 17:54:21 crc kubenswrapper[4690]: I0320 17:54:21.909593 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-njclq" Mar 20 17:54:21 crc kubenswrapper[4690]: I0320 17:54:21.918616 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 20 17:54:21 crc kubenswrapper[4690]: I0320 17:54:21.921851 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 20 17:54:21 crc kubenswrapper[4690]: I0320 17:54:21.922302 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-f6gl8" Mar 20 17:54:21 crc kubenswrapper[4690]: I0320 17:54:21.922829 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-njclq"] Mar 20 17:54:22 crc kubenswrapper[4690]: I0320 17:54:22.037488 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bgh9\" (UniqueName: \"kubernetes.io/projected/95830c09-d53f-4e08-800d-09d227668aee-kube-api-access-5bgh9\") pod \"nova-cell0-conductor-db-sync-njclq\" (UID: \"95830c09-d53f-4e08-800d-09d227668aee\") " pod="openstack/nova-cell0-conductor-db-sync-njclq" Mar 20 17:54:22 crc kubenswrapper[4690]: I0320 17:54:22.037624 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95830c09-d53f-4e08-800d-09d227668aee-scripts\") pod \"nova-cell0-conductor-db-sync-njclq\" (UID: \"95830c09-d53f-4e08-800d-09d227668aee\") " pod="openstack/nova-cell0-conductor-db-sync-njclq" Mar 20 17:54:22 crc kubenswrapper[4690]: I0320 17:54:22.037825 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95830c09-d53f-4e08-800d-09d227668aee-config-data\") pod \"nova-cell0-conductor-db-sync-njclq\" (UID: \"95830c09-d53f-4e08-800d-09d227668aee\") " pod="openstack/nova-cell0-conductor-db-sync-njclq" Mar 20 17:54:22 crc kubenswrapper[4690]: I0320 17:54:22.037857 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95830c09-d53f-4e08-800d-09d227668aee-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-njclq\" (UID: \"95830c09-d53f-4e08-800d-09d227668aee\") " pod="openstack/nova-cell0-conductor-db-sync-njclq" Mar 20 17:54:22 crc kubenswrapper[4690]: I0320 17:54:22.139123 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bgh9\" (UniqueName: \"kubernetes.io/projected/95830c09-d53f-4e08-800d-09d227668aee-kube-api-access-5bgh9\") pod \"nova-cell0-conductor-db-sync-njclq\" (UID: \"95830c09-d53f-4e08-800d-09d227668aee\") " pod="openstack/nova-cell0-conductor-db-sync-njclq" Mar 20 17:54:22 crc kubenswrapper[4690]: I0320 17:54:22.139480 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95830c09-d53f-4e08-800d-09d227668aee-scripts\") pod \"nova-cell0-conductor-db-sync-njclq\" (UID: \"95830c09-d53f-4e08-800d-09d227668aee\") " pod="openstack/nova-cell0-conductor-db-sync-njclq" Mar 20 17:54:22 crc kubenswrapper[4690]: I0320 17:54:22.139712 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95830c09-d53f-4e08-800d-09d227668aee-config-data\") pod \"nova-cell0-conductor-db-sync-njclq\" (UID: \"95830c09-d53f-4e08-800d-09d227668aee\") " pod="openstack/nova-cell0-conductor-db-sync-njclq" Mar 20 17:54:22 crc kubenswrapper[4690]: I0320 17:54:22.139807 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95830c09-d53f-4e08-800d-09d227668aee-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-njclq\" (UID: \"95830c09-d53f-4e08-800d-09d227668aee\") " pod="openstack/nova-cell0-conductor-db-sync-njclq" Mar 20 17:54:22 crc kubenswrapper[4690]: I0320 17:54:22.150947 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95830c09-d53f-4e08-800d-09d227668aee-scripts\") pod \"nova-cell0-conductor-db-sync-njclq\" (UID: \"95830c09-d53f-4e08-800d-09d227668aee\") " pod="openstack/nova-cell0-conductor-db-sync-njclq" Mar 20 17:54:22 crc kubenswrapper[4690]: I0320 17:54:22.151595 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95830c09-d53f-4e08-800d-09d227668aee-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-njclq\" (UID: \"95830c09-d53f-4e08-800d-09d227668aee\") " pod="openstack/nova-cell0-conductor-db-sync-njclq" Mar 20 17:54:22 crc kubenswrapper[4690]: I0320 17:54:22.152388 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95830c09-d53f-4e08-800d-09d227668aee-config-data\") pod \"nova-cell0-conductor-db-sync-njclq\" (UID: \"95830c09-d53f-4e08-800d-09d227668aee\") " pod="openstack/nova-cell0-conductor-db-sync-njclq" Mar 20 17:54:22 crc kubenswrapper[4690]: I0320 17:54:22.159233 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bgh9\" (UniqueName: \"kubernetes.io/projected/95830c09-d53f-4e08-800d-09d227668aee-kube-api-access-5bgh9\") pod \"nova-cell0-conductor-db-sync-njclq\" (UID: \"95830c09-d53f-4e08-800d-09d227668aee\") " pod="openstack/nova-cell0-conductor-db-sync-njclq" Mar 20 17:54:22 crc kubenswrapper[4690]: I0320 17:54:22.228732 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-njclq" Mar 20 17:54:22 crc kubenswrapper[4690]: I0320 17:54:22.739650 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-njclq"] Mar 20 17:54:23 crc kubenswrapper[4690]: I0320 17:54:23.106368 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-75b55fdddd-6ht5q" Mar 20 17:54:23 crc kubenswrapper[4690]: I0320 17:54:23.192664 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4e76b465-2eaf-4a41-9f08-5b397fb08181","Type":"ContainerStarted","Data":"86dc9ef2b8327e0779274f854f03ffac24e452d2c7c4be37e7f9af0daf0d6e38"} Mar 20 17:54:23 crc kubenswrapper[4690]: I0320 17:54:23.192798 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4e76b465-2eaf-4a41-9f08-5b397fb08181" containerName="ceilometer-central-agent" containerID="cri-o://126704503de4765a6440e251e1b8bb306826619827125fa5a235eaf63ff4b101" gracePeriod=30 Mar 20 17:54:23 crc kubenswrapper[4690]: I0320 17:54:23.192852 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4e76b465-2eaf-4a41-9f08-5b397fb08181" containerName="proxy-httpd" containerID="cri-o://86dc9ef2b8327e0779274f854f03ffac24e452d2c7c4be37e7f9af0daf0d6e38" gracePeriod=30 Mar 20 17:54:23 crc kubenswrapper[4690]: I0320 17:54:23.192864 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4e76b465-2eaf-4a41-9f08-5b397fb08181" containerName="ceilometer-notification-agent" containerID="cri-o://bdf6fbb853034ae1eac8dfde0006b0780f68c09f9498a4a6929c39e37fdab2d1" gracePeriod=30 Mar 20 17:54:23 crc kubenswrapper[4690]: I0320 17:54:23.192889 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4e76b465-2eaf-4a41-9f08-5b397fb08181" containerName="sg-core" containerID="cri-o://edcc85af13a00a47cd97cc62d1a8ea4ba5430ccc2575031b7d929018657d3fd1" gracePeriod=30 Mar 20 17:54:23 crc kubenswrapper[4690]: I0320 17:54:23.192827 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 17:54:23 crc kubenswrapper[4690]: I0320 17:54:23.197943 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-njclq" event={"ID":"95830c09-d53f-4e08-800d-09d227668aee","Type":"ContainerStarted","Data":"2c3a58957c495edd7112d9fe343d26b3e7ddd9b78c8667d10f62f9da18730bc3"} Mar 20 17:54:23 crc kubenswrapper[4690]: I0320 17:54:23.206887 4690 generic.go:334] "Generic (PLEG): container finished" podID="04ab06c3-11ab-4253-bafa-fea6ac93bedf" containerID="9dc9555160a85373f86d6289076c054b66d8a4efcaf73bd80e525e3f4b9a1393" exitCode=0 Mar 20 17:54:23 crc kubenswrapper[4690]: I0320 17:54:23.207136 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-75b55fdddd-6ht5q" event={"ID":"04ab06c3-11ab-4253-bafa-fea6ac93bedf","Type":"ContainerDied","Data":"9dc9555160a85373f86d6289076c054b66d8a4efcaf73bd80e525e3f4b9a1393"} Mar 20 17:54:23 crc kubenswrapper[4690]: I0320 17:54:23.207230 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-75b55fdddd-6ht5q" event={"ID":"04ab06c3-11ab-4253-bafa-fea6ac93bedf","Type":"ContainerDied","Data":"df618994c4b8b0b639b4d393714ca9a0a45e07fc92084bb9a36b4a4dedd90888"} Mar 20 17:54:23 crc kubenswrapper[4690]: I0320 17:54:23.207349 4690 scope.go:117] "RemoveContainer" containerID="d5ab180629d73512bae5dd907281bbec2640f129392bd67d7ae8ac78c3ca5703" Mar 20 17:54:23 crc kubenswrapper[4690]: I0320 17:54:23.207586 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-75b55fdddd-6ht5q" Mar 20 17:54:23 crc kubenswrapper[4690]: I0320 17:54:23.224461 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.712569168 podStartE2EDuration="7.224441862s" podCreationTimestamp="2026-03-20 17:54:16 +0000 UTC" firstStartedPulling="2026-03-20 17:54:17.094136858 +0000 UTC m=+1331.959962536" lastFinishedPulling="2026-03-20 17:54:22.606009552 +0000 UTC m=+1337.471835230" observedRunningTime="2026-03-20 17:54:23.210511066 +0000 UTC m=+1338.076336744" watchObservedRunningTime="2026-03-20 17:54:23.224441862 +0000 UTC m=+1338.090267550" Mar 20 17:54:23 crc kubenswrapper[4690]: I0320 17:54:23.230412 4690 scope.go:117] "RemoveContainer" containerID="9dc9555160a85373f86d6289076c054b66d8a4efcaf73bd80e525e3f4b9a1393" Mar 20 17:54:23 crc kubenswrapper[4690]: I0320 17:54:23.250875 4690 scope.go:117] "RemoveContainer" containerID="d5ab180629d73512bae5dd907281bbec2640f129392bd67d7ae8ac78c3ca5703" Mar 20 17:54:23 crc kubenswrapper[4690]: E0320 17:54:23.251984 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5ab180629d73512bae5dd907281bbec2640f129392bd67d7ae8ac78c3ca5703\": container with ID starting with d5ab180629d73512bae5dd907281bbec2640f129392bd67d7ae8ac78c3ca5703 not found: ID does not exist" containerID="d5ab180629d73512bae5dd907281bbec2640f129392bd67d7ae8ac78c3ca5703" Mar 20 17:54:23 crc kubenswrapper[4690]: I0320 17:54:23.252015 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5ab180629d73512bae5dd907281bbec2640f129392bd67d7ae8ac78c3ca5703"} err="failed to get container status \"d5ab180629d73512bae5dd907281bbec2640f129392bd67d7ae8ac78c3ca5703\": rpc error: code = NotFound desc = could not find container \"d5ab180629d73512bae5dd907281bbec2640f129392bd67d7ae8ac78c3ca5703\": container with ID starting with d5ab180629d73512bae5dd907281bbec2640f129392bd67d7ae8ac78c3ca5703 not found: ID does not exist" Mar 20 17:54:23 crc kubenswrapper[4690]: I0320 17:54:23.252036 4690 scope.go:117] "RemoveContainer" containerID="9dc9555160a85373f86d6289076c054b66d8a4efcaf73bd80e525e3f4b9a1393" Mar 20 17:54:23 crc kubenswrapper[4690]: E0320 17:54:23.252353 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9dc9555160a85373f86d6289076c054b66d8a4efcaf73bd80e525e3f4b9a1393\": container with ID starting with 9dc9555160a85373f86d6289076c054b66d8a4efcaf73bd80e525e3f4b9a1393 not found: ID does not exist" containerID="9dc9555160a85373f86d6289076c054b66d8a4efcaf73bd80e525e3f4b9a1393" Mar 20 17:54:23 crc kubenswrapper[4690]: I0320 17:54:23.252375 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9dc9555160a85373f86d6289076c054b66d8a4efcaf73bd80e525e3f4b9a1393"} err="failed to get container status \"9dc9555160a85373f86d6289076c054b66d8a4efcaf73bd80e525e3f4b9a1393\": rpc error: code = NotFound desc = could not find container \"9dc9555160a85373f86d6289076c054b66d8a4efcaf73bd80e525e3f4b9a1393\": container with ID starting with 9dc9555160a85373f86d6289076c054b66d8a4efcaf73bd80e525e3f4b9a1393 not found: ID does not exist" Mar 20 17:54:23 crc kubenswrapper[4690]: I0320 17:54:23.262784 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04ab06c3-11ab-4253-bafa-fea6ac93bedf-combined-ca-bundle\") pod \"04ab06c3-11ab-4253-bafa-fea6ac93bedf\" (UID: \"04ab06c3-11ab-4253-bafa-fea6ac93bedf\") " Mar 20 17:54:23 crc kubenswrapper[4690]: I0320 17:54:23.263044 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/04ab06c3-11ab-4253-bafa-fea6ac93bedf-httpd-config\") pod \"04ab06c3-11ab-4253-bafa-fea6ac93bedf\" (UID: \"04ab06c3-11ab-4253-bafa-fea6ac93bedf\") " Mar 20 17:54:23 crc kubenswrapper[4690]: I0320 17:54:23.263143 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/04ab06c3-11ab-4253-bafa-fea6ac93bedf-config\") pod \"04ab06c3-11ab-4253-bafa-fea6ac93bedf\" (UID: \"04ab06c3-11ab-4253-bafa-fea6ac93bedf\") " Mar 20 17:54:23 crc kubenswrapper[4690]: I0320 17:54:23.263192 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kbkbz\" (UniqueName: \"kubernetes.io/projected/04ab06c3-11ab-4253-bafa-fea6ac93bedf-kube-api-access-kbkbz\") pod \"04ab06c3-11ab-4253-bafa-fea6ac93bedf\" (UID: \"04ab06c3-11ab-4253-bafa-fea6ac93bedf\") " Mar 20 17:54:23 crc kubenswrapper[4690]: I0320 17:54:23.263232 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/04ab06c3-11ab-4253-bafa-fea6ac93bedf-ovndb-tls-certs\") pod \"04ab06c3-11ab-4253-bafa-fea6ac93bedf\" (UID: \"04ab06c3-11ab-4253-bafa-fea6ac93bedf\") " Mar 20 17:54:23 crc kubenswrapper[4690]: I0320 17:54:23.273420 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04ab06c3-11ab-4253-bafa-fea6ac93bedf-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "04ab06c3-11ab-4253-bafa-fea6ac93bedf" (UID: "04ab06c3-11ab-4253-bafa-fea6ac93bedf"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:54:23 crc kubenswrapper[4690]: I0320 17:54:23.273628 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04ab06c3-11ab-4253-bafa-fea6ac93bedf-kube-api-access-kbkbz" (OuterVolumeSpecName: "kube-api-access-kbkbz") pod "04ab06c3-11ab-4253-bafa-fea6ac93bedf" (UID: "04ab06c3-11ab-4253-bafa-fea6ac93bedf"). InnerVolumeSpecName "kube-api-access-kbkbz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:54:23 crc kubenswrapper[4690]: I0320 17:54:23.330830 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04ab06c3-11ab-4253-bafa-fea6ac93bedf-config" (OuterVolumeSpecName: "config") pod "04ab06c3-11ab-4253-bafa-fea6ac93bedf" (UID: "04ab06c3-11ab-4253-bafa-fea6ac93bedf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:54:23 crc kubenswrapper[4690]: I0320 17:54:23.334419 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04ab06c3-11ab-4253-bafa-fea6ac93bedf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "04ab06c3-11ab-4253-bafa-fea6ac93bedf" (UID: "04ab06c3-11ab-4253-bafa-fea6ac93bedf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:54:23 crc kubenswrapper[4690]: I0320 17:54:23.351485 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04ab06c3-11ab-4253-bafa-fea6ac93bedf-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "04ab06c3-11ab-4253-bafa-fea6ac93bedf" (UID: "04ab06c3-11ab-4253-bafa-fea6ac93bedf"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:54:23 crc kubenswrapper[4690]: I0320 17:54:23.365670 4690 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/04ab06c3-11ab-4253-bafa-fea6ac93bedf-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:23 crc kubenswrapper[4690]: I0320 17:54:23.365705 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kbkbz\" (UniqueName: \"kubernetes.io/projected/04ab06c3-11ab-4253-bafa-fea6ac93bedf-kube-api-access-kbkbz\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:23 crc kubenswrapper[4690]: I0320 17:54:23.365716 4690 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/04ab06c3-11ab-4253-bafa-fea6ac93bedf-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:23 crc kubenswrapper[4690]: I0320 17:54:23.365726 4690 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04ab06c3-11ab-4253-bafa-fea6ac93bedf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:23 crc kubenswrapper[4690]: I0320 17:54:23.365736 4690 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/04ab06c3-11ab-4253-bafa-fea6ac93bedf-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:23 crc kubenswrapper[4690]: I0320 17:54:23.552412 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-75b55fdddd-6ht5q"] Mar 20 17:54:23 crc kubenswrapper[4690]: I0320 17:54:23.577754 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-75b55fdddd-6ht5q"] Mar 20 17:54:23 crc kubenswrapper[4690]: I0320 17:54:23.909382 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04ab06c3-11ab-4253-bafa-fea6ac93bedf" path="/var/lib/kubelet/pods/04ab06c3-11ab-4253-bafa-fea6ac93bedf/volumes" Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.008498 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.184934 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4e76b465-2eaf-4a41-9f08-5b397fb08181-log-httpd\") pod \"4e76b465-2eaf-4a41-9f08-5b397fb08181\" (UID: \"4e76b465-2eaf-4a41-9f08-5b397fb08181\") " Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.185199 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hnv7d\" (UniqueName: \"kubernetes.io/projected/4e76b465-2eaf-4a41-9f08-5b397fb08181-kube-api-access-hnv7d\") pod \"4e76b465-2eaf-4a41-9f08-5b397fb08181\" (UID: \"4e76b465-2eaf-4a41-9f08-5b397fb08181\") " Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.185292 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4e76b465-2eaf-4a41-9f08-5b397fb08181-run-httpd\") pod \"4e76b465-2eaf-4a41-9f08-5b397fb08181\" (UID: \"4e76b465-2eaf-4a41-9f08-5b397fb08181\") " Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.185525 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4e76b465-2eaf-4a41-9f08-5b397fb08181-sg-core-conf-yaml\") pod \"4e76b465-2eaf-4a41-9f08-5b397fb08181\" (UID: \"4e76b465-2eaf-4a41-9f08-5b397fb08181\") " Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.185634 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e76b465-2eaf-4a41-9f08-5b397fb08181-config-data\") pod \"4e76b465-2eaf-4a41-9f08-5b397fb08181\" (UID: \"4e76b465-2eaf-4a41-9f08-5b397fb08181\") " Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.185690 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e76b465-2eaf-4a41-9f08-5b397fb08181-combined-ca-bundle\") pod \"4e76b465-2eaf-4a41-9f08-5b397fb08181\" (UID: \"4e76b465-2eaf-4a41-9f08-5b397fb08181\") " Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.185758 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e76b465-2eaf-4a41-9f08-5b397fb08181-scripts\") pod \"4e76b465-2eaf-4a41-9f08-5b397fb08181\" (UID: \"4e76b465-2eaf-4a41-9f08-5b397fb08181\") " Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.187349 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e76b465-2eaf-4a41-9f08-5b397fb08181-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4e76b465-2eaf-4a41-9f08-5b397fb08181" (UID: "4e76b465-2eaf-4a41-9f08-5b397fb08181"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.188180 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e76b465-2eaf-4a41-9f08-5b397fb08181-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4e76b465-2eaf-4a41-9f08-5b397fb08181" (UID: "4e76b465-2eaf-4a41-9f08-5b397fb08181"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.193855 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e76b465-2eaf-4a41-9f08-5b397fb08181-scripts" (OuterVolumeSpecName: "scripts") pod "4e76b465-2eaf-4a41-9f08-5b397fb08181" (UID: "4e76b465-2eaf-4a41-9f08-5b397fb08181"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.193886 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e76b465-2eaf-4a41-9f08-5b397fb08181-kube-api-access-hnv7d" (OuterVolumeSpecName: "kube-api-access-hnv7d") pod "4e76b465-2eaf-4a41-9f08-5b397fb08181" (UID: "4e76b465-2eaf-4a41-9f08-5b397fb08181"). InnerVolumeSpecName "kube-api-access-hnv7d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.224545 4690 generic.go:334] "Generic (PLEG): container finished" podID="4e76b465-2eaf-4a41-9f08-5b397fb08181" containerID="86dc9ef2b8327e0779274f854f03ffac24e452d2c7c4be37e7f9af0daf0d6e38" exitCode=0 Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.224768 4690 generic.go:334] "Generic (PLEG): container finished" podID="4e76b465-2eaf-4a41-9f08-5b397fb08181" containerID="edcc85af13a00a47cd97cc62d1a8ea4ba5430ccc2575031b7d929018657d3fd1" exitCode=2 Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.224851 4690 generic.go:334] "Generic (PLEG): container finished" podID="4e76b465-2eaf-4a41-9f08-5b397fb08181" containerID="bdf6fbb853034ae1eac8dfde0006b0780f68c09f9498a4a6929c39e37fdab2d1" exitCode=0 Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.224961 4690 generic.go:334] "Generic (PLEG): container finished" podID="4e76b465-2eaf-4a41-9f08-5b397fb08181" containerID="126704503de4765a6440e251e1b8bb306826619827125fa5a235eaf63ff4b101" exitCode=0 Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.225068 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4e76b465-2eaf-4a41-9f08-5b397fb08181","Type":"ContainerDied","Data":"86dc9ef2b8327e0779274f854f03ffac24e452d2c7c4be37e7f9af0daf0d6e38"} Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.225148 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4e76b465-2eaf-4a41-9f08-5b397fb08181","Type":"ContainerDied","Data":"edcc85af13a00a47cd97cc62d1a8ea4ba5430ccc2575031b7d929018657d3fd1"} Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.225207 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4e76b465-2eaf-4a41-9f08-5b397fb08181","Type":"ContainerDied","Data":"bdf6fbb853034ae1eac8dfde0006b0780f68c09f9498a4a6929c39e37fdab2d1"} Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.225366 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4e76b465-2eaf-4a41-9f08-5b397fb08181","Type":"ContainerDied","Data":"126704503de4765a6440e251e1b8bb306826619827125fa5a235eaf63ff4b101"} Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.225482 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4e76b465-2eaf-4a41-9f08-5b397fb08181","Type":"ContainerDied","Data":"d2c3bf32129e84a1bb8d4795d9a3c039911f1e859ee6186956ad1b362ca18eb8"} Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.225576 4690 scope.go:117] "RemoveContainer" containerID="86dc9ef2b8327e0779274f854f03ffac24e452d2c7c4be37e7f9af0daf0d6e38" Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.225837 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.241989 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e76b465-2eaf-4a41-9f08-5b397fb08181-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "4e76b465-2eaf-4a41-9f08-5b397fb08181" (UID: "4e76b465-2eaf-4a41-9f08-5b397fb08181"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.275994 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e76b465-2eaf-4a41-9f08-5b397fb08181-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4e76b465-2eaf-4a41-9f08-5b397fb08181" (UID: "4e76b465-2eaf-4a41-9f08-5b397fb08181"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.288001 4690 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e76b465-2eaf-4a41-9f08-5b397fb08181-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.288033 4690 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e76b465-2eaf-4a41-9f08-5b397fb08181-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.288042 4690 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4e76b465-2eaf-4a41-9f08-5b397fb08181-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.288051 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hnv7d\" (UniqueName: \"kubernetes.io/projected/4e76b465-2eaf-4a41-9f08-5b397fb08181-kube-api-access-hnv7d\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.288060 4690 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4e76b465-2eaf-4a41-9f08-5b397fb08181-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.288070 4690 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4e76b465-2eaf-4a41-9f08-5b397fb08181-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.290282 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e76b465-2eaf-4a41-9f08-5b397fb08181-config-data" (OuterVolumeSpecName: "config-data") pod "4e76b465-2eaf-4a41-9f08-5b397fb08181" (UID: "4e76b465-2eaf-4a41-9f08-5b397fb08181"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.389412 4690 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e76b465-2eaf-4a41-9f08-5b397fb08181-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.402350 4690 scope.go:117] "RemoveContainer" containerID="edcc85af13a00a47cd97cc62d1a8ea4ba5430ccc2575031b7d929018657d3fd1" Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.426498 4690 scope.go:117] "RemoveContainer" containerID="bdf6fbb853034ae1eac8dfde0006b0780f68c09f9498a4a6929c39e37fdab2d1" Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.458672 4690 scope.go:117] "RemoveContainer" containerID="126704503de4765a6440e251e1b8bb306826619827125fa5a235eaf63ff4b101" Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.484063 4690 scope.go:117] "RemoveContainer" containerID="86dc9ef2b8327e0779274f854f03ffac24e452d2c7c4be37e7f9af0daf0d6e38" Mar 20 17:54:24 crc kubenswrapper[4690]: E0320 17:54:24.484573 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86dc9ef2b8327e0779274f854f03ffac24e452d2c7c4be37e7f9af0daf0d6e38\": container with ID starting with 86dc9ef2b8327e0779274f854f03ffac24e452d2c7c4be37e7f9af0daf0d6e38 not found: ID does not exist" containerID="86dc9ef2b8327e0779274f854f03ffac24e452d2c7c4be37e7f9af0daf0d6e38" Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.484632 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86dc9ef2b8327e0779274f854f03ffac24e452d2c7c4be37e7f9af0daf0d6e38"} err="failed to get container status \"86dc9ef2b8327e0779274f854f03ffac24e452d2c7c4be37e7f9af0daf0d6e38\": rpc error: code = NotFound desc = could not find container \"86dc9ef2b8327e0779274f854f03ffac24e452d2c7c4be37e7f9af0daf0d6e38\": container with ID starting with 86dc9ef2b8327e0779274f854f03ffac24e452d2c7c4be37e7f9af0daf0d6e38 not found: ID does not exist" Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.484664 4690 scope.go:117] "RemoveContainer" containerID="edcc85af13a00a47cd97cc62d1a8ea4ba5430ccc2575031b7d929018657d3fd1" Mar 20 17:54:24 crc kubenswrapper[4690]: E0320 17:54:24.485076 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"edcc85af13a00a47cd97cc62d1a8ea4ba5430ccc2575031b7d929018657d3fd1\": container with ID starting with edcc85af13a00a47cd97cc62d1a8ea4ba5430ccc2575031b7d929018657d3fd1 not found: ID does not exist" containerID="edcc85af13a00a47cd97cc62d1a8ea4ba5430ccc2575031b7d929018657d3fd1" Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.485107 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edcc85af13a00a47cd97cc62d1a8ea4ba5430ccc2575031b7d929018657d3fd1"} err="failed to get container status \"edcc85af13a00a47cd97cc62d1a8ea4ba5430ccc2575031b7d929018657d3fd1\": rpc error: code = NotFound desc = could not find container \"edcc85af13a00a47cd97cc62d1a8ea4ba5430ccc2575031b7d929018657d3fd1\": container with ID starting with edcc85af13a00a47cd97cc62d1a8ea4ba5430ccc2575031b7d929018657d3fd1 not found: ID does not exist" Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.485129 4690 scope.go:117] "RemoveContainer" containerID="bdf6fbb853034ae1eac8dfde0006b0780f68c09f9498a4a6929c39e37fdab2d1" Mar 20 17:54:24 crc kubenswrapper[4690]: E0320 17:54:24.485540 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bdf6fbb853034ae1eac8dfde0006b0780f68c09f9498a4a6929c39e37fdab2d1\": container with ID starting with bdf6fbb853034ae1eac8dfde0006b0780f68c09f9498a4a6929c39e37fdab2d1 not found: ID does not exist" containerID="bdf6fbb853034ae1eac8dfde0006b0780f68c09f9498a4a6929c39e37fdab2d1" Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.485569 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdf6fbb853034ae1eac8dfde0006b0780f68c09f9498a4a6929c39e37fdab2d1"} err="failed to get container status \"bdf6fbb853034ae1eac8dfde0006b0780f68c09f9498a4a6929c39e37fdab2d1\": rpc error: code = NotFound desc = could not find container \"bdf6fbb853034ae1eac8dfde0006b0780f68c09f9498a4a6929c39e37fdab2d1\": container with ID starting with bdf6fbb853034ae1eac8dfde0006b0780f68c09f9498a4a6929c39e37fdab2d1 not found: ID does not exist" Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.485585 4690 scope.go:117] "RemoveContainer" containerID="126704503de4765a6440e251e1b8bb306826619827125fa5a235eaf63ff4b101" Mar 20 17:54:24 crc kubenswrapper[4690]: E0320 17:54:24.485812 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"126704503de4765a6440e251e1b8bb306826619827125fa5a235eaf63ff4b101\": container with ID starting with 126704503de4765a6440e251e1b8bb306826619827125fa5a235eaf63ff4b101 not found: ID does not exist" containerID="126704503de4765a6440e251e1b8bb306826619827125fa5a235eaf63ff4b101" Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.485842 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"126704503de4765a6440e251e1b8bb306826619827125fa5a235eaf63ff4b101"} err="failed to get container status \"126704503de4765a6440e251e1b8bb306826619827125fa5a235eaf63ff4b101\": rpc error: code = NotFound desc = could not find container \"126704503de4765a6440e251e1b8bb306826619827125fa5a235eaf63ff4b101\": container with ID starting with 126704503de4765a6440e251e1b8bb306826619827125fa5a235eaf63ff4b101 not found: ID does not exist" Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.485859 4690 scope.go:117] "RemoveContainer" containerID="86dc9ef2b8327e0779274f854f03ffac24e452d2c7c4be37e7f9af0daf0d6e38" Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.486231 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86dc9ef2b8327e0779274f854f03ffac24e452d2c7c4be37e7f9af0daf0d6e38"} err="failed to get container status \"86dc9ef2b8327e0779274f854f03ffac24e452d2c7c4be37e7f9af0daf0d6e38\": rpc error: code = NotFound desc = could not find container \"86dc9ef2b8327e0779274f854f03ffac24e452d2c7c4be37e7f9af0daf0d6e38\": container with ID starting with 86dc9ef2b8327e0779274f854f03ffac24e452d2c7c4be37e7f9af0daf0d6e38 not found: ID does not exist" Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.486271 4690 scope.go:117] "RemoveContainer" containerID="edcc85af13a00a47cd97cc62d1a8ea4ba5430ccc2575031b7d929018657d3fd1" Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.486514 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edcc85af13a00a47cd97cc62d1a8ea4ba5430ccc2575031b7d929018657d3fd1"} err="failed to get container status \"edcc85af13a00a47cd97cc62d1a8ea4ba5430ccc2575031b7d929018657d3fd1\": rpc error: code = NotFound desc = could not find container \"edcc85af13a00a47cd97cc62d1a8ea4ba5430ccc2575031b7d929018657d3fd1\": container with ID starting with edcc85af13a00a47cd97cc62d1a8ea4ba5430ccc2575031b7d929018657d3fd1 not found: ID does not exist" Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.486540 4690 scope.go:117] "RemoveContainer" containerID="bdf6fbb853034ae1eac8dfde0006b0780f68c09f9498a4a6929c39e37fdab2d1" Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.486750 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdf6fbb853034ae1eac8dfde0006b0780f68c09f9498a4a6929c39e37fdab2d1"} err="failed to get container status \"bdf6fbb853034ae1eac8dfde0006b0780f68c09f9498a4a6929c39e37fdab2d1\": rpc error: code = NotFound desc = could not find container \"bdf6fbb853034ae1eac8dfde0006b0780f68c09f9498a4a6929c39e37fdab2d1\": container with ID starting with bdf6fbb853034ae1eac8dfde0006b0780f68c09f9498a4a6929c39e37fdab2d1 not found: ID does not exist" Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.486769 4690 scope.go:117] "RemoveContainer" containerID="126704503de4765a6440e251e1b8bb306826619827125fa5a235eaf63ff4b101" Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.486955 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"126704503de4765a6440e251e1b8bb306826619827125fa5a235eaf63ff4b101"} err="failed to get container status \"126704503de4765a6440e251e1b8bb306826619827125fa5a235eaf63ff4b101\": rpc error: code = NotFound desc = could not find container \"126704503de4765a6440e251e1b8bb306826619827125fa5a235eaf63ff4b101\": container with ID starting with 126704503de4765a6440e251e1b8bb306826619827125fa5a235eaf63ff4b101 not found: ID does not exist" Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.486977 4690 scope.go:117] "RemoveContainer" containerID="86dc9ef2b8327e0779274f854f03ffac24e452d2c7c4be37e7f9af0daf0d6e38" Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.487125 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86dc9ef2b8327e0779274f854f03ffac24e452d2c7c4be37e7f9af0daf0d6e38"} err="failed to get container status \"86dc9ef2b8327e0779274f854f03ffac24e452d2c7c4be37e7f9af0daf0d6e38\": rpc error: code = NotFound desc = could not find container \"86dc9ef2b8327e0779274f854f03ffac24e452d2c7c4be37e7f9af0daf0d6e38\": container with ID starting with 86dc9ef2b8327e0779274f854f03ffac24e452d2c7c4be37e7f9af0daf0d6e38 not found: ID does not exist" Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.487145 4690 scope.go:117] "RemoveContainer" containerID="edcc85af13a00a47cd97cc62d1a8ea4ba5430ccc2575031b7d929018657d3fd1" Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.487348 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edcc85af13a00a47cd97cc62d1a8ea4ba5430ccc2575031b7d929018657d3fd1"} err="failed to get container status \"edcc85af13a00a47cd97cc62d1a8ea4ba5430ccc2575031b7d929018657d3fd1\": rpc error: code = NotFound desc = could not find container \"edcc85af13a00a47cd97cc62d1a8ea4ba5430ccc2575031b7d929018657d3fd1\": container with ID starting with edcc85af13a00a47cd97cc62d1a8ea4ba5430ccc2575031b7d929018657d3fd1 not found: ID does not exist" Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.487371 4690 scope.go:117] "RemoveContainer" containerID="bdf6fbb853034ae1eac8dfde0006b0780f68c09f9498a4a6929c39e37fdab2d1" Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.487696 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdf6fbb853034ae1eac8dfde0006b0780f68c09f9498a4a6929c39e37fdab2d1"} err="failed to get container status \"bdf6fbb853034ae1eac8dfde0006b0780f68c09f9498a4a6929c39e37fdab2d1\": rpc error: code = NotFound desc = could not find container \"bdf6fbb853034ae1eac8dfde0006b0780f68c09f9498a4a6929c39e37fdab2d1\": container with ID starting with bdf6fbb853034ae1eac8dfde0006b0780f68c09f9498a4a6929c39e37fdab2d1 not found: ID does not exist" Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.487715 4690 scope.go:117] "RemoveContainer" containerID="126704503de4765a6440e251e1b8bb306826619827125fa5a235eaf63ff4b101" Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.488947 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"126704503de4765a6440e251e1b8bb306826619827125fa5a235eaf63ff4b101"} err="failed to get container status \"126704503de4765a6440e251e1b8bb306826619827125fa5a235eaf63ff4b101\": rpc error: code = NotFound desc = could not find container \"126704503de4765a6440e251e1b8bb306826619827125fa5a235eaf63ff4b101\": container with ID starting with 126704503de4765a6440e251e1b8bb306826619827125fa5a235eaf63ff4b101 not found: ID does not exist" Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.488986 4690 scope.go:117] "RemoveContainer" containerID="86dc9ef2b8327e0779274f854f03ffac24e452d2c7c4be37e7f9af0daf0d6e38" Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.489456 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86dc9ef2b8327e0779274f854f03ffac24e452d2c7c4be37e7f9af0daf0d6e38"} err="failed to get container status \"86dc9ef2b8327e0779274f854f03ffac24e452d2c7c4be37e7f9af0daf0d6e38\": rpc error: code = NotFound desc = could not find container \"86dc9ef2b8327e0779274f854f03ffac24e452d2c7c4be37e7f9af0daf0d6e38\": container with ID starting with 86dc9ef2b8327e0779274f854f03ffac24e452d2c7c4be37e7f9af0daf0d6e38 not found: ID does not exist" Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.489477 4690 scope.go:117] "RemoveContainer" containerID="edcc85af13a00a47cd97cc62d1a8ea4ba5430ccc2575031b7d929018657d3fd1" Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.490403 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edcc85af13a00a47cd97cc62d1a8ea4ba5430ccc2575031b7d929018657d3fd1"} err="failed to get container status \"edcc85af13a00a47cd97cc62d1a8ea4ba5430ccc2575031b7d929018657d3fd1\": rpc error: code = NotFound desc = could not find container \"edcc85af13a00a47cd97cc62d1a8ea4ba5430ccc2575031b7d929018657d3fd1\": container with ID starting with edcc85af13a00a47cd97cc62d1a8ea4ba5430ccc2575031b7d929018657d3fd1 not found: ID does not exist" Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.490423 4690 scope.go:117] "RemoveContainer" containerID="bdf6fbb853034ae1eac8dfde0006b0780f68c09f9498a4a6929c39e37fdab2d1" Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.490655 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdf6fbb853034ae1eac8dfde0006b0780f68c09f9498a4a6929c39e37fdab2d1"} err="failed to get container status \"bdf6fbb853034ae1eac8dfde0006b0780f68c09f9498a4a6929c39e37fdab2d1\": rpc error: code = NotFound desc = could not find container \"bdf6fbb853034ae1eac8dfde0006b0780f68c09f9498a4a6929c39e37fdab2d1\": container with ID starting with bdf6fbb853034ae1eac8dfde0006b0780f68c09f9498a4a6929c39e37fdab2d1 not found: ID does not exist" Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.490691 4690 scope.go:117] "RemoveContainer" containerID="126704503de4765a6440e251e1b8bb306826619827125fa5a235eaf63ff4b101" Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.490918 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"126704503de4765a6440e251e1b8bb306826619827125fa5a235eaf63ff4b101"} err="failed to get container status \"126704503de4765a6440e251e1b8bb306826619827125fa5a235eaf63ff4b101\": rpc error: code = NotFound desc = could not find container \"126704503de4765a6440e251e1b8bb306826619827125fa5a235eaf63ff4b101\": container with ID starting with 126704503de4765a6440e251e1b8bb306826619827125fa5a235eaf63ff4b101 not found: ID does not exist" Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.567152 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.593238 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.612767 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:54:24 crc kubenswrapper[4690]: E0320 17:54:24.613350 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e76b465-2eaf-4a41-9f08-5b397fb08181" containerName="sg-core" Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.613386 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e76b465-2eaf-4a41-9f08-5b397fb08181" containerName="sg-core" Mar 20 17:54:24 crc kubenswrapper[4690]: E0320 17:54:24.613404 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04ab06c3-11ab-4253-bafa-fea6ac93bedf" containerName="neutron-httpd" Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.613413 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="04ab06c3-11ab-4253-bafa-fea6ac93bedf" containerName="neutron-httpd" Mar 20 17:54:24 crc kubenswrapper[4690]: E0320 17:54:24.613432 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e76b465-2eaf-4a41-9f08-5b397fb08181" containerName="proxy-httpd" Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.613442 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e76b465-2eaf-4a41-9f08-5b397fb08181" containerName="proxy-httpd" Mar 20 17:54:24 crc kubenswrapper[4690]: E0320 17:54:24.613468 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04ab06c3-11ab-4253-bafa-fea6ac93bedf" containerName="neutron-api" Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.613477 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="04ab06c3-11ab-4253-bafa-fea6ac93bedf" containerName="neutron-api" Mar 20 17:54:24 crc kubenswrapper[4690]: E0320 17:54:24.613499 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e76b465-2eaf-4a41-9f08-5b397fb08181" containerName="ceilometer-central-agent" Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.613508 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e76b465-2eaf-4a41-9f08-5b397fb08181" containerName="ceilometer-central-agent" Mar 20 17:54:24 crc kubenswrapper[4690]: E0320 17:54:24.613519 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e76b465-2eaf-4a41-9f08-5b397fb08181" containerName="ceilometer-notification-agent" Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.613527 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e76b465-2eaf-4a41-9f08-5b397fb08181" containerName="ceilometer-notification-agent" Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.613755 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="04ab06c3-11ab-4253-bafa-fea6ac93bedf" containerName="neutron-api" Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.613820 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="04ab06c3-11ab-4253-bafa-fea6ac93bedf" containerName="neutron-httpd" Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.613860 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e76b465-2eaf-4a41-9f08-5b397fb08181" containerName="ceilometer-central-agent" Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.613873 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e76b465-2eaf-4a41-9f08-5b397fb08181" containerName="sg-core" Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.613891 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e76b465-2eaf-4a41-9f08-5b397fb08181" containerName="ceilometer-notification-agent" Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.613927 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e76b465-2eaf-4a41-9f08-5b397fb08181" containerName="proxy-httpd" Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.617958 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.620871 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.621023 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.627814 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.694666 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nl8vh\" (UniqueName: \"kubernetes.io/projected/db231a9f-179b-4f9b-9d02-89e8605dcc24-kube-api-access-nl8vh\") pod \"ceilometer-0\" (UID: \"db231a9f-179b-4f9b-9d02-89e8605dcc24\") " pod="openstack/ceilometer-0" Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.694769 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/db231a9f-179b-4f9b-9d02-89e8605dcc24-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"db231a9f-179b-4f9b-9d02-89e8605dcc24\") " pod="openstack/ceilometer-0" Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.694808 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db231a9f-179b-4f9b-9d02-89e8605dcc24-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"db231a9f-179b-4f9b-9d02-89e8605dcc24\") " pod="openstack/ceilometer-0" Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.694834 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/db231a9f-179b-4f9b-9d02-89e8605dcc24-log-httpd\") pod \"ceilometer-0\" (UID: \"db231a9f-179b-4f9b-9d02-89e8605dcc24\") " pod="openstack/ceilometer-0" Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.694862 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/db231a9f-179b-4f9b-9d02-89e8605dcc24-run-httpd\") pod \"ceilometer-0\" (UID: \"db231a9f-179b-4f9b-9d02-89e8605dcc24\") " pod="openstack/ceilometer-0" Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.694885 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db231a9f-179b-4f9b-9d02-89e8605dcc24-scripts\") pod \"ceilometer-0\" (UID: \"db231a9f-179b-4f9b-9d02-89e8605dcc24\") " pod="openstack/ceilometer-0" Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.694935 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db231a9f-179b-4f9b-9d02-89e8605dcc24-config-data\") pod \"ceilometer-0\" (UID: \"db231a9f-179b-4f9b-9d02-89e8605dcc24\") " pod="openstack/ceilometer-0" Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.800664 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db231a9f-179b-4f9b-9d02-89e8605dcc24-config-data\") pod \"ceilometer-0\" (UID: \"db231a9f-179b-4f9b-9d02-89e8605dcc24\") " pod="openstack/ceilometer-0" Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.800803 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nl8vh\" (UniqueName: \"kubernetes.io/projected/db231a9f-179b-4f9b-9d02-89e8605dcc24-kube-api-access-nl8vh\") pod \"ceilometer-0\" (UID: \"db231a9f-179b-4f9b-9d02-89e8605dcc24\") " pod="openstack/ceilometer-0" Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.800880 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/db231a9f-179b-4f9b-9d02-89e8605dcc24-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"db231a9f-179b-4f9b-9d02-89e8605dcc24\") " pod="openstack/ceilometer-0" Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.800919 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db231a9f-179b-4f9b-9d02-89e8605dcc24-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"db231a9f-179b-4f9b-9d02-89e8605dcc24\") " pod="openstack/ceilometer-0" Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.800952 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/db231a9f-179b-4f9b-9d02-89e8605dcc24-log-httpd\") pod \"ceilometer-0\" (UID: \"db231a9f-179b-4f9b-9d02-89e8605dcc24\") " pod="openstack/ceilometer-0" Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.800982 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/db231a9f-179b-4f9b-9d02-89e8605dcc24-run-httpd\") pod \"ceilometer-0\" (UID: \"db231a9f-179b-4f9b-9d02-89e8605dcc24\") " pod="openstack/ceilometer-0" Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.801006 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db231a9f-179b-4f9b-9d02-89e8605dcc24-scripts\") pod \"ceilometer-0\" (UID: \"db231a9f-179b-4f9b-9d02-89e8605dcc24\") " pod="openstack/ceilometer-0" Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.803671 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/db231a9f-179b-4f9b-9d02-89e8605dcc24-log-httpd\") pod \"ceilometer-0\" (UID: \"db231a9f-179b-4f9b-9d02-89e8605dcc24\") " pod="openstack/ceilometer-0" Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.804002 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/db231a9f-179b-4f9b-9d02-89e8605dcc24-run-httpd\") pod \"ceilometer-0\" (UID: \"db231a9f-179b-4f9b-9d02-89e8605dcc24\") " pod="openstack/ceilometer-0" Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.807168 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db231a9f-179b-4f9b-9d02-89e8605dcc24-config-data\") pod \"ceilometer-0\" (UID: \"db231a9f-179b-4f9b-9d02-89e8605dcc24\") " pod="openstack/ceilometer-0" Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.808095 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/db231a9f-179b-4f9b-9d02-89e8605dcc24-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"db231a9f-179b-4f9b-9d02-89e8605dcc24\") " pod="openstack/ceilometer-0" Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.808152 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db231a9f-179b-4f9b-9d02-89e8605dcc24-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"db231a9f-179b-4f9b-9d02-89e8605dcc24\") " pod="openstack/ceilometer-0" Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.810161 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db231a9f-179b-4f9b-9d02-89e8605dcc24-scripts\") pod \"ceilometer-0\" (UID: \"db231a9f-179b-4f9b-9d02-89e8605dcc24\") " pod="openstack/ceilometer-0" Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.823379 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nl8vh\" (UniqueName: \"kubernetes.io/projected/db231a9f-179b-4f9b-9d02-89e8605dcc24-kube-api-access-nl8vh\") pod \"ceilometer-0\" (UID: \"db231a9f-179b-4f9b-9d02-89e8605dcc24\") " pod="openstack/ceilometer-0" Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.980514 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:54:24 crc kubenswrapper[4690]: I0320 17:54:24.984213 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-867c5896-qkwmr" Mar 20 17:54:25 crc kubenswrapper[4690]: I0320 17:54:25.016007 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/607d61e7-e52a-46e6-a23a-2d4714c5b543-horizon-tls-certs\") pod \"607d61e7-e52a-46e6-a23a-2d4714c5b543\" (UID: \"607d61e7-e52a-46e6-a23a-2d4714c5b543\") " Mar 20 17:54:25 crc kubenswrapper[4690]: I0320 17:54:25.016129 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rlrb8\" (UniqueName: \"kubernetes.io/projected/607d61e7-e52a-46e6-a23a-2d4714c5b543-kube-api-access-rlrb8\") pod \"607d61e7-e52a-46e6-a23a-2d4714c5b543\" (UID: \"607d61e7-e52a-46e6-a23a-2d4714c5b543\") " Mar 20 17:54:25 crc kubenswrapper[4690]: I0320 17:54:25.016223 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/607d61e7-e52a-46e6-a23a-2d4714c5b543-config-data\") pod \"607d61e7-e52a-46e6-a23a-2d4714c5b543\" (UID: \"607d61e7-e52a-46e6-a23a-2d4714c5b543\") " Mar 20 17:54:25 crc kubenswrapper[4690]: I0320 17:54:25.016851 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/607d61e7-e52a-46e6-a23a-2d4714c5b543-combined-ca-bundle\") pod \"607d61e7-e52a-46e6-a23a-2d4714c5b543\" (UID: \"607d61e7-e52a-46e6-a23a-2d4714c5b543\") " Mar 20 17:54:25 crc kubenswrapper[4690]: I0320 17:54:25.016934 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/607d61e7-e52a-46e6-a23a-2d4714c5b543-logs\") pod \"607d61e7-e52a-46e6-a23a-2d4714c5b543\" (UID: \"607d61e7-e52a-46e6-a23a-2d4714c5b543\") " Mar 20 17:54:25 crc kubenswrapper[4690]: I0320 17:54:25.016962 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/607d61e7-e52a-46e6-a23a-2d4714c5b543-scripts\") pod \"607d61e7-e52a-46e6-a23a-2d4714c5b543\" (UID: \"607d61e7-e52a-46e6-a23a-2d4714c5b543\") " Mar 20 17:54:25 crc kubenswrapper[4690]: I0320 17:54:25.017002 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/607d61e7-e52a-46e6-a23a-2d4714c5b543-horizon-secret-key\") pod \"607d61e7-e52a-46e6-a23a-2d4714c5b543\" (UID: \"607d61e7-e52a-46e6-a23a-2d4714c5b543\") " Mar 20 17:54:25 crc kubenswrapper[4690]: I0320 17:54:25.017471 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/607d61e7-e52a-46e6-a23a-2d4714c5b543-logs" (OuterVolumeSpecName: "logs") pod "607d61e7-e52a-46e6-a23a-2d4714c5b543" (UID: "607d61e7-e52a-46e6-a23a-2d4714c5b543"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:54:25 crc kubenswrapper[4690]: I0320 17:54:25.017893 4690 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/607d61e7-e52a-46e6-a23a-2d4714c5b543-logs\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:25 crc kubenswrapper[4690]: I0320 17:54:25.019839 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/607d61e7-e52a-46e6-a23a-2d4714c5b543-kube-api-access-rlrb8" (OuterVolumeSpecName: "kube-api-access-rlrb8") pod "607d61e7-e52a-46e6-a23a-2d4714c5b543" (UID: "607d61e7-e52a-46e6-a23a-2d4714c5b543"). InnerVolumeSpecName "kube-api-access-rlrb8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:54:25 crc kubenswrapper[4690]: I0320 17:54:25.021364 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/607d61e7-e52a-46e6-a23a-2d4714c5b543-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "607d61e7-e52a-46e6-a23a-2d4714c5b543" (UID: "607d61e7-e52a-46e6-a23a-2d4714c5b543"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:54:25 crc kubenswrapper[4690]: I0320 17:54:25.040909 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/607d61e7-e52a-46e6-a23a-2d4714c5b543-scripts" (OuterVolumeSpecName: "scripts") pod "607d61e7-e52a-46e6-a23a-2d4714c5b543" (UID: "607d61e7-e52a-46e6-a23a-2d4714c5b543"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:54:25 crc kubenswrapper[4690]: I0320 17:54:25.043015 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/607d61e7-e52a-46e6-a23a-2d4714c5b543-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "607d61e7-e52a-46e6-a23a-2d4714c5b543" (UID: "607d61e7-e52a-46e6-a23a-2d4714c5b543"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:54:25 crc kubenswrapper[4690]: I0320 17:54:25.050207 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/607d61e7-e52a-46e6-a23a-2d4714c5b543-config-data" (OuterVolumeSpecName: "config-data") pod "607d61e7-e52a-46e6-a23a-2d4714c5b543" (UID: "607d61e7-e52a-46e6-a23a-2d4714c5b543"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:54:25 crc kubenswrapper[4690]: I0320 17:54:25.082756 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/607d61e7-e52a-46e6-a23a-2d4714c5b543-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "607d61e7-e52a-46e6-a23a-2d4714c5b543" (UID: "607d61e7-e52a-46e6-a23a-2d4714c5b543"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:54:25 crc kubenswrapper[4690]: I0320 17:54:25.119539 4690 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/607d61e7-e52a-46e6-a23a-2d4714c5b543-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:25 crc kubenswrapper[4690]: I0320 17:54:25.119797 4690 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/607d61e7-e52a-46e6-a23a-2d4714c5b543-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:25 crc kubenswrapper[4690]: I0320 17:54:25.119898 4690 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/607d61e7-e52a-46e6-a23a-2d4714c5b543-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:25 crc kubenswrapper[4690]: I0320 17:54:25.119989 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rlrb8\" (UniqueName: \"kubernetes.io/projected/607d61e7-e52a-46e6-a23a-2d4714c5b543-kube-api-access-rlrb8\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:25 crc kubenswrapper[4690]: I0320 17:54:25.120068 4690 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/607d61e7-e52a-46e6-a23a-2d4714c5b543-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:25 crc kubenswrapper[4690]: I0320 17:54:25.120145 4690 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/607d61e7-e52a-46e6-a23a-2d4714c5b543-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:25 crc kubenswrapper[4690]: I0320 17:54:25.239205 4690 generic.go:334] "Generic (PLEG): container finished" podID="607d61e7-e52a-46e6-a23a-2d4714c5b543" containerID="83415bbed66278723c555c9441d97cd81cb450f1f463045bdaece0319a8abe3d" exitCode=137 Mar 20 17:54:25 crc kubenswrapper[4690]: I0320 17:54:25.239280 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-867c5896-qkwmr" event={"ID":"607d61e7-e52a-46e6-a23a-2d4714c5b543","Type":"ContainerDied","Data":"83415bbed66278723c555c9441d97cd81cb450f1f463045bdaece0319a8abe3d"} Mar 20 17:54:25 crc kubenswrapper[4690]: I0320 17:54:25.239311 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-867c5896-qkwmr" event={"ID":"607d61e7-e52a-46e6-a23a-2d4714c5b543","Type":"ContainerDied","Data":"b621d720f873c1bef307556d0e25334ff094c8a37d689fb519e73f28d11cced4"} Mar 20 17:54:25 crc kubenswrapper[4690]: I0320 17:54:25.239331 4690 scope.go:117] "RemoveContainer" containerID="986e03253e1d42403b7786f6087a48f5db97b4dbed738848947823b11c19e91a" Mar 20 17:54:25 crc kubenswrapper[4690]: I0320 17:54:25.239469 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-867c5896-qkwmr" Mar 20 17:54:25 crc kubenswrapper[4690]: I0320 17:54:25.282351 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-867c5896-qkwmr"] Mar 20 17:54:25 crc kubenswrapper[4690]: I0320 17:54:25.289926 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-867c5896-qkwmr"] Mar 20 17:54:25 crc kubenswrapper[4690]: I0320 17:54:25.440634 4690 scope.go:117] "RemoveContainer" containerID="83415bbed66278723c555c9441d97cd81cb450f1f463045bdaece0319a8abe3d" Mar 20 17:54:25 crc kubenswrapper[4690]: I0320 17:54:25.487343 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:54:25 crc kubenswrapper[4690]: I0320 17:54:25.489681 4690 scope.go:117] "RemoveContainer" containerID="986e03253e1d42403b7786f6087a48f5db97b4dbed738848947823b11c19e91a" Mar 20 17:54:25 crc kubenswrapper[4690]: E0320 17:54:25.490329 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"986e03253e1d42403b7786f6087a48f5db97b4dbed738848947823b11c19e91a\": container with ID starting with 986e03253e1d42403b7786f6087a48f5db97b4dbed738848947823b11c19e91a not found: ID does not exist" containerID="986e03253e1d42403b7786f6087a48f5db97b4dbed738848947823b11c19e91a" Mar 20 17:54:25 crc kubenswrapper[4690]: I0320 17:54:25.490426 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"986e03253e1d42403b7786f6087a48f5db97b4dbed738848947823b11c19e91a"} err="failed to get container status \"986e03253e1d42403b7786f6087a48f5db97b4dbed738848947823b11c19e91a\": rpc error: code = NotFound desc = could not find container \"986e03253e1d42403b7786f6087a48f5db97b4dbed738848947823b11c19e91a\": container with ID starting with 986e03253e1d42403b7786f6087a48f5db97b4dbed738848947823b11c19e91a not found: ID does not exist" Mar 20 17:54:25 crc kubenswrapper[4690]: I0320 17:54:25.490460 4690 scope.go:117] "RemoveContainer" containerID="83415bbed66278723c555c9441d97cd81cb450f1f463045bdaece0319a8abe3d" Mar 20 17:54:25 crc kubenswrapper[4690]: E0320 17:54:25.490973 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83415bbed66278723c555c9441d97cd81cb450f1f463045bdaece0319a8abe3d\": container with ID starting with 83415bbed66278723c555c9441d97cd81cb450f1f463045bdaece0319a8abe3d not found: ID does not exist" containerID="83415bbed66278723c555c9441d97cd81cb450f1f463045bdaece0319a8abe3d" Mar 20 17:54:25 crc kubenswrapper[4690]: I0320 17:54:25.491013 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83415bbed66278723c555c9441d97cd81cb450f1f463045bdaece0319a8abe3d"} err="failed to get container status \"83415bbed66278723c555c9441d97cd81cb450f1f463045bdaece0319a8abe3d\": rpc error: code = NotFound desc = could not find container \"83415bbed66278723c555c9441d97cd81cb450f1f463045bdaece0319a8abe3d\": container with ID starting with 83415bbed66278723c555c9441d97cd81cb450f1f463045bdaece0319a8abe3d not found: ID does not exist" Mar 20 17:54:25 crc kubenswrapper[4690]: I0320 17:54:25.895325 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e76b465-2eaf-4a41-9f08-5b397fb08181" path="/var/lib/kubelet/pods/4e76b465-2eaf-4a41-9f08-5b397fb08181/volumes" Mar 20 17:54:25 crc kubenswrapper[4690]: I0320 17:54:25.896581 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="607d61e7-e52a-46e6-a23a-2d4714c5b543" path="/var/lib/kubelet/pods/607d61e7-e52a-46e6-a23a-2d4714c5b543/volumes" Mar 20 17:54:26 crc kubenswrapper[4690]: I0320 17:54:26.276162 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"db231a9f-179b-4f9b-9d02-89e8605dcc24","Type":"ContainerStarted","Data":"25ff9572e869922a027fc488d3d63bf9c506b0eb1aa334b1a0ac4174f5cadd10"} Mar 20 17:54:26 crc kubenswrapper[4690]: I0320 17:54:26.276441 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"db231a9f-179b-4f9b-9d02-89e8605dcc24","Type":"ContainerStarted","Data":"67008a7fcff5c905169093de8384e009547fe9372f7543ee0e570f4a2983575b"} Mar 20 17:54:26 crc kubenswrapper[4690]: I0320 17:54:26.798821 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:54:27 crc kubenswrapper[4690]: I0320 17:54:27.237981 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-77ff877fdd-nntbj" Mar 20 17:54:27 crc kubenswrapper[4690]: I0320 17:54:27.403872 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-77ff877fdd-nntbj" Mar 20 17:54:27 crc kubenswrapper[4690]: I0320 17:54:27.512621 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-846cbcbcb-bk7ct"] Mar 20 17:54:27 crc kubenswrapper[4690]: I0320 17:54:27.512844 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-846cbcbcb-bk7ct" podUID="eeac13cf-6875-434a-b276-fae77a828d02" containerName="placement-log" containerID="cri-o://660b50aac9a97ca5f210780af2fbe5551050f7fb5a3827150cecccb01b045837" gracePeriod=30 Mar 20 17:54:27 crc kubenswrapper[4690]: I0320 17:54:27.513333 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-846cbcbcb-bk7ct" podUID="eeac13cf-6875-434a-b276-fae77a828d02" containerName="placement-api" containerID="cri-o://e273e0db8a848a00bb240bb0f1a7fd1fe0162b2b098a6033f1f05d2ac6cb0bba" gracePeriod=30 Mar 20 17:54:28 crc kubenswrapper[4690]: I0320 17:54:28.298084 4690 generic.go:334] "Generic (PLEG): container finished" podID="eeac13cf-6875-434a-b276-fae77a828d02" containerID="660b50aac9a97ca5f210780af2fbe5551050f7fb5a3827150cecccb01b045837" exitCode=143 Mar 20 17:54:28 crc kubenswrapper[4690]: I0320 17:54:28.298161 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-846cbcbcb-bk7ct" event={"ID":"eeac13cf-6875-434a-b276-fae77a828d02","Type":"ContainerDied","Data":"660b50aac9a97ca5f210780af2fbe5551050f7fb5a3827150cecccb01b045837"} Mar 20 17:54:31 crc kubenswrapper[4690]: I0320 17:54:31.354556 4690 generic.go:334] "Generic (PLEG): container finished" podID="eeac13cf-6875-434a-b276-fae77a828d02" containerID="e273e0db8a848a00bb240bb0f1a7fd1fe0162b2b098a6033f1f05d2ac6cb0bba" exitCode=0 Mar 20 17:54:31 crc kubenswrapper[4690]: I0320 17:54:31.354678 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-846cbcbcb-bk7ct" event={"ID":"eeac13cf-6875-434a-b276-fae77a828d02","Type":"ContainerDied","Data":"e273e0db8a848a00bb240bb0f1a7fd1fe0162b2b098a6033f1f05d2ac6cb0bba"} Mar 20 17:54:32 crc kubenswrapper[4690]: I0320 17:54:32.532027 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-846cbcbcb-bk7ct" Mar 20 17:54:32 crc kubenswrapper[4690]: I0320 17:54:32.667882 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eeac13cf-6875-434a-b276-fae77a828d02-scripts\") pod \"eeac13cf-6875-434a-b276-fae77a828d02\" (UID: \"eeac13cf-6875-434a-b276-fae77a828d02\") " Mar 20 17:54:32 crc kubenswrapper[4690]: I0320 17:54:32.667980 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eeac13cf-6875-434a-b276-fae77a828d02-combined-ca-bundle\") pod \"eeac13cf-6875-434a-b276-fae77a828d02\" (UID: \"eeac13cf-6875-434a-b276-fae77a828d02\") " Mar 20 17:54:32 crc kubenswrapper[4690]: I0320 17:54:32.668034 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eeac13cf-6875-434a-b276-fae77a828d02-config-data\") pod \"eeac13cf-6875-434a-b276-fae77a828d02\" (UID: \"eeac13cf-6875-434a-b276-fae77a828d02\") " Mar 20 17:54:32 crc kubenswrapper[4690]: I0320 17:54:32.668063 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eeac13cf-6875-434a-b276-fae77a828d02-internal-tls-certs\") pod \"eeac13cf-6875-434a-b276-fae77a828d02\" (UID: \"eeac13cf-6875-434a-b276-fae77a828d02\") " Mar 20 17:54:32 crc kubenswrapper[4690]: I0320 17:54:32.668092 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6wdsz\" (UniqueName: \"kubernetes.io/projected/eeac13cf-6875-434a-b276-fae77a828d02-kube-api-access-6wdsz\") pod \"eeac13cf-6875-434a-b276-fae77a828d02\" (UID: \"eeac13cf-6875-434a-b276-fae77a828d02\") " Mar 20 17:54:32 crc kubenswrapper[4690]: I0320 17:54:32.668181 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eeac13cf-6875-434a-b276-fae77a828d02-public-tls-certs\") pod \"eeac13cf-6875-434a-b276-fae77a828d02\" (UID: \"eeac13cf-6875-434a-b276-fae77a828d02\") " Mar 20 17:54:32 crc kubenswrapper[4690]: I0320 17:54:32.668243 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eeac13cf-6875-434a-b276-fae77a828d02-logs\") pod \"eeac13cf-6875-434a-b276-fae77a828d02\" (UID: \"eeac13cf-6875-434a-b276-fae77a828d02\") " Mar 20 17:54:32 crc kubenswrapper[4690]: I0320 17:54:32.669282 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eeac13cf-6875-434a-b276-fae77a828d02-logs" (OuterVolumeSpecName: "logs") pod "eeac13cf-6875-434a-b276-fae77a828d02" (UID: "eeac13cf-6875-434a-b276-fae77a828d02"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:54:32 crc kubenswrapper[4690]: I0320 17:54:32.674673 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eeac13cf-6875-434a-b276-fae77a828d02-scripts" (OuterVolumeSpecName: "scripts") pod "eeac13cf-6875-434a-b276-fae77a828d02" (UID: "eeac13cf-6875-434a-b276-fae77a828d02"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:54:32 crc kubenswrapper[4690]: I0320 17:54:32.677674 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eeac13cf-6875-434a-b276-fae77a828d02-kube-api-access-6wdsz" (OuterVolumeSpecName: "kube-api-access-6wdsz") pod "eeac13cf-6875-434a-b276-fae77a828d02" (UID: "eeac13cf-6875-434a-b276-fae77a828d02"). InnerVolumeSpecName "kube-api-access-6wdsz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:54:32 crc kubenswrapper[4690]: I0320 17:54:32.733525 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eeac13cf-6875-434a-b276-fae77a828d02-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eeac13cf-6875-434a-b276-fae77a828d02" (UID: "eeac13cf-6875-434a-b276-fae77a828d02"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:54:32 crc kubenswrapper[4690]: I0320 17:54:32.752417 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eeac13cf-6875-434a-b276-fae77a828d02-config-data" (OuterVolumeSpecName: "config-data") pod "eeac13cf-6875-434a-b276-fae77a828d02" (UID: "eeac13cf-6875-434a-b276-fae77a828d02"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:54:32 crc kubenswrapper[4690]: I0320 17:54:32.770420 4690 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eeac13cf-6875-434a-b276-fae77a828d02-logs\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:32 crc kubenswrapper[4690]: I0320 17:54:32.770469 4690 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eeac13cf-6875-434a-b276-fae77a828d02-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:32 crc kubenswrapper[4690]: I0320 17:54:32.770487 4690 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eeac13cf-6875-434a-b276-fae77a828d02-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:32 crc kubenswrapper[4690]: I0320 17:54:32.770507 4690 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eeac13cf-6875-434a-b276-fae77a828d02-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:32 crc kubenswrapper[4690]: I0320 17:54:32.770525 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6wdsz\" (UniqueName: \"kubernetes.io/projected/eeac13cf-6875-434a-b276-fae77a828d02-kube-api-access-6wdsz\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:32 crc kubenswrapper[4690]: I0320 17:54:32.788885 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eeac13cf-6875-434a-b276-fae77a828d02-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "eeac13cf-6875-434a-b276-fae77a828d02" (UID: "eeac13cf-6875-434a-b276-fae77a828d02"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:54:32 crc kubenswrapper[4690]: I0320 17:54:32.791122 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eeac13cf-6875-434a-b276-fae77a828d02-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "eeac13cf-6875-434a-b276-fae77a828d02" (UID: "eeac13cf-6875-434a-b276-fae77a828d02"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:54:32 crc kubenswrapper[4690]: I0320 17:54:32.871966 4690 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eeac13cf-6875-434a-b276-fae77a828d02-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:32 crc kubenswrapper[4690]: I0320 17:54:32.872004 4690 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eeac13cf-6875-434a-b276-fae77a828d02-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:33 crc kubenswrapper[4690]: I0320 17:54:33.377562 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"db231a9f-179b-4f9b-9d02-89e8605dcc24","Type":"ContainerStarted","Data":"3685bf0ce05709efe2b6df25a32c7d16fbcc0b5a78db46d4381afb507d3c4886"} Mar 20 17:54:33 crc kubenswrapper[4690]: I0320 17:54:33.377871 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"db231a9f-179b-4f9b-9d02-89e8605dcc24","Type":"ContainerStarted","Data":"abbbceb121af1e2c94c64d7da274f62fc052febe0e31a98d476f87c6c109d27b"} Mar 20 17:54:33 crc kubenswrapper[4690]: I0320 17:54:33.380662 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-846cbcbcb-bk7ct" event={"ID":"eeac13cf-6875-434a-b276-fae77a828d02","Type":"ContainerDied","Data":"7b36f31d6af928bfcd59680b0012319f817b3c646a271c618565edc67ffd8ada"} Mar 20 17:54:33 crc kubenswrapper[4690]: I0320 17:54:33.380707 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-846cbcbcb-bk7ct" Mar 20 17:54:33 crc kubenswrapper[4690]: I0320 17:54:33.380737 4690 scope.go:117] "RemoveContainer" containerID="e273e0db8a848a00bb240bb0f1a7fd1fe0162b2b098a6033f1f05d2ac6cb0bba" Mar 20 17:54:33 crc kubenswrapper[4690]: I0320 17:54:33.383193 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-njclq" event={"ID":"95830c09-d53f-4e08-800d-09d227668aee","Type":"ContainerStarted","Data":"a53656e4c8e345ea3e2b042f3181137b42a535e3c48b19f229cba3b9985e607a"} Mar 20 17:54:33 crc kubenswrapper[4690]: I0320 17:54:33.406218 4690 scope.go:117] "RemoveContainer" containerID="660b50aac9a97ca5f210780af2fbe5551050f7fb5a3827150cecccb01b045837" Mar 20 17:54:33 crc kubenswrapper[4690]: I0320 17:54:33.421913 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-njclq" podStartSLOduration=2.910602688 podStartE2EDuration="12.4218874s" podCreationTimestamp="2026-03-20 17:54:21 +0000 UTC" firstStartedPulling="2026-03-20 17:54:22.784849796 +0000 UTC m=+1337.650675474" lastFinishedPulling="2026-03-20 17:54:32.296134508 +0000 UTC m=+1347.161960186" observedRunningTime="2026-03-20 17:54:33.409933299 +0000 UTC m=+1348.275758977" watchObservedRunningTime="2026-03-20 17:54:33.4218874 +0000 UTC m=+1348.287713078" Mar 20 17:54:33 crc kubenswrapper[4690]: I0320 17:54:33.448612 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-846cbcbcb-bk7ct"] Mar 20 17:54:33 crc kubenswrapper[4690]: I0320 17:54:33.467590 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-846cbcbcb-bk7ct"] Mar 20 17:54:33 crc kubenswrapper[4690]: I0320 17:54:33.590441 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 17:54:33 crc kubenswrapper[4690]: I0320 17:54:33.590937 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="d11f0ffd-e625-4b90-a1e5-2315bf45529d" containerName="glance-httpd" containerID="cri-o://f525f7e9a79c32144c72f3cdc1110e4430998bf95419f4035a87d2a359ae37e3" gracePeriod=30 Mar 20 17:54:33 crc kubenswrapper[4690]: I0320 17:54:33.590770 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="d11f0ffd-e625-4b90-a1e5-2315bf45529d" containerName="glance-log" containerID="cri-o://812ab71b05d73eb8736a5210c10c237da0abea28d91276df114f5435a9ef3edb" gracePeriod=30 Mar 20 17:54:33 crc kubenswrapper[4690]: I0320 17:54:33.893710 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eeac13cf-6875-434a-b276-fae77a828d02" path="/var/lib/kubelet/pods/eeac13cf-6875-434a-b276-fae77a828d02/volumes" Mar 20 17:54:34 crc kubenswrapper[4690]: I0320 17:54:34.227121 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 17:54:34 crc kubenswrapper[4690]: I0320 17:54:34.227423 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="deb0f27d-5620-4c5e-b5b0-a068c76c566f" containerName="glance-log" containerID="cri-o://015697887f8aa9d888845169592a7b02b23c82171277e5982e44a949633a207f" gracePeriod=30 Mar 20 17:54:34 crc kubenswrapper[4690]: I0320 17:54:34.227526 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="deb0f27d-5620-4c5e-b5b0-a068c76c566f" containerName="glance-httpd" containerID="cri-o://2033301cfebc62e3d9fc98b727590b0fa89505b126f1251f5a0e64100e88156e" gracePeriod=30 Mar 20 17:54:34 crc kubenswrapper[4690]: I0320 17:54:34.396722 4690 generic.go:334] "Generic (PLEG): container finished" podID="deb0f27d-5620-4c5e-b5b0-a068c76c566f" containerID="015697887f8aa9d888845169592a7b02b23c82171277e5982e44a949633a207f" exitCode=143 Mar 20 17:54:34 crc kubenswrapper[4690]: I0320 17:54:34.396842 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"deb0f27d-5620-4c5e-b5b0-a068c76c566f","Type":"ContainerDied","Data":"015697887f8aa9d888845169592a7b02b23c82171277e5982e44a949633a207f"} Mar 20 17:54:34 crc kubenswrapper[4690]: I0320 17:54:34.399703 4690 generic.go:334] "Generic (PLEG): container finished" podID="d11f0ffd-e625-4b90-a1e5-2315bf45529d" containerID="812ab71b05d73eb8736a5210c10c237da0abea28d91276df114f5435a9ef3edb" exitCode=143 Mar 20 17:54:34 crc kubenswrapper[4690]: I0320 17:54:34.399774 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d11f0ffd-e625-4b90-a1e5-2315bf45529d","Type":"ContainerDied","Data":"812ab71b05d73eb8736a5210c10c237da0abea28d91276df114f5435a9ef3edb"} Mar 20 17:54:35 crc kubenswrapper[4690]: I0320 17:54:35.415066 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"db231a9f-179b-4f9b-9d02-89e8605dcc24","Type":"ContainerStarted","Data":"5476dd95a751346cec85829ba1201127ed091b8272ee6e3391780d461af7efcf"} Mar 20 17:54:35 crc kubenswrapper[4690]: I0320 17:54:35.415522 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="db231a9f-179b-4f9b-9d02-89e8605dcc24" containerName="ceilometer-central-agent" containerID="cri-o://25ff9572e869922a027fc488d3d63bf9c506b0eb1aa334b1a0ac4174f5cadd10" gracePeriod=30 Mar 20 17:54:35 crc kubenswrapper[4690]: I0320 17:54:35.415747 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 17:54:35 crc kubenswrapper[4690]: I0320 17:54:35.415986 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="db231a9f-179b-4f9b-9d02-89e8605dcc24" containerName="proxy-httpd" containerID="cri-o://5476dd95a751346cec85829ba1201127ed091b8272ee6e3391780d461af7efcf" gracePeriod=30 Mar 20 17:54:35 crc kubenswrapper[4690]: I0320 17:54:35.416030 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="db231a9f-179b-4f9b-9d02-89e8605dcc24" containerName="sg-core" containerID="cri-o://3685bf0ce05709efe2b6df25a32c7d16fbcc0b5a78db46d4381afb507d3c4886" gracePeriod=30 Mar 20 17:54:35 crc kubenswrapper[4690]: I0320 17:54:35.416059 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="db231a9f-179b-4f9b-9d02-89e8605dcc24" containerName="ceilometer-notification-agent" containerID="cri-o://abbbceb121af1e2c94c64d7da274f62fc052febe0e31a98d476f87c6c109d27b" gracePeriod=30 Mar 20 17:54:35 crc kubenswrapper[4690]: I0320 17:54:35.444557 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.361039082 podStartE2EDuration="11.444541735s" podCreationTimestamp="2026-03-20 17:54:24 +0000 UTC" firstStartedPulling="2026-03-20 17:54:25.513348652 +0000 UTC m=+1340.379174330" lastFinishedPulling="2026-03-20 17:54:34.596851305 +0000 UTC m=+1349.462676983" observedRunningTime="2026-03-20 17:54:35.438104877 +0000 UTC m=+1350.303930555" watchObservedRunningTime="2026-03-20 17:54:35.444541735 +0000 UTC m=+1350.310367413" Mar 20 17:54:35 crc kubenswrapper[4690]: E0320 17:54:35.887952 4690 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddb231a9f_179b_4f9b_9d02_89e8605dcc24.slice/crio-abbbceb121af1e2c94c64d7da274f62fc052febe0e31a98d476f87c6c109d27b.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddb231a9f_179b_4f9b_9d02_89e8605dcc24.slice/crio-conmon-25ff9572e869922a027fc488d3d63bf9c506b0eb1aa334b1a0ac4174f5cadd10.scope\": RecentStats: unable to find data in memory cache]" Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.199630 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.331547 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nl8vh\" (UniqueName: \"kubernetes.io/projected/db231a9f-179b-4f9b-9d02-89e8605dcc24-kube-api-access-nl8vh\") pod \"db231a9f-179b-4f9b-9d02-89e8605dcc24\" (UID: \"db231a9f-179b-4f9b-9d02-89e8605dcc24\") " Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.331644 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db231a9f-179b-4f9b-9d02-89e8605dcc24-scripts\") pod \"db231a9f-179b-4f9b-9d02-89e8605dcc24\" (UID: \"db231a9f-179b-4f9b-9d02-89e8605dcc24\") " Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.331769 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db231a9f-179b-4f9b-9d02-89e8605dcc24-config-data\") pod \"db231a9f-179b-4f9b-9d02-89e8605dcc24\" (UID: \"db231a9f-179b-4f9b-9d02-89e8605dcc24\") " Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.331814 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/db231a9f-179b-4f9b-9d02-89e8605dcc24-log-httpd\") pod \"db231a9f-179b-4f9b-9d02-89e8605dcc24\" (UID: \"db231a9f-179b-4f9b-9d02-89e8605dcc24\") " Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.331862 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db231a9f-179b-4f9b-9d02-89e8605dcc24-combined-ca-bundle\") pod \"db231a9f-179b-4f9b-9d02-89e8605dcc24\" (UID: \"db231a9f-179b-4f9b-9d02-89e8605dcc24\") " Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.331938 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/db231a9f-179b-4f9b-9d02-89e8605dcc24-run-httpd\") pod \"db231a9f-179b-4f9b-9d02-89e8605dcc24\" (UID: \"db231a9f-179b-4f9b-9d02-89e8605dcc24\") " Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.332001 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/db231a9f-179b-4f9b-9d02-89e8605dcc24-sg-core-conf-yaml\") pod \"db231a9f-179b-4f9b-9d02-89e8605dcc24\" (UID: \"db231a9f-179b-4f9b-9d02-89e8605dcc24\") " Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.332218 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db231a9f-179b-4f9b-9d02-89e8605dcc24-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "db231a9f-179b-4f9b-9d02-89e8605dcc24" (UID: "db231a9f-179b-4f9b-9d02-89e8605dcc24"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.332548 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db231a9f-179b-4f9b-9d02-89e8605dcc24-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "db231a9f-179b-4f9b-9d02-89e8605dcc24" (UID: "db231a9f-179b-4f9b-9d02-89e8605dcc24"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.332748 4690 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/db231a9f-179b-4f9b-9d02-89e8605dcc24-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.332772 4690 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/db231a9f-179b-4f9b-9d02-89e8605dcc24-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.336952 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db231a9f-179b-4f9b-9d02-89e8605dcc24-scripts" (OuterVolumeSpecName: "scripts") pod "db231a9f-179b-4f9b-9d02-89e8605dcc24" (UID: "db231a9f-179b-4f9b-9d02-89e8605dcc24"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.337544 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db231a9f-179b-4f9b-9d02-89e8605dcc24-kube-api-access-nl8vh" (OuterVolumeSpecName: "kube-api-access-nl8vh") pod "db231a9f-179b-4f9b-9d02-89e8605dcc24" (UID: "db231a9f-179b-4f9b-9d02-89e8605dcc24"). InnerVolumeSpecName "kube-api-access-nl8vh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.359056 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db231a9f-179b-4f9b-9d02-89e8605dcc24-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "db231a9f-179b-4f9b-9d02-89e8605dcc24" (UID: "db231a9f-179b-4f9b-9d02-89e8605dcc24"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.415747 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db231a9f-179b-4f9b-9d02-89e8605dcc24-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "db231a9f-179b-4f9b-9d02-89e8605dcc24" (UID: "db231a9f-179b-4f9b-9d02-89e8605dcc24"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.426100 4690 generic.go:334] "Generic (PLEG): container finished" podID="db231a9f-179b-4f9b-9d02-89e8605dcc24" containerID="5476dd95a751346cec85829ba1201127ed091b8272ee6e3391780d461af7efcf" exitCode=0 Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.426156 4690 generic.go:334] "Generic (PLEG): container finished" podID="db231a9f-179b-4f9b-9d02-89e8605dcc24" containerID="3685bf0ce05709efe2b6df25a32c7d16fbcc0b5a78db46d4381afb507d3c4886" exitCode=2 Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.426164 4690 generic.go:334] "Generic (PLEG): container finished" podID="db231a9f-179b-4f9b-9d02-89e8605dcc24" containerID="abbbceb121af1e2c94c64d7da274f62fc052febe0e31a98d476f87c6c109d27b" exitCode=0 Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.426172 4690 generic.go:334] "Generic (PLEG): container finished" podID="db231a9f-179b-4f9b-9d02-89e8605dcc24" containerID="25ff9572e869922a027fc488d3d63bf9c506b0eb1aa334b1a0ac4174f5cadd10" exitCode=0 Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.426194 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.426194 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"db231a9f-179b-4f9b-9d02-89e8605dcc24","Type":"ContainerDied","Data":"5476dd95a751346cec85829ba1201127ed091b8272ee6e3391780d461af7efcf"} Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.426251 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"db231a9f-179b-4f9b-9d02-89e8605dcc24","Type":"ContainerDied","Data":"3685bf0ce05709efe2b6df25a32c7d16fbcc0b5a78db46d4381afb507d3c4886"} Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.426351 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"db231a9f-179b-4f9b-9d02-89e8605dcc24","Type":"ContainerDied","Data":"abbbceb121af1e2c94c64d7da274f62fc052febe0e31a98d476f87c6c109d27b"} Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.426352 4690 scope.go:117] "RemoveContainer" containerID="5476dd95a751346cec85829ba1201127ed091b8272ee6e3391780d461af7efcf" Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.426364 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"db231a9f-179b-4f9b-9d02-89e8605dcc24","Type":"ContainerDied","Data":"25ff9572e869922a027fc488d3d63bf9c506b0eb1aa334b1a0ac4174f5cadd10"} Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.426375 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"db231a9f-179b-4f9b-9d02-89e8605dcc24","Type":"ContainerDied","Data":"67008a7fcff5c905169093de8384e009547fe9372f7543ee0e570f4a2983575b"} Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.435373 4690 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/db231a9f-179b-4f9b-9d02-89e8605dcc24-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.435411 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nl8vh\" (UniqueName: \"kubernetes.io/projected/db231a9f-179b-4f9b-9d02-89e8605dcc24-kube-api-access-nl8vh\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.435422 4690 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db231a9f-179b-4f9b-9d02-89e8605dcc24-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.435431 4690 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db231a9f-179b-4f9b-9d02-89e8605dcc24-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.444280 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db231a9f-179b-4f9b-9d02-89e8605dcc24-config-data" (OuterVolumeSpecName: "config-data") pod "db231a9f-179b-4f9b-9d02-89e8605dcc24" (UID: "db231a9f-179b-4f9b-9d02-89e8605dcc24"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.454615 4690 scope.go:117] "RemoveContainer" containerID="3685bf0ce05709efe2b6df25a32c7d16fbcc0b5a78db46d4381afb507d3c4886" Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.484030 4690 scope.go:117] "RemoveContainer" containerID="abbbceb121af1e2c94c64d7da274f62fc052febe0e31a98d476f87c6c109d27b" Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.501673 4690 scope.go:117] "RemoveContainer" containerID="25ff9572e869922a027fc488d3d63bf9c506b0eb1aa334b1a0ac4174f5cadd10" Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.518051 4690 scope.go:117] "RemoveContainer" containerID="5476dd95a751346cec85829ba1201127ed091b8272ee6e3391780d461af7efcf" Mar 20 17:54:36 crc kubenswrapper[4690]: E0320 17:54:36.518516 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5476dd95a751346cec85829ba1201127ed091b8272ee6e3391780d461af7efcf\": container with ID starting with 5476dd95a751346cec85829ba1201127ed091b8272ee6e3391780d461af7efcf not found: ID does not exist" containerID="5476dd95a751346cec85829ba1201127ed091b8272ee6e3391780d461af7efcf" Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.518559 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5476dd95a751346cec85829ba1201127ed091b8272ee6e3391780d461af7efcf"} err="failed to get container status \"5476dd95a751346cec85829ba1201127ed091b8272ee6e3391780d461af7efcf\": rpc error: code = NotFound desc = could not find container \"5476dd95a751346cec85829ba1201127ed091b8272ee6e3391780d461af7efcf\": container with ID starting with 5476dd95a751346cec85829ba1201127ed091b8272ee6e3391780d461af7efcf not found: ID does not exist" Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.518588 4690 scope.go:117] "RemoveContainer" containerID="3685bf0ce05709efe2b6df25a32c7d16fbcc0b5a78db46d4381afb507d3c4886" Mar 20 17:54:36 crc kubenswrapper[4690]: E0320 17:54:36.519020 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3685bf0ce05709efe2b6df25a32c7d16fbcc0b5a78db46d4381afb507d3c4886\": container with ID starting with 3685bf0ce05709efe2b6df25a32c7d16fbcc0b5a78db46d4381afb507d3c4886 not found: ID does not exist" containerID="3685bf0ce05709efe2b6df25a32c7d16fbcc0b5a78db46d4381afb507d3c4886" Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.519044 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3685bf0ce05709efe2b6df25a32c7d16fbcc0b5a78db46d4381afb507d3c4886"} err="failed to get container status \"3685bf0ce05709efe2b6df25a32c7d16fbcc0b5a78db46d4381afb507d3c4886\": rpc error: code = NotFound desc = could not find container \"3685bf0ce05709efe2b6df25a32c7d16fbcc0b5a78db46d4381afb507d3c4886\": container with ID starting with 3685bf0ce05709efe2b6df25a32c7d16fbcc0b5a78db46d4381afb507d3c4886 not found: ID does not exist" Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.519060 4690 scope.go:117] "RemoveContainer" containerID="abbbceb121af1e2c94c64d7da274f62fc052febe0e31a98d476f87c6c109d27b" Mar 20 17:54:36 crc kubenswrapper[4690]: E0320 17:54:36.519376 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abbbceb121af1e2c94c64d7da274f62fc052febe0e31a98d476f87c6c109d27b\": container with ID starting with abbbceb121af1e2c94c64d7da274f62fc052febe0e31a98d476f87c6c109d27b not found: ID does not exist" containerID="abbbceb121af1e2c94c64d7da274f62fc052febe0e31a98d476f87c6c109d27b" Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.519432 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abbbceb121af1e2c94c64d7da274f62fc052febe0e31a98d476f87c6c109d27b"} err="failed to get container status \"abbbceb121af1e2c94c64d7da274f62fc052febe0e31a98d476f87c6c109d27b\": rpc error: code = NotFound desc = could not find container \"abbbceb121af1e2c94c64d7da274f62fc052febe0e31a98d476f87c6c109d27b\": container with ID starting with abbbceb121af1e2c94c64d7da274f62fc052febe0e31a98d476f87c6c109d27b not found: ID does not exist" Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.519460 4690 scope.go:117] "RemoveContainer" containerID="25ff9572e869922a027fc488d3d63bf9c506b0eb1aa334b1a0ac4174f5cadd10" Mar 20 17:54:36 crc kubenswrapper[4690]: E0320 17:54:36.519702 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25ff9572e869922a027fc488d3d63bf9c506b0eb1aa334b1a0ac4174f5cadd10\": container with ID starting with 25ff9572e869922a027fc488d3d63bf9c506b0eb1aa334b1a0ac4174f5cadd10 not found: ID does not exist" containerID="25ff9572e869922a027fc488d3d63bf9c506b0eb1aa334b1a0ac4174f5cadd10" Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.519724 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25ff9572e869922a027fc488d3d63bf9c506b0eb1aa334b1a0ac4174f5cadd10"} err="failed to get container status \"25ff9572e869922a027fc488d3d63bf9c506b0eb1aa334b1a0ac4174f5cadd10\": rpc error: code = NotFound desc = could not find container \"25ff9572e869922a027fc488d3d63bf9c506b0eb1aa334b1a0ac4174f5cadd10\": container with ID starting with 25ff9572e869922a027fc488d3d63bf9c506b0eb1aa334b1a0ac4174f5cadd10 not found: ID does not exist" Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.519738 4690 scope.go:117] "RemoveContainer" containerID="5476dd95a751346cec85829ba1201127ed091b8272ee6e3391780d461af7efcf" Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.519941 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5476dd95a751346cec85829ba1201127ed091b8272ee6e3391780d461af7efcf"} err="failed to get container status \"5476dd95a751346cec85829ba1201127ed091b8272ee6e3391780d461af7efcf\": rpc error: code = NotFound desc = could not find container \"5476dd95a751346cec85829ba1201127ed091b8272ee6e3391780d461af7efcf\": container with ID starting with 5476dd95a751346cec85829ba1201127ed091b8272ee6e3391780d461af7efcf not found: ID does not exist" Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.519969 4690 scope.go:117] "RemoveContainer" containerID="3685bf0ce05709efe2b6df25a32c7d16fbcc0b5a78db46d4381afb507d3c4886" Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.520200 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3685bf0ce05709efe2b6df25a32c7d16fbcc0b5a78db46d4381afb507d3c4886"} err="failed to get container status \"3685bf0ce05709efe2b6df25a32c7d16fbcc0b5a78db46d4381afb507d3c4886\": rpc error: code = NotFound desc = could not find container \"3685bf0ce05709efe2b6df25a32c7d16fbcc0b5a78db46d4381afb507d3c4886\": container with ID starting with 3685bf0ce05709efe2b6df25a32c7d16fbcc0b5a78db46d4381afb507d3c4886 not found: ID does not exist" Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.520225 4690 scope.go:117] "RemoveContainer" containerID="abbbceb121af1e2c94c64d7da274f62fc052febe0e31a98d476f87c6c109d27b" Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.520612 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abbbceb121af1e2c94c64d7da274f62fc052febe0e31a98d476f87c6c109d27b"} err="failed to get container status \"abbbceb121af1e2c94c64d7da274f62fc052febe0e31a98d476f87c6c109d27b\": rpc error: code = NotFound desc = could not find container \"abbbceb121af1e2c94c64d7da274f62fc052febe0e31a98d476f87c6c109d27b\": container with ID starting with abbbceb121af1e2c94c64d7da274f62fc052febe0e31a98d476f87c6c109d27b not found: ID does not exist" Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.520642 4690 scope.go:117] "RemoveContainer" containerID="25ff9572e869922a027fc488d3d63bf9c506b0eb1aa334b1a0ac4174f5cadd10" Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.521122 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25ff9572e869922a027fc488d3d63bf9c506b0eb1aa334b1a0ac4174f5cadd10"} err="failed to get container status \"25ff9572e869922a027fc488d3d63bf9c506b0eb1aa334b1a0ac4174f5cadd10\": rpc error: code = NotFound desc = could not find container \"25ff9572e869922a027fc488d3d63bf9c506b0eb1aa334b1a0ac4174f5cadd10\": container with ID starting with 25ff9572e869922a027fc488d3d63bf9c506b0eb1aa334b1a0ac4174f5cadd10 not found: ID does not exist" Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.521145 4690 scope.go:117] "RemoveContainer" containerID="5476dd95a751346cec85829ba1201127ed091b8272ee6e3391780d461af7efcf" Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.521448 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5476dd95a751346cec85829ba1201127ed091b8272ee6e3391780d461af7efcf"} err="failed to get container status \"5476dd95a751346cec85829ba1201127ed091b8272ee6e3391780d461af7efcf\": rpc error: code = NotFound desc = could not find container \"5476dd95a751346cec85829ba1201127ed091b8272ee6e3391780d461af7efcf\": container with ID starting with 5476dd95a751346cec85829ba1201127ed091b8272ee6e3391780d461af7efcf not found: ID does not exist" Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.521470 4690 scope.go:117] "RemoveContainer" containerID="3685bf0ce05709efe2b6df25a32c7d16fbcc0b5a78db46d4381afb507d3c4886" Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.521757 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3685bf0ce05709efe2b6df25a32c7d16fbcc0b5a78db46d4381afb507d3c4886"} err="failed to get container status \"3685bf0ce05709efe2b6df25a32c7d16fbcc0b5a78db46d4381afb507d3c4886\": rpc error: code = NotFound desc = could not find container \"3685bf0ce05709efe2b6df25a32c7d16fbcc0b5a78db46d4381afb507d3c4886\": container with ID starting with 3685bf0ce05709efe2b6df25a32c7d16fbcc0b5a78db46d4381afb507d3c4886 not found: ID does not exist" Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.521774 4690 scope.go:117] "RemoveContainer" containerID="abbbceb121af1e2c94c64d7da274f62fc052febe0e31a98d476f87c6c109d27b" Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.521979 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abbbceb121af1e2c94c64d7da274f62fc052febe0e31a98d476f87c6c109d27b"} err="failed to get container status \"abbbceb121af1e2c94c64d7da274f62fc052febe0e31a98d476f87c6c109d27b\": rpc error: code = NotFound desc = could not find container \"abbbceb121af1e2c94c64d7da274f62fc052febe0e31a98d476f87c6c109d27b\": container with ID starting with abbbceb121af1e2c94c64d7da274f62fc052febe0e31a98d476f87c6c109d27b not found: ID does not exist" Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.521991 4690 scope.go:117] "RemoveContainer" containerID="25ff9572e869922a027fc488d3d63bf9c506b0eb1aa334b1a0ac4174f5cadd10" Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.522174 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25ff9572e869922a027fc488d3d63bf9c506b0eb1aa334b1a0ac4174f5cadd10"} err="failed to get container status \"25ff9572e869922a027fc488d3d63bf9c506b0eb1aa334b1a0ac4174f5cadd10\": rpc error: code = NotFound desc = could not find container \"25ff9572e869922a027fc488d3d63bf9c506b0eb1aa334b1a0ac4174f5cadd10\": container with ID starting with 25ff9572e869922a027fc488d3d63bf9c506b0eb1aa334b1a0ac4174f5cadd10 not found: ID does not exist" Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.522187 4690 scope.go:117] "RemoveContainer" containerID="5476dd95a751346cec85829ba1201127ed091b8272ee6e3391780d461af7efcf" Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.522440 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5476dd95a751346cec85829ba1201127ed091b8272ee6e3391780d461af7efcf"} err="failed to get container status \"5476dd95a751346cec85829ba1201127ed091b8272ee6e3391780d461af7efcf\": rpc error: code = NotFound desc = could not find container \"5476dd95a751346cec85829ba1201127ed091b8272ee6e3391780d461af7efcf\": container with ID starting with 5476dd95a751346cec85829ba1201127ed091b8272ee6e3391780d461af7efcf not found: ID does not exist" Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.522451 4690 scope.go:117] "RemoveContainer" containerID="3685bf0ce05709efe2b6df25a32c7d16fbcc0b5a78db46d4381afb507d3c4886" Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.522626 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3685bf0ce05709efe2b6df25a32c7d16fbcc0b5a78db46d4381afb507d3c4886"} err="failed to get container status \"3685bf0ce05709efe2b6df25a32c7d16fbcc0b5a78db46d4381afb507d3c4886\": rpc error: code = NotFound desc = could not find container \"3685bf0ce05709efe2b6df25a32c7d16fbcc0b5a78db46d4381afb507d3c4886\": container with ID starting with 3685bf0ce05709efe2b6df25a32c7d16fbcc0b5a78db46d4381afb507d3c4886 not found: ID does not exist" Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.522638 4690 scope.go:117] "RemoveContainer" containerID="abbbceb121af1e2c94c64d7da274f62fc052febe0e31a98d476f87c6c109d27b" Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.522800 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abbbceb121af1e2c94c64d7da274f62fc052febe0e31a98d476f87c6c109d27b"} err="failed to get container status \"abbbceb121af1e2c94c64d7da274f62fc052febe0e31a98d476f87c6c109d27b\": rpc error: code = NotFound desc = could not find container \"abbbceb121af1e2c94c64d7da274f62fc052febe0e31a98d476f87c6c109d27b\": container with ID starting with abbbceb121af1e2c94c64d7da274f62fc052febe0e31a98d476f87c6c109d27b not found: ID does not exist" Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.522812 4690 scope.go:117] "RemoveContainer" containerID="25ff9572e869922a027fc488d3d63bf9c506b0eb1aa334b1a0ac4174f5cadd10" Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.522969 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25ff9572e869922a027fc488d3d63bf9c506b0eb1aa334b1a0ac4174f5cadd10"} err="failed to get container status \"25ff9572e869922a027fc488d3d63bf9c506b0eb1aa334b1a0ac4174f5cadd10\": rpc error: code = NotFound desc = could not find container \"25ff9572e869922a027fc488d3d63bf9c506b0eb1aa334b1a0ac4174f5cadd10\": container with ID starting with 25ff9572e869922a027fc488d3d63bf9c506b0eb1aa334b1a0ac4174f5cadd10 not found: ID does not exist" Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.537103 4690 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db231a9f-179b-4f9b-9d02-89e8605dcc24-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.771796 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.778131 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.796920 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:54:36 crc kubenswrapper[4690]: E0320 17:54:36.800549 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db231a9f-179b-4f9b-9d02-89e8605dcc24" containerName="proxy-httpd" Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.800777 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="db231a9f-179b-4f9b-9d02-89e8605dcc24" containerName="proxy-httpd" Mar 20 17:54:36 crc kubenswrapper[4690]: E0320 17:54:36.800862 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eeac13cf-6875-434a-b276-fae77a828d02" containerName="placement-api" Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.800940 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="eeac13cf-6875-434a-b276-fae77a828d02" containerName="placement-api" Mar 20 17:54:36 crc kubenswrapper[4690]: E0320 17:54:36.801481 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eeac13cf-6875-434a-b276-fae77a828d02" containerName="placement-log" Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.801572 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="eeac13cf-6875-434a-b276-fae77a828d02" containerName="placement-log" Mar 20 17:54:36 crc kubenswrapper[4690]: E0320 17:54:36.801686 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="607d61e7-e52a-46e6-a23a-2d4714c5b543" containerName="horizon-log" Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.801769 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="607d61e7-e52a-46e6-a23a-2d4714c5b543" containerName="horizon-log" Mar 20 17:54:36 crc kubenswrapper[4690]: E0320 17:54:36.801855 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db231a9f-179b-4f9b-9d02-89e8605dcc24" containerName="ceilometer-notification-agent" Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.801986 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="db231a9f-179b-4f9b-9d02-89e8605dcc24" containerName="ceilometer-notification-agent" Mar 20 17:54:36 crc kubenswrapper[4690]: E0320 17:54:36.802081 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="607d61e7-e52a-46e6-a23a-2d4714c5b543" containerName="horizon" Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.802158 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="607d61e7-e52a-46e6-a23a-2d4714c5b543" containerName="horizon" Mar 20 17:54:36 crc kubenswrapper[4690]: E0320 17:54:36.802230 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db231a9f-179b-4f9b-9d02-89e8605dcc24" containerName="sg-core" Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.802336 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="db231a9f-179b-4f9b-9d02-89e8605dcc24" containerName="sg-core" Mar 20 17:54:36 crc kubenswrapper[4690]: E0320 17:54:36.802433 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db231a9f-179b-4f9b-9d02-89e8605dcc24" containerName="ceilometer-central-agent" Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.802514 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="db231a9f-179b-4f9b-9d02-89e8605dcc24" containerName="ceilometer-central-agent" Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.802810 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="eeac13cf-6875-434a-b276-fae77a828d02" containerName="placement-api" Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.802894 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="db231a9f-179b-4f9b-9d02-89e8605dcc24" containerName="ceilometer-central-agent" Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.802987 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="607d61e7-e52a-46e6-a23a-2d4714c5b543" containerName="horizon-log" Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.803072 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="db231a9f-179b-4f9b-9d02-89e8605dcc24" containerName="proxy-httpd" Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.803149 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="607d61e7-e52a-46e6-a23a-2d4714c5b543" containerName="horizon" Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.803222 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="db231a9f-179b-4f9b-9d02-89e8605dcc24" containerName="ceilometer-notification-agent" Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.803349 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="db231a9f-179b-4f9b-9d02-89e8605dcc24" containerName="sg-core" Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.803430 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="eeac13cf-6875-434a-b276-fae77a828d02" containerName="placement-log" Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.805822 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.811430 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.811509 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.853765 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.944673 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6433d8a9-75c5-47c1-be4e-2fbf10227f4d-config-data\") pod \"ceilometer-0\" (UID: \"6433d8a9-75c5-47c1-be4e-2fbf10227f4d\") " pod="openstack/ceilometer-0" Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.945131 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75596\" (UniqueName: \"kubernetes.io/projected/6433d8a9-75c5-47c1-be4e-2fbf10227f4d-kube-api-access-75596\") pod \"ceilometer-0\" (UID: \"6433d8a9-75c5-47c1-be4e-2fbf10227f4d\") " pod="openstack/ceilometer-0" Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.945315 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6433d8a9-75c5-47c1-be4e-2fbf10227f4d-scripts\") pod \"ceilometer-0\" (UID: \"6433d8a9-75c5-47c1-be4e-2fbf10227f4d\") " pod="openstack/ceilometer-0" Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.945464 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6433d8a9-75c5-47c1-be4e-2fbf10227f4d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6433d8a9-75c5-47c1-be4e-2fbf10227f4d\") " pod="openstack/ceilometer-0" Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.945497 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6433d8a9-75c5-47c1-be4e-2fbf10227f4d-log-httpd\") pod \"ceilometer-0\" (UID: \"6433d8a9-75c5-47c1-be4e-2fbf10227f4d\") " pod="openstack/ceilometer-0" Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.945654 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6433d8a9-75c5-47c1-be4e-2fbf10227f4d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6433d8a9-75c5-47c1-be4e-2fbf10227f4d\") " pod="openstack/ceilometer-0" Mar 20 17:54:36 crc kubenswrapper[4690]: I0320 17:54:36.945702 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6433d8a9-75c5-47c1-be4e-2fbf10227f4d-run-httpd\") pod \"ceilometer-0\" (UID: \"6433d8a9-75c5-47c1-be4e-2fbf10227f4d\") " pod="openstack/ceilometer-0" Mar 20 17:54:37 crc kubenswrapper[4690]: I0320 17:54:37.048144 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6433d8a9-75c5-47c1-be4e-2fbf10227f4d-scripts\") pod \"ceilometer-0\" (UID: \"6433d8a9-75c5-47c1-be4e-2fbf10227f4d\") " pod="openstack/ceilometer-0" Mar 20 17:54:37 crc kubenswrapper[4690]: I0320 17:54:37.048222 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6433d8a9-75c5-47c1-be4e-2fbf10227f4d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6433d8a9-75c5-47c1-be4e-2fbf10227f4d\") " pod="openstack/ceilometer-0" Mar 20 17:54:37 crc kubenswrapper[4690]: I0320 17:54:37.048242 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6433d8a9-75c5-47c1-be4e-2fbf10227f4d-log-httpd\") pod \"ceilometer-0\" (UID: \"6433d8a9-75c5-47c1-be4e-2fbf10227f4d\") " pod="openstack/ceilometer-0" Mar 20 17:54:37 crc kubenswrapper[4690]: I0320 17:54:37.048296 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6433d8a9-75c5-47c1-be4e-2fbf10227f4d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6433d8a9-75c5-47c1-be4e-2fbf10227f4d\") " pod="openstack/ceilometer-0" Mar 20 17:54:37 crc kubenswrapper[4690]: I0320 17:54:37.048312 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6433d8a9-75c5-47c1-be4e-2fbf10227f4d-run-httpd\") pod \"ceilometer-0\" (UID: \"6433d8a9-75c5-47c1-be4e-2fbf10227f4d\") " pod="openstack/ceilometer-0" Mar 20 17:54:37 crc kubenswrapper[4690]: I0320 17:54:37.048350 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6433d8a9-75c5-47c1-be4e-2fbf10227f4d-config-data\") pod \"ceilometer-0\" (UID: \"6433d8a9-75c5-47c1-be4e-2fbf10227f4d\") " pod="openstack/ceilometer-0" Mar 20 17:54:37 crc kubenswrapper[4690]: I0320 17:54:37.048424 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75596\" (UniqueName: \"kubernetes.io/projected/6433d8a9-75c5-47c1-be4e-2fbf10227f4d-kube-api-access-75596\") pod \"ceilometer-0\" (UID: \"6433d8a9-75c5-47c1-be4e-2fbf10227f4d\") " pod="openstack/ceilometer-0" Mar 20 17:54:37 crc kubenswrapper[4690]: I0320 17:54:37.050997 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6433d8a9-75c5-47c1-be4e-2fbf10227f4d-run-httpd\") pod \"ceilometer-0\" (UID: \"6433d8a9-75c5-47c1-be4e-2fbf10227f4d\") " pod="openstack/ceilometer-0" Mar 20 17:54:37 crc kubenswrapper[4690]: I0320 17:54:37.051322 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6433d8a9-75c5-47c1-be4e-2fbf10227f4d-log-httpd\") pod \"ceilometer-0\" (UID: \"6433d8a9-75c5-47c1-be4e-2fbf10227f4d\") " pod="openstack/ceilometer-0" Mar 20 17:54:37 crc kubenswrapper[4690]: I0320 17:54:37.053567 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6433d8a9-75c5-47c1-be4e-2fbf10227f4d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6433d8a9-75c5-47c1-be4e-2fbf10227f4d\") " pod="openstack/ceilometer-0" Mar 20 17:54:37 crc kubenswrapper[4690]: I0320 17:54:37.054140 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6433d8a9-75c5-47c1-be4e-2fbf10227f4d-scripts\") pod \"ceilometer-0\" (UID: \"6433d8a9-75c5-47c1-be4e-2fbf10227f4d\") " pod="openstack/ceilometer-0" Mar 20 17:54:37 crc kubenswrapper[4690]: I0320 17:54:37.054785 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6433d8a9-75c5-47c1-be4e-2fbf10227f4d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6433d8a9-75c5-47c1-be4e-2fbf10227f4d\") " pod="openstack/ceilometer-0" Mar 20 17:54:37 crc kubenswrapper[4690]: I0320 17:54:37.057231 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6433d8a9-75c5-47c1-be4e-2fbf10227f4d-config-data\") pod \"ceilometer-0\" (UID: \"6433d8a9-75c5-47c1-be4e-2fbf10227f4d\") " pod="openstack/ceilometer-0" Mar 20 17:54:37 crc kubenswrapper[4690]: I0320 17:54:37.065559 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75596\" (UniqueName: \"kubernetes.io/projected/6433d8a9-75c5-47c1-be4e-2fbf10227f4d-kube-api-access-75596\") pod \"ceilometer-0\" (UID: \"6433d8a9-75c5-47c1-be4e-2fbf10227f4d\") " pod="openstack/ceilometer-0" Mar 20 17:54:37 crc kubenswrapper[4690]: I0320 17:54:37.187769 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:54:37 crc kubenswrapper[4690]: I0320 17:54:37.249539 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 17:54:37 crc kubenswrapper[4690]: I0320 17:54:37.351455 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d11f0ffd-e625-4b90-a1e5-2315bf45529d-config-data\") pod \"d11f0ffd-e625-4b90-a1e5-2315bf45529d\" (UID: \"d11f0ffd-e625-4b90-a1e5-2315bf45529d\") " Mar 20 17:54:37 crc kubenswrapper[4690]: I0320 17:54:37.351802 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d11f0ffd-e625-4b90-a1e5-2315bf45529d-public-tls-certs\") pod \"d11f0ffd-e625-4b90-a1e5-2315bf45529d\" (UID: \"d11f0ffd-e625-4b90-a1e5-2315bf45529d\") " Mar 20 17:54:37 crc kubenswrapper[4690]: I0320 17:54:37.351855 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d11f0ffd-e625-4b90-a1e5-2315bf45529d-httpd-run\") pod \"d11f0ffd-e625-4b90-a1e5-2315bf45529d\" (UID: \"d11f0ffd-e625-4b90-a1e5-2315bf45529d\") " Mar 20 17:54:37 crc kubenswrapper[4690]: I0320 17:54:37.351876 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"d11f0ffd-e625-4b90-a1e5-2315bf45529d\" (UID: \"d11f0ffd-e625-4b90-a1e5-2315bf45529d\") " Mar 20 17:54:37 crc kubenswrapper[4690]: I0320 17:54:37.351900 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d11f0ffd-e625-4b90-a1e5-2315bf45529d-scripts\") pod \"d11f0ffd-e625-4b90-a1e5-2315bf45529d\" (UID: \"d11f0ffd-e625-4b90-a1e5-2315bf45529d\") " Mar 20 17:54:37 crc kubenswrapper[4690]: I0320 17:54:37.351978 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d11f0ffd-e625-4b90-a1e5-2315bf45529d-logs\") pod \"d11f0ffd-e625-4b90-a1e5-2315bf45529d\" (UID: \"d11f0ffd-e625-4b90-a1e5-2315bf45529d\") " Mar 20 17:54:37 crc kubenswrapper[4690]: I0320 17:54:37.352044 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d11f0ffd-e625-4b90-a1e5-2315bf45529d-combined-ca-bundle\") pod \"d11f0ffd-e625-4b90-a1e5-2315bf45529d\" (UID: \"d11f0ffd-e625-4b90-a1e5-2315bf45529d\") " Mar 20 17:54:37 crc kubenswrapper[4690]: I0320 17:54:37.352150 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6n4br\" (UniqueName: \"kubernetes.io/projected/d11f0ffd-e625-4b90-a1e5-2315bf45529d-kube-api-access-6n4br\") pod \"d11f0ffd-e625-4b90-a1e5-2315bf45529d\" (UID: \"d11f0ffd-e625-4b90-a1e5-2315bf45529d\") " Mar 20 17:54:37 crc kubenswrapper[4690]: I0320 17:54:37.352229 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d11f0ffd-e625-4b90-a1e5-2315bf45529d-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "d11f0ffd-e625-4b90-a1e5-2315bf45529d" (UID: "d11f0ffd-e625-4b90-a1e5-2315bf45529d"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:54:37 crc kubenswrapper[4690]: I0320 17:54:37.352346 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d11f0ffd-e625-4b90-a1e5-2315bf45529d-logs" (OuterVolumeSpecName: "logs") pod "d11f0ffd-e625-4b90-a1e5-2315bf45529d" (UID: "d11f0ffd-e625-4b90-a1e5-2315bf45529d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:54:37 crc kubenswrapper[4690]: I0320 17:54:37.352722 4690 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d11f0ffd-e625-4b90-a1e5-2315bf45529d-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:37 crc kubenswrapper[4690]: I0320 17:54:37.352750 4690 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d11f0ffd-e625-4b90-a1e5-2315bf45529d-logs\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:37 crc kubenswrapper[4690]: I0320 17:54:37.356532 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "d11f0ffd-e625-4b90-a1e5-2315bf45529d" (UID: "d11f0ffd-e625-4b90-a1e5-2315bf45529d"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 17:54:37 crc kubenswrapper[4690]: I0320 17:54:37.356568 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d11f0ffd-e625-4b90-a1e5-2315bf45529d-kube-api-access-6n4br" (OuterVolumeSpecName: "kube-api-access-6n4br") pod "d11f0ffd-e625-4b90-a1e5-2315bf45529d" (UID: "d11f0ffd-e625-4b90-a1e5-2315bf45529d"). InnerVolumeSpecName "kube-api-access-6n4br". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:54:37 crc kubenswrapper[4690]: I0320 17:54:37.368157 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d11f0ffd-e625-4b90-a1e5-2315bf45529d-scripts" (OuterVolumeSpecName: "scripts") pod "d11f0ffd-e625-4b90-a1e5-2315bf45529d" (UID: "d11f0ffd-e625-4b90-a1e5-2315bf45529d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:54:37 crc kubenswrapper[4690]: I0320 17:54:37.387018 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d11f0ffd-e625-4b90-a1e5-2315bf45529d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d11f0ffd-e625-4b90-a1e5-2315bf45529d" (UID: "d11f0ffd-e625-4b90-a1e5-2315bf45529d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:54:37 crc kubenswrapper[4690]: I0320 17:54:37.420781 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d11f0ffd-e625-4b90-a1e5-2315bf45529d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "d11f0ffd-e625-4b90-a1e5-2315bf45529d" (UID: "d11f0ffd-e625-4b90-a1e5-2315bf45529d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:54:37 crc kubenswrapper[4690]: I0320 17:54:37.431316 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d11f0ffd-e625-4b90-a1e5-2315bf45529d-config-data" (OuterVolumeSpecName: "config-data") pod "d11f0ffd-e625-4b90-a1e5-2315bf45529d" (UID: "d11f0ffd-e625-4b90-a1e5-2315bf45529d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:54:37 crc kubenswrapper[4690]: I0320 17:54:37.443180 4690 generic.go:334] "Generic (PLEG): container finished" podID="d11f0ffd-e625-4b90-a1e5-2315bf45529d" containerID="f525f7e9a79c32144c72f3cdc1110e4430998bf95419f4035a87d2a359ae37e3" exitCode=0 Mar 20 17:54:37 crc kubenswrapper[4690]: I0320 17:54:37.443217 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d11f0ffd-e625-4b90-a1e5-2315bf45529d","Type":"ContainerDied","Data":"f525f7e9a79c32144c72f3cdc1110e4430998bf95419f4035a87d2a359ae37e3"} Mar 20 17:54:37 crc kubenswrapper[4690]: I0320 17:54:37.443239 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d11f0ffd-e625-4b90-a1e5-2315bf45529d","Type":"ContainerDied","Data":"ba14332577c0b75ae272556eb11fd1aaaace63c827978692cf05a08e58396ed5"} Mar 20 17:54:37 crc kubenswrapper[4690]: I0320 17:54:37.443266 4690 scope.go:117] "RemoveContainer" containerID="f525f7e9a79c32144c72f3cdc1110e4430998bf95419f4035a87d2a359ae37e3" Mar 20 17:54:37 crc kubenswrapper[4690]: I0320 17:54:37.443375 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 17:54:37 crc kubenswrapper[4690]: I0320 17:54:37.455639 4690 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d11f0ffd-e625-4b90-a1e5-2315bf45529d-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:37 crc kubenswrapper[4690]: I0320 17:54:37.455791 4690 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d11f0ffd-e625-4b90-a1e5-2315bf45529d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:37 crc kubenswrapper[4690]: I0320 17:54:37.455809 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6n4br\" (UniqueName: \"kubernetes.io/projected/d11f0ffd-e625-4b90-a1e5-2315bf45529d-kube-api-access-6n4br\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:37 crc kubenswrapper[4690]: I0320 17:54:37.455820 4690 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d11f0ffd-e625-4b90-a1e5-2315bf45529d-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:37 crc kubenswrapper[4690]: I0320 17:54:37.455831 4690 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d11f0ffd-e625-4b90-a1e5-2315bf45529d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:37 crc kubenswrapper[4690]: I0320 17:54:37.455864 4690 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Mar 20 17:54:37 crc kubenswrapper[4690]: I0320 17:54:37.518360 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 17:54:37 crc kubenswrapper[4690]: I0320 17:54:37.524643 4690 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Mar 20 17:54:37 crc kubenswrapper[4690]: I0320 17:54:37.541977 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 17:54:37 crc kubenswrapper[4690]: I0320 17:54:37.544530 4690 scope.go:117] "RemoveContainer" containerID="812ab71b05d73eb8736a5210c10c237da0abea28d91276df114f5435a9ef3edb" Mar 20 17:54:37 crc kubenswrapper[4690]: I0320 17:54:37.555442 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 17:54:37 crc kubenswrapper[4690]: E0320 17:54:37.556788 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d11f0ffd-e625-4b90-a1e5-2315bf45529d" containerName="glance-httpd" Mar 20 17:54:37 crc kubenswrapper[4690]: I0320 17:54:37.556806 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="d11f0ffd-e625-4b90-a1e5-2315bf45529d" containerName="glance-httpd" Mar 20 17:54:37 crc kubenswrapper[4690]: E0320 17:54:37.556838 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d11f0ffd-e625-4b90-a1e5-2315bf45529d" containerName="glance-log" Mar 20 17:54:37 crc kubenswrapper[4690]: I0320 17:54:37.556845 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="d11f0ffd-e625-4b90-a1e5-2315bf45529d" containerName="glance-log" Mar 20 17:54:37 crc kubenswrapper[4690]: I0320 17:54:37.557062 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="d11f0ffd-e625-4b90-a1e5-2315bf45529d" containerName="glance-log" Mar 20 17:54:37 crc kubenswrapper[4690]: I0320 17:54:37.557082 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="d11f0ffd-e625-4b90-a1e5-2315bf45529d" containerName="glance-httpd" Mar 20 17:54:37 crc kubenswrapper[4690]: I0320 17:54:37.564332 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 17:54:37 crc kubenswrapper[4690]: I0320 17:54:37.564678 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 17:54:37 crc kubenswrapper[4690]: I0320 17:54:37.566274 4690 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:37 crc kubenswrapper[4690]: I0320 17:54:37.572929 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 20 17:54:37 crc kubenswrapper[4690]: I0320 17:54:37.573342 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 20 17:54:37 crc kubenswrapper[4690]: I0320 17:54:37.636466 4690 scope.go:117] "RemoveContainer" containerID="f525f7e9a79c32144c72f3cdc1110e4430998bf95419f4035a87d2a359ae37e3" Mar 20 17:54:37 crc kubenswrapper[4690]: E0320 17:54:37.638455 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f525f7e9a79c32144c72f3cdc1110e4430998bf95419f4035a87d2a359ae37e3\": container with ID starting with f525f7e9a79c32144c72f3cdc1110e4430998bf95419f4035a87d2a359ae37e3 not found: ID does not exist" containerID="f525f7e9a79c32144c72f3cdc1110e4430998bf95419f4035a87d2a359ae37e3" Mar 20 17:54:37 crc kubenswrapper[4690]: I0320 17:54:37.638492 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f525f7e9a79c32144c72f3cdc1110e4430998bf95419f4035a87d2a359ae37e3"} err="failed to get container status \"f525f7e9a79c32144c72f3cdc1110e4430998bf95419f4035a87d2a359ae37e3\": rpc error: code = NotFound desc = could not find container \"f525f7e9a79c32144c72f3cdc1110e4430998bf95419f4035a87d2a359ae37e3\": container with ID starting with f525f7e9a79c32144c72f3cdc1110e4430998bf95419f4035a87d2a359ae37e3 not found: ID does not exist" Mar 20 17:54:37 crc kubenswrapper[4690]: I0320 17:54:37.638518 4690 scope.go:117] "RemoveContainer" containerID="812ab71b05d73eb8736a5210c10c237da0abea28d91276df114f5435a9ef3edb" Mar 20 17:54:37 crc kubenswrapper[4690]: E0320 17:54:37.642537 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"812ab71b05d73eb8736a5210c10c237da0abea28d91276df114f5435a9ef3edb\": container with ID starting with 812ab71b05d73eb8736a5210c10c237da0abea28d91276df114f5435a9ef3edb not found: ID does not exist" containerID="812ab71b05d73eb8736a5210c10c237da0abea28d91276df114f5435a9ef3edb" Mar 20 17:54:37 crc kubenswrapper[4690]: I0320 17:54:37.642596 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"812ab71b05d73eb8736a5210c10c237da0abea28d91276df114f5435a9ef3edb"} err="failed to get container status \"812ab71b05d73eb8736a5210c10c237da0abea28d91276df114f5435a9ef3edb\": rpc error: code = NotFound desc = could not find container \"812ab71b05d73eb8736a5210c10c237da0abea28d91276df114f5435a9ef3edb\": container with ID starting with 812ab71b05d73eb8736a5210c10c237da0abea28d91276df114f5435a9ef3edb not found: ID does not exist" Mar 20 17:54:37 crc kubenswrapper[4690]: I0320 17:54:37.667277 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b415aa0-2e76-4f43-8f53-2da695c5b62e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"9b415aa0-2e76-4f43-8f53-2da695c5b62e\") " pod="openstack/glance-default-external-api-0" Mar 20 17:54:37 crc kubenswrapper[4690]: I0320 17:54:37.667312 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b415aa0-2e76-4f43-8f53-2da695c5b62e-scripts\") pod \"glance-default-external-api-0\" (UID: \"9b415aa0-2e76-4f43-8f53-2da695c5b62e\") " pod="openstack/glance-default-external-api-0" Mar 20 17:54:37 crc kubenswrapper[4690]: I0320 17:54:37.667586 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9b415aa0-2e76-4f43-8f53-2da695c5b62e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9b415aa0-2e76-4f43-8f53-2da695c5b62e\") " pod="openstack/glance-default-external-api-0" Mar 20 17:54:37 crc kubenswrapper[4690]: I0320 17:54:37.667627 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b415aa0-2e76-4f43-8f53-2da695c5b62e-config-data\") pod \"glance-default-external-api-0\" (UID: \"9b415aa0-2e76-4f43-8f53-2da695c5b62e\") " pod="openstack/glance-default-external-api-0" Mar 20 17:54:37 crc kubenswrapper[4690]: I0320 17:54:37.667654 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"9b415aa0-2e76-4f43-8f53-2da695c5b62e\") " pod="openstack/glance-default-external-api-0" Mar 20 17:54:37 crc kubenswrapper[4690]: I0320 17:54:37.667672 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b415aa0-2e76-4f43-8f53-2da695c5b62e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9b415aa0-2e76-4f43-8f53-2da695c5b62e\") " pod="openstack/glance-default-external-api-0" Mar 20 17:54:37 crc kubenswrapper[4690]: I0320 17:54:37.667738 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4h592\" (UniqueName: \"kubernetes.io/projected/9b415aa0-2e76-4f43-8f53-2da695c5b62e-kube-api-access-4h592\") pod \"glance-default-external-api-0\" (UID: \"9b415aa0-2e76-4f43-8f53-2da695c5b62e\") " pod="openstack/glance-default-external-api-0" Mar 20 17:54:37 crc kubenswrapper[4690]: I0320 17:54:37.667777 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b415aa0-2e76-4f43-8f53-2da695c5b62e-logs\") pod \"glance-default-external-api-0\" (UID: \"9b415aa0-2e76-4f43-8f53-2da695c5b62e\") " pod="openstack/glance-default-external-api-0" Mar 20 17:54:37 crc kubenswrapper[4690]: I0320 17:54:37.688027 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:54:37 crc kubenswrapper[4690]: I0320 17:54:37.769334 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4h592\" (UniqueName: \"kubernetes.io/projected/9b415aa0-2e76-4f43-8f53-2da695c5b62e-kube-api-access-4h592\") pod \"glance-default-external-api-0\" (UID: \"9b415aa0-2e76-4f43-8f53-2da695c5b62e\") " pod="openstack/glance-default-external-api-0" Mar 20 17:54:37 crc kubenswrapper[4690]: I0320 17:54:37.769552 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b415aa0-2e76-4f43-8f53-2da695c5b62e-logs\") pod \"glance-default-external-api-0\" (UID: \"9b415aa0-2e76-4f43-8f53-2da695c5b62e\") " pod="openstack/glance-default-external-api-0" Mar 20 17:54:37 crc kubenswrapper[4690]: I0320 17:54:37.769648 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b415aa0-2e76-4f43-8f53-2da695c5b62e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"9b415aa0-2e76-4f43-8f53-2da695c5b62e\") " pod="openstack/glance-default-external-api-0" Mar 20 17:54:37 crc kubenswrapper[4690]: I0320 17:54:37.769671 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b415aa0-2e76-4f43-8f53-2da695c5b62e-scripts\") pod \"glance-default-external-api-0\" (UID: \"9b415aa0-2e76-4f43-8f53-2da695c5b62e\") " pod="openstack/glance-default-external-api-0" Mar 20 17:54:37 crc kubenswrapper[4690]: I0320 17:54:37.769752 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9b415aa0-2e76-4f43-8f53-2da695c5b62e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9b415aa0-2e76-4f43-8f53-2da695c5b62e\") " pod="openstack/glance-default-external-api-0" Mar 20 17:54:37 crc kubenswrapper[4690]: I0320 17:54:37.769793 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b415aa0-2e76-4f43-8f53-2da695c5b62e-config-data\") pod \"glance-default-external-api-0\" (UID: \"9b415aa0-2e76-4f43-8f53-2da695c5b62e\") " pod="openstack/glance-default-external-api-0" Mar 20 17:54:37 crc kubenswrapper[4690]: I0320 17:54:37.769826 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"9b415aa0-2e76-4f43-8f53-2da695c5b62e\") " pod="openstack/glance-default-external-api-0" Mar 20 17:54:37 crc kubenswrapper[4690]: I0320 17:54:37.769882 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b415aa0-2e76-4f43-8f53-2da695c5b62e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9b415aa0-2e76-4f43-8f53-2da695c5b62e\") " pod="openstack/glance-default-external-api-0" Mar 20 17:54:37 crc kubenswrapper[4690]: I0320 17:54:37.771693 4690 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"9b415aa0-2e76-4f43-8f53-2da695c5b62e\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-external-api-0" Mar 20 17:54:37 crc kubenswrapper[4690]: I0320 17:54:37.783904 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9b415aa0-2e76-4f43-8f53-2da695c5b62e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9b415aa0-2e76-4f43-8f53-2da695c5b62e\") " pod="openstack/glance-default-external-api-0" Mar 20 17:54:37 crc kubenswrapper[4690]: I0320 17:54:37.787615 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b415aa0-2e76-4f43-8f53-2da695c5b62e-logs\") pod \"glance-default-external-api-0\" (UID: \"9b415aa0-2e76-4f43-8f53-2da695c5b62e\") " pod="openstack/glance-default-external-api-0" Mar 20 17:54:37 crc kubenswrapper[4690]: I0320 17:54:37.788951 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b415aa0-2e76-4f43-8f53-2da695c5b62e-config-data\") pod \"glance-default-external-api-0\" (UID: \"9b415aa0-2e76-4f43-8f53-2da695c5b62e\") " pod="openstack/glance-default-external-api-0" Mar 20 17:54:37 crc kubenswrapper[4690]: I0320 17:54:37.795927 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4h592\" (UniqueName: \"kubernetes.io/projected/9b415aa0-2e76-4f43-8f53-2da695c5b62e-kube-api-access-4h592\") pod \"glance-default-external-api-0\" (UID: \"9b415aa0-2e76-4f43-8f53-2da695c5b62e\") " pod="openstack/glance-default-external-api-0" Mar 20 17:54:37 crc kubenswrapper[4690]: I0320 17:54:37.800672 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b415aa0-2e76-4f43-8f53-2da695c5b62e-scripts\") pod \"glance-default-external-api-0\" (UID: \"9b415aa0-2e76-4f43-8f53-2da695c5b62e\") " pod="openstack/glance-default-external-api-0" Mar 20 17:54:37 crc kubenswrapper[4690]: I0320 17:54:37.806747 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b415aa0-2e76-4f43-8f53-2da695c5b62e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"9b415aa0-2e76-4f43-8f53-2da695c5b62e\") " pod="openstack/glance-default-external-api-0" Mar 20 17:54:37 crc kubenswrapper[4690]: I0320 17:54:37.809374 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b415aa0-2e76-4f43-8f53-2da695c5b62e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9b415aa0-2e76-4f43-8f53-2da695c5b62e\") " pod="openstack/glance-default-external-api-0" Mar 20 17:54:37 crc kubenswrapper[4690]: I0320 17:54:37.816425 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"9b415aa0-2e76-4f43-8f53-2da695c5b62e\") " pod="openstack/glance-default-external-api-0" Mar 20 17:54:37 crc kubenswrapper[4690]: I0320 17:54:37.899749 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d11f0ffd-e625-4b90-a1e5-2315bf45529d" path="/var/lib/kubelet/pods/d11f0ffd-e625-4b90-a1e5-2315bf45529d/volumes" Mar 20 17:54:37 crc kubenswrapper[4690]: I0320 17:54:37.907483 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db231a9f-179b-4f9b-9d02-89e8605dcc24" path="/var/lib/kubelet/pods/db231a9f-179b-4f9b-9d02-89e8605dcc24/volumes" Mar 20 17:54:37 crc kubenswrapper[4690]: I0320 17:54:37.920724 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 17:54:38 crc kubenswrapper[4690]: I0320 17:54:38.054567 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 17:54:38 crc kubenswrapper[4690]: I0320 17:54:38.187908 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"deb0f27d-5620-4c5e-b5b0-a068c76c566f\" (UID: \"deb0f27d-5620-4c5e-b5b0-a068c76c566f\") " Mar 20 17:54:38 crc kubenswrapper[4690]: I0320 17:54:38.188234 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/deb0f27d-5620-4c5e-b5b0-a068c76c566f-config-data\") pod \"deb0f27d-5620-4c5e-b5b0-a068c76c566f\" (UID: \"deb0f27d-5620-4c5e-b5b0-a068c76c566f\") " Mar 20 17:54:38 crc kubenswrapper[4690]: I0320 17:54:38.188446 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt7k2\" (UniqueName: \"kubernetes.io/projected/deb0f27d-5620-4c5e-b5b0-a068c76c566f-kube-api-access-vt7k2\") pod \"deb0f27d-5620-4c5e-b5b0-a068c76c566f\" (UID: \"deb0f27d-5620-4c5e-b5b0-a068c76c566f\") " Mar 20 17:54:38 crc kubenswrapper[4690]: I0320 17:54:38.188652 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/deb0f27d-5620-4c5e-b5b0-a068c76c566f-logs\") pod \"deb0f27d-5620-4c5e-b5b0-a068c76c566f\" (UID: \"deb0f27d-5620-4c5e-b5b0-a068c76c566f\") " Mar 20 17:54:38 crc kubenswrapper[4690]: I0320 17:54:38.188889 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/deb0f27d-5620-4c5e-b5b0-a068c76c566f-httpd-run\") pod \"deb0f27d-5620-4c5e-b5b0-a068c76c566f\" (UID: \"deb0f27d-5620-4c5e-b5b0-a068c76c566f\") " Mar 20 17:54:38 crc kubenswrapper[4690]: I0320 17:54:38.188970 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/deb0f27d-5620-4c5e-b5b0-a068c76c566f-combined-ca-bundle\") pod \"deb0f27d-5620-4c5e-b5b0-a068c76c566f\" (UID: \"deb0f27d-5620-4c5e-b5b0-a068c76c566f\") " Mar 20 17:54:38 crc kubenswrapper[4690]: I0320 17:54:38.189047 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/deb0f27d-5620-4c5e-b5b0-a068c76c566f-scripts\") pod \"deb0f27d-5620-4c5e-b5b0-a068c76c566f\" (UID: \"deb0f27d-5620-4c5e-b5b0-a068c76c566f\") " Mar 20 17:54:38 crc kubenswrapper[4690]: I0320 17:54:38.202113 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/deb0f27d-5620-4c5e-b5b0-a068c76c566f-internal-tls-certs\") pod \"deb0f27d-5620-4c5e-b5b0-a068c76c566f\" (UID: \"deb0f27d-5620-4c5e-b5b0-a068c76c566f\") " Mar 20 17:54:38 crc kubenswrapper[4690]: I0320 17:54:38.189674 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/deb0f27d-5620-4c5e-b5b0-a068c76c566f-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "deb0f27d-5620-4c5e-b5b0-a068c76c566f" (UID: "deb0f27d-5620-4c5e-b5b0-a068c76c566f"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:54:38 crc kubenswrapper[4690]: I0320 17:54:38.190181 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/deb0f27d-5620-4c5e-b5b0-a068c76c566f-logs" (OuterVolumeSpecName: "logs") pod "deb0f27d-5620-4c5e-b5b0-a068c76c566f" (UID: "deb0f27d-5620-4c5e-b5b0-a068c76c566f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:54:38 crc kubenswrapper[4690]: I0320 17:54:38.195949 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/deb0f27d-5620-4c5e-b5b0-a068c76c566f-kube-api-access-vt7k2" (OuterVolumeSpecName: "kube-api-access-vt7k2") pod "deb0f27d-5620-4c5e-b5b0-a068c76c566f" (UID: "deb0f27d-5620-4c5e-b5b0-a068c76c566f"). InnerVolumeSpecName "kube-api-access-vt7k2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:54:38 crc kubenswrapper[4690]: I0320 17:54:38.200605 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "deb0f27d-5620-4c5e-b5b0-a068c76c566f" (UID: "deb0f27d-5620-4c5e-b5b0-a068c76c566f"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 17:54:38 crc kubenswrapper[4690]: I0320 17:54:38.200671 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/deb0f27d-5620-4c5e-b5b0-a068c76c566f-scripts" (OuterVolumeSpecName: "scripts") pod "deb0f27d-5620-4c5e-b5b0-a068c76c566f" (UID: "deb0f27d-5620-4c5e-b5b0-a068c76c566f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:54:38 crc kubenswrapper[4690]: I0320 17:54:38.207897 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt7k2\" (UniqueName: \"kubernetes.io/projected/deb0f27d-5620-4c5e-b5b0-a068c76c566f-kube-api-access-vt7k2\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:38 crc kubenswrapper[4690]: I0320 17:54:38.207928 4690 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/deb0f27d-5620-4c5e-b5b0-a068c76c566f-logs\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:38 crc kubenswrapper[4690]: I0320 17:54:38.207937 4690 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/deb0f27d-5620-4c5e-b5b0-a068c76c566f-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:38 crc kubenswrapper[4690]: I0320 17:54:38.207947 4690 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/deb0f27d-5620-4c5e-b5b0-a068c76c566f-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:38 crc kubenswrapper[4690]: I0320 17:54:38.207974 4690 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Mar 20 17:54:38 crc kubenswrapper[4690]: I0320 17:54:38.234200 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/deb0f27d-5620-4c5e-b5b0-a068c76c566f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "deb0f27d-5620-4c5e-b5b0-a068c76c566f" (UID: "deb0f27d-5620-4c5e-b5b0-a068c76c566f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:54:38 crc kubenswrapper[4690]: I0320 17:54:38.246847 4690 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Mar 20 17:54:38 crc kubenswrapper[4690]: I0320 17:54:38.261218 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/deb0f27d-5620-4c5e-b5b0-a068c76c566f-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "deb0f27d-5620-4c5e-b5b0-a068c76c566f" (UID: "deb0f27d-5620-4c5e-b5b0-a068c76c566f"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:54:38 crc kubenswrapper[4690]: I0320 17:54:38.311449 4690 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/deb0f27d-5620-4c5e-b5b0-a068c76c566f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:38 crc kubenswrapper[4690]: I0320 17:54:38.311487 4690 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/deb0f27d-5620-4c5e-b5b0-a068c76c566f-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:38 crc kubenswrapper[4690]: I0320 17:54:38.311503 4690 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:38 crc kubenswrapper[4690]: I0320 17:54:38.337491 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/deb0f27d-5620-4c5e-b5b0-a068c76c566f-config-data" (OuterVolumeSpecName: "config-data") pod "deb0f27d-5620-4c5e-b5b0-a068c76c566f" (UID: "deb0f27d-5620-4c5e-b5b0-a068c76c566f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:54:38 crc kubenswrapper[4690]: I0320 17:54:38.413198 4690 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/deb0f27d-5620-4c5e-b5b0-a068c76c566f-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:38 crc kubenswrapper[4690]: I0320 17:54:38.463424 4690 generic.go:334] "Generic (PLEG): container finished" podID="deb0f27d-5620-4c5e-b5b0-a068c76c566f" containerID="2033301cfebc62e3d9fc98b727590b0fa89505b126f1251f5a0e64100e88156e" exitCode=0 Mar 20 17:54:38 crc kubenswrapper[4690]: I0320 17:54:38.463660 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 17:54:38 crc kubenswrapper[4690]: I0320 17:54:38.464023 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"deb0f27d-5620-4c5e-b5b0-a068c76c566f","Type":"ContainerDied","Data":"2033301cfebc62e3d9fc98b727590b0fa89505b126f1251f5a0e64100e88156e"} Mar 20 17:54:38 crc kubenswrapper[4690]: I0320 17:54:38.464074 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"deb0f27d-5620-4c5e-b5b0-a068c76c566f","Type":"ContainerDied","Data":"8e4e3c37f96e939e058abff7821c99808dc700877ce98560f64c091b0836e0dd"} Mar 20 17:54:38 crc kubenswrapper[4690]: I0320 17:54:38.464092 4690 scope.go:117] "RemoveContainer" containerID="2033301cfebc62e3d9fc98b727590b0fa89505b126f1251f5a0e64100e88156e" Mar 20 17:54:38 crc kubenswrapper[4690]: I0320 17:54:38.471134 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6433d8a9-75c5-47c1-be4e-2fbf10227f4d","Type":"ContainerStarted","Data":"c556460e694f56a6261f25c6618f72a3c05698be1e72993d58cd05732ce91b1d"} Mar 20 17:54:38 crc kubenswrapper[4690]: I0320 17:54:38.471174 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6433d8a9-75c5-47c1-be4e-2fbf10227f4d","Type":"ContainerStarted","Data":"0f7f48de9b24b0bb54f125e23ec7d5d0f1b71e7d359f2f305bd1f85e169b1df0"} Mar 20 17:54:38 crc kubenswrapper[4690]: I0320 17:54:38.503564 4690 scope.go:117] "RemoveContainer" containerID="015697887f8aa9d888845169592a7b02b23c82171277e5982e44a949633a207f" Mar 20 17:54:38 crc kubenswrapper[4690]: I0320 17:54:38.517798 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 17:54:38 crc kubenswrapper[4690]: I0320 17:54:38.523653 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 17:54:38 crc kubenswrapper[4690]: I0320 17:54:38.537377 4690 scope.go:117] "RemoveContainer" containerID="2033301cfebc62e3d9fc98b727590b0fa89505b126f1251f5a0e64100e88156e" Mar 20 17:54:38 crc kubenswrapper[4690]: E0320 17:54:38.543673 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2033301cfebc62e3d9fc98b727590b0fa89505b126f1251f5a0e64100e88156e\": container with ID starting with 2033301cfebc62e3d9fc98b727590b0fa89505b126f1251f5a0e64100e88156e not found: ID does not exist" containerID="2033301cfebc62e3d9fc98b727590b0fa89505b126f1251f5a0e64100e88156e" Mar 20 17:54:38 crc kubenswrapper[4690]: I0320 17:54:38.543712 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2033301cfebc62e3d9fc98b727590b0fa89505b126f1251f5a0e64100e88156e"} err="failed to get container status \"2033301cfebc62e3d9fc98b727590b0fa89505b126f1251f5a0e64100e88156e\": rpc error: code = NotFound desc = could not find container \"2033301cfebc62e3d9fc98b727590b0fa89505b126f1251f5a0e64100e88156e\": container with ID starting with 2033301cfebc62e3d9fc98b727590b0fa89505b126f1251f5a0e64100e88156e not found: ID does not exist" Mar 20 17:54:38 crc kubenswrapper[4690]: I0320 17:54:38.543736 4690 scope.go:117] "RemoveContainer" containerID="015697887f8aa9d888845169592a7b02b23c82171277e5982e44a949633a207f" Mar 20 17:54:38 crc kubenswrapper[4690]: E0320 17:54:38.545557 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"015697887f8aa9d888845169592a7b02b23c82171277e5982e44a949633a207f\": container with ID starting with 015697887f8aa9d888845169592a7b02b23c82171277e5982e44a949633a207f not found: ID does not exist" containerID="015697887f8aa9d888845169592a7b02b23c82171277e5982e44a949633a207f" Mar 20 17:54:38 crc kubenswrapper[4690]: I0320 17:54:38.545583 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"015697887f8aa9d888845169592a7b02b23c82171277e5982e44a949633a207f"} err="failed to get container status \"015697887f8aa9d888845169592a7b02b23c82171277e5982e44a949633a207f\": rpc error: code = NotFound desc = could not find container \"015697887f8aa9d888845169592a7b02b23c82171277e5982e44a949633a207f\": container with ID starting with 015697887f8aa9d888845169592a7b02b23c82171277e5982e44a949633a207f not found: ID does not exist" Mar 20 17:54:38 crc kubenswrapper[4690]: I0320 17:54:38.545616 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 17:54:38 crc kubenswrapper[4690]: E0320 17:54:38.546029 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="deb0f27d-5620-4c5e-b5b0-a068c76c566f" containerName="glance-httpd" Mar 20 17:54:38 crc kubenswrapper[4690]: I0320 17:54:38.546044 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="deb0f27d-5620-4c5e-b5b0-a068c76c566f" containerName="glance-httpd" Mar 20 17:54:38 crc kubenswrapper[4690]: E0320 17:54:38.546060 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="deb0f27d-5620-4c5e-b5b0-a068c76c566f" containerName="glance-log" Mar 20 17:54:38 crc kubenswrapper[4690]: I0320 17:54:38.546066 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="deb0f27d-5620-4c5e-b5b0-a068c76c566f" containerName="glance-log" Mar 20 17:54:38 crc kubenswrapper[4690]: I0320 17:54:38.546230 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="deb0f27d-5620-4c5e-b5b0-a068c76c566f" containerName="glance-httpd" Mar 20 17:54:38 crc kubenswrapper[4690]: I0320 17:54:38.546253 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="deb0f27d-5620-4c5e-b5b0-a068c76c566f" containerName="glance-log" Mar 20 17:54:38 crc kubenswrapper[4690]: I0320 17:54:38.547134 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 17:54:38 crc kubenswrapper[4690]: I0320 17:54:38.552598 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 20 17:54:38 crc kubenswrapper[4690]: I0320 17:54:38.552779 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 20 17:54:38 crc kubenswrapper[4690]: I0320 17:54:38.559185 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 17:54:38 crc kubenswrapper[4690]: I0320 17:54:38.616211 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d70983fe-8325-430a-beeb-fa3b8007e70e-logs\") pod \"glance-default-internal-api-0\" (UID: \"d70983fe-8325-430a-beeb-fa3b8007e70e\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:54:38 crc kubenswrapper[4690]: I0320 17:54:38.616304 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8zv6\" (UniqueName: \"kubernetes.io/projected/d70983fe-8325-430a-beeb-fa3b8007e70e-kube-api-access-m8zv6\") pod \"glance-default-internal-api-0\" (UID: \"d70983fe-8325-430a-beeb-fa3b8007e70e\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:54:38 crc kubenswrapper[4690]: I0320 17:54:38.616332 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d70983fe-8325-430a-beeb-fa3b8007e70e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d70983fe-8325-430a-beeb-fa3b8007e70e\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:54:38 crc kubenswrapper[4690]: I0320 17:54:38.616370 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"d70983fe-8325-430a-beeb-fa3b8007e70e\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:54:38 crc kubenswrapper[4690]: I0320 17:54:38.616414 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d70983fe-8325-430a-beeb-fa3b8007e70e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d70983fe-8325-430a-beeb-fa3b8007e70e\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:54:38 crc kubenswrapper[4690]: I0320 17:54:38.616437 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d70983fe-8325-430a-beeb-fa3b8007e70e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d70983fe-8325-430a-beeb-fa3b8007e70e\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:54:38 crc kubenswrapper[4690]: I0320 17:54:38.616452 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d70983fe-8325-430a-beeb-fa3b8007e70e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d70983fe-8325-430a-beeb-fa3b8007e70e\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:54:38 crc kubenswrapper[4690]: I0320 17:54:38.616476 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d70983fe-8325-430a-beeb-fa3b8007e70e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d70983fe-8325-430a-beeb-fa3b8007e70e\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:54:38 crc kubenswrapper[4690]: I0320 17:54:38.693505 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 17:54:38 crc kubenswrapper[4690]: I0320 17:54:38.718630 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d70983fe-8325-430a-beeb-fa3b8007e70e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d70983fe-8325-430a-beeb-fa3b8007e70e\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:54:38 crc kubenswrapper[4690]: I0320 17:54:38.718688 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d70983fe-8325-430a-beeb-fa3b8007e70e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d70983fe-8325-430a-beeb-fa3b8007e70e\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:54:38 crc kubenswrapper[4690]: I0320 17:54:38.718733 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d70983fe-8325-430a-beeb-fa3b8007e70e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d70983fe-8325-430a-beeb-fa3b8007e70e\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:54:38 crc kubenswrapper[4690]: I0320 17:54:38.718801 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d70983fe-8325-430a-beeb-fa3b8007e70e-logs\") pod \"glance-default-internal-api-0\" (UID: \"d70983fe-8325-430a-beeb-fa3b8007e70e\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:54:38 crc kubenswrapper[4690]: I0320 17:54:38.718865 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8zv6\" (UniqueName: \"kubernetes.io/projected/d70983fe-8325-430a-beeb-fa3b8007e70e-kube-api-access-m8zv6\") pod \"glance-default-internal-api-0\" (UID: \"d70983fe-8325-430a-beeb-fa3b8007e70e\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:54:38 crc kubenswrapper[4690]: I0320 17:54:38.718897 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d70983fe-8325-430a-beeb-fa3b8007e70e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d70983fe-8325-430a-beeb-fa3b8007e70e\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:54:38 crc kubenswrapper[4690]: I0320 17:54:38.718940 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"d70983fe-8325-430a-beeb-fa3b8007e70e\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:54:38 crc kubenswrapper[4690]: I0320 17:54:38.719005 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d70983fe-8325-430a-beeb-fa3b8007e70e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d70983fe-8325-430a-beeb-fa3b8007e70e\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:54:38 crc kubenswrapper[4690]: I0320 17:54:38.719372 4690 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"d70983fe-8325-430a-beeb-fa3b8007e70e\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-internal-api-0" Mar 20 17:54:38 crc kubenswrapper[4690]: I0320 17:54:38.719735 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d70983fe-8325-430a-beeb-fa3b8007e70e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d70983fe-8325-430a-beeb-fa3b8007e70e\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:54:38 crc kubenswrapper[4690]: I0320 17:54:38.719899 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d70983fe-8325-430a-beeb-fa3b8007e70e-logs\") pod \"glance-default-internal-api-0\" (UID: \"d70983fe-8325-430a-beeb-fa3b8007e70e\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:54:38 crc kubenswrapper[4690]: I0320 17:54:38.728681 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d70983fe-8325-430a-beeb-fa3b8007e70e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d70983fe-8325-430a-beeb-fa3b8007e70e\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:54:38 crc kubenswrapper[4690]: I0320 17:54:38.730826 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d70983fe-8325-430a-beeb-fa3b8007e70e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d70983fe-8325-430a-beeb-fa3b8007e70e\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:54:38 crc kubenswrapper[4690]: I0320 17:54:38.734016 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d70983fe-8325-430a-beeb-fa3b8007e70e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d70983fe-8325-430a-beeb-fa3b8007e70e\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:54:38 crc kubenswrapper[4690]: I0320 17:54:38.739864 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d70983fe-8325-430a-beeb-fa3b8007e70e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d70983fe-8325-430a-beeb-fa3b8007e70e\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:54:38 crc kubenswrapper[4690]: I0320 17:54:38.745367 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8zv6\" (UniqueName: \"kubernetes.io/projected/d70983fe-8325-430a-beeb-fa3b8007e70e-kube-api-access-m8zv6\") pod \"glance-default-internal-api-0\" (UID: \"d70983fe-8325-430a-beeb-fa3b8007e70e\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:54:38 crc kubenswrapper[4690]: I0320 17:54:38.760394 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"d70983fe-8325-430a-beeb-fa3b8007e70e\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:54:38 crc kubenswrapper[4690]: I0320 17:54:38.909002 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 17:54:39 crc kubenswrapper[4690]: I0320 17:54:39.483115 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9b415aa0-2e76-4f43-8f53-2da695c5b62e","Type":"ContainerStarted","Data":"3dd0c0aab1949fa915f13e4952ec93b35422264b125eecc1489e15744a2bb65d"} Mar 20 17:54:39 crc kubenswrapper[4690]: I0320 17:54:39.486408 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6433d8a9-75c5-47c1-be4e-2fbf10227f4d","Type":"ContainerStarted","Data":"d794e6ce362b84fb6dffb8e695893205130628f9d1c59173657bc0df80b7c841"} Mar 20 17:54:39 crc kubenswrapper[4690]: I0320 17:54:39.603145 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 17:54:39 crc kubenswrapper[4690]: I0320 17:54:39.895222 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="deb0f27d-5620-4c5e-b5b0-a068c76c566f" path="/var/lib/kubelet/pods/deb0f27d-5620-4c5e-b5b0-a068c76c566f/volumes" Mar 20 17:54:40 crc kubenswrapper[4690]: I0320 17:54:40.503404 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9b415aa0-2e76-4f43-8f53-2da695c5b62e","Type":"ContainerStarted","Data":"8aa491543d40b4f19b29bba14384544a00a4a93b6dfb82a9f5d745c7e200a709"} Mar 20 17:54:40 crc kubenswrapper[4690]: I0320 17:54:40.503728 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9b415aa0-2e76-4f43-8f53-2da695c5b62e","Type":"ContainerStarted","Data":"b579700cef106e6e8d9f91fccf359f305d39bad5288c8c7b314255c8ea50e11e"} Mar 20 17:54:40 crc kubenswrapper[4690]: I0320 17:54:40.505681 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d70983fe-8325-430a-beeb-fa3b8007e70e","Type":"ContainerStarted","Data":"28717f3d91e102cbf1a7a43d177935a0eedf637a370bf687e3a29aa3fdfcb784"} Mar 20 17:54:40 crc kubenswrapper[4690]: I0320 17:54:40.505709 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d70983fe-8325-430a-beeb-fa3b8007e70e","Type":"ContainerStarted","Data":"286ea63251b71af243434841a10f53eb3b4bf7432c9799b5a36bc62b892e1a21"} Mar 20 17:54:40 crc kubenswrapper[4690]: I0320 17:54:40.509165 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6433d8a9-75c5-47c1-be4e-2fbf10227f4d","Type":"ContainerStarted","Data":"97a4238ebdc506ab23bec52efa2c1aa2f53d083dfbddafb50aed61d2facf2760"} Mar 20 17:54:40 crc kubenswrapper[4690]: I0320 17:54:40.550076 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.550053913 podStartE2EDuration="3.550053913s" podCreationTimestamp="2026-03-20 17:54:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:54:40.522627894 +0000 UTC m=+1355.388453592" watchObservedRunningTime="2026-03-20 17:54:40.550053913 +0000 UTC m=+1355.415879591" Mar 20 17:54:41 crc kubenswrapper[4690]: I0320 17:54:41.520332 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d70983fe-8325-430a-beeb-fa3b8007e70e","Type":"ContainerStarted","Data":"95271f4533714ece4817cc0e36f56c30a6c84a5514c63539ae49ea518600f9a0"} Mar 20 17:54:41 crc kubenswrapper[4690]: I0320 17:54:41.552018 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.551988845 podStartE2EDuration="3.551988845s" podCreationTimestamp="2026-03-20 17:54:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:54:41.541485894 +0000 UTC m=+1356.407311572" watchObservedRunningTime="2026-03-20 17:54:41.551988845 +0000 UTC m=+1356.417814523" Mar 20 17:54:42 crc kubenswrapper[4690]: I0320 17:54:42.539069 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6433d8a9-75c5-47c1-be4e-2fbf10227f4d","Type":"ContainerStarted","Data":"b839758cdff51e58df519ad01c333db778c86791eec5d8f477cf69e10886bf2a"} Mar 20 17:54:42 crc kubenswrapper[4690]: I0320 17:54:42.539653 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 17:54:42 crc kubenswrapper[4690]: I0320 17:54:42.565558 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.270707477 podStartE2EDuration="6.5655406s" podCreationTimestamp="2026-03-20 17:54:36 +0000 UTC" firstStartedPulling="2026-03-20 17:54:37.697980633 +0000 UTC m=+1352.563806311" lastFinishedPulling="2026-03-20 17:54:41.992813756 +0000 UTC m=+1356.858639434" observedRunningTime="2026-03-20 17:54:42.559240745 +0000 UTC m=+1357.425066423" watchObservedRunningTime="2026-03-20 17:54:42.5655406 +0000 UTC m=+1357.431366278" Mar 20 17:54:44 crc kubenswrapper[4690]: I0320 17:54:44.558929 4690 generic.go:334] "Generic (PLEG): container finished" podID="95830c09-d53f-4e08-800d-09d227668aee" containerID="a53656e4c8e345ea3e2b042f3181137b42a535e3c48b19f229cba3b9985e607a" exitCode=0 Mar 20 17:54:44 crc kubenswrapper[4690]: I0320 17:54:44.558989 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-njclq" event={"ID":"95830c09-d53f-4e08-800d-09d227668aee","Type":"ContainerDied","Data":"a53656e4c8e345ea3e2b042f3181137b42a535e3c48b19f229cba3b9985e607a"} Mar 20 17:54:45 crc kubenswrapper[4690]: I0320 17:54:45.979014 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-njclq" Mar 20 17:54:46 crc kubenswrapper[4690]: I0320 17:54:46.107847 4690 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod48156974-0ab6-4f24-8d90-c5dcdfbe9f37"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod48156974-0ab6-4f24-8d90-c5dcdfbe9f37] : Timed out while waiting for systemd to remove kubepods-besteffort-pod48156974_0ab6_4f24_8d90_c5dcdfbe9f37.slice" Mar 20 17:54:46 crc kubenswrapper[4690]: I0320 17:54:46.159646 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95830c09-d53f-4e08-800d-09d227668aee-combined-ca-bundle\") pod \"95830c09-d53f-4e08-800d-09d227668aee\" (UID: \"95830c09-d53f-4e08-800d-09d227668aee\") " Mar 20 17:54:46 crc kubenswrapper[4690]: I0320 17:54:46.159792 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95830c09-d53f-4e08-800d-09d227668aee-config-data\") pod \"95830c09-d53f-4e08-800d-09d227668aee\" (UID: \"95830c09-d53f-4e08-800d-09d227668aee\") " Mar 20 17:54:46 crc kubenswrapper[4690]: I0320 17:54:46.159886 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95830c09-d53f-4e08-800d-09d227668aee-scripts\") pod \"95830c09-d53f-4e08-800d-09d227668aee\" (UID: \"95830c09-d53f-4e08-800d-09d227668aee\") " Mar 20 17:54:46 crc kubenswrapper[4690]: I0320 17:54:46.159936 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5bgh9\" (UniqueName: \"kubernetes.io/projected/95830c09-d53f-4e08-800d-09d227668aee-kube-api-access-5bgh9\") pod \"95830c09-d53f-4e08-800d-09d227668aee\" (UID: \"95830c09-d53f-4e08-800d-09d227668aee\") " Mar 20 17:54:46 crc kubenswrapper[4690]: I0320 17:54:46.169630 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95830c09-d53f-4e08-800d-09d227668aee-scripts" (OuterVolumeSpecName: "scripts") pod "95830c09-d53f-4e08-800d-09d227668aee" (UID: "95830c09-d53f-4e08-800d-09d227668aee"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:54:46 crc kubenswrapper[4690]: I0320 17:54:46.169742 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95830c09-d53f-4e08-800d-09d227668aee-kube-api-access-5bgh9" (OuterVolumeSpecName: "kube-api-access-5bgh9") pod "95830c09-d53f-4e08-800d-09d227668aee" (UID: "95830c09-d53f-4e08-800d-09d227668aee"). InnerVolumeSpecName "kube-api-access-5bgh9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:54:46 crc kubenswrapper[4690]: I0320 17:54:46.192130 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95830c09-d53f-4e08-800d-09d227668aee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "95830c09-d53f-4e08-800d-09d227668aee" (UID: "95830c09-d53f-4e08-800d-09d227668aee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:54:46 crc kubenswrapper[4690]: I0320 17:54:46.194723 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95830c09-d53f-4e08-800d-09d227668aee-config-data" (OuterVolumeSpecName: "config-data") pod "95830c09-d53f-4e08-800d-09d227668aee" (UID: "95830c09-d53f-4e08-800d-09d227668aee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:54:46 crc kubenswrapper[4690]: I0320 17:54:46.261734 4690 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95830c09-d53f-4e08-800d-09d227668aee-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:46 crc kubenswrapper[4690]: I0320 17:54:46.261773 4690 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95830c09-d53f-4e08-800d-09d227668aee-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:46 crc kubenswrapper[4690]: I0320 17:54:46.261787 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5bgh9\" (UniqueName: \"kubernetes.io/projected/95830c09-d53f-4e08-800d-09d227668aee-kube-api-access-5bgh9\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:46 crc kubenswrapper[4690]: I0320 17:54:46.261801 4690 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95830c09-d53f-4e08-800d-09d227668aee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:46 crc kubenswrapper[4690]: I0320 17:54:46.579718 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-njclq" event={"ID":"95830c09-d53f-4e08-800d-09d227668aee","Type":"ContainerDied","Data":"2c3a58957c495edd7112d9fe343d26b3e7ddd9b78c8667d10f62f9da18730bc3"} Mar 20 17:54:46 crc kubenswrapper[4690]: I0320 17:54:46.579777 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c3a58957c495edd7112d9fe343d26b3e7ddd9b78c8667d10f62f9da18730bc3" Mar 20 17:54:46 crc kubenswrapper[4690]: I0320 17:54:46.579793 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-njclq" Mar 20 17:54:46 crc kubenswrapper[4690]: I0320 17:54:46.667321 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 17:54:46 crc kubenswrapper[4690]: E0320 17:54:46.667698 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95830c09-d53f-4e08-800d-09d227668aee" containerName="nova-cell0-conductor-db-sync" Mar 20 17:54:46 crc kubenswrapper[4690]: I0320 17:54:46.667713 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="95830c09-d53f-4e08-800d-09d227668aee" containerName="nova-cell0-conductor-db-sync" Mar 20 17:54:46 crc kubenswrapper[4690]: I0320 17:54:46.667935 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="95830c09-d53f-4e08-800d-09d227668aee" containerName="nova-cell0-conductor-db-sync" Mar 20 17:54:46 crc kubenswrapper[4690]: I0320 17:54:46.668490 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 20 17:54:46 crc kubenswrapper[4690]: I0320 17:54:46.671413 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-f6gl8" Mar 20 17:54:46 crc kubenswrapper[4690]: I0320 17:54:46.672328 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 20 17:54:46 crc kubenswrapper[4690]: I0320 17:54:46.679993 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 17:54:46 crc kubenswrapper[4690]: I0320 17:54:46.772250 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/185f4ebf-5a0a-4c5c-9c54-61e9c1c886e3-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"185f4ebf-5a0a-4c5c-9c54-61e9c1c886e3\") " pod="openstack/nova-cell0-conductor-0" Mar 20 17:54:46 crc kubenswrapper[4690]: I0320 17:54:46.772399 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/185f4ebf-5a0a-4c5c-9c54-61e9c1c886e3-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"185f4ebf-5a0a-4c5c-9c54-61e9c1c886e3\") " pod="openstack/nova-cell0-conductor-0" Mar 20 17:54:46 crc kubenswrapper[4690]: I0320 17:54:46.772435 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfwt7\" (UniqueName: \"kubernetes.io/projected/185f4ebf-5a0a-4c5c-9c54-61e9c1c886e3-kube-api-access-lfwt7\") pod \"nova-cell0-conductor-0\" (UID: \"185f4ebf-5a0a-4c5c-9c54-61e9c1c886e3\") " pod="openstack/nova-cell0-conductor-0" Mar 20 17:54:46 crc kubenswrapper[4690]: I0320 17:54:46.873891 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfwt7\" (UniqueName: \"kubernetes.io/projected/185f4ebf-5a0a-4c5c-9c54-61e9c1c886e3-kube-api-access-lfwt7\") pod \"nova-cell0-conductor-0\" (UID: \"185f4ebf-5a0a-4c5c-9c54-61e9c1c886e3\") " pod="openstack/nova-cell0-conductor-0" Mar 20 17:54:46 crc kubenswrapper[4690]: I0320 17:54:46.874042 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/185f4ebf-5a0a-4c5c-9c54-61e9c1c886e3-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"185f4ebf-5a0a-4c5c-9c54-61e9c1c886e3\") " pod="openstack/nova-cell0-conductor-0" Mar 20 17:54:46 crc kubenswrapper[4690]: I0320 17:54:46.874094 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/185f4ebf-5a0a-4c5c-9c54-61e9c1c886e3-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"185f4ebf-5a0a-4c5c-9c54-61e9c1c886e3\") " pod="openstack/nova-cell0-conductor-0" Mar 20 17:54:46 crc kubenswrapper[4690]: I0320 17:54:46.879947 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/185f4ebf-5a0a-4c5c-9c54-61e9c1c886e3-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"185f4ebf-5a0a-4c5c-9c54-61e9c1c886e3\") " pod="openstack/nova-cell0-conductor-0" Mar 20 17:54:46 crc kubenswrapper[4690]: I0320 17:54:46.881225 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/185f4ebf-5a0a-4c5c-9c54-61e9c1c886e3-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"185f4ebf-5a0a-4c5c-9c54-61e9c1c886e3\") " pod="openstack/nova-cell0-conductor-0" Mar 20 17:54:46 crc kubenswrapper[4690]: I0320 17:54:46.892556 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfwt7\" (UniqueName: \"kubernetes.io/projected/185f4ebf-5a0a-4c5c-9c54-61e9c1c886e3-kube-api-access-lfwt7\") pod \"nova-cell0-conductor-0\" (UID: \"185f4ebf-5a0a-4c5c-9c54-61e9c1c886e3\") " pod="openstack/nova-cell0-conductor-0" Mar 20 17:54:46 crc kubenswrapper[4690]: I0320 17:54:46.983901 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 20 17:54:47 crc kubenswrapper[4690]: I0320 17:54:47.457385 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 17:54:47 crc kubenswrapper[4690]: W0320 17:54:47.464114 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod185f4ebf_5a0a_4c5c_9c54_61e9c1c886e3.slice/crio-698003ef232fa375c7253bc2c602ec51295d748c0901606869d92e0afff655d7 WatchSource:0}: Error finding container 698003ef232fa375c7253bc2c602ec51295d748c0901606869d92e0afff655d7: Status 404 returned error can't find the container with id 698003ef232fa375c7253bc2c602ec51295d748c0901606869d92e0afff655d7 Mar 20 17:54:47 crc kubenswrapper[4690]: I0320 17:54:47.590334 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"185f4ebf-5a0a-4c5c-9c54-61e9c1c886e3","Type":"ContainerStarted","Data":"698003ef232fa375c7253bc2c602ec51295d748c0901606869d92e0afff655d7"} Mar 20 17:54:47 crc kubenswrapper[4690]: I0320 17:54:47.922086 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 20 17:54:47 crc kubenswrapper[4690]: I0320 17:54:47.922142 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 20 17:54:47 crc kubenswrapper[4690]: I0320 17:54:47.966190 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 20 17:54:47 crc kubenswrapper[4690]: I0320 17:54:47.974893 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 20 17:54:48 crc kubenswrapper[4690]: I0320 17:54:48.606728 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"185f4ebf-5a0a-4c5c-9c54-61e9c1c886e3","Type":"ContainerStarted","Data":"b7617a3c7e573a25fd1bae4da4965ee9fd1e9cceb51a21f90c46fb776234828f"} Mar 20 17:54:48 crc kubenswrapper[4690]: I0320 17:54:48.607985 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 20 17:54:48 crc kubenswrapper[4690]: I0320 17:54:48.608104 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 20 17:54:48 crc kubenswrapper[4690]: I0320 17:54:48.608217 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 20 17:54:48 crc kubenswrapper[4690]: I0320 17:54:48.640022 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.639992986 podStartE2EDuration="2.639992986s" podCreationTimestamp="2026-03-20 17:54:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:54:48.624642521 +0000 UTC m=+1363.490468249" watchObservedRunningTime="2026-03-20 17:54:48.639992986 +0000 UTC m=+1363.505818704" Mar 20 17:54:48 crc kubenswrapper[4690]: I0320 17:54:48.910446 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 20 17:54:48 crc kubenswrapper[4690]: I0320 17:54:48.910521 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 20 17:54:48 crc kubenswrapper[4690]: I0320 17:54:48.956003 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 20 17:54:48 crc kubenswrapper[4690]: I0320 17:54:48.967459 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 20 17:54:49 crc kubenswrapper[4690]: I0320 17:54:49.612962 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 20 17:54:49 crc kubenswrapper[4690]: I0320 17:54:49.613012 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 20 17:54:50 crc kubenswrapper[4690]: I0320 17:54:50.090641 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 17:54:50 crc kubenswrapper[4690]: I0320 17:54:50.558826 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 20 17:54:50 crc kubenswrapper[4690]: I0320 17:54:50.603914 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 20 17:54:51 crc kubenswrapper[4690]: I0320 17:54:51.628013 4690 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 17:54:51 crc kubenswrapper[4690]: I0320 17:54:51.629449 4690 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 17:54:51 crc kubenswrapper[4690]: I0320 17:54:51.628143 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="185f4ebf-5a0a-4c5c-9c54-61e9c1c886e3" containerName="nova-cell0-conductor-conductor" containerID="cri-o://b7617a3c7e573a25fd1bae4da4965ee9fd1e9cceb51a21f90c46fb776234828f" gracePeriod=30 Mar 20 17:54:51 crc kubenswrapper[4690]: I0320 17:54:51.737493 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 20 17:54:51 crc kubenswrapper[4690]: I0320 17:54:51.748476 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 20 17:54:52 crc kubenswrapper[4690]: I0320 17:54:52.029164 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:54:52 crc kubenswrapper[4690]: I0320 17:54:52.029717 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6433d8a9-75c5-47c1-be4e-2fbf10227f4d" containerName="ceilometer-central-agent" containerID="cri-o://c556460e694f56a6261f25c6618f72a3c05698be1e72993d58cd05732ce91b1d" gracePeriod=30 Mar 20 17:54:52 crc kubenswrapper[4690]: I0320 17:54:52.029791 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6433d8a9-75c5-47c1-be4e-2fbf10227f4d" containerName="sg-core" containerID="cri-o://97a4238ebdc506ab23bec52efa2c1aa2f53d083dfbddafb50aed61d2facf2760" gracePeriod=30 Mar 20 17:54:52 crc kubenswrapper[4690]: I0320 17:54:52.029807 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6433d8a9-75c5-47c1-be4e-2fbf10227f4d" containerName="ceilometer-notification-agent" containerID="cri-o://d794e6ce362b84fb6dffb8e695893205130628f9d1c59173657bc0df80b7c841" gracePeriod=30 Mar 20 17:54:52 crc kubenswrapper[4690]: I0320 17:54:52.029809 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6433d8a9-75c5-47c1-be4e-2fbf10227f4d" containerName="proxy-httpd" containerID="cri-o://b839758cdff51e58df519ad01c333db778c86791eec5d8f477cf69e10886bf2a" gracePeriod=30 Mar 20 17:54:52 crc kubenswrapper[4690]: I0320 17:54:52.036468 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 20 17:54:52 crc kubenswrapper[4690]: I0320 17:54:52.639186 4690 generic.go:334] "Generic (PLEG): container finished" podID="6433d8a9-75c5-47c1-be4e-2fbf10227f4d" containerID="b839758cdff51e58df519ad01c333db778c86791eec5d8f477cf69e10886bf2a" exitCode=0 Mar 20 17:54:52 crc kubenswrapper[4690]: I0320 17:54:52.639508 4690 generic.go:334] "Generic (PLEG): container finished" podID="6433d8a9-75c5-47c1-be4e-2fbf10227f4d" containerID="97a4238ebdc506ab23bec52efa2c1aa2f53d083dfbddafb50aed61d2facf2760" exitCode=2 Mar 20 17:54:52 crc kubenswrapper[4690]: I0320 17:54:52.639520 4690 generic.go:334] "Generic (PLEG): container finished" podID="6433d8a9-75c5-47c1-be4e-2fbf10227f4d" containerID="c556460e694f56a6261f25c6618f72a3c05698be1e72993d58cd05732ce91b1d" exitCode=0 Mar 20 17:54:52 crc kubenswrapper[4690]: I0320 17:54:52.639279 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6433d8a9-75c5-47c1-be4e-2fbf10227f4d","Type":"ContainerDied","Data":"b839758cdff51e58df519ad01c333db778c86791eec5d8f477cf69e10886bf2a"} Mar 20 17:54:52 crc kubenswrapper[4690]: I0320 17:54:52.639612 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6433d8a9-75c5-47c1-be4e-2fbf10227f4d","Type":"ContainerDied","Data":"97a4238ebdc506ab23bec52efa2c1aa2f53d083dfbddafb50aed61d2facf2760"} Mar 20 17:54:52 crc kubenswrapper[4690]: I0320 17:54:52.639625 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6433d8a9-75c5-47c1-be4e-2fbf10227f4d","Type":"ContainerDied","Data":"c556460e694f56a6261f25c6618f72a3c05698be1e72993d58cd05732ce91b1d"} Mar 20 17:54:53 crc kubenswrapper[4690]: I0320 17:54:53.286610 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 20 17:54:53 crc kubenswrapper[4690]: I0320 17:54:53.404913 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/185f4ebf-5a0a-4c5c-9c54-61e9c1c886e3-combined-ca-bundle\") pod \"185f4ebf-5a0a-4c5c-9c54-61e9c1c886e3\" (UID: \"185f4ebf-5a0a-4c5c-9c54-61e9c1c886e3\") " Mar 20 17:54:53 crc kubenswrapper[4690]: I0320 17:54:53.405092 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/185f4ebf-5a0a-4c5c-9c54-61e9c1c886e3-config-data\") pod \"185f4ebf-5a0a-4c5c-9c54-61e9c1c886e3\" (UID: \"185f4ebf-5a0a-4c5c-9c54-61e9c1c886e3\") " Mar 20 17:54:53 crc kubenswrapper[4690]: I0320 17:54:53.405161 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lfwt7\" (UniqueName: \"kubernetes.io/projected/185f4ebf-5a0a-4c5c-9c54-61e9c1c886e3-kube-api-access-lfwt7\") pod \"185f4ebf-5a0a-4c5c-9c54-61e9c1c886e3\" (UID: \"185f4ebf-5a0a-4c5c-9c54-61e9c1c886e3\") " Mar 20 17:54:53 crc kubenswrapper[4690]: I0320 17:54:53.410820 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/185f4ebf-5a0a-4c5c-9c54-61e9c1c886e3-kube-api-access-lfwt7" (OuterVolumeSpecName: "kube-api-access-lfwt7") pod "185f4ebf-5a0a-4c5c-9c54-61e9c1c886e3" (UID: "185f4ebf-5a0a-4c5c-9c54-61e9c1c886e3"). InnerVolumeSpecName "kube-api-access-lfwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:54:53 crc kubenswrapper[4690]: I0320 17:54:53.430607 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/185f4ebf-5a0a-4c5c-9c54-61e9c1c886e3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "185f4ebf-5a0a-4c5c-9c54-61e9c1c886e3" (UID: "185f4ebf-5a0a-4c5c-9c54-61e9c1c886e3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:54:53 crc kubenswrapper[4690]: I0320 17:54:53.452488 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/185f4ebf-5a0a-4c5c-9c54-61e9c1c886e3-config-data" (OuterVolumeSpecName: "config-data") pod "185f4ebf-5a0a-4c5c-9c54-61e9c1c886e3" (UID: "185f4ebf-5a0a-4c5c-9c54-61e9c1c886e3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:54:53 crc kubenswrapper[4690]: I0320 17:54:53.507113 4690 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/185f4ebf-5a0a-4c5c-9c54-61e9c1c886e3-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:53 crc kubenswrapper[4690]: I0320 17:54:53.507149 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lfwt7\" (UniqueName: \"kubernetes.io/projected/185f4ebf-5a0a-4c5c-9c54-61e9c1c886e3-kube-api-access-lfwt7\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:53 crc kubenswrapper[4690]: I0320 17:54:53.507163 4690 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/185f4ebf-5a0a-4c5c-9c54-61e9c1c886e3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:53 crc kubenswrapper[4690]: I0320 17:54:53.648981 4690 generic.go:334] "Generic (PLEG): container finished" podID="185f4ebf-5a0a-4c5c-9c54-61e9c1c886e3" containerID="b7617a3c7e573a25fd1bae4da4965ee9fd1e9cceb51a21f90c46fb776234828f" exitCode=0 Mar 20 17:54:53 crc kubenswrapper[4690]: I0320 17:54:53.649031 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"185f4ebf-5a0a-4c5c-9c54-61e9c1c886e3","Type":"ContainerDied","Data":"b7617a3c7e573a25fd1bae4da4965ee9fd1e9cceb51a21f90c46fb776234828f"} Mar 20 17:54:53 crc kubenswrapper[4690]: I0320 17:54:53.649055 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 20 17:54:53 crc kubenswrapper[4690]: I0320 17:54:53.649079 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"185f4ebf-5a0a-4c5c-9c54-61e9c1c886e3","Type":"ContainerDied","Data":"698003ef232fa375c7253bc2c602ec51295d748c0901606869d92e0afff655d7"} Mar 20 17:54:53 crc kubenswrapper[4690]: I0320 17:54:53.649098 4690 scope.go:117] "RemoveContainer" containerID="b7617a3c7e573a25fd1bae4da4965ee9fd1e9cceb51a21f90c46fb776234828f" Mar 20 17:54:53 crc kubenswrapper[4690]: I0320 17:54:53.675115 4690 scope.go:117] "RemoveContainer" containerID="b7617a3c7e573a25fd1bae4da4965ee9fd1e9cceb51a21f90c46fb776234828f" Mar 20 17:54:53 crc kubenswrapper[4690]: E0320 17:54:53.675801 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7617a3c7e573a25fd1bae4da4965ee9fd1e9cceb51a21f90c46fb776234828f\": container with ID starting with b7617a3c7e573a25fd1bae4da4965ee9fd1e9cceb51a21f90c46fb776234828f not found: ID does not exist" containerID="b7617a3c7e573a25fd1bae4da4965ee9fd1e9cceb51a21f90c46fb776234828f" Mar 20 17:54:53 crc kubenswrapper[4690]: I0320 17:54:53.675852 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7617a3c7e573a25fd1bae4da4965ee9fd1e9cceb51a21f90c46fb776234828f"} err="failed to get container status \"b7617a3c7e573a25fd1bae4da4965ee9fd1e9cceb51a21f90c46fb776234828f\": rpc error: code = NotFound desc = could not find container \"b7617a3c7e573a25fd1bae4da4965ee9fd1e9cceb51a21f90c46fb776234828f\": container with ID starting with b7617a3c7e573a25fd1bae4da4965ee9fd1e9cceb51a21f90c46fb776234828f not found: ID does not exist" Mar 20 17:54:53 crc kubenswrapper[4690]: I0320 17:54:53.703126 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 17:54:53 crc kubenswrapper[4690]: I0320 17:54:53.719568 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 17:54:53 crc kubenswrapper[4690]: I0320 17:54:53.729884 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 17:54:53 crc kubenswrapper[4690]: E0320 17:54:53.730566 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="185f4ebf-5a0a-4c5c-9c54-61e9c1c886e3" containerName="nova-cell0-conductor-conductor" Mar 20 17:54:53 crc kubenswrapper[4690]: I0320 17:54:53.730586 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="185f4ebf-5a0a-4c5c-9c54-61e9c1c886e3" containerName="nova-cell0-conductor-conductor" Mar 20 17:54:53 crc kubenswrapper[4690]: I0320 17:54:53.730738 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="185f4ebf-5a0a-4c5c-9c54-61e9c1c886e3" containerName="nova-cell0-conductor-conductor" Mar 20 17:54:53 crc kubenswrapper[4690]: I0320 17:54:53.731349 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 20 17:54:53 crc kubenswrapper[4690]: I0320 17:54:53.733660 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-f6gl8" Mar 20 17:54:53 crc kubenswrapper[4690]: I0320 17:54:53.734220 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 20 17:54:53 crc kubenswrapper[4690]: I0320 17:54:53.762988 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 17:54:53 crc kubenswrapper[4690]: I0320 17:54:53.812065 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d9df793-f6e2-4d60-a54d-971847c8d3ea-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"9d9df793-f6e2-4d60-a54d-971847c8d3ea\") " pod="openstack/nova-cell0-conductor-0" Mar 20 17:54:53 crc kubenswrapper[4690]: I0320 17:54:53.812227 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpn9c\" (UniqueName: \"kubernetes.io/projected/9d9df793-f6e2-4d60-a54d-971847c8d3ea-kube-api-access-dpn9c\") pod \"nova-cell0-conductor-0\" (UID: \"9d9df793-f6e2-4d60-a54d-971847c8d3ea\") " pod="openstack/nova-cell0-conductor-0" Mar 20 17:54:53 crc kubenswrapper[4690]: I0320 17:54:53.812277 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d9df793-f6e2-4d60-a54d-971847c8d3ea-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"9d9df793-f6e2-4d60-a54d-971847c8d3ea\") " pod="openstack/nova-cell0-conductor-0" Mar 20 17:54:53 crc kubenswrapper[4690]: I0320 17:54:53.897891 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="185f4ebf-5a0a-4c5c-9c54-61e9c1c886e3" path="/var/lib/kubelet/pods/185f4ebf-5a0a-4c5c-9c54-61e9c1c886e3/volumes" Mar 20 17:54:53 crc kubenswrapper[4690]: I0320 17:54:53.914623 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpn9c\" (UniqueName: \"kubernetes.io/projected/9d9df793-f6e2-4d60-a54d-971847c8d3ea-kube-api-access-dpn9c\") pod \"nova-cell0-conductor-0\" (UID: \"9d9df793-f6e2-4d60-a54d-971847c8d3ea\") " pod="openstack/nova-cell0-conductor-0" Mar 20 17:54:53 crc kubenswrapper[4690]: I0320 17:54:53.914704 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d9df793-f6e2-4d60-a54d-971847c8d3ea-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"9d9df793-f6e2-4d60-a54d-971847c8d3ea\") " pod="openstack/nova-cell0-conductor-0" Mar 20 17:54:53 crc kubenswrapper[4690]: I0320 17:54:53.914851 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d9df793-f6e2-4d60-a54d-971847c8d3ea-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"9d9df793-f6e2-4d60-a54d-971847c8d3ea\") " pod="openstack/nova-cell0-conductor-0" Mar 20 17:54:53 crc kubenswrapper[4690]: I0320 17:54:53.920526 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d9df793-f6e2-4d60-a54d-971847c8d3ea-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"9d9df793-f6e2-4d60-a54d-971847c8d3ea\") " pod="openstack/nova-cell0-conductor-0" Mar 20 17:54:53 crc kubenswrapper[4690]: I0320 17:54:53.921131 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d9df793-f6e2-4d60-a54d-971847c8d3ea-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"9d9df793-f6e2-4d60-a54d-971847c8d3ea\") " pod="openstack/nova-cell0-conductor-0" Mar 20 17:54:53 crc kubenswrapper[4690]: I0320 17:54:53.936784 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpn9c\" (UniqueName: \"kubernetes.io/projected/9d9df793-f6e2-4d60-a54d-971847c8d3ea-kube-api-access-dpn9c\") pod \"nova-cell0-conductor-0\" (UID: \"9d9df793-f6e2-4d60-a54d-971847c8d3ea\") " pod="openstack/nova-cell0-conductor-0" Mar 20 17:54:54 crc kubenswrapper[4690]: I0320 17:54:54.053684 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 20 17:54:54 crc kubenswrapper[4690]: I0320 17:54:54.275778 4690 patch_prober.go:28] interesting pod/machine-config-daemon-wtg2q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:54:54 crc kubenswrapper[4690]: I0320 17:54:54.275840 4690 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:54:54 crc kubenswrapper[4690]: I0320 17:54:54.610671 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 17:54:54 crc kubenswrapper[4690]: W0320 17:54:54.620701 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d9df793_f6e2_4d60_a54d_971847c8d3ea.slice/crio-ea1f15a2cf44d56f3373d1c5e339d089f7aaba5e349dccb13c755654b2464e5e WatchSource:0}: Error finding container ea1f15a2cf44d56f3373d1c5e339d089f7aaba5e349dccb13c755654b2464e5e: Status 404 returned error can't find the container with id ea1f15a2cf44d56f3373d1c5e339d089f7aaba5e349dccb13c755654b2464e5e Mar 20 17:54:54 crc kubenswrapper[4690]: I0320 17:54:54.681649 4690 generic.go:334] "Generic (PLEG): container finished" podID="6433d8a9-75c5-47c1-be4e-2fbf10227f4d" containerID="d794e6ce362b84fb6dffb8e695893205130628f9d1c59173657bc0df80b7c841" exitCode=0 Mar 20 17:54:54 crc kubenswrapper[4690]: I0320 17:54:54.681747 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6433d8a9-75c5-47c1-be4e-2fbf10227f4d","Type":"ContainerDied","Data":"d794e6ce362b84fb6dffb8e695893205130628f9d1c59173657bc0df80b7c841"} Mar 20 17:54:54 crc kubenswrapper[4690]: I0320 17:54:54.700627 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"9d9df793-f6e2-4d60-a54d-971847c8d3ea","Type":"ContainerStarted","Data":"ea1f15a2cf44d56f3373d1c5e339d089f7aaba5e349dccb13c755654b2464e5e"} Mar 20 17:54:54 crc kubenswrapper[4690]: I0320 17:54:54.829009 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:54:54 crc kubenswrapper[4690]: I0320 17:54:54.933551 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6433d8a9-75c5-47c1-be4e-2fbf10227f4d-scripts\") pod \"6433d8a9-75c5-47c1-be4e-2fbf10227f4d\" (UID: \"6433d8a9-75c5-47c1-be4e-2fbf10227f4d\") " Mar 20 17:54:54 crc kubenswrapper[4690]: I0320 17:54:54.933627 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6433d8a9-75c5-47c1-be4e-2fbf10227f4d-config-data\") pod \"6433d8a9-75c5-47c1-be4e-2fbf10227f4d\" (UID: \"6433d8a9-75c5-47c1-be4e-2fbf10227f4d\") " Mar 20 17:54:54 crc kubenswrapper[4690]: I0320 17:54:54.933645 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75596\" (UniqueName: \"kubernetes.io/projected/6433d8a9-75c5-47c1-be4e-2fbf10227f4d-kube-api-access-75596\") pod \"6433d8a9-75c5-47c1-be4e-2fbf10227f4d\" (UID: \"6433d8a9-75c5-47c1-be4e-2fbf10227f4d\") " Mar 20 17:54:54 crc kubenswrapper[4690]: I0320 17:54:54.933752 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6433d8a9-75c5-47c1-be4e-2fbf10227f4d-log-httpd\") pod \"6433d8a9-75c5-47c1-be4e-2fbf10227f4d\" (UID: \"6433d8a9-75c5-47c1-be4e-2fbf10227f4d\") " Mar 20 17:54:54 crc kubenswrapper[4690]: I0320 17:54:54.933837 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6433d8a9-75c5-47c1-be4e-2fbf10227f4d-combined-ca-bundle\") pod \"6433d8a9-75c5-47c1-be4e-2fbf10227f4d\" (UID: \"6433d8a9-75c5-47c1-be4e-2fbf10227f4d\") " Mar 20 17:54:54 crc kubenswrapper[4690]: I0320 17:54:54.933853 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6433d8a9-75c5-47c1-be4e-2fbf10227f4d-sg-core-conf-yaml\") pod \"6433d8a9-75c5-47c1-be4e-2fbf10227f4d\" (UID: \"6433d8a9-75c5-47c1-be4e-2fbf10227f4d\") " Mar 20 17:54:54 crc kubenswrapper[4690]: I0320 17:54:54.933880 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6433d8a9-75c5-47c1-be4e-2fbf10227f4d-run-httpd\") pod \"6433d8a9-75c5-47c1-be4e-2fbf10227f4d\" (UID: \"6433d8a9-75c5-47c1-be4e-2fbf10227f4d\") " Mar 20 17:54:54 crc kubenswrapper[4690]: I0320 17:54:54.935367 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6433d8a9-75c5-47c1-be4e-2fbf10227f4d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6433d8a9-75c5-47c1-be4e-2fbf10227f4d" (UID: "6433d8a9-75c5-47c1-be4e-2fbf10227f4d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:54:54 crc kubenswrapper[4690]: I0320 17:54:54.943355 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6433d8a9-75c5-47c1-be4e-2fbf10227f4d-kube-api-access-75596" (OuterVolumeSpecName: "kube-api-access-75596") pod "6433d8a9-75c5-47c1-be4e-2fbf10227f4d" (UID: "6433d8a9-75c5-47c1-be4e-2fbf10227f4d"). InnerVolumeSpecName "kube-api-access-75596". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:54:54 crc kubenswrapper[4690]: I0320 17:54:54.946005 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6433d8a9-75c5-47c1-be4e-2fbf10227f4d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6433d8a9-75c5-47c1-be4e-2fbf10227f4d" (UID: "6433d8a9-75c5-47c1-be4e-2fbf10227f4d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:54:54 crc kubenswrapper[4690]: I0320 17:54:54.946982 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6433d8a9-75c5-47c1-be4e-2fbf10227f4d-scripts" (OuterVolumeSpecName: "scripts") pod "6433d8a9-75c5-47c1-be4e-2fbf10227f4d" (UID: "6433d8a9-75c5-47c1-be4e-2fbf10227f4d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:54:54 crc kubenswrapper[4690]: I0320 17:54:54.959890 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6433d8a9-75c5-47c1-be4e-2fbf10227f4d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6433d8a9-75c5-47c1-be4e-2fbf10227f4d" (UID: "6433d8a9-75c5-47c1-be4e-2fbf10227f4d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:54:55 crc kubenswrapper[4690]: I0320 17:54:55.008171 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6433d8a9-75c5-47c1-be4e-2fbf10227f4d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6433d8a9-75c5-47c1-be4e-2fbf10227f4d" (UID: "6433d8a9-75c5-47c1-be4e-2fbf10227f4d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:54:55 crc kubenswrapper[4690]: I0320 17:54:55.022037 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6433d8a9-75c5-47c1-be4e-2fbf10227f4d-config-data" (OuterVolumeSpecName: "config-data") pod "6433d8a9-75c5-47c1-be4e-2fbf10227f4d" (UID: "6433d8a9-75c5-47c1-be4e-2fbf10227f4d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:54:55 crc kubenswrapper[4690]: I0320 17:54:55.035542 4690 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6433d8a9-75c5-47c1-be4e-2fbf10227f4d-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:55 crc kubenswrapper[4690]: I0320 17:54:55.035974 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75596\" (UniqueName: \"kubernetes.io/projected/6433d8a9-75c5-47c1-be4e-2fbf10227f4d-kube-api-access-75596\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:55 crc kubenswrapper[4690]: I0320 17:54:55.036053 4690 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6433d8a9-75c5-47c1-be4e-2fbf10227f4d-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:55 crc kubenswrapper[4690]: I0320 17:54:55.036066 4690 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6433d8a9-75c5-47c1-be4e-2fbf10227f4d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:55 crc kubenswrapper[4690]: I0320 17:54:55.036078 4690 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6433d8a9-75c5-47c1-be4e-2fbf10227f4d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:55 crc kubenswrapper[4690]: I0320 17:54:55.036088 4690 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6433d8a9-75c5-47c1-be4e-2fbf10227f4d-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:55 crc kubenswrapper[4690]: I0320 17:54:55.036100 4690 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6433d8a9-75c5-47c1-be4e-2fbf10227f4d-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:55 crc kubenswrapper[4690]: I0320 17:54:55.709867 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"9d9df793-f6e2-4d60-a54d-971847c8d3ea","Type":"ContainerStarted","Data":"05c242972c149653e4329e99b9ac6243dc60f08c3534e988bb026dffe50bcdc2"} Mar 20 17:54:55 crc kubenswrapper[4690]: I0320 17:54:55.710287 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 20 17:54:55 crc kubenswrapper[4690]: I0320 17:54:55.712468 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6433d8a9-75c5-47c1-be4e-2fbf10227f4d","Type":"ContainerDied","Data":"0f7f48de9b24b0bb54f125e23ec7d5d0f1b71e7d359f2f305bd1f85e169b1df0"} Mar 20 17:54:55 crc kubenswrapper[4690]: I0320 17:54:55.712506 4690 scope.go:117] "RemoveContainer" containerID="b839758cdff51e58df519ad01c333db778c86791eec5d8f477cf69e10886bf2a" Mar 20 17:54:55 crc kubenswrapper[4690]: I0320 17:54:55.712532 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:54:55 crc kubenswrapper[4690]: I0320 17:54:55.734950 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.734929109 podStartE2EDuration="2.734929109s" podCreationTimestamp="2026-03-20 17:54:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:54:55.730905658 +0000 UTC m=+1370.596731356" watchObservedRunningTime="2026-03-20 17:54:55.734929109 +0000 UTC m=+1370.600754787" Mar 20 17:54:55 crc kubenswrapper[4690]: I0320 17:54:55.735983 4690 scope.go:117] "RemoveContainer" containerID="97a4238ebdc506ab23bec52efa2c1aa2f53d083dfbddafb50aed61d2facf2760" Mar 20 17:54:55 crc kubenswrapper[4690]: I0320 17:54:55.765432 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:54:55 crc kubenswrapper[4690]: I0320 17:54:55.769865 4690 scope.go:117] "RemoveContainer" containerID="d794e6ce362b84fb6dffb8e695893205130628f9d1c59173657bc0df80b7c841" Mar 20 17:54:55 crc kubenswrapper[4690]: I0320 17:54:55.777411 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:54:55 crc kubenswrapper[4690]: I0320 17:54:55.791751 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:54:55 crc kubenswrapper[4690]: E0320 17:54:55.792195 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6433d8a9-75c5-47c1-be4e-2fbf10227f4d" containerName="ceilometer-notification-agent" Mar 20 17:54:55 crc kubenswrapper[4690]: I0320 17:54:55.792210 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="6433d8a9-75c5-47c1-be4e-2fbf10227f4d" containerName="ceilometer-notification-agent" Mar 20 17:54:55 crc kubenswrapper[4690]: E0320 17:54:55.792221 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6433d8a9-75c5-47c1-be4e-2fbf10227f4d" containerName="sg-core" Mar 20 17:54:55 crc kubenswrapper[4690]: I0320 17:54:55.792227 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="6433d8a9-75c5-47c1-be4e-2fbf10227f4d" containerName="sg-core" Mar 20 17:54:55 crc kubenswrapper[4690]: E0320 17:54:55.792237 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6433d8a9-75c5-47c1-be4e-2fbf10227f4d" containerName="ceilometer-central-agent" Mar 20 17:54:55 crc kubenswrapper[4690]: I0320 17:54:55.792243 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="6433d8a9-75c5-47c1-be4e-2fbf10227f4d" containerName="ceilometer-central-agent" Mar 20 17:54:55 crc kubenswrapper[4690]: E0320 17:54:55.792285 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6433d8a9-75c5-47c1-be4e-2fbf10227f4d" containerName="proxy-httpd" Mar 20 17:54:55 crc kubenswrapper[4690]: I0320 17:54:55.792292 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="6433d8a9-75c5-47c1-be4e-2fbf10227f4d" containerName="proxy-httpd" Mar 20 17:54:55 crc kubenswrapper[4690]: I0320 17:54:55.792470 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="6433d8a9-75c5-47c1-be4e-2fbf10227f4d" containerName="sg-core" Mar 20 17:54:55 crc kubenswrapper[4690]: I0320 17:54:55.792484 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="6433d8a9-75c5-47c1-be4e-2fbf10227f4d" containerName="proxy-httpd" Mar 20 17:54:55 crc kubenswrapper[4690]: I0320 17:54:55.792492 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="6433d8a9-75c5-47c1-be4e-2fbf10227f4d" containerName="ceilometer-notification-agent" Mar 20 17:54:55 crc kubenswrapper[4690]: I0320 17:54:55.792508 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="6433d8a9-75c5-47c1-be4e-2fbf10227f4d" containerName="ceilometer-central-agent" Mar 20 17:54:55 crc kubenswrapper[4690]: I0320 17:54:55.794040 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:54:55 crc kubenswrapper[4690]: I0320 17:54:55.800315 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 17:54:55 crc kubenswrapper[4690]: I0320 17:54:55.801115 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 17:54:55 crc kubenswrapper[4690]: I0320 17:54:55.801638 4690 scope.go:117] "RemoveContainer" containerID="c556460e694f56a6261f25c6618f72a3c05698be1e72993d58cd05732ce91b1d" Mar 20 17:54:55 crc kubenswrapper[4690]: I0320 17:54:55.807937 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:54:55 crc kubenswrapper[4690]: I0320 17:54:55.849424 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrxpz\" (UniqueName: \"kubernetes.io/projected/684c0017-320d-4195-a9a0-52a5174dfdd1-kube-api-access-nrxpz\") pod \"ceilometer-0\" (UID: \"684c0017-320d-4195-a9a0-52a5174dfdd1\") " pod="openstack/ceilometer-0" Mar 20 17:54:55 crc kubenswrapper[4690]: I0320 17:54:55.849511 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/684c0017-320d-4195-a9a0-52a5174dfdd1-config-data\") pod \"ceilometer-0\" (UID: \"684c0017-320d-4195-a9a0-52a5174dfdd1\") " pod="openstack/ceilometer-0" Mar 20 17:54:55 crc kubenswrapper[4690]: I0320 17:54:55.849768 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/684c0017-320d-4195-a9a0-52a5174dfdd1-run-httpd\") pod \"ceilometer-0\" (UID: \"684c0017-320d-4195-a9a0-52a5174dfdd1\") " pod="openstack/ceilometer-0" Mar 20 17:54:55 crc kubenswrapper[4690]: I0320 17:54:55.849918 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/684c0017-320d-4195-a9a0-52a5174dfdd1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"684c0017-320d-4195-a9a0-52a5174dfdd1\") " pod="openstack/ceilometer-0" Mar 20 17:54:55 crc kubenswrapper[4690]: I0320 17:54:55.849959 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/684c0017-320d-4195-a9a0-52a5174dfdd1-scripts\") pod \"ceilometer-0\" (UID: \"684c0017-320d-4195-a9a0-52a5174dfdd1\") " pod="openstack/ceilometer-0" Mar 20 17:54:55 crc kubenswrapper[4690]: I0320 17:54:55.850013 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/684c0017-320d-4195-a9a0-52a5174dfdd1-log-httpd\") pod \"ceilometer-0\" (UID: \"684c0017-320d-4195-a9a0-52a5174dfdd1\") " pod="openstack/ceilometer-0" Mar 20 17:54:55 crc kubenswrapper[4690]: I0320 17:54:55.850029 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/684c0017-320d-4195-a9a0-52a5174dfdd1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"684c0017-320d-4195-a9a0-52a5174dfdd1\") " pod="openstack/ceilometer-0" Mar 20 17:54:55 crc kubenswrapper[4690]: I0320 17:54:55.917352 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6433d8a9-75c5-47c1-be4e-2fbf10227f4d" path="/var/lib/kubelet/pods/6433d8a9-75c5-47c1-be4e-2fbf10227f4d/volumes" Mar 20 17:54:55 crc kubenswrapper[4690]: I0320 17:54:55.951325 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/684c0017-320d-4195-a9a0-52a5174dfdd1-run-httpd\") pod \"ceilometer-0\" (UID: \"684c0017-320d-4195-a9a0-52a5174dfdd1\") " pod="openstack/ceilometer-0" Mar 20 17:54:55 crc kubenswrapper[4690]: I0320 17:54:55.951375 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/684c0017-320d-4195-a9a0-52a5174dfdd1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"684c0017-320d-4195-a9a0-52a5174dfdd1\") " pod="openstack/ceilometer-0" Mar 20 17:54:55 crc kubenswrapper[4690]: I0320 17:54:55.951765 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/684c0017-320d-4195-a9a0-52a5174dfdd1-scripts\") pod \"ceilometer-0\" (UID: \"684c0017-320d-4195-a9a0-52a5174dfdd1\") " pod="openstack/ceilometer-0" Mar 20 17:54:55 crc kubenswrapper[4690]: I0320 17:54:55.951833 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/684c0017-320d-4195-a9a0-52a5174dfdd1-log-httpd\") pod \"ceilometer-0\" (UID: \"684c0017-320d-4195-a9a0-52a5174dfdd1\") " pod="openstack/ceilometer-0" Mar 20 17:54:55 crc kubenswrapper[4690]: I0320 17:54:55.951949 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/684c0017-320d-4195-a9a0-52a5174dfdd1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"684c0017-320d-4195-a9a0-52a5174dfdd1\") " pod="openstack/ceilometer-0" Mar 20 17:54:55 crc kubenswrapper[4690]: I0320 17:54:55.951938 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/684c0017-320d-4195-a9a0-52a5174dfdd1-run-httpd\") pod \"ceilometer-0\" (UID: \"684c0017-320d-4195-a9a0-52a5174dfdd1\") " pod="openstack/ceilometer-0" Mar 20 17:54:55 crc kubenswrapper[4690]: I0320 17:54:55.952045 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrxpz\" (UniqueName: \"kubernetes.io/projected/684c0017-320d-4195-a9a0-52a5174dfdd1-kube-api-access-nrxpz\") pod \"ceilometer-0\" (UID: \"684c0017-320d-4195-a9a0-52a5174dfdd1\") " pod="openstack/ceilometer-0" Mar 20 17:54:55 crc kubenswrapper[4690]: I0320 17:54:55.952146 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/684c0017-320d-4195-a9a0-52a5174dfdd1-config-data\") pod \"ceilometer-0\" (UID: \"684c0017-320d-4195-a9a0-52a5174dfdd1\") " pod="openstack/ceilometer-0" Mar 20 17:54:55 crc kubenswrapper[4690]: I0320 17:54:55.955635 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/684c0017-320d-4195-a9a0-52a5174dfdd1-log-httpd\") pod \"ceilometer-0\" (UID: \"684c0017-320d-4195-a9a0-52a5174dfdd1\") " pod="openstack/ceilometer-0" Mar 20 17:54:55 crc kubenswrapper[4690]: I0320 17:54:55.957679 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/684c0017-320d-4195-a9a0-52a5174dfdd1-config-data\") pod \"ceilometer-0\" (UID: \"684c0017-320d-4195-a9a0-52a5174dfdd1\") " pod="openstack/ceilometer-0" Mar 20 17:54:55 crc kubenswrapper[4690]: I0320 17:54:55.970348 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/684c0017-320d-4195-a9a0-52a5174dfdd1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"684c0017-320d-4195-a9a0-52a5174dfdd1\") " pod="openstack/ceilometer-0" Mar 20 17:54:55 crc kubenswrapper[4690]: I0320 17:54:55.971517 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/684c0017-320d-4195-a9a0-52a5174dfdd1-scripts\") pod \"ceilometer-0\" (UID: \"684c0017-320d-4195-a9a0-52a5174dfdd1\") " pod="openstack/ceilometer-0" Mar 20 17:54:55 crc kubenswrapper[4690]: I0320 17:54:55.976092 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/684c0017-320d-4195-a9a0-52a5174dfdd1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"684c0017-320d-4195-a9a0-52a5174dfdd1\") " pod="openstack/ceilometer-0" Mar 20 17:54:55 crc kubenswrapper[4690]: I0320 17:54:55.978995 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrxpz\" (UniqueName: \"kubernetes.io/projected/684c0017-320d-4195-a9a0-52a5174dfdd1-kube-api-access-nrxpz\") pod \"ceilometer-0\" (UID: \"684c0017-320d-4195-a9a0-52a5174dfdd1\") " pod="openstack/ceilometer-0" Mar 20 17:54:56 crc kubenswrapper[4690]: I0320 17:54:56.122643 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:54:56 crc kubenswrapper[4690]: I0320 17:54:56.571113 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 17:54:56 crc kubenswrapper[4690]: I0320 17:54:56.571698 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="5855b86f-8504-4956-af4e-0cb3f9ace108" containerName="kube-state-metrics" containerID="cri-o://ac47407de899afac863001490b2003df80d11e3399b69af6add689ae63492528" gracePeriod=30 Mar 20 17:54:56 crc kubenswrapper[4690]: I0320 17:54:56.595799 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:54:56 crc kubenswrapper[4690]: W0320 17:54:56.612198 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod684c0017_320d_4195_a9a0_52a5174dfdd1.slice/crio-8e81ae9bc42ac5958490b64d6131f4ae58f1a00aa84efba25edfd473f699a7bf WatchSource:0}: Error finding container 8e81ae9bc42ac5958490b64d6131f4ae58f1a00aa84efba25edfd473f699a7bf: Status 404 returned error can't find the container with id 8e81ae9bc42ac5958490b64d6131f4ae58f1a00aa84efba25edfd473f699a7bf Mar 20 17:54:56 crc kubenswrapper[4690]: I0320 17:54:56.721610 4690 generic.go:334] "Generic (PLEG): container finished" podID="5855b86f-8504-4956-af4e-0cb3f9ace108" containerID="ac47407de899afac863001490b2003df80d11e3399b69af6add689ae63492528" exitCode=2 Mar 20 17:54:56 crc kubenswrapper[4690]: I0320 17:54:56.721709 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5855b86f-8504-4956-af4e-0cb3f9ace108","Type":"ContainerDied","Data":"ac47407de899afac863001490b2003df80d11e3399b69af6add689ae63492528"} Mar 20 17:54:56 crc kubenswrapper[4690]: I0320 17:54:56.725161 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"684c0017-320d-4195-a9a0-52a5174dfdd1","Type":"ContainerStarted","Data":"8e81ae9bc42ac5958490b64d6131f4ae58f1a00aa84efba25edfd473f699a7bf"} Mar 20 17:54:57 crc kubenswrapper[4690]: I0320 17:54:57.061789 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 17:54:57 crc kubenswrapper[4690]: I0320 17:54:57.176404 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vmmsd\" (UniqueName: \"kubernetes.io/projected/5855b86f-8504-4956-af4e-0cb3f9ace108-kube-api-access-vmmsd\") pod \"5855b86f-8504-4956-af4e-0cb3f9ace108\" (UID: \"5855b86f-8504-4956-af4e-0cb3f9ace108\") " Mar 20 17:54:57 crc kubenswrapper[4690]: I0320 17:54:57.183596 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5855b86f-8504-4956-af4e-0cb3f9ace108-kube-api-access-vmmsd" (OuterVolumeSpecName: "kube-api-access-vmmsd") pod "5855b86f-8504-4956-af4e-0cb3f9ace108" (UID: "5855b86f-8504-4956-af4e-0cb3f9ace108"). InnerVolumeSpecName "kube-api-access-vmmsd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:54:57 crc kubenswrapper[4690]: I0320 17:54:57.280345 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vmmsd\" (UniqueName: \"kubernetes.io/projected/5855b86f-8504-4956-af4e-0cb3f9ace108-kube-api-access-vmmsd\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:57 crc kubenswrapper[4690]: I0320 17:54:57.757970 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"684c0017-320d-4195-a9a0-52a5174dfdd1","Type":"ContainerStarted","Data":"2d6cc974d71df78b72e2dde85c3ce0bba672d9729f8c79e23aea5c997d4ed1cc"} Mar 20 17:54:57 crc kubenswrapper[4690]: I0320 17:54:57.759989 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5855b86f-8504-4956-af4e-0cb3f9ace108","Type":"ContainerDied","Data":"87d6401aa5bd45c5c7d4ae253c0af29f9b06720a31cb4e92b14b338f3af8ba94"} Mar 20 17:54:57 crc kubenswrapper[4690]: I0320 17:54:57.760036 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 17:54:57 crc kubenswrapper[4690]: I0320 17:54:57.760045 4690 scope.go:117] "RemoveContainer" containerID="ac47407de899afac863001490b2003df80d11e3399b69af6add689ae63492528" Mar 20 17:54:57 crc kubenswrapper[4690]: I0320 17:54:57.811282 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 17:54:57 crc kubenswrapper[4690]: I0320 17:54:57.832546 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 17:54:57 crc kubenswrapper[4690]: I0320 17:54:57.842657 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 17:54:57 crc kubenswrapper[4690]: E0320 17:54:57.843279 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5855b86f-8504-4956-af4e-0cb3f9ace108" containerName="kube-state-metrics" Mar 20 17:54:57 crc kubenswrapper[4690]: I0320 17:54:57.843304 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="5855b86f-8504-4956-af4e-0cb3f9ace108" containerName="kube-state-metrics" Mar 20 17:54:57 crc kubenswrapper[4690]: I0320 17:54:57.843533 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="5855b86f-8504-4956-af4e-0cb3f9ace108" containerName="kube-state-metrics" Mar 20 17:54:57 crc kubenswrapper[4690]: I0320 17:54:57.844388 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 17:54:57 crc kubenswrapper[4690]: I0320 17:54:57.848153 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Mar 20 17:54:57 crc kubenswrapper[4690]: I0320 17:54:57.849189 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Mar 20 17:54:57 crc kubenswrapper[4690]: I0320 17:54:57.852856 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 17:54:57 crc kubenswrapper[4690]: I0320 17:54:57.891804 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6btnn\" (UniqueName: \"kubernetes.io/projected/4d1751ac-6582-4c73-aef9-12952bde5126-kube-api-access-6btnn\") pod \"kube-state-metrics-0\" (UID: \"4d1751ac-6582-4c73-aef9-12952bde5126\") " pod="openstack/kube-state-metrics-0" Mar 20 17:54:57 crc kubenswrapper[4690]: I0320 17:54:57.891875 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d1751ac-6582-4c73-aef9-12952bde5126-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"4d1751ac-6582-4c73-aef9-12952bde5126\") " pod="openstack/kube-state-metrics-0" Mar 20 17:54:57 crc kubenswrapper[4690]: I0320 17:54:57.891921 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d1751ac-6582-4c73-aef9-12952bde5126-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"4d1751ac-6582-4c73-aef9-12952bde5126\") " pod="openstack/kube-state-metrics-0" Mar 20 17:54:57 crc kubenswrapper[4690]: I0320 17:54:57.891940 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/4d1751ac-6582-4c73-aef9-12952bde5126-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"4d1751ac-6582-4c73-aef9-12952bde5126\") " pod="openstack/kube-state-metrics-0" Mar 20 17:54:57 crc kubenswrapper[4690]: I0320 17:54:57.898080 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5855b86f-8504-4956-af4e-0cb3f9ace108" path="/var/lib/kubelet/pods/5855b86f-8504-4956-af4e-0cb3f9ace108/volumes" Mar 20 17:54:57 crc kubenswrapper[4690]: I0320 17:54:57.993404 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d1751ac-6582-4c73-aef9-12952bde5126-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"4d1751ac-6582-4c73-aef9-12952bde5126\") " pod="openstack/kube-state-metrics-0" Mar 20 17:54:57 crc kubenswrapper[4690]: I0320 17:54:57.993754 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d1751ac-6582-4c73-aef9-12952bde5126-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"4d1751ac-6582-4c73-aef9-12952bde5126\") " pod="openstack/kube-state-metrics-0" Mar 20 17:54:57 crc kubenswrapper[4690]: I0320 17:54:57.993774 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/4d1751ac-6582-4c73-aef9-12952bde5126-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"4d1751ac-6582-4c73-aef9-12952bde5126\") " pod="openstack/kube-state-metrics-0" Mar 20 17:54:57 crc kubenswrapper[4690]: I0320 17:54:57.993922 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6btnn\" (UniqueName: \"kubernetes.io/projected/4d1751ac-6582-4c73-aef9-12952bde5126-kube-api-access-6btnn\") pod \"kube-state-metrics-0\" (UID: \"4d1751ac-6582-4c73-aef9-12952bde5126\") " pod="openstack/kube-state-metrics-0" Mar 20 17:54:57 crc kubenswrapper[4690]: I0320 17:54:57.999142 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d1751ac-6582-4c73-aef9-12952bde5126-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"4d1751ac-6582-4c73-aef9-12952bde5126\") " pod="openstack/kube-state-metrics-0" Mar 20 17:54:57 crc kubenswrapper[4690]: I0320 17:54:57.999332 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/4d1751ac-6582-4c73-aef9-12952bde5126-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"4d1751ac-6582-4c73-aef9-12952bde5126\") " pod="openstack/kube-state-metrics-0" Mar 20 17:54:57 crc kubenswrapper[4690]: I0320 17:54:57.999567 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d1751ac-6582-4c73-aef9-12952bde5126-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"4d1751ac-6582-4c73-aef9-12952bde5126\") " pod="openstack/kube-state-metrics-0" Mar 20 17:54:58 crc kubenswrapper[4690]: I0320 17:54:58.014008 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6btnn\" (UniqueName: \"kubernetes.io/projected/4d1751ac-6582-4c73-aef9-12952bde5126-kube-api-access-6btnn\") pod \"kube-state-metrics-0\" (UID: \"4d1751ac-6582-4c73-aef9-12952bde5126\") " pod="openstack/kube-state-metrics-0" Mar 20 17:54:58 crc kubenswrapper[4690]: I0320 17:54:58.162546 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 17:54:58 crc kubenswrapper[4690]: I0320 17:54:58.520591 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:54:58 crc kubenswrapper[4690]: I0320 17:54:58.622092 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 17:54:58 crc kubenswrapper[4690]: W0320 17:54:58.632358 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4d1751ac_6582_4c73_aef9_12952bde5126.slice/crio-64dc58505b163a6fa08ca09ac7ea421dbc18d417e1932aaced7e524cf69b0ecd WatchSource:0}: Error finding container 64dc58505b163a6fa08ca09ac7ea421dbc18d417e1932aaced7e524cf69b0ecd: Status 404 returned error can't find the container with id 64dc58505b163a6fa08ca09ac7ea421dbc18d417e1932aaced7e524cf69b0ecd Mar 20 17:54:58 crc kubenswrapper[4690]: I0320 17:54:58.769565 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"4d1751ac-6582-4c73-aef9-12952bde5126","Type":"ContainerStarted","Data":"64dc58505b163a6fa08ca09ac7ea421dbc18d417e1932aaced7e524cf69b0ecd"} Mar 20 17:54:58 crc kubenswrapper[4690]: I0320 17:54:58.772929 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"684c0017-320d-4195-a9a0-52a5174dfdd1","Type":"ContainerStarted","Data":"1169c3782cff65c6859a1024927f6b920024feebecb830c8d188d884432b50c4"} Mar 20 17:54:59 crc kubenswrapper[4690]: I0320 17:54:59.109185 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 20 17:54:59 crc kubenswrapper[4690]: I0320 17:54:59.785648 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"684c0017-320d-4195-a9a0-52a5174dfdd1","Type":"ContainerStarted","Data":"eee733de193fa9a786e74c6633f006599f51ca530be0593065ce97cca0c9e7bd"} Mar 20 17:54:59 crc kubenswrapper[4690]: I0320 17:54:59.787390 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"4d1751ac-6582-4c73-aef9-12952bde5126","Type":"ContainerStarted","Data":"f6bdb28c33f3e181a4c4fd60e7cef39d2eb07bf2ca3c84365e9b5eb4ad7403eb"} Mar 20 17:54:59 crc kubenswrapper[4690]: I0320 17:54:59.787575 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 20 17:54:59 crc kubenswrapper[4690]: I0320 17:54:59.813371 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.233181358 podStartE2EDuration="2.813349807s" podCreationTimestamp="2026-03-20 17:54:57 +0000 UTC" firstStartedPulling="2026-03-20 17:54:58.634968298 +0000 UTC m=+1373.500793976" lastFinishedPulling="2026-03-20 17:54:59.215136747 +0000 UTC m=+1374.080962425" observedRunningTime="2026-03-20 17:54:59.8062269 +0000 UTC m=+1374.672052598" watchObservedRunningTime="2026-03-20 17:54:59.813349807 +0000 UTC m=+1374.679175485" Mar 20 17:54:59 crc kubenswrapper[4690]: I0320 17:54:59.822554 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-454zt"] Mar 20 17:54:59 crc kubenswrapper[4690]: I0320 17:54:59.824142 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-454zt" Mar 20 17:54:59 crc kubenswrapper[4690]: I0320 17:54:59.826334 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 20 17:54:59 crc kubenswrapper[4690]: I0320 17:54:59.832729 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 20 17:54:59 crc kubenswrapper[4690]: I0320 17:54:59.836301 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-454zt"] Mar 20 17:54:59 crc kubenswrapper[4690]: I0320 17:54:59.931028 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvgd7\" (UniqueName: \"kubernetes.io/projected/d3f5fa9c-b4e5-4674-8ecf-2dd41a12852e-kube-api-access-bvgd7\") pod \"nova-cell0-cell-mapping-454zt\" (UID: \"d3f5fa9c-b4e5-4674-8ecf-2dd41a12852e\") " pod="openstack/nova-cell0-cell-mapping-454zt" Mar 20 17:54:59 crc kubenswrapper[4690]: I0320 17:54:59.931285 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3f5fa9c-b4e5-4674-8ecf-2dd41a12852e-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-454zt\" (UID: \"d3f5fa9c-b4e5-4674-8ecf-2dd41a12852e\") " pod="openstack/nova-cell0-cell-mapping-454zt" Mar 20 17:54:59 crc kubenswrapper[4690]: I0320 17:54:59.931341 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3f5fa9c-b4e5-4674-8ecf-2dd41a12852e-scripts\") pod \"nova-cell0-cell-mapping-454zt\" (UID: \"d3f5fa9c-b4e5-4674-8ecf-2dd41a12852e\") " pod="openstack/nova-cell0-cell-mapping-454zt" Mar 20 17:54:59 crc kubenswrapper[4690]: I0320 17:54:59.931409 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3f5fa9c-b4e5-4674-8ecf-2dd41a12852e-config-data\") pod \"nova-cell0-cell-mapping-454zt\" (UID: \"d3f5fa9c-b4e5-4674-8ecf-2dd41a12852e\") " pod="openstack/nova-cell0-cell-mapping-454zt" Mar 20 17:55:00 crc kubenswrapper[4690]: I0320 17:55:00.013780 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 17:55:00 crc kubenswrapper[4690]: I0320 17:55:00.015591 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:55:00 crc kubenswrapper[4690]: I0320 17:55:00.023307 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 20 17:55:00 crc kubenswrapper[4690]: I0320 17:55:00.036347 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3f5fa9c-b4e5-4674-8ecf-2dd41a12852e-config-data\") pod \"nova-cell0-cell-mapping-454zt\" (UID: \"d3f5fa9c-b4e5-4674-8ecf-2dd41a12852e\") " pod="openstack/nova-cell0-cell-mapping-454zt" Mar 20 17:55:00 crc kubenswrapper[4690]: I0320 17:55:00.036428 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvgd7\" (UniqueName: \"kubernetes.io/projected/d3f5fa9c-b4e5-4674-8ecf-2dd41a12852e-kube-api-access-bvgd7\") pod \"nova-cell0-cell-mapping-454zt\" (UID: \"d3f5fa9c-b4e5-4674-8ecf-2dd41a12852e\") " pod="openstack/nova-cell0-cell-mapping-454zt" Mar 20 17:55:00 crc kubenswrapper[4690]: I0320 17:55:00.036530 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3f5fa9c-b4e5-4674-8ecf-2dd41a12852e-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-454zt\" (UID: \"d3f5fa9c-b4e5-4674-8ecf-2dd41a12852e\") " pod="openstack/nova-cell0-cell-mapping-454zt" Mar 20 17:55:00 crc kubenswrapper[4690]: I0320 17:55:00.036561 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3f5fa9c-b4e5-4674-8ecf-2dd41a12852e-scripts\") pod \"nova-cell0-cell-mapping-454zt\" (UID: \"d3f5fa9c-b4e5-4674-8ecf-2dd41a12852e\") " pod="openstack/nova-cell0-cell-mapping-454zt" Mar 20 17:55:00 crc kubenswrapper[4690]: I0320 17:55:00.049680 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3f5fa9c-b4e5-4674-8ecf-2dd41a12852e-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-454zt\" (UID: \"d3f5fa9c-b4e5-4674-8ecf-2dd41a12852e\") " pod="openstack/nova-cell0-cell-mapping-454zt" Mar 20 17:55:00 crc kubenswrapper[4690]: I0320 17:55:00.054284 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 17:55:00 crc kubenswrapper[4690]: I0320 17:55:00.054510 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3f5fa9c-b4e5-4674-8ecf-2dd41a12852e-scripts\") pod \"nova-cell0-cell-mapping-454zt\" (UID: \"d3f5fa9c-b4e5-4674-8ecf-2dd41a12852e\") " pod="openstack/nova-cell0-cell-mapping-454zt" Mar 20 17:55:00 crc kubenswrapper[4690]: I0320 17:55:00.078143 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3f5fa9c-b4e5-4674-8ecf-2dd41a12852e-config-data\") pod \"nova-cell0-cell-mapping-454zt\" (UID: \"d3f5fa9c-b4e5-4674-8ecf-2dd41a12852e\") " pod="openstack/nova-cell0-cell-mapping-454zt" Mar 20 17:55:00 crc kubenswrapper[4690]: I0320 17:55:00.092843 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvgd7\" (UniqueName: \"kubernetes.io/projected/d3f5fa9c-b4e5-4674-8ecf-2dd41a12852e-kube-api-access-bvgd7\") pod \"nova-cell0-cell-mapping-454zt\" (UID: \"d3f5fa9c-b4e5-4674-8ecf-2dd41a12852e\") " pod="openstack/nova-cell0-cell-mapping-454zt" Mar 20 17:55:00 crc kubenswrapper[4690]: I0320 17:55:00.141293 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0767f87b-816c-4824-aaf1-8eb760dc6ee8-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"0767f87b-816c-4824-aaf1-8eb760dc6ee8\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:55:00 crc kubenswrapper[4690]: I0320 17:55:00.141621 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqf5n\" (UniqueName: \"kubernetes.io/projected/0767f87b-816c-4824-aaf1-8eb760dc6ee8-kube-api-access-lqf5n\") pod \"nova-cell1-novncproxy-0\" (UID: \"0767f87b-816c-4824-aaf1-8eb760dc6ee8\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:55:00 crc kubenswrapper[4690]: I0320 17:55:00.141725 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0767f87b-816c-4824-aaf1-8eb760dc6ee8-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"0767f87b-816c-4824-aaf1-8eb760dc6ee8\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:55:00 crc kubenswrapper[4690]: I0320 17:55:00.159334 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 20 17:55:00 crc kubenswrapper[4690]: I0320 17:55:00.182050 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-454zt" Mar 20 17:55:00 crc kubenswrapper[4690]: I0320 17:55:00.187284 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 17:55:00 crc kubenswrapper[4690]: I0320 17:55:00.216462 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 20 17:55:00 crc kubenswrapper[4690]: I0320 17:55:00.248887 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqf5n\" (UniqueName: \"kubernetes.io/projected/0767f87b-816c-4824-aaf1-8eb760dc6ee8-kube-api-access-lqf5n\") pod \"nova-cell1-novncproxy-0\" (UID: \"0767f87b-816c-4824-aaf1-8eb760dc6ee8\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:55:00 crc kubenswrapper[4690]: I0320 17:55:00.248958 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0767f87b-816c-4824-aaf1-8eb760dc6ee8-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"0767f87b-816c-4824-aaf1-8eb760dc6ee8\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:55:00 crc kubenswrapper[4690]: I0320 17:55:00.249035 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/240578b9-6354-4ab4-9e38-cec9daee9be4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"240578b9-6354-4ab4-9e38-cec9daee9be4\") " pod="openstack/nova-api-0" Mar 20 17:55:00 crc kubenswrapper[4690]: I0320 17:55:00.249094 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/240578b9-6354-4ab4-9e38-cec9daee9be4-logs\") pod \"nova-api-0\" (UID: \"240578b9-6354-4ab4-9e38-cec9daee9be4\") " pod="openstack/nova-api-0" Mar 20 17:55:00 crc kubenswrapper[4690]: I0320 17:55:00.249160 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/240578b9-6354-4ab4-9e38-cec9daee9be4-config-data\") pod \"nova-api-0\" (UID: \"240578b9-6354-4ab4-9e38-cec9daee9be4\") " pod="openstack/nova-api-0" Mar 20 17:55:00 crc kubenswrapper[4690]: I0320 17:55:00.249209 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0767f87b-816c-4824-aaf1-8eb760dc6ee8-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"0767f87b-816c-4824-aaf1-8eb760dc6ee8\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:55:00 crc kubenswrapper[4690]: I0320 17:55:00.249247 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bz2st\" (UniqueName: \"kubernetes.io/projected/240578b9-6354-4ab4-9e38-cec9daee9be4-kube-api-access-bz2st\") pod \"nova-api-0\" (UID: \"240578b9-6354-4ab4-9e38-cec9daee9be4\") " pod="openstack/nova-api-0" Mar 20 17:55:00 crc kubenswrapper[4690]: I0320 17:55:00.253196 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 17:55:00 crc kubenswrapper[4690]: I0320 17:55:00.288810 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 17:55:00 crc kubenswrapper[4690]: I0320 17:55:00.291142 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 17:55:00 crc kubenswrapper[4690]: I0320 17:55:00.309354 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0767f87b-816c-4824-aaf1-8eb760dc6ee8-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"0767f87b-816c-4824-aaf1-8eb760dc6ee8\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:55:00 crc kubenswrapper[4690]: I0320 17:55:00.326199 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0767f87b-816c-4824-aaf1-8eb760dc6ee8-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"0767f87b-816c-4824-aaf1-8eb760dc6ee8\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:55:00 crc kubenswrapper[4690]: I0320 17:55:00.351682 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c7be3e2-e25f-45c8-8320-d1b5407835fe-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2c7be3e2-e25f-45c8-8320-d1b5407835fe\") " pod="openstack/nova-scheduler-0" Mar 20 17:55:00 crc kubenswrapper[4690]: I0320 17:55:00.351752 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/240578b9-6354-4ab4-9e38-cec9daee9be4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"240578b9-6354-4ab4-9e38-cec9daee9be4\") " pod="openstack/nova-api-0" Mar 20 17:55:00 crc kubenswrapper[4690]: I0320 17:55:00.351787 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/240578b9-6354-4ab4-9e38-cec9daee9be4-logs\") pod \"nova-api-0\" (UID: \"240578b9-6354-4ab4-9e38-cec9daee9be4\") " pod="openstack/nova-api-0" Mar 20 17:55:00 crc kubenswrapper[4690]: I0320 17:55:00.351829 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmxtj\" (UniqueName: \"kubernetes.io/projected/2c7be3e2-e25f-45c8-8320-d1b5407835fe-kube-api-access-hmxtj\") pod \"nova-scheduler-0\" (UID: \"2c7be3e2-e25f-45c8-8320-d1b5407835fe\") " pod="openstack/nova-scheduler-0" Mar 20 17:55:00 crc kubenswrapper[4690]: I0320 17:55:00.351850 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/240578b9-6354-4ab4-9e38-cec9daee9be4-config-data\") pod \"nova-api-0\" (UID: \"240578b9-6354-4ab4-9e38-cec9daee9be4\") " pod="openstack/nova-api-0" Mar 20 17:55:00 crc kubenswrapper[4690]: I0320 17:55:00.351872 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c7be3e2-e25f-45c8-8320-d1b5407835fe-config-data\") pod \"nova-scheduler-0\" (UID: \"2c7be3e2-e25f-45c8-8320-d1b5407835fe\") " pod="openstack/nova-scheduler-0" Mar 20 17:55:00 crc kubenswrapper[4690]: I0320 17:55:00.351902 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bz2st\" (UniqueName: \"kubernetes.io/projected/240578b9-6354-4ab4-9e38-cec9daee9be4-kube-api-access-bz2st\") pod \"nova-api-0\" (UID: \"240578b9-6354-4ab4-9e38-cec9daee9be4\") " pod="openstack/nova-api-0" Mar 20 17:55:00 crc kubenswrapper[4690]: I0320 17:55:00.352249 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 20 17:55:00 crc kubenswrapper[4690]: I0320 17:55:00.364617 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/240578b9-6354-4ab4-9e38-cec9daee9be4-logs\") pod \"nova-api-0\" (UID: \"240578b9-6354-4ab4-9e38-cec9daee9be4\") " pod="openstack/nova-api-0" Mar 20 17:55:00 crc kubenswrapper[4690]: I0320 17:55:00.369919 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqf5n\" (UniqueName: \"kubernetes.io/projected/0767f87b-816c-4824-aaf1-8eb760dc6ee8-kube-api-access-lqf5n\") pod \"nova-cell1-novncproxy-0\" (UID: \"0767f87b-816c-4824-aaf1-8eb760dc6ee8\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:55:00 crc kubenswrapper[4690]: I0320 17:55:00.370371 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/240578b9-6354-4ab4-9e38-cec9daee9be4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"240578b9-6354-4ab4-9e38-cec9daee9be4\") " pod="openstack/nova-api-0" Mar 20 17:55:00 crc kubenswrapper[4690]: I0320 17:55:00.370516 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 17:55:00 crc kubenswrapper[4690]: I0320 17:55:00.384588 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/240578b9-6354-4ab4-9e38-cec9daee9be4-config-data\") pod \"nova-api-0\" (UID: \"240578b9-6354-4ab4-9e38-cec9daee9be4\") " pod="openstack/nova-api-0" Mar 20 17:55:00 crc kubenswrapper[4690]: I0320 17:55:00.384860 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bz2st\" (UniqueName: \"kubernetes.io/projected/240578b9-6354-4ab4-9e38-cec9daee9be4-kube-api-access-bz2st\") pod \"nova-api-0\" (UID: \"240578b9-6354-4ab4-9e38-cec9daee9be4\") " pod="openstack/nova-api-0" Mar 20 17:55:00 crc kubenswrapper[4690]: I0320 17:55:00.386483 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 17:55:00 crc kubenswrapper[4690]: I0320 17:55:00.407303 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 20 17:55:00 crc kubenswrapper[4690]: I0320 17:55:00.408700 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 17:55:00 crc kubenswrapper[4690]: I0320 17:55:00.414642 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 20 17:55:00 crc kubenswrapper[4690]: I0320 17:55:00.458739 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c7be3e2-e25f-45c8-8320-d1b5407835fe-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2c7be3e2-e25f-45c8-8320-d1b5407835fe\") " pod="openstack/nova-scheduler-0" Mar 20 17:55:00 crc kubenswrapper[4690]: I0320 17:55:00.459051 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05ab337b-5ea6-4657-94e3-cc9f28b0d9fa-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"05ab337b-5ea6-4657-94e3-cc9f28b0d9fa\") " pod="openstack/nova-metadata-0" Mar 20 17:55:00 crc kubenswrapper[4690]: I0320 17:55:00.459136 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmxtj\" (UniqueName: \"kubernetes.io/projected/2c7be3e2-e25f-45c8-8320-d1b5407835fe-kube-api-access-hmxtj\") pod \"nova-scheduler-0\" (UID: \"2c7be3e2-e25f-45c8-8320-d1b5407835fe\") " pod="openstack/nova-scheduler-0" Mar 20 17:55:00 crc kubenswrapper[4690]: I0320 17:55:00.459173 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c7be3e2-e25f-45c8-8320-d1b5407835fe-config-data\") pod \"nova-scheduler-0\" (UID: \"2c7be3e2-e25f-45c8-8320-d1b5407835fe\") " pod="openstack/nova-scheduler-0" Mar 20 17:55:00 crc kubenswrapper[4690]: I0320 17:55:00.459199 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwmmx\" (UniqueName: \"kubernetes.io/projected/05ab337b-5ea6-4657-94e3-cc9f28b0d9fa-kube-api-access-hwmmx\") pod \"nova-metadata-0\" (UID: \"05ab337b-5ea6-4657-94e3-cc9f28b0d9fa\") " pod="openstack/nova-metadata-0" Mar 20 17:55:00 crc kubenswrapper[4690]: I0320 17:55:00.459233 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05ab337b-5ea6-4657-94e3-cc9f28b0d9fa-config-data\") pod \"nova-metadata-0\" (UID: \"05ab337b-5ea6-4657-94e3-cc9f28b0d9fa\") " pod="openstack/nova-metadata-0" Mar 20 17:55:00 crc kubenswrapper[4690]: I0320 17:55:00.459262 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05ab337b-5ea6-4657-94e3-cc9f28b0d9fa-logs\") pod \"nova-metadata-0\" (UID: \"05ab337b-5ea6-4657-94e3-cc9f28b0d9fa\") " pod="openstack/nova-metadata-0" Mar 20 17:55:00 crc kubenswrapper[4690]: I0320 17:55:00.464483 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c7be3e2-e25f-45c8-8320-d1b5407835fe-config-data\") pod \"nova-scheduler-0\" (UID: \"2c7be3e2-e25f-45c8-8320-d1b5407835fe\") " pod="openstack/nova-scheduler-0" Mar 20 17:55:00 crc kubenswrapper[4690]: I0320 17:55:00.468680 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c7be3e2-e25f-45c8-8320-d1b5407835fe-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2c7be3e2-e25f-45c8-8320-d1b5407835fe\") " pod="openstack/nova-scheduler-0" Mar 20 17:55:00 crc kubenswrapper[4690]: I0320 17:55:00.479788 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmxtj\" (UniqueName: \"kubernetes.io/projected/2c7be3e2-e25f-45c8-8320-d1b5407835fe-kube-api-access-hmxtj\") pod \"nova-scheduler-0\" (UID: \"2c7be3e2-e25f-45c8-8320-d1b5407835fe\") " pod="openstack/nova-scheduler-0" Mar 20 17:55:00 crc kubenswrapper[4690]: I0320 17:55:00.495308 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 17:55:00 crc kubenswrapper[4690]: I0320 17:55:00.520665 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-q25ft"] Mar 20 17:55:00 crc kubenswrapper[4690]: I0320 17:55:00.522756 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-q25ft" Mar 20 17:55:00 crc kubenswrapper[4690]: I0320 17:55:00.531409 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-q25ft"] Mar 20 17:55:00 crc kubenswrapper[4690]: I0320 17:55:00.571885 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c4134682-ffd8-4189-9abd-bf4f23b57a90-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-q25ft\" (UID: \"c4134682-ffd8-4189-9abd-bf4f23b57a90\") " pod="openstack/dnsmasq-dns-757b4f8459-q25ft" Mar 20 17:55:00 crc kubenswrapper[4690]: I0320 17:55:00.571953 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c4134682-ffd8-4189-9abd-bf4f23b57a90-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-q25ft\" (UID: \"c4134682-ffd8-4189-9abd-bf4f23b57a90\") " pod="openstack/dnsmasq-dns-757b4f8459-q25ft" Mar 20 17:55:00 crc kubenswrapper[4690]: I0320 17:55:00.571984 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwmmx\" (UniqueName: \"kubernetes.io/projected/05ab337b-5ea6-4657-94e3-cc9f28b0d9fa-kube-api-access-hwmmx\") pod \"nova-metadata-0\" (UID: \"05ab337b-5ea6-4657-94e3-cc9f28b0d9fa\") " pod="openstack/nova-metadata-0" Mar 20 17:55:00 crc kubenswrapper[4690]: I0320 17:55:00.572014 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05ab337b-5ea6-4657-94e3-cc9f28b0d9fa-config-data\") pod \"nova-metadata-0\" (UID: \"05ab337b-5ea6-4657-94e3-cc9f28b0d9fa\") " pod="openstack/nova-metadata-0" Mar 20 17:55:00 crc kubenswrapper[4690]: I0320 17:55:00.572029 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05ab337b-5ea6-4657-94e3-cc9f28b0d9fa-logs\") pod \"nova-metadata-0\" (UID: \"05ab337b-5ea6-4657-94e3-cc9f28b0d9fa\") " pod="openstack/nova-metadata-0" Mar 20 17:55:00 crc kubenswrapper[4690]: I0320 17:55:00.572047 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vw2bn\" (UniqueName: \"kubernetes.io/projected/c4134682-ffd8-4189-9abd-bf4f23b57a90-kube-api-access-vw2bn\") pod \"dnsmasq-dns-757b4f8459-q25ft\" (UID: \"c4134682-ffd8-4189-9abd-bf4f23b57a90\") " pod="openstack/dnsmasq-dns-757b4f8459-q25ft" Mar 20 17:55:00 crc kubenswrapper[4690]: I0320 17:55:00.572075 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c4134682-ffd8-4189-9abd-bf4f23b57a90-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-q25ft\" (UID: \"c4134682-ffd8-4189-9abd-bf4f23b57a90\") " pod="openstack/dnsmasq-dns-757b4f8459-q25ft" Mar 20 17:55:00 crc kubenswrapper[4690]: I0320 17:55:00.572115 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c4134682-ffd8-4189-9abd-bf4f23b57a90-dns-svc\") pod \"dnsmasq-dns-757b4f8459-q25ft\" (UID: \"c4134682-ffd8-4189-9abd-bf4f23b57a90\") " pod="openstack/dnsmasq-dns-757b4f8459-q25ft" Mar 20 17:55:00 crc kubenswrapper[4690]: I0320 17:55:00.572143 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4134682-ffd8-4189-9abd-bf4f23b57a90-config\") pod \"dnsmasq-dns-757b4f8459-q25ft\" (UID: \"c4134682-ffd8-4189-9abd-bf4f23b57a90\") " pod="openstack/dnsmasq-dns-757b4f8459-q25ft" Mar 20 17:55:00 crc kubenswrapper[4690]: I0320 17:55:00.572168 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05ab337b-5ea6-4657-94e3-cc9f28b0d9fa-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"05ab337b-5ea6-4657-94e3-cc9f28b0d9fa\") " pod="openstack/nova-metadata-0" Mar 20 17:55:00 crc kubenswrapper[4690]: I0320 17:55:00.576640 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05ab337b-5ea6-4657-94e3-cc9f28b0d9fa-logs\") pod \"nova-metadata-0\" (UID: \"05ab337b-5ea6-4657-94e3-cc9f28b0d9fa\") " pod="openstack/nova-metadata-0" Mar 20 17:55:00 crc kubenswrapper[4690]: I0320 17:55:00.585474 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05ab337b-5ea6-4657-94e3-cc9f28b0d9fa-config-data\") pod \"nova-metadata-0\" (UID: \"05ab337b-5ea6-4657-94e3-cc9f28b0d9fa\") " pod="openstack/nova-metadata-0" Mar 20 17:55:00 crc kubenswrapper[4690]: I0320 17:55:00.586178 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05ab337b-5ea6-4657-94e3-cc9f28b0d9fa-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"05ab337b-5ea6-4657-94e3-cc9f28b0d9fa\") " pod="openstack/nova-metadata-0" Mar 20 17:55:00 crc kubenswrapper[4690]: I0320 17:55:00.596592 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwmmx\" (UniqueName: \"kubernetes.io/projected/05ab337b-5ea6-4657-94e3-cc9f28b0d9fa-kube-api-access-hwmmx\") pod \"nova-metadata-0\" (UID: \"05ab337b-5ea6-4657-94e3-cc9f28b0d9fa\") " pod="openstack/nova-metadata-0" Mar 20 17:55:00 crc kubenswrapper[4690]: I0320 17:55:00.647150 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:55:00 crc kubenswrapper[4690]: I0320 17:55:00.673808 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c4134682-ffd8-4189-9abd-bf4f23b57a90-dns-svc\") pod \"dnsmasq-dns-757b4f8459-q25ft\" (UID: \"c4134682-ffd8-4189-9abd-bf4f23b57a90\") " pod="openstack/dnsmasq-dns-757b4f8459-q25ft" Mar 20 17:55:00 crc kubenswrapper[4690]: I0320 17:55:00.673879 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4134682-ffd8-4189-9abd-bf4f23b57a90-config\") pod \"dnsmasq-dns-757b4f8459-q25ft\" (UID: \"c4134682-ffd8-4189-9abd-bf4f23b57a90\") " pod="openstack/dnsmasq-dns-757b4f8459-q25ft" Mar 20 17:55:00 crc kubenswrapper[4690]: I0320 17:55:00.673980 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c4134682-ffd8-4189-9abd-bf4f23b57a90-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-q25ft\" (UID: \"c4134682-ffd8-4189-9abd-bf4f23b57a90\") " pod="openstack/dnsmasq-dns-757b4f8459-q25ft" Mar 20 17:55:00 crc kubenswrapper[4690]: I0320 17:55:00.674033 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c4134682-ffd8-4189-9abd-bf4f23b57a90-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-q25ft\" (UID: \"c4134682-ffd8-4189-9abd-bf4f23b57a90\") " pod="openstack/dnsmasq-dns-757b4f8459-q25ft" Mar 20 17:55:00 crc kubenswrapper[4690]: I0320 17:55:00.674090 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vw2bn\" (UniqueName: \"kubernetes.io/projected/c4134682-ffd8-4189-9abd-bf4f23b57a90-kube-api-access-vw2bn\") pod \"dnsmasq-dns-757b4f8459-q25ft\" (UID: \"c4134682-ffd8-4189-9abd-bf4f23b57a90\") " pod="openstack/dnsmasq-dns-757b4f8459-q25ft" Mar 20 17:55:00 crc kubenswrapper[4690]: I0320 17:55:00.674124 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c4134682-ffd8-4189-9abd-bf4f23b57a90-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-q25ft\" (UID: \"c4134682-ffd8-4189-9abd-bf4f23b57a90\") " pod="openstack/dnsmasq-dns-757b4f8459-q25ft" Mar 20 17:55:00 crc kubenswrapper[4690]: I0320 17:55:00.675131 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c4134682-ffd8-4189-9abd-bf4f23b57a90-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-q25ft\" (UID: \"c4134682-ffd8-4189-9abd-bf4f23b57a90\") " pod="openstack/dnsmasq-dns-757b4f8459-q25ft" Mar 20 17:55:00 crc kubenswrapper[4690]: I0320 17:55:00.676380 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c4134682-ffd8-4189-9abd-bf4f23b57a90-dns-svc\") pod \"dnsmasq-dns-757b4f8459-q25ft\" (UID: \"c4134682-ffd8-4189-9abd-bf4f23b57a90\") " pod="openstack/dnsmasq-dns-757b4f8459-q25ft" Mar 20 17:55:00 crc kubenswrapper[4690]: I0320 17:55:00.677005 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c4134682-ffd8-4189-9abd-bf4f23b57a90-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-q25ft\" (UID: \"c4134682-ffd8-4189-9abd-bf4f23b57a90\") " pod="openstack/dnsmasq-dns-757b4f8459-q25ft" Mar 20 17:55:00 crc kubenswrapper[4690]: I0320 17:55:00.677609 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4134682-ffd8-4189-9abd-bf4f23b57a90-config\") pod \"dnsmasq-dns-757b4f8459-q25ft\" (UID: \"c4134682-ffd8-4189-9abd-bf4f23b57a90\") " pod="openstack/dnsmasq-dns-757b4f8459-q25ft" Mar 20 17:55:00 crc kubenswrapper[4690]: I0320 17:55:00.678413 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c4134682-ffd8-4189-9abd-bf4f23b57a90-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-q25ft\" (UID: \"c4134682-ffd8-4189-9abd-bf4f23b57a90\") " pod="openstack/dnsmasq-dns-757b4f8459-q25ft" Mar 20 17:55:00 crc kubenswrapper[4690]: I0320 17:55:00.704357 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vw2bn\" (UniqueName: \"kubernetes.io/projected/c4134682-ffd8-4189-9abd-bf4f23b57a90-kube-api-access-vw2bn\") pod \"dnsmasq-dns-757b4f8459-q25ft\" (UID: \"c4134682-ffd8-4189-9abd-bf4f23b57a90\") " pod="openstack/dnsmasq-dns-757b4f8459-q25ft" Mar 20 17:55:00 crc kubenswrapper[4690]: I0320 17:55:00.716157 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 17:55:00 crc kubenswrapper[4690]: I0320 17:55:00.765312 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 17:55:00 crc kubenswrapper[4690]: I0320 17:55:00.879063 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-q25ft" Mar 20 17:55:00 crc kubenswrapper[4690]: I0320 17:55:00.889916 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-454zt"] Mar 20 17:55:01 crc kubenswrapper[4690]: I0320 17:55:01.069465 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 17:55:01 crc kubenswrapper[4690]: I0320 17:55:01.207490 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-69qfb"] Mar 20 17:55:01 crc kubenswrapper[4690]: I0320 17:55:01.211815 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-69qfb" Mar 20 17:55:01 crc kubenswrapper[4690]: I0320 17:55:01.218681 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 20 17:55:01 crc kubenswrapper[4690]: I0320 17:55:01.218955 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 20 17:55:01 crc kubenswrapper[4690]: I0320 17:55:01.233307 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 17:55:01 crc kubenswrapper[4690]: I0320 17:55:01.276313 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-69qfb"] Mar 20 17:55:01 crc kubenswrapper[4690]: I0320 17:55:01.284741 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6kp9\" (UniqueName: \"kubernetes.io/projected/11cba8a0-5804-4d01-bcdb-ef490500501f-kube-api-access-k6kp9\") pod \"nova-cell1-conductor-db-sync-69qfb\" (UID: \"11cba8a0-5804-4d01-bcdb-ef490500501f\") " pod="openstack/nova-cell1-conductor-db-sync-69qfb" Mar 20 17:55:01 crc kubenswrapper[4690]: I0320 17:55:01.284813 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/11cba8a0-5804-4d01-bcdb-ef490500501f-scripts\") pod \"nova-cell1-conductor-db-sync-69qfb\" (UID: \"11cba8a0-5804-4d01-bcdb-ef490500501f\") " pod="openstack/nova-cell1-conductor-db-sync-69qfb" Mar 20 17:55:01 crc kubenswrapper[4690]: I0320 17:55:01.284892 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11cba8a0-5804-4d01-bcdb-ef490500501f-config-data\") pod \"nova-cell1-conductor-db-sync-69qfb\" (UID: \"11cba8a0-5804-4d01-bcdb-ef490500501f\") " pod="openstack/nova-cell1-conductor-db-sync-69qfb" Mar 20 17:55:01 crc kubenswrapper[4690]: I0320 17:55:01.284913 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11cba8a0-5804-4d01-bcdb-ef490500501f-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-69qfb\" (UID: \"11cba8a0-5804-4d01-bcdb-ef490500501f\") " pod="openstack/nova-cell1-conductor-db-sync-69qfb" Mar 20 17:55:01 crc kubenswrapper[4690]: I0320 17:55:01.365357 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 17:55:01 crc kubenswrapper[4690]: I0320 17:55:01.387078 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 17:55:01 crc kubenswrapper[4690]: I0320 17:55:01.387218 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/11cba8a0-5804-4d01-bcdb-ef490500501f-scripts\") pod \"nova-cell1-conductor-db-sync-69qfb\" (UID: \"11cba8a0-5804-4d01-bcdb-ef490500501f\") " pod="openstack/nova-cell1-conductor-db-sync-69qfb" Mar 20 17:55:01 crc kubenswrapper[4690]: I0320 17:55:01.387311 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11cba8a0-5804-4d01-bcdb-ef490500501f-config-data\") pod \"nova-cell1-conductor-db-sync-69qfb\" (UID: \"11cba8a0-5804-4d01-bcdb-ef490500501f\") " pod="openstack/nova-cell1-conductor-db-sync-69qfb" Mar 20 17:55:01 crc kubenswrapper[4690]: I0320 17:55:01.387337 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11cba8a0-5804-4d01-bcdb-ef490500501f-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-69qfb\" (UID: \"11cba8a0-5804-4d01-bcdb-ef490500501f\") " pod="openstack/nova-cell1-conductor-db-sync-69qfb" Mar 20 17:55:01 crc kubenswrapper[4690]: I0320 17:55:01.387413 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6kp9\" (UniqueName: \"kubernetes.io/projected/11cba8a0-5804-4d01-bcdb-ef490500501f-kube-api-access-k6kp9\") pod \"nova-cell1-conductor-db-sync-69qfb\" (UID: \"11cba8a0-5804-4d01-bcdb-ef490500501f\") " pod="openstack/nova-cell1-conductor-db-sync-69qfb" Mar 20 17:55:01 crc kubenswrapper[4690]: I0320 17:55:01.458595 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/11cba8a0-5804-4d01-bcdb-ef490500501f-scripts\") pod \"nova-cell1-conductor-db-sync-69qfb\" (UID: \"11cba8a0-5804-4d01-bcdb-ef490500501f\") " pod="openstack/nova-cell1-conductor-db-sync-69qfb" Mar 20 17:55:01 crc kubenswrapper[4690]: I0320 17:55:01.458683 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11cba8a0-5804-4d01-bcdb-ef490500501f-config-data\") pod \"nova-cell1-conductor-db-sync-69qfb\" (UID: \"11cba8a0-5804-4d01-bcdb-ef490500501f\") " pod="openstack/nova-cell1-conductor-db-sync-69qfb" Mar 20 17:55:01 crc kubenswrapper[4690]: I0320 17:55:01.461066 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11cba8a0-5804-4d01-bcdb-ef490500501f-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-69qfb\" (UID: \"11cba8a0-5804-4d01-bcdb-ef490500501f\") " pod="openstack/nova-cell1-conductor-db-sync-69qfb" Mar 20 17:55:01 crc kubenswrapper[4690]: I0320 17:55:01.483769 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6kp9\" (UniqueName: \"kubernetes.io/projected/11cba8a0-5804-4d01-bcdb-ef490500501f-kube-api-access-k6kp9\") pod \"nova-cell1-conductor-db-sync-69qfb\" (UID: \"11cba8a0-5804-4d01-bcdb-ef490500501f\") " pod="openstack/nova-cell1-conductor-db-sync-69qfb" Mar 20 17:55:01 crc kubenswrapper[4690]: W0320 17:55:01.502591 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2c7be3e2_e25f_45c8_8320_d1b5407835fe.slice/crio-f77cab5bd3555c5468b7ce27b5aa5c7a200992ac9621790b20853a94672a9ece WatchSource:0}: Error finding container f77cab5bd3555c5468b7ce27b5aa5c7a200992ac9621790b20853a94672a9ece: Status 404 returned error can't find the container with id f77cab5bd3555c5468b7ce27b5aa5c7a200992ac9621790b20853a94672a9ece Mar 20 17:55:01 crc kubenswrapper[4690]: I0320 17:55:01.597184 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-q25ft"] Mar 20 17:55:01 crc kubenswrapper[4690]: W0320 17:55:01.599146 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc4134682_ffd8_4189_9abd_bf4f23b57a90.slice/crio-a6bb5d309d753e41fe1a364995cbb0872e1d966766e4cea29d793b2aacf9c3b2 WatchSource:0}: Error finding container a6bb5d309d753e41fe1a364995cbb0872e1d966766e4cea29d793b2aacf9c3b2: Status 404 returned error can't find the container with id a6bb5d309d753e41fe1a364995cbb0872e1d966766e4cea29d793b2aacf9c3b2 Mar 20 17:55:01 crc kubenswrapper[4690]: I0320 17:55:01.655487 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-69qfb" Mar 20 17:55:01 crc kubenswrapper[4690]: I0320 17:55:01.825203 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2c7be3e2-e25f-45c8-8320-d1b5407835fe","Type":"ContainerStarted","Data":"f77cab5bd3555c5468b7ce27b5aa5c7a200992ac9621790b20853a94672a9ece"} Mar 20 17:55:01 crc kubenswrapper[4690]: I0320 17:55:01.828597 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-454zt" event={"ID":"d3f5fa9c-b4e5-4674-8ecf-2dd41a12852e","Type":"ContainerStarted","Data":"8ba3dabaad1997eae9e7f119e3e94ab9802dfb2f29f3870414a98004cdbf6e7b"} Mar 20 17:55:01 crc kubenswrapper[4690]: I0320 17:55:01.828629 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-454zt" event={"ID":"d3f5fa9c-b4e5-4674-8ecf-2dd41a12852e","Type":"ContainerStarted","Data":"d346c17ca1173332dd2f6bde66c3d6c84dfadc2f35963c88caf1aa6956e5e71b"} Mar 20 17:55:01 crc kubenswrapper[4690]: I0320 17:55:01.834309 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"05ab337b-5ea6-4657-94e3-cc9f28b0d9fa","Type":"ContainerStarted","Data":"ef30f1eeda48128ea0b274fd9f178c38fd77449c66ca2a1caf7f56f648001f23"} Mar 20 17:55:01 crc kubenswrapper[4690]: I0320 17:55:01.839879 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-q25ft" event={"ID":"c4134682-ffd8-4189-9abd-bf4f23b57a90","Type":"ContainerStarted","Data":"a6bb5d309d753e41fe1a364995cbb0872e1d966766e4cea29d793b2aacf9c3b2"} Mar 20 17:55:01 crc kubenswrapper[4690]: I0320 17:55:01.842191 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"240578b9-6354-4ab4-9e38-cec9daee9be4","Type":"ContainerStarted","Data":"730a2fc9dc1bf8e50094de3b7d7c897619892375dd1cd9a848d3420e7b3315c1"} Mar 20 17:55:01 crc kubenswrapper[4690]: I0320 17:55:01.857834 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="684c0017-320d-4195-a9a0-52a5174dfdd1" containerName="ceilometer-central-agent" containerID="cri-o://2d6cc974d71df78b72e2dde85c3ce0bba672d9729f8c79e23aea5c997d4ed1cc" gracePeriod=30 Mar 20 17:55:01 crc kubenswrapper[4690]: I0320 17:55:01.858005 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="684c0017-320d-4195-a9a0-52a5174dfdd1" containerName="sg-core" containerID="cri-o://eee733de193fa9a786e74c6633f006599f51ca530be0593065ce97cca0c9e7bd" gracePeriod=30 Mar 20 17:55:01 crc kubenswrapper[4690]: I0320 17:55:01.858023 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"684c0017-320d-4195-a9a0-52a5174dfdd1","Type":"ContainerStarted","Data":"5551c7fa1f1596229263c54a6b7ac58580bb152dfea8eeb0ff1133f09bc4143b"} Mar 20 17:55:01 crc kubenswrapper[4690]: I0320 17:55:01.858073 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 17:55:01 crc kubenswrapper[4690]: I0320 17:55:01.858076 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="684c0017-320d-4195-a9a0-52a5174dfdd1" containerName="ceilometer-notification-agent" containerID="cri-o://1169c3782cff65c6859a1024927f6b920024feebecb830c8d188d884432b50c4" gracePeriod=30 Mar 20 17:55:01 crc kubenswrapper[4690]: I0320 17:55:01.858118 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="684c0017-320d-4195-a9a0-52a5174dfdd1" containerName="proxy-httpd" containerID="cri-o://5551c7fa1f1596229263c54a6b7ac58580bb152dfea8eeb0ff1133f09bc4143b" gracePeriod=30 Mar 20 17:55:01 crc kubenswrapper[4690]: I0320 17:55:01.858461 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-454zt" podStartSLOduration=2.858441485 podStartE2EDuration="2.858441485s" podCreationTimestamp="2026-03-20 17:54:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:55:01.851086231 +0000 UTC m=+1376.716911919" watchObservedRunningTime="2026-03-20 17:55:01.858441485 +0000 UTC m=+1376.724267163" Mar 20 17:55:01 crc kubenswrapper[4690]: I0320 17:55:01.873750 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"0767f87b-816c-4824-aaf1-8eb760dc6ee8","Type":"ContainerStarted","Data":"626bf09ec0007c08d38c99e459d97e7119bb4e9d09ed6a9de444a1432cbc6c88"} Mar 20 17:55:01 crc kubenswrapper[4690]: I0320 17:55:01.898506 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.532791489 podStartE2EDuration="6.898480504s" podCreationTimestamp="2026-03-20 17:54:55 +0000 UTC" firstStartedPulling="2026-03-20 17:54:56.614793111 +0000 UTC m=+1371.480618799" lastFinishedPulling="2026-03-20 17:55:00.980482136 +0000 UTC m=+1375.846307814" observedRunningTime="2026-03-20 17:55:01.891716986 +0000 UTC m=+1376.757542664" watchObservedRunningTime="2026-03-20 17:55:01.898480504 +0000 UTC m=+1376.764306182" Mar 20 17:55:02 crc kubenswrapper[4690]: I0320 17:55:02.220422 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-69qfb"] Mar 20 17:55:02 crc kubenswrapper[4690]: W0320 17:55:02.264484 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod11cba8a0_5804_4d01_bcdb_ef490500501f.slice/crio-ff6f3fbe0a5cce91e4a835de8e6df7fffc96a2e87eaa4bfcfb0ff6a27876ac53 WatchSource:0}: Error finding container ff6f3fbe0a5cce91e4a835de8e6df7fffc96a2e87eaa4bfcfb0ff6a27876ac53: Status 404 returned error can't find the container with id ff6f3fbe0a5cce91e4a835de8e6df7fffc96a2e87eaa4bfcfb0ff6a27876ac53 Mar 20 17:55:02 crc kubenswrapper[4690]: I0320 17:55:02.885795 4690 generic.go:334] "Generic (PLEG): container finished" podID="c4134682-ffd8-4189-9abd-bf4f23b57a90" containerID="5dec7639a6ca06587a3dc3a15fa008efdd0ccbcd1a868dfab6c96fc9d8d85282" exitCode=0 Mar 20 17:55:02 crc kubenswrapper[4690]: I0320 17:55:02.885870 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-q25ft" event={"ID":"c4134682-ffd8-4189-9abd-bf4f23b57a90","Type":"ContainerDied","Data":"5dec7639a6ca06587a3dc3a15fa008efdd0ccbcd1a868dfab6c96fc9d8d85282"} Mar 20 17:55:02 crc kubenswrapper[4690]: I0320 17:55:02.895208 4690 generic.go:334] "Generic (PLEG): container finished" podID="684c0017-320d-4195-a9a0-52a5174dfdd1" containerID="5551c7fa1f1596229263c54a6b7ac58580bb152dfea8eeb0ff1133f09bc4143b" exitCode=0 Mar 20 17:55:02 crc kubenswrapper[4690]: I0320 17:55:02.895238 4690 generic.go:334] "Generic (PLEG): container finished" podID="684c0017-320d-4195-a9a0-52a5174dfdd1" containerID="eee733de193fa9a786e74c6633f006599f51ca530be0593065ce97cca0c9e7bd" exitCode=2 Mar 20 17:55:02 crc kubenswrapper[4690]: I0320 17:55:02.895246 4690 generic.go:334] "Generic (PLEG): container finished" podID="684c0017-320d-4195-a9a0-52a5174dfdd1" containerID="1169c3782cff65c6859a1024927f6b920024feebecb830c8d188d884432b50c4" exitCode=0 Mar 20 17:55:02 crc kubenswrapper[4690]: I0320 17:55:02.895302 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"684c0017-320d-4195-a9a0-52a5174dfdd1","Type":"ContainerDied","Data":"5551c7fa1f1596229263c54a6b7ac58580bb152dfea8eeb0ff1133f09bc4143b"} Mar 20 17:55:02 crc kubenswrapper[4690]: I0320 17:55:02.895327 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"684c0017-320d-4195-a9a0-52a5174dfdd1","Type":"ContainerDied","Data":"eee733de193fa9a786e74c6633f006599f51ca530be0593065ce97cca0c9e7bd"} Mar 20 17:55:02 crc kubenswrapper[4690]: I0320 17:55:02.895337 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"684c0017-320d-4195-a9a0-52a5174dfdd1","Type":"ContainerDied","Data":"1169c3782cff65c6859a1024927f6b920024feebecb830c8d188d884432b50c4"} Mar 20 17:55:02 crc kubenswrapper[4690]: I0320 17:55:02.913586 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-69qfb" event={"ID":"11cba8a0-5804-4d01-bcdb-ef490500501f","Type":"ContainerStarted","Data":"d7edac921af3e2d5feb159e51483e142289f9d76a7f9e650bb9c796b33e41066"} Mar 20 17:55:02 crc kubenswrapper[4690]: I0320 17:55:02.913658 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-69qfb" event={"ID":"11cba8a0-5804-4d01-bcdb-ef490500501f","Type":"ContainerStarted","Data":"ff6f3fbe0a5cce91e4a835de8e6df7fffc96a2e87eaa4bfcfb0ff6a27876ac53"} Mar 20 17:55:02 crc kubenswrapper[4690]: I0320 17:55:02.940744 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-69qfb" podStartSLOduration=1.940728392 podStartE2EDuration="1.940728392s" podCreationTimestamp="2026-03-20 17:55:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:55:02.929110541 +0000 UTC m=+1377.794936219" watchObservedRunningTime="2026-03-20 17:55:02.940728392 +0000 UTC m=+1377.806554070" Mar 20 17:55:03 crc kubenswrapper[4690]: I0320 17:55:03.702378 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 17:55:03 crc kubenswrapper[4690]: I0320 17:55:03.746276 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 17:55:04 crc kubenswrapper[4690]: I0320 17:55:04.936868 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-q25ft" event={"ID":"c4134682-ffd8-4189-9abd-bf4f23b57a90","Type":"ContainerStarted","Data":"b4428763cecbb68afbf2440b8d1f4ae5da26dfa679171253130813a80c49e5a0"} Mar 20 17:55:04 crc kubenswrapper[4690]: I0320 17:55:04.938768 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-757b4f8459-q25ft" Mar 20 17:55:04 crc kubenswrapper[4690]: I0320 17:55:04.965529 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-757b4f8459-q25ft" podStartSLOduration=4.965509537 podStartE2EDuration="4.965509537s" podCreationTimestamp="2026-03-20 17:55:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:55:04.961939798 +0000 UTC m=+1379.827765486" watchObservedRunningTime="2026-03-20 17:55:04.965509537 +0000 UTC m=+1379.831335215" Mar 20 17:55:05 crc kubenswrapper[4690]: I0320 17:55:05.947818 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"0767f87b-816c-4824-aaf1-8eb760dc6ee8","Type":"ContainerStarted","Data":"f80e166e696db08579ebb1a6ee42e6b620decd863b0f09b952e0cefa723722a8"} Mar 20 17:55:05 crc kubenswrapper[4690]: I0320 17:55:05.948151 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="0767f87b-816c-4824-aaf1-8eb760dc6ee8" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://f80e166e696db08579ebb1a6ee42e6b620decd863b0f09b952e0cefa723722a8" gracePeriod=30 Mar 20 17:55:05 crc kubenswrapper[4690]: I0320 17:55:05.954060 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2c7be3e2-e25f-45c8-8320-d1b5407835fe","Type":"ContainerStarted","Data":"0c982168f6c0db21d3fe2de8aeb6075daec4e3eca159cb524428072ee15a1863"} Mar 20 17:55:05 crc kubenswrapper[4690]: I0320 17:55:05.956558 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"05ab337b-5ea6-4657-94e3-cc9f28b0d9fa","Type":"ContainerStarted","Data":"aa3401c7df5a8fff1b01805472031b3714f6586ddfbe9a0453c38bb05eb35ce7"} Mar 20 17:55:05 crc kubenswrapper[4690]: I0320 17:55:05.956628 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"05ab337b-5ea6-4657-94e3-cc9f28b0d9fa","Type":"ContainerStarted","Data":"c3cf40d561a11e218246caf700c41b99352c9d08f0eea94dbfcb9f7131a0098c"} Mar 20 17:55:05 crc kubenswrapper[4690]: I0320 17:55:05.956741 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="05ab337b-5ea6-4657-94e3-cc9f28b0d9fa" containerName="nova-metadata-log" containerID="cri-o://c3cf40d561a11e218246caf700c41b99352c9d08f0eea94dbfcb9f7131a0098c" gracePeriod=30 Mar 20 17:55:05 crc kubenswrapper[4690]: I0320 17:55:05.957098 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="05ab337b-5ea6-4657-94e3-cc9f28b0d9fa" containerName="nova-metadata-metadata" containerID="cri-o://aa3401c7df5a8fff1b01805472031b3714f6586ddfbe9a0453c38bb05eb35ce7" gracePeriod=30 Mar 20 17:55:05 crc kubenswrapper[4690]: I0320 17:55:05.967037 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"240578b9-6354-4ab4-9e38-cec9daee9be4","Type":"ContainerStarted","Data":"394dd84b71c6e5d6e0c22eb4b3dc75580c8589a87d9fa00a9e477d9fee50aa8b"} Mar 20 17:55:05 crc kubenswrapper[4690]: I0320 17:55:05.967085 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"240578b9-6354-4ab4-9e38-cec9daee9be4","Type":"ContainerStarted","Data":"877ac42af068efa39ddb6127ea33ddac065609ce0b8559a32b84b38290ac3ba4"} Mar 20 17:55:06 crc kubenswrapper[4690]: I0320 17:55:06.012503 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.654088273 podStartE2EDuration="7.012480097s" podCreationTimestamp="2026-03-20 17:54:59 +0000 UTC" firstStartedPulling="2026-03-20 17:55:01.239344186 +0000 UTC m=+1376.105169864" lastFinishedPulling="2026-03-20 17:55:04.59773601 +0000 UTC m=+1379.463561688" observedRunningTime="2026-03-20 17:55:05.980833951 +0000 UTC m=+1380.846659629" watchObservedRunningTime="2026-03-20 17:55:06.012480097 +0000 UTC m=+1380.878305775" Mar 20 17:55:06 crc kubenswrapper[4690]: I0320 17:55:06.034023 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.956384057 podStartE2EDuration="6.034003644s" podCreationTimestamp="2026-03-20 17:55:00 +0000 UTC" firstStartedPulling="2026-03-20 17:55:01.518009655 +0000 UTC m=+1376.383835333" lastFinishedPulling="2026-03-20 17:55:04.595629242 +0000 UTC m=+1379.461454920" observedRunningTime="2026-03-20 17:55:06.008080706 +0000 UTC m=+1380.873906394" watchObservedRunningTime="2026-03-20 17:55:06.034003644 +0000 UTC m=+1380.899829312" Mar 20 17:55:06 crc kubenswrapper[4690]: I0320 17:55:06.040835 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.956835469 podStartE2EDuration="6.040816042s" podCreationTimestamp="2026-03-20 17:55:00 +0000 UTC" firstStartedPulling="2026-03-20 17:55:01.507509314 +0000 UTC m=+1376.373334992" lastFinishedPulling="2026-03-20 17:55:04.591489857 +0000 UTC m=+1379.457315565" observedRunningTime="2026-03-20 17:55:06.021879658 +0000 UTC m=+1380.887705336" watchObservedRunningTime="2026-03-20 17:55:06.040816042 +0000 UTC m=+1380.906641720" Mar 20 17:55:06 crc kubenswrapper[4690]: I0320 17:55:06.048088 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.545217988 podStartE2EDuration="6.048074413s" podCreationTimestamp="2026-03-20 17:55:00 +0000 UTC" firstStartedPulling="2026-03-20 17:55:01.095315807 +0000 UTC m=+1375.961141485" lastFinishedPulling="2026-03-20 17:55:04.598172222 +0000 UTC m=+1379.463997910" observedRunningTime="2026-03-20 17:55:06.044320239 +0000 UTC m=+1380.910145937" watchObservedRunningTime="2026-03-20 17:55:06.048074413 +0000 UTC m=+1380.913900091" Mar 20 17:55:06 crc kubenswrapper[4690]: I0320 17:55:06.654537 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 17:55:06 crc kubenswrapper[4690]: I0320 17:55:06.798449 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05ab337b-5ea6-4657-94e3-cc9f28b0d9fa-logs\") pod \"05ab337b-5ea6-4657-94e3-cc9f28b0d9fa\" (UID: \"05ab337b-5ea6-4657-94e3-cc9f28b0d9fa\") " Mar 20 17:55:06 crc kubenswrapper[4690]: I0320 17:55:06.798602 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hwmmx\" (UniqueName: \"kubernetes.io/projected/05ab337b-5ea6-4657-94e3-cc9f28b0d9fa-kube-api-access-hwmmx\") pod \"05ab337b-5ea6-4657-94e3-cc9f28b0d9fa\" (UID: \"05ab337b-5ea6-4657-94e3-cc9f28b0d9fa\") " Mar 20 17:55:06 crc kubenswrapper[4690]: I0320 17:55:06.798745 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05ab337b-5ea6-4657-94e3-cc9f28b0d9fa-config-data\") pod \"05ab337b-5ea6-4657-94e3-cc9f28b0d9fa\" (UID: \"05ab337b-5ea6-4657-94e3-cc9f28b0d9fa\") " Mar 20 17:55:06 crc kubenswrapper[4690]: I0320 17:55:06.798794 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05ab337b-5ea6-4657-94e3-cc9f28b0d9fa-combined-ca-bundle\") pod \"05ab337b-5ea6-4657-94e3-cc9f28b0d9fa\" (UID: \"05ab337b-5ea6-4657-94e3-cc9f28b0d9fa\") " Mar 20 17:55:06 crc kubenswrapper[4690]: I0320 17:55:06.798826 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05ab337b-5ea6-4657-94e3-cc9f28b0d9fa-logs" (OuterVolumeSpecName: "logs") pod "05ab337b-5ea6-4657-94e3-cc9f28b0d9fa" (UID: "05ab337b-5ea6-4657-94e3-cc9f28b0d9fa"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:55:06 crc kubenswrapper[4690]: I0320 17:55:06.799161 4690 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05ab337b-5ea6-4657-94e3-cc9f28b0d9fa-logs\") on node \"crc\" DevicePath \"\"" Mar 20 17:55:06 crc kubenswrapper[4690]: I0320 17:55:06.820241 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05ab337b-5ea6-4657-94e3-cc9f28b0d9fa-kube-api-access-hwmmx" (OuterVolumeSpecName: "kube-api-access-hwmmx") pod "05ab337b-5ea6-4657-94e3-cc9f28b0d9fa" (UID: "05ab337b-5ea6-4657-94e3-cc9f28b0d9fa"). InnerVolumeSpecName "kube-api-access-hwmmx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:55:06 crc kubenswrapper[4690]: I0320 17:55:06.836382 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05ab337b-5ea6-4657-94e3-cc9f28b0d9fa-config-data" (OuterVolumeSpecName: "config-data") pod "05ab337b-5ea6-4657-94e3-cc9f28b0d9fa" (UID: "05ab337b-5ea6-4657-94e3-cc9f28b0d9fa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:55:06 crc kubenswrapper[4690]: I0320 17:55:06.842239 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05ab337b-5ea6-4657-94e3-cc9f28b0d9fa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "05ab337b-5ea6-4657-94e3-cc9f28b0d9fa" (UID: "05ab337b-5ea6-4657-94e3-cc9f28b0d9fa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:55:06 crc kubenswrapper[4690]: I0320 17:55:06.901225 4690 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05ab337b-5ea6-4657-94e3-cc9f28b0d9fa-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:55:06 crc kubenswrapper[4690]: I0320 17:55:06.901278 4690 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05ab337b-5ea6-4657-94e3-cc9f28b0d9fa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:55:06 crc kubenswrapper[4690]: I0320 17:55:06.901294 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hwmmx\" (UniqueName: \"kubernetes.io/projected/05ab337b-5ea6-4657-94e3-cc9f28b0d9fa-kube-api-access-hwmmx\") on node \"crc\" DevicePath \"\"" Mar 20 17:55:06 crc kubenswrapper[4690]: I0320 17:55:06.976737 4690 generic.go:334] "Generic (PLEG): container finished" podID="05ab337b-5ea6-4657-94e3-cc9f28b0d9fa" containerID="aa3401c7df5a8fff1b01805472031b3714f6586ddfbe9a0453c38bb05eb35ce7" exitCode=0 Mar 20 17:55:06 crc kubenswrapper[4690]: I0320 17:55:06.976778 4690 generic.go:334] "Generic (PLEG): container finished" podID="05ab337b-5ea6-4657-94e3-cc9f28b0d9fa" containerID="c3cf40d561a11e218246caf700c41b99352c9d08f0eea94dbfcb9f7131a0098c" exitCode=143 Mar 20 17:55:06 crc kubenswrapper[4690]: I0320 17:55:06.976801 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 17:55:06 crc kubenswrapper[4690]: I0320 17:55:06.976867 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"05ab337b-5ea6-4657-94e3-cc9f28b0d9fa","Type":"ContainerDied","Data":"aa3401c7df5a8fff1b01805472031b3714f6586ddfbe9a0453c38bb05eb35ce7"} Mar 20 17:55:06 crc kubenswrapper[4690]: I0320 17:55:06.976962 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"05ab337b-5ea6-4657-94e3-cc9f28b0d9fa","Type":"ContainerDied","Data":"c3cf40d561a11e218246caf700c41b99352c9d08f0eea94dbfcb9f7131a0098c"} Mar 20 17:55:06 crc kubenswrapper[4690]: I0320 17:55:06.977026 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"05ab337b-5ea6-4657-94e3-cc9f28b0d9fa","Type":"ContainerDied","Data":"ef30f1eeda48128ea0b274fd9f178c38fd77449c66ca2a1caf7f56f648001f23"} Mar 20 17:55:06 crc kubenswrapper[4690]: I0320 17:55:06.977057 4690 scope.go:117] "RemoveContainer" containerID="aa3401c7df5a8fff1b01805472031b3714f6586ddfbe9a0453c38bb05eb35ce7" Mar 20 17:55:06 crc kubenswrapper[4690]: I0320 17:55:06.999214 4690 scope.go:117] "RemoveContainer" containerID="c3cf40d561a11e218246caf700c41b99352c9d08f0eea94dbfcb9f7131a0098c" Mar 20 17:55:07 crc kubenswrapper[4690]: I0320 17:55:07.018599 4690 scope.go:117] "RemoveContainer" containerID="aa3401c7df5a8fff1b01805472031b3714f6586ddfbe9a0453c38bb05eb35ce7" Mar 20 17:55:07 crc kubenswrapper[4690]: E0320 17:55:07.019070 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa3401c7df5a8fff1b01805472031b3714f6586ddfbe9a0453c38bb05eb35ce7\": container with ID starting with aa3401c7df5a8fff1b01805472031b3714f6586ddfbe9a0453c38bb05eb35ce7 not found: ID does not exist" containerID="aa3401c7df5a8fff1b01805472031b3714f6586ddfbe9a0453c38bb05eb35ce7" Mar 20 17:55:07 crc kubenswrapper[4690]: I0320 17:55:07.019115 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa3401c7df5a8fff1b01805472031b3714f6586ddfbe9a0453c38bb05eb35ce7"} err="failed to get container status \"aa3401c7df5a8fff1b01805472031b3714f6586ddfbe9a0453c38bb05eb35ce7\": rpc error: code = NotFound desc = could not find container \"aa3401c7df5a8fff1b01805472031b3714f6586ddfbe9a0453c38bb05eb35ce7\": container with ID starting with aa3401c7df5a8fff1b01805472031b3714f6586ddfbe9a0453c38bb05eb35ce7 not found: ID does not exist" Mar 20 17:55:07 crc kubenswrapper[4690]: I0320 17:55:07.019139 4690 scope.go:117] "RemoveContainer" containerID="c3cf40d561a11e218246caf700c41b99352c9d08f0eea94dbfcb9f7131a0098c" Mar 20 17:55:07 crc kubenswrapper[4690]: E0320 17:55:07.019632 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3cf40d561a11e218246caf700c41b99352c9d08f0eea94dbfcb9f7131a0098c\": container with ID starting with c3cf40d561a11e218246caf700c41b99352c9d08f0eea94dbfcb9f7131a0098c not found: ID does not exist" containerID="c3cf40d561a11e218246caf700c41b99352c9d08f0eea94dbfcb9f7131a0098c" Mar 20 17:55:07 crc kubenswrapper[4690]: I0320 17:55:07.019832 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3cf40d561a11e218246caf700c41b99352c9d08f0eea94dbfcb9f7131a0098c"} err="failed to get container status \"c3cf40d561a11e218246caf700c41b99352c9d08f0eea94dbfcb9f7131a0098c\": rpc error: code = NotFound desc = could not find container \"c3cf40d561a11e218246caf700c41b99352c9d08f0eea94dbfcb9f7131a0098c\": container with ID starting with c3cf40d561a11e218246caf700c41b99352c9d08f0eea94dbfcb9f7131a0098c not found: ID does not exist" Mar 20 17:55:07 crc kubenswrapper[4690]: I0320 17:55:07.019876 4690 scope.go:117] "RemoveContainer" containerID="aa3401c7df5a8fff1b01805472031b3714f6586ddfbe9a0453c38bb05eb35ce7" Mar 20 17:55:07 crc kubenswrapper[4690]: I0320 17:55:07.019983 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 17:55:07 crc kubenswrapper[4690]: I0320 17:55:07.020587 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa3401c7df5a8fff1b01805472031b3714f6586ddfbe9a0453c38bb05eb35ce7"} err="failed to get container status \"aa3401c7df5a8fff1b01805472031b3714f6586ddfbe9a0453c38bb05eb35ce7\": rpc error: code = NotFound desc = could not find container \"aa3401c7df5a8fff1b01805472031b3714f6586ddfbe9a0453c38bb05eb35ce7\": container with ID starting with aa3401c7df5a8fff1b01805472031b3714f6586ddfbe9a0453c38bb05eb35ce7 not found: ID does not exist" Mar 20 17:55:07 crc kubenswrapper[4690]: I0320 17:55:07.020627 4690 scope.go:117] "RemoveContainer" containerID="c3cf40d561a11e218246caf700c41b99352c9d08f0eea94dbfcb9f7131a0098c" Mar 20 17:55:07 crc kubenswrapper[4690]: I0320 17:55:07.020950 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3cf40d561a11e218246caf700c41b99352c9d08f0eea94dbfcb9f7131a0098c"} err="failed to get container status \"c3cf40d561a11e218246caf700c41b99352c9d08f0eea94dbfcb9f7131a0098c\": rpc error: code = NotFound desc = could not find container \"c3cf40d561a11e218246caf700c41b99352c9d08f0eea94dbfcb9f7131a0098c\": container with ID starting with c3cf40d561a11e218246caf700c41b99352c9d08f0eea94dbfcb9f7131a0098c not found: ID does not exist" Mar 20 17:55:07 crc kubenswrapper[4690]: I0320 17:55:07.035410 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 17:55:07 crc kubenswrapper[4690]: I0320 17:55:07.045225 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 20 17:55:07 crc kubenswrapper[4690]: E0320 17:55:07.045732 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05ab337b-5ea6-4657-94e3-cc9f28b0d9fa" containerName="nova-metadata-metadata" Mar 20 17:55:07 crc kubenswrapper[4690]: I0320 17:55:07.045760 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="05ab337b-5ea6-4657-94e3-cc9f28b0d9fa" containerName="nova-metadata-metadata" Mar 20 17:55:07 crc kubenswrapper[4690]: E0320 17:55:07.045788 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05ab337b-5ea6-4657-94e3-cc9f28b0d9fa" containerName="nova-metadata-log" Mar 20 17:55:07 crc kubenswrapper[4690]: I0320 17:55:07.045798 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="05ab337b-5ea6-4657-94e3-cc9f28b0d9fa" containerName="nova-metadata-log" Mar 20 17:55:07 crc kubenswrapper[4690]: I0320 17:55:07.046021 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="05ab337b-5ea6-4657-94e3-cc9f28b0d9fa" containerName="nova-metadata-metadata" Mar 20 17:55:07 crc kubenswrapper[4690]: I0320 17:55:07.046040 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="05ab337b-5ea6-4657-94e3-cc9f28b0d9fa" containerName="nova-metadata-log" Mar 20 17:55:07 crc kubenswrapper[4690]: I0320 17:55:07.047238 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 17:55:07 crc kubenswrapper[4690]: I0320 17:55:07.050069 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 20 17:55:07 crc kubenswrapper[4690]: I0320 17:55:07.051970 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 20 17:55:07 crc kubenswrapper[4690]: I0320 17:55:07.061469 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 17:55:07 crc kubenswrapper[4690]: I0320 17:55:07.205852 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gg7x6\" (UniqueName: \"kubernetes.io/projected/a5903459-6cc0-495f-b301-4e6cc91b5a5a-kube-api-access-gg7x6\") pod \"nova-metadata-0\" (UID: \"a5903459-6cc0-495f-b301-4e6cc91b5a5a\") " pod="openstack/nova-metadata-0" Mar 20 17:55:07 crc kubenswrapper[4690]: I0320 17:55:07.205902 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5903459-6cc0-495f-b301-4e6cc91b5a5a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a5903459-6cc0-495f-b301-4e6cc91b5a5a\") " pod="openstack/nova-metadata-0" Mar 20 17:55:07 crc kubenswrapper[4690]: I0320 17:55:07.206039 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5903459-6cc0-495f-b301-4e6cc91b5a5a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a5903459-6cc0-495f-b301-4e6cc91b5a5a\") " pod="openstack/nova-metadata-0" Mar 20 17:55:07 crc kubenswrapper[4690]: I0320 17:55:07.206096 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5903459-6cc0-495f-b301-4e6cc91b5a5a-config-data\") pod \"nova-metadata-0\" (UID: \"a5903459-6cc0-495f-b301-4e6cc91b5a5a\") " pod="openstack/nova-metadata-0" Mar 20 17:55:07 crc kubenswrapper[4690]: I0320 17:55:07.206114 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5903459-6cc0-495f-b301-4e6cc91b5a5a-logs\") pod \"nova-metadata-0\" (UID: \"a5903459-6cc0-495f-b301-4e6cc91b5a5a\") " pod="openstack/nova-metadata-0" Mar 20 17:55:07 crc kubenswrapper[4690]: I0320 17:55:07.308157 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5903459-6cc0-495f-b301-4e6cc91b5a5a-config-data\") pod \"nova-metadata-0\" (UID: \"a5903459-6cc0-495f-b301-4e6cc91b5a5a\") " pod="openstack/nova-metadata-0" Mar 20 17:55:07 crc kubenswrapper[4690]: I0320 17:55:07.308218 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5903459-6cc0-495f-b301-4e6cc91b5a5a-logs\") pod \"nova-metadata-0\" (UID: \"a5903459-6cc0-495f-b301-4e6cc91b5a5a\") " pod="openstack/nova-metadata-0" Mar 20 17:55:07 crc kubenswrapper[4690]: I0320 17:55:07.308385 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gg7x6\" (UniqueName: \"kubernetes.io/projected/a5903459-6cc0-495f-b301-4e6cc91b5a5a-kube-api-access-gg7x6\") pod \"nova-metadata-0\" (UID: \"a5903459-6cc0-495f-b301-4e6cc91b5a5a\") " pod="openstack/nova-metadata-0" Mar 20 17:55:07 crc kubenswrapper[4690]: I0320 17:55:07.308433 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5903459-6cc0-495f-b301-4e6cc91b5a5a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a5903459-6cc0-495f-b301-4e6cc91b5a5a\") " pod="openstack/nova-metadata-0" Mar 20 17:55:07 crc kubenswrapper[4690]: I0320 17:55:07.308966 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5903459-6cc0-495f-b301-4e6cc91b5a5a-logs\") pod \"nova-metadata-0\" (UID: \"a5903459-6cc0-495f-b301-4e6cc91b5a5a\") " pod="openstack/nova-metadata-0" Mar 20 17:55:07 crc kubenswrapper[4690]: I0320 17:55:07.309342 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5903459-6cc0-495f-b301-4e6cc91b5a5a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a5903459-6cc0-495f-b301-4e6cc91b5a5a\") " pod="openstack/nova-metadata-0" Mar 20 17:55:07 crc kubenswrapper[4690]: I0320 17:55:07.313245 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5903459-6cc0-495f-b301-4e6cc91b5a5a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a5903459-6cc0-495f-b301-4e6cc91b5a5a\") " pod="openstack/nova-metadata-0" Mar 20 17:55:07 crc kubenswrapper[4690]: I0320 17:55:07.314895 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5903459-6cc0-495f-b301-4e6cc91b5a5a-config-data\") pod \"nova-metadata-0\" (UID: \"a5903459-6cc0-495f-b301-4e6cc91b5a5a\") " pod="openstack/nova-metadata-0" Mar 20 17:55:07 crc kubenswrapper[4690]: I0320 17:55:07.323922 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5903459-6cc0-495f-b301-4e6cc91b5a5a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a5903459-6cc0-495f-b301-4e6cc91b5a5a\") " pod="openstack/nova-metadata-0" Mar 20 17:55:07 crc kubenswrapper[4690]: I0320 17:55:07.325347 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gg7x6\" (UniqueName: \"kubernetes.io/projected/a5903459-6cc0-495f-b301-4e6cc91b5a5a-kube-api-access-gg7x6\") pod \"nova-metadata-0\" (UID: \"a5903459-6cc0-495f-b301-4e6cc91b5a5a\") " pod="openstack/nova-metadata-0" Mar 20 17:55:07 crc kubenswrapper[4690]: I0320 17:55:07.375833 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 17:55:07 crc kubenswrapper[4690]: W0320 17:55:07.826064 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda5903459_6cc0_495f_b301_4e6cc91b5a5a.slice/crio-0da35885f33d37ea5d576777d17c7337143f20bcbb668d960b17c4b28b79e76f WatchSource:0}: Error finding container 0da35885f33d37ea5d576777d17c7337143f20bcbb668d960b17c4b28b79e76f: Status 404 returned error can't find the container with id 0da35885f33d37ea5d576777d17c7337143f20bcbb668d960b17c4b28b79e76f Mar 20 17:55:07 crc kubenswrapper[4690]: I0320 17:55:07.826454 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 17:55:07 crc kubenswrapper[4690]: I0320 17:55:07.925036 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05ab337b-5ea6-4657-94e3-cc9f28b0d9fa" path="/var/lib/kubelet/pods/05ab337b-5ea6-4657-94e3-cc9f28b0d9fa/volumes" Mar 20 17:55:07 crc kubenswrapper[4690]: I0320 17:55:07.993942 4690 generic.go:334] "Generic (PLEG): container finished" podID="684c0017-320d-4195-a9a0-52a5174dfdd1" containerID="2d6cc974d71df78b72e2dde85c3ce0bba672d9729f8c79e23aea5c997d4ed1cc" exitCode=0 Mar 20 17:55:07 crc kubenswrapper[4690]: I0320 17:55:07.994025 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"684c0017-320d-4195-a9a0-52a5174dfdd1","Type":"ContainerDied","Data":"2d6cc974d71df78b72e2dde85c3ce0bba672d9729f8c79e23aea5c997d4ed1cc"} Mar 20 17:55:07 crc kubenswrapper[4690]: I0320 17:55:07.995470 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a5903459-6cc0-495f-b301-4e6cc91b5a5a","Type":"ContainerStarted","Data":"0da35885f33d37ea5d576777d17c7337143f20bcbb668d960b17c4b28b79e76f"} Mar 20 17:55:08 crc kubenswrapper[4690]: I0320 17:55:08.164419 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:55:08 crc kubenswrapper[4690]: I0320 17:55:08.173936 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 20 17:55:08 crc kubenswrapper[4690]: I0320 17:55:08.333152 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/684c0017-320d-4195-a9a0-52a5174dfdd1-sg-core-conf-yaml\") pod \"684c0017-320d-4195-a9a0-52a5174dfdd1\" (UID: \"684c0017-320d-4195-a9a0-52a5174dfdd1\") " Mar 20 17:55:08 crc kubenswrapper[4690]: I0320 17:55:08.333614 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/684c0017-320d-4195-a9a0-52a5174dfdd1-combined-ca-bundle\") pod \"684c0017-320d-4195-a9a0-52a5174dfdd1\" (UID: \"684c0017-320d-4195-a9a0-52a5174dfdd1\") " Mar 20 17:55:08 crc kubenswrapper[4690]: I0320 17:55:08.334035 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/684c0017-320d-4195-a9a0-52a5174dfdd1-scripts\") pod \"684c0017-320d-4195-a9a0-52a5174dfdd1\" (UID: \"684c0017-320d-4195-a9a0-52a5174dfdd1\") " Mar 20 17:55:08 crc kubenswrapper[4690]: I0320 17:55:08.334059 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/684c0017-320d-4195-a9a0-52a5174dfdd1-config-data\") pod \"684c0017-320d-4195-a9a0-52a5174dfdd1\" (UID: \"684c0017-320d-4195-a9a0-52a5174dfdd1\") " Mar 20 17:55:08 crc kubenswrapper[4690]: I0320 17:55:08.334142 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/684c0017-320d-4195-a9a0-52a5174dfdd1-run-httpd\") pod \"684c0017-320d-4195-a9a0-52a5174dfdd1\" (UID: \"684c0017-320d-4195-a9a0-52a5174dfdd1\") " Mar 20 17:55:08 crc kubenswrapper[4690]: I0320 17:55:08.334192 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nrxpz\" (UniqueName: \"kubernetes.io/projected/684c0017-320d-4195-a9a0-52a5174dfdd1-kube-api-access-nrxpz\") pod \"684c0017-320d-4195-a9a0-52a5174dfdd1\" (UID: \"684c0017-320d-4195-a9a0-52a5174dfdd1\") " Mar 20 17:55:08 crc kubenswrapper[4690]: I0320 17:55:08.334214 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/684c0017-320d-4195-a9a0-52a5174dfdd1-log-httpd\") pod \"684c0017-320d-4195-a9a0-52a5174dfdd1\" (UID: \"684c0017-320d-4195-a9a0-52a5174dfdd1\") " Mar 20 17:55:08 crc kubenswrapper[4690]: I0320 17:55:08.335486 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/684c0017-320d-4195-a9a0-52a5174dfdd1-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "684c0017-320d-4195-a9a0-52a5174dfdd1" (UID: "684c0017-320d-4195-a9a0-52a5174dfdd1"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:55:08 crc kubenswrapper[4690]: I0320 17:55:08.335827 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/684c0017-320d-4195-a9a0-52a5174dfdd1-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "684c0017-320d-4195-a9a0-52a5174dfdd1" (UID: "684c0017-320d-4195-a9a0-52a5174dfdd1"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:55:08 crc kubenswrapper[4690]: I0320 17:55:08.338432 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/684c0017-320d-4195-a9a0-52a5174dfdd1-scripts" (OuterVolumeSpecName: "scripts") pod "684c0017-320d-4195-a9a0-52a5174dfdd1" (UID: "684c0017-320d-4195-a9a0-52a5174dfdd1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:55:08 crc kubenswrapper[4690]: I0320 17:55:08.339058 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/684c0017-320d-4195-a9a0-52a5174dfdd1-kube-api-access-nrxpz" (OuterVolumeSpecName: "kube-api-access-nrxpz") pod "684c0017-320d-4195-a9a0-52a5174dfdd1" (UID: "684c0017-320d-4195-a9a0-52a5174dfdd1"). InnerVolumeSpecName "kube-api-access-nrxpz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:55:08 crc kubenswrapper[4690]: I0320 17:55:08.367599 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/684c0017-320d-4195-a9a0-52a5174dfdd1-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "684c0017-320d-4195-a9a0-52a5174dfdd1" (UID: "684c0017-320d-4195-a9a0-52a5174dfdd1"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:55:08 crc kubenswrapper[4690]: I0320 17:55:08.408906 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/684c0017-320d-4195-a9a0-52a5174dfdd1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "684c0017-320d-4195-a9a0-52a5174dfdd1" (UID: "684c0017-320d-4195-a9a0-52a5174dfdd1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:55:08 crc kubenswrapper[4690]: I0320 17:55:08.436101 4690 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/684c0017-320d-4195-a9a0-52a5174dfdd1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:55:08 crc kubenswrapper[4690]: I0320 17:55:08.436132 4690 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/684c0017-320d-4195-a9a0-52a5174dfdd1-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:55:08 crc kubenswrapper[4690]: I0320 17:55:08.436141 4690 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/684c0017-320d-4195-a9a0-52a5174dfdd1-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 17:55:08 crc kubenswrapper[4690]: I0320 17:55:08.436150 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nrxpz\" (UniqueName: \"kubernetes.io/projected/684c0017-320d-4195-a9a0-52a5174dfdd1-kube-api-access-nrxpz\") on node \"crc\" DevicePath \"\"" Mar 20 17:55:08 crc kubenswrapper[4690]: I0320 17:55:08.436161 4690 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/684c0017-320d-4195-a9a0-52a5174dfdd1-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 17:55:08 crc kubenswrapper[4690]: I0320 17:55:08.436172 4690 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/684c0017-320d-4195-a9a0-52a5174dfdd1-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 17:55:08 crc kubenswrapper[4690]: I0320 17:55:08.443081 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/684c0017-320d-4195-a9a0-52a5174dfdd1-config-data" (OuterVolumeSpecName: "config-data") pod "684c0017-320d-4195-a9a0-52a5174dfdd1" (UID: "684c0017-320d-4195-a9a0-52a5174dfdd1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:55:08 crc kubenswrapper[4690]: I0320 17:55:08.537559 4690 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/684c0017-320d-4195-a9a0-52a5174dfdd1-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:55:09 crc kubenswrapper[4690]: I0320 17:55:09.012033 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:55:09 crc kubenswrapper[4690]: I0320 17:55:09.012068 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"684c0017-320d-4195-a9a0-52a5174dfdd1","Type":"ContainerDied","Data":"8e81ae9bc42ac5958490b64d6131f4ae58f1a00aa84efba25edfd473f699a7bf"} Mar 20 17:55:09 crc kubenswrapper[4690]: I0320 17:55:09.012168 4690 scope.go:117] "RemoveContainer" containerID="5551c7fa1f1596229263c54a6b7ac58580bb152dfea8eeb0ff1133f09bc4143b" Mar 20 17:55:09 crc kubenswrapper[4690]: I0320 17:55:09.015939 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a5903459-6cc0-495f-b301-4e6cc91b5a5a","Type":"ContainerStarted","Data":"3fec59d75ffc0f45a4b1c25a5e6938133e4589825a4e6b31507cf8a4c126e57e"} Mar 20 17:55:09 crc kubenswrapper[4690]: I0320 17:55:09.015994 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a5903459-6cc0-495f-b301-4e6cc91b5a5a","Type":"ContainerStarted","Data":"e924be1dfc1829cb448a315b99d1fe62654ae04178a463e5e7aeb12a768fc68b"} Mar 20 17:55:09 crc kubenswrapper[4690]: I0320 17:55:09.017942 4690 generic.go:334] "Generic (PLEG): container finished" podID="11cba8a0-5804-4d01-bcdb-ef490500501f" containerID="d7edac921af3e2d5feb159e51483e142289f9d76a7f9e650bb9c796b33e41066" exitCode=0 Mar 20 17:55:09 crc kubenswrapper[4690]: I0320 17:55:09.017998 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-69qfb" event={"ID":"11cba8a0-5804-4d01-bcdb-ef490500501f","Type":"ContainerDied","Data":"d7edac921af3e2d5feb159e51483e142289f9d76a7f9e650bb9c796b33e41066"} Mar 20 17:55:09 crc kubenswrapper[4690]: I0320 17:55:09.051860 4690 scope.go:117] "RemoveContainer" containerID="eee733de193fa9a786e74c6633f006599f51ca530be0593065ce97cca0c9e7bd" Mar 20 17:55:09 crc kubenswrapper[4690]: I0320 17:55:09.093946 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.09392137 podStartE2EDuration="2.09392137s" podCreationTimestamp="2026-03-20 17:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:55:09.072845776 +0000 UTC m=+1383.938671464" watchObservedRunningTime="2026-03-20 17:55:09.09392137 +0000 UTC m=+1383.959747068" Mar 20 17:55:09 crc kubenswrapper[4690]: I0320 17:55:09.107825 4690 scope.go:117] "RemoveContainer" containerID="1169c3782cff65c6859a1024927f6b920024feebecb830c8d188d884432b50c4" Mar 20 17:55:09 crc kubenswrapper[4690]: I0320 17:55:09.122780 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:55:09 crc kubenswrapper[4690]: I0320 17:55:09.137048 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:55:09 crc kubenswrapper[4690]: I0320 17:55:09.150627 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:55:09 crc kubenswrapper[4690]: E0320 17:55:09.151065 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="684c0017-320d-4195-a9a0-52a5174dfdd1" containerName="ceilometer-central-agent" Mar 20 17:55:09 crc kubenswrapper[4690]: I0320 17:55:09.151086 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="684c0017-320d-4195-a9a0-52a5174dfdd1" containerName="ceilometer-central-agent" Mar 20 17:55:09 crc kubenswrapper[4690]: E0320 17:55:09.151097 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="684c0017-320d-4195-a9a0-52a5174dfdd1" containerName="proxy-httpd" Mar 20 17:55:09 crc kubenswrapper[4690]: I0320 17:55:09.151106 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="684c0017-320d-4195-a9a0-52a5174dfdd1" containerName="proxy-httpd" Mar 20 17:55:09 crc kubenswrapper[4690]: E0320 17:55:09.151131 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="684c0017-320d-4195-a9a0-52a5174dfdd1" containerName="ceilometer-notification-agent" Mar 20 17:55:09 crc kubenswrapper[4690]: I0320 17:55:09.151140 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="684c0017-320d-4195-a9a0-52a5174dfdd1" containerName="ceilometer-notification-agent" Mar 20 17:55:09 crc kubenswrapper[4690]: E0320 17:55:09.151176 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="684c0017-320d-4195-a9a0-52a5174dfdd1" containerName="sg-core" Mar 20 17:55:09 crc kubenswrapper[4690]: I0320 17:55:09.151184 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="684c0017-320d-4195-a9a0-52a5174dfdd1" containerName="sg-core" Mar 20 17:55:09 crc kubenswrapper[4690]: I0320 17:55:09.151409 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="684c0017-320d-4195-a9a0-52a5174dfdd1" containerName="proxy-httpd" Mar 20 17:55:09 crc kubenswrapper[4690]: I0320 17:55:09.151441 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="684c0017-320d-4195-a9a0-52a5174dfdd1" containerName="ceilometer-notification-agent" Mar 20 17:55:09 crc kubenswrapper[4690]: I0320 17:55:09.151456 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="684c0017-320d-4195-a9a0-52a5174dfdd1" containerName="ceilometer-central-agent" Mar 20 17:55:09 crc kubenswrapper[4690]: I0320 17:55:09.151472 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="684c0017-320d-4195-a9a0-52a5174dfdd1" containerName="sg-core" Mar 20 17:55:09 crc kubenswrapper[4690]: I0320 17:55:09.162148 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:55:09 crc kubenswrapper[4690]: I0320 17:55:09.164875 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 20 17:55:09 crc kubenswrapper[4690]: I0320 17:55:09.165204 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 17:55:09 crc kubenswrapper[4690]: I0320 17:55:09.175064 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 17:55:09 crc kubenswrapper[4690]: I0320 17:55:09.181634 4690 scope.go:117] "RemoveContainer" containerID="2d6cc974d71df78b72e2dde85c3ce0bba672d9729f8c79e23aea5c997d4ed1cc" Mar 20 17:55:09 crc kubenswrapper[4690]: I0320 17:55:09.190371 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:55:09 crc kubenswrapper[4690]: I0320 17:55:09.262536 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/db1320b9-1cd4-4756-b81a-c3eed18a7140-run-httpd\") pod \"ceilometer-0\" (UID: \"db1320b9-1cd4-4756-b81a-c3eed18a7140\") " pod="openstack/ceilometer-0" Mar 20 17:55:09 crc kubenswrapper[4690]: I0320 17:55:09.262573 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/db1320b9-1cd4-4756-b81a-c3eed18a7140-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"db1320b9-1cd4-4756-b81a-c3eed18a7140\") " pod="openstack/ceilometer-0" Mar 20 17:55:09 crc kubenswrapper[4690]: I0320 17:55:09.262606 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db1320b9-1cd4-4756-b81a-c3eed18a7140-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"db1320b9-1cd4-4756-b81a-c3eed18a7140\") " pod="openstack/ceilometer-0" Mar 20 17:55:09 crc kubenswrapper[4690]: I0320 17:55:09.262626 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db1320b9-1cd4-4756-b81a-c3eed18a7140-config-data\") pod \"ceilometer-0\" (UID: \"db1320b9-1cd4-4756-b81a-c3eed18a7140\") " pod="openstack/ceilometer-0" Mar 20 17:55:09 crc kubenswrapper[4690]: I0320 17:55:09.262651 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/db1320b9-1cd4-4756-b81a-c3eed18a7140-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"db1320b9-1cd4-4756-b81a-c3eed18a7140\") " pod="openstack/ceilometer-0" Mar 20 17:55:09 crc kubenswrapper[4690]: I0320 17:55:09.262708 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/db1320b9-1cd4-4756-b81a-c3eed18a7140-log-httpd\") pod \"ceilometer-0\" (UID: \"db1320b9-1cd4-4756-b81a-c3eed18a7140\") " pod="openstack/ceilometer-0" Mar 20 17:55:09 crc kubenswrapper[4690]: I0320 17:55:09.262731 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db1320b9-1cd4-4756-b81a-c3eed18a7140-scripts\") pod \"ceilometer-0\" (UID: \"db1320b9-1cd4-4756-b81a-c3eed18a7140\") " pod="openstack/ceilometer-0" Mar 20 17:55:09 crc kubenswrapper[4690]: I0320 17:55:09.262808 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5k574\" (UniqueName: \"kubernetes.io/projected/db1320b9-1cd4-4756-b81a-c3eed18a7140-kube-api-access-5k574\") pod \"ceilometer-0\" (UID: \"db1320b9-1cd4-4756-b81a-c3eed18a7140\") " pod="openstack/ceilometer-0" Mar 20 17:55:09 crc kubenswrapper[4690]: I0320 17:55:09.365291 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db1320b9-1cd4-4756-b81a-c3eed18a7140-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"db1320b9-1cd4-4756-b81a-c3eed18a7140\") " pod="openstack/ceilometer-0" Mar 20 17:55:09 crc kubenswrapper[4690]: I0320 17:55:09.365369 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db1320b9-1cd4-4756-b81a-c3eed18a7140-config-data\") pod \"ceilometer-0\" (UID: \"db1320b9-1cd4-4756-b81a-c3eed18a7140\") " pod="openstack/ceilometer-0" Mar 20 17:55:09 crc kubenswrapper[4690]: I0320 17:55:09.366186 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/db1320b9-1cd4-4756-b81a-c3eed18a7140-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"db1320b9-1cd4-4756-b81a-c3eed18a7140\") " pod="openstack/ceilometer-0" Mar 20 17:55:09 crc kubenswrapper[4690]: I0320 17:55:09.366343 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/db1320b9-1cd4-4756-b81a-c3eed18a7140-log-httpd\") pod \"ceilometer-0\" (UID: \"db1320b9-1cd4-4756-b81a-c3eed18a7140\") " pod="openstack/ceilometer-0" Mar 20 17:55:09 crc kubenswrapper[4690]: I0320 17:55:09.366470 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db1320b9-1cd4-4756-b81a-c3eed18a7140-scripts\") pod \"ceilometer-0\" (UID: \"db1320b9-1cd4-4756-b81a-c3eed18a7140\") " pod="openstack/ceilometer-0" Mar 20 17:55:09 crc kubenswrapper[4690]: I0320 17:55:09.366652 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5k574\" (UniqueName: \"kubernetes.io/projected/db1320b9-1cd4-4756-b81a-c3eed18a7140-kube-api-access-5k574\") pod \"ceilometer-0\" (UID: \"db1320b9-1cd4-4756-b81a-c3eed18a7140\") " pod="openstack/ceilometer-0" Mar 20 17:55:09 crc kubenswrapper[4690]: I0320 17:55:09.366688 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/db1320b9-1cd4-4756-b81a-c3eed18a7140-run-httpd\") pod \"ceilometer-0\" (UID: \"db1320b9-1cd4-4756-b81a-c3eed18a7140\") " pod="openstack/ceilometer-0" Mar 20 17:55:09 crc kubenswrapper[4690]: I0320 17:55:09.366707 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/db1320b9-1cd4-4756-b81a-c3eed18a7140-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"db1320b9-1cd4-4756-b81a-c3eed18a7140\") " pod="openstack/ceilometer-0" Mar 20 17:55:09 crc kubenswrapper[4690]: I0320 17:55:09.367181 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/db1320b9-1cd4-4756-b81a-c3eed18a7140-log-httpd\") pod \"ceilometer-0\" (UID: \"db1320b9-1cd4-4756-b81a-c3eed18a7140\") " pod="openstack/ceilometer-0" Mar 20 17:55:09 crc kubenswrapper[4690]: I0320 17:55:09.367206 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/db1320b9-1cd4-4756-b81a-c3eed18a7140-run-httpd\") pod \"ceilometer-0\" (UID: \"db1320b9-1cd4-4756-b81a-c3eed18a7140\") " pod="openstack/ceilometer-0" Mar 20 17:55:09 crc kubenswrapper[4690]: I0320 17:55:09.371078 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/db1320b9-1cd4-4756-b81a-c3eed18a7140-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"db1320b9-1cd4-4756-b81a-c3eed18a7140\") " pod="openstack/ceilometer-0" Mar 20 17:55:09 crc kubenswrapper[4690]: I0320 17:55:09.371372 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db1320b9-1cd4-4756-b81a-c3eed18a7140-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"db1320b9-1cd4-4756-b81a-c3eed18a7140\") " pod="openstack/ceilometer-0" Mar 20 17:55:09 crc kubenswrapper[4690]: I0320 17:55:09.371911 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/db1320b9-1cd4-4756-b81a-c3eed18a7140-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"db1320b9-1cd4-4756-b81a-c3eed18a7140\") " pod="openstack/ceilometer-0" Mar 20 17:55:09 crc kubenswrapper[4690]: I0320 17:55:09.372545 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db1320b9-1cd4-4756-b81a-c3eed18a7140-scripts\") pod \"ceilometer-0\" (UID: \"db1320b9-1cd4-4756-b81a-c3eed18a7140\") " pod="openstack/ceilometer-0" Mar 20 17:55:09 crc kubenswrapper[4690]: I0320 17:55:09.376101 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db1320b9-1cd4-4756-b81a-c3eed18a7140-config-data\") pod \"ceilometer-0\" (UID: \"db1320b9-1cd4-4756-b81a-c3eed18a7140\") " pod="openstack/ceilometer-0" Mar 20 17:55:09 crc kubenswrapper[4690]: I0320 17:55:09.398744 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5k574\" (UniqueName: \"kubernetes.io/projected/db1320b9-1cd4-4756-b81a-c3eed18a7140-kube-api-access-5k574\") pod \"ceilometer-0\" (UID: \"db1320b9-1cd4-4756-b81a-c3eed18a7140\") " pod="openstack/ceilometer-0" Mar 20 17:55:09 crc kubenswrapper[4690]: I0320 17:55:09.487215 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:55:09 crc kubenswrapper[4690]: I0320 17:55:09.912113 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="684c0017-320d-4195-a9a0-52a5174dfdd1" path="/var/lib/kubelet/pods/684c0017-320d-4195-a9a0-52a5174dfdd1/volumes" Mar 20 17:55:09 crc kubenswrapper[4690]: W0320 17:55:09.937760 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddb1320b9_1cd4_4756_b81a_c3eed18a7140.slice/crio-57ffdfb621377fb59d10ddd552692d4c55803d502d40535b98f945f00685b5c7 WatchSource:0}: Error finding container 57ffdfb621377fb59d10ddd552692d4c55803d502d40535b98f945f00685b5c7: Status 404 returned error can't find the container with id 57ffdfb621377fb59d10ddd552692d4c55803d502d40535b98f945f00685b5c7 Mar 20 17:55:09 crc kubenswrapper[4690]: I0320 17:55:09.943639 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:55:10 crc kubenswrapper[4690]: I0320 17:55:10.029115 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"db1320b9-1cd4-4756-b81a-c3eed18a7140","Type":"ContainerStarted","Data":"57ffdfb621377fb59d10ddd552692d4c55803d502d40535b98f945f00685b5c7"} Mar 20 17:55:10 crc kubenswrapper[4690]: I0320 17:55:10.030789 4690 generic.go:334] "Generic (PLEG): container finished" podID="d3f5fa9c-b4e5-4674-8ecf-2dd41a12852e" containerID="8ba3dabaad1997eae9e7f119e3e94ab9802dfb2f29f3870414a98004cdbf6e7b" exitCode=0 Mar 20 17:55:10 crc kubenswrapper[4690]: I0320 17:55:10.030953 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-454zt" event={"ID":"d3f5fa9c-b4e5-4674-8ecf-2dd41a12852e","Type":"ContainerDied","Data":"8ba3dabaad1997eae9e7f119e3e94ab9802dfb2f29f3870414a98004cdbf6e7b"} Mar 20 17:55:10 crc kubenswrapper[4690]: I0320 17:55:10.387632 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 17:55:10 crc kubenswrapper[4690]: I0320 17:55:10.387938 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 17:55:10 crc kubenswrapper[4690]: I0320 17:55:10.547947 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-69qfb" Mar 20 17:55:10 crc kubenswrapper[4690]: I0320 17:55:10.649379 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:55:10 crc kubenswrapper[4690]: I0320 17:55:10.689986 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11cba8a0-5804-4d01-bcdb-ef490500501f-config-data\") pod \"11cba8a0-5804-4d01-bcdb-ef490500501f\" (UID: \"11cba8a0-5804-4d01-bcdb-ef490500501f\") " Mar 20 17:55:10 crc kubenswrapper[4690]: I0320 17:55:10.690161 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11cba8a0-5804-4d01-bcdb-ef490500501f-combined-ca-bundle\") pod \"11cba8a0-5804-4d01-bcdb-ef490500501f\" (UID: \"11cba8a0-5804-4d01-bcdb-ef490500501f\") " Mar 20 17:55:10 crc kubenswrapper[4690]: I0320 17:55:10.690210 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6kp9\" (UniqueName: \"kubernetes.io/projected/11cba8a0-5804-4d01-bcdb-ef490500501f-kube-api-access-k6kp9\") pod \"11cba8a0-5804-4d01-bcdb-ef490500501f\" (UID: \"11cba8a0-5804-4d01-bcdb-ef490500501f\") " Mar 20 17:55:10 crc kubenswrapper[4690]: I0320 17:55:10.690508 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/11cba8a0-5804-4d01-bcdb-ef490500501f-scripts\") pod \"11cba8a0-5804-4d01-bcdb-ef490500501f\" (UID: \"11cba8a0-5804-4d01-bcdb-ef490500501f\") " Mar 20 17:55:10 crc kubenswrapper[4690]: I0320 17:55:10.694346 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11cba8a0-5804-4d01-bcdb-ef490500501f-scripts" (OuterVolumeSpecName: "scripts") pod "11cba8a0-5804-4d01-bcdb-ef490500501f" (UID: "11cba8a0-5804-4d01-bcdb-ef490500501f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:55:10 crc kubenswrapper[4690]: I0320 17:55:10.697200 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11cba8a0-5804-4d01-bcdb-ef490500501f-kube-api-access-k6kp9" (OuterVolumeSpecName: "kube-api-access-k6kp9") pod "11cba8a0-5804-4d01-bcdb-ef490500501f" (UID: "11cba8a0-5804-4d01-bcdb-ef490500501f"). InnerVolumeSpecName "kube-api-access-k6kp9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:55:10 crc kubenswrapper[4690]: I0320 17:55:10.717456 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 20 17:55:10 crc kubenswrapper[4690]: I0320 17:55:10.717490 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 20 17:55:10 crc kubenswrapper[4690]: I0320 17:55:10.732363 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11cba8a0-5804-4d01-bcdb-ef490500501f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "11cba8a0-5804-4d01-bcdb-ef490500501f" (UID: "11cba8a0-5804-4d01-bcdb-ef490500501f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:55:10 crc kubenswrapper[4690]: I0320 17:55:10.738892 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11cba8a0-5804-4d01-bcdb-ef490500501f-config-data" (OuterVolumeSpecName: "config-data") pod "11cba8a0-5804-4d01-bcdb-ef490500501f" (UID: "11cba8a0-5804-4d01-bcdb-ef490500501f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:55:10 crc kubenswrapper[4690]: I0320 17:55:10.755088 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 20 17:55:10 crc kubenswrapper[4690]: I0320 17:55:10.793484 4690 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11cba8a0-5804-4d01-bcdb-ef490500501f-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:55:10 crc kubenswrapper[4690]: I0320 17:55:10.793522 4690 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11cba8a0-5804-4d01-bcdb-ef490500501f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:55:10 crc kubenswrapper[4690]: I0320 17:55:10.793533 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k6kp9\" (UniqueName: \"kubernetes.io/projected/11cba8a0-5804-4d01-bcdb-ef490500501f-kube-api-access-k6kp9\") on node \"crc\" DevicePath \"\"" Mar 20 17:55:10 crc kubenswrapper[4690]: I0320 17:55:10.793542 4690 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/11cba8a0-5804-4d01-bcdb-ef490500501f-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:55:10 crc kubenswrapper[4690]: I0320 17:55:10.880476 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-757b4f8459-q25ft" Mar 20 17:55:10 crc kubenswrapper[4690]: I0320 17:55:10.957422 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-m9hgh"] Mar 20 17:55:10 crc kubenswrapper[4690]: I0320 17:55:10.957651 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c9776ccc5-m9hgh" podUID="14ff8ae3-f423-405a-bdef-0805c9925ba5" containerName="dnsmasq-dns" containerID="cri-o://ac3419b66515324fa8c3078ab0d570c15a8e78206b2f6c405aee336be32614cb" gracePeriod=10 Mar 20 17:55:11 crc kubenswrapper[4690]: I0320 17:55:11.112497 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-69qfb" Mar 20 17:55:11 crc kubenswrapper[4690]: I0320 17:55:11.112624 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-69qfb" event={"ID":"11cba8a0-5804-4d01-bcdb-ef490500501f","Type":"ContainerDied","Data":"ff6f3fbe0a5cce91e4a835de8e6df7fffc96a2e87eaa4bfcfb0ff6a27876ac53"} Mar 20 17:55:11 crc kubenswrapper[4690]: I0320 17:55:11.112664 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff6f3fbe0a5cce91e4a835de8e6df7fffc96a2e87eaa4bfcfb0ff6a27876ac53" Mar 20 17:55:11 crc kubenswrapper[4690]: I0320 17:55:11.152527 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"db1320b9-1cd4-4756-b81a-c3eed18a7140","Type":"ContainerStarted","Data":"e9e131e0321bc2bd64fab209bda5a6688cdff571f9a35ce09702536f74c753a4"} Mar 20 17:55:11 crc kubenswrapper[4690]: I0320 17:55:11.219218 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 20 17:55:11 crc kubenswrapper[4690]: E0320 17:55:11.219738 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11cba8a0-5804-4d01-bcdb-ef490500501f" containerName="nova-cell1-conductor-db-sync" Mar 20 17:55:11 crc kubenswrapper[4690]: I0320 17:55:11.219763 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="11cba8a0-5804-4d01-bcdb-ef490500501f" containerName="nova-cell1-conductor-db-sync" Mar 20 17:55:11 crc kubenswrapper[4690]: I0320 17:55:11.219997 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="11cba8a0-5804-4d01-bcdb-ef490500501f" containerName="nova-cell1-conductor-db-sync" Mar 20 17:55:11 crc kubenswrapper[4690]: I0320 17:55:11.224127 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 20 17:55:11 crc kubenswrapper[4690]: I0320 17:55:11.231334 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 20 17:55:11 crc kubenswrapper[4690]: I0320 17:55:11.250903 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 20 17:55:11 crc kubenswrapper[4690]: I0320 17:55:11.255933 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 20 17:55:11 crc kubenswrapper[4690]: I0320 17:55:11.326540 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rsrr\" (UniqueName: \"kubernetes.io/projected/b1aa290f-4335-4859-83e2-b2283b49e235-kube-api-access-8rsrr\") pod \"nova-cell1-conductor-0\" (UID: \"b1aa290f-4335-4859-83e2-b2283b49e235\") " pod="openstack/nova-cell1-conductor-0" Mar 20 17:55:11 crc kubenswrapper[4690]: I0320 17:55:11.326681 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1aa290f-4335-4859-83e2-b2283b49e235-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"b1aa290f-4335-4859-83e2-b2283b49e235\") " pod="openstack/nova-cell1-conductor-0" Mar 20 17:55:11 crc kubenswrapper[4690]: I0320 17:55:11.326742 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1aa290f-4335-4859-83e2-b2283b49e235-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"b1aa290f-4335-4859-83e2-b2283b49e235\") " pod="openstack/nova-cell1-conductor-0" Mar 20 17:55:11 crc kubenswrapper[4690]: I0320 17:55:11.429341 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rsrr\" (UniqueName: \"kubernetes.io/projected/b1aa290f-4335-4859-83e2-b2283b49e235-kube-api-access-8rsrr\") pod \"nova-cell1-conductor-0\" (UID: \"b1aa290f-4335-4859-83e2-b2283b49e235\") " pod="openstack/nova-cell1-conductor-0" Mar 20 17:55:11 crc kubenswrapper[4690]: I0320 17:55:11.429736 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1aa290f-4335-4859-83e2-b2283b49e235-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"b1aa290f-4335-4859-83e2-b2283b49e235\") " pod="openstack/nova-cell1-conductor-0" Mar 20 17:55:11 crc kubenswrapper[4690]: I0320 17:55:11.429808 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1aa290f-4335-4859-83e2-b2283b49e235-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"b1aa290f-4335-4859-83e2-b2283b49e235\") " pod="openstack/nova-cell1-conductor-0" Mar 20 17:55:11 crc kubenswrapper[4690]: I0320 17:55:11.445094 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1aa290f-4335-4859-83e2-b2283b49e235-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"b1aa290f-4335-4859-83e2-b2283b49e235\") " pod="openstack/nova-cell1-conductor-0" Mar 20 17:55:11 crc kubenswrapper[4690]: I0320 17:55:11.448039 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1aa290f-4335-4859-83e2-b2283b49e235-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"b1aa290f-4335-4859-83e2-b2283b49e235\") " pod="openstack/nova-cell1-conductor-0" Mar 20 17:55:11 crc kubenswrapper[4690]: I0320 17:55:11.475423 4690 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="240578b9-6354-4ab4-9e38-cec9daee9be4" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.192:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 17:55:11 crc kubenswrapper[4690]: I0320 17:55:11.475459 4690 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="240578b9-6354-4ab4-9e38-cec9daee9be4" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.192:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 17:55:11 crc kubenswrapper[4690]: I0320 17:55:11.495055 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rsrr\" (UniqueName: \"kubernetes.io/projected/b1aa290f-4335-4859-83e2-b2283b49e235-kube-api-access-8rsrr\") pod \"nova-cell1-conductor-0\" (UID: \"b1aa290f-4335-4859-83e2-b2283b49e235\") " pod="openstack/nova-cell1-conductor-0" Mar 20 17:55:11 crc kubenswrapper[4690]: I0320 17:55:11.585934 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 20 17:55:11 crc kubenswrapper[4690]: I0320 17:55:11.745880 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-454zt" Mar 20 17:55:11 crc kubenswrapper[4690]: I0320 17:55:11.805021 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-m9hgh" Mar 20 17:55:11 crc kubenswrapper[4690]: I0320 17:55:11.839635 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bvgd7\" (UniqueName: \"kubernetes.io/projected/d3f5fa9c-b4e5-4674-8ecf-2dd41a12852e-kube-api-access-bvgd7\") pod \"d3f5fa9c-b4e5-4674-8ecf-2dd41a12852e\" (UID: \"d3f5fa9c-b4e5-4674-8ecf-2dd41a12852e\") " Mar 20 17:55:11 crc kubenswrapper[4690]: I0320 17:55:11.839678 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/14ff8ae3-f423-405a-bdef-0805c9925ba5-ovsdbserver-nb\") pod \"14ff8ae3-f423-405a-bdef-0805c9925ba5\" (UID: \"14ff8ae3-f423-405a-bdef-0805c9925ba5\") " Mar 20 17:55:11 crc kubenswrapper[4690]: I0320 17:55:11.839717 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3f5fa9c-b4e5-4674-8ecf-2dd41a12852e-combined-ca-bundle\") pod \"d3f5fa9c-b4e5-4674-8ecf-2dd41a12852e\" (UID: \"d3f5fa9c-b4e5-4674-8ecf-2dd41a12852e\") " Mar 20 17:55:11 crc kubenswrapper[4690]: I0320 17:55:11.839793 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/14ff8ae3-f423-405a-bdef-0805c9925ba5-dns-swift-storage-0\") pod \"14ff8ae3-f423-405a-bdef-0805c9925ba5\" (UID: \"14ff8ae3-f423-405a-bdef-0805c9925ba5\") " Mar 20 17:55:11 crc kubenswrapper[4690]: I0320 17:55:11.839854 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3f5fa9c-b4e5-4674-8ecf-2dd41a12852e-config-data\") pod \"d3f5fa9c-b4e5-4674-8ecf-2dd41a12852e\" (UID: \"d3f5fa9c-b4e5-4674-8ecf-2dd41a12852e\") " Mar 20 17:55:11 crc kubenswrapper[4690]: I0320 17:55:11.839902 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/14ff8ae3-f423-405a-bdef-0805c9925ba5-dns-svc\") pod \"14ff8ae3-f423-405a-bdef-0805c9925ba5\" (UID: \"14ff8ae3-f423-405a-bdef-0805c9925ba5\") " Mar 20 17:55:11 crc kubenswrapper[4690]: I0320 17:55:11.839987 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hhmp4\" (UniqueName: \"kubernetes.io/projected/14ff8ae3-f423-405a-bdef-0805c9925ba5-kube-api-access-hhmp4\") pod \"14ff8ae3-f423-405a-bdef-0805c9925ba5\" (UID: \"14ff8ae3-f423-405a-bdef-0805c9925ba5\") " Mar 20 17:55:11 crc kubenswrapper[4690]: I0320 17:55:11.840009 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14ff8ae3-f423-405a-bdef-0805c9925ba5-config\") pod \"14ff8ae3-f423-405a-bdef-0805c9925ba5\" (UID: \"14ff8ae3-f423-405a-bdef-0805c9925ba5\") " Mar 20 17:55:11 crc kubenswrapper[4690]: I0320 17:55:11.840033 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/14ff8ae3-f423-405a-bdef-0805c9925ba5-ovsdbserver-sb\") pod \"14ff8ae3-f423-405a-bdef-0805c9925ba5\" (UID: \"14ff8ae3-f423-405a-bdef-0805c9925ba5\") " Mar 20 17:55:11 crc kubenswrapper[4690]: I0320 17:55:11.840079 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3f5fa9c-b4e5-4674-8ecf-2dd41a12852e-scripts\") pod \"d3f5fa9c-b4e5-4674-8ecf-2dd41a12852e\" (UID: \"d3f5fa9c-b4e5-4674-8ecf-2dd41a12852e\") " Mar 20 17:55:11 crc kubenswrapper[4690]: I0320 17:55:11.844473 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14ff8ae3-f423-405a-bdef-0805c9925ba5-kube-api-access-hhmp4" (OuterVolumeSpecName: "kube-api-access-hhmp4") pod "14ff8ae3-f423-405a-bdef-0805c9925ba5" (UID: "14ff8ae3-f423-405a-bdef-0805c9925ba5"). InnerVolumeSpecName "kube-api-access-hhmp4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:55:11 crc kubenswrapper[4690]: I0320 17:55:11.846330 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3f5fa9c-b4e5-4674-8ecf-2dd41a12852e-scripts" (OuterVolumeSpecName: "scripts") pod "d3f5fa9c-b4e5-4674-8ecf-2dd41a12852e" (UID: "d3f5fa9c-b4e5-4674-8ecf-2dd41a12852e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:55:11 crc kubenswrapper[4690]: I0320 17:55:11.871646 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3f5fa9c-b4e5-4674-8ecf-2dd41a12852e-kube-api-access-bvgd7" (OuterVolumeSpecName: "kube-api-access-bvgd7") pod "d3f5fa9c-b4e5-4674-8ecf-2dd41a12852e" (UID: "d3f5fa9c-b4e5-4674-8ecf-2dd41a12852e"). InnerVolumeSpecName "kube-api-access-bvgd7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:55:11 crc kubenswrapper[4690]: I0320 17:55:11.900387 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3f5fa9c-b4e5-4674-8ecf-2dd41a12852e-config-data" (OuterVolumeSpecName: "config-data") pod "d3f5fa9c-b4e5-4674-8ecf-2dd41a12852e" (UID: "d3f5fa9c-b4e5-4674-8ecf-2dd41a12852e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:55:11 crc kubenswrapper[4690]: I0320 17:55:11.927147 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14ff8ae3-f423-405a-bdef-0805c9925ba5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "14ff8ae3-f423-405a-bdef-0805c9925ba5" (UID: "14ff8ae3-f423-405a-bdef-0805c9925ba5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:55:11 crc kubenswrapper[4690]: I0320 17:55:11.935445 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3f5fa9c-b4e5-4674-8ecf-2dd41a12852e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d3f5fa9c-b4e5-4674-8ecf-2dd41a12852e" (UID: "d3f5fa9c-b4e5-4674-8ecf-2dd41a12852e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:55:11 crc kubenswrapper[4690]: I0320 17:55:11.940232 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14ff8ae3-f423-405a-bdef-0805c9925ba5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "14ff8ae3-f423-405a-bdef-0805c9925ba5" (UID: "14ff8ae3-f423-405a-bdef-0805c9925ba5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:55:11 crc kubenswrapper[4690]: I0320 17:55:11.945901 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hhmp4\" (UniqueName: \"kubernetes.io/projected/14ff8ae3-f423-405a-bdef-0805c9925ba5-kube-api-access-hhmp4\") on node \"crc\" DevicePath \"\"" Mar 20 17:55:11 crc kubenswrapper[4690]: I0320 17:55:11.945927 4690 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3f5fa9c-b4e5-4674-8ecf-2dd41a12852e-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:55:11 crc kubenswrapper[4690]: I0320 17:55:11.945937 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bvgd7\" (UniqueName: \"kubernetes.io/projected/d3f5fa9c-b4e5-4674-8ecf-2dd41a12852e-kube-api-access-bvgd7\") on node \"crc\" DevicePath \"\"" Mar 20 17:55:11 crc kubenswrapper[4690]: I0320 17:55:11.945946 4690 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/14ff8ae3-f423-405a-bdef-0805c9925ba5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 17:55:11 crc kubenswrapper[4690]: I0320 17:55:11.945955 4690 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3f5fa9c-b4e5-4674-8ecf-2dd41a12852e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:55:11 crc kubenswrapper[4690]: I0320 17:55:11.945963 4690 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3f5fa9c-b4e5-4674-8ecf-2dd41a12852e-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:55:11 crc kubenswrapper[4690]: I0320 17:55:11.945972 4690 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/14ff8ae3-f423-405a-bdef-0805c9925ba5-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 17:55:11 crc kubenswrapper[4690]: I0320 17:55:11.963445 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14ff8ae3-f423-405a-bdef-0805c9925ba5-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "14ff8ae3-f423-405a-bdef-0805c9925ba5" (UID: "14ff8ae3-f423-405a-bdef-0805c9925ba5"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:55:12 crc kubenswrapper[4690]: I0320 17:55:12.003797 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14ff8ae3-f423-405a-bdef-0805c9925ba5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "14ff8ae3-f423-405a-bdef-0805c9925ba5" (UID: "14ff8ae3-f423-405a-bdef-0805c9925ba5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:55:12 crc kubenswrapper[4690]: I0320 17:55:12.036787 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14ff8ae3-f423-405a-bdef-0805c9925ba5-config" (OuterVolumeSpecName: "config") pod "14ff8ae3-f423-405a-bdef-0805c9925ba5" (UID: "14ff8ae3-f423-405a-bdef-0805c9925ba5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:55:12 crc kubenswrapper[4690]: I0320 17:55:12.049419 4690 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/14ff8ae3-f423-405a-bdef-0805c9925ba5-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 17:55:12 crc kubenswrapper[4690]: I0320 17:55:12.049466 4690 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14ff8ae3-f423-405a-bdef-0805c9925ba5-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:55:12 crc kubenswrapper[4690]: I0320 17:55:12.049480 4690 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/14ff8ae3-f423-405a-bdef-0805c9925ba5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 17:55:12 crc kubenswrapper[4690]: I0320 17:55:12.170456 4690 generic.go:334] "Generic (PLEG): container finished" podID="14ff8ae3-f423-405a-bdef-0805c9925ba5" containerID="ac3419b66515324fa8c3078ab0d570c15a8e78206b2f6c405aee336be32614cb" exitCode=0 Mar 20 17:55:12 crc kubenswrapper[4690]: I0320 17:55:12.170557 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-m9hgh" event={"ID":"14ff8ae3-f423-405a-bdef-0805c9925ba5","Type":"ContainerDied","Data":"ac3419b66515324fa8c3078ab0d570c15a8e78206b2f6c405aee336be32614cb"} Mar 20 17:55:12 crc kubenswrapper[4690]: I0320 17:55:12.170593 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-m9hgh" event={"ID":"14ff8ae3-f423-405a-bdef-0805c9925ba5","Type":"ContainerDied","Data":"d071b197fc3af1e53517392c47fd9d9b11fa1a68db14a7bee4f36dc80346cf65"} Mar 20 17:55:12 crc kubenswrapper[4690]: I0320 17:55:12.170614 4690 scope.go:117] "RemoveContainer" containerID="ac3419b66515324fa8c3078ab0d570c15a8e78206b2f6c405aee336be32614cb" Mar 20 17:55:12 crc kubenswrapper[4690]: I0320 17:55:12.170825 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-m9hgh" Mar 20 17:55:12 crc kubenswrapper[4690]: I0320 17:55:12.182709 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-454zt" Mar 20 17:55:12 crc kubenswrapper[4690]: I0320 17:55:12.183294 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-454zt" event={"ID":"d3f5fa9c-b4e5-4674-8ecf-2dd41a12852e","Type":"ContainerDied","Data":"d346c17ca1173332dd2f6bde66c3d6c84dfadc2f35963c88caf1aa6956e5e71b"} Mar 20 17:55:12 crc kubenswrapper[4690]: I0320 17:55:12.183341 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d346c17ca1173332dd2f6bde66c3d6c84dfadc2f35963c88caf1aa6956e5e71b" Mar 20 17:55:12 crc kubenswrapper[4690]: I0320 17:55:12.224634 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 20 17:55:12 crc kubenswrapper[4690]: I0320 17:55:12.300601 4690 scope.go:117] "RemoveContainer" containerID="dfbcd3140b95eaa4f33f7554d49b3fa306a7665d254d3b428545c604bc069749" Mar 20 17:55:12 crc kubenswrapper[4690]: I0320 17:55:12.319480 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-m9hgh"] Mar 20 17:55:12 crc kubenswrapper[4690]: I0320 17:55:12.334240 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-m9hgh"] Mar 20 17:55:12 crc kubenswrapper[4690]: I0320 17:55:12.361971 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 17:55:12 crc kubenswrapper[4690]: I0320 17:55:12.362322 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="240578b9-6354-4ab4-9e38-cec9daee9be4" containerName="nova-api-log" containerID="cri-o://877ac42af068efa39ddb6127ea33ddac065609ce0b8559a32b84b38290ac3ba4" gracePeriod=30 Mar 20 17:55:12 crc kubenswrapper[4690]: I0320 17:55:12.362901 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="240578b9-6354-4ab4-9e38-cec9daee9be4" containerName="nova-api-api" containerID="cri-o://394dd84b71c6e5d6e0c22eb4b3dc75580c8589a87d9fa00a9e477d9fee50aa8b" gracePeriod=30 Mar 20 17:55:12 crc kubenswrapper[4690]: I0320 17:55:12.377442 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 17:55:12 crc kubenswrapper[4690]: I0320 17:55:12.377673 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="a5903459-6cc0-495f-b301-4e6cc91b5a5a" containerName="nova-metadata-log" containerID="cri-o://e924be1dfc1829cb448a315b99d1fe62654ae04178a463e5e7aeb12a768fc68b" gracePeriod=30 Mar 20 17:55:12 crc kubenswrapper[4690]: I0320 17:55:12.378087 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="a5903459-6cc0-495f-b301-4e6cc91b5a5a" containerName="nova-metadata-metadata" containerID="cri-o://3fec59d75ffc0f45a4b1c25a5e6938133e4589825a4e6b31507cf8a4c126e57e" gracePeriod=30 Mar 20 17:55:12 crc kubenswrapper[4690]: I0320 17:55:12.383630 4690 scope.go:117] "RemoveContainer" containerID="ac3419b66515324fa8c3078ab0d570c15a8e78206b2f6c405aee336be32614cb" Mar 20 17:55:12 crc kubenswrapper[4690]: E0320 17:55:12.384079 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac3419b66515324fa8c3078ab0d570c15a8e78206b2f6c405aee336be32614cb\": container with ID starting with ac3419b66515324fa8c3078ab0d570c15a8e78206b2f6c405aee336be32614cb not found: ID does not exist" containerID="ac3419b66515324fa8c3078ab0d570c15a8e78206b2f6c405aee336be32614cb" Mar 20 17:55:12 crc kubenswrapper[4690]: I0320 17:55:12.384103 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac3419b66515324fa8c3078ab0d570c15a8e78206b2f6c405aee336be32614cb"} err="failed to get container status \"ac3419b66515324fa8c3078ab0d570c15a8e78206b2f6c405aee336be32614cb\": rpc error: code = NotFound desc = could not find container \"ac3419b66515324fa8c3078ab0d570c15a8e78206b2f6c405aee336be32614cb\": container with ID starting with ac3419b66515324fa8c3078ab0d570c15a8e78206b2f6c405aee336be32614cb not found: ID does not exist" Mar 20 17:55:12 crc kubenswrapper[4690]: I0320 17:55:12.384122 4690 scope.go:117] "RemoveContainer" containerID="dfbcd3140b95eaa4f33f7554d49b3fa306a7665d254d3b428545c604bc069749" Mar 20 17:55:12 crc kubenswrapper[4690]: I0320 17:55:12.385054 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 17:55:12 crc kubenswrapper[4690]: E0320 17:55:12.386544 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dfbcd3140b95eaa4f33f7554d49b3fa306a7665d254d3b428545c604bc069749\": container with ID starting with dfbcd3140b95eaa4f33f7554d49b3fa306a7665d254d3b428545c604bc069749 not found: ID does not exist" containerID="dfbcd3140b95eaa4f33f7554d49b3fa306a7665d254d3b428545c604bc069749" Mar 20 17:55:12 crc kubenswrapper[4690]: I0320 17:55:12.386566 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dfbcd3140b95eaa4f33f7554d49b3fa306a7665d254d3b428545c604bc069749"} err="failed to get container status \"dfbcd3140b95eaa4f33f7554d49b3fa306a7665d254d3b428545c604bc069749\": rpc error: code = NotFound desc = could not find container \"dfbcd3140b95eaa4f33f7554d49b3fa306a7665d254d3b428545c604bc069749\": container with ID starting with dfbcd3140b95eaa4f33f7554d49b3fa306a7665d254d3b428545c604bc069749 not found: ID does not exist" Mar 20 17:55:13 crc kubenswrapper[4690]: I0320 17:55:13.204866 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"db1320b9-1cd4-4756-b81a-c3eed18a7140","Type":"ContainerStarted","Data":"21321ee395aef14fbc71d4126ffe48479fe453b8429666d92188d11427599ea1"} Mar 20 17:55:13 crc kubenswrapper[4690]: I0320 17:55:13.213620 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"b1aa290f-4335-4859-83e2-b2283b49e235","Type":"ContainerStarted","Data":"4849a9cb484480ffedea9f628fc891579c72a0f6ccbc7f6d807b47608457a900"} Mar 20 17:55:13 crc kubenswrapper[4690]: I0320 17:55:13.213847 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"b1aa290f-4335-4859-83e2-b2283b49e235","Type":"ContainerStarted","Data":"339165c9604f61bd516c0d93f04197f550499837c89fed3120008b1c6b20c916"} Mar 20 17:55:13 crc kubenswrapper[4690]: I0320 17:55:13.213963 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 20 17:55:13 crc kubenswrapper[4690]: I0320 17:55:13.221752 4690 generic.go:334] "Generic (PLEG): container finished" podID="240578b9-6354-4ab4-9e38-cec9daee9be4" containerID="877ac42af068efa39ddb6127ea33ddac065609ce0b8559a32b84b38290ac3ba4" exitCode=143 Mar 20 17:55:13 crc kubenswrapper[4690]: I0320 17:55:13.221816 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"240578b9-6354-4ab4-9e38-cec9daee9be4","Type":"ContainerDied","Data":"877ac42af068efa39ddb6127ea33ddac065609ce0b8559a32b84b38290ac3ba4"} Mar 20 17:55:13 crc kubenswrapper[4690]: I0320 17:55:13.230947 4690 generic.go:334] "Generic (PLEG): container finished" podID="a5903459-6cc0-495f-b301-4e6cc91b5a5a" containerID="3fec59d75ffc0f45a4b1c25a5e6938133e4589825a4e6b31507cf8a4c126e57e" exitCode=0 Mar 20 17:55:13 crc kubenswrapper[4690]: I0320 17:55:13.230989 4690 generic.go:334] "Generic (PLEG): container finished" podID="a5903459-6cc0-495f-b301-4e6cc91b5a5a" containerID="e924be1dfc1829cb448a315b99d1fe62654ae04178a463e5e7aeb12a768fc68b" exitCode=143 Mar 20 17:55:13 crc kubenswrapper[4690]: I0320 17:55:13.231068 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a5903459-6cc0-495f-b301-4e6cc91b5a5a","Type":"ContainerDied","Data":"3fec59d75ffc0f45a4b1c25a5e6938133e4589825a4e6b31507cf8a4c126e57e"} Mar 20 17:55:13 crc kubenswrapper[4690]: I0320 17:55:13.231107 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a5903459-6cc0-495f-b301-4e6cc91b5a5a","Type":"ContainerDied","Data":"e924be1dfc1829cb448a315b99d1fe62654ae04178a463e5e7aeb12a768fc68b"} Mar 20 17:55:13 crc kubenswrapper[4690]: I0320 17:55:13.231183 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="2c7be3e2-e25f-45c8-8320-d1b5407835fe" containerName="nova-scheduler-scheduler" containerID="cri-o://0c982168f6c0db21d3fe2de8aeb6075daec4e3eca159cb524428072ee15a1863" gracePeriod=30 Mar 20 17:55:13 crc kubenswrapper[4690]: I0320 17:55:13.234219 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.234200772 podStartE2EDuration="2.234200772s" podCreationTimestamp="2026-03-20 17:55:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:55:13.231238719 +0000 UTC m=+1388.097064398" watchObservedRunningTime="2026-03-20 17:55:13.234200772 +0000 UTC m=+1388.100026460" Mar 20 17:55:13 crc kubenswrapper[4690]: I0320 17:55:13.621661 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 17:55:13 crc kubenswrapper[4690]: I0320 17:55:13.692048 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5903459-6cc0-495f-b301-4e6cc91b5a5a-nova-metadata-tls-certs\") pod \"a5903459-6cc0-495f-b301-4e6cc91b5a5a\" (UID: \"a5903459-6cc0-495f-b301-4e6cc91b5a5a\") " Mar 20 17:55:13 crc kubenswrapper[4690]: I0320 17:55:13.692104 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5903459-6cc0-495f-b301-4e6cc91b5a5a-combined-ca-bundle\") pod \"a5903459-6cc0-495f-b301-4e6cc91b5a5a\" (UID: \"a5903459-6cc0-495f-b301-4e6cc91b5a5a\") " Mar 20 17:55:13 crc kubenswrapper[4690]: I0320 17:55:13.692302 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5903459-6cc0-495f-b301-4e6cc91b5a5a-logs\") pod \"a5903459-6cc0-495f-b301-4e6cc91b5a5a\" (UID: \"a5903459-6cc0-495f-b301-4e6cc91b5a5a\") " Mar 20 17:55:13 crc kubenswrapper[4690]: I0320 17:55:13.692380 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5903459-6cc0-495f-b301-4e6cc91b5a5a-config-data\") pod \"a5903459-6cc0-495f-b301-4e6cc91b5a5a\" (UID: \"a5903459-6cc0-495f-b301-4e6cc91b5a5a\") " Mar 20 17:55:13 crc kubenswrapper[4690]: I0320 17:55:13.692429 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gg7x6\" (UniqueName: \"kubernetes.io/projected/a5903459-6cc0-495f-b301-4e6cc91b5a5a-kube-api-access-gg7x6\") pod \"a5903459-6cc0-495f-b301-4e6cc91b5a5a\" (UID: \"a5903459-6cc0-495f-b301-4e6cc91b5a5a\") " Mar 20 17:55:13 crc kubenswrapper[4690]: I0320 17:55:13.692828 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5903459-6cc0-495f-b301-4e6cc91b5a5a-logs" (OuterVolumeSpecName: "logs") pod "a5903459-6cc0-495f-b301-4e6cc91b5a5a" (UID: "a5903459-6cc0-495f-b301-4e6cc91b5a5a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:55:13 crc kubenswrapper[4690]: I0320 17:55:13.692965 4690 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5903459-6cc0-495f-b301-4e6cc91b5a5a-logs\") on node \"crc\" DevicePath \"\"" Mar 20 17:55:13 crc kubenswrapper[4690]: I0320 17:55:13.717934 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5903459-6cc0-495f-b301-4e6cc91b5a5a-kube-api-access-gg7x6" (OuterVolumeSpecName: "kube-api-access-gg7x6") pod "a5903459-6cc0-495f-b301-4e6cc91b5a5a" (UID: "a5903459-6cc0-495f-b301-4e6cc91b5a5a"). InnerVolumeSpecName "kube-api-access-gg7x6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:55:13 crc kubenswrapper[4690]: I0320 17:55:13.752607 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5903459-6cc0-495f-b301-4e6cc91b5a5a-config-data" (OuterVolumeSpecName: "config-data") pod "a5903459-6cc0-495f-b301-4e6cc91b5a5a" (UID: "a5903459-6cc0-495f-b301-4e6cc91b5a5a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:55:13 crc kubenswrapper[4690]: I0320 17:55:13.771363 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5903459-6cc0-495f-b301-4e6cc91b5a5a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a5903459-6cc0-495f-b301-4e6cc91b5a5a" (UID: "a5903459-6cc0-495f-b301-4e6cc91b5a5a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:55:13 crc kubenswrapper[4690]: I0320 17:55:13.792739 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5903459-6cc0-495f-b301-4e6cc91b5a5a-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "a5903459-6cc0-495f-b301-4e6cc91b5a5a" (UID: "a5903459-6cc0-495f-b301-4e6cc91b5a5a"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:55:13 crc kubenswrapper[4690]: I0320 17:55:13.794822 4690 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5903459-6cc0-495f-b301-4e6cc91b5a5a-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:55:13 crc kubenswrapper[4690]: I0320 17:55:13.794857 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gg7x6\" (UniqueName: \"kubernetes.io/projected/a5903459-6cc0-495f-b301-4e6cc91b5a5a-kube-api-access-gg7x6\") on node \"crc\" DevicePath \"\"" Mar 20 17:55:13 crc kubenswrapper[4690]: I0320 17:55:13.794873 4690 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5903459-6cc0-495f-b301-4e6cc91b5a5a-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 17:55:13 crc kubenswrapper[4690]: I0320 17:55:13.794884 4690 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5903459-6cc0-495f-b301-4e6cc91b5a5a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:55:13 crc kubenswrapper[4690]: I0320 17:55:13.893564 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14ff8ae3-f423-405a-bdef-0805c9925ba5" path="/var/lib/kubelet/pods/14ff8ae3-f423-405a-bdef-0805c9925ba5/volumes" Mar 20 17:55:14 crc kubenswrapper[4690]: I0320 17:55:14.242032 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a5903459-6cc0-495f-b301-4e6cc91b5a5a","Type":"ContainerDied","Data":"0da35885f33d37ea5d576777d17c7337143f20bcbb668d960b17c4b28b79e76f"} Mar 20 17:55:14 crc kubenswrapper[4690]: I0320 17:55:14.242045 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 17:55:14 crc kubenswrapper[4690]: I0320 17:55:14.242146 4690 scope.go:117] "RemoveContainer" containerID="3fec59d75ffc0f45a4b1c25a5e6938133e4589825a4e6b31507cf8a4c126e57e" Mar 20 17:55:14 crc kubenswrapper[4690]: I0320 17:55:14.248037 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"db1320b9-1cd4-4756-b81a-c3eed18a7140","Type":"ContainerStarted","Data":"b626f15a2b06d6a28e0976df179131417700c2831dc8477177e75c040c4cecb3"} Mar 20 17:55:14 crc kubenswrapper[4690]: I0320 17:55:14.264750 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 17:55:14 crc kubenswrapper[4690]: I0320 17:55:14.267114 4690 scope.go:117] "RemoveContainer" containerID="e924be1dfc1829cb448a315b99d1fe62654ae04178a463e5e7aeb12a768fc68b" Mar 20 17:55:14 crc kubenswrapper[4690]: I0320 17:55:14.296392 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 17:55:14 crc kubenswrapper[4690]: I0320 17:55:14.306094 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 20 17:55:14 crc kubenswrapper[4690]: E0320 17:55:14.306517 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14ff8ae3-f423-405a-bdef-0805c9925ba5" containerName="dnsmasq-dns" Mar 20 17:55:14 crc kubenswrapper[4690]: I0320 17:55:14.306534 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="14ff8ae3-f423-405a-bdef-0805c9925ba5" containerName="dnsmasq-dns" Mar 20 17:55:14 crc kubenswrapper[4690]: E0320 17:55:14.306563 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5903459-6cc0-495f-b301-4e6cc91b5a5a" containerName="nova-metadata-log" Mar 20 17:55:14 crc kubenswrapper[4690]: I0320 17:55:14.306570 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5903459-6cc0-495f-b301-4e6cc91b5a5a" containerName="nova-metadata-log" Mar 20 17:55:14 crc kubenswrapper[4690]: E0320 17:55:14.306581 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3f5fa9c-b4e5-4674-8ecf-2dd41a12852e" containerName="nova-manage" Mar 20 17:55:14 crc kubenswrapper[4690]: I0320 17:55:14.306587 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3f5fa9c-b4e5-4674-8ecf-2dd41a12852e" containerName="nova-manage" Mar 20 17:55:14 crc kubenswrapper[4690]: E0320 17:55:14.306609 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5903459-6cc0-495f-b301-4e6cc91b5a5a" containerName="nova-metadata-metadata" Mar 20 17:55:14 crc kubenswrapper[4690]: I0320 17:55:14.306615 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5903459-6cc0-495f-b301-4e6cc91b5a5a" containerName="nova-metadata-metadata" Mar 20 17:55:14 crc kubenswrapper[4690]: E0320 17:55:14.306625 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14ff8ae3-f423-405a-bdef-0805c9925ba5" containerName="init" Mar 20 17:55:14 crc kubenswrapper[4690]: I0320 17:55:14.306630 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="14ff8ae3-f423-405a-bdef-0805c9925ba5" containerName="init" Mar 20 17:55:14 crc kubenswrapper[4690]: I0320 17:55:14.306787 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="14ff8ae3-f423-405a-bdef-0805c9925ba5" containerName="dnsmasq-dns" Mar 20 17:55:14 crc kubenswrapper[4690]: I0320 17:55:14.306809 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5903459-6cc0-495f-b301-4e6cc91b5a5a" containerName="nova-metadata-log" Mar 20 17:55:14 crc kubenswrapper[4690]: I0320 17:55:14.306824 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3f5fa9c-b4e5-4674-8ecf-2dd41a12852e" containerName="nova-manage" Mar 20 17:55:14 crc kubenswrapper[4690]: I0320 17:55:14.306833 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5903459-6cc0-495f-b301-4e6cc91b5a5a" containerName="nova-metadata-metadata" Mar 20 17:55:14 crc kubenswrapper[4690]: I0320 17:55:14.308167 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 17:55:14 crc kubenswrapper[4690]: I0320 17:55:14.314916 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 17:55:14 crc kubenswrapper[4690]: I0320 17:55:14.316015 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 20 17:55:14 crc kubenswrapper[4690]: I0320 17:55:14.316433 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 20 17:55:14 crc kubenswrapper[4690]: I0320 17:55:14.413716 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0953a097-ab34-4b4e-8389-00cc858d9a36-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0953a097-ab34-4b4e-8389-00cc858d9a36\") " pod="openstack/nova-metadata-0" Mar 20 17:55:14 crc kubenswrapper[4690]: I0320 17:55:14.413769 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0953a097-ab34-4b4e-8389-00cc858d9a36-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0953a097-ab34-4b4e-8389-00cc858d9a36\") " pod="openstack/nova-metadata-0" Mar 20 17:55:14 crc kubenswrapper[4690]: I0320 17:55:14.414084 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0953a097-ab34-4b4e-8389-00cc858d9a36-config-data\") pod \"nova-metadata-0\" (UID: \"0953a097-ab34-4b4e-8389-00cc858d9a36\") " pod="openstack/nova-metadata-0" Mar 20 17:55:14 crc kubenswrapper[4690]: I0320 17:55:14.414180 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0953a097-ab34-4b4e-8389-00cc858d9a36-logs\") pod \"nova-metadata-0\" (UID: \"0953a097-ab34-4b4e-8389-00cc858d9a36\") " pod="openstack/nova-metadata-0" Mar 20 17:55:14 crc kubenswrapper[4690]: I0320 17:55:14.414231 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgfpd\" (UniqueName: \"kubernetes.io/projected/0953a097-ab34-4b4e-8389-00cc858d9a36-kube-api-access-hgfpd\") pod \"nova-metadata-0\" (UID: \"0953a097-ab34-4b4e-8389-00cc858d9a36\") " pod="openstack/nova-metadata-0" Mar 20 17:55:14 crc kubenswrapper[4690]: I0320 17:55:14.515666 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0953a097-ab34-4b4e-8389-00cc858d9a36-config-data\") pod \"nova-metadata-0\" (UID: \"0953a097-ab34-4b4e-8389-00cc858d9a36\") " pod="openstack/nova-metadata-0" Mar 20 17:55:14 crc kubenswrapper[4690]: I0320 17:55:14.515736 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0953a097-ab34-4b4e-8389-00cc858d9a36-logs\") pod \"nova-metadata-0\" (UID: \"0953a097-ab34-4b4e-8389-00cc858d9a36\") " pod="openstack/nova-metadata-0" Mar 20 17:55:14 crc kubenswrapper[4690]: I0320 17:55:14.515763 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgfpd\" (UniqueName: \"kubernetes.io/projected/0953a097-ab34-4b4e-8389-00cc858d9a36-kube-api-access-hgfpd\") pod \"nova-metadata-0\" (UID: \"0953a097-ab34-4b4e-8389-00cc858d9a36\") " pod="openstack/nova-metadata-0" Mar 20 17:55:14 crc kubenswrapper[4690]: I0320 17:55:14.515790 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0953a097-ab34-4b4e-8389-00cc858d9a36-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0953a097-ab34-4b4e-8389-00cc858d9a36\") " pod="openstack/nova-metadata-0" Mar 20 17:55:14 crc kubenswrapper[4690]: I0320 17:55:14.515808 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0953a097-ab34-4b4e-8389-00cc858d9a36-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0953a097-ab34-4b4e-8389-00cc858d9a36\") " pod="openstack/nova-metadata-0" Mar 20 17:55:14 crc kubenswrapper[4690]: I0320 17:55:14.516817 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0953a097-ab34-4b4e-8389-00cc858d9a36-logs\") pod \"nova-metadata-0\" (UID: \"0953a097-ab34-4b4e-8389-00cc858d9a36\") " pod="openstack/nova-metadata-0" Mar 20 17:55:14 crc kubenswrapper[4690]: I0320 17:55:14.521530 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0953a097-ab34-4b4e-8389-00cc858d9a36-config-data\") pod \"nova-metadata-0\" (UID: \"0953a097-ab34-4b4e-8389-00cc858d9a36\") " pod="openstack/nova-metadata-0" Mar 20 17:55:14 crc kubenswrapper[4690]: I0320 17:55:14.522100 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0953a097-ab34-4b4e-8389-00cc858d9a36-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0953a097-ab34-4b4e-8389-00cc858d9a36\") " pod="openstack/nova-metadata-0" Mar 20 17:55:14 crc kubenswrapper[4690]: I0320 17:55:14.522161 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0953a097-ab34-4b4e-8389-00cc858d9a36-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0953a097-ab34-4b4e-8389-00cc858d9a36\") " pod="openstack/nova-metadata-0" Mar 20 17:55:14 crc kubenswrapper[4690]: I0320 17:55:14.536948 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgfpd\" (UniqueName: \"kubernetes.io/projected/0953a097-ab34-4b4e-8389-00cc858d9a36-kube-api-access-hgfpd\") pod \"nova-metadata-0\" (UID: \"0953a097-ab34-4b4e-8389-00cc858d9a36\") " pod="openstack/nova-metadata-0" Mar 20 17:55:14 crc kubenswrapper[4690]: I0320 17:55:14.627231 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 17:55:15 crc kubenswrapper[4690]: I0320 17:55:15.097957 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 17:55:15 crc kubenswrapper[4690]: I0320 17:55:15.263618 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0953a097-ab34-4b4e-8389-00cc858d9a36","Type":"ContainerStarted","Data":"98ac9663ea55f6e3f6cdacb736bc37d1d10de45099a043d4690dea03fb1747a4"} Mar 20 17:55:15 crc kubenswrapper[4690]: E0320 17:55:15.719209 4690 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0c982168f6c0db21d3fe2de8aeb6075daec4e3eca159cb524428072ee15a1863" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 20 17:55:15 crc kubenswrapper[4690]: E0320 17:55:15.721329 4690 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0c982168f6c0db21d3fe2de8aeb6075daec4e3eca159cb524428072ee15a1863" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 20 17:55:15 crc kubenswrapper[4690]: E0320 17:55:15.722585 4690 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0c982168f6c0db21d3fe2de8aeb6075daec4e3eca159cb524428072ee15a1863" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 20 17:55:15 crc kubenswrapper[4690]: E0320 17:55:15.722708 4690 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="2c7be3e2-e25f-45c8-8320-d1b5407835fe" containerName="nova-scheduler-scheduler" Mar 20 17:55:15 crc kubenswrapper[4690]: I0320 17:55:15.900420 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5903459-6cc0-495f-b301-4e6cc91b5a5a" path="/var/lib/kubelet/pods/a5903459-6cc0-495f-b301-4e6cc91b5a5a/volumes" Mar 20 17:55:16 crc kubenswrapper[4690]: I0320 17:55:16.277228 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"db1320b9-1cd4-4756-b81a-c3eed18a7140","Type":"ContainerStarted","Data":"0e60e43b47a4a865acf7462b05a6f1f37792cdfee8fd036d956034721a8d5502"} Mar 20 17:55:16 crc kubenswrapper[4690]: I0320 17:55:16.277385 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 17:55:16 crc kubenswrapper[4690]: I0320 17:55:16.278596 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0953a097-ab34-4b4e-8389-00cc858d9a36","Type":"ContainerStarted","Data":"7301b1b2ac4f7fcb788150a6bbb67e141f3313b86114ced59fd41ffa3efa0bf8"} Mar 20 17:55:16 crc kubenswrapper[4690]: I0320 17:55:16.278626 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0953a097-ab34-4b4e-8389-00cc858d9a36","Type":"ContainerStarted","Data":"cb1d2f86c59ecb4fa2c614c24bc6db28795c5867c5ee347e656ccf8c542a5274"} Mar 20 17:55:16 crc kubenswrapper[4690]: I0320 17:55:16.299825 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.7747814480000002 podStartE2EDuration="7.299802236s" podCreationTimestamp="2026-03-20 17:55:09 +0000 UTC" firstStartedPulling="2026-03-20 17:55:09.939900972 +0000 UTC m=+1384.805726650" lastFinishedPulling="2026-03-20 17:55:15.46492172 +0000 UTC m=+1390.330747438" observedRunningTime="2026-03-20 17:55:16.293920423 +0000 UTC m=+1391.159746101" watchObservedRunningTime="2026-03-20 17:55:16.299802236 +0000 UTC m=+1391.165627914" Mar 20 17:55:16 crc kubenswrapper[4690]: I0320 17:55:16.315547 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.315528021 podStartE2EDuration="2.315528021s" podCreationTimestamp="2026-03-20 17:55:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:55:16.309301309 +0000 UTC m=+1391.175126977" watchObservedRunningTime="2026-03-20 17:55:16.315528021 +0000 UTC m=+1391.181353699" Mar 20 17:55:16 crc kubenswrapper[4690]: I0320 17:55:16.722630 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 17:55:16 crc kubenswrapper[4690]: I0320 17:55:16.764143 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c7be3e2-e25f-45c8-8320-d1b5407835fe-config-data\") pod \"2c7be3e2-e25f-45c8-8320-d1b5407835fe\" (UID: \"2c7be3e2-e25f-45c8-8320-d1b5407835fe\") " Mar 20 17:55:16 crc kubenswrapper[4690]: I0320 17:55:16.764186 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c7be3e2-e25f-45c8-8320-d1b5407835fe-combined-ca-bundle\") pod \"2c7be3e2-e25f-45c8-8320-d1b5407835fe\" (UID: \"2c7be3e2-e25f-45c8-8320-d1b5407835fe\") " Mar 20 17:55:16 crc kubenswrapper[4690]: I0320 17:55:16.764296 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hmxtj\" (UniqueName: \"kubernetes.io/projected/2c7be3e2-e25f-45c8-8320-d1b5407835fe-kube-api-access-hmxtj\") pod \"2c7be3e2-e25f-45c8-8320-d1b5407835fe\" (UID: \"2c7be3e2-e25f-45c8-8320-d1b5407835fe\") " Mar 20 17:55:16 crc kubenswrapper[4690]: I0320 17:55:16.774306 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c7be3e2-e25f-45c8-8320-d1b5407835fe-kube-api-access-hmxtj" (OuterVolumeSpecName: "kube-api-access-hmxtj") pod "2c7be3e2-e25f-45c8-8320-d1b5407835fe" (UID: "2c7be3e2-e25f-45c8-8320-d1b5407835fe"). InnerVolumeSpecName "kube-api-access-hmxtj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:55:16 crc kubenswrapper[4690]: I0320 17:55:16.803208 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c7be3e2-e25f-45c8-8320-d1b5407835fe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2c7be3e2-e25f-45c8-8320-d1b5407835fe" (UID: "2c7be3e2-e25f-45c8-8320-d1b5407835fe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:55:16 crc kubenswrapper[4690]: I0320 17:55:16.803705 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c7be3e2-e25f-45c8-8320-d1b5407835fe-config-data" (OuterVolumeSpecName: "config-data") pod "2c7be3e2-e25f-45c8-8320-d1b5407835fe" (UID: "2c7be3e2-e25f-45c8-8320-d1b5407835fe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:55:16 crc kubenswrapper[4690]: I0320 17:55:16.866493 4690 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c7be3e2-e25f-45c8-8320-d1b5407835fe-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:55:16 crc kubenswrapper[4690]: I0320 17:55:16.866527 4690 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c7be3e2-e25f-45c8-8320-d1b5407835fe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:55:16 crc kubenswrapper[4690]: I0320 17:55:16.866539 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hmxtj\" (UniqueName: \"kubernetes.io/projected/2c7be3e2-e25f-45c8-8320-d1b5407835fe-kube-api-access-hmxtj\") on node \"crc\" DevicePath \"\"" Mar 20 17:55:17 crc kubenswrapper[4690]: I0320 17:55:17.286189 4690 generic.go:334] "Generic (PLEG): container finished" podID="2c7be3e2-e25f-45c8-8320-d1b5407835fe" containerID="0c982168f6c0db21d3fe2de8aeb6075daec4e3eca159cb524428072ee15a1863" exitCode=0 Mar 20 17:55:17 crc kubenswrapper[4690]: I0320 17:55:17.286437 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 17:55:17 crc kubenswrapper[4690]: I0320 17:55:17.286445 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2c7be3e2-e25f-45c8-8320-d1b5407835fe","Type":"ContainerDied","Data":"0c982168f6c0db21d3fe2de8aeb6075daec4e3eca159cb524428072ee15a1863"} Mar 20 17:55:17 crc kubenswrapper[4690]: I0320 17:55:17.286482 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2c7be3e2-e25f-45c8-8320-d1b5407835fe","Type":"ContainerDied","Data":"f77cab5bd3555c5468b7ce27b5aa5c7a200992ac9621790b20853a94672a9ece"} Mar 20 17:55:17 crc kubenswrapper[4690]: I0320 17:55:17.286499 4690 scope.go:117] "RemoveContainer" containerID="0c982168f6c0db21d3fe2de8aeb6075daec4e3eca159cb524428072ee15a1863" Mar 20 17:55:17 crc kubenswrapper[4690]: I0320 17:55:17.324473 4690 scope.go:117] "RemoveContainer" containerID="0c982168f6c0db21d3fe2de8aeb6075daec4e3eca159cb524428072ee15a1863" Mar 20 17:55:17 crc kubenswrapper[4690]: E0320 17:55:17.326234 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c982168f6c0db21d3fe2de8aeb6075daec4e3eca159cb524428072ee15a1863\": container with ID starting with 0c982168f6c0db21d3fe2de8aeb6075daec4e3eca159cb524428072ee15a1863 not found: ID does not exist" containerID="0c982168f6c0db21d3fe2de8aeb6075daec4e3eca159cb524428072ee15a1863" Mar 20 17:55:17 crc kubenswrapper[4690]: I0320 17:55:17.326303 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c982168f6c0db21d3fe2de8aeb6075daec4e3eca159cb524428072ee15a1863"} err="failed to get container status \"0c982168f6c0db21d3fe2de8aeb6075daec4e3eca159cb524428072ee15a1863\": rpc error: code = NotFound desc = could not find container \"0c982168f6c0db21d3fe2de8aeb6075daec4e3eca159cb524428072ee15a1863\": container with ID starting with 0c982168f6c0db21d3fe2de8aeb6075daec4e3eca159cb524428072ee15a1863 not found: ID does not exist" Mar 20 17:55:17 crc kubenswrapper[4690]: I0320 17:55:17.326877 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 17:55:17 crc kubenswrapper[4690]: I0320 17:55:17.343021 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 17:55:17 crc kubenswrapper[4690]: I0320 17:55:17.354183 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 17:55:17 crc kubenswrapper[4690]: E0320 17:55:17.354612 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c7be3e2-e25f-45c8-8320-d1b5407835fe" containerName="nova-scheduler-scheduler" Mar 20 17:55:17 crc kubenswrapper[4690]: I0320 17:55:17.354632 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c7be3e2-e25f-45c8-8320-d1b5407835fe" containerName="nova-scheduler-scheduler" Mar 20 17:55:17 crc kubenswrapper[4690]: I0320 17:55:17.354802 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c7be3e2-e25f-45c8-8320-d1b5407835fe" containerName="nova-scheduler-scheduler" Mar 20 17:55:17 crc kubenswrapper[4690]: I0320 17:55:17.355388 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 17:55:17 crc kubenswrapper[4690]: I0320 17:55:17.358827 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 20 17:55:17 crc kubenswrapper[4690]: I0320 17:55:17.365183 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 17:55:17 crc kubenswrapper[4690]: I0320 17:55:17.375722 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9lwb\" (UniqueName: \"kubernetes.io/projected/27f99cff-5842-4132-89a9-3cc1872139cf-kube-api-access-l9lwb\") pod \"nova-scheduler-0\" (UID: \"27f99cff-5842-4132-89a9-3cc1872139cf\") " pod="openstack/nova-scheduler-0" Mar 20 17:55:17 crc kubenswrapper[4690]: I0320 17:55:17.375779 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27f99cff-5842-4132-89a9-3cc1872139cf-config-data\") pod \"nova-scheduler-0\" (UID: \"27f99cff-5842-4132-89a9-3cc1872139cf\") " pod="openstack/nova-scheduler-0" Mar 20 17:55:17 crc kubenswrapper[4690]: I0320 17:55:17.375890 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27f99cff-5842-4132-89a9-3cc1872139cf-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"27f99cff-5842-4132-89a9-3cc1872139cf\") " pod="openstack/nova-scheduler-0" Mar 20 17:55:17 crc kubenswrapper[4690]: I0320 17:55:17.478046 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27f99cff-5842-4132-89a9-3cc1872139cf-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"27f99cff-5842-4132-89a9-3cc1872139cf\") " pod="openstack/nova-scheduler-0" Mar 20 17:55:17 crc kubenswrapper[4690]: I0320 17:55:17.478218 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9lwb\" (UniqueName: \"kubernetes.io/projected/27f99cff-5842-4132-89a9-3cc1872139cf-kube-api-access-l9lwb\") pod \"nova-scheduler-0\" (UID: \"27f99cff-5842-4132-89a9-3cc1872139cf\") " pod="openstack/nova-scheduler-0" Mar 20 17:55:17 crc kubenswrapper[4690]: I0320 17:55:17.478245 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27f99cff-5842-4132-89a9-3cc1872139cf-config-data\") pod \"nova-scheduler-0\" (UID: \"27f99cff-5842-4132-89a9-3cc1872139cf\") " pod="openstack/nova-scheduler-0" Mar 20 17:55:17 crc kubenswrapper[4690]: I0320 17:55:17.491093 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27f99cff-5842-4132-89a9-3cc1872139cf-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"27f99cff-5842-4132-89a9-3cc1872139cf\") " pod="openstack/nova-scheduler-0" Mar 20 17:55:17 crc kubenswrapper[4690]: I0320 17:55:17.498909 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9lwb\" (UniqueName: \"kubernetes.io/projected/27f99cff-5842-4132-89a9-3cc1872139cf-kube-api-access-l9lwb\") pod \"nova-scheduler-0\" (UID: \"27f99cff-5842-4132-89a9-3cc1872139cf\") " pod="openstack/nova-scheduler-0" Mar 20 17:55:17 crc kubenswrapper[4690]: I0320 17:55:17.506041 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27f99cff-5842-4132-89a9-3cc1872139cf-config-data\") pod \"nova-scheduler-0\" (UID: \"27f99cff-5842-4132-89a9-3cc1872139cf\") " pod="openstack/nova-scheduler-0" Mar 20 17:55:17 crc kubenswrapper[4690]: I0320 17:55:17.704752 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 17:55:17 crc kubenswrapper[4690]: I0320 17:55:17.912554 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c7be3e2-e25f-45c8-8320-d1b5407835fe" path="/var/lib/kubelet/pods/2c7be3e2-e25f-45c8-8320-d1b5407835fe/volumes" Mar 20 17:55:18 crc kubenswrapper[4690]: I0320 17:55:18.312719 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 17:55:18 crc kubenswrapper[4690]: I0320 17:55:18.315603 4690 generic.go:334] "Generic (PLEG): container finished" podID="240578b9-6354-4ab4-9e38-cec9daee9be4" containerID="394dd84b71c6e5d6e0c22eb4b3dc75580c8589a87d9fa00a9e477d9fee50aa8b" exitCode=0 Mar 20 17:55:18 crc kubenswrapper[4690]: I0320 17:55:18.315709 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"240578b9-6354-4ab4-9e38-cec9daee9be4","Type":"ContainerDied","Data":"394dd84b71c6e5d6e0c22eb4b3dc75580c8589a87d9fa00a9e477d9fee50aa8b"} Mar 20 17:55:18 crc kubenswrapper[4690]: W0320 17:55:18.349401 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod27f99cff_5842_4132_89a9_3cc1872139cf.slice/crio-0b8c4448455a714f08b538d5472432e8cc8f85ace5912d1b8835dae54c37ccc0 WatchSource:0}: Error finding container 0b8c4448455a714f08b538d5472432e8cc8f85ace5912d1b8835dae54c37ccc0: Status 404 returned error can't find the container with id 0b8c4448455a714f08b538d5472432e8cc8f85ace5912d1b8835dae54c37ccc0 Mar 20 17:55:18 crc kubenswrapper[4690]: I0320 17:55:18.387803 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 17:55:18 crc kubenswrapper[4690]: I0320 17:55:18.387866 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 17:55:18 crc kubenswrapper[4690]: I0320 17:55:18.530140 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 17:55:18 crc kubenswrapper[4690]: I0320 17:55:18.618638 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/240578b9-6354-4ab4-9e38-cec9daee9be4-logs\") pod \"240578b9-6354-4ab4-9e38-cec9daee9be4\" (UID: \"240578b9-6354-4ab4-9e38-cec9daee9be4\") " Mar 20 17:55:18 crc kubenswrapper[4690]: I0320 17:55:18.618793 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bz2st\" (UniqueName: \"kubernetes.io/projected/240578b9-6354-4ab4-9e38-cec9daee9be4-kube-api-access-bz2st\") pod \"240578b9-6354-4ab4-9e38-cec9daee9be4\" (UID: \"240578b9-6354-4ab4-9e38-cec9daee9be4\") " Mar 20 17:55:18 crc kubenswrapper[4690]: I0320 17:55:18.618834 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/240578b9-6354-4ab4-9e38-cec9daee9be4-config-data\") pod \"240578b9-6354-4ab4-9e38-cec9daee9be4\" (UID: \"240578b9-6354-4ab4-9e38-cec9daee9be4\") " Mar 20 17:55:18 crc kubenswrapper[4690]: I0320 17:55:18.618881 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/240578b9-6354-4ab4-9e38-cec9daee9be4-combined-ca-bundle\") pod \"240578b9-6354-4ab4-9e38-cec9daee9be4\" (UID: \"240578b9-6354-4ab4-9e38-cec9daee9be4\") " Mar 20 17:55:18 crc kubenswrapper[4690]: I0320 17:55:18.620339 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/240578b9-6354-4ab4-9e38-cec9daee9be4-logs" (OuterVolumeSpecName: "logs") pod "240578b9-6354-4ab4-9e38-cec9daee9be4" (UID: "240578b9-6354-4ab4-9e38-cec9daee9be4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:55:18 crc kubenswrapper[4690]: I0320 17:55:18.629992 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/240578b9-6354-4ab4-9e38-cec9daee9be4-kube-api-access-bz2st" (OuterVolumeSpecName: "kube-api-access-bz2st") pod "240578b9-6354-4ab4-9e38-cec9daee9be4" (UID: "240578b9-6354-4ab4-9e38-cec9daee9be4"). InnerVolumeSpecName "kube-api-access-bz2st". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:55:18 crc kubenswrapper[4690]: I0320 17:55:18.647635 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/240578b9-6354-4ab4-9e38-cec9daee9be4-config-data" (OuterVolumeSpecName: "config-data") pod "240578b9-6354-4ab4-9e38-cec9daee9be4" (UID: "240578b9-6354-4ab4-9e38-cec9daee9be4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:55:18 crc kubenswrapper[4690]: I0320 17:55:18.647654 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/240578b9-6354-4ab4-9e38-cec9daee9be4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "240578b9-6354-4ab4-9e38-cec9daee9be4" (UID: "240578b9-6354-4ab4-9e38-cec9daee9be4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:55:18 crc kubenswrapper[4690]: I0320 17:55:18.720928 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bz2st\" (UniqueName: \"kubernetes.io/projected/240578b9-6354-4ab4-9e38-cec9daee9be4-kube-api-access-bz2st\") on node \"crc\" DevicePath \"\"" Mar 20 17:55:18 crc kubenswrapper[4690]: I0320 17:55:18.720955 4690 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/240578b9-6354-4ab4-9e38-cec9daee9be4-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:55:18 crc kubenswrapper[4690]: I0320 17:55:18.720965 4690 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/240578b9-6354-4ab4-9e38-cec9daee9be4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:55:18 crc kubenswrapper[4690]: I0320 17:55:18.720974 4690 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/240578b9-6354-4ab4-9e38-cec9daee9be4-logs\") on node \"crc\" DevicePath \"\"" Mar 20 17:55:19 crc kubenswrapper[4690]: I0320 17:55:19.326044 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 17:55:19 crc kubenswrapper[4690]: I0320 17:55:19.327391 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"240578b9-6354-4ab4-9e38-cec9daee9be4","Type":"ContainerDied","Data":"730a2fc9dc1bf8e50094de3b7d7c897619892375dd1cd9a848d3420e7b3315c1"} Mar 20 17:55:19 crc kubenswrapper[4690]: I0320 17:55:19.327430 4690 scope.go:117] "RemoveContainer" containerID="394dd84b71c6e5d6e0c22eb4b3dc75580c8589a87d9fa00a9e477d9fee50aa8b" Mar 20 17:55:19 crc kubenswrapper[4690]: I0320 17:55:19.332035 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"27f99cff-5842-4132-89a9-3cc1872139cf","Type":"ContainerStarted","Data":"c5d3e79567996ab752e5f084c5d4ff7071a56d2617d055dec8395e07f8b9c3e7"} Mar 20 17:55:19 crc kubenswrapper[4690]: I0320 17:55:19.332083 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"27f99cff-5842-4132-89a9-3cc1872139cf","Type":"ContainerStarted","Data":"0b8c4448455a714f08b538d5472432e8cc8f85ace5912d1b8835dae54c37ccc0"} Mar 20 17:55:19 crc kubenswrapper[4690]: I0320 17:55:19.355266 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.355234238 podStartE2EDuration="2.355234238s" podCreationTimestamp="2026-03-20 17:55:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:55:19.35099065 +0000 UTC m=+1394.216816338" watchObservedRunningTime="2026-03-20 17:55:19.355234238 +0000 UTC m=+1394.221059916" Mar 20 17:55:19 crc kubenswrapper[4690]: I0320 17:55:19.363682 4690 scope.go:117] "RemoveContainer" containerID="877ac42af068efa39ddb6127ea33ddac065609ce0b8559a32b84b38290ac3ba4" Mar 20 17:55:19 crc kubenswrapper[4690]: I0320 17:55:19.378451 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 17:55:19 crc kubenswrapper[4690]: I0320 17:55:19.391101 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 20 17:55:19 crc kubenswrapper[4690]: I0320 17:55:19.401623 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 20 17:55:19 crc kubenswrapper[4690]: E0320 17:55:19.402203 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="240578b9-6354-4ab4-9e38-cec9daee9be4" containerName="nova-api-api" Mar 20 17:55:19 crc kubenswrapper[4690]: I0320 17:55:19.402299 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="240578b9-6354-4ab4-9e38-cec9daee9be4" containerName="nova-api-api" Mar 20 17:55:19 crc kubenswrapper[4690]: E0320 17:55:19.402431 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="240578b9-6354-4ab4-9e38-cec9daee9be4" containerName="nova-api-log" Mar 20 17:55:19 crc kubenswrapper[4690]: I0320 17:55:19.402492 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="240578b9-6354-4ab4-9e38-cec9daee9be4" containerName="nova-api-log" Mar 20 17:55:19 crc kubenswrapper[4690]: I0320 17:55:19.402712 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="240578b9-6354-4ab4-9e38-cec9daee9be4" containerName="nova-api-api" Mar 20 17:55:19 crc kubenswrapper[4690]: I0320 17:55:19.402777 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="240578b9-6354-4ab4-9e38-cec9daee9be4" containerName="nova-api-log" Mar 20 17:55:19 crc kubenswrapper[4690]: I0320 17:55:19.403750 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 17:55:19 crc kubenswrapper[4690]: I0320 17:55:19.405327 4690 scope.go:117] "RemoveContainer" containerID="9ad9115247aafa1301ac3af9dec61ff4aef0639cbad4072484683f177f717f15" Mar 20 17:55:19 crc kubenswrapper[4690]: I0320 17:55:19.407985 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 20 17:55:19 crc kubenswrapper[4690]: I0320 17:55:19.412569 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 17:55:19 crc kubenswrapper[4690]: I0320 17:55:19.434021 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6cwx\" (UniqueName: \"kubernetes.io/projected/317814bc-68b1-4454-953c-dfacdf66c9da-kube-api-access-g6cwx\") pod \"nova-api-0\" (UID: \"317814bc-68b1-4454-953c-dfacdf66c9da\") " pod="openstack/nova-api-0" Mar 20 17:55:19 crc kubenswrapper[4690]: I0320 17:55:19.434458 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/317814bc-68b1-4454-953c-dfacdf66c9da-logs\") pod \"nova-api-0\" (UID: \"317814bc-68b1-4454-953c-dfacdf66c9da\") " pod="openstack/nova-api-0" Mar 20 17:55:19 crc kubenswrapper[4690]: I0320 17:55:19.434572 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/317814bc-68b1-4454-953c-dfacdf66c9da-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"317814bc-68b1-4454-953c-dfacdf66c9da\") " pod="openstack/nova-api-0" Mar 20 17:55:19 crc kubenswrapper[4690]: I0320 17:55:19.434769 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/317814bc-68b1-4454-953c-dfacdf66c9da-config-data\") pod \"nova-api-0\" (UID: \"317814bc-68b1-4454-953c-dfacdf66c9da\") " pod="openstack/nova-api-0" Mar 20 17:55:19 crc kubenswrapper[4690]: I0320 17:55:19.443061 4690 scope.go:117] "RemoveContainer" containerID="1965a485a3e86fd29b9983142bd03d03793ebad7052e1db882233deb479e4f49" Mar 20 17:55:19 crc kubenswrapper[4690]: I0320 17:55:19.483620 4690 scope.go:117] "RemoveContainer" containerID="0c296493f2d26f37a45c3fc71b5bad3937b0d9c8f5e3e133beffb7249e0856a3" Mar 20 17:55:19 crc kubenswrapper[4690]: I0320 17:55:19.536818 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/317814bc-68b1-4454-953c-dfacdf66c9da-config-data\") pod \"nova-api-0\" (UID: \"317814bc-68b1-4454-953c-dfacdf66c9da\") " pod="openstack/nova-api-0" Mar 20 17:55:19 crc kubenswrapper[4690]: I0320 17:55:19.536939 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6cwx\" (UniqueName: \"kubernetes.io/projected/317814bc-68b1-4454-953c-dfacdf66c9da-kube-api-access-g6cwx\") pod \"nova-api-0\" (UID: \"317814bc-68b1-4454-953c-dfacdf66c9da\") " pod="openstack/nova-api-0" Mar 20 17:55:19 crc kubenswrapper[4690]: I0320 17:55:19.536970 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/317814bc-68b1-4454-953c-dfacdf66c9da-logs\") pod \"nova-api-0\" (UID: \"317814bc-68b1-4454-953c-dfacdf66c9da\") " pod="openstack/nova-api-0" Mar 20 17:55:19 crc kubenswrapper[4690]: I0320 17:55:19.537014 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/317814bc-68b1-4454-953c-dfacdf66c9da-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"317814bc-68b1-4454-953c-dfacdf66c9da\") " pod="openstack/nova-api-0" Mar 20 17:55:19 crc kubenswrapper[4690]: I0320 17:55:19.537582 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/317814bc-68b1-4454-953c-dfacdf66c9da-logs\") pod \"nova-api-0\" (UID: \"317814bc-68b1-4454-953c-dfacdf66c9da\") " pod="openstack/nova-api-0" Mar 20 17:55:19 crc kubenswrapper[4690]: I0320 17:55:19.543279 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/317814bc-68b1-4454-953c-dfacdf66c9da-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"317814bc-68b1-4454-953c-dfacdf66c9da\") " pod="openstack/nova-api-0" Mar 20 17:55:19 crc kubenswrapper[4690]: I0320 17:55:19.547835 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/317814bc-68b1-4454-953c-dfacdf66c9da-config-data\") pod \"nova-api-0\" (UID: \"317814bc-68b1-4454-953c-dfacdf66c9da\") " pod="openstack/nova-api-0" Mar 20 17:55:19 crc kubenswrapper[4690]: I0320 17:55:19.553928 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6cwx\" (UniqueName: \"kubernetes.io/projected/317814bc-68b1-4454-953c-dfacdf66c9da-kube-api-access-g6cwx\") pod \"nova-api-0\" (UID: \"317814bc-68b1-4454-953c-dfacdf66c9da\") " pod="openstack/nova-api-0" Mar 20 17:55:19 crc kubenswrapper[4690]: I0320 17:55:19.720572 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 17:55:19 crc kubenswrapper[4690]: I0320 17:55:19.900428 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="240578b9-6354-4ab4-9e38-cec9daee9be4" path="/var/lib/kubelet/pods/240578b9-6354-4ab4-9e38-cec9daee9be4/volumes" Mar 20 17:55:20 crc kubenswrapper[4690]: I0320 17:55:20.185170 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 17:55:20 crc kubenswrapper[4690]: W0320 17:55:20.187741 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod317814bc_68b1_4454_953c_dfacdf66c9da.slice/crio-82854f3238cfd87823f435c25dc3a68f3036ed6aade5ee9906bd1f4e6ed24707 WatchSource:0}: Error finding container 82854f3238cfd87823f435c25dc3a68f3036ed6aade5ee9906bd1f4e6ed24707: Status 404 returned error can't find the container with id 82854f3238cfd87823f435c25dc3a68f3036ed6aade5ee9906bd1f4e6ed24707 Mar 20 17:55:20 crc kubenswrapper[4690]: I0320 17:55:20.342222 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"317814bc-68b1-4454-953c-dfacdf66c9da","Type":"ContainerStarted","Data":"82854f3238cfd87823f435c25dc3a68f3036ed6aade5ee9906bd1f4e6ed24707"} Mar 20 17:55:21 crc kubenswrapper[4690]: I0320 17:55:21.355170 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"317814bc-68b1-4454-953c-dfacdf66c9da","Type":"ContainerStarted","Data":"b0097d716724b9802f8cade39d53d64466071d0a06d455751900a65c4a4e0efd"} Mar 20 17:55:21 crc kubenswrapper[4690]: I0320 17:55:21.355211 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"317814bc-68b1-4454-953c-dfacdf66c9da","Type":"ContainerStarted","Data":"96fe73bf887097afa9c5596684bf3dde991e88b351de63ab57bbc7ac16d5ab4e"} Mar 20 17:55:21 crc kubenswrapper[4690]: I0320 17:55:21.387839 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.387809908 podStartE2EDuration="2.387809908s" podCreationTimestamp="2026-03-20 17:55:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:55:21.38534928 +0000 UTC m=+1396.251174958" watchObservedRunningTime="2026-03-20 17:55:21.387809908 +0000 UTC m=+1396.253635636" Mar 20 17:55:21 crc kubenswrapper[4690]: I0320 17:55:21.638631 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 20 17:55:22 crc kubenswrapper[4690]: I0320 17:55:22.705594 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 20 17:55:24 crc kubenswrapper[4690]: I0320 17:55:24.273588 4690 patch_prober.go:28] interesting pod/machine-config-daemon-wtg2q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:55:24 crc kubenswrapper[4690]: I0320 17:55:24.273953 4690 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:55:24 crc kubenswrapper[4690]: I0320 17:55:24.627762 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 20 17:55:24 crc kubenswrapper[4690]: I0320 17:55:24.627822 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 20 17:55:25 crc kubenswrapper[4690]: I0320 17:55:25.642398 4690 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="0953a097-ab34-4b4e-8389-00cc858d9a36" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.200:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 17:55:25 crc kubenswrapper[4690]: I0320 17:55:25.642451 4690 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="0953a097-ab34-4b4e-8389-00cc858d9a36" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.200:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 17:55:27 crc kubenswrapper[4690]: I0320 17:55:27.705638 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 20 17:55:27 crc kubenswrapper[4690]: I0320 17:55:27.739316 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 20 17:55:28 crc kubenswrapper[4690]: I0320 17:55:28.488168 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 20 17:55:29 crc kubenswrapper[4690]: I0320 17:55:29.721690 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 17:55:29 crc kubenswrapper[4690]: I0320 17:55:29.721779 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 17:55:30 crc kubenswrapper[4690]: I0320 17:55:30.804491 4690 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="317814bc-68b1-4454-953c-dfacdf66c9da" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.202:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 17:55:30 crc kubenswrapper[4690]: I0320 17:55:30.804882 4690 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="317814bc-68b1-4454-953c-dfacdf66c9da" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.202:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 17:55:32 crc kubenswrapper[4690]: I0320 17:55:32.628104 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 20 17:55:32 crc kubenswrapper[4690]: I0320 17:55:32.628159 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 20 17:55:34 crc kubenswrapper[4690]: I0320 17:55:34.634031 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 20 17:55:34 crc kubenswrapper[4690]: I0320 17:55:34.634403 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 20 17:55:34 crc kubenswrapper[4690]: I0320 17:55:34.640584 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 20 17:55:34 crc kubenswrapper[4690]: I0320 17:55:34.640697 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 20 17:55:36 crc kubenswrapper[4690]: I0320 17:55:36.441168 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:55:36 crc kubenswrapper[4690]: I0320 17:55:36.556305 4690 generic.go:334] "Generic (PLEG): container finished" podID="0767f87b-816c-4824-aaf1-8eb760dc6ee8" containerID="f80e166e696db08579ebb1a6ee42e6b620decd863b0f09b952e0cefa723722a8" exitCode=137 Mar 20 17:55:36 crc kubenswrapper[4690]: I0320 17:55:36.556367 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:55:36 crc kubenswrapper[4690]: I0320 17:55:36.556375 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"0767f87b-816c-4824-aaf1-8eb760dc6ee8","Type":"ContainerDied","Data":"f80e166e696db08579ebb1a6ee42e6b620decd863b0f09b952e0cefa723722a8"} Mar 20 17:55:36 crc kubenswrapper[4690]: I0320 17:55:36.556483 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"0767f87b-816c-4824-aaf1-8eb760dc6ee8","Type":"ContainerDied","Data":"626bf09ec0007c08d38c99e459d97e7119bb4e9d09ed6a9de444a1432cbc6c88"} Mar 20 17:55:36 crc kubenswrapper[4690]: I0320 17:55:36.556522 4690 scope.go:117] "RemoveContainer" containerID="f80e166e696db08579ebb1a6ee42e6b620decd863b0f09b952e0cefa723722a8" Mar 20 17:55:36 crc kubenswrapper[4690]: I0320 17:55:36.586080 4690 scope.go:117] "RemoveContainer" containerID="f80e166e696db08579ebb1a6ee42e6b620decd863b0f09b952e0cefa723722a8" Mar 20 17:55:36 crc kubenswrapper[4690]: E0320 17:55:36.586647 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f80e166e696db08579ebb1a6ee42e6b620decd863b0f09b952e0cefa723722a8\": container with ID starting with f80e166e696db08579ebb1a6ee42e6b620decd863b0f09b952e0cefa723722a8 not found: ID does not exist" containerID="f80e166e696db08579ebb1a6ee42e6b620decd863b0f09b952e0cefa723722a8" Mar 20 17:55:36 crc kubenswrapper[4690]: I0320 17:55:36.586696 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f80e166e696db08579ebb1a6ee42e6b620decd863b0f09b952e0cefa723722a8"} err="failed to get container status \"f80e166e696db08579ebb1a6ee42e6b620decd863b0f09b952e0cefa723722a8\": rpc error: code = NotFound desc = could not find container \"f80e166e696db08579ebb1a6ee42e6b620decd863b0f09b952e0cefa723722a8\": container with ID starting with f80e166e696db08579ebb1a6ee42e6b620decd863b0f09b952e0cefa723722a8 not found: ID does not exist" Mar 20 17:55:36 crc kubenswrapper[4690]: I0320 17:55:36.623502 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqf5n\" (UniqueName: \"kubernetes.io/projected/0767f87b-816c-4824-aaf1-8eb760dc6ee8-kube-api-access-lqf5n\") pod \"0767f87b-816c-4824-aaf1-8eb760dc6ee8\" (UID: \"0767f87b-816c-4824-aaf1-8eb760dc6ee8\") " Mar 20 17:55:36 crc kubenswrapper[4690]: I0320 17:55:36.623645 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0767f87b-816c-4824-aaf1-8eb760dc6ee8-combined-ca-bundle\") pod \"0767f87b-816c-4824-aaf1-8eb760dc6ee8\" (UID: \"0767f87b-816c-4824-aaf1-8eb760dc6ee8\") " Mar 20 17:55:36 crc kubenswrapper[4690]: I0320 17:55:36.623880 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0767f87b-816c-4824-aaf1-8eb760dc6ee8-config-data\") pod \"0767f87b-816c-4824-aaf1-8eb760dc6ee8\" (UID: \"0767f87b-816c-4824-aaf1-8eb760dc6ee8\") " Mar 20 17:55:36 crc kubenswrapper[4690]: I0320 17:55:36.631572 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0767f87b-816c-4824-aaf1-8eb760dc6ee8-kube-api-access-lqf5n" (OuterVolumeSpecName: "kube-api-access-lqf5n") pod "0767f87b-816c-4824-aaf1-8eb760dc6ee8" (UID: "0767f87b-816c-4824-aaf1-8eb760dc6ee8"). InnerVolumeSpecName "kube-api-access-lqf5n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:55:36 crc kubenswrapper[4690]: I0320 17:55:36.660312 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0767f87b-816c-4824-aaf1-8eb760dc6ee8-config-data" (OuterVolumeSpecName: "config-data") pod "0767f87b-816c-4824-aaf1-8eb760dc6ee8" (UID: "0767f87b-816c-4824-aaf1-8eb760dc6ee8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:55:36 crc kubenswrapper[4690]: I0320 17:55:36.667650 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0767f87b-816c-4824-aaf1-8eb760dc6ee8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0767f87b-816c-4824-aaf1-8eb760dc6ee8" (UID: "0767f87b-816c-4824-aaf1-8eb760dc6ee8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:55:36 crc kubenswrapper[4690]: I0320 17:55:36.727244 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqf5n\" (UniqueName: \"kubernetes.io/projected/0767f87b-816c-4824-aaf1-8eb760dc6ee8-kube-api-access-lqf5n\") on node \"crc\" DevicePath \"\"" Mar 20 17:55:36 crc kubenswrapper[4690]: I0320 17:55:36.727352 4690 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0767f87b-816c-4824-aaf1-8eb760dc6ee8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:55:36 crc kubenswrapper[4690]: I0320 17:55:36.727371 4690 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0767f87b-816c-4824-aaf1-8eb760dc6ee8-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:55:36 crc kubenswrapper[4690]: I0320 17:55:36.904619 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 17:55:36 crc kubenswrapper[4690]: I0320 17:55:36.919039 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 17:55:36 crc kubenswrapper[4690]: I0320 17:55:36.927642 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 17:55:36 crc kubenswrapper[4690]: E0320 17:55:36.928199 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0767f87b-816c-4824-aaf1-8eb760dc6ee8" containerName="nova-cell1-novncproxy-novncproxy" Mar 20 17:55:36 crc kubenswrapper[4690]: I0320 17:55:36.928220 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="0767f87b-816c-4824-aaf1-8eb760dc6ee8" containerName="nova-cell1-novncproxy-novncproxy" Mar 20 17:55:36 crc kubenswrapper[4690]: I0320 17:55:36.928475 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="0767f87b-816c-4824-aaf1-8eb760dc6ee8" containerName="nova-cell1-novncproxy-novncproxy" Mar 20 17:55:36 crc kubenswrapper[4690]: I0320 17:55:36.929161 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:55:36 crc kubenswrapper[4690]: I0320 17:55:36.930867 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 20 17:55:36 crc kubenswrapper[4690]: I0320 17:55:36.935991 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 20 17:55:36 crc kubenswrapper[4690]: I0320 17:55:36.936933 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 20 17:55:36 crc kubenswrapper[4690]: I0320 17:55:36.943595 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 17:55:37 crc kubenswrapper[4690]: I0320 17:55:37.032309 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4c84\" (UniqueName: \"kubernetes.io/projected/0d61bbf6-923c-45e5-9e55-42cb69c00b3b-kube-api-access-n4c84\") pod \"nova-cell1-novncproxy-0\" (UID: \"0d61bbf6-923c-45e5-9e55-42cb69c00b3b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:55:37 crc kubenswrapper[4690]: I0320 17:55:37.032390 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d61bbf6-923c-45e5-9e55-42cb69c00b3b-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"0d61bbf6-923c-45e5-9e55-42cb69c00b3b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:55:37 crc kubenswrapper[4690]: I0320 17:55:37.032712 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d61bbf6-923c-45e5-9e55-42cb69c00b3b-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"0d61bbf6-923c-45e5-9e55-42cb69c00b3b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:55:37 crc kubenswrapper[4690]: I0320 17:55:37.032850 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d61bbf6-923c-45e5-9e55-42cb69c00b3b-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"0d61bbf6-923c-45e5-9e55-42cb69c00b3b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:55:37 crc kubenswrapper[4690]: I0320 17:55:37.033502 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d61bbf6-923c-45e5-9e55-42cb69c00b3b-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"0d61bbf6-923c-45e5-9e55-42cb69c00b3b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:55:37 crc kubenswrapper[4690]: I0320 17:55:37.135588 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d61bbf6-923c-45e5-9e55-42cb69c00b3b-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"0d61bbf6-923c-45e5-9e55-42cb69c00b3b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:55:37 crc kubenswrapper[4690]: I0320 17:55:37.135658 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4c84\" (UniqueName: \"kubernetes.io/projected/0d61bbf6-923c-45e5-9e55-42cb69c00b3b-kube-api-access-n4c84\") pod \"nova-cell1-novncproxy-0\" (UID: \"0d61bbf6-923c-45e5-9e55-42cb69c00b3b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:55:37 crc kubenswrapper[4690]: I0320 17:55:37.135689 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d61bbf6-923c-45e5-9e55-42cb69c00b3b-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"0d61bbf6-923c-45e5-9e55-42cb69c00b3b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:55:37 crc kubenswrapper[4690]: I0320 17:55:37.135743 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d61bbf6-923c-45e5-9e55-42cb69c00b3b-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"0d61bbf6-923c-45e5-9e55-42cb69c00b3b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:55:37 crc kubenswrapper[4690]: I0320 17:55:37.135763 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d61bbf6-923c-45e5-9e55-42cb69c00b3b-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"0d61bbf6-923c-45e5-9e55-42cb69c00b3b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:55:37 crc kubenswrapper[4690]: I0320 17:55:37.140118 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d61bbf6-923c-45e5-9e55-42cb69c00b3b-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"0d61bbf6-923c-45e5-9e55-42cb69c00b3b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:55:37 crc kubenswrapper[4690]: I0320 17:55:37.145707 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d61bbf6-923c-45e5-9e55-42cb69c00b3b-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"0d61bbf6-923c-45e5-9e55-42cb69c00b3b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:55:37 crc kubenswrapper[4690]: I0320 17:55:37.146312 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d61bbf6-923c-45e5-9e55-42cb69c00b3b-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"0d61bbf6-923c-45e5-9e55-42cb69c00b3b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:55:37 crc kubenswrapper[4690]: I0320 17:55:37.147167 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d61bbf6-923c-45e5-9e55-42cb69c00b3b-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"0d61bbf6-923c-45e5-9e55-42cb69c00b3b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:55:37 crc kubenswrapper[4690]: I0320 17:55:37.161920 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4c84\" (UniqueName: \"kubernetes.io/projected/0d61bbf6-923c-45e5-9e55-42cb69c00b3b-kube-api-access-n4c84\") pod \"nova-cell1-novncproxy-0\" (UID: \"0d61bbf6-923c-45e5-9e55-42cb69c00b3b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:55:37 crc kubenswrapper[4690]: I0320 17:55:37.253315 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:55:37 crc kubenswrapper[4690]: I0320 17:55:37.720673 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 17:55:37 crc kubenswrapper[4690]: I0320 17:55:37.721969 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 17:55:37 crc kubenswrapper[4690]: I0320 17:55:37.745188 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 17:55:37 crc kubenswrapper[4690]: I0320 17:55:37.894831 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0767f87b-816c-4824-aaf1-8eb760dc6ee8" path="/var/lib/kubelet/pods/0767f87b-816c-4824-aaf1-8eb760dc6ee8/volumes" Mar 20 17:55:38 crc kubenswrapper[4690]: I0320 17:55:38.579669 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"0d61bbf6-923c-45e5-9e55-42cb69c00b3b","Type":"ContainerStarted","Data":"634a64c4e32631c11e29c72ca185608c851e329fcc6c72df26929c02808925c7"} Mar 20 17:55:38 crc kubenswrapper[4690]: I0320 17:55:38.580104 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"0d61bbf6-923c-45e5-9e55-42cb69c00b3b","Type":"ContainerStarted","Data":"0e3cbcfd3e05648d8f658b22a9d6a5d2dbdaffa1296bbf598c2f6a56ae57d2fe"} Mar 20 17:55:38 crc kubenswrapper[4690]: I0320 17:55:38.622069 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.622044638 podStartE2EDuration="2.622044638s" podCreationTimestamp="2026-03-20 17:55:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:55:38.603175756 +0000 UTC m=+1413.469001464" watchObservedRunningTime="2026-03-20 17:55:38.622044638 +0000 UTC m=+1413.487870356" Mar 20 17:55:39 crc kubenswrapper[4690]: I0320 17:55:39.503372 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 20 17:55:39 crc kubenswrapper[4690]: I0320 17:55:39.725287 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 20 17:55:39 crc kubenswrapper[4690]: I0320 17:55:39.726903 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 20 17:55:39 crc kubenswrapper[4690]: I0320 17:55:39.728562 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 20 17:55:40 crc kubenswrapper[4690]: I0320 17:55:40.602271 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 20 17:55:40 crc kubenswrapper[4690]: I0320 17:55:40.770973 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-7h684"] Mar 20 17:55:40 crc kubenswrapper[4690]: I0320 17:55:40.774900 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-7h684" Mar 20 17:55:40 crc kubenswrapper[4690]: I0320 17:55:40.810245 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-7h684"] Mar 20 17:55:40 crc kubenswrapper[4690]: I0320 17:55:40.915497 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1b5a6a56-2ecc-47cf-9f38-4fd2df362c77-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-7h684\" (UID: \"1b5a6a56-2ecc-47cf-9f38-4fd2df362c77\") " pod="openstack/dnsmasq-dns-89c5cd4d5-7h684" Mar 20 17:55:40 crc kubenswrapper[4690]: I0320 17:55:40.915861 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1b5a6a56-2ecc-47cf-9f38-4fd2df362c77-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-7h684\" (UID: \"1b5a6a56-2ecc-47cf-9f38-4fd2df362c77\") " pod="openstack/dnsmasq-dns-89c5cd4d5-7h684" Mar 20 17:55:40 crc kubenswrapper[4690]: I0320 17:55:40.916017 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1b5a6a56-2ecc-47cf-9f38-4fd2df362c77-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-7h684\" (UID: \"1b5a6a56-2ecc-47cf-9f38-4fd2df362c77\") " pod="openstack/dnsmasq-dns-89c5cd4d5-7h684" Mar 20 17:55:40 crc kubenswrapper[4690]: I0320 17:55:40.916158 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b5a6a56-2ecc-47cf-9f38-4fd2df362c77-config\") pod \"dnsmasq-dns-89c5cd4d5-7h684\" (UID: \"1b5a6a56-2ecc-47cf-9f38-4fd2df362c77\") " pod="openstack/dnsmasq-dns-89c5cd4d5-7h684" Mar 20 17:55:40 crc kubenswrapper[4690]: I0320 17:55:40.916312 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flcvv\" (UniqueName: \"kubernetes.io/projected/1b5a6a56-2ecc-47cf-9f38-4fd2df362c77-kube-api-access-flcvv\") pod \"dnsmasq-dns-89c5cd4d5-7h684\" (UID: \"1b5a6a56-2ecc-47cf-9f38-4fd2df362c77\") " pod="openstack/dnsmasq-dns-89c5cd4d5-7h684" Mar 20 17:55:40 crc kubenswrapper[4690]: I0320 17:55:40.916471 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1b5a6a56-2ecc-47cf-9f38-4fd2df362c77-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-7h684\" (UID: \"1b5a6a56-2ecc-47cf-9f38-4fd2df362c77\") " pod="openstack/dnsmasq-dns-89c5cd4d5-7h684" Mar 20 17:55:41 crc kubenswrapper[4690]: I0320 17:55:41.017912 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1b5a6a56-2ecc-47cf-9f38-4fd2df362c77-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-7h684\" (UID: \"1b5a6a56-2ecc-47cf-9f38-4fd2df362c77\") " pod="openstack/dnsmasq-dns-89c5cd4d5-7h684" Mar 20 17:55:41 crc kubenswrapper[4690]: I0320 17:55:41.018217 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1b5a6a56-2ecc-47cf-9f38-4fd2df362c77-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-7h684\" (UID: \"1b5a6a56-2ecc-47cf-9f38-4fd2df362c77\") " pod="openstack/dnsmasq-dns-89c5cd4d5-7h684" Mar 20 17:55:41 crc kubenswrapper[4690]: I0320 17:55:41.018557 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1b5a6a56-2ecc-47cf-9f38-4fd2df362c77-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-7h684\" (UID: \"1b5a6a56-2ecc-47cf-9f38-4fd2df362c77\") " pod="openstack/dnsmasq-dns-89c5cd4d5-7h684" Mar 20 17:55:41 crc kubenswrapper[4690]: I0320 17:55:41.018614 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b5a6a56-2ecc-47cf-9f38-4fd2df362c77-config\") pod \"dnsmasq-dns-89c5cd4d5-7h684\" (UID: \"1b5a6a56-2ecc-47cf-9f38-4fd2df362c77\") " pod="openstack/dnsmasq-dns-89c5cd4d5-7h684" Mar 20 17:55:41 crc kubenswrapper[4690]: I0320 17:55:41.018639 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flcvv\" (UniqueName: \"kubernetes.io/projected/1b5a6a56-2ecc-47cf-9f38-4fd2df362c77-kube-api-access-flcvv\") pod \"dnsmasq-dns-89c5cd4d5-7h684\" (UID: \"1b5a6a56-2ecc-47cf-9f38-4fd2df362c77\") " pod="openstack/dnsmasq-dns-89c5cd4d5-7h684" Mar 20 17:55:41 crc kubenswrapper[4690]: I0320 17:55:41.018763 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1b5a6a56-2ecc-47cf-9f38-4fd2df362c77-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-7h684\" (UID: \"1b5a6a56-2ecc-47cf-9f38-4fd2df362c77\") " pod="openstack/dnsmasq-dns-89c5cd4d5-7h684" Mar 20 17:55:41 crc kubenswrapper[4690]: I0320 17:55:41.018952 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1b5a6a56-2ecc-47cf-9f38-4fd2df362c77-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-7h684\" (UID: \"1b5a6a56-2ecc-47cf-9f38-4fd2df362c77\") " pod="openstack/dnsmasq-dns-89c5cd4d5-7h684" Mar 20 17:55:41 crc kubenswrapper[4690]: I0320 17:55:41.019193 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1b5a6a56-2ecc-47cf-9f38-4fd2df362c77-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-7h684\" (UID: \"1b5a6a56-2ecc-47cf-9f38-4fd2df362c77\") " pod="openstack/dnsmasq-dns-89c5cd4d5-7h684" Mar 20 17:55:41 crc kubenswrapper[4690]: I0320 17:55:41.019959 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1b5a6a56-2ecc-47cf-9f38-4fd2df362c77-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-7h684\" (UID: \"1b5a6a56-2ecc-47cf-9f38-4fd2df362c77\") " pod="openstack/dnsmasq-dns-89c5cd4d5-7h684" Mar 20 17:55:41 crc kubenswrapper[4690]: I0320 17:55:41.020027 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1b5a6a56-2ecc-47cf-9f38-4fd2df362c77-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-7h684\" (UID: \"1b5a6a56-2ecc-47cf-9f38-4fd2df362c77\") " pod="openstack/dnsmasq-dns-89c5cd4d5-7h684" Mar 20 17:55:41 crc kubenswrapper[4690]: I0320 17:55:41.020679 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b5a6a56-2ecc-47cf-9f38-4fd2df362c77-config\") pod \"dnsmasq-dns-89c5cd4d5-7h684\" (UID: \"1b5a6a56-2ecc-47cf-9f38-4fd2df362c77\") " pod="openstack/dnsmasq-dns-89c5cd4d5-7h684" Mar 20 17:55:41 crc kubenswrapper[4690]: I0320 17:55:41.037046 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flcvv\" (UniqueName: \"kubernetes.io/projected/1b5a6a56-2ecc-47cf-9f38-4fd2df362c77-kube-api-access-flcvv\") pod \"dnsmasq-dns-89c5cd4d5-7h684\" (UID: \"1b5a6a56-2ecc-47cf-9f38-4fd2df362c77\") " pod="openstack/dnsmasq-dns-89c5cd4d5-7h684" Mar 20 17:55:41 crc kubenswrapper[4690]: I0320 17:55:41.108963 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-7h684" Mar 20 17:55:42 crc kubenswrapper[4690]: W0320 17:55:42.121210 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1b5a6a56_2ecc_47cf_9f38_4fd2df362c77.slice/crio-601b8ba3957e1335746be44c44075457ab0c8ebbb31d05fa597a6f7a164d2b2c WatchSource:0}: Error finding container 601b8ba3957e1335746be44c44075457ab0c8ebbb31d05fa597a6f7a164d2b2c: Status 404 returned error can't find the container with id 601b8ba3957e1335746be44c44075457ab0c8ebbb31d05fa597a6f7a164d2b2c Mar 20 17:55:42 crc kubenswrapper[4690]: I0320 17:55:42.124422 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-7h684"] Mar 20 17:55:42 crc kubenswrapper[4690]: I0320 17:55:42.253721 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:55:42 crc kubenswrapper[4690]: I0320 17:55:42.628026 4690 generic.go:334] "Generic (PLEG): container finished" podID="1b5a6a56-2ecc-47cf-9f38-4fd2df362c77" containerID="20769a7849e38f8dfc4f0c170b990eadc7c100f3b363f0a9b14cef58867a4dd1" exitCode=0 Mar 20 17:55:42 crc kubenswrapper[4690]: I0320 17:55:42.628136 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-7h684" event={"ID":"1b5a6a56-2ecc-47cf-9f38-4fd2df362c77","Type":"ContainerDied","Data":"20769a7849e38f8dfc4f0c170b990eadc7c100f3b363f0a9b14cef58867a4dd1"} Mar 20 17:55:42 crc kubenswrapper[4690]: I0320 17:55:42.628450 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-7h684" event={"ID":"1b5a6a56-2ecc-47cf-9f38-4fd2df362c77","Type":"ContainerStarted","Data":"601b8ba3957e1335746be44c44075457ab0c8ebbb31d05fa597a6f7a164d2b2c"} Mar 20 17:55:42 crc kubenswrapper[4690]: I0320 17:55:42.986892 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 17:55:43 crc kubenswrapper[4690]: I0320 17:55:43.638393 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-7h684" event={"ID":"1b5a6a56-2ecc-47cf-9f38-4fd2df362c77","Type":"ContainerStarted","Data":"22556642c84455540ea294d9c9842722880e77895a45fbd6b0aaa97a6f82aec6"} Mar 20 17:55:43 crc kubenswrapper[4690]: I0320 17:55:43.638543 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="317814bc-68b1-4454-953c-dfacdf66c9da" containerName="nova-api-log" containerID="cri-o://96fe73bf887097afa9c5596684bf3dde991e88b351de63ab57bbc7ac16d5ab4e" gracePeriod=30 Mar 20 17:55:43 crc kubenswrapper[4690]: I0320 17:55:43.638617 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="317814bc-68b1-4454-953c-dfacdf66c9da" containerName="nova-api-api" containerID="cri-o://b0097d716724b9802f8cade39d53d64466071d0a06d455751900a65c4a4e0efd" gracePeriod=30 Mar 20 17:55:43 crc kubenswrapper[4690]: I0320 17:55:43.673637 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-89c5cd4d5-7h684" podStartSLOduration=3.673622012 podStartE2EDuration="3.673622012s" podCreationTimestamp="2026-03-20 17:55:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:55:43.670731012 +0000 UTC m=+1418.536556690" watchObservedRunningTime="2026-03-20 17:55:43.673622012 +0000 UTC m=+1418.539447690" Mar 20 17:55:43 crc kubenswrapper[4690]: I0320 17:55:43.701766 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:55:43 crc kubenswrapper[4690]: I0320 17:55:43.702213 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="db1320b9-1cd4-4756-b81a-c3eed18a7140" containerName="ceilometer-central-agent" containerID="cri-o://e9e131e0321bc2bd64fab209bda5a6688cdff571f9a35ce09702536f74c753a4" gracePeriod=30 Mar 20 17:55:43 crc kubenswrapper[4690]: I0320 17:55:43.702317 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="db1320b9-1cd4-4756-b81a-c3eed18a7140" containerName="proxy-httpd" containerID="cri-o://0e60e43b47a4a865acf7462b05a6f1f37792cdfee8fd036d956034721a8d5502" gracePeriod=30 Mar 20 17:55:43 crc kubenswrapper[4690]: I0320 17:55:43.702401 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="db1320b9-1cd4-4756-b81a-c3eed18a7140" containerName="sg-core" containerID="cri-o://b626f15a2b06d6a28e0976df179131417700c2831dc8477177e75c040c4cecb3" gracePeriod=30 Mar 20 17:55:43 crc kubenswrapper[4690]: I0320 17:55:43.702374 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="db1320b9-1cd4-4756-b81a-c3eed18a7140" containerName="ceilometer-notification-agent" containerID="cri-o://21321ee395aef14fbc71d4126ffe48479fe453b8429666d92188d11427599ea1" gracePeriod=30 Mar 20 17:55:44 crc kubenswrapper[4690]: I0320 17:55:44.496917 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:55:44 crc kubenswrapper[4690]: I0320 17:55:44.634080 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/db1320b9-1cd4-4756-b81a-c3eed18a7140-sg-core-conf-yaml\") pod \"db1320b9-1cd4-4756-b81a-c3eed18a7140\" (UID: \"db1320b9-1cd4-4756-b81a-c3eed18a7140\") " Mar 20 17:55:44 crc kubenswrapper[4690]: I0320 17:55:44.634414 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db1320b9-1cd4-4756-b81a-c3eed18a7140-config-data\") pod \"db1320b9-1cd4-4756-b81a-c3eed18a7140\" (UID: \"db1320b9-1cd4-4756-b81a-c3eed18a7140\") " Mar 20 17:55:44 crc kubenswrapper[4690]: I0320 17:55:44.634486 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5k574\" (UniqueName: \"kubernetes.io/projected/db1320b9-1cd4-4756-b81a-c3eed18a7140-kube-api-access-5k574\") pod \"db1320b9-1cd4-4756-b81a-c3eed18a7140\" (UID: \"db1320b9-1cd4-4756-b81a-c3eed18a7140\") " Mar 20 17:55:44 crc kubenswrapper[4690]: I0320 17:55:44.634550 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db1320b9-1cd4-4756-b81a-c3eed18a7140-combined-ca-bundle\") pod \"db1320b9-1cd4-4756-b81a-c3eed18a7140\" (UID: \"db1320b9-1cd4-4756-b81a-c3eed18a7140\") " Mar 20 17:55:44 crc kubenswrapper[4690]: I0320 17:55:44.634592 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/db1320b9-1cd4-4756-b81a-c3eed18a7140-log-httpd\") pod \"db1320b9-1cd4-4756-b81a-c3eed18a7140\" (UID: \"db1320b9-1cd4-4756-b81a-c3eed18a7140\") " Mar 20 17:55:44 crc kubenswrapper[4690]: I0320 17:55:44.634617 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/db1320b9-1cd4-4756-b81a-c3eed18a7140-ceilometer-tls-certs\") pod \"db1320b9-1cd4-4756-b81a-c3eed18a7140\" (UID: \"db1320b9-1cd4-4756-b81a-c3eed18a7140\") " Mar 20 17:55:44 crc kubenswrapper[4690]: I0320 17:55:44.634689 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db1320b9-1cd4-4756-b81a-c3eed18a7140-scripts\") pod \"db1320b9-1cd4-4756-b81a-c3eed18a7140\" (UID: \"db1320b9-1cd4-4756-b81a-c3eed18a7140\") " Mar 20 17:55:44 crc kubenswrapper[4690]: I0320 17:55:44.634712 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/db1320b9-1cd4-4756-b81a-c3eed18a7140-run-httpd\") pod \"db1320b9-1cd4-4756-b81a-c3eed18a7140\" (UID: \"db1320b9-1cd4-4756-b81a-c3eed18a7140\") " Mar 20 17:55:44 crc kubenswrapper[4690]: I0320 17:55:44.635411 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db1320b9-1cd4-4756-b81a-c3eed18a7140-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "db1320b9-1cd4-4756-b81a-c3eed18a7140" (UID: "db1320b9-1cd4-4756-b81a-c3eed18a7140"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:55:44 crc kubenswrapper[4690]: I0320 17:55:44.635550 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db1320b9-1cd4-4756-b81a-c3eed18a7140-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "db1320b9-1cd4-4756-b81a-c3eed18a7140" (UID: "db1320b9-1cd4-4756-b81a-c3eed18a7140"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:55:44 crc kubenswrapper[4690]: I0320 17:55:44.641788 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db1320b9-1cd4-4756-b81a-c3eed18a7140-kube-api-access-5k574" (OuterVolumeSpecName: "kube-api-access-5k574") pod "db1320b9-1cd4-4756-b81a-c3eed18a7140" (UID: "db1320b9-1cd4-4756-b81a-c3eed18a7140"). InnerVolumeSpecName "kube-api-access-5k574". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:55:44 crc kubenswrapper[4690]: I0320 17:55:44.652424 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db1320b9-1cd4-4756-b81a-c3eed18a7140-scripts" (OuterVolumeSpecName: "scripts") pod "db1320b9-1cd4-4756-b81a-c3eed18a7140" (UID: "db1320b9-1cd4-4756-b81a-c3eed18a7140"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:55:44 crc kubenswrapper[4690]: I0320 17:55:44.677854 4690 generic.go:334] "Generic (PLEG): container finished" podID="db1320b9-1cd4-4756-b81a-c3eed18a7140" containerID="0e60e43b47a4a865acf7462b05a6f1f37792cdfee8fd036d956034721a8d5502" exitCode=0 Mar 20 17:55:44 crc kubenswrapper[4690]: I0320 17:55:44.677886 4690 generic.go:334] "Generic (PLEG): container finished" podID="db1320b9-1cd4-4756-b81a-c3eed18a7140" containerID="b626f15a2b06d6a28e0976df179131417700c2831dc8477177e75c040c4cecb3" exitCode=2 Mar 20 17:55:44 crc kubenswrapper[4690]: I0320 17:55:44.677896 4690 generic.go:334] "Generic (PLEG): container finished" podID="db1320b9-1cd4-4756-b81a-c3eed18a7140" containerID="21321ee395aef14fbc71d4126ffe48479fe453b8429666d92188d11427599ea1" exitCode=0 Mar 20 17:55:44 crc kubenswrapper[4690]: I0320 17:55:44.677903 4690 generic.go:334] "Generic (PLEG): container finished" podID="db1320b9-1cd4-4756-b81a-c3eed18a7140" containerID="e9e131e0321bc2bd64fab209bda5a6688cdff571f9a35ce09702536f74c753a4" exitCode=0 Mar 20 17:55:44 crc kubenswrapper[4690]: I0320 17:55:44.677944 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"db1320b9-1cd4-4756-b81a-c3eed18a7140","Type":"ContainerDied","Data":"0e60e43b47a4a865acf7462b05a6f1f37792cdfee8fd036d956034721a8d5502"} Mar 20 17:55:44 crc kubenswrapper[4690]: I0320 17:55:44.677969 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"db1320b9-1cd4-4756-b81a-c3eed18a7140","Type":"ContainerDied","Data":"b626f15a2b06d6a28e0976df179131417700c2831dc8477177e75c040c4cecb3"} Mar 20 17:55:44 crc kubenswrapper[4690]: I0320 17:55:44.677980 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"db1320b9-1cd4-4756-b81a-c3eed18a7140","Type":"ContainerDied","Data":"21321ee395aef14fbc71d4126ffe48479fe453b8429666d92188d11427599ea1"} Mar 20 17:55:44 crc kubenswrapper[4690]: I0320 17:55:44.677989 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"db1320b9-1cd4-4756-b81a-c3eed18a7140","Type":"ContainerDied","Data":"e9e131e0321bc2bd64fab209bda5a6688cdff571f9a35ce09702536f74c753a4"} Mar 20 17:55:44 crc kubenswrapper[4690]: I0320 17:55:44.678113 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"db1320b9-1cd4-4756-b81a-c3eed18a7140","Type":"ContainerDied","Data":"57ffdfb621377fb59d10ddd552692d4c55803d502d40535b98f945f00685b5c7"} Mar 20 17:55:44 crc kubenswrapper[4690]: I0320 17:55:44.678132 4690 scope.go:117] "RemoveContainer" containerID="0e60e43b47a4a865acf7462b05a6f1f37792cdfee8fd036d956034721a8d5502" Mar 20 17:55:44 crc kubenswrapper[4690]: I0320 17:55:44.678270 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:55:44 crc kubenswrapper[4690]: I0320 17:55:44.679529 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db1320b9-1cd4-4756-b81a-c3eed18a7140-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "db1320b9-1cd4-4756-b81a-c3eed18a7140" (UID: "db1320b9-1cd4-4756-b81a-c3eed18a7140"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:55:44 crc kubenswrapper[4690]: I0320 17:55:44.689316 4690 generic.go:334] "Generic (PLEG): container finished" podID="317814bc-68b1-4454-953c-dfacdf66c9da" containerID="96fe73bf887097afa9c5596684bf3dde991e88b351de63ab57bbc7ac16d5ab4e" exitCode=143 Mar 20 17:55:44 crc kubenswrapper[4690]: I0320 17:55:44.689644 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"317814bc-68b1-4454-953c-dfacdf66c9da","Type":"ContainerDied","Data":"96fe73bf887097afa9c5596684bf3dde991e88b351de63ab57bbc7ac16d5ab4e"} Mar 20 17:55:44 crc kubenswrapper[4690]: I0320 17:55:44.689735 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-89c5cd4d5-7h684" Mar 20 17:55:44 crc kubenswrapper[4690]: I0320 17:55:44.698490 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db1320b9-1cd4-4756-b81a-c3eed18a7140-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "db1320b9-1cd4-4756-b81a-c3eed18a7140" (UID: "db1320b9-1cd4-4756-b81a-c3eed18a7140"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:55:44 crc kubenswrapper[4690]: I0320 17:55:44.736932 4690 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/db1320b9-1cd4-4756-b81a-c3eed18a7140-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 17:55:44 crc kubenswrapper[4690]: I0320 17:55:44.736958 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5k574\" (UniqueName: \"kubernetes.io/projected/db1320b9-1cd4-4756-b81a-c3eed18a7140-kube-api-access-5k574\") on node \"crc\" DevicePath \"\"" Mar 20 17:55:44 crc kubenswrapper[4690]: I0320 17:55:44.736967 4690 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/db1320b9-1cd4-4756-b81a-c3eed18a7140-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 17:55:44 crc kubenswrapper[4690]: I0320 17:55:44.736976 4690 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/db1320b9-1cd4-4756-b81a-c3eed18a7140-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 17:55:44 crc kubenswrapper[4690]: I0320 17:55:44.736985 4690 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db1320b9-1cd4-4756-b81a-c3eed18a7140-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:55:44 crc kubenswrapper[4690]: I0320 17:55:44.737015 4690 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/db1320b9-1cd4-4756-b81a-c3eed18a7140-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 17:55:44 crc kubenswrapper[4690]: I0320 17:55:44.743244 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db1320b9-1cd4-4756-b81a-c3eed18a7140-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "db1320b9-1cd4-4756-b81a-c3eed18a7140" (UID: "db1320b9-1cd4-4756-b81a-c3eed18a7140"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:55:44 crc kubenswrapper[4690]: I0320 17:55:44.769905 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db1320b9-1cd4-4756-b81a-c3eed18a7140-config-data" (OuterVolumeSpecName: "config-data") pod "db1320b9-1cd4-4756-b81a-c3eed18a7140" (UID: "db1320b9-1cd4-4756-b81a-c3eed18a7140"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:55:44 crc kubenswrapper[4690]: I0320 17:55:44.775104 4690 scope.go:117] "RemoveContainer" containerID="b626f15a2b06d6a28e0976df179131417700c2831dc8477177e75c040c4cecb3" Mar 20 17:55:44 crc kubenswrapper[4690]: I0320 17:55:44.802232 4690 scope.go:117] "RemoveContainer" containerID="21321ee395aef14fbc71d4126ffe48479fe453b8429666d92188d11427599ea1" Mar 20 17:55:44 crc kubenswrapper[4690]: I0320 17:55:44.823676 4690 scope.go:117] "RemoveContainer" containerID="e9e131e0321bc2bd64fab209bda5a6688cdff571f9a35ce09702536f74c753a4" Mar 20 17:55:44 crc kubenswrapper[4690]: I0320 17:55:44.838759 4690 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db1320b9-1cd4-4756-b81a-c3eed18a7140-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:55:44 crc kubenswrapper[4690]: I0320 17:55:44.838808 4690 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db1320b9-1cd4-4756-b81a-c3eed18a7140-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:55:44 crc kubenswrapper[4690]: I0320 17:55:44.864608 4690 scope.go:117] "RemoveContainer" containerID="0e60e43b47a4a865acf7462b05a6f1f37792cdfee8fd036d956034721a8d5502" Mar 20 17:55:44 crc kubenswrapper[4690]: E0320 17:55:44.865097 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e60e43b47a4a865acf7462b05a6f1f37792cdfee8fd036d956034721a8d5502\": container with ID starting with 0e60e43b47a4a865acf7462b05a6f1f37792cdfee8fd036d956034721a8d5502 not found: ID does not exist" containerID="0e60e43b47a4a865acf7462b05a6f1f37792cdfee8fd036d956034721a8d5502" Mar 20 17:55:44 crc kubenswrapper[4690]: I0320 17:55:44.865152 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e60e43b47a4a865acf7462b05a6f1f37792cdfee8fd036d956034721a8d5502"} err="failed to get container status \"0e60e43b47a4a865acf7462b05a6f1f37792cdfee8fd036d956034721a8d5502\": rpc error: code = NotFound desc = could not find container \"0e60e43b47a4a865acf7462b05a6f1f37792cdfee8fd036d956034721a8d5502\": container with ID starting with 0e60e43b47a4a865acf7462b05a6f1f37792cdfee8fd036d956034721a8d5502 not found: ID does not exist" Mar 20 17:55:44 crc kubenswrapper[4690]: I0320 17:55:44.865192 4690 scope.go:117] "RemoveContainer" containerID="b626f15a2b06d6a28e0976df179131417700c2831dc8477177e75c040c4cecb3" Mar 20 17:55:44 crc kubenswrapper[4690]: E0320 17:55:44.865654 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b626f15a2b06d6a28e0976df179131417700c2831dc8477177e75c040c4cecb3\": container with ID starting with b626f15a2b06d6a28e0976df179131417700c2831dc8477177e75c040c4cecb3 not found: ID does not exist" containerID="b626f15a2b06d6a28e0976df179131417700c2831dc8477177e75c040c4cecb3" Mar 20 17:55:44 crc kubenswrapper[4690]: I0320 17:55:44.865693 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b626f15a2b06d6a28e0976df179131417700c2831dc8477177e75c040c4cecb3"} err="failed to get container status \"b626f15a2b06d6a28e0976df179131417700c2831dc8477177e75c040c4cecb3\": rpc error: code = NotFound desc = could not find container \"b626f15a2b06d6a28e0976df179131417700c2831dc8477177e75c040c4cecb3\": container with ID starting with b626f15a2b06d6a28e0976df179131417700c2831dc8477177e75c040c4cecb3 not found: ID does not exist" Mar 20 17:55:44 crc kubenswrapper[4690]: I0320 17:55:44.865712 4690 scope.go:117] "RemoveContainer" containerID="21321ee395aef14fbc71d4126ffe48479fe453b8429666d92188d11427599ea1" Mar 20 17:55:44 crc kubenswrapper[4690]: E0320 17:55:44.865914 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21321ee395aef14fbc71d4126ffe48479fe453b8429666d92188d11427599ea1\": container with ID starting with 21321ee395aef14fbc71d4126ffe48479fe453b8429666d92188d11427599ea1 not found: ID does not exist" containerID="21321ee395aef14fbc71d4126ffe48479fe453b8429666d92188d11427599ea1" Mar 20 17:55:44 crc kubenswrapper[4690]: I0320 17:55:44.865940 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21321ee395aef14fbc71d4126ffe48479fe453b8429666d92188d11427599ea1"} err="failed to get container status \"21321ee395aef14fbc71d4126ffe48479fe453b8429666d92188d11427599ea1\": rpc error: code = NotFound desc = could not find container \"21321ee395aef14fbc71d4126ffe48479fe453b8429666d92188d11427599ea1\": container with ID starting with 21321ee395aef14fbc71d4126ffe48479fe453b8429666d92188d11427599ea1 not found: ID does not exist" Mar 20 17:55:44 crc kubenswrapper[4690]: I0320 17:55:44.865956 4690 scope.go:117] "RemoveContainer" containerID="e9e131e0321bc2bd64fab209bda5a6688cdff571f9a35ce09702536f74c753a4" Mar 20 17:55:44 crc kubenswrapper[4690]: E0320 17:55:44.866121 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9e131e0321bc2bd64fab209bda5a6688cdff571f9a35ce09702536f74c753a4\": container with ID starting with e9e131e0321bc2bd64fab209bda5a6688cdff571f9a35ce09702536f74c753a4 not found: ID does not exist" containerID="e9e131e0321bc2bd64fab209bda5a6688cdff571f9a35ce09702536f74c753a4" Mar 20 17:55:44 crc kubenswrapper[4690]: I0320 17:55:44.866151 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9e131e0321bc2bd64fab209bda5a6688cdff571f9a35ce09702536f74c753a4"} err="failed to get container status \"e9e131e0321bc2bd64fab209bda5a6688cdff571f9a35ce09702536f74c753a4\": rpc error: code = NotFound desc = could not find container \"e9e131e0321bc2bd64fab209bda5a6688cdff571f9a35ce09702536f74c753a4\": container with ID starting with e9e131e0321bc2bd64fab209bda5a6688cdff571f9a35ce09702536f74c753a4 not found: ID does not exist" Mar 20 17:55:44 crc kubenswrapper[4690]: I0320 17:55:44.866177 4690 scope.go:117] "RemoveContainer" containerID="0e60e43b47a4a865acf7462b05a6f1f37792cdfee8fd036d956034721a8d5502" Mar 20 17:55:44 crc kubenswrapper[4690]: I0320 17:55:44.866361 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e60e43b47a4a865acf7462b05a6f1f37792cdfee8fd036d956034721a8d5502"} err="failed to get container status \"0e60e43b47a4a865acf7462b05a6f1f37792cdfee8fd036d956034721a8d5502\": rpc error: code = NotFound desc = could not find container \"0e60e43b47a4a865acf7462b05a6f1f37792cdfee8fd036d956034721a8d5502\": container with ID starting with 0e60e43b47a4a865acf7462b05a6f1f37792cdfee8fd036d956034721a8d5502 not found: ID does not exist" Mar 20 17:55:44 crc kubenswrapper[4690]: I0320 17:55:44.866385 4690 scope.go:117] "RemoveContainer" containerID="b626f15a2b06d6a28e0976df179131417700c2831dc8477177e75c040c4cecb3" Mar 20 17:55:44 crc kubenswrapper[4690]: I0320 17:55:44.866557 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b626f15a2b06d6a28e0976df179131417700c2831dc8477177e75c040c4cecb3"} err="failed to get container status \"b626f15a2b06d6a28e0976df179131417700c2831dc8477177e75c040c4cecb3\": rpc error: code = NotFound desc = could not find container \"b626f15a2b06d6a28e0976df179131417700c2831dc8477177e75c040c4cecb3\": container with ID starting with b626f15a2b06d6a28e0976df179131417700c2831dc8477177e75c040c4cecb3 not found: ID does not exist" Mar 20 17:55:44 crc kubenswrapper[4690]: I0320 17:55:44.866583 4690 scope.go:117] "RemoveContainer" containerID="21321ee395aef14fbc71d4126ffe48479fe453b8429666d92188d11427599ea1" Mar 20 17:55:44 crc kubenswrapper[4690]: I0320 17:55:44.866739 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21321ee395aef14fbc71d4126ffe48479fe453b8429666d92188d11427599ea1"} err="failed to get container status \"21321ee395aef14fbc71d4126ffe48479fe453b8429666d92188d11427599ea1\": rpc error: code = NotFound desc = could not find container \"21321ee395aef14fbc71d4126ffe48479fe453b8429666d92188d11427599ea1\": container with ID starting with 21321ee395aef14fbc71d4126ffe48479fe453b8429666d92188d11427599ea1 not found: ID does not exist" Mar 20 17:55:44 crc kubenswrapper[4690]: I0320 17:55:44.866760 4690 scope.go:117] "RemoveContainer" containerID="e9e131e0321bc2bd64fab209bda5a6688cdff571f9a35ce09702536f74c753a4" Mar 20 17:55:44 crc kubenswrapper[4690]: I0320 17:55:44.866902 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9e131e0321bc2bd64fab209bda5a6688cdff571f9a35ce09702536f74c753a4"} err="failed to get container status \"e9e131e0321bc2bd64fab209bda5a6688cdff571f9a35ce09702536f74c753a4\": rpc error: code = NotFound desc = could not find container \"e9e131e0321bc2bd64fab209bda5a6688cdff571f9a35ce09702536f74c753a4\": container with ID starting with e9e131e0321bc2bd64fab209bda5a6688cdff571f9a35ce09702536f74c753a4 not found: ID does not exist" Mar 20 17:55:44 crc kubenswrapper[4690]: I0320 17:55:44.866923 4690 scope.go:117] "RemoveContainer" containerID="0e60e43b47a4a865acf7462b05a6f1f37792cdfee8fd036d956034721a8d5502" Mar 20 17:55:44 crc kubenswrapper[4690]: I0320 17:55:44.867069 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e60e43b47a4a865acf7462b05a6f1f37792cdfee8fd036d956034721a8d5502"} err="failed to get container status \"0e60e43b47a4a865acf7462b05a6f1f37792cdfee8fd036d956034721a8d5502\": rpc error: code = NotFound desc = could not find container \"0e60e43b47a4a865acf7462b05a6f1f37792cdfee8fd036d956034721a8d5502\": container with ID starting with 0e60e43b47a4a865acf7462b05a6f1f37792cdfee8fd036d956034721a8d5502 not found: ID does not exist" Mar 20 17:55:44 crc kubenswrapper[4690]: I0320 17:55:44.867089 4690 scope.go:117] "RemoveContainer" containerID="b626f15a2b06d6a28e0976df179131417700c2831dc8477177e75c040c4cecb3" Mar 20 17:55:44 crc kubenswrapper[4690]: I0320 17:55:44.867278 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b626f15a2b06d6a28e0976df179131417700c2831dc8477177e75c040c4cecb3"} err="failed to get container status \"b626f15a2b06d6a28e0976df179131417700c2831dc8477177e75c040c4cecb3\": rpc error: code = NotFound desc = could not find container \"b626f15a2b06d6a28e0976df179131417700c2831dc8477177e75c040c4cecb3\": container with ID starting with b626f15a2b06d6a28e0976df179131417700c2831dc8477177e75c040c4cecb3 not found: ID does not exist" Mar 20 17:55:44 crc kubenswrapper[4690]: I0320 17:55:44.867308 4690 scope.go:117] "RemoveContainer" containerID="21321ee395aef14fbc71d4126ffe48479fe453b8429666d92188d11427599ea1" Mar 20 17:55:44 crc kubenswrapper[4690]: I0320 17:55:44.867476 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21321ee395aef14fbc71d4126ffe48479fe453b8429666d92188d11427599ea1"} err="failed to get container status \"21321ee395aef14fbc71d4126ffe48479fe453b8429666d92188d11427599ea1\": rpc error: code = NotFound desc = could not find container \"21321ee395aef14fbc71d4126ffe48479fe453b8429666d92188d11427599ea1\": container with ID starting with 21321ee395aef14fbc71d4126ffe48479fe453b8429666d92188d11427599ea1 not found: ID does not exist" Mar 20 17:55:44 crc kubenswrapper[4690]: I0320 17:55:44.867502 4690 scope.go:117] "RemoveContainer" containerID="e9e131e0321bc2bd64fab209bda5a6688cdff571f9a35ce09702536f74c753a4" Mar 20 17:55:44 crc kubenswrapper[4690]: I0320 17:55:44.867665 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9e131e0321bc2bd64fab209bda5a6688cdff571f9a35ce09702536f74c753a4"} err="failed to get container status \"e9e131e0321bc2bd64fab209bda5a6688cdff571f9a35ce09702536f74c753a4\": rpc error: code = NotFound desc = could not find container \"e9e131e0321bc2bd64fab209bda5a6688cdff571f9a35ce09702536f74c753a4\": container with ID starting with e9e131e0321bc2bd64fab209bda5a6688cdff571f9a35ce09702536f74c753a4 not found: ID does not exist" Mar 20 17:55:44 crc kubenswrapper[4690]: I0320 17:55:44.867688 4690 scope.go:117] "RemoveContainer" containerID="0e60e43b47a4a865acf7462b05a6f1f37792cdfee8fd036d956034721a8d5502" Mar 20 17:55:44 crc kubenswrapper[4690]: I0320 17:55:44.867838 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e60e43b47a4a865acf7462b05a6f1f37792cdfee8fd036d956034721a8d5502"} err="failed to get container status \"0e60e43b47a4a865acf7462b05a6f1f37792cdfee8fd036d956034721a8d5502\": rpc error: code = NotFound desc = could not find container \"0e60e43b47a4a865acf7462b05a6f1f37792cdfee8fd036d956034721a8d5502\": container with ID starting with 0e60e43b47a4a865acf7462b05a6f1f37792cdfee8fd036d956034721a8d5502 not found: ID does not exist" Mar 20 17:55:44 crc kubenswrapper[4690]: I0320 17:55:44.867859 4690 scope.go:117] "RemoveContainer" containerID="b626f15a2b06d6a28e0976df179131417700c2831dc8477177e75c040c4cecb3" Mar 20 17:55:44 crc kubenswrapper[4690]: I0320 17:55:44.868005 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b626f15a2b06d6a28e0976df179131417700c2831dc8477177e75c040c4cecb3"} err="failed to get container status \"b626f15a2b06d6a28e0976df179131417700c2831dc8477177e75c040c4cecb3\": rpc error: code = NotFound desc = could not find container \"b626f15a2b06d6a28e0976df179131417700c2831dc8477177e75c040c4cecb3\": container with ID starting with b626f15a2b06d6a28e0976df179131417700c2831dc8477177e75c040c4cecb3 not found: ID does not exist" Mar 20 17:55:44 crc kubenswrapper[4690]: I0320 17:55:44.868025 4690 scope.go:117] "RemoveContainer" containerID="21321ee395aef14fbc71d4126ffe48479fe453b8429666d92188d11427599ea1" Mar 20 17:55:44 crc kubenswrapper[4690]: I0320 17:55:44.868174 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21321ee395aef14fbc71d4126ffe48479fe453b8429666d92188d11427599ea1"} err="failed to get container status \"21321ee395aef14fbc71d4126ffe48479fe453b8429666d92188d11427599ea1\": rpc error: code = NotFound desc = could not find container \"21321ee395aef14fbc71d4126ffe48479fe453b8429666d92188d11427599ea1\": container with ID starting with 21321ee395aef14fbc71d4126ffe48479fe453b8429666d92188d11427599ea1 not found: ID does not exist" Mar 20 17:55:44 crc kubenswrapper[4690]: I0320 17:55:44.868217 4690 scope.go:117] "RemoveContainer" containerID="e9e131e0321bc2bd64fab209bda5a6688cdff571f9a35ce09702536f74c753a4" Mar 20 17:55:44 crc kubenswrapper[4690]: I0320 17:55:44.868502 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9e131e0321bc2bd64fab209bda5a6688cdff571f9a35ce09702536f74c753a4"} err="failed to get container status \"e9e131e0321bc2bd64fab209bda5a6688cdff571f9a35ce09702536f74c753a4\": rpc error: code = NotFound desc = could not find container \"e9e131e0321bc2bd64fab209bda5a6688cdff571f9a35ce09702536f74c753a4\": container with ID starting with e9e131e0321bc2bd64fab209bda5a6688cdff571f9a35ce09702536f74c753a4 not found: ID does not exist" Mar 20 17:55:45 crc kubenswrapper[4690]: I0320 17:55:45.031330 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:55:45 crc kubenswrapper[4690]: I0320 17:55:45.044330 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:55:45 crc kubenswrapper[4690]: I0320 17:55:45.059623 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:55:45 crc kubenswrapper[4690]: E0320 17:55:45.060095 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db1320b9-1cd4-4756-b81a-c3eed18a7140" containerName="ceilometer-notification-agent" Mar 20 17:55:45 crc kubenswrapper[4690]: I0320 17:55:45.060118 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="db1320b9-1cd4-4756-b81a-c3eed18a7140" containerName="ceilometer-notification-agent" Mar 20 17:55:45 crc kubenswrapper[4690]: E0320 17:55:45.060155 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db1320b9-1cd4-4756-b81a-c3eed18a7140" containerName="proxy-httpd" Mar 20 17:55:45 crc kubenswrapper[4690]: I0320 17:55:45.060163 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="db1320b9-1cd4-4756-b81a-c3eed18a7140" containerName="proxy-httpd" Mar 20 17:55:45 crc kubenswrapper[4690]: E0320 17:55:45.060182 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db1320b9-1cd4-4756-b81a-c3eed18a7140" containerName="ceilometer-central-agent" Mar 20 17:55:45 crc kubenswrapper[4690]: I0320 17:55:45.060192 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="db1320b9-1cd4-4756-b81a-c3eed18a7140" containerName="ceilometer-central-agent" Mar 20 17:55:45 crc kubenswrapper[4690]: E0320 17:55:45.060215 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db1320b9-1cd4-4756-b81a-c3eed18a7140" containerName="sg-core" Mar 20 17:55:45 crc kubenswrapper[4690]: I0320 17:55:45.060224 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="db1320b9-1cd4-4756-b81a-c3eed18a7140" containerName="sg-core" Mar 20 17:55:45 crc kubenswrapper[4690]: I0320 17:55:45.060705 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="db1320b9-1cd4-4756-b81a-c3eed18a7140" containerName="ceilometer-central-agent" Mar 20 17:55:45 crc kubenswrapper[4690]: I0320 17:55:45.060749 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="db1320b9-1cd4-4756-b81a-c3eed18a7140" containerName="ceilometer-notification-agent" Mar 20 17:55:45 crc kubenswrapper[4690]: I0320 17:55:45.060761 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="db1320b9-1cd4-4756-b81a-c3eed18a7140" containerName="proxy-httpd" Mar 20 17:55:45 crc kubenswrapper[4690]: I0320 17:55:45.060779 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="db1320b9-1cd4-4756-b81a-c3eed18a7140" containerName="sg-core" Mar 20 17:55:45 crc kubenswrapper[4690]: I0320 17:55:45.062873 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:55:45 crc kubenswrapper[4690]: I0320 17:55:45.066010 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 17:55:45 crc kubenswrapper[4690]: I0320 17:55:45.066121 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 17:55:45 crc kubenswrapper[4690]: I0320 17:55:45.066010 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 20 17:55:45 crc kubenswrapper[4690]: I0320 17:55:45.069146 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:55:45 crc kubenswrapper[4690]: I0320 17:55:45.245532 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a58c4cd-dea7-417a-a296-6de5e559294f-run-httpd\") pod \"ceilometer-0\" (UID: \"8a58c4cd-dea7-417a-a296-6de5e559294f\") " pod="openstack/ceilometer-0" Mar 20 17:55:45 crc kubenswrapper[4690]: I0320 17:55:45.245613 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a58c4cd-dea7-417a-a296-6de5e559294f-config-data\") pod \"ceilometer-0\" (UID: \"8a58c4cd-dea7-417a-a296-6de5e559294f\") " pod="openstack/ceilometer-0" Mar 20 17:55:45 crc kubenswrapper[4690]: I0320 17:55:45.245692 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxrzz\" (UniqueName: \"kubernetes.io/projected/8a58c4cd-dea7-417a-a296-6de5e559294f-kube-api-access-jxrzz\") pod \"ceilometer-0\" (UID: \"8a58c4cd-dea7-417a-a296-6de5e559294f\") " pod="openstack/ceilometer-0" Mar 20 17:55:45 crc kubenswrapper[4690]: I0320 17:55:45.245713 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a58c4cd-dea7-417a-a296-6de5e559294f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8a58c4cd-dea7-417a-a296-6de5e559294f\") " pod="openstack/ceilometer-0" Mar 20 17:55:45 crc kubenswrapper[4690]: I0320 17:55:45.245731 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a58c4cd-dea7-417a-a296-6de5e559294f-log-httpd\") pod \"ceilometer-0\" (UID: \"8a58c4cd-dea7-417a-a296-6de5e559294f\") " pod="openstack/ceilometer-0" Mar 20 17:55:45 crc kubenswrapper[4690]: I0320 17:55:45.245799 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a58c4cd-dea7-417a-a296-6de5e559294f-scripts\") pod \"ceilometer-0\" (UID: \"8a58c4cd-dea7-417a-a296-6de5e559294f\") " pod="openstack/ceilometer-0" Mar 20 17:55:45 crc kubenswrapper[4690]: I0320 17:55:45.245832 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a58c4cd-dea7-417a-a296-6de5e559294f-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8a58c4cd-dea7-417a-a296-6de5e559294f\") " pod="openstack/ceilometer-0" Mar 20 17:55:45 crc kubenswrapper[4690]: I0320 17:55:45.245985 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8a58c4cd-dea7-417a-a296-6de5e559294f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8a58c4cd-dea7-417a-a296-6de5e559294f\") " pod="openstack/ceilometer-0" Mar 20 17:55:45 crc kubenswrapper[4690]: I0320 17:55:45.348166 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a58c4cd-dea7-417a-a296-6de5e559294f-scripts\") pod \"ceilometer-0\" (UID: \"8a58c4cd-dea7-417a-a296-6de5e559294f\") " pod="openstack/ceilometer-0" Mar 20 17:55:45 crc kubenswrapper[4690]: I0320 17:55:45.348469 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a58c4cd-dea7-417a-a296-6de5e559294f-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8a58c4cd-dea7-417a-a296-6de5e559294f\") " pod="openstack/ceilometer-0" Mar 20 17:55:45 crc kubenswrapper[4690]: I0320 17:55:45.348509 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8a58c4cd-dea7-417a-a296-6de5e559294f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8a58c4cd-dea7-417a-a296-6de5e559294f\") " pod="openstack/ceilometer-0" Mar 20 17:55:45 crc kubenswrapper[4690]: I0320 17:55:45.348554 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a58c4cd-dea7-417a-a296-6de5e559294f-run-httpd\") pod \"ceilometer-0\" (UID: \"8a58c4cd-dea7-417a-a296-6de5e559294f\") " pod="openstack/ceilometer-0" Mar 20 17:55:45 crc kubenswrapper[4690]: I0320 17:55:45.348585 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a58c4cd-dea7-417a-a296-6de5e559294f-config-data\") pod \"ceilometer-0\" (UID: \"8a58c4cd-dea7-417a-a296-6de5e559294f\") " pod="openstack/ceilometer-0" Mar 20 17:55:45 crc kubenswrapper[4690]: I0320 17:55:45.348634 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxrzz\" (UniqueName: \"kubernetes.io/projected/8a58c4cd-dea7-417a-a296-6de5e559294f-kube-api-access-jxrzz\") pod \"ceilometer-0\" (UID: \"8a58c4cd-dea7-417a-a296-6de5e559294f\") " pod="openstack/ceilometer-0" Mar 20 17:55:45 crc kubenswrapper[4690]: I0320 17:55:45.348654 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a58c4cd-dea7-417a-a296-6de5e559294f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8a58c4cd-dea7-417a-a296-6de5e559294f\") " pod="openstack/ceilometer-0" Mar 20 17:55:45 crc kubenswrapper[4690]: I0320 17:55:45.348670 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a58c4cd-dea7-417a-a296-6de5e559294f-log-httpd\") pod \"ceilometer-0\" (UID: \"8a58c4cd-dea7-417a-a296-6de5e559294f\") " pod="openstack/ceilometer-0" Mar 20 17:55:45 crc kubenswrapper[4690]: I0320 17:55:45.349088 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a58c4cd-dea7-417a-a296-6de5e559294f-log-httpd\") pod \"ceilometer-0\" (UID: \"8a58c4cd-dea7-417a-a296-6de5e559294f\") " pod="openstack/ceilometer-0" Mar 20 17:55:45 crc kubenswrapper[4690]: I0320 17:55:45.349390 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a58c4cd-dea7-417a-a296-6de5e559294f-run-httpd\") pod \"ceilometer-0\" (UID: \"8a58c4cd-dea7-417a-a296-6de5e559294f\") " pod="openstack/ceilometer-0" Mar 20 17:55:45 crc kubenswrapper[4690]: I0320 17:55:45.352572 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a58c4cd-dea7-417a-a296-6de5e559294f-scripts\") pod \"ceilometer-0\" (UID: \"8a58c4cd-dea7-417a-a296-6de5e559294f\") " pod="openstack/ceilometer-0" Mar 20 17:55:45 crc kubenswrapper[4690]: I0320 17:55:45.353149 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a58c4cd-dea7-417a-a296-6de5e559294f-config-data\") pod \"ceilometer-0\" (UID: \"8a58c4cd-dea7-417a-a296-6de5e559294f\") " pod="openstack/ceilometer-0" Mar 20 17:55:45 crc kubenswrapper[4690]: I0320 17:55:45.353312 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a58c4cd-dea7-417a-a296-6de5e559294f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8a58c4cd-dea7-417a-a296-6de5e559294f\") " pod="openstack/ceilometer-0" Mar 20 17:55:45 crc kubenswrapper[4690]: I0320 17:55:45.354600 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8a58c4cd-dea7-417a-a296-6de5e559294f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8a58c4cd-dea7-417a-a296-6de5e559294f\") " pod="openstack/ceilometer-0" Mar 20 17:55:45 crc kubenswrapper[4690]: I0320 17:55:45.356364 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a58c4cd-dea7-417a-a296-6de5e559294f-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8a58c4cd-dea7-417a-a296-6de5e559294f\") " pod="openstack/ceilometer-0" Mar 20 17:55:45 crc kubenswrapper[4690]: I0320 17:55:45.380205 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxrzz\" (UniqueName: \"kubernetes.io/projected/8a58c4cd-dea7-417a-a296-6de5e559294f-kube-api-access-jxrzz\") pod \"ceilometer-0\" (UID: \"8a58c4cd-dea7-417a-a296-6de5e559294f\") " pod="openstack/ceilometer-0" Mar 20 17:55:45 crc kubenswrapper[4690]: I0320 17:55:45.414714 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:55:45 crc kubenswrapper[4690]: I0320 17:55:45.547855 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:55:45 crc kubenswrapper[4690]: I0320 17:55:45.868808 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:55:45 crc kubenswrapper[4690]: W0320 17:55:45.877029 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a58c4cd_dea7_417a_a296_6de5e559294f.slice/crio-e1acbf2fb974464045026382a508b52535f0e659b76a970d006e91927dd99d59 WatchSource:0}: Error finding container e1acbf2fb974464045026382a508b52535f0e659b76a970d006e91927dd99d59: Status 404 returned error can't find the container with id e1acbf2fb974464045026382a508b52535f0e659b76a970d006e91927dd99d59 Mar 20 17:55:45 crc kubenswrapper[4690]: I0320 17:55:45.920694 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db1320b9-1cd4-4756-b81a-c3eed18a7140" path="/var/lib/kubelet/pods/db1320b9-1cd4-4756-b81a-c3eed18a7140/volumes" Mar 20 17:55:46 crc kubenswrapper[4690]: I0320 17:55:46.708988 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a58c4cd-dea7-417a-a296-6de5e559294f","Type":"ContainerStarted","Data":"68fca97fdd34b769e6f4db6341de981cc26cfe2a8fedbe69af61702db1aa8401"} Mar 20 17:55:46 crc kubenswrapper[4690]: I0320 17:55:46.709043 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a58c4cd-dea7-417a-a296-6de5e559294f","Type":"ContainerStarted","Data":"e1acbf2fb974464045026382a508b52535f0e659b76a970d006e91927dd99d59"} Mar 20 17:55:47 crc kubenswrapper[4690]: I0320 17:55:47.234320 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 17:55:47 crc kubenswrapper[4690]: I0320 17:55:47.254879 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:55:47 crc kubenswrapper[4690]: I0320 17:55:47.281436 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:55:47 crc kubenswrapper[4690]: I0320 17:55:47.390286 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/317814bc-68b1-4454-953c-dfacdf66c9da-config-data\") pod \"317814bc-68b1-4454-953c-dfacdf66c9da\" (UID: \"317814bc-68b1-4454-953c-dfacdf66c9da\") " Mar 20 17:55:47 crc kubenswrapper[4690]: I0320 17:55:47.390394 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/317814bc-68b1-4454-953c-dfacdf66c9da-combined-ca-bundle\") pod \"317814bc-68b1-4454-953c-dfacdf66c9da\" (UID: \"317814bc-68b1-4454-953c-dfacdf66c9da\") " Mar 20 17:55:47 crc kubenswrapper[4690]: I0320 17:55:47.390448 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/317814bc-68b1-4454-953c-dfacdf66c9da-logs\") pod \"317814bc-68b1-4454-953c-dfacdf66c9da\" (UID: \"317814bc-68b1-4454-953c-dfacdf66c9da\") " Mar 20 17:55:47 crc kubenswrapper[4690]: I0320 17:55:47.390488 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g6cwx\" (UniqueName: \"kubernetes.io/projected/317814bc-68b1-4454-953c-dfacdf66c9da-kube-api-access-g6cwx\") pod \"317814bc-68b1-4454-953c-dfacdf66c9da\" (UID: \"317814bc-68b1-4454-953c-dfacdf66c9da\") " Mar 20 17:55:47 crc kubenswrapper[4690]: I0320 17:55:47.390932 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/317814bc-68b1-4454-953c-dfacdf66c9da-logs" (OuterVolumeSpecName: "logs") pod "317814bc-68b1-4454-953c-dfacdf66c9da" (UID: "317814bc-68b1-4454-953c-dfacdf66c9da"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:55:47 crc kubenswrapper[4690]: I0320 17:55:47.391053 4690 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/317814bc-68b1-4454-953c-dfacdf66c9da-logs\") on node \"crc\" DevicePath \"\"" Mar 20 17:55:47 crc kubenswrapper[4690]: I0320 17:55:47.395991 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/317814bc-68b1-4454-953c-dfacdf66c9da-kube-api-access-g6cwx" (OuterVolumeSpecName: "kube-api-access-g6cwx") pod "317814bc-68b1-4454-953c-dfacdf66c9da" (UID: "317814bc-68b1-4454-953c-dfacdf66c9da"). InnerVolumeSpecName "kube-api-access-g6cwx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:55:47 crc kubenswrapper[4690]: I0320 17:55:47.424421 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/317814bc-68b1-4454-953c-dfacdf66c9da-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "317814bc-68b1-4454-953c-dfacdf66c9da" (UID: "317814bc-68b1-4454-953c-dfacdf66c9da"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:55:47 crc kubenswrapper[4690]: I0320 17:55:47.428566 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/317814bc-68b1-4454-953c-dfacdf66c9da-config-data" (OuterVolumeSpecName: "config-data") pod "317814bc-68b1-4454-953c-dfacdf66c9da" (UID: "317814bc-68b1-4454-953c-dfacdf66c9da"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:55:47 crc kubenswrapper[4690]: I0320 17:55:47.492525 4690 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/317814bc-68b1-4454-953c-dfacdf66c9da-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:55:47 crc kubenswrapper[4690]: I0320 17:55:47.492565 4690 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/317814bc-68b1-4454-953c-dfacdf66c9da-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:55:47 crc kubenswrapper[4690]: I0320 17:55:47.492582 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g6cwx\" (UniqueName: \"kubernetes.io/projected/317814bc-68b1-4454-953c-dfacdf66c9da-kube-api-access-g6cwx\") on node \"crc\" DevicePath \"\"" Mar 20 17:55:47 crc kubenswrapper[4690]: I0320 17:55:47.728991 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a58c4cd-dea7-417a-a296-6de5e559294f","Type":"ContainerStarted","Data":"adf2b7adec0431ebd135cdc93755d3275cd95f9000e37b11dcd1e2e2d7390c95"} Mar 20 17:55:47 crc kubenswrapper[4690]: I0320 17:55:47.733189 4690 generic.go:334] "Generic (PLEG): container finished" podID="317814bc-68b1-4454-953c-dfacdf66c9da" containerID="b0097d716724b9802f8cade39d53d64466071d0a06d455751900a65c4a4e0efd" exitCode=0 Mar 20 17:55:47 crc kubenswrapper[4690]: I0320 17:55:47.733433 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 17:55:47 crc kubenswrapper[4690]: I0320 17:55:47.733667 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"317814bc-68b1-4454-953c-dfacdf66c9da","Type":"ContainerDied","Data":"b0097d716724b9802f8cade39d53d64466071d0a06d455751900a65c4a4e0efd"} Mar 20 17:55:47 crc kubenswrapper[4690]: I0320 17:55:47.733705 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"317814bc-68b1-4454-953c-dfacdf66c9da","Type":"ContainerDied","Data":"82854f3238cfd87823f435c25dc3a68f3036ed6aade5ee9906bd1f4e6ed24707"} Mar 20 17:55:47 crc kubenswrapper[4690]: I0320 17:55:47.733726 4690 scope.go:117] "RemoveContainer" containerID="b0097d716724b9802f8cade39d53d64466071d0a06d455751900a65c4a4e0efd" Mar 20 17:55:47 crc kubenswrapper[4690]: I0320 17:55:47.755393 4690 scope.go:117] "RemoveContainer" containerID="96fe73bf887097afa9c5596684bf3dde991e88b351de63ab57bbc7ac16d5ab4e" Mar 20 17:55:47 crc kubenswrapper[4690]: I0320 17:55:47.762935 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:55:47 crc kubenswrapper[4690]: I0320 17:55:47.775718 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 17:55:47 crc kubenswrapper[4690]: I0320 17:55:47.778104 4690 scope.go:117] "RemoveContainer" containerID="b0097d716724b9802f8cade39d53d64466071d0a06d455751900a65c4a4e0efd" Mar 20 17:55:47 crc kubenswrapper[4690]: E0320 17:55:47.778623 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0097d716724b9802f8cade39d53d64466071d0a06d455751900a65c4a4e0efd\": container with ID starting with b0097d716724b9802f8cade39d53d64466071d0a06d455751900a65c4a4e0efd not found: ID does not exist" containerID="b0097d716724b9802f8cade39d53d64466071d0a06d455751900a65c4a4e0efd" Mar 20 17:55:47 crc kubenswrapper[4690]: I0320 17:55:47.778653 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0097d716724b9802f8cade39d53d64466071d0a06d455751900a65c4a4e0efd"} err="failed to get container status \"b0097d716724b9802f8cade39d53d64466071d0a06d455751900a65c4a4e0efd\": rpc error: code = NotFound desc = could not find container \"b0097d716724b9802f8cade39d53d64466071d0a06d455751900a65c4a4e0efd\": container with ID starting with b0097d716724b9802f8cade39d53d64466071d0a06d455751900a65c4a4e0efd not found: ID does not exist" Mar 20 17:55:47 crc kubenswrapper[4690]: I0320 17:55:47.778675 4690 scope.go:117] "RemoveContainer" containerID="96fe73bf887097afa9c5596684bf3dde991e88b351de63ab57bbc7ac16d5ab4e" Mar 20 17:55:47 crc kubenswrapper[4690]: E0320 17:55:47.779005 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96fe73bf887097afa9c5596684bf3dde991e88b351de63ab57bbc7ac16d5ab4e\": container with ID starting with 96fe73bf887097afa9c5596684bf3dde991e88b351de63ab57bbc7ac16d5ab4e not found: ID does not exist" containerID="96fe73bf887097afa9c5596684bf3dde991e88b351de63ab57bbc7ac16d5ab4e" Mar 20 17:55:47 crc kubenswrapper[4690]: I0320 17:55:47.779052 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96fe73bf887097afa9c5596684bf3dde991e88b351de63ab57bbc7ac16d5ab4e"} err="failed to get container status \"96fe73bf887097afa9c5596684bf3dde991e88b351de63ab57bbc7ac16d5ab4e\": rpc error: code = NotFound desc = could not find container \"96fe73bf887097afa9c5596684bf3dde991e88b351de63ab57bbc7ac16d5ab4e\": container with ID starting with 96fe73bf887097afa9c5596684bf3dde991e88b351de63ab57bbc7ac16d5ab4e not found: ID does not exist" Mar 20 17:55:47 crc kubenswrapper[4690]: I0320 17:55:47.785629 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 20 17:55:47 crc kubenswrapper[4690]: I0320 17:55:47.811192 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 20 17:55:47 crc kubenswrapper[4690]: E0320 17:55:47.811631 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="317814bc-68b1-4454-953c-dfacdf66c9da" containerName="nova-api-api" Mar 20 17:55:47 crc kubenswrapper[4690]: I0320 17:55:47.811647 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="317814bc-68b1-4454-953c-dfacdf66c9da" containerName="nova-api-api" Mar 20 17:55:47 crc kubenswrapper[4690]: E0320 17:55:47.811673 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="317814bc-68b1-4454-953c-dfacdf66c9da" containerName="nova-api-log" Mar 20 17:55:47 crc kubenswrapper[4690]: I0320 17:55:47.811679 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="317814bc-68b1-4454-953c-dfacdf66c9da" containerName="nova-api-log" Mar 20 17:55:47 crc kubenswrapper[4690]: I0320 17:55:47.811869 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="317814bc-68b1-4454-953c-dfacdf66c9da" containerName="nova-api-api" Mar 20 17:55:47 crc kubenswrapper[4690]: I0320 17:55:47.811894 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="317814bc-68b1-4454-953c-dfacdf66c9da" containerName="nova-api-log" Mar 20 17:55:47 crc kubenswrapper[4690]: I0320 17:55:47.812895 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 17:55:47 crc kubenswrapper[4690]: I0320 17:55:47.816323 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 20 17:55:47 crc kubenswrapper[4690]: I0320 17:55:47.816428 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 20 17:55:47 crc kubenswrapper[4690]: I0320 17:55:47.816333 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 20 17:55:47 crc kubenswrapper[4690]: I0320 17:55:47.834226 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 17:55:47 crc kubenswrapper[4690]: I0320 17:55:47.895705 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="317814bc-68b1-4454-953c-dfacdf66c9da" path="/var/lib/kubelet/pods/317814bc-68b1-4454-953c-dfacdf66c9da/volumes" Mar 20 17:55:47 crc kubenswrapper[4690]: I0320 17:55:47.900244 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqzg9\" (UniqueName: \"kubernetes.io/projected/d7846f23-5aa1-4613-a307-9e4bc7d372bb-kube-api-access-zqzg9\") pod \"nova-api-0\" (UID: \"d7846f23-5aa1-4613-a307-9e4bc7d372bb\") " pod="openstack/nova-api-0" Mar 20 17:55:47 crc kubenswrapper[4690]: I0320 17:55:47.900540 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7846f23-5aa1-4613-a307-9e4bc7d372bb-config-data\") pod \"nova-api-0\" (UID: \"d7846f23-5aa1-4613-a307-9e4bc7d372bb\") " pod="openstack/nova-api-0" Mar 20 17:55:47 crc kubenswrapper[4690]: I0320 17:55:47.900649 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7846f23-5aa1-4613-a307-9e4bc7d372bb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d7846f23-5aa1-4613-a307-9e4bc7d372bb\") " pod="openstack/nova-api-0" Mar 20 17:55:47 crc kubenswrapper[4690]: I0320 17:55:47.900740 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7846f23-5aa1-4613-a307-9e4bc7d372bb-logs\") pod \"nova-api-0\" (UID: \"d7846f23-5aa1-4613-a307-9e4bc7d372bb\") " pod="openstack/nova-api-0" Mar 20 17:55:47 crc kubenswrapper[4690]: I0320 17:55:47.900875 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7846f23-5aa1-4613-a307-9e4bc7d372bb-internal-tls-certs\") pod \"nova-api-0\" (UID: \"d7846f23-5aa1-4613-a307-9e4bc7d372bb\") " pod="openstack/nova-api-0" Mar 20 17:55:47 crc kubenswrapper[4690]: I0320 17:55:47.901009 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7846f23-5aa1-4613-a307-9e4bc7d372bb-public-tls-certs\") pod \"nova-api-0\" (UID: \"d7846f23-5aa1-4613-a307-9e4bc7d372bb\") " pod="openstack/nova-api-0" Mar 20 17:55:47 crc kubenswrapper[4690]: I0320 17:55:47.992634 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-5q5q8"] Mar 20 17:55:47 crc kubenswrapper[4690]: I0320 17:55:47.993986 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-5q5q8" Mar 20 17:55:47 crc kubenswrapper[4690]: I0320 17:55:47.998198 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 20 17:55:47 crc kubenswrapper[4690]: I0320 17:55:47.998632 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 20 17:55:48 crc kubenswrapper[4690]: I0320 17:55:48.002141 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7846f23-5aa1-4613-a307-9e4bc7d372bb-internal-tls-certs\") pod \"nova-api-0\" (UID: \"d7846f23-5aa1-4613-a307-9e4bc7d372bb\") " pod="openstack/nova-api-0" Mar 20 17:55:48 crc kubenswrapper[4690]: I0320 17:55:48.002339 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7846f23-5aa1-4613-a307-9e4bc7d372bb-public-tls-certs\") pod \"nova-api-0\" (UID: \"d7846f23-5aa1-4613-a307-9e4bc7d372bb\") " pod="openstack/nova-api-0" Mar 20 17:55:48 crc kubenswrapper[4690]: I0320 17:55:48.002449 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqzg9\" (UniqueName: \"kubernetes.io/projected/d7846f23-5aa1-4613-a307-9e4bc7d372bb-kube-api-access-zqzg9\") pod \"nova-api-0\" (UID: \"d7846f23-5aa1-4613-a307-9e4bc7d372bb\") " pod="openstack/nova-api-0" Mar 20 17:55:48 crc kubenswrapper[4690]: I0320 17:55:48.002582 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7846f23-5aa1-4613-a307-9e4bc7d372bb-config-data\") pod \"nova-api-0\" (UID: \"d7846f23-5aa1-4613-a307-9e4bc7d372bb\") " pod="openstack/nova-api-0" Mar 20 17:55:48 crc kubenswrapper[4690]: I0320 17:55:48.002668 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7846f23-5aa1-4613-a307-9e4bc7d372bb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d7846f23-5aa1-4613-a307-9e4bc7d372bb\") " pod="openstack/nova-api-0" Mar 20 17:55:48 crc kubenswrapper[4690]: I0320 17:55:48.002757 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7846f23-5aa1-4613-a307-9e4bc7d372bb-logs\") pod \"nova-api-0\" (UID: \"d7846f23-5aa1-4613-a307-9e4bc7d372bb\") " pod="openstack/nova-api-0" Mar 20 17:55:48 crc kubenswrapper[4690]: I0320 17:55:48.005637 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7846f23-5aa1-4613-a307-9e4bc7d372bb-logs\") pod \"nova-api-0\" (UID: \"d7846f23-5aa1-4613-a307-9e4bc7d372bb\") " pod="openstack/nova-api-0" Mar 20 17:55:48 crc kubenswrapper[4690]: I0320 17:55:48.009924 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7846f23-5aa1-4613-a307-9e4bc7d372bb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d7846f23-5aa1-4613-a307-9e4bc7d372bb\") " pod="openstack/nova-api-0" Mar 20 17:55:48 crc kubenswrapper[4690]: I0320 17:55:48.011050 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7846f23-5aa1-4613-a307-9e4bc7d372bb-public-tls-certs\") pod \"nova-api-0\" (UID: \"d7846f23-5aa1-4613-a307-9e4bc7d372bb\") " pod="openstack/nova-api-0" Mar 20 17:55:48 crc kubenswrapper[4690]: I0320 17:55:48.012136 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-5q5q8"] Mar 20 17:55:48 crc kubenswrapper[4690]: I0320 17:55:48.013201 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7846f23-5aa1-4613-a307-9e4bc7d372bb-config-data\") pod \"nova-api-0\" (UID: \"d7846f23-5aa1-4613-a307-9e4bc7d372bb\") " pod="openstack/nova-api-0" Mar 20 17:55:48 crc kubenswrapper[4690]: I0320 17:55:48.016706 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7846f23-5aa1-4613-a307-9e4bc7d372bb-internal-tls-certs\") pod \"nova-api-0\" (UID: \"d7846f23-5aa1-4613-a307-9e4bc7d372bb\") " pod="openstack/nova-api-0" Mar 20 17:55:48 crc kubenswrapper[4690]: I0320 17:55:48.026583 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqzg9\" (UniqueName: \"kubernetes.io/projected/d7846f23-5aa1-4613-a307-9e4bc7d372bb-kube-api-access-zqzg9\") pod \"nova-api-0\" (UID: \"d7846f23-5aa1-4613-a307-9e4bc7d372bb\") " pod="openstack/nova-api-0" Mar 20 17:55:48 crc kubenswrapper[4690]: I0320 17:55:48.104326 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1296ff75-6f88-4e2c-bf63-b46c3b090a6d-scripts\") pod \"nova-cell1-cell-mapping-5q5q8\" (UID: \"1296ff75-6f88-4e2c-bf63-b46c3b090a6d\") " pod="openstack/nova-cell1-cell-mapping-5q5q8" Mar 20 17:55:48 crc kubenswrapper[4690]: I0320 17:55:48.104631 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6tv2\" (UniqueName: \"kubernetes.io/projected/1296ff75-6f88-4e2c-bf63-b46c3b090a6d-kube-api-access-r6tv2\") pod \"nova-cell1-cell-mapping-5q5q8\" (UID: \"1296ff75-6f88-4e2c-bf63-b46c3b090a6d\") " pod="openstack/nova-cell1-cell-mapping-5q5q8" Mar 20 17:55:48 crc kubenswrapper[4690]: I0320 17:55:48.104656 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1296ff75-6f88-4e2c-bf63-b46c3b090a6d-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-5q5q8\" (UID: \"1296ff75-6f88-4e2c-bf63-b46c3b090a6d\") " pod="openstack/nova-cell1-cell-mapping-5q5q8" Mar 20 17:55:48 crc kubenswrapper[4690]: I0320 17:55:48.104725 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1296ff75-6f88-4e2c-bf63-b46c3b090a6d-config-data\") pod \"nova-cell1-cell-mapping-5q5q8\" (UID: \"1296ff75-6f88-4e2c-bf63-b46c3b090a6d\") " pod="openstack/nova-cell1-cell-mapping-5q5q8" Mar 20 17:55:48 crc kubenswrapper[4690]: I0320 17:55:48.138284 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 17:55:48 crc kubenswrapper[4690]: I0320 17:55:48.206672 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1296ff75-6f88-4e2c-bf63-b46c3b090a6d-scripts\") pod \"nova-cell1-cell-mapping-5q5q8\" (UID: \"1296ff75-6f88-4e2c-bf63-b46c3b090a6d\") " pod="openstack/nova-cell1-cell-mapping-5q5q8" Mar 20 17:55:48 crc kubenswrapper[4690]: I0320 17:55:48.206767 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6tv2\" (UniqueName: \"kubernetes.io/projected/1296ff75-6f88-4e2c-bf63-b46c3b090a6d-kube-api-access-r6tv2\") pod \"nova-cell1-cell-mapping-5q5q8\" (UID: \"1296ff75-6f88-4e2c-bf63-b46c3b090a6d\") " pod="openstack/nova-cell1-cell-mapping-5q5q8" Mar 20 17:55:48 crc kubenswrapper[4690]: I0320 17:55:48.206788 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1296ff75-6f88-4e2c-bf63-b46c3b090a6d-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-5q5q8\" (UID: \"1296ff75-6f88-4e2c-bf63-b46c3b090a6d\") " pod="openstack/nova-cell1-cell-mapping-5q5q8" Mar 20 17:55:48 crc kubenswrapper[4690]: I0320 17:55:48.206821 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1296ff75-6f88-4e2c-bf63-b46c3b090a6d-config-data\") pod \"nova-cell1-cell-mapping-5q5q8\" (UID: \"1296ff75-6f88-4e2c-bf63-b46c3b090a6d\") " pod="openstack/nova-cell1-cell-mapping-5q5q8" Mar 20 17:55:48 crc kubenswrapper[4690]: I0320 17:55:48.211327 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1296ff75-6f88-4e2c-bf63-b46c3b090a6d-config-data\") pod \"nova-cell1-cell-mapping-5q5q8\" (UID: \"1296ff75-6f88-4e2c-bf63-b46c3b090a6d\") " pod="openstack/nova-cell1-cell-mapping-5q5q8" Mar 20 17:55:48 crc kubenswrapper[4690]: I0320 17:55:48.211714 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1296ff75-6f88-4e2c-bf63-b46c3b090a6d-scripts\") pod \"nova-cell1-cell-mapping-5q5q8\" (UID: \"1296ff75-6f88-4e2c-bf63-b46c3b090a6d\") " pod="openstack/nova-cell1-cell-mapping-5q5q8" Mar 20 17:55:48 crc kubenswrapper[4690]: I0320 17:55:48.212123 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1296ff75-6f88-4e2c-bf63-b46c3b090a6d-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-5q5q8\" (UID: \"1296ff75-6f88-4e2c-bf63-b46c3b090a6d\") " pod="openstack/nova-cell1-cell-mapping-5q5q8" Mar 20 17:55:48 crc kubenswrapper[4690]: I0320 17:55:48.226777 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6tv2\" (UniqueName: \"kubernetes.io/projected/1296ff75-6f88-4e2c-bf63-b46c3b090a6d-kube-api-access-r6tv2\") pod \"nova-cell1-cell-mapping-5q5q8\" (UID: \"1296ff75-6f88-4e2c-bf63-b46c3b090a6d\") " pod="openstack/nova-cell1-cell-mapping-5q5q8" Mar 20 17:55:48 crc kubenswrapper[4690]: I0320 17:55:48.311219 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-5q5q8" Mar 20 17:55:48 crc kubenswrapper[4690]: I0320 17:55:48.608053 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 17:55:48 crc kubenswrapper[4690]: I0320 17:55:48.751774 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a58c4cd-dea7-417a-a296-6de5e559294f","Type":"ContainerStarted","Data":"68004e2e6d998f4d9c40d85ccc032f41687549f20ff6478f052f3ecf426a223c"} Mar 20 17:55:48 crc kubenswrapper[4690]: I0320 17:55:48.752287 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-5q5q8"] Mar 20 17:55:48 crc kubenswrapper[4690]: W0320 17:55:48.752536 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1296ff75_6f88_4e2c_bf63_b46c3b090a6d.slice/crio-3cad27e2f02bf123d37701ba36ff7a38e8a7b0136e0addcfccdb415919a53251 WatchSource:0}: Error finding container 3cad27e2f02bf123d37701ba36ff7a38e8a7b0136e0addcfccdb415919a53251: Status 404 returned error can't find the container with id 3cad27e2f02bf123d37701ba36ff7a38e8a7b0136e0addcfccdb415919a53251 Mar 20 17:55:48 crc kubenswrapper[4690]: I0320 17:55:48.753955 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d7846f23-5aa1-4613-a307-9e4bc7d372bb","Type":"ContainerStarted","Data":"e538813479e794dbee17f8c171bc958bcc2011472b5105fd76da522df1785e8d"} Mar 20 17:55:49 crc kubenswrapper[4690]: I0320 17:55:49.762164 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d7846f23-5aa1-4613-a307-9e4bc7d372bb","Type":"ContainerStarted","Data":"abc1b66adb188024a85af79cc12690f2277af075cff037e71d81b558c4f5b2ff"} Mar 20 17:55:49 crc kubenswrapper[4690]: I0320 17:55:49.762207 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d7846f23-5aa1-4613-a307-9e4bc7d372bb","Type":"ContainerStarted","Data":"0461f510933459a5848e00a5cc7243698f4679e5d6749f37f0384c304c232c4a"} Mar 20 17:55:49 crc kubenswrapper[4690]: I0320 17:55:49.765009 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-5q5q8" event={"ID":"1296ff75-6f88-4e2c-bf63-b46c3b090a6d","Type":"ContainerStarted","Data":"5b896b52c746e31af9bd4f8d5f536765ca7340c733d71497868aaaf70f62f225"} Mar 20 17:55:49 crc kubenswrapper[4690]: I0320 17:55:49.765053 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-5q5q8" event={"ID":"1296ff75-6f88-4e2c-bf63-b46c3b090a6d","Type":"ContainerStarted","Data":"3cad27e2f02bf123d37701ba36ff7a38e8a7b0136e0addcfccdb415919a53251"} Mar 20 17:55:49 crc kubenswrapper[4690]: I0320 17:55:49.792986 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.792965751 podStartE2EDuration="2.792965751s" podCreationTimestamp="2026-03-20 17:55:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:55:49.785177096 +0000 UTC m=+1424.651002774" watchObservedRunningTime="2026-03-20 17:55:49.792965751 +0000 UTC m=+1424.658791429" Mar 20 17:55:49 crc kubenswrapper[4690]: I0320 17:55:49.808397 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-5q5q8" podStartSLOduration=2.808379898 podStartE2EDuration="2.808379898s" podCreationTimestamp="2026-03-20 17:55:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:55:49.804376798 +0000 UTC m=+1424.670202486" watchObservedRunningTime="2026-03-20 17:55:49.808379898 +0000 UTC m=+1424.674205586" Mar 20 17:55:50 crc kubenswrapper[4690]: I0320 17:55:50.776174 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a58c4cd-dea7-417a-a296-6de5e559294f","Type":"ContainerStarted","Data":"28558a80d2fed1dd92d2dbd5d6b42cb9ff2a43b9dfa2186798e6ac88c879ab0a"} Mar 20 17:55:50 crc kubenswrapper[4690]: I0320 17:55:50.778207 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8a58c4cd-dea7-417a-a296-6de5e559294f" containerName="ceilometer-central-agent" containerID="cri-o://68fca97fdd34b769e6f4db6341de981cc26cfe2a8fedbe69af61702db1aa8401" gracePeriod=30 Mar 20 17:55:50 crc kubenswrapper[4690]: I0320 17:55:50.778228 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8a58c4cd-dea7-417a-a296-6de5e559294f" containerName="proxy-httpd" containerID="cri-o://28558a80d2fed1dd92d2dbd5d6b42cb9ff2a43b9dfa2186798e6ac88c879ab0a" gracePeriod=30 Mar 20 17:55:50 crc kubenswrapper[4690]: I0320 17:55:50.778221 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8a58c4cd-dea7-417a-a296-6de5e559294f" containerName="sg-core" containerID="cri-o://68004e2e6d998f4d9c40d85ccc032f41687549f20ff6478f052f3ecf426a223c" gracePeriod=30 Mar 20 17:55:50 crc kubenswrapper[4690]: I0320 17:55:50.778284 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8a58c4cd-dea7-417a-a296-6de5e559294f" containerName="ceilometer-notification-agent" containerID="cri-o://adf2b7adec0431ebd135cdc93755d3275cd95f9000e37b11dcd1e2e2d7390c95" gracePeriod=30 Mar 20 17:55:50 crc kubenswrapper[4690]: I0320 17:55:50.805066 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.6643869329999998 podStartE2EDuration="5.805034715s" podCreationTimestamp="2026-03-20 17:55:45 +0000 UTC" firstStartedPulling="2026-03-20 17:55:45.879500632 +0000 UTC m=+1420.745326320" lastFinishedPulling="2026-03-20 17:55:50.020148424 +0000 UTC m=+1424.885974102" observedRunningTime="2026-03-20 17:55:50.801879218 +0000 UTC m=+1425.667704916" watchObservedRunningTime="2026-03-20 17:55:50.805034715 +0000 UTC m=+1425.670860433" Mar 20 17:55:51 crc kubenswrapper[4690]: I0320 17:55:51.110191 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-89c5cd4d5-7h684" Mar 20 17:55:51 crc kubenswrapper[4690]: I0320 17:55:51.169530 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-q25ft"] Mar 20 17:55:51 crc kubenswrapper[4690]: I0320 17:55:51.169808 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-757b4f8459-q25ft" podUID="c4134682-ffd8-4189-9abd-bf4f23b57a90" containerName="dnsmasq-dns" containerID="cri-o://b4428763cecbb68afbf2440b8d1f4ae5da26dfa679171253130813a80c49e5a0" gracePeriod=10 Mar 20 17:55:51 crc kubenswrapper[4690]: I0320 17:55:51.610173 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-q25ft" Mar 20 17:55:51 crc kubenswrapper[4690]: I0320 17:55:51.679008 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c4134682-ffd8-4189-9abd-bf4f23b57a90-ovsdbserver-sb\") pod \"c4134682-ffd8-4189-9abd-bf4f23b57a90\" (UID: \"c4134682-ffd8-4189-9abd-bf4f23b57a90\") " Mar 20 17:55:51 crc kubenswrapper[4690]: I0320 17:55:51.679123 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vw2bn\" (UniqueName: \"kubernetes.io/projected/c4134682-ffd8-4189-9abd-bf4f23b57a90-kube-api-access-vw2bn\") pod \"c4134682-ffd8-4189-9abd-bf4f23b57a90\" (UID: \"c4134682-ffd8-4189-9abd-bf4f23b57a90\") " Mar 20 17:55:51 crc kubenswrapper[4690]: I0320 17:55:51.679185 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c4134682-ffd8-4189-9abd-bf4f23b57a90-ovsdbserver-nb\") pod \"c4134682-ffd8-4189-9abd-bf4f23b57a90\" (UID: \"c4134682-ffd8-4189-9abd-bf4f23b57a90\") " Mar 20 17:55:51 crc kubenswrapper[4690]: I0320 17:55:51.679234 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c4134682-ffd8-4189-9abd-bf4f23b57a90-dns-swift-storage-0\") pod \"c4134682-ffd8-4189-9abd-bf4f23b57a90\" (UID: \"c4134682-ffd8-4189-9abd-bf4f23b57a90\") " Mar 20 17:55:51 crc kubenswrapper[4690]: I0320 17:55:51.679288 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4134682-ffd8-4189-9abd-bf4f23b57a90-config\") pod \"c4134682-ffd8-4189-9abd-bf4f23b57a90\" (UID: \"c4134682-ffd8-4189-9abd-bf4f23b57a90\") " Mar 20 17:55:51 crc kubenswrapper[4690]: I0320 17:55:51.679439 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c4134682-ffd8-4189-9abd-bf4f23b57a90-dns-svc\") pod \"c4134682-ffd8-4189-9abd-bf4f23b57a90\" (UID: \"c4134682-ffd8-4189-9abd-bf4f23b57a90\") " Mar 20 17:55:51 crc kubenswrapper[4690]: I0320 17:55:51.691767 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4134682-ffd8-4189-9abd-bf4f23b57a90-kube-api-access-vw2bn" (OuterVolumeSpecName: "kube-api-access-vw2bn") pod "c4134682-ffd8-4189-9abd-bf4f23b57a90" (UID: "c4134682-ffd8-4189-9abd-bf4f23b57a90"). InnerVolumeSpecName "kube-api-access-vw2bn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:55:51 crc kubenswrapper[4690]: I0320 17:55:51.737895 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4134682-ffd8-4189-9abd-bf4f23b57a90-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c4134682-ffd8-4189-9abd-bf4f23b57a90" (UID: "c4134682-ffd8-4189-9abd-bf4f23b57a90"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:55:51 crc kubenswrapper[4690]: I0320 17:55:51.743064 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4134682-ffd8-4189-9abd-bf4f23b57a90-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c4134682-ffd8-4189-9abd-bf4f23b57a90" (UID: "c4134682-ffd8-4189-9abd-bf4f23b57a90"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:55:51 crc kubenswrapper[4690]: I0320 17:55:51.743094 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4134682-ffd8-4189-9abd-bf4f23b57a90-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c4134682-ffd8-4189-9abd-bf4f23b57a90" (UID: "c4134682-ffd8-4189-9abd-bf4f23b57a90"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:55:51 crc kubenswrapper[4690]: I0320 17:55:51.762021 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4134682-ffd8-4189-9abd-bf4f23b57a90-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c4134682-ffd8-4189-9abd-bf4f23b57a90" (UID: "c4134682-ffd8-4189-9abd-bf4f23b57a90"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:55:51 crc kubenswrapper[4690]: I0320 17:55:51.774514 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4134682-ffd8-4189-9abd-bf4f23b57a90-config" (OuterVolumeSpecName: "config") pod "c4134682-ffd8-4189-9abd-bf4f23b57a90" (UID: "c4134682-ffd8-4189-9abd-bf4f23b57a90"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:55:51 crc kubenswrapper[4690]: I0320 17:55:51.781606 4690 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c4134682-ffd8-4189-9abd-bf4f23b57a90-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 17:55:51 crc kubenswrapper[4690]: I0320 17:55:51.781634 4690 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c4134682-ffd8-4189-9abd-bf4f23b57a90-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 17:55:51 crc kubenswrapper[4690]: I0320 17:55:51.781645 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vw2bn\" (UniqueName: \"kubernetes.io/projected/c4134682-ffd8-4189-9abd-bf4f23b57a90-kube-api-access-vw2bn\") on node \"crc\" DevicePath \"\"" Mar 20 17:55:51 crc kubenswrapper[4690]: I0320 17:55:51.781655 4690 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c4134682-ffd8-4189-9abd-bf4f23b57a90-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 17:55:51 crc kubenswrapper[4690]: I0320 17:55:51.781665 4690 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c4134682-ffd8-4189-9abd-bf4f23b57a90-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 17:55:51 crc kubenswrapper[4690]: I0320 17:55:51.781673 4690 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4134682-ffd8-4189-9abd-bf4f23b57a90-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:55:51 crc kubenswrapper[4690]: I0320 17:55:51.788583 4690 generic.go:334] "Generic (PLEG): container finished" podID="8a58c4cd-dea7-417a-a296-6de5e559294f" containerID="28558a80d2fed1dd92d2dbd5d6b42cb9ff2a43b9dfa2186798e6ac88c879ab0a" exitCode=0 Mar 20 17:55:51 crc kubenswrapper[4690]: I0320 17:55:51.788615 4690 generic.go:334] "Generic (PLEG): container finished" podID="8a58c4cd-dea7-417a-a296-6de5e559294f" containerID="68004e2e6d998f4d9c40d85ccc032f41687549f20ff6478f052f3ecf426a223c" exitCode=2 Mar 20 17:55:51 crc kubenswrapper[4690]: I0320 17:55:51.788622 4690 generic.go:334] "Generic (PLEG): container finished" podID="8a58c4cd-dea7-417a-a296-6de5e559294f" containerID="adf2b7adec0431ebd135cdc93755d3275cd95f9000e37b11dcd1e2e2d7390c95" exitCode=0 Mar 20 17:55:51 crc kubenswrapper[4690]: I0320 17:55:51.788681 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a58c4cd-dea7-417a-a296-6de5e559294f","Type":"ContainerDied","Data":"28558a80d2fed1dd92d2dbd5d6b42cb9ff2a43b9dfa2186798e6ac88c879ab0a"} Mar 20 17:55:51 crc kubenswrapper[4690]: I0320 17:55:51.788823 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a58c4cd-dea7-417a-a296-6de5e559294f","Type":"ContainerDied","Data":"68004e2e6d998f4d9c40d85ccc032f41687549f20ff6478f052f3ecf426a223c"} Mar 20 17:55:51 crc kubenswrapper[4690]: I0320 17:55:51.788842 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a58c4cd-dea7-417a-a296-6de5e559294f","Type":"ContainerDied","Data":"adf2b7adec0431ebd135cdc93755d3275cd95f9000e37b11dcd1e2e2d7390c95"} Mar 20 17:55:51 crc kubenswrapper[4690]: I0320 17:55:51.790639 4690 generic.go:334] "Generic (PLEG): container finished" podID="c4134682-ffd8-4189-9abd-bf4f23b57a90" containerID="b4428763cecbb68afbf2440b8d1f4ae5da26dfa679171253130813a80c49e5a0" exitCode=0 Mar 20 17:55:51 crc kubenswrapper[4690]: I0320 17:55:51.790690 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-q25ft" event={"ID":"c4134682-ffd8-4189-9abd-bf4f23b57a90","Type":"ContainerDied","Data":"b4428763cecbb68afbf2440b8d1f4ae5da26dfa679171253130813a80c49e5a0"} Mar 20 17:55:51 crc kubenswrapper[4690]: I0320 17:55:51.790691 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-q25ft" Mar 20 17:55:51 crc kubenswrapper[4690]: I0320 17:55:51.790743 4690 scope.go:117] "RemoveContainer" containerID="b4428763cecbb68afbf2440b8d1f4ae5da26dfa679171253130813a80c49e5a0" Mar 20 17:55:51 crc kubenswrapper[4690]: I0320 17:55:51.790730 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-q25ft" event={"ID":"c4134682-ffd8-4189-9abd-bf4f23b57a90","Type":"ContainerDied","Data":"a6bb5d309d753e41fe1a364995cbb0872e1d966766e4cea29d793b2aacf9c3b2"} Mar 20 17:55:51 crc kubenswrapper[4690]: I0320 17:55:51.824168 4690 scope.go:117] "RemoveContainer" containerID="5dec7639a6ca06587a3dc3a15fa008efdd0ccbcd1a868dfab6c96fc9d8d85282" Mar 20 17:55:51 crc kubenswrapper[4690]: I0320 17:55:51.841594 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-q25ft"] Mar 20 17:55:51 crc kubenswrapper[4690]: I0320 17:55:51.851793 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-q25ft"] Mar 20 17:55:51 crc kubenswrapper[4690]: I0320 17:55:51.859810 4690 scope.go:117] "RemoveContainer" containerID="b4428763cecbb68afbf2440b8d1f4ae5da26dfa679171253130813a80c49e5a0" Mar 20 17:55:51 crc kubenswrapper[4690]: E0320 17:55:51.866064 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4428763cecbb68afbf2440b8d1f4ae5da26dfa679171253130813a80c49e5a0\": container with ID starting with b4428763cecbb68afbf2440b8d1f4ae5da26dfa679171253130813a80c49e5a0 not found: ID does not exist" containerID="b4428763cecbb68afbf2440b8d1f4ae5da26dfa679171253130813a80c49e5a0" Mar 20 17:55:51 crc kubenswrapper[4690]: I0320 17:55:51.866115 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4428763cecbb68afbf2440b8d1f4ae5da26dfa679171253130813a80c49e5a0"} err="failed to get container status \"b4428763cecbb68afbf2440b8d1f4ae5da26dfa679171253130813a80c49e5a0\": rpc error: code = NotFound desc = could not find container \"b4428763cecbb68afbf2440b8d1f4ae5da26dfa679171253130813a80c49e5a0\": container with ID starting with b4428763cecbb68afbf2440b8d1f4ae5da26dfa679171253130813a80c49e5a0 not found: ID does not exist" Mar 20 17:55:51 crc kubenswrapper[4690]: I0320 17:55:51.866143 4690 scope.go:117] "RemoveContainer" containerID="5dec7639a6ca06587a3dc3a15fa008efdd0ccbcd1a868dfab6c96fc9d8d85282" Mar 20 17:55:51 crc kubenswrapper[4690]: E0320 17:55:51.866607 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5dec7639a6ca06587a3dc3a15fa008efdd0ccbcd1a868dfab6c96fc9d8d85282\": container with ID starting with 5dec7639a6ca06587a3dc3a15fa008efdd0ccbcd1a868dfab6c96fc9d8d85282 not found: ID does not exist" containerID="5dec7639a6ca06587a3dc3a15fa008efdd0ccbcd1a868dfab6c96fc9d8d85282" Mar 20 17:55:51 crc kubenswrapper[4690]: I0320 17:55:51.866655 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5dec7639a6ca06587a3dc3a15fa008efdd0ccbcd1a868dfab6c96fc9d8d85282"} err="failed to get container status \"5dec7639a6ca06587a3dc3a15fa008efdd0ccbcd1a868dfab6c96fc9d8d85282\": rpc error: code = NotFound desc = could not find container \"5dec7639a6ca06587a3dc3a15fa008efdd0ccbcd1a868dfab6c96fc9d8d85282\": container with ID starting with 5dec7639a6ca06587a3dc3a15fa008efdd0ccbcd1a868dfab6c96fc9d8d85282 not found: ID does not exist" Mar 20 17:55:51 crc kubenswrapper[4690]: I0320 17:55:51.894883 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4134682-ffd8-4189-9abd-bf4f23b57a90" path="/var/lib/kubelet/pods/c4134682-ffd8-4189-9abd-bf4f23b57a90/volumes" Mar 20 17:55:53 crc kubenswrapper[4690]: I0320 17:55:53.812277 4690 generic.go:334] "Generic (PLEG): container finished" podID="8a58c4cd-dea7-417a-a296-6de5e559294f" containerID="68fca97fdd34b769e6f4db6341de981cc26cfe2a8fedbe69af61702db1aa8401" exitCode=0 Mar 20 17:55:53 crc kubenswrapper[4690]: I0320 17:55:53.812452 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a58c4cd-dea7-417a-a296-6de5e559294f","Type":"ContainerDied","Data":"68fca97fdd34b769e6f4db6341de981cc26cfe2a8fedbe69af61702db1aa8401"} Mar 20 17:55:54 crc kubenswrapper[4690]: I0320 17:55:54.150217 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:55:54 crc kubenswrapper[4690]: I0320 17:55:54.229582 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a58c4cd-dea7-417a-a296-6de5e559294f-scripts\") pod \"8a58c4cd-dea7-417a-a296-6de5e559294f\" (UID: \"8a58c4cd-dea7-417a-a296-6de5e559294f\") " Mar 20 17:55:54 crc kubenswrapper[4690]: I0320 17:55:54.229732 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jxrzz\" (UniqueName: \"kubernetes.io/projected/8a58c4cd-dea7-417a-a296-6de5e559294f-kube-api-access-jxrzz\") pod \"8a58c4cd-dea7-417a-a296-6de5e559294f\" (UID: \"8a58c4cd-dea7-417a-a296-6de5e559294f\") " Mar 20 17:55:54 crc kubenswrapper[4690]: I0320 17:55:54.229819 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a58c4cd-dea7-417a-a296-6de5e559294f-log-httpd\") pod \"8a58c4cd-dea7-417a-a296-6de5e559294f\" (UID: \"8a58c4cd-dea7-417a-a296-6de5e559294f\") " Mar 20 17:55:54 crc kubenswrapper[4690]: I0320 17:55:54.229847 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8a58c4cd-dea7-417a-a296-6de5e559294f-sg-core-conf-yaml\") pod \"8a58c4cd-dea7-417a-a296-6de5e559294f\" (UID: \"8a58c4cd-dea7-417a-a296-6de5e559294f\") " Mar 20 17:55:54 crc kubenswrapper[4690]: I0320 17:55:54.229884 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a58c4cd-dea7-417a-a296-6de5e559294f-combined-ca-bundle\") pod \"8a58c4cd-dea7-417a-a296-6de5e559294f\" (UID: \"8a58c4cd-dea7-417a-a296-6de5e559294f\") " Mar 20 17:55:54 crc kubenswrapper[4690]: I0320 17:55:54.230116 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a58c4cd-dea7-417a-a296-6de5e559294f-run-httpd\") pod \"8a58c4cd-dea7-417a-a296-6de5e559294f\" (UID: \"8a58c4cd-dea7-417a-a296-6de5e559294f\") " Mar 20 17:55:54 crc kubenswrapper[4690]: I0320 17:55:54.230186 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a58c4cd-dea7-417a-a296-6de5e559294f-config-data\") pod \"8a58c4cd-dea7-417a-a296-6de5e559294f\" (UID: \"8a58c4cd-dea7-417a-a296-6de5e559294f\") " Mar 20 17:55:54 crc kubenswrapper[4690]: I0320 17:55:54.230406 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a58c4cd-dea7-417a-a296-6de5e559294f-ceilometer-tls-certs\") pod \"8a58c4cd-dea7-417a-a296-6de5e559294f\" (UID: \"8a58c4cd-dea7-417a-a296-6de5e559294f\") " Mar 20 17:55:54 crc kubenswrapper[4690]: I0320 17:55:54.235110 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a58c4cd-dea7-417a-a296-6de5e559294f-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8a58c4cd-dea7-417a-a296-6de5e559294f" (UID: "8a58c4cd-dea7-417a-a296-6de5e559294f"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:55:54 crc kubenswrapper[4690]: I0320 17:55:54.235161 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a58c4cd-dea7-417a-a296-6de5e559294f-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8a58c4cd-dea7-417a-a296-6de5e559294f" (UID: "8a58c4cd-dea7-417a-a296-6de5e559294f"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:55:54 crc kubenswrapper[4690]: I0320 17:55:54.251985 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a58c4cd-dea7-417a-a296-6de5e559294f-scripts" (OuterVolumeSpecName: "scripts") pod "8a58c4cd-dea7-417a-a296-6de5e559294f" (UID: "8a58c4cd-dea7-417a-a296-6de5e559294f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:55:54 crc kubenswrapper[4690]: I0320 17:55:54.252066 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a58c4cd-dea7-417a-a296-6de5e559294f-kube-api-access-jxrzz" (OuterVolumeSpecName: "kube-api-access-jxrzz") pod "8a58c4cd-dea7-417a-a296-6de5e559294f" (UID: "8a58c4cd-dea7-417a-a296-6de5e559294f"). InnerVolumeSpecName "kube-api-access-jxrzz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:55:54 crc kubenswrapper[4690]: I0320 17:55:54.274177 4690 patch_prober.go:28] interesting pod/machine-config-daemon-wtg2q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:55:54 crc kubenswrapper[4690]: I0320 17:55:54.274643 4690 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:55:54 crc kubenswrapper[4690]: I0320 17:55:54.274690 4690 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" Mar 20 17:55:54 crc kubenswrapper[4690]: I0320 17:55:54.275502 4690 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c6c26ff37905c4c37c818991d48555bc929721ae7acd19a88c41bd55b417a5fe"} pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 17:55:54 crc kubenswrapper[4690]: I0320 17:55:54.275559 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" containerName="machine-config-daemon" containerID="cri-o://c6c26ff37905c4c37c818991d48555bc929721ae7acd19a88c41bd55b417a5fe" gracePeriod=600 Mar 20 17:55:54 crc kubenswrapper[4690]: I0320 17:55:54.288510 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a58c4cd-dea7-417a-a296-6de5e559294f-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "8a58c4cd-dea7-417a-a296-6de5e559294f" (UID: "8a58c4cd-dea7-417a-a296-6de5e559294f"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:55:54 crc kubenswrapper[4690]: I0320 17:55:54.305495 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a58c4cd-dea7-417a-a296-6de5e559294f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8a58c4cd-dea7-417a-a296-6de5e559294f" (UID: "8a58c4cd-dea7-417a-a296-6de5e559294f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:55:54 crc kubenswrapper[4690]: I0320 17:55:54.305972 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a58c4cd-dea7-417a-a296-6de5e559294f-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "8a58c4cd-dea7-417a-a296-6de5e559294f" (UID: "8a58c4cd-dea7-417a-a296-6de5e559294f"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:55:54 crc kubenswrapper[4690]: I0320 17:55:54.328417 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a58c4cd-dea7-417a-a296-6de5e559294f-config-data" (OuterVolumeSpecName: "config-data") pod "8a58c4cd-dea7-417a-a296-6de5e559294f" (UID: "8a58c4cd-dea7-417a-a296-6de5e559294f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:55:54 crc kubenswrapper[4690]: I0320 17:55:54.332268 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jxrzz\" (UniqueName: \"kubernetes.io/projected/8a58c4cd-dea7-417a-a296-6de5e559294f-kube-api-access-jxrzz\") on node \"crc\" DevicePath \"\"" Mar 20 17:55:54 crc kubenswrapper[4690]: I0320 17:55:54.332299 4690 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a58c4cd-dea7-417a-a296-6de5e559294f-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 17:55:54 crc kubenswrapper[4690]: I0320 17:55:54.332412 4690 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8a58c4cd-dea7-417a-a296-6de5e559294f-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 17:55:54 crc kubenswrapper[4690]: I0320 17:55:54.332424 4690 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a58c4cd-dea7-417a-a296-6de5e559294f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:55:54 crc kubenswrapper[4690]: I0320 17:55:54.332434 4690 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a58c4cd-dea7-417a-a296-6de5e559294f-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 17:55:54 crc kubenswrapper[4690]: I0320 17:55:54.332443 4690 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a58c4cd-dea7-417a-a296-6de5e559294f-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:55:54 crc kubenswrapper[4690]: I0320 17:55:54.332452 4690 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a58c4cd-dea7-417a-a296-6de5e559294f-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 17:55:54 crc kubenswrapper[4690]: I0320 17:55:54.332462 4690 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a58c4cd-dea7-417a-a296-6de5e559294f-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:55:54 crc kubenswrapper[4690]: I0320 17:55:54.828970 4690 generic.go:334] "Generic (PLEG): container finished" podID="c18651e4-89e3-43fd-a780-bfa6df87591e" containerID="c6c26ff37905c4c37c818991d48555bc929721ae7acd19a88c41bd55b417a5fe" exitCode=0 Mar 20 17:55:54 crc kubenswrapper[4690]: I0320 17:55:54.829115 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" event={"ID":"c18651e4-89e3-43fd-a780-bfa6df87591e","Type":"ContainerDied","Data":"c6c26ff37905c4c37c818991d48555bc929721ae7acd19a88c41bd55b417a5fe"} Mar 20 17:55:54 crc kubenswrapper[4690]: I0320 17:55:54.829564 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" event={"ID":"c18651e4-89e3-43fd-a780-bfa6df87591e","Type":"ContainerStarted","Data":"965e35066bff888caca5b994dc3af56f56ca5e0e9e97a4c5970943a091971930"} Mar 20 17:55:54 crc kubenswrapper[4690]: I0320 17:55:54.829601 4690 scope.go:117] "RemoveContainer" containerID="ab2561b6600e9d6bebb46c2c746c35623906cf56d05e6dcd356c447e3e87dfa1" Mar 20 17:55:54 crc kubenswrapper[4690]: I0320 17:55:54.834620 4690 generic.go:334] "Generic (PLEG): container finished" podID="1296ff75-6f88-4e2c-bf63-b46c3b090a6d" containerID="5b896b52c746e31af9bd4f8d5f536765ca7340c733d71497868aaaf70f62f225" exitCode=0 Mar 20 17:55:54 crc kubenswrapper[4690]: I0320 17:55:54.834677 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-5q5q8" event={"ID":"1296ff75-6f88-4e2c-bf63-b46c3b090a6d","Type":"ContainerDied","Data":"5b896b52c746e31af9bd4f8d5f536765ca7340c733d71497868aaaf70f62f225"} Mar 20 17:55:54 crc kubenswrapper[4690]: I0320 17:55:54.845303 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a58c4cd-dea7-417a-a296-6de5e559294f","Type":"ContainerDied","Data":"e1acbf2fb974464045026382a508b52535f0e659b76a970d006e91927dd99d59"} Mar 20 17:55:54 crc kubenswrapper[4690]: I0320 17:55:54.845535 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:55:54 crc kubenswrapper[4690]: I0320 17:55:54.923076 4690 scope.go:117] "RemoveContainer" containerID="28558a80d2fed1dd92d2dbd5d6b42cb9ff2a43b9dfa2186798e6ac88c879ab0a" Mar 20 17:55:54 crc kubenswrapper[4690]: I0320 17:55:54.968872 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:55:54 crc kubenswrapper[4690]: I0320 17:55:54.970748 4690 scope.go:117] "RemoveContainer" containerID="68004e2e6d998f4d9c40d85ccc032f41687549f20ff6478f052f3ecf426a223c" Mar 20 17:55:54 crc kubenswrapper[4690]: I0320 17:55:54.976384 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:55:54 crc kubenswrapper[4690]: I0320 17:55:54.983949 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:55:54 crc kubenswrapper[4690]: E0320 17:55:54.984422 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a58c4cd-dea7-417a-a296-6de5e559294f" containerName="ceilometer-notification-agent" Mar 20 17:55:54 crc kubenswrapper[4690]: I0320 17:55:54.984481 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a58c4cd-dea7-417a-a296-6de5e559294f" containerName="ceilometer-notification-agent" Mar 20 17:55:54 crc kubenswrapper[4690]: E0320 17:55:54.984600 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a58c4cd-dea7-417a-a296-6de5e559294f" containerName="proxy-httpd" Mar 20 17:55:54 crc kubenswrapper[4690]: I0320 17:55:54.984647 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a58c4cd-dea7-417a-a296-6de5e559294f" containerName="proxy-httpd" Mar 20 17:55:54 crc kubenswrapper[4690]: E0320 17:55:54.984735 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4134682-ffd8-4189-9abd-bf4f23b57a90" containerName="dnsmasq-dns" Mar 20 17:55:54 crc kubenswrapper[4690]: I0320 17:55:54.984782 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4134682-ffd8-4189-9abd-bf4f23b57a90" containerName="dnsmasq-dns" Mar 20 17:55:54 crc kubenswrapper[4690]: E0320 17:55:54.984831 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4134682-ffd8-4189-9abd-bf4f23b57a90" containerName="init" Mar 20 17:55:54 crc kubenswrapper[4690]: I0320 17:55:54.984875 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4134682-ffd8-4189-9abd-bf4f23b57a90" containerName="init" Mar 20 17:55:54 crc kubenswrapper[4690]: E0320 17:55:54.984939 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a58c4cd-dea7-417a-a296-6de5e559294f" containerName="ceilometer-central-agent" Mar 20 17:55:54 crc kubenswrapper[4690]: I0320 17:55:54.984984 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a58c4cd-dea7-417a-a296-6de5e559294f" containerName="ceilometer-central-agent" Mar 20 17:55:54 crc kubenswrapper[4690]: E0320 17:55:54.985044 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a58c4cd-dea7-417a-a296-6de5e559294f" containerName="sg-core" Mar 20 17:55:54 crc kubenswrapper[4690]: I0320 17:55:54.985092 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a58c4cd-dea7-417a-a296-6de5e559294f" containerName="sg-core" Mar 20 17:55:54 crc kubenswrapper[4690]: I0320 17:55:54.985417 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a58c4cd-dea7-417a-a296-6de5e559294f" containerName="sg-core" Mar 20 17:55:54 crc kubenswrapper[4690]: I0320 17:55:54.985488 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a58c4cd-dea7-417a-a296-6de5e559294f" containerName="ceilometer-central-agent" Mar 20 17:55:54 crc kubenswrapper[4690]: I0320 17:55:54.985551 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a58c4cd-dea7-417a-a296-6de5e559294f" containerName="proxy-httpd" Mar 20 17:55:54 crc kubenswrapper[4690]: I0320 17:55:54.985604 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a58c4cd-dea7-417a-a296-6de5e559294f" containerName="ceilometer-notification-agent" Mar 20 17:55:54 crc kubenswrapper[4690]: I0320 17:55:54.985663 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4134682-ffd8-4189-9abd-bf4f23b57a90" containerName="dnsmasq-dns" Mar 20 17:55:54 crc kubenswrapper[4690]: I0320 17:55:54.987579 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:55:54 crc kubenswrapper[4690]: I0320 17:55:54.990993 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 20 17:55:54 crc kubenswrapper[4690]: I0320 17:55:54.991196 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 17:55:54 crc kubenswrapper[4690]: I0320 17:55:54.991385 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 17:55:54 crc kubenswrapper[4690]: I0320 17:55:54.997383 4690 scope.go:117] "RemoveContainer" containerID="adf2b7adec0431ebd135cdc93755d3275cd95f9000e37b11dcd1e2e2d7390c95" Mar 20 17:55:55 crc kubenswrapper[4690]: I0320 17:55:55.004675 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:55:55 crc kubenswrapper[4690]: I0320 17:55:55.031520 4690 scope.go:117] "RemoveContainer" containerID="68fca97fdd34b769e6f4db6341de981cc26cfe2a8fedbe69af61702db1aa8401" Mar 20 17:55:55 crc kubenswrapper[4690]: I0320 17:55:55.149071 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc998c2a-5f75-4a3b-b62a-1d6f8fbfc6d5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fc998c2a-5f75-4a3b-b62a-1d6f8fbfc6d5\") " pod="openstack/ceilometer-0" Mar 20 17:55:55 crc kubenswrapper[4690]: I0320 17:55:55.149170 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fc998c2a-5f75-4a3b-b62a-1d6f8fbfc6d5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fc998c2a-5f75-4a3b-b62a-1d6f8fbfc6d5\") " pod="openstack/ceilometer-0" Mar 20 17:55:55 crc kubenswrapper[4690]: I0320 17:55:55.149221 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fc998c2a-5f75-4a3b-b62a-1d6f8fbfc6d5-log-httpd\") pod \"ceilometer-0\" (UID: \"fc998c2a-5f75-4a3b-b62a-1d6f8fbfc6d5\") " pod="openstack/ceilometer-0" Mar 20 17:55:55 crc kubenswrapper[4690]: I0320 17:55:55.149240 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc998c2a-5f75-4a3b-b62a-1d6f8fbfc6d5-scripts\") pod \"ceilometer-0\" (UID: \"fc998c2a-5f75-4a3b-b62a-1d6f8fbfc6d5\") " pod="openstack/ceilometer-0" Mar 20 17:55:55 crc kubenswrapper[4690]: I0320 17:55:55.149273 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc998c2a-5f75-4a3b-b62a-1d6f8fbfc6d5-config-data\") pod \"ceilometer-0\" (UID: \"fc998c2a-5f75-4a3b-b62a-1d6f8fbfc6d5\") " pod="openstack/ceilometer-0" Mar 20 17:55:55 crc kubenswrapper[4690]: I0320 17:55:55.149301 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc998c2a-5f75-4a3b-b62a-1d6f8fbfc6d5-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"fc998c2a-5f75-4a3b-b62a-1d6f8fbfc6d5\") " pod="openstack/ceilometer-0" Mar 20 17:55:55 crc kubenswrapper[4690]: I0320 17:55:55.149395 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6snp\" (UniqueName: \"kubernetes.io/projected/fc998c2a-5f75-4a3b-b62a-1d6f8fbfc6d5-kube-api-access-f6snp\") pod \"ceilometer-0\" (UID: \"fc998c2a-5f75-4a3b-b62a-1d6f8fbfc6d5\") " pod="openstack/ceilometer-0" Mar 20 17:55:55 crc kubenswrapper[4690]: I0320 17:55:55.149589 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fc998c2a-5f75-4a3b-b62a-1d6f8fbfc6d5-run-httpd\") pod \"ceilometer-0\" (UID: \"fc998c2a-5f75-4a3b-b62a-1d6f8fbfc6d5\") " pod="openstack/ceilometer-0" Mar 20 17:55:55 crc kubenswrapper[4690]: I0320 17:55:55.252322 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fc998c2a-5f75-4a3b-b62a-1d6f8fbfc6d5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fc998c2a-5f75-4a3b-b62a-1d6f8fbfc6d5\") " pod="openstack/ceilometer-0" Mar 20 17:55:55 crc kubenswrapper[4690]: I0320 17:55:55.252466 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fc998c2a-5f75-4a3b-b62a-1d6f8fbfc6d5-log-httpd\") pod \"ceilometer-0\" (UID: \"fc998c2a-5f75-4a3b-b62a-1d6f8fbfc6d5\") " pod="openstack/ceilometer-0" Mar 20 17:55:55 crc kubenswrapper[4690]: I0320 17:55:55.252505 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc998c2a-5f75-4a3b-b62a-1d6f8fbfc6d5-scripts\") pod \"ceilometer-0\" (UID: \"fc998c2a-5f75-4a3b-b62a-1d6f8fbfc6d5\") " pod="openstack/ceilometer-0" Mar 20 17:55:55 crc kubenswrapper[4690]: I0320 17:55:55.252539 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc998c2a-5f75-4a3b-b62a-1d6f8fbfc6d5-config-data\") pod \"ceilometer-0\" (UID: \"fc998c2a-5f75-4a3b-b62a-1d6f8fbfc6d5\") " pod="openstack/ceilometer-0" Mar 20 17:55:55 crc kubenswrapper[4690]: I0320 17:55:55.252593 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc998c2a-5f75-4a3b-b62a-1d6f8fbfc6d5-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"fc998c2a-5f75-4a3b-b62a-1d6f8fbfc6d5\") " pod="openstack/ceilometer-0" Mar 20 17:55:55 crc kubenswrapper[4690]: I0320 17:55:55.252629 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6snp\" (UniqueName: \"kubernetes.io/projected/fc998c2a-5f75-4a3b-b62a-1d6f8fbfc6d5-kube-api-access-f6snp\") pod \"ceilometer-0\" (UID: \"fc998c2a-5f75-4a3b-b62a-1d6f8fbfc6d5\") " pod="openstack/ceilometer-0" Mar 20 17:55:55 crc kubenswrapper[4690]: I0320 17:55:55.252679 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fc998c2a-5f75-4a3b-b62a-1d6f8fbfc6d5-run-httpd\") pod \"ceilometer-0\" (UID: \"fc998c2a-5f75-4a3b-b62a-1d6f8fbfc6d5\") " pod="openstack/ceilometer-0" Mar 20 17:55:55 crc kubenswrapper[4690]: I0320 17:55:55.252787 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc998c2a-5f75-4a3b-b62a-1d6f8fbfc6d5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fc998c2a-5f75-4a3b-b62a-1d6f8fbfc6d5\") " pod="openstack/ceilometer-0" Mar 20 17:55:55 crc kubenswrapper[4690]: I0320 17:55:55.252983 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fc998c2a-5f75-4a3b-b62a-1d6f8fbfc6d5-log-httpd\") pod \"ceilometer-0\" (UID: \"fc998c2a-5f75-4a3b-b62a-1d6f8fbfc6d5\") " pod="openstack/ceilometer-0" Mar 20 17:55:55 crc kubenswrapper[4690]: I0320 17:55:55.254066 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fc998c2a-5f75-4a3b-b62a-1d6f8fbfc6d5-run-httpd\") pod \"ceilometer-0\" (UID: \"fc998c2a-5f75-4a3b-b62a-1d6f8fbfc6d5\") " pod="openstack/ceilometer-0" Mar 20 17:55:55 crc kubenswrapper[4690]: I0320 17:55:55.258299 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc998c2a-5f75-4a3b-b62a-1d6f8fbfc6d5-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"fc998c2a-5f75-4a3b-b62a-1d6f8fbfc6d5\") " pod="openstack/ceilometer-0" Mar 20 17:55:55 crc kubenswrapper[4690]: I0320 17:55:55.258570 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc998c2a-5f75-4a3b-b62a-1d6f8fbfc6d5-scripts\") pod \"ceilometer-0\" (UID: \"fc998c2a-5f75-4a3b-b62a-1d6f8fbfc6d5\") " pod="openstack/ceilometer-0" Mar 20 17:55:55 crc kubenswrapper[4690]: I0320 17:55:55.259233 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc998c2a-5f75-4a3b-b62a-1d6f8fbfc6d5-config-data\") pod \"ceilometer-0\" (UID: \"fc998c2a-5f75-4a3b-b62a-1d6f8fbfc6d5\") " pod="openstack/ceilometer-0" Mar 20 17:55:55 crc kubenswrapper[4690]: I0320 17:55:55.260500 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fc998c2a-5f75-4a3b-b62a-1d6f8fbfc6d5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fc998c2a-5f75-4a3b-b62a-1d6f8fbfc6d5\") " pod="openstack/ceilometer-0" Mar 20 17:55:55 crc kubenswrapper[4690]: I0320 17:55:55.264634 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc998c2a-5f75-4a3b-b62a-1d6f8fbfc6d5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fc998c2a-5f75-4a3b-b62a-1d6f8fbfc6d5\") " pod="openstack/ceilometer-0" Mar 20 17:55:55 crc kubenswrapper[4690]: I0320 17:55:55.277781 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6snp\" (UniqueName: \"kubernetes.io/projected/fc998c2a-5f75-4a3b-b62a-1d6f8fbfc6d5-kube-api-access-f6snp\") pod \"ceilometer-0\" (UID: \"fc998c2a-5f75-4a3b-b62a-1d6f8fbfc6d5\") " pod="openstack/ceilometer-0" Mar 20 17:55:55 crc kubenswrapper[4690]: I0320 17:55:55.308370 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:55:55 crc kubenswrapper[4690]: I0320 17:55:55.782018 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:55:55 crc kubenswrapper[4690]: W0320 17:55:55.785515 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfc998c2a_5f75_4a3b_b62a_1d6f8fbfc6d5.slice/crio-30db53a2e80cbf136759b26ab4bd2e2b48fb8e008176559c3c08762dc9dfc68b WatchSource:0}: Error finding container 30db53a2e80cbf136759b26ab4bd2e2b48fb8e008176559c3c08762dc9dfc68b: Status 404 returned error can't find the container with id 30db53a2e80cbf136759b26ab4bd2e2b48fb8e008176559c3c08762dc9dfc68b Mar 20 17:55:55 crc kubenswrapper[4690]: I0320 17:55:55.856531 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fc998c2a-5f75-4a3b-b62a-1d6f8fbfc6d5","Type":"ContainerStarted","Data":"30db53a2e80cbf136759b26ab4bd2e2b48fb8e008176559c3c08762dc9dfc68b"} Mar 20 17:55:55 crc kubenswrapper[4690]: I0320 17:55:55.926871 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a58c4cd-dea7-417a-a296-6de5e559294f" path="/var/lib/kubelet/pods/8a58c4cd-dea7-417a-a296-6de5e559294f/volumes" Mar 20 17:55:56 crc kubenswrapper[4690]: I0320 17:55:56.522227 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-5q5q8" Mar 20 17:55:56 crc kubenswrapper[4690]: I0320 17:55:56.680411 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1296ff75-6f88-4e2c-bf63-b46c3b090a6d-config-data\") pod \"1296ff75-6f88-4e2c-bf63-b46c3b090a6d\" (UID: \"1296ff75-6f88-4e2c-bf63-b46c3b090a6d\") " Mar 20 17:55:56 crc kubenswrapper[4690]: I0320 17:55:56.680515 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1296ff75-6f88-4e2c-bf63-b46c3b090a6d-combined-ca-bundle\") pod \"1296ff75-6f88-4e2c-bf63-b46c3b090a6d\" (UID: \"1296ff75-6f88-4e2c-bf63-b46c3b090a6d\") " Mar 20 17:55:56 crc kubenswrapper[4690]: I0320 17:55:56.680605 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1296ff75-6f88-4e2c-bf63-b46c3b090a6d-scripts\") pod \"1296ff75-6f88-4e2c-bf63-b46c3b090a6d\" (UID: \"1296ff75-6f88-4e2c-bf63-b46c3b090a6d\") " Mar 20 17:55:56 crc kubenswrapper[4690]: I0320 17:55:56.680671 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r6tv2\" (UniqueName: \"kubernetes.io/projected/1296ff75-6f88-4e2c-bf63-b46c3b090a6d-kube-api-access-r6tv2\") pod \"1296ff75-6f88-4e2c-bf63-b46c3b090a6d\" (UID: \"1296ff75-6f88-4e2c-bf63-b46c3b090a6d\") " Mar 20 17:55:56 crc kubenswrapper[4690]: I0320 17:55:56.690397 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1296ff75-6f88-4e2c-bf63-b46c3b090a6d-scripts" (OuterVolumeSpecName: "scripts") pod "1296ff75-6f88-4e2c-bf63-b46c3b090a6d" (UID: "1296ff75-6f88-4e2c-bf63-b46c3b090a6d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:55:56 crc kubenswrapper[4690]: I0320 17:55:56.690738 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1296ff75-6f88-4e2c-bf63-b46c3b090a6d-kube-api-access-r6tv2" (OuterVolumeSpecName: "kube-api-access-r6tv2") pod "1296ff75-6f88-4e2c-bf63-b46c3b090a6d" (UID: "1296ff75-6f88-4e2c-bf63-b46c3b090a6d"). InnerVolumeSpecName "kube-api-access-r6tv2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:55:56 crc kubenswrapper[4690]: I0320 17:55:56.726348 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1296ff75-6f88-4e2c-bf63-b46c3b090a6d-config-data" (OuterVolumeSpecName: "config-data") pod "1296ff75-6f88-4e2c-bf63-b46c3b090a6d" (UID: "1296ff75-6f88-4e2c-bf63-b46c3b090a6d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:55:56 crc kubenswrapper[4690]: I0320 17:55:56.744827 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1296ff75-6f88-4e2c-bf63-b46c3b090a6d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1296ff75-6f88-4e2c-bf63-b46c3b090a6d" (UID: "1296ff75-6f88-4e2c-bf63-b46c3b090a6d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:55:56 crc kubenswrapper[4690]: I0320 17:55:56.783827 4690 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1296ff75-6f88-4e2c-bf63-b46c3b090a6d-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:55:56 crc kubenswrapper[4690]: I0320 17:55:56.783877 4690 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1296ff75-6f88-4e2c-bf63-b46c3b090a6d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:55:56 crc kubenswrapper[4690]: I0320 17:55:56.783892 4690 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1296ff75-6f88-4e2c-bf63-b46c3b090a6d-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:55:56 crc kubenswrapper[4690]: I0320 17:55:56.783904 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r6tv2\" (UniqueName: \"kubernetes.io/projected/1296ff75-6f88-4e2c-bf63-b46c3b090a6d-kube-api-access-r6tv2\") on node \"crc\" DevicePath \"\"" Mar 20 17:55:56 crc kubenswrapper[4690]: I0320 17:55:56.872782 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-5q5q8" Mar 20 17:55:56 crc kubenswrapper[4690]: I0320 17:55:56.872778 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-5q5q8" event={"ID":"1296ff75-6f88-4e2c-bf63-b46c3b090a6d","Type":"ContainerDied","Data":"3cad27e2f02bf123d37701ba36ff7a38e8a7b0136e0addcfccdb415919a53251"} Mar 20 17:55:56 crc kubenswrapper[4690]: I0320 17:55:56.872897 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3cad27e2f02bf123d37701ba36ff7a38e8a7b0136e0addcfccdb415919a53251" Mar 20 17:55:56 crc kubenswrapper[4690]: I0320 17:55:56.875377 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fc998c2a-5f75-4a3b-b62a-1d6f8fbfc6d5","Type":"ContainerStarted","Data":"25336bae8b5fe28113977edda0f1f08764f71d714f949576d1a521d9006af47b"} Mar 20 17:55:57 crc kubenswrapper[4690]: I0320 17:55:57.150173 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 17:55:57 crc kubenswrapper[4690]: I0320 17:55:57.150488 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="27f99cff-5842-4132-89a9-3cc1872139cf" containerName="nova-scheduler-scheduler" containerID="cri-o://c5d3e79567996ab752e5f084c5d4ff7071a56d2617d055dec8395e07f8b9c3e7" gracePeriod=30 Mar 20 17:55:57 crc kubenswrapper[4690]: I0320 17:55:57.175775 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 17:55:57 crc kubenswrapper[4690]: I0320 17:55:57.176452 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d7846f23-5aa1-4613-a307-9e4bc7d372bb" containerName="nova-api-log" containerID="cri-o://0461f510933459a5848e00a5cc7243698f4679e5d6749f37f0384c304c232c4a" gracePeriod=30 Mar 20 17:55:57 crc kubenswrapper[4690]: I0320 17:55:57.177007 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d7846f23-5aa1-4613-a307-9e4bc7d372bb" containerName="nova-api-api" containerID="cri-o://abc1b66adb188024a85af79cc12690f2277af075cff037e71d81b558c4f5b2ff" gracePeriod=30 Mar 20 17:55:57 crc kubenswrapper[4690]: I0320 17:55:57.186507 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 17:55:57 crc kubenswrapper[4690]: I0320 17:55:57.186752 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="0953a097-ab34-4b4e-8389-00cc858d9a36" containerName="nova-metadata-log" containerID="cri-o://cb1d2f86c59ecb4fa2c614c24bc6db28795c5867c5ee347e656ccf8c542a5274" gracePeriod=30 Mar 20 17:55:57 crc kubenswrapper[4690]: I0320 17:55:57.186887 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="0953a097-ab34-4b4e-8389-00cc858d9a36" containerName="nova-metadata-metadata" containerID="cri-o://7301b1b2ac4f7fcb788150a6bbb67e141f3313b86114ced59fd41ffa3efa0bf8" gracePeriod=30 Mar 20 17:55:57 crc kubenswrapper[4690]: E0320 17:55:57.736994 4690 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c5d3e79567996ab752e5f084c5d4ff7071a56d2617d055dec8395e07f8b9c3e7" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 20 17:55:57 crc kubenswrapper[4690]: E0320 17:55:57.756503 4690 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c5d3e79567996ab752e5f084c5d4ff7071a56d2617d055dec8395e07f8b9c3e7" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 20 17:55:57 crc kubenswrapper[4690]: E0320 17:55:57.765756 4690 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c5d3e79567996ab752e5f084c5d4ff7071a56d2617d055dec8395e07f8b9c3e7" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 20 17:55:57 crc kubenswrapper[4690]: E0320 17:55:57.765829 4690 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="27f99cff-5842-4132-89a9-3cc1872139cf" containerName="nova-scheduler-scheduler" Mar 20 17:55:57 crc kubenswrapper[4690]: I0320 17:55:57.839159 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 17:55:57 crc kubenswrapper[4690]: I0320 17:55:57.888720 4690 generic.go:334] "Generic (PLEG): container finished" podID="0953a097-ab34-4b4e-8389-00cc858d9a36" containerID="cb1d2f86c59ecb4fa2c614c24bc6db28795c5867c5ee347e656ccf8c542a5274" exitCode=143 Mar 20 17:55:57 crc kubenswrapper[4690]: I0320 17:55:57.891553 4690 generic.go:334] "Generic (PLEG): container finished" podID="d7846f23-5aa1-4613-a307-9e4bc7d372bb" containerID="abc1b66adb188024a85af79cc12690f2277af075cff037e71d81b558c4f5b2ff" exitCode=0 Mar 20 17:55:57 crc kubenswrapper[4690]: I0320 17:55:57.891568 4690 generic.go:334] "Generic (PLEG): container finished" podID="d7846f23-5aa1-4613-a307-9e4bc7d372bb" containerID="0461f510933459a5848e00a5cc7243698f4679e5d6749f37f0384c304c232c4a" exitCode=143 Mar 20 17:55:57 crc kubenswrapper[4690]: I0320 17:55:57.891628 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 17:55:57 crc kubenswrapper[4690]: I0320 17:55:57.907688 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0953a097-ab34-4b4e-8389-00cc858d9a36","Type":"ContainerDied","Data":"cb1d2f86c59ecb4fa2c614c24bc6db28795c5867c5ee347e656ccf8c542a5274"} Mar 20 17:55:57 crc kubenswrapper[4690]: I0320 17:55:57.907725 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d7846f23-5aa1-4613-a307-9e4bc7d372bb","Type":"ContainerDied","Data":"abc1b66adb188024a85af79cc12690f2277af075cff037e71d81b558c4f5b2ff"} Mar 20 17:55:57 crc kubenswrapper[4690]: I0320 17:55:57.907739 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d7846f23-5aa1-4613-a307-9e4bc7d372bb","Type":"ContainerDied","Data":"0461f510933459a5848e00a5cc7243698f4679e5d6749f37f0384c304c232c4a"} Mar 20 17:55:57 crc kubenswrapper[4690]: I0320 17:55:57.907749 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d7846f23-5aa1-4613-a307-9e4bc7d372bb","Type":"ContainerDied","Data":"e538813479e794dbee17f8c171bc958bcc2011472b5105fd76da522df1785e8d"} Mar 20 17:55:57 crc kubenswrapper[4690]: I0320 17:55:57.907760 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fc998c2a-5f75-4a3b-b62a-1d6f8fbfc6d5","Type":"ContainerStarted","Data":"e2251ef6163777a6464aeb4dfda1ecb95e9b0aa5b0e645cfe0bbcef3b4e4b1a7"} Mar 20 17:55:57 crc kubenswrapper[4690]: I0320 17:55:57.907777 4690 scope.go:117] "RemoveContainer" containerID="abc1b66adb188024a85af79cc12690f2277af075cff037e71d81b558c4f5b2ff" Mar 20 17:55:57 crc kubenswrapper[4690]: I0320 17:55:57.949528 4690 scope.go:117] "RemoveContainer" containerID="0461f510933459a5848e00a5cc7243698f4679e5d6749f37f0384c304c232c4a" Mar 20 17:55:57 crc kubenswrapper[4690]: I0320 17:55:57.992521 4690 scope.go:117] "RemoveContainer" containerID="abc1b66adb188024a85af79cc12690f2277af075cff037e71d81b558c4f5b2ff" Mar 20 17:55:57 crc kubenswrapper[4690]: E0320 17:55:57.993190 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abc1b66adb188024a85af79cc12690f2277af075cff037e71d81b558c4f5b2ff\": container with ID starting with abc1b66adb188024a85af79cc12690f2277af075cff037e71d81b558c4f5b2ff not found: ID does not exist" containerID="abc1b66adb188024a85af79cc12690f2277af075cff037e71d81b558c4f5b2ff" Mar 20 17:55:57 crc kubenswrapper[4690]: I0320 17:55:57.993223 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abc1b66adb188024a85af79cc12690f2277af075cff037e71d81b558c4f5b2ff"} err="failed to get container status \"abc1b66adb188024a85af79cc12690f2277af075cff037e71d81b558c4f5b2ff\": rpc error: code = NotFound desc = could not find container \"abc1b66adb188024a85af79cc12690f2277af075cff037e71d81b558c4f5b2ff\": container with ID starting with abc1b66adb188024a85af79cc12690f2277af075cff037e71d81b558c4f5b2ff not found: ID does not exist" Mar 20 17:55:57 crc kubenswrapper[4690]: I0320 17:55:57.993243 4690 scope.go:117] "RemoveContainer" containerID="0461f510933459a5848e00a5cc7243698f4679e5d6749f37f0384c304c232c4a" Mar 20 17:55:57 crc kubenswrapper[4690]: E0320 17:55:57.994270 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0461f510933459a5848e00a5cc7243698f4679e5d6749f37f0384c304c232c4a\": container with ID starting with 0461f510933459a5848e00a5cc7243698f4679e5d6749f37f0384c304c232c4a not found: ID does not exist" containerID="0461f510933459a5848e00a5cc7243698f4679e5d6749f37f0384c304c232c4a" Mar 20 17:55:57 crc kubenswrapper[4690]: I0320 17:55:57.994292 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0461f510933459a5848e00a5cc7243698f4679e5d6749f37f0384c304c232c4a"} err="failed to get container status \"0461f510933459a5848e00a5cc7243698f4679e5d6749f37f0384c304c232c4a\": rpc error: code = NotFound desc = could not find container \"0461f510933459a5848e00a5cc7243698f4679e5d6749f37f0384c304c232c4a\": container with ID starting with 0461f510933459a5848e00a5cc7243698f4679e5d6749f37f0384c304c232c4a not found: ID does not exist" Mar 20 17:55:57 crc kubenswrapper[4690]: I0320 17:55:57.994304 4690 scope.go:117] "RemoveContainer" containerID="abc1b66adb188024a85af79cc12690f2277af075cff037e71d81b558c4f5b2ff" Mar 20 17:55:57 crc kubenswrapper[4690]: I0320 17:55:57.997079 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abc1b66adb188024a85af79cc12690f2277af075cff037e71d81b558c4f5b2ff"} err="failed to get container status \"abc1b66adb188024a85af79cc12690f2277af075cff037e71d81b558c4f5b2ff\": rpc error: code = NotFound desc = could not find container \"abc1b66adb188024a85af79cc12690f2277af075cff037e71d81b558c4f5b2ff\": container with ID starting with abc1b66adb188024a85af79cc12690f2277af075cff037e71d81b558c4f5b2ff not found: ID does not exist" Mar 20 17:55:57 crc kubenswrapper[4690]: I0320 17:55:57.997101 4690 scope.go:117] "RemoveContainer" containerID="0461f510933459a5848e00a5cc7243698f4679e5d6749f37f0384c304c232c4a" Mar 20 17:55:57 crc kubenswrapper[4690]: I0320 17:55:57.997581 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0461f510933459a5848e00a5cc7243698f4679e5d6749f37f0384c304c232c4a"} err="failed to get container status \"0461f510933459a5848e00a5cc7243698f4679e5d6749f37f0384c304c232c4a\": rpc error: code = NotFound desc = could not find container \"0461f510933459a5848e00a5cc7243698f4679e5d6749f37f0384c304c232c4a\": container with ID starting with 0461f510933459a5848e00a5cc7243698f4679e5d6749f37f0384c304c232c4a not found: ID does not exist" Mar 20 17:55:58 crc kubenswrapper[4690]: I0320 17:55:58.021739 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7846f23-5aa1-4613-a307-9e4bc7d372bb-config-data\") pod \"d7846f23-5aa1-4613-a307-9e4bc7d372bb\" (UID: \"d7846f23-5aa1-4613-a307-9e4bc7d372bb\") " Mar 20 17:55:58 crc kubenswrapper[4690]: I0320 17:55:58.021859 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7846f23-5aa1-4613-a307-9e4bc7d372bb-internal-tls-certs\") pod \"d7846f23-5aa1-4613-a307-9e4bc7d372bb\" (UID: \"d7846f23-5aa1-4613-a307-9e4bc7d372bb\") " Mar 20 17:55:58 crc kubenswrapper[4690]: I0320 17:55:58.021916 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7846f23-5aa1-4613-a307-9e4bc7d372bb-public-tls-certs\") pod \"d7846f23-5aa1-4613-a307-9e4bc7d372bb\" (UID: \"d7846f23-5aa1-4613-a307-9e4bc7d372bb\") " Mar 20 17:55:58 crc kubenswrapper[4690]: I0320 17:55:58.021959 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7846f23-5aa1-4613-a307-9e4bc7d372bb-logs\") pod \"d7846f23-5aa1-4613-a307-9e4bc7d372bb\" (UID: \"d7846f23-5aa1-4613-a307-9e4bc7d372bb\") " Mar 20 17:55:58 crc kubenswrapper[4690]: I0320 17:55:58.021988 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zqzg9\" (UniqueName: \"kubernetes.io/projected/d7846f23-5aa1-4613-a307-9e4bc7d372bb-kube-api-access-zqzg9\") pod \"d7846f23-5aa1-4613-a307-9e4bc7d372bb\" (UID: \"d7846f23-5aa1-4613-a307-9e4bc7d372bb\") " Mar 20 17:55:58 crc kubenswrapper[4690]: I0320 17:55:58.022020 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7846f23-5aa1-4613-a307-9e4bc7d372bb-combined-ca-bundle\") pod \"d7846f23-5aa1-4613-a307-9e4bc7d372bb\" (UID: \"d7846f23-5aa1-4613-a307-9e4bc7d372bb\") " Mar 20 17:55:58 crc kubenswrapper[4690]: I0320 17:55:58.023435 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7846f23-5aa1-4613-a307-9e4bc7d372bb-logs" (OuterVolumeSpecName: "logs") pod "d7846f23-5aa1-4613-a307-9e4bc7d372bb" (UID: "d7846f23-5aa1-4613-a307-9e4bc7d372bb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:55:58 crc kubenswrapper[4690]: I0320 17:55:58.030393 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7846f23-5aa1-4613-a307-9e4bc7d372bb-kube-api-access-zqzg9" (OuterVolumeSpecName: "kube-api-access-zqzg9") pod "d7846f23-5aa1-4613-a307-9e4bc7d372bb" (UID: "d7846f23-5aa1-4613-a307-9e4bc7d372bb"). InnerVolumeSpecName "kube-api-access-zqzg9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:55:58 crc kubenswrapper[4690]: I0320 17:55:58.055328 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7846f23-5aa1-4613-a307-9e4bc7d372bb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d7846f23-5aa1-4613-a307-9e4bc7d372bb" (UID: "d7846f23-5aa1-4613-a307-9e4bc7d372bb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:55:58 crc kubenswrapper[4690]: I0320 17:55:58.061706 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7846f23-5aa1-4613-a307-9e4bc7d372bb-config-data" (OuterVolumeSpecName: "config-data") pod "d7846f23-5aa1-4613-a307-9e4bc7d372bb" (UID: "d7846f23-5aa1-4613-a307-9e4bc7d372bb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:55:58 crc kubenswrapper[4690]: I0320 17:55:58.085387 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7846f23-5aa1-4613-a307-9e4bc7d372bb-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "d7846f23-5aa1-4613-a307-9e4bc7d372bb" (UID: "d7846f23-5aa1-4613-a307-9e4bc7d372bb"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:55:58 crc kubenswrapper[4690]: I0320 17:55:58.096905 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7846f23-5aa1-4613-a307-9e4bc7d372bb-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "d7846f23-5aa1-4613-a307-9e4bc7d372bb" (UID: "d7846f23-5aa1-4613-a307-9e4bc7d372bb"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:55:58 crc kubenswrapper[4690]: I0320 17:55:58.124390 4690 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7846f23-5aa1-4613-a307-9e4bc7d372bb-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:55:58 crc kubenswrapper[4690]: I0320 17:55:58.124424 4690 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7846f23-5aa1-4613-a307-9e4bc7d372bb-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 17:55:58 crc kubenswrapper[4690]: I0320 17:55:58.124437 4690 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7846f23-5aa1-4613-a307-9e4bc7d372bb-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 17:55:58 crc kubenswrapper[4690]: I0320 17:55:58.124449 4690 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7846f23-5aa1-4613-a307-9e4bc7d372bb-logs\") on node \"crc\" DevicePath \"\"" Mar 20 17:55:58 crc kubenswrapper[4690]: I0320 17:55:58.124458 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zqzg9\" (UniqueName: \"kubernetes.io/projected/d7846f23-5aa1-4613-a307-9e4bc7d372bb-kube-api-access-zqzg9\") on node \"crc\" DevicePath \"\"" Mar 20 17:55:58 crc kubenswrapper[4690]: I0320 17:55:58.124466 4690 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7846f23-5aa1-4613-a307-9e4bc7d372bb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:55:58 crc kubenswrapper[4690]: I0320 17:55:58.223817 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 17:55:58 crc kubenswrapper[4690]: I0320 17:55:58.231405 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 20 17:55:58 crc kubenswrapper[4690]: I0320 17:55:58.247967 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 20 17:55:58 crc kubenswrapper[4690]: E0320 17:55:58.248669 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1296ff75-6f88-4e2c-bf63-b46c3b090a6d" containerName="nova-manage" Mar 20 17:55:58 crc kubenswrapper[4690]: I0320 17:55:58.248760 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="1296ff75-6f88-4e2c-bf63-b46c3b090a6d" containerName="nova-manage" Mar 20 17:55:58 crc kubenswrapper[4690]: E0320 17:55:58.248851 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7846f23-5aa1-4613-a307-9e4bc7d372bb" containerName="nova-api-log" Mar 20 17:55:58 crc kubenswrapper[4690]: I0320 17:55:58.248918 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7846f23-5aa1-4613-a307-9e4bc7d372bb" containerName="nova-api-log" Mar 20 17:55:58 crc kubenswrapper[4690]: E0320 17:55:58.249002 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7846f23-5aa1-4613-a307-9e4bc7d372bb" containerName="nova-api-api" Mar 20 17:55:58 crc kubenswrapper[4690]: I0320 17:55:58.249066 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7846f23-5aa1-4613-a307-9e4bc7d372bb" containerName="nova-api-api" Mar 20 17:55:58 crc kubenswrapper[4690]: I0320 17:55:58.249422 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7846f23-5aa1-4613-a307-9e4bc7d372bb" containerName="nova-api-log" Mar 20 17:55:58 crc kubenswrapper[4690]: I0320 17:55:58.249535 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7846f23-5aa1-4613-a307-9e4bc7d372bb" containerName="nova-api-api" Mar 20 17:55:58 crc kubenswrapper[4690]: I0320 17:55:58.249607 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="1296ff75-6f88-4e2c-bf63-b46c3b090a6d" containerName="nova-manage" Mar 20 17:55:58 crc kubenswrapper[4690]: I0320 17:55:58.250841 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 17:55:58 crc kubenswrapper[4690]: I0320 17:55:58.253492 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 20 17:55:58 crc kubenswrapper[4690]: I0320 17:55:58.253792 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 20 17:55:58 crc kubenswrapper[4690]: I0320 17:55:58.253898 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 20 17:55:58 crc kubenswrapper[4690]: I0320 17:55:58.262834 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 17:55:58 crc kubenswrapper[4690]: I0320 17:55:58.428641 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f8feaad-3661-4ea6-9e2d-90cf79d48df9-internal-tls-certs\") pod \"nova-api-0\" (UID: \"5f8feaad-3661-4ea6-9e2d-90cf79d48df9\") " pod="openstack/nova-api-0" Mar 20 17:55:58 crc kubenswrapper[4690]: I0320 17:55:58.428741 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f8feaad-3661-4ea6-9e2d-90cf79d48df9-config-data\") pod \"nova-api-0\" (UID: \"5f8feaad-3661-4ea6-9e2d-90cf79d48df9\") " pod="openstack/nova-api-0" Mar 20 17:55:58 crc kubenswrapper[4690]: I0320 17:55:58.428795 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f8feaad-3661-4ea6-9e2d-90cf79d48df9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5f8feaad-3661-4ea6-9e2d-90cf79d48df9\") " pod="openstack/nova-api-0" Mar 20 17:55:58 crc kubenswrapper[4690]: I0320 17:55:58.428812 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f8feaad-3661-4ea6-9e2d-90cf79d48df9-public-tls-certs\") pod \"nova-api-0\" (UID: \"5f8feaad-3661-4ea6-9e2d-90cf79d48df9\") " pod="openstack/nova-api-0" Mar 20 17:55:58 crc kubenswrapper[4690]: I0320 17:55:58.428874 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbh8z\" (UniqueName: \"kubernetes.io/projected/5f8feaad-3661-4ea6-9e2d-90cf79d48df9-kube-api-access-cbh8z\") pod \"nova-api-0\" (UID: \"5f8feaad-3661-4ea6-9e2d-90cf79d48df9\") " pod="openstack/nova-api-0" Mar 20 17:55:58 crc kubenswrapper[4690]: I0320 17:55:58.428892 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f8feaad-3661-4ea6-9e2d-90cf79d48df9-logs\") pod \"nova-api-0\" (UID: \"5f8feaad-3661-4ea6-9e2d-90cf79d48df9\") " pod="openstack/nova-api-0" Mar 20 17:55:58 crc kubenswrapper[4690]: I0320 17:55:58.530696 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f8feaad-3661-4ea6-9e2d-90cf79d48df9-internal-tls-certs\") pod \"nova-api-0\" (UID: \"5f8feaad-3661-4ea6-9e2d-90cf79d48df9\") " pod="openstack/nova-api-0" Mar 20 17:55:58 crc kubenswrapper[4690]: I0320 17:55:58.530809 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f8feaad-3661-4ea6-9e2d-90cf79d48df9-config-data\") pod \"nova-api-0\" (UID: \"5f8feaad-3661-4ea6-9e2d-90cf79d48df9\") " pod="openstack/nova-api-0" Mar 20 17:55:58 crc kubenswrapper[4690]: I0320 17:55:58.530866 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f8feaad-3661-4ea6-9e2d-90cf79d48df9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5f8feaad-3661-4ea6-9e2d-90cf79d48df9\") " pod="openstack/nova-api-0" Mar 20 17:55:58 crc kubenswrapper[4690]: I0320 17:55:58.530882 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f8feaad-3661-4ea6-9e2d-90cf79d48df9-public-tls-certs\") pod \"nova-api-0\" (UID: \"5f8feaad-3661-4ea6-9e2d-90cf79d48df9\") " pod="openstack/nova-api-0" Mar 20 17:55:58 crc kubenswrapper[4690]: I0320 17:55:58.530948 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbh8z\" (UniqueName: \"kubernetes.io/projected/5f8feaad-3661-4ea6-9e2d-90cf79d48df9-kube-api-access-cbh8z\") pod \"nova-api-0\" (UID: \"5f8feaad-3661-4ea6-9e2d-90cf79d48df9\") " pod="openstack/nova-api-0" Mar 20 17:55:58 crc kubenswrapper[4690]: I0320 17:55:58.530969 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f8feaad-3661-4ea6-9e2d-90cf79d48df9-logs\") pod \"nova-api-0\" (UID: \"5f8feaad-3661-4ea6-9e2d-90cf79d48df9\") " pod="openstack/nova-api-0" Mar 20 17:55:58 crc kubenswrapper[4690]: I0320 17:55:58.531421 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f8feaad-3661-4ea6-9e2d-90cf79d48df9-logs\") pod \"nova-api-0\" (UID: \"5f8feaad-3661-4ea6-9e2d-90cf79d48df9\") " pod="openstack/nova-api-0" Mar 20 17:55:58 crc kubenswrapper[4690]: I0320 17:55:58.536022 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f8feaad-3661-4ea6-9e2d-90cf79d48df9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5f8feaad-3661-4ea6-9e2d-90cf79d48df9\") " pod="openstack/nova-api-0" Mar 20 17:55:58 crc kubenswrapper[4690]: I0320 17:55:58.540665 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f8feaad-3661-4ea6-9e2d-90cf79d48df9-public-tls-certs\") pod \"nova-api-0\" (UID: \"5f8feaad-3661-4ea6-9e2d-90cf79d48df9\") " pod="openstack/nova-api-0" Mar 20 17:55:58 crc kubenswrapper[4690]: I0320 17:55:58.549447 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbh8z\" (UniqueName: \"kubernetes.io/projected/5f8feaad-3661-4ea6-9e2d-90cf79d48df9-kube-api-access-cbh8z\") pod \"nova-api-0\" (UID: \"5f8feaad-3661-4ea6-9e2d-90cf79d48df9\") " pod="openstack/nova-api-0" Mar 20 17:55:58 crc kubenswrapper[4690]: I0320 17:55:58.551268 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f8feaad-3661-4ea6-9e2d-90cf79d48df9-internal-tls-certs\") pod \"nova-api-0\" (UID: \"5f8feaad-3661-4ea6-9e2d-90cf79d48df9\") " pod="openstack/nova-api-0" Mar 20 17:55:58 crc kubenswrapper[4690]: I0320 17:55:58.552090 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f8feaad-3661-4ea6-9e2d-90cf79d48df9-config-data\") pod \"nova-api-0\" (UID: \"5f8feaad-3661-4ea6-9e2d-90cf79d48df9\") " pod="openstack/nova-api-0" Mar 20 17:55:58 crc kubenswrapper[4690]: I0320 17:55:58.650544 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 17:55:58 crc kubenswrapper[4690]: I0320 17:55:58.914215 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fc998c2a-5f75-4a3b-b62a-1d6f8fbfc6d5","Type":"ContainerStarted","Data":"1822ee26551f883548ae62a32df40582fe718f4374ffa05e8e9dff183c823bbc"} Mar 20 17:55:59 crc kubenswrapper[4690]: I0320 17:55:59.132665 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 17:55:59 crc kubenswrapper[4690]: W0320 17:55:59.139305 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5f8feaad_3661_4ea6_9e2d_90cf79d48df9.slice/crio-aeb23df777fed4cf43602021b657b593e3700a626a51feeefee3e694c8cb4db6 WatchSource:0}: Error finding container aeb23df777fed4cf43602021b657b593e3700a626a51feeefee3e694c8cb4db6: Status 404 returned error can't find the container with id aeb23df777fed4cf43602021b657b593e3700a626a51feeefee3e694c8cb4db6 Mar 20 17:55:59 crc kubenswrapper[4690]: I0320 17:55:59.899766 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7846f23-5aa1-4613-a307-9e4bc7d372bb" path="/var/lib/kubelet/pods/d7846f23-5aa1-4613-a307-9e4bc7d372bb/volumes" Mar 20 17:55:59 crc kubenswrapper[4690]: I0320 17:55:59.933140 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5f8feaad-3661-4ea6-9e2d-90cf79d48df9","Type":"ContainerStarted","Data":"5f1850b7f9680ca891259a40acdac2d51ec8fbd77f7437161163ff932ca3c7c3"} Mar 20 17:55:59 crc kubenswrapper[4690]: I0320 17:55:59.933184 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5f8feaad-3661-4ea6-9e2d-90cf79d48df9","Type":"ContainerStarted","Data":"4a3ae04a1bdb25c3cd4760141e0148d5f10931d23c27712ab64072004caa7fa9"} Mar 20 17:55:59 crc kubenswrapper[4690]: I0320 17:55:59.933193 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5f8feaad-3661-4ea6-9e2d-90cf79d48df9","Type":"ContainerStarted","Data":"aeb23df777fed4cf43602021b657b593e3700a626a51feeefee3e694c8cb4db6"} Mar 20 17:55:59 crc kubenswrapper[4690]: I0320 17:55:59.987166 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=1.987146497 podStartE2EDuration="1.987146497s" podCreationTimestamp="2026-03-20 17:55:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:55:59.968374756 +0000 UTC m=+1434.834200434" watchObservedRunningTime="2026-03-20 17:55:59.987146497 +0000 UTC m=+1434.852972175" Mar 20 17:56:00 crc kubenswrapper[4690]: I0320 17:56:00.137359 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567156-n6gh2"] Mar 20 17:56:00 crc kubenswrapper[4690]: I0320 17:56:00.138915 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567156-n6gh2" Mar 20 17:56:00 crc kubenswrapper[4690]: I0320 17:56:00.143017 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5fwhb" Mar 20 17:56:00 crc kubenswrapper[4690]: I0320 17:56:00.143058 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 17:56:00 crc kubenswrapper[4690]: I0320 17:56:00.143328 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 17:56:00 crc kubenswrapper[4690]: I0320 17:56:00.146279 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567156-n6gh2"] Mar 20 17:56:00 crc kubenswrapper[4690]: I0320 17:56:00.189637 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ms2gd\" (UniqueName: \"kubernetes.io/projected/98fc80fa-7ce3-43dc-9ec0-cccc94302c99-kube-api-access-ms2gd\") pod \"auto-csr-approver-29567156-n6gh2\" (UID: \"98fc80fa-7ce3-43dc-9ec0-cccc94302c99\") " pod="openshift-infra/auto-csr-approver-29567156-n6gh2" Mar 20 17:56:00 crc kubenswrapper[4690]: I0320 17:56:00.290987 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ms2gd\" (UniqueName: \"kubernetes.io/projected/98fc80fa-7ce3-43dc-9ec0-cccc94302c99-kube-api-access-ms2gd\") pod \"auto-csr-approver-29567156-n6gh2\" (UID: \"98fc80fa-7ce3-43dc-9ec0-cccc94302c99\") " pod="openshift-infra/auto-csr-approver-29567156-n6gh2" Mar 20 17:56:00 crc kubenswrapper[4690]: I0320 17:56:00.317386 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ms2gd\" (UniqueName: \"kubernetes.io/projected/98fc80fa-7ce3-43dc-9ec0-cccc94302c99-kube-api-access-ms2gd\") pod \"auto-csr-approver-29567156-n6gh2\" (UID: \"98fc80fa-7ce3-43dc-9ec0-cccc94302c99\") " pod="openshift-infra/auto-csr-approver-29567156-n6gh2" Mar 20 17:56:00 crc kubenswrapper[4690]: I0320 17:56:00.461742 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567156-n6gh2" Mar 20 17:56:00 crc kubenswrapper[4690]: I0320 17:56:00.724166 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 17:56:00 crc kubenswrapper[4690]: I0320 17:56:00.801531 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0953a097-ab34-4b4e-8389-00cc858d9a36-nova-metadata-tls-certs\") pod \"0953a097-ab34-4b4e-8389-00cc858d9a36\" (UID: \"0953a097-ab34-4b4e-8389-00cc858d9a36\") " Mar 20 17:56:00 crc kubenswrapper[4690]: I0320 17:56:00.801662 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hgfpd\" (UniqueName: \"kubernetes.io/projected/0953a097-ab34-4b4e-8389-00cc858d9a36-kube-api-access-hgfpd\") pod \"0953a097-ab34-4b4e-8389-00cc858d9a36\" (UID: \"0953a097-ab34-4b4e-8389-00cc858d9a36\") " Mar 20 17:56:00 crc kubenswrapper[4690]: I0320 17:56:00.801736 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0953a097-ab34-4b4e-8389-00cc858d9a36-logs\") pod \"0953a097-ab34-4b4e-8389-00cc858d9a36\" (UID: \"0953a097-ab34-4b4e-8389-00cc858d9a36\") " Mar 20 17:56:00 crc kubenswrapper[4690]: I0320 17:56:00.801825 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0953a097-ab34-4b4e-8389-00cc858d9a36-combined-ca-bundle\") pod \"0953a097-ab34-4b4e-8389-00cc858d9a36\" (UID: \"0953a097-ab34-4b4e-8389-00cc858d9a36\") " Mar 20 17:56:00 crc kubenswrapper[4690]: I0320 17:56:00.801848 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0953a097-ab34-4b4e-8389-00cc858d9a36-config-data\") pod \"0953a097-ab34-4b4e-8389-00cc858d9a36\" (UID: \"0953a097-ab34-4b4e-8389-00cc858d9a36\") " Mar 20 17:56:00 crc kubenswrapper[4690]: I0320 17:56:00.803846 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0953a097-ab34-4b4e-8389-00cc858d9a36-logs" (OuterVolumeSpecName: "logs") pod "0953a097-ab34-4b4e-8389-00cc858d9a36" (UID: "0953a097-ab34-4b4e-8389-00cc858d9a36"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:56:00 crc kubenswrapper[4690]: I0320 17:56:00.808595 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0953a097-ab34-4b4e-8389-00cc858d9a36-kube-api-access-hgfpd" (OuterVolumeSpecName: "kube-api-access-hgfpd") pod "0953a097-ab34-4b4e-8389-00cc858d9a36" (UID: "0953a097-ab34-4b4e-8389-00cc858d9a36"). InnerVolumeSpecName "kube-api-access-hgfpd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:56:00 crc kubenswrapper[4690]: I0320 17:56:00.849659 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0953a097-ab34-4b4e-8389-00cc858d9a36-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0953a097-ab34-4b4e-8389-00cc858d9a36" (UID: "0953a097-ab34-4b4e-8389-00cc858d9a36"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:56:00 crc kubenswrapper[4690]: I0320 17:56:00.851978 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0953a097-ab34-4b4e-8389-00cc858d9a36-config-data" (OuterVolumeSpecName: "config-data") pod "0953a097-ab34-4b4e-8389-00cc858d9a36" (UID: "0953a097-ab34-4b4e-8389-00cc858d9a36"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:56:00 crc kubenswrapper[4690]: I0320 17:56:00.884702 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0953a097-ab34-4b4e-8389-00cc858d9a36-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "0953a097-ab34-4b4e-8389-00cc858d9a36" (UID: "0953a097-ab34-4b4e-8389-00cc858d9a36"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:56:00 crc kubenswrapper[4690]: I0320 17:56:00.904267 4690 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0953a097-ab34-4b4e-8389-00cc858d9a36-logs\") on node \"crc\" DevicePath \"\"" Mar 20 17:56:00 crc kubenswrapper[4690]: I0320 17:56:00.904299 4690 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0953a097-ab34-4b4e-8389-00cc858d9a36-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:56:00 crc kubenswrapper[4690]: I0320 17:56:00.904308 4690 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0953a097-ab34-4b4e-8389-00cc858d9a36-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:56:00 crc kubenswrapper[4690]: I0320 17:56:00.904317 4690 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0953a097-ab34-4b4e-8389-00cc858d9a36-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 17:56:00 crc kubenswrapper[4690]: I0320 17:56:00.904327 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hgfpd\" (UniqueName: \"kubernetes.io/projected/0953a097-ab34-4b4e-8389-00cc858d9a36-kube-api-access-hgfpd\") on node \"crc\" DevicePath \"\"" Mar 20 17:56:00 crc kubenswrapper[4690]: I0320 17:56:00.946949 4690 generic.go:334] "Generic (PLEG): container finished" podID="0953a097-ab34-4b4e-8389-00cc858d9a36" containerID="7301b1b2ac4f7fcb788150a6bbb67e141f3313b86114ced59fd41ffa3efa0bf8" exitCode=0 Mar 20 17:56:00 crc kubenswrapper[4690]: I0320 17:56:00.947030 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 17:56:00 crc kubenswrapper[4690]: I0320 17:56:00.947030 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0953a097-ab34-4b4e-8389-00cc858d9a36","Type":"ContainerDied","Data":"7301b1b2ac4f7fcb788150a6bbb67e141f3313b86114ced59fd41ffa3efa0bf8"} Mar 20 17:56:00 crc kubenswrapper[4690]: I0320 17:56:00.948653 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0953a097-ab34-4b4e-8389-00cc858d9a36","Type":"ContainerDied","Data":"98ac9663ea55f6e3f6cdacb736bc37d1d10de45099a043d4690dea03fb1747a4"} Mar 20 17:56:00 crc kubenswrapper[4690]: I0320 17:56:00.948671 4690 scope.go:117] "RemoveContainer" containerID="7301b1b2ac4f7fcb788150a6bbb67e141f3313b86114ced59fd41ffa3efa0bf8" Mar 20 17:56:00 crc kubenswrapper[4690]: I0320 17:56:00.958788 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fc998c2a-5f75-4a3b-b62a-1d6f8fbfc6d5","Type":"ContainerStarted","Data":"9f4695ee4ebbf2ccf34dc9d86af0dc6ea3b6da07c4d59e117570a011e67564d7"} Mar 20 17:56:00 crc kubenswrapper[4690]: I0320 17:56:00.958882 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 17:56:00 crc kubenswrapper[4690]: I0320 17:56:00.967317 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567156-n6gh2"] Mar 20 17:56:00 crc kubenswrapper[4690]: I0320 17:56:00.979380 4690 scope.go:117] "RemoveContainer" containerID="cb1d2f86c59ecb4fa2c614c24bc6db28795c5867c5ee347e656ccf8c542a5274" Mar 20 17:56:00 crc kubenswrapper[4690]: I0320 17:56:00.981814 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.902832053 podStartE2EDuration="6.981797416s" podCreationTimestamp="2026-03-20 17:55:54 +0000 UTC" firstStartedPulling="2026-03-20 17:55:55.7892439 +0000 UTC m=+1430.655069598" lastFinishedPulling="2026-03-20 17:55:59.868209283 +0000 UTC m=+1434.734034961" observedRunningTime="2026-03-20 17:56:00.981373764 +0000 UTC m=+1435.847199472" watchObservedRunningTime="2026-03-20 17:56:00.981797416 +0000 UTC m=+1435.847623094" Mar 20 17:56:01 crc kubenswrapper[4690]: I0320 17:56:01.008135 4690 scope.go:117] "RemoveContainer" containerID="7301b1b2ac4f7fcb788150a6bbb67e141f3313b86114ced59fd41ffa3efa0bf8" Mar 20 17:56:01 crc kubenswrapper[4690]: E0320 17:56:01.009048 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7301b1b2ac4f7fcb788150a6bbb67e141f3313b86114ced59fd41ffa3efa0bf8\": container with ID starting with 7301b1b2ac4f7fcb788150a6bbb67e141f3313b86114ced59fd41ffa3efa0bf8 not found: ID does not exist" containerID="7301b1b2ac4f7fcb788150a6bbb67e141f3313b86114ced59fd41ffa3efa0bf8" Mar 20 17:56:01 crc kubenswrapper[4690]: I0320 17:56:01.009085 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7301b1b2ac4f7fcb788150a6bbb67e141f3313b86114ced59fd41ffa3efa0bf8"} err="failed to get container status \"7301b1b2ac4f7fcb788150a6bbb67e141f3313b86114ced59fd41ffa3efa0bf8\": rpc error: code = NotFound desc = could not find container \"7301b1b2ac4f7fcb788150a6bbb67e141f3313b86114ced59fd41ffa3efa0bf8\": container with ID starting with 7301b1b2ac4f7fcb788150a6bbb67e141f3313b86114ced59fd41ffa3efa0bf8 not found: ID does not exist" Mar 20 17:56:01 crc kubenswrapper[4690]: I0320 17:56:01.009107 4690 scope.go:117] "RemoveContainer" containerID="cb1d2f86c59ecb4fa2c614c24bc6db28795c5867c5ee347e656ccf8c542a5274" Mar 20 17:56:01 crc kubenswrapper[4690]: E0320 17:56:01.009444 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb1d2f86c59ecb4fa2c614c24bc6db28795c5867c5ee347e656ccf8c542a5274\": container with ID starting with cb1d2f86c59ecb4fa2c614c24bc6db28795c5867c5ee347e656ccf8c542a5274 not found: ID does not exist" containerID="cb1d2f86c59ecb4fa2c614c24bc6db28795c5867c5ee347e656ccf8c542a5274" Mar 20 17:56:01 crc kubenswrapper[4690]: I0320 17:56:01.009460 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb1d2f86c59ecb4fa2c614c24bc6db28795c5867c5ee347e656ccf8c542a5274"} err="failed to get container status \"cb1d2f86c59ecb4fa2c614c24bc6db28795c5867c5ee347e656ccf8c542a5274\": rpc error: code = NotFound desc = could not find container \"cb1d2f86c59ecb4fa2c614c24bc6db28795c5867c5ee347e656ccf8c542a5274\": container with ID starting with cb1d2f86c59ecb4fa2c614c24bc6db28795c5867c5ee347e656ccf8c542a5274 not found: ID does not exist" Mar 20 17:56:01 crc kubenswrapper[4690]: I0320 17:56:01.017465 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 17:56:01 crc kubenswrapper[4690]: I0320 17:56:01.026157 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 17:56:01 crc kubenswrapper[4690]: I0320 17:56:01.035895 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 20 17:56:01 crc kubenswrapper[4690]: E0320 17:56:01.036404 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0953a097-ab34-4b4e-8389-00cc858d9a36" containerName="nova-metadata-metadata" Mar 20 17:56:01 crc kubenswrapper[4690]: I0320 17:56:01.036428 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="0953a097-ab34-4b4e-8389-00cc858d9a36" containerName="nova-metadata-metadata" Mar 20 17:56:01 crc kubenswrapper[4690]: E0320 17:56:01.036459 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0953a097-ab34-4b4e-8389-00cc858d9a36" containerName="nova-metadata-log" Mar 20 17:56:01 crc kubenswrapper[4690]: I0320 17:56:01.036470 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="0953a097-ab34-4b4e-8389-00cc858d9a36" containerName="nova-metadata-log" Mar 20 17:56:01 crc kubenswrapper[4690]: I0320 17:56:01.036690 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="0953a097-ab34-4b4e-8389-00cc858d9a36" containerName="nova-metadata-log" Mar 20 17:56:01 crc kubenswrapper[4690]: I0320 17:56:01.036710 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="0953a097-ab34-4b4e-8389-00cc858d9a36" containerName="nova-metadata-metadata" Mar 20 17:56:01 crc kubenswrapper[4690]: I0320 17:56:01.037840 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 17:56:01 crc kubenswrapper[4690]: I0320 17:56:01.043894 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 17:56:01 crc kubenswrapper[4690]: I0320 17:56:01.045090 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 20 17:56:01 crc kubenswrapper[4690]: I0320 17:56:01.045422 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 20 17:56:01 crc kubenswrapper[4690]: I0320 17:56:01.107395 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jj78p\" (UniqueName: \"kubernetes.io/projected/bc7c7487-ca7b-46c1-9cfd-6b9a34a0253f-kube-api-access-jj78p\") pod \"nova-metadata-0\" (UID: \"bc7c7487-ca7b-46c1-9cfd-6b9a34a0253f\") " pod="openstack/nova-metadata-0" Mar 20 17:56:01 crc kubenswrapper[4690]: I0320 17:56:01.107447 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc7c7487-ca7b-46c1-9cfd-6b9a34a0253f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"bc7c7487-ca7b-46c1-9cfd-6b9a34a0253f\") " pod="openstack/nova-metadata-0" Mar 20 17:56:01 crc kubenswrapper[4690]: I0320 17:56:01.107641 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc7c7487-ca7b-46c1-9cfd-6b9a34a0253f-config-data\") pod \"nova-metadata-0\" (UID: \"bc7c7487-ca7b-46c1-9cfd-6b9a34a0253f\") " pod="openstack/nova-metadata-0" Mar 20 17:56:01 crc kubenswrapper[4690]: I0320 17:56:01.107683 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc7c7487-ca7b-46c1-9cfd-6b9a34a0253f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"bc7c7487-ca7b-46c1-9cfd-6b9a34a0253f\") " pod="openstack/nova-metadata-0" Mar 20 17:56:01 crc kubenswrapper[4690]: I0320 17:56:01.107722 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bc7c7487-ca7b-46c1-9cfd-6b9a34a0253f-logs\") pod \"nova-metadata-0\" (UID: \"bc7c7487-ca7b-46c1-9cfd-6b9a34a0253f\") " pod="openstack/nova-metadata-0" Mar 20 17:56:01 crc kubenswrapper[4690]: I0320 17:56:01.209411 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jj78p\" (UniqueName: \"kubernetes.io/projected/bc7c7487-ca7b-46c1-9cfd-6b9a34a0253f-kube-api-access-jj78p\") pod \"nova-metadata-0\" (UID: \"bc7c7487-ca7b-46c1-9cfd-6b9a34a0253f\") " pod="openstack/nova-metadata-0" Mar 20 17:56:01 crc kubenswrapper[4690]: I0320 17:56:01.209460 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc7c7487-ca7b-46c1-9cfd-6b9a34a0253f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"bc7c7487-ca7b-46c1-9cfd-6b9a34a0253f\") " pod="openstack/nova-metadata-0" Mar 20 17:56:01 crc kubenswrapper[4690]: I0320 17:56:01.209549 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc7c7487-ca7b-46c1-9cfd-6b9a34a0253f-config-data\") pod \"nova-metadata-0\" (UID: \"bc7c7487-ca7b-46c1-9cfd-6b9a34a0253f\") " pod="openstack/nova-metadata-0" Mar 20 17:56:01 crc kubenswrapper[4690]: I0320 17:56:01.209575 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc7c7487-ca7b-46c1-9cfd-6b9a34a0253f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"bc7c7487-ca7b-46c1-9cfd-6b9a34a0253f\") " pod="openstack/nova-metadata-0" Mar 20 17:56:01 crc kubenswrapper[4690]: I0320 17:56:01.209608 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bc7c7487-ca7b-46c1-9cfd-6b9a34a0253f-logs\") pod \"nova-metadata-0\" (UID: \"bc7c7487-ca7b-46c1-9cfd-6b9a34a0253f\") " pod="openstack/nova-metadata-0" Mar 20 17:56:01 crc kubenswrapper[4690]: I0320 17:56:01.210223 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bc7c7487-ca7b-46c1-9cfd-6b9a34a0253f-logs\") pod \"nova-metadata-0\" (UID: \"bc7c7487-ca7b-46c1-9cfd-6b9a34a0253f\") " pod="openstack/nova-metadata-0" Mar 20 17:56:01 crc kubenswrapper[4690]: I0320 17:56:01.213979 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc7c7487-ca7b-46c1-9cfd-6b9a34a0253f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"bc7c7487-ca7b-46c1-9cfd-6b9a34a0253f\") " pod="openstack/nova-metadata-0" Mar 20 17:56:01 crc kubenswrapper[4690]: I0320 17:56:01.214971 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc7c7487-ca7b-46c1-9cfd-6b9a34a0253f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"bc7c7487-ca7b-46c1-9cfd-6b9a34a0253f\") " pod="openstack/nova-metadata-0" Mar 20 17:56:01 crc kubenswrapper[4690]: I0320 17:56:01.215767 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc7c7487-ca7b-46c1-9cfd-6b9a34a0253f-config-data\") pod \"nova-metadata-0\" (UID: \"bc7c7487-ca7b-46c1-9cfd-6b9a34a0253f\") " pod="openstack/nova-metadata-0" Mar 20 17:56:01 crc kubenswrapper[4690]: I0320 17:56:01.227786 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jj78p\" (UniqueName: \"kubernetes.io/projected/bc7c7487-ca7b-46c1-9cfd-6b9a34a0253f-kube-api-access-jj78p\") pod \"nova-metadata-0\" (UID: \"bc7c7487-ca7b-46c1-9cfd-6b9a34a0253f\") " pod="openstack/nova-metadata-0" Mar 20 17:56:01 crc kubenswrapper[4690]: I0320 17:56:01.359101 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 17:56:01 crc kubenswrapper[4690]: I0320 17:56:01.845152 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 17:56:01 crc kubenswrapper[4690]: I0320 17:56:01.902288 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0953a097-ab34-4b4e-8389-00cc858d9a36" path="/var/lib/kubelet/pods/0953a097-ab34-4b4e-8389-00cc858d9a36/volumes" Mar 20 17:56:01 crc kubenswrapper[4690]: I0320 17:56:01.944858 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 17:56:01 crc kubenswrapper[4690]: I0320 17:56:01.972200 4690 generic.go:334] "Generic (PLEG): container finished" podID="27f99cff-5842-4132-89a9-3cc1872139cf" containerID="c5d3e79567996ab752e5f084c5d4ff7071a56d2617d055dec8395e07f8b9c3e7" exitCode=0 Mar 20 17:56:01 crc kubenswrapper[4690]: I0320 17:56:01.972297 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"27f99cff-5842-4132-89a9-3cc1872139cf","Type":"ContainerDied","Data":"c5d3e79567996ab752e5f084c5d4ff7071a56d2617d055dec8395e07f8b9c3e7"} Mar 20 17:56:01 crc kubenswrapper[4690]: I0320 17:56:01.972325 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"27f99cff-5842-4132-89a9-3cc1872139cf","Type":"ContainerDied","Data":"0b8c4448455a714f08b538d5472432e8cc8f85ace5912d1b8835dae54c37ccc0"} Mar 20 17:56:01 crc kubenswrapper[4690]: I0320 17:56:01.972344 4690 scope.go:117] "RemoveContainer" containerID="c5d3e79567996ab752e5f084c5d4ff7071a56d2617d055dec8395e07f8b9c3e7" Mar 20 17:56:01 crc kubenswrapper[4690]: I0320 17:56:01.972437 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 17:56:01 crc kubenswrapper[4690]: I0320 17:56:01.977056 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567156-n6gh2" event={"ID":"98fc80fa-7ce3-43dc-9ec0-cccc94302c99","Type":"ContainerStarted","Data":"81a1cb84987eade9e0ec7e7ec5c4c74460ab62341b55c6e6519fa8a2fe8c7dcc"} Mar 20 17:56:01 crc kubenswrapper[4690]: I0320 17:56:01.980216 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bc7c7487-ca7b-46c1-9cfd-6b9a34a0253f","Type":"ContainerStarted","Data":"4aa929e7aca8647b56d5e93f9991669e6b1b3ac03c6629a082a216c58b5945f4"} Mar 20 17:56:01 crc kubenswrapper[4690]: I0320 17:56:01.999463 4690 scope.go:117] "RemoveContainer" containerID="c5d3e79567996ab752e5f084c5d4ff7071a56d2617d055dec8395e07f8b9c3e7" Mar 20 17:56:02 crc kubenswrapper[4690]: E0320 17:56:02.000081 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5d3e79567996ab752e5f084c5d4ff7071a56d2617d055dec8395e07f8b9c3e7\": container with ID starting with c5d3e79567996ab752e5f084c5d4ff7071a56d2617d055dec8395e07f8b9c3e7 not found: ID does not exist" containerID="c5d3e79567996ab752e5f084c5d4ff7071a56d2617d055dec8395e07f8b9c3e7" Mar 20 17:56:02 crc kubenswrapper[4690]: I0320 17:56:02.000131 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5d3e79567996ab752e5f084c5d4ff7071a56d2617d055dec8395e07f8b9c3e7"} err="failed to get container status \"c5d3e79567996ab752e5f084c5d4ff7071a56d2617d055dec8395e07f8b9c3e7\": rpc error: code = NotFound desc = could not find container \"c5d3e79567996ab752e5f084c5d4ff7071a56d2617d055dec8395e07f8b9c3e7\": container with ID starting with c5d3e79567996ab752e5f084c5d4ff7071a56d2617d055dec8395e07f8b9c3e7 not found: ID does not exist" Mar 20 17:56:02 crc kubenswrapper[4690]: I0320 17:56:02.024843 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27f99cff-5842-4132-89a9-3cc1872139cf-combined-ca-bundle\") pod \"27f99cff-5842-4132-89a9-3cc1872139cf\" (UID: \"27f99cff-5842-4132-89a9-3cc1872139cf\") " Mar 20 17:56:02 crc kubenswrapper[4690]: I0320 17:56:02.024948 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9lwb\" (UniqueName: \"kubernetes.io/projected/27f99cff-5842-4132-89a9-3cc1872139cf-kube-api-access-l9lwb\") pod \"27f99cff-5842-4132-89a9-3cc1872139cf\" (UID: \"27f99cff-5842-4132-89a9-3cc1872139cf\") " Mar 20 17:56:02 crc kubenswrapper[4690]: I0320 17:56:02.024974 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27f99cff-5842-4132-89a9-3cc1872139cf-config-data\") pod \"27f99cff-5842-4132-89a9-3cc1872139cf\" (UID: \"27f99cff-5842-4132-89a9-3cc1872139cf\") " Mar 20 17:56:02 crc kubenswrapper[4690]: I0320 17:56:02.035514 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27f99cff-5842-4132-89a9-3cc1872139cf-kube-api-access-l9lwb" (OuterVolumeSpecName: "kube-api-access-l9lwb") pod "27f99cff-5842-4132-89a9-3cc1872139cf" (UID: "27f99cff-5842-4132-89a9-3cc1872139cf"). InnerVolumeSpecName "kube-api-access-l9lwb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:56:02 crc kubenswrapper[4690]: I0320 17:56:02.055564 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27f99cff-5842-4132-89a9-3cc1872139cf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "27f99cff-5842-4132-89a9-3cc1872139cf" (UID: "27f99cff-5842-4132-89a9-3cc1872139cf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:56:02 crc kubenswrapper[4690]: I0320 17:56:02.067213 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27f99cff-5842-4132-89a9-3cc1872139cf-config-data" (OuterVolumeSpecName: "config-data") pod "27f99cff-5842-4132-89a9-3cc1872139cf" (UID: "27f99cff-5842-4132-89a9-3cc1872139cf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:56:02 crc kubenswrapper[4690]: I0320 17:56:02.128176 4690 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27f99cff-5842-4132-89a9-3cc1872139cf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:56:02 crc kubenswrapper[4690]: I0320 17:56:02.128222 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l9lwb\" (UniqueName: \"kubernetes.io/projected/27f99cff-5842-4132-89a9-3cc1872139cf-kube-api-access-l9lwb\") on node \"crc\" DevicePath \"\"" Mar 20 17:56:02 crc kubenswrapper[4690]: I0320 17:56:02.128233 4690 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27f99cff-5842-4132-89a9-3cc1872139cf-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:56:02 crc kubenswrapper[4690]: I0320 17:56:02.335641 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 17:56:02 crc kubenswrapper[4690]: I0320 17:56:02.350590 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 17:56:02 crc kubenswrapper[4690]: I0320 17:56:02.360264 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 17:56:02 crc kubenswrapper[4690]: E0320 17:56:02.360766 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27f99cff-5842-4132-89a9-3cc1872139cf" containerName="nova-scheduler-scheduler" Mar 20 17:56:02 crc kubenswrapper[4690]: I0320 17:56:02.360788 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="27f99cff-5842-4132-89a9-3cc1872139cf" containerName="nova-scheduler-scheduler" Mar 20 17:56:02 crc kubenswrapper[4690]: I0320 17:56:02.360980 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="27f99cff-5842-4132-89a9-3cc1872139cf" containerName="nova-scheduler-scheduler" Mar 20 17:56:02 crc kubenswrapper[4690]: I0320 17:56:02.361648 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 17:56:02 crc kubenswrapper[4690]: I0320 17:56:02.373808 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 20 17:56:02 crc kubenswrapper[4690]: I0320 17:56:02.389589 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 17:56:02 crc kubenswrapper[4690]: I0320 17:56:02.436441 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6c3a0b8-8793-4e94-bbee-851b32f0a393-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a6c3a0b8-8793-4e94-bbee-851b32f0a393\") " pod="openstack/nova-scheduler-0" Mar 20 17:56:02 crc kubenswrapper[4690]: I0320 17:56:02.436537 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6c3a0b8-8793-4e94-bbee-851b32f0a393-config-data\") pod \"nova-scheduler-0\" (UID: \"a6c3a0b8-8793-4e94-bbee-851b32f0a393\") " pod="openstack/nova-scheduler-0" Mar 20 17:56:02 crc kubenswrapper[4690]: I0320 17:56:02.436622 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7pmw\" (UniqueName: \"kubernetes.io/projected/a6c3a0b8-8793-4e94-bbee-851b32f0a393-kube-api-access-p7pmw\") pod \"nova-scheduler-0\" (UID: \"a6c3a0b8-8793-4e94-bbee-851b32f0a393\") " pod="openstack/nova-scheduler-0" Mar 20 17:56:02 crc kubenswrapper[4690]: I0320 17:56:02.537952 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7pmw\" (UniqueName: \"kubernetes.io/projected/a6c3a0b8-8793-4e94-bbee-851b32f0a393-kube-api-access-p7pmw\") pod \"nova-scheduler-0\" (UID: \"a6c3a0b8-8793-4e94-bbee-851b32f0a393\") " pod="openstack/nova-scheduler-0" Mar 20 17:56:02 crc kubenswrapper[4690]: I0320 17:56:02.538077 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6c3a0b8-8793-4e94-bbee-851b32f0a393-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a6c3a0b8-8793-4e94-bbee-851b32f0a393\") " pod="openstack/nova-scheduler-0" Mar 20 17:56:02 crc kubenswrapper[4690]: I0320 17:56:02.538120 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6c3a0b8-8793-4e94-bbee-851b32f0a393-config-data\") pod \"nova-scheduler-0\" (UID: \"a6c3a0b8-8793-4e94-bbee-851b32f0a393\") " pod="openstack/nova-scheduler-0" Mar 20 17:56:02 crc kubenswrapper[4690]: I0320 17:56:02.542714 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6c3a0b8-8793-4e94-bbee-851b32f0a393-config-data\") pod \"nova-scheduler-0\" (UID: \"a6c3a0b8-8793-4e94-bbee-851b32f0a393\") " pod="openstack/nova-scheduler-0" Mar 20 17:56:02 crc kubenswrapper[4690]: I0320 17:56:02.543973 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6c3a0b8-8793-4e94-bbee-851b32f0a393-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a6c3a0b8-8793-4e94-bbee-851b32f0a393\") " pod="openstack/nova-scheduler-0" Mar 20 17:56:02 crc kubenswrapper[4690]: I0320 17:56:02.559246 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7pmw\" (UniqueName: \"kubernetes.io/projected/a6c3a0b8-8793-4e94-bbee-851b32f0a393-kube-api-access-p7pmw\") pod \"nova-scheduler-0\" (UID: \"a6c3a0b8-8793-4e94-bbee-851b32f0a393\") " pod="openstack/nova-scheduler-0" Mar 20 17:56:02 crc kubenswrapper[4690]: I0320 17:56:02.688084 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 17:56:02 crc kubenswrapper[4690]: I0320 17:56:02.989287 4690 generic.go:334] "Generic (PLEG): container finished" podID="98fc80fa-7ce3-43dc-9ec0-cccc94302c99" containerID="c8c1f89c24b34f4002fd1d1894f93e1415f49d52941e54264d9454bf070accb2" exitCode=0 Mar 20 17:56:02 crc kubenswrapper[4690]: I0320 17:56:02.989342 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567156-n6gh2" event={"ID":"98fc80fa-7ce3-43dc-9ec0-cccc94302c99","Type":"ContainerDied","Data":"c8c1f89c24b34f4002fd1d1894f93e1415f49d52941e54264d9454bf070accb2"} Mar 20 17:56:02 crc kubenswrapper[4690]: I0320 17:56:02.992097 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bc7c7487-ca7b-46c1-9cfd-6b9a34a0253f","Type":"ContainerStarted","Data":"78247bace0dbf41f0f3135230f7787e06eb8b8b56248738bb7e08e40353cf313"} Mar 20 17:56:02 crc kubenswrapper[4690]: I0320 17:56:02.992127 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bc7c7487-ca7b-46c1-9cfd-6b9a34a0253f","Type":"ContainerStarted","Data":"d4fe12c99e705ec5c3af6a3372b4815f7e6308348fd26a8750baac3ff60452d6"} Mar 20 17:56:03 crc kubenswrapper[4690]: I0320 17:56:03.155401 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.155378775 podStartE2EDuration="3.155378775s" podCreationTimestamp="2026-03-20 17:56:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:56:03.025525146 +0000 UTC m=+1437.891350824" watchObservedRunningTime="2026-03-20 17:56:03.155378775 +0000 UTC m=+1438.021204463" Mar 20 17:56:03 crc kubenswrapper[4690]: I0320 17:56:03.166020 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 17:56:03 crc kubenswrapper[4690]: I0320 17:56:03.897788 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27f99cff-5842-4132-89a9-3cc1872139cf" path="/var/lib/kubelet/pods/27f99cff-5842-4132-89a9-3cc1872139cf/volumes" Mar 20 17:56:04 crc kubenswrapper[4690]: I0320 17:56:04.001317 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a6c3a0b8-8793-4e94-bbee-851b32f0a393","Type":"ContainerStarted","Data":"e9f83ca269246d32f85b2451cc4fe87db976fdd532a6a543ba1ccf4256ed4c2d"} Mar 20 17:56:04 crc kubenswrapper[4690]: I0320 17:56:04.001354 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a6c3a0b8-8793-4e94-bbee-851b32f0a393","Type":"ContainerStarted","Data":"7116b7a61f4fdd878eea5042469c25928917de0d4db0ba74381c1273cc5e0f7e"} Mar 20 17:56:04 crc kubenswrapper[4690]: I0320 17:56:04.045962 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.04594088 podStartE2EDuration="2.04594088s" podCreationTimestamp="2026-03-20 17:56:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:56:04.026926662 +0000 UTC m=+1438.892752340" watchObservedRunningTime="2026-03-20 17:56:04.04594088 +0000 UTC m=+1438.911766558" Mar 20 17:56:04 crc kubenswrapper[4690]: I0320 17:56:04.496397 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567156-n6gh2" Mar 20 17:56:04 crc kubenswrapper[4690]: I0320 17:56:04.573356 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ms2gd\" (UniqueName: \"kubernetes.io/projected/98fc80fa-7ce3-43dc-9ec0-cccc94302c99-kube-api-access-ms2gd\") pod \"98fc80fa-7ce3-43dc-9ec0-cccc94302c99\" (UID: \"98fc80fa-7ce3-43dc-9ec0-cccc94302c99\") " Mar 20 17:56:04 crc kubenswrapper[4690]: I0320 17:56:04.590446 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98fc80fa-7ce3-43dc-9ec0-cccc94302c99-kube-api-access-ms2gd" (OuterVolumeSpecName: "kube-api-access-ms2gd") pod "98fc80fa-7ce3-43dc-9ec0-cccc94302c99" (UID: "98fc80fa-7ce3-43dc-9ec0-cccc94302c99"). InnerVolumeSpecName "kube-api-access-ms2gd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:56:04 crc kubenswrapper[4690]: I0320 17:56:04.675572 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ms2gd\" (UniqueName: \"kubernetes.io/projected/98fc80fa-7ce3-43dc-9ec0-cccc94302c99-kube-api-access-ms2gd\") on node \"crc\" DevicePath \"\"" Mar 20 17:56:05 crc kubenswrapper[4690]: I0320 17:56:05.009380 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567156-n6gh2" event={"ID":"98fc80fa-7ce3-43dc-9ec0-cccc94302c99","Type":"ContainerDied","Data":"81a1cb84987eade9e0ec7e7ec5c4c74460ab62341b55c6e6519fa8a2fe8c7dcc"} Mar 20 17:56:05 crc kubenswrapper[4690]: I0320 17:56:05.009829 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="81a1cb84987eade9e0ec7e7ec5c4c74460ab62341b55c6e6519fa8a2fe8c7dcc" Mar 20 17:56:05 crc kubenswrapper[4690]: I0320 17:56:05.009398 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567156-n6gh2" Mar 20 17:56:05 crc kubenswrapper[4690]: I0320 17:56:05.571563 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567150-4m2nn"] Mar 20 17:56:05 crc kubenswrapper[4690]: I0320 17:56:05.582463 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567150-4m2nn"] Mar 20 17:56:05 crc kubenswrapper[4690]: I0320 17:56:05.910174 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb74f825-5d0a-4a8a-8d15-d95cfdcf2729" path="/var/lib/kubelet/pods/cb74f825-5d0a-4a8a-8d15-d95cfdcf2729/volumes" Mar 20 17:56:07 crc kubenswrapper[4690]: I0320 17:56:07.688338 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 20 17:56:08 crc kubenswrapper[4690]: I0320 17:56:08.651077 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 17:56:08 crc kubenswrapper[4690]: I0320 17:56:08.651531 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 17:56:09 crc kubenswrapper[4690]: I0320 17:56:09.669419 4690 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="5f8feaad-3661-4ea6-9e2d-90cf79d48df9" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.209:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 17:56:09 crc kubenswrapper[4690]: I0320 17:56:09.669711 4690 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="5f8feaad-3661-4ea6-9e2d-90cf79d48df9" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.209:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 17:56:11 crc kubenswrapper[4690]: I0320 17:56:11.360214 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 20 17:56:11 crc kubenswrapper[4690]: I0320 17:56:11.360680 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 20 17:56:12 crc kubenswrapper[4690]: I0320 17:56:12.375434 4690 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="bc7c7487-ca7b-46c1-9cfd-6b9a34a0253f" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.211:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 17:56:12 crc kubenswrapper[4690]: I0320 17:56:12.375476 4690 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="bc7c7487-ca7b-46c1-9cfd-6b9a34a0253f" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.211:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 17:56:12 crc kubenswrapper[4690]: I0320 17:56:12.688315 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 20 17:56:12 crc kubenswrapper[4690]: I0320 17:56:12.746883 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 20 17:56:13 crc kubenswrapper[4690]: I0320 17:56:13.147514 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 20 17:56:16 crc kubenswrapper[4690]: I0320 17:56:16.651025 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 17:56:16 crc kubenswrapper[4690]: I0320 17:56:16.651371 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 17:56:18 crc kubenswrapper[4690]: I0320 17:56:18.660839 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 20 17:56:18 crc kubenswrapper[4690]: I0320 17:56:18.663707 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 20 17:56:18 crc kubenswrapper[4690]: I0320 17:56:18.672517 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 20 17:56:19 crc kubenswrapper[4690]: I0320 17:56:19.174474 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 20 17:56:19 crc kubenswrapper[4690]: I0320 17:56:19.359868 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 20 17:56:19 crc kubenswrapper[4690]: I0320 17:56:19.360159 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 20 17:56:19 crc kubenswrapper[4690]: I0320 17:56:19.819311 4690 scope.go:117] "RemoveContainer" containerID="8ba358eb6c3b3ac3d431f63cad1f33f11e7d1366cd0c13b9100253d62a11fabc" Mar 20 17:56:21 crc kubenswrapper[4690]: I0320 17:56:21.367707 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 20 17:56:21 crc kubenswrapper[4690]: I0320 17:56:21.376555 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 20 17:56:21 crc kubenswrapper[4690]: I0320 17:56:21.377557 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 20 17:56:22 crc kubenswrapper[4690]: I0320 17:56:22.210369 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 20 17:56:25 crc kubenswrapper[4690]: I0320 17:56:25.334905 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 20 17:56:34 crc kubenswrapper[4690]: I0320 17:56:34.454553 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 17:56:35 crc kubenswrapper[4690]: I0320 17:56:35.222310 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 17:56:36 crc kubenswrapper[4690]: I0320 17:56:36.827688 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="4ee4534b-8d84-4ca5-a8bc-10574d39d7bc" containerName="rabbitmq" containerID="cri-o://014e5dcc51458e00ea1c1e92fc8066e86e8ba38713cdd0bb150493ff18fbc998" gracePeriod=58 Mar 20 17:56:37 crc kubenswrapper[4690]: I0320 17:56:37.761924 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7" containerName="rabbitmq" containerID="cri-o://09ddd27993db8baaf316b15c984459a48393208845452c3703761da831dfaced" gracePeriod=58 Mar 20 17:56:38 crc kubenswrapper[4690]: I0320 17:56:38.518278 4690 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="4ee4534b-8d84-4ca5-a8bc-10574d39d7bc" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.97:5671: connect: connection refused" Mar 20 17:56:38 crc kubenswrapper[4690]: I0320 17:56:38.693956 4690 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.98:5671: connect: connection refused" Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.364025 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.377994 4690 generic.go:334] "Generic (PLEG): container finished" podID="a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7" containerID="09ddd27993db8baaf316b15c984459a48393208845452c3703761da831dfaced" exitCode=0 Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.378038 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7","Type":"ContainerDied","Data":"09ddd27993db8baaf316b15c984459a48393208845452c3703761da831dfaced"} Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.378090 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7","Type":"ContainerDied","Data":"90c0dff250aab7f2bbe343a386791e62adb9fdbf2ae00b7c9e03c065674a4553"} Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.378110 4690 scope.go:117] "RemoveContainer" containerID="09ddd27993db8baaf316b15c984459a48393208845452c3703761da831dfaced" Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.378050 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.408238 4690 scope.go:117] "RemoveContainer" containerID="3df7ac4d250a04a6d7d52ab030145e0cd9c9bdc339e5fa7bd91d25cb277c9406" Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.444882 4690 scope.go:117] "RemoveContainer" containerID="09ddd27993db8baaf316b15c984459a48393208845452c3703761da831dfaced" Mar 20 17:56:39 crc kubenswrapper[4690]: E0320 17:56:39.445775 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09ddd27993db8baaf316b15c984459a48393208845452c3703761da831dfaced\": container with ID starting with 09ddd27993db8baaf316b15c984459a48393208845452c3703761da831dfaced not found: ID does not exist" containerID="09ddd27993db8baaf316b15c984459a48393208845452c3703761da831dfaced" Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.445835 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09ddd27993db8baaf316b15c984459a48393208845452c3703761da831dfaced"} err="failed to get container status \"09ddd27993db8baaf316b15c984459a48393208845452c3703761da831dfaced\": rpc error: code = NotFound desc = could not find container \"09ddd27993db8baaf316b15c984459a48393208845452c3703761da831dfaced\": container with ID starting with 09ddd27993db8baaf316b15c984459a48393208845452c3703761da831dfaced not found: ID does not exist" Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.445872 4690 scope.go:117] "RemoveContainer" containerID="3df7ac4d250a04a6d7d52ab030145e0cd9c9bdc339e5fa7bd91d25cb277c9406" Mar 20 17:56:39 crc kubenswrapper[4690]: E0320 17:56:39.446568 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3df7ac4d250a04a6d7d52ab030145e0cd9c9bdc339e5fa7bd91d25cb277c9406\": container with ID starting with 3df7ac4d250a04a6d7d52ab030145e0cd9c9bdc339e5fa7bd91d25cb277c9406 not found: ID does not exist" containerID="3df7ac4d250a04a6d7d52ab030145e0cd9c9bdc339e5fa7bd91d25cb277c9406" Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.446622 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3df7ac4d250a04a6d7d52ab030145e0cd9c9bdc339e5fa7bd91d25cb277c9406"} err="failed to get container status \"3df7ac4d250a04a6d7d52ab030145e0cd9c9bdc339e5fa7bd91d25cb277c9406\": rpc error: code = NotFound desc = could not find container \"3df7ac4d250a04a6d7d52ab030145e0cd9c9bdc339e5fa7bd91d25cb277c9406\": container with ID starting with 3df7ac4d250a04a6d7d52ab030145e0cd9c9bdc339e5fa7bd91d25cb277c9406 not found: ID does not exist" Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.472933 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7-rabbitmq-tls\") pod \"a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7\" (UID: \"a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7\") " Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.473001 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-869th\" (UniqueName: \"kubernetes.io/projected/a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7-kube-api-access-869th\") pod \"a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7\" (UID: \"a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7\") " Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.473038 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7-rabbitmq-erlang-cookie\") pod \"a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7\" (UID: \"a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7\") " Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.473075 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7-plugins-conf\") pod \"a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7\" (UID: \"a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7\") " Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.473123 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7-erlang-cookie-secret\") pod \"a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7\" (UID: \"a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7\") " Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.473155 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7-server-conf\") pod \"a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7\" (UID: \"a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7\") " Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.473197 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7\" (UID: \"a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7\") " Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.473215 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7-rabbitmq-plugins\") pod \"a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7\" (UID: \"a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7\") " Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.473237 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7-rabbitmq-confd\") pod \"a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7\" (UID: \"a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7\") " Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.473335 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7-pod-info\") pod \"a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7\" (UID: \"a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7\") " Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.473383 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7-config-data\") pod \"a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7\" (UID: \"a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7\") " Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.479224 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7" (UID: "a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.479453 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7" (UID: "a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.479532 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7" (UID: "a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.483928 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7" (UID: "a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.484173 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7" (UID: "a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.484765 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "persistence") pod "a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7" (UID: "a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.485609 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7-kube-api-access-869th" (OuterVolumeSpecName: "kube-api-access-869th") pod "a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7" (UID: "a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7"). InnerVolumeSpecName "kube-api-access-869th". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.506385 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7-pod-info" (OuterVolumeSpecName: "pod-info") pod "a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7" (UID: "a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.513992 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7-config-data" (OuterVolumeSpecName: "config-data") pod "a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7" (UID: "a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.519077 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7-server-conf" (OuterVolumeSpecName: "server-conf") pod "a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7" (UID: "a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.576229 4690 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7-pod-info\") on node \"crc\" DevicePath \"\"" Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.576292 4690 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.576311 4690 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.576329 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-869th\" (UniqueName: \"kubernetes.io/projected/a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7-kube-api-access-869th\") on node \"crc\" DevicePath \"\"" Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.576349 4690 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.576366 4690 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.576381 4690 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.576396 4690 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7-server-conf\") on node \"crc\" DevicePath \"\"" Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.576438 4690 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.576457 4690 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.582395 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7" (UID: "a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.605750 4690 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.677117 4690 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.677150 4690 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.722650 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.732727 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.757333 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 17:56:39 crc kubenswrapper[4690]: E0320 17:56:39.757882 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98fc80fa-7ce3-43dc-9ec0-cccc94302c99" containerName="oc" Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.757914 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="98fc80fa-7ce3-43dc-9ec0-cccc94302c99" containerName="oc" Mar 20 17:56:39 crc kubenswrapper[4690]: E0320 17:56:39.757977 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7" containerName="setup-container" Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.757991 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7" containerName="setup-container" Mar 20 17:56:39 crc kubenswrapper[4690]: E0320 17:56:39.758017 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7" containerName="rabbitmq" Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.758029 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7" containerName="rabbitmq" Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.758370 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7" containerName="rabbitmq" Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.758400 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="98fc80fa-7ce3-43dc-9ec0-cccc94302c99" containerName="oc" Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.760018 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.764708 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.764758 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.764771 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.764775 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.764963 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-kbltv" Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.765018 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.765470 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.779270 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b93f0757-6c7a-473f-80e5-f4b9e7f88fad-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b93f0757-6c7a-473f-80e5-f4b9e7f88fad\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.779516 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b93f0757-6c7a-473f-80e5-f4b9e7f88fad-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"b93f0757-6c7a-473f-80e5-f4b9e7f88fad\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.779610 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b93f0757-6c7a-473f-80e5-f4b9e7f88fad-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b93f0757-6c7a-473f-80e5-f4b9e7f88fad\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.779793 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b93f0757-6c7a-473f-80e5-f4b9e7f88fad-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b93f0757-6c7a-473f-80e5-f4b9e7f88fad\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.779900 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b93f0757-6c7a-473f-80e5-f4b9e7f88fad-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b93f0757-6c7a-473f-80e5-f4b9e7f88fad\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.780025 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6v92\" (UniqueName: \"kubernetes.io/projected/b93f0757-6c7a-473f-80e5-f4b9e7f88fad-kube-api-access-w6v92\") pod \"rabbitmq-cell1-server-0\" (UID: \"b93f0757-6c7a-473f-80e5-f4b9e7f88fad\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.780227 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b93f0757-6c7a-473f-80e5-f4b9e7f88fad-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b93f0757-6c7a-473f-80e5-f4b9e7f88fad\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.780390 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b93f0757-6c7a-473f-80e5-f4b9e7f88fad-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"b93f0757-6c7a-473f-80e5-f4b9e7f88fad\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.780505 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b93f0757-6c7a-473f-80e5-f4b9e7f88fad-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b93f0757-6c7a-473f-80e5-f4b9e7f88fad\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.780616 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b93f0757-6c7a-473f-80e5-f4b9e7f88fad\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.780720 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b93f0757-6c7a-473f-80e5-f4b9e7f88fad-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b93f0757-6c7a-473f-80e5-f4b9e7f88fad\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.805521 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.882623 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6v92\" (UniqueName: \"kubernetes.io/projected/b93f0757-6c7a-473f-80e5-f4b9e7f88fad-kube-api-access-w6v92\") pod \"rabbitmq-cell1-server-0\" (UID: \"b93f0757-6c7a-473f-80e5-f4b9e7f88fad\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.882702 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b93f0757-6c7a-473f-80e5-f4b9e7f88fad-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b93f0757-6c7a-473f-80e5-f4b9e7f88fad\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.882719 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b93f0757-6c7a-473f-80e5-f4b9e7f88fad-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"b93f0757-6c7a-473f-80e5-f4b9e7f88fad\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.882736 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b93f0757-6c7a-473f-80e5-f4b9e7f88fad-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b93f0757-6c7a-473f-80e5-f4b9e7f88fad\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.882753 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b93f0757-6c7a-473f-80e5-f4b9e7f88fad\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.882768 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b93f0757-6c7a-473f-80e5-f4b9e7f88fad-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b93f0757-6c7a-473f-80e5-f4b9e7f88fad\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.882797 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b93f0757-6c7a-473f-80e5-f4b9e7f88fad-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b93f0757-6c7a-473f-80e5-f4b9e7f88fad\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.882819 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b93f0757-6c7a-473f-80e5-f4b9e7f88fad-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"b93f0757-6c7a-473f-80e5-f4b9e7f88fad\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.882834 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b93f0757-6c7a-473f-80e5-f4b9e7f88fad-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b93f0757-6c7a-473f-80e5-f4b9e7f88fad\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.882891 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b93f0757-6c7a-473f-80e5-f4b9e7f88fad-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b93f0757-6c7a-473f-80e5-f4b9e7f88fad\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.882910 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b93f0757-6c7a-473f-80e5-f4b9e7f88fad-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b93f0757-6c7a-473f-80e5-f4b9e7f88fad\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.883818 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b93f0757-6c7a-473f-80e5-f4b9e7f88fad-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b93f0757-6c7a-473f-80e5-f4b9e7f88fad\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.884151 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b93f0757-6c7a-473f-80e5-f4b9e7f88fad-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b93f0757-6c7a-473f-80e5-f4b9e7f88fad\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.884213 4690 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b93f0757-6c7a-473f-80e5-f4b9e7f88fad\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.884224 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b93f0757-6c7a-473f-80e5-f4b9e7f88fad-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b93f0757-6c7a-473f-80e5-f4b9e7f88fad\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.884500 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b93f0757-6c7a-473f-80e5-f4b9e7f88fad-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"b93f0757-6c7a-473f-80e5-f4b9e7f88fad\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.884604 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b93f0757-6c7a-473f-80e5-f4b9e7f88fad-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b93f0757-6c7a-473f-80e5-f4b9e7f88fad\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.887465 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b93f0757-6c7a-473f-80e5-f4b9e7f88fad-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"b93f0757-6c7a-473f-80e5-f4b9e7f88fad\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.887583 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b93f0757-6c7a-473f-80e5-f4b9e7f88fad-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b93f0757-6c7a-473f-80e5-f4b9e7f88fad\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.888114 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b93f0757-6c7a-473f-80e5-f4b9e7f88fad-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b93f0757-6c7a-473f-80e5-f4b9e7f88fad\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.897581 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b93f0757-6c7a-473f-80e5-f4b9e7f88fad-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b93f0757-6c7a-473f-80e5-f4b9e7f88fad\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.903470 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6v92\" (UniqueName: \"kubernetes.io/projected/b93f0757-6c7a-473f-80e5-f4b9e7f88fad-kube-api-access-w6v92\") pod \"rabbitmq-cell1-server-0\" (UID: \"b93f0757-6c7a-473f-80e5-f4b9e7f88fad\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.904496 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7" path="/var/lib/kubelet/pods/a07bc2a2-e1d2-4185-87b5-53ebcce2cfd7/volumes" Mar 20 17:56:39 crc kubenswrapper[4690]: I0320 17:56:39.915945 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b93f0757-6c7a-473f-80e5-f4b9e7f88fad\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:56:40 crc kubenswrapper[4690]: I0320 17:56:40.090948 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:56:40 crc kubenswrapper[4690]: I0320 17:56:40.567057 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 17:56:41 crc kubenswrapper[4690]: I0320 17:56:41.400445 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b93f0757-6c7a-473f-80e5-f4b9e7f88fad","Type":"ContainerStarted","Data":"fe3cb89d21ef66f0cb089abe204a1d9f9f8763c83d1043789ace8ff317830d4b"} Mar 20 17:56:42 crc kubenswrapper[4690]: I0320 17:56:42.418578 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b93f0757-6c7a-473f-80e5-f4b9e7f88fad","Type":"ContainerStarted","Data":"c671d3d29278ae0c7a005e293815447fa52d515517090b0d6749d424b8221884"} Mar 20 17:56:43 crc kubenswrapper[4690]: I0320 17:56:43.434786 4690 generic.go:334] "Generic (PLEG): container finished" podID="4ee4534b-8d84-4ca5-a8bc-10574d39d7bc" containerID="014e5dcc51458e00ea1c1e92fc8066e86e8ba38713cdd0bb150493ff18fbc998" exitCode=0 Mar 20 17:56:43 crc kubenswrapper[4690]: I0320 17:56:43.434920 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4ee4534b-8d84-4ca5-a8bc-10574d39d7bc","Type":"ContainerDied","Data":"014e5dcc51458e00ea1c1e92fc8066e86e8ba38713cdd0bb150493ff18fbc998"} Mar 20 17:56:43 crc kubenswrapper[4690]: I0320 17:56:43.435112 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4ee4534b-8d84-4ca5-a8bc-10574d39d7bc","Type":"ContainerDied","Data":"67c11610348986459be7d3545e959b3f6c3cb99823efa90eeaf0b4cf35de901c"} Mar 20 17:56:43 crc kubenswrapper[4690]: I0320 17:56:43.435126 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="67c11610348986459be7d3545e959b3f6c3cb99823efa90eeaf0b4cf35de901c" Mar 20 17:56:43 crc kubenswrapper[4690]: I0320 17:56:43.458626 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 20 17:56:43 crc kubenswrapper[4690]: I0320 17:56:43.560909 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4ee4534b-8d84-4ca5-a8bc-10574d39d7bc-config-data\") pod \"4ee4534b-8d84-4ca5-a8bc-10574d39d7bc\" (UID: \"4ee4534b-8d84-4ca5-a8bc-10574d39d7bc\") " Mar 20 17:56:43 crc kubenswrapper[4690]: I0320 17:56:43.560993 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4ee4534b-8d84-4ca5-a8bc-10574d39d7bc-pod-info\") pod \"4ee4534b-8d84-4ca5-a8bc-10574d39d7bc\" (UID: \"4ee4534b-8d84-4ca5-a8bc-10574d39d7bc\") " Mar 20 17:56:43 crc kubenswrapper[4690]: I0320 17:56:43.561042 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4ee4534b-8d84-4ca5-a8bc-10574d39d7bc-rabbitmq-tls\") pod \"4ee4534b-8d84-4ca5-a8bc-10574d39d7bc\" (UID: \"4ee4534b-8d84-4ca5-a8bc-10574d39d7bc\") " Mar 20 17:56:43 crc kubenswrapper[4690]: I0320 17:56:43.561093 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4ee4534b-8d84-4ca5-a8bc-10574d39d7bc-rabbitmq-plugins\") pod \"4ee4534b-8d84-4ca5-a8bc-10574d39d7bc\" (UID: \"4ee4534b-8d84-4ca5-a8bc-10574d39d7bc\") " Mar 20 17:56:43 crc kubenswrapper[4690]: I0320 17:56:43.561121 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4ee4534b-8d84-4ca5-a8bc-10574d39d7bc-server-conf\") pod \"4ee4534b-8d84-4ca5-a8bc-10574d39d7bc\" (UID: \"4ee4534b-8d84-4ca5-a8bc-10574d39d7bc\") " Mar 20 17:56:43 crc kubenswrapper[4690]: I0320 17:56:43.561201 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m6qrh\" (UniqueName: \"kubernetes.io/projected/4ee4534b-8d84-4ca5-a8bc-10574d39d7bc-kube-api-access-m6qrh\") pod \"4ee4534b-8d84-4ca5-a8bc-10574d39d7bc\" (UID: \"4ee4534b-8d84-4ca5-a8bc-10574d39d7bc\") " Mar 20 17:56:43 crc kubenswrapper[4690]: I0320 17:56:43.561313 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4ee4534b-8d84-4ca5-a8bc-10574d39d7bc-rabbitmq-erlang-cookie\") pod \"4ee4534b-8d84-4ca5-a8bc-10574d39d7bc\" (UID: \"4ee4534b-8d84-4ca5-a8bc-10574d39d7bc\") " Mar 20 17:56:43 crc kubenswrapper[4690]: I0320 17:56:43.561367 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4ee4534b-8d84-4ca5-a8bc-10574d39d7bc-erlang-cookie-secret\") pod \"4ee4534b-8d84-4ca5-a8bc-10574d39d7bc\" (UID: \"4ee4534b-8d84-4ca5-a8bc-10574d39d7bc\") " Mar 20 17:56:43 crc kubenswrapper[4690]: I0320 17:56:43.561403 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4ee4534b-8d84-4ca5-a8bc-10574d39d7bc-plugins-conf\") pod \"4ee4534b-8d84-4ca5-a8bc-10574d39d7bc\" (UID: \"4ee4534b-8d84-4ca5-a8bc-10574d39d7bc\") " Mar 20 17:56:43 crc kubenswrapper[4690]: I0320 17:56:43.561437 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"4ee4534b-8d84-4ca5-a8bc-10574d39d7bc\" (UID: \"4ee4534b-8d84-4ca5-a8bc-10574d39d7bc\") " Mar 20 17:56:43 crc kubenswrapper[4690]: I0320 17:56:43.561474 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4ee4534b-8d84-4ca5-a8bc-10574d39d7bc-rabbitmq-confd\") pod \"4ee4534b-8d84-4ca5-a8bc-10574d39d7bc\" (UID: \"4ee4534b-8d84-4ca5-a8bc-10574d39d7bc\") " Mar 20 17:56:43 crc kubenswrapper[4690]: I0320 17:56:43.561872 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ee4534b-8d84-4ca5-a8bc-10574d39d7bc-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "4ee4534b-8d84-4ca5-a8bc-10574d39d7bc" (UID: "4ee4534b-8d84-4ca5-a8bc-10574d39d7bc"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:56:43 crc kubenswrapper[4690]: I0320 17:56:43.561953 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ee4534b-8d84-4ca5-a8bc-10574d39d7bc-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "4ee4534b-8d84-4ca5-a8bc-10574d39d7bc" (UID: "4ee4534b-8d84-4ca5-a8bc-10574d39d7bc"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:56:43 crc kubenswrapper[4690]: I0320 17:56:43.562008 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ee4534b-8d84-4ca5-a8bc-10574d39d7bc-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "4ee4534b-8d84-4ca5-a8bc-10574d39d7bc" (UID: "4ee4534b-8d84-4ca5-a8bc-10574d39d7bc"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:56:43 crc kubenswrapper[4690]: I0320 17:56:43.562394 4690 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4ee4534b-8d84-4ca5-a8bc-10574d39d7bc-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 20 17:56:43 crc kubenswrapper[4690]: I0320 17:56:43.562696 4690 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4ee4534b-8d84-4ca5-a8bc-10574d39d7bc-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 20 17:56:43 crc kubenswrapper[4690]: I0320 17:56:43.562710 4690 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4ee4534b-8d84-4ca5-a8bc-10574d39d7bc-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 20 17:56:43 crc kubenswrapper[4690]: I0320 17:56:43.568560 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "4ee4534b-8d84-4ca5-a8bc-10574d39d7bc" (UID: "4ee4534b-8d84-4ca5-a8bc-10574d39d7bc"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 17:56:43 crc kubenswrapper[4690]: I0320 17:56:43.575519 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ee4534b-8d84-4ca5-a8bc-10574d39d7bc-kube-api-access-m6qrh" (OuterVolumeSpecName: "kube-api-access-m6qrh") pod "4ee4534b-8d84-4ca5-a8bc-10574d39d7bc" (UID: "4ee4534b-8d84-4ca5-a8bc-10574d39d7bc"). InnerVolumeSpecName "kube-api-access-m6qrh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:56:43 crc kubenswrapper[4690]: I0320 17:56:43.579362 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ee4534b-8d84-4ca5-a8bc-10574d39d7bc-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "4ee4534b-8d84-4ca5-a8bc-10574d39d7bc" (UID: "4ee4534b-8d84-4ca5-a8bc-10574d39d7bc"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:56:43 crc kubenswrapper[4690]: I0320 17:56:43.579487 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ee4534b-8d84-4ca5-a8bc-10574d39d7bc-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "4ee4534b-8d84-4ca5-a8bc-10574d39d7bc" (UID: "4ee4534b-8d84-4ca5-a8bc-10574d39d7bc"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:56:43 crc kubenswrapper[4690]: I0320 17:56:43.611109 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/4ee4534b-8d84-4ca5-a8bc-10574d39d7bc-pod-info" (OuterVolumeSpecName: "pod-info") pod "4ee4534b-8d84-4ca5-a8bc-10574d39d7bc" (UID: "4ee4534b-8d84-4ca5-a8bc-10574d39d7bc"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 20 17:56:43 crc kubenswrapper[4690]: I0320 17:56:43.667699 4690 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Mar 20 17:56:43 crc kubenswrapper[4690]: I0320 17:56:43.667745 4690 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4ee4534b-8d84-4ca5-a8bc-10574d39d7bc-pod-info\") on node \"crc\" DevicePath \"\"" Mar 20 17:56:43 crc kubenswrapper[4690]: I0320 17:56:43.667760 4690 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4ee4534b-8d84-4ca5-a8bc-10574d39d7bc-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 20 17:56:43 crc kubenswrapper[4690]: I0320 17:56:43.667774 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m6qrh\" (UniqueName: \"kubernetes.io/projected/4ee4534b-8d84-4ca5-a8bc-10574d39d7bc-kube-api-access-m6qrh\") on node \"crc\" DevicePath \"\"" Mar 20 17:56:43 crc kubenswrapper[4690]: I0320 17:56:43.667788 4690 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4ee4534b-8d84-4ca5-a8bc-10574d39d7bc-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 20 17:56:43 crc kubenswrapper[4690]: I0320 17:56:43.680968 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ee4534b-8d84-4ca5-a8bc-10574d39d7bc-server-conf" (OuterVolumeSpecName: "server-conf") pod "4ee4534b-8d84-4ca5-a8bc-10574d39d7bc" (UID: "4ee4534b-8d84-4ca5-a8bc-10574d39d7bc"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:56:43 crc kubenswrapper[4690]: I0320 17:56:43.709702 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ee4534b-8d84-4ca5-a8bc-10574d39d7bc-config-data" (OuterVolumeSpecName: "config-data") pod "4ee4534b-8d84-4ca5-a8bc-10574d39d7bc" (UID: "4ee4534b-8d84-4ca5-a8bc-10574d39d7bc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:56:43 crc kubenswrapper[4690]: I0320 17:56:43.721456 4690 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Mar 20 17:56:43 crc kubenswrapper[4690]: I0320 17:56:43.770413 4690 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4ee4534b-8d84-4ca5-a8bc-10574d39d7bc-server-conf\") on node \"crc\" DevicePath \"\"" Mar 20 17:56:43 crc kubenswrapper[4690]: I0320 17:56:43.770460 4690 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Mar 20 17:56:43 crc kubenswrapper[4690]: I0320 17:56:43.770473 4690 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4ee4534b-8d84-4ca5-a8bc-10574d39d7bc-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:56:43 crc kubenswrapper[4690]: I0320 17:56:43.803427 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ee4534b-8d84-4ca5-a8bc-10574d39d7bc-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "4ee4534b-8d84-4ca5-a8bc-10574d39d7bc" (UID: "4ee4534b-8d84-4ca5-a8bc-10574d39d7bc"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:56:43 crc kubenswrapper[4690]: I0320 17:56:43.872334 4690 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4ee4534b-8d84-4ca5-a8bc-10574d39d7bc-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 20 17:56:44 crc kubenswrapper[4690]: I0320 17:56:44.442859 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 20 17:56:44 crc kubenswrapper[4690]: I0320 17:56:44.466149 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 17:56:44 crc kubenswrapper[4690]: I0320 17:56:44.475046 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 17:56:44 crc kubenswrapper[4690]: I0320 17:56:44.492294 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 17:56:44 crc kubenswrapper[4690]: E0320 17:56:44.492801 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ee4534b-8d84-4ca5-a8bc-10574d39d7bc" containerName="rabbitmq" Mar 20 17:56:44 crc kubenswrapper[4690]: I0320 17:56:44.492835 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ee4534b-8d84-4ca5-a8bc-10574d39d7bc" containerName="rabbitmq" Mar 20 17:56:44 crc kubenswrapper[4690]: E0320 17:56:44.492878 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ee4534b-8d84-4ca5-a8bc-10574d39d7bc" containerName="setup-container" Mar 20 17:56:44 crc kubenswrapper[4690]: I0320 17:56:44.492890 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ee4534b-8d84-4ca5-a8bc-10574d39d7bc" containerName="setup-container" Mar 20 17:56:44 crc kubenswrapper[4690]: I0320 17:56:44.493114 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ee4534b-8d84-4ca5-a8bc-10574d39d7bc" containerName="rabbitmq" Mar 20 17:56:44 crc kubenswrapper[4690]: I0320 17:56:44.495376 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 20 17:56:44 crc kubenswrapper[4690]: I0320 17:56:44.499237 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 20 17:56:44 crc kubenswrapper[4690]: I0320 17:56:44.499442 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 20 17:56:44 crc kubenswrapper[4690]: I0320 17:56:44.499633 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-6x46v" Mar 20 17:56:44 crc kubenswrapper[4690]: I0320 17:56:44.499808 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 20 17:56:44 crc kubenswrapper[4690]: I0320 17:56:44.499912 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 20 17:56:44 crc kubenswrapper[4690]: I0320 17:56:44.500020 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 20 17:56:44 crc kubenswrapper[4690]: I0320 17:56:44.501478 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 20 17:56:44 crc kubenswrapper[4690]: I0320 17:56:44.512073 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 17:56:44 crc kubenswrapper[4690]: I0320 17:56:44.589099 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-njmnp"] Mar 20 17:56:44 crc kubenswrapper[4690]: I0320 17:56:44.592116 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-njmnp" Mar 20 17:56:44 crc kubenswrapper[4690]: I0320 17:56:44.602016 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Mar 20 17:56:44 crc kubenswrapper[4690]: I0320 17:56:44.602371 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-njmnp"] Mar 20 17:56:44 crc kubenswrapper[4690]: I0320 17:56:44.686310 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ab528fee-94bb-4907-aca5-97dcabef8332-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"ab528fee-94bb-4907-aca5-97dcabef8332\") " pod="openstack/rabbitmq-server-0" Mar 20 17:56:44 crc kubenswrapper[4690]: I0320 17:56:44.686380 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c69a2330-7295-45a3-a6c0-edc86cfe42e7-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd4cc8c9-njmnp\" (UID: \"c69a2330-7295-45a3-a6c0-edc86cfe42e7\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-njmnp" Mar 20 17:56:44 crc kubenswrapper[4690]: I0320 17:56:44.686414 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ab528fee-94bb-4907-aca5-97dcabef8332-pod-info\") pod \"rabbitmq-server-0\" (UID: \"ab528fee-94bb-4907-aca5-97dcabef8332\") " pod="openstack/rabbitmq-server-0" Mar 20 17:56:44 crc kubenswrapper[4690]: I0320 17:56:44.686490 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ds7dw\" (UniqueName: \"kubernetes.io/projected/ab528fee-94bb-4907-aca5-97dcabef8332-kube-api-access-ds7dw\") pod \"rabbitmq-server-0\" (UID: \"ab528fee-94bb-4907-aca5-97dcabef8332\") " pod="openstack/rabbitmq-server-0" Mar 20 17:56:44 crc kubenswrapper[4690]: I0320 17:56:44.686570 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ab528fee-94bb-4907-aca5-97dcabef8332-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"ab528fee-94bb-4907-aca5-97dcabef8332\") " pod="openstack/rabbitmq-server-0" Mar 20 17:56:44 crc kubenswrapper[4690]: I0320 17:56:44.686624 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ab528fee-94bb-4907-aca5-97dcabef8332-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"ab528fee-94bb-4907-aca5-97dcabef8332\") " pod="openstack/rabbitmq-server-0" Mar 20 17:56:44 crc kubenswrapper[4690]: I0320 17:56:44.686642 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"ab528fee-94bb-4907-aca5-97dcabef8332\") " pod="openstack/rabbitmq-server-0" Mar 20 17:56:44 crc kubenswrapper[4690]: I0320 17:56:44.686661 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ab528fee-94bb-4907-aca5-97dcabef8332-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"ab528fee-94bb-4907-aca5-97dcabef8332\") " pod="openstack/rabbitmq-server-0" Mar 20 17:56:44 crc kubenswrapper[4690]: I0320 17:56:44.686678 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ab528fee-94bb-4907-aca5-97dcabef8332-config-data\") pod \"rabbitmq-server-0\" (UID: \"ab528fee-94bb-4907-aca5-97dcabef8332\") " pod="openstack/rabbitmq-server-0" Mar 20 17:56:44 crc kubenswrapper[4690]: I0320 17:56:44.686756 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ab528fee-94bb-4907-aca5-97dcabef8332-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"ab528fee-94bb-4907-aca5-97dcabef8332\") " pod="openstack/rabbitmq-server-0" Mar 20 17:56:44 crc kubenswrapper[4690]: I0320 17:56:44.686809 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c69a2330-7295-45a3-a6c0-edc86cfe42e7-config\") pod \"dnsmasq-dns-79bd4cc8c9-njmnp\" (UID: \"c69a2330-7295-45a3-a6c0-edc86cfe42e7\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-njmnp" Mar 20 17:56:44 crc kubenswrapper[4690]: I0320 17:56:44.686886 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/c69a2330-7295-45a3-a6c0-edc86cfe42e7-openstack-edpm-ipam\") pod \"dnsmasq-dns-79bd4cc8c9-njmnp\" (UID: \"c69a2330-7295-45a3-a6c0-edc86cfe42e7\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-njmnp" Mar 20 17:56:44 crc kubenswrapper[4690]: I0320 17:56:44.686974 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ab528fee-94bb-4907-aca5-97dcabef8332-server-conf\") pod \"rabbitmq-server-0\" (UID: \"ab528fee-94bb-4907-aca5-97dcabef8332\") " pod="openstack/rabbitmq-server-0" Mar 20 17:56:44 crc kubenswrapper[4690]: I0320 17:56:44.687020 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ab528fee-94bb-4907-aca5-97dcabef8332-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"ab528fee-94bb-4907-aca5-97dcabef8332\") " pod="openstack/rabbitmq-server-0" Mar 20 17:56:44 crc kubenswrapper[4690]: I0320 17:56:44.687072 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjwq5\" (UniqueName: \"kubernetes.io/projected/c69a2330-7295-45a3-a6c0-edc86cfe42e7-kube-api-access-gjwq5\") pod \"dnsmasq-dns-79bd4cc8c9-njmnp\" (UID: \"c69a2330-7295-45a3-a6c0-edc86cfe42e7\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-njmnp" Mar 20 17:56:44 crc kubenswrapper[4690]: I0320 17:56:44.687154 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c69a2330-7295-45a3-a6c0-edc86cfe42e7-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd4cc8c9-njmnp\" (UID: \"c69a2330-7295-45a3-a6c0-edc86cfe42e7\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-njmnp" Mar 20 17:56:44 crc kubenswrapper[4690]: I0320 17:56:44.687215 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c69a2330-7295-45a3-a6c0-edc86cfe42e7-dns-swift-storage-0\") pod \"dnsmasq-dns-79bd4cc8c9-njmnp\" (UID: \"c69a2330-7295-45a3-a6c0-edc86cfe42e7\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-njmnp" Mar 20 17:56:44 crc kubenswrapper[4690]: I0320 17:56:44.687280 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c69a2330-7295-45a3-a6c0-edc86cfe42e7-dns-svc\") pod \"dnsmasq-dns-79bd4cc8c9-njmnp\" (UID: \"c69a2330-7295-45a3-a6c0-edc86cfe42e7\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-njmnp" Mar 20 17:56:44 crc kubenswrapper[4690]: I0320 17:56:44.788850 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c69a2330-7295-45a3-a6c0-edc86cfe42e7-dns-svc\") pod \"dnsmasq-dns-79bd4cc8c9-njmnp\" (UID: \"c69a2330-7295-45a3-a6c0-edc86cfe42e7\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-njmnp" Mar 20 17:56:44 crc kubenswrapper[4690]: I0320 17:56:44.789231 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ab528fee-94bb-4907-aca5-97dcabef8332-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"ab528fee-94bb-4907-aca5-97dcabef8332\") " pod="openstack/rabbitmq-server-0" Mar 20 17:56:44 crc kubenswrapper[4690]: I0320 17:56:44.789324 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c69a2330-7295-45a3-a6c0-edc86cfe42e7-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd4cc8c9-njmnp\" (UID: \"c69a2330-7295-45a3-a6c0-edc86cfe42e7\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-njmnp" Mar 20 17:56:44 crc kubenswrapper[4690]: I0320 17:56:44.789356 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ab528fee-94bb-4907-aca5-97dcabef8332-pod-info\") pod \"rabbitmq-server-0\" (UID: \"ab528fee-94bb-4907-aca5-97dcabef8332\") " pod="openstack/rabbitmq-server-0" Mar 20 17:56:44 crc kubenswrapper[4690]: I0320 17:56:44.789406 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ds7dw\" (UniqueName: \"kubernetes.io/projected/ab528fee-94bb-4907-aca5-97dcabef8332-kube-api-access-ds7dw\") pod \"rabbitmq-server-0\" (UID: \"ab528fee-94bb-4907-aca5-97dcabef8332\") " pod="openstack/rabbitmq-server-0" Mar 20 17:56:44 crc kubenswrapper[4690]: I0320 17:56:44.789470 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ab528fee-94bb-4907-aca5-97dcabef8332-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"ab528fee-94bb-4907-aca5-97dcabef8332\") " pod="openstack/rabbitmq-server-0" Mar 20 17:56:44 crc kubenswrapper[4690]: I0320 17:56:44.789494 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"ab528fee-94bb-4907-aca5-97dcabef8332\") " pod="openstack/rabbitmq-server-0" Mar 20 17:56:44 crc kubenswrapper[4690]: I0320 17:56:44.789515 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ab528fee-94bb-4907-aca5-97dcabef8332-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"ab528fee-94bb-4907-aca5-97dcabef8332\") " pod="openstack/rabbitmq-server-0" Mar 20 17:56:44 crc kubenswrapper[4690]: I0320 17:56:44.789539 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ab528fee-94bb-4907-aca5-97dcabef8332-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"ab528fee-94bb-4907-aca5-97dcabef8332\") " pod="openstack/rabbitmq-server-0" Mar 20 17:56:44 crc kubenswrapper[4690]: I0320 17:56:44.789565 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ab528fee-94bb-4907-aca5-97dcabef8332-config-data\") pod \"rabbitmq-server-0\" (UID: \"ab528fee-94bb-4907-aca5-97dcabef8332\") " pod="openstack/rabbitmq-server-0" Mar 20 17:56:44 crc kubenswrapper[4690]: I0320 17:56:44.789605 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c69a2330-7295-45a3-a6c0-edc86cfe42e7-config\") pod \"dnsmasq-dns-79bd4cc8c9-njmnp\" (UID: \"c69a2330-7295-45a3-a6c0-edc86cfe42e7\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-njmnp" Mar 20 17:56:44 crc kubenswrapper[4690]: I0320 17:56:44.789626 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ab528fee-94bb-4907-aca5-97dcabef8332-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"ab528fee-94bb-4907-aca5-97dcabef8332\") " pod="openstack/rabbitmq-server-0" Mar 20 17:56:44 crc kubenswrapper[4690]: I0320 17:56:44.789657 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/c69a2330-7295-45a3-a6c0-edc86cfe42e7-openstack-edpm-ipam\") pod \"dnsmasq-dns-79bd4cc8c9-njmnp\" (UID: \"c69a2330-7295-45a3-a6c0-edc86cfe42e7\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-njmnp" Mar 20 17:56:44 crc kubenswrapper[4690]: I0320 17:56:44.789700 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ab528fee-94bb-4907-aca5-97dcabef8332-server-conf\") pod \"rabbitmq-server-0\" (UID: \"ab528fee-94bb-4907-aca5-97dcabef8332\") " pod="openstack/rabbitmq-server-0" Mar 20 17:56:44 crc kubenswrapper[4690]: I0320 17:56:44.789776 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ab528fee-94bb-4907-aca5-97dcabef8332-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"ab528fee-94bb-4907-aca5-97dcabef8332\") " pod="openstack/rabbitmq-server-0" Mar 20 17:56:44 crc kubenswrapper[4690]: I0320 17:56:44.789801 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjwq5\" (UniqueName: \"kubernetes.io/projected/c69a2330-7295-45a3-a6c0-edc86cfe42e7-kube-api-access-gjwq5\") pod \"dnsmasq-dns-79bd4cc8c9-njmnp\" (UID: \"c69a2330-7295-45a3-a6c0-edc86cfe42e7\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-njmnp" Mar 20 17:56:44 crc kubenswrapper[4690]: I0320 17:56:44.789832 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c69a2330-7295-45a3-a6c0-edc86cfe42e7-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd4cc8c9-njmnp\" (UID: \"c69a2330-7295-45a3-a6c0-edc86cfe42e7\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-njmnp" Mar 20 17:56:44 crc kubenswrapper[4690]: I0320 17:56:44.789866 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c69a2330-7295-45a3-a6c0-edc86cfe42e7-dns-swift-storage-0\") pod \"dnsmasq-dns-79bd4cc8c9-njmnp\" (UID: \"c69a2330-7295-45a3-a6c0-edc86cfe42e7\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-njmnp" Mar 20 17:56:44 crc kubenswrapper[4690]: I0320 17:56:44.789953 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c69a2330-7295-45a3-a6c0-edc86cfe42e7-dns-svc\") pod \"dnsmasq-dns-79bd4cc8c9-njmnp\" (UID: \"c69a2330-7295-45a3-a6c0-edc86cfe42e7\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-njmnp" Mar 20 17:56:44 crc kubenswrapper[4690]: I0320 17:56:44.790084 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c69a2330-7295-45a3-a6c0-edc86cfe42e7-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd4cc8c9-njmnp\" (UID: \"c69a2330-7295-45a3-a6c0-edc86cfe42e7\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-njmnp" Mar 20 17:56:44 crc kubenswrapper[4690]: I0320 17:56:44.790869 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ab528fee-94bb-4907-aca5-97dcabef8332-config-data\") pod \"rabbitmq-server-0\" (UID: \"ab528fee-94bb-4907-aca5-97dcabef8332\") " pod="openstack/rabbitmq-server-0" Mar 20 17:56:44 crc kubenswrapper[4690]: I0320 17:56:44.791067 4690 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"ab528fee-94bb-4907-aca5-97dcabef8332\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Mar 20 17:56:44 crc kubenswrapper[4690]: I0320 17:56:44.791124 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c69a2330-7295-45a3-a6c0-edc86cfe42e7-config\") pod \"dnsmasq-dns-79bd4cc8c9-njmnp\" (UID: \"c69a2330-7295-45a3-a6c0-edc86cfe42e7\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-njmnp" Mar 20 17:56:44 crc kubenswrapper[4690]: I0320 17:56:44.791619 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ab528fee-94bb-4907-aca5-97dcabef8332-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"ab528fee-94bb-4907-aca5-97dcabef8332\") " pod="openstack/rabbitmq-server-0" Mar 20 17:56:44 crc kubenswrapper[4690]: I0320 17:56:44.791978 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ab528fee-94bb-4907-aca5-97dcabef8332-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"ab528fee-94bb-4907-aca5-97dcabef8332\") " pod="openstack/rabbitmq-server-0" Mar 20 17:56:44 crc kubenswrapper[4690]: I0320 17:56:44.792639 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/c69a2330-7295-45a3-a6c0-edc86cfe42e7-openstack-edpm-ipam\") pod \"dnsmasq-dns-79bd4cc8c9-njmnp\" (UID: \"c69a2330-7295-45a3-a6c0-edc86cfe42e7\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-njmnp" Mar 20 17:56:44 crc kubenswrapper[4690]: I0320 17:56:44.793692 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ab528fee-94bb-4907-aca5-97dcabef8332-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"ab528fee-94bb-4907-aca5-97dcabef8332\") " pod="openstack/rabbitmq-server-0" Mar 20 17:56:44 crc kubenswrapper[4690]: I0320 17:56:44.793983 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c69a2330-7295-45a3-a6c0-edc86cfe42e7-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd4cc8c9-njmnp\" (UID: \"c69a2330-7295-45a3-a6c0-edc86cfe42e7\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-njmnp" Mar 20 17:56:44 crc kubenswrapper[4690]: I0320 17:56:44.794128 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c69a2330-7295-45a3-a6c0-edc86cfe42e7-dns-swift-storage-0\") pod \"dnsmasq-dns-79bd4cc8c9-njmnp\" (UID: \"c69a2330-7295-45a3-a6c0-edc86cfe42e7\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-njmnp" Mar 20 17:56:44 crc kubenswrapper[4690]: I0320 17:56:44.797181 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ab528fee-94bb-4907-aca5-97dcabef8332-pod-info\") pod \"rabbitmq-server-0\" (UID: \"ab528fee-94bb-4907-aca5-97dcabef8332\") " pod="openstack/rabbitmq-server-0" Mar 20 17:56:44 crc kubenswrapper[4690]: I0320 17:56:44.799008 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ab528fee-94bb-4907-aca5-97dcabef8332-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"ab528fee-94bb-4907-aca5-97dcabef8332\") " pod="openstack/rabbitmq-server-0" Mar 20 17:56:44 crc kubenswrapper[4690]: I0320 17:56:44.799484 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ab528fee-94bb-4907-aca5-97dcabef8332-server-conf\") pod \"rabbitmq-server-0\" (UID: \"ab528fee-94bb-4907-aca5-97dcabef8332\") " pod="openstack/rabbitmq-server-0" Mar 20 17:56:44 crc kubenswrapper[4690]: I0320 17:56:44.808335 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ab528fee-94bb-4907-aca5-97dcabef8332-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"ab528fee-94bb-4907-aca5-97dcabef8332\") " pod="openstack/rabbitmq-server-0" Mar 20 17:56:44 crc kubenswrapper[4690]: I0320 17:56:44.810502 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ab528fee-94bb-4907-aca5-97dcabef8332-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"ab528fee-94bb-4907-aca5-97dcabef8332\") " pod="openstack/rabbitmq-server-0" Mar 20 17:56:44 crc kubenswrapper[4690]: I0320 17:56:44.812535 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ds7dw\" (UniqueName: \"kubernetes.io/projected/ab528fee-94bb-4907-aca5-97dcabef8332-kube-api-access-ds7dw\") pod \"rabbitmq-server-0\" (UID: \"ab528fee-94bb-4907-aca5-97dcabef8332\") " pod="openstack/rabbitmq-server-0" Mar 20 17:56:44 crc kubenswrapper[4690]: I0320 17:56:44.814751 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjwq5\" (UniqueName: \"kubernetes.io/projected/c69a2330-7295-45a3-a6c0-edc86cfe42e7-kube-api-access-gjwq5\") pod \"dnsmasq-dns-79bd4cc8c9-njmnp\" (UID: \"c69a2330-7295-45a3-a6c0-edc86cfe42e7\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-njmnp" Mar 20 17:56:44 crc kubenswrapper[4690]: I0320 17:56:44.842143 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"ab528fee-94bb-4907-aca5-97dcabef8332\") " pod="openstack/rabbitmq-server-0" Mar 20 17:56:44 crc kubenswrapper[4690]: I0320 17:56:44.913942 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-njmnp" Mar 20 17:56:45 crc kubenswrapper[4690]: I0320 17:56:45.114877 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 20 17:56:45 crc kubenswrapper[4690]: I0320 17:56:45.419072 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-njmnp"] Mar 20 17:56:45 crc kubenswrapper[4690]: I0320 17:56:45.453004 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-njmnp" event={"ID":"c69a2330-7295-45a3-a6c0-edc86cfe42e7","Type":"ContainerStarted","Data":"4f79f840ce6342756bcb0d731ed7973ce0e3f5d66a19d5fb1a9798cef3939ff8"} Mar 20 17:56:45 crc kubenswrapper[4690]: I0320 17:56:45.587629 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 17:56:45 crc kubenswrapper[4690]: I0320 17:56:45.906550 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ee4534b-8d84-4ca5-a8bc-10574d39d7bc" path="/var/lib/kubelet/pods/4ee4534b-8d84-4ca5-a8bc-10574d39d7bc/volumes" Mar 20 17:56:46 crc kubenswrapper[4690]: I0320 17:56:46.466158 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ab528fee-94bb-4907-aca5-97dcabef8332","Type":"ContainerStarted","Data":"4f462548fd7d84c5973b961c4ef25f8959e1762033b714c2e9da47ceee7593eb"} Mar 20 17:56:46 crc kubenswrapper[4690]: I0320 17:56:46.468857 4690 generic.go:334] "Generic (PLEG): container finished" podID="c69a2330-7295-45a3-a6c0-edc86cfe42e7" containerID="d525df854186d4d0ba0835ec720b992845d1fdaa25f6b16722d593cf8f170de4" exitCode=0 Mar 20 17:56:46 crc kubenswrapper[4690]: I0320 17:56:46.468887 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-njmnp" event={"ID":"c69a2330-7295-45a3-a6c0-edc86cfe42e7","Type":"ContainerDied","Data":"d525df854186d4d0ba0835ec720b992845d1fdaa25f6b16722d593cf8f170de4"} Mar 20 17:56:47 crc kubenswrapper[4690]: I0320 17:56:47.486209 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-njmnp" event={"ID":"c69a2330-7295-45a3-a6c0-edc86cfe42e7","Type":"ContainerStarted","Data":"199a6cf779f74b869d7b93acf9d25e89ca0d9c9203cef1a701d6b142c267350b"} Mar 20 17:56:47 crc kubenswrapper[4690]: I0320 17:56:47.486423 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-79bd4cc8c9-njmnp" Mar 20 17:56:47 crc kubenswrapper[4690]: I0320 17:56:47.490828 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ab528fee-94bb-4907-aca5-97dcabef8332","Type":"ContainerStarted","Data":"1b5f69ddeb012cffc65cf063976fa9fc89947c887fb70e237910cefca5aac610"} Mar 20 17:56:47 crc kubenswrapper[4690]: I0320 17:56:47.517875 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-79bd4cc8c9-njmnp" podStartSLOduration=3.517851786 podStartE2EDuration="3.517851786s" podCreationTimestamp="2026-03-20 17:56:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:56:47.515360937 +0000 UTC m=+1482.381186625" watchObservedRunningTime="2026-03-20 17:56:47.517851786 +0000 UTC m=+1482.383677474" Mar 20 17:56:54 crc kubenswrapper[4690]: I0320 17:56:54.916468 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-79bd4cc8c9-njmnp" Mar 20 17:56:54 crc kubenswrapper[4690]: I0320 17:56:54.995859 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-7h684"] Mar 20 17:56:54 crc kubenswrapper[4690]: I0320 17:56:54.996380 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-89c5cd4d5-7h684" podUID="1b5a6a56-2ecc-47cf-9f38-4fd2df362c77" containerName="dnsmasq-dns" containerID="cri-o://22556642c84455540ea294d9c9842722880e77895a45fbd6b0aaa97a6f82aec6" gracePeriod=10 Mar 20 17:56:55 crc kubenswrapper[4690]: I0320 17:56:55.188552 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55478c4467-q7j6l"] Mar 20 17:56:55 crc kubenswrapper[4690]: I0320 17:56:55.194705 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55478c4467-q7j6l" Mar 20 17:56:55 crc kubenswrapper[4690]: I0320 17:56:55.237988 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55478c4467-q7j6l"] Mar 20 17:56:55 crc kubenswrapper[4690]: I0320 17:56:55.350524 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/50265e08-57d1-4ae0-8434-086c38b3e525-dns-swift-storage-0\") pod \"dnsmasq-dns-55478c4467-q7j6l\" (UID: \"50265e08-57d1-4ae0-8434-086c38b3e525\") " pod="openstack/dnsmasq-dns-55478c4467-q7j6l" Mar 20 17:56:55 crc kubenswrapper[4690]: I0320 17:56:55.350588 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mm52l\" (UniqueName: \"kubernetes.io/projected/50265e08-57d1-4ae0-8434-086c38b3e525-kube-api-access-mm52l\") pod \"dnsmasq-dns-55478c4467-q7j6l\" (UID: \"50265e08-57d1-4ae0-8434-086c38b3e525\") " pod="openstack/dnsmasq-dns-55478c4467-q7j6l" Mar 20 17:56:55 crc kubenswrapper[4690]: I0320 17:56:55.350627 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50265e08-57d1-4ae0-8434-086c38b3e525-config\") pod \"dnsmasq-dns-55478c4467-q7j6l\" (UID: \"50265e08-57d1-4ae0-8434-086c38b3e525\") " pod="openstack/dnsmasq-dns-55478c4467-q7j6l" Mar 20 17:56:55 crc kubenswrapper[4690]: I0320 17:56:55.350716 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/50265e08-57d1-4ae0-8434-086c38b3e525-dns-svc\") pod \"dnsmasq-dns-55478c4467-q7j6l\" (UID: \"50265e08-57d1-4ae0-8434-086c38b3e525\") " pod="openstack/dnsmasq-dns-55478c4467-q7j6l" Mar 20 17:56:55 crc kubenswrapper[4690]: I0320 17:56:55.350786 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/50265e08-57d1-4ae0-8434-086c38b3e525-ovsdbserver-sb\") pod \"dnsmasq-dns-55478c4467-q7j6l\" (UID: \"50265e08-57d1-4ae0-8434-086c38b3e525\") " pod="openstack/dnsmasq-dns-55478c4467-q7j6l" Mar 20 17:56:55 crc kubenswrapper[4690]: I0320 17:56:55.350813 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/50265e08-57d1-4ae0-8434-086c38b3e525-ovsdbserver-nb\") pod \"dnsmasq-dns-55478c4467-q7j6l\" (UID: \"50265e08-57d1-4ae0-8434-086c38b3e525\") " pod="openstack/dnsmasq-dns-55478c4467-q7j6l" Mar 20 17:56:55 crc kubenswrapper[4690]: I0320 17:56:55.351037 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/50265e08-57d1-4ae0-8434-086c38b3e525-openstack-edpm-ipam\") pod \"dnsmasq-dns-55478c4467-q7j6l\" (UID: \"50265e08-57d1-4ae0-8434-086c38b3e525\") " pod="openstack/dnsmasq-dns-55478c4467-q7j6l" Mar 20 17:56:55 crc kubenswrapper[4690]: I0320 17:56:55.455417 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mm52l\" (UniqueName: \"kubernetes.io/projected/50265e08-57d1-4ae0-8434-086c38b3e525-kube-api-access-mm52l\") pod \"dnsmasq-dns-55478c4467-q7j6l\" (UID: \"50265e08-57d1-4ae0-8434-086c38b3e525\") " pod="openstack/dnsmasq-dns-55478c4467-q7j6l" Mar 20 17:56:55 crc kubenswrapper[4690]: I0320 17:56:55.455487 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50265e08-57d1-4ae0-8434-086c38b3e525-config\") pod \"dnsmasq-dns-55478c4467-q7j6l\" (UID: \"50265e08-57d1-4ae0-8434-086c38b3e525\") " pod="openstack/dnsmasq-dns-55478c4467-q7j6l" Mar 20 17:56:55 crc kubenswrapper[4690]: I0320 17:56:55.455564 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/50265e08-57d1-4ae0-8434-086c38b3e525-dns-svc\") pod \"dnsmasq-dns-55478c4467-q7j6l\" (UID: \"50265e08-57d1-4ae0-8434-086c38b3e525\") " pod="openstack/dnsmasq-dns-55478c4467-q7j6l" Mar 20 17:56:55 crc kubenswrapper[4690]: I0320 17:56:55.455630 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/50265e08-57d1-4ae0-8434-086c38b3e525-ovsdbserver-sb\") pod \"dnsmasq-dns-55478c4467-q7j6l\" (UID: \"50265e08-57d1-4ae0-8434-086c38b3e525\") " pod="openstack/dnsmasq-dns-55478c4467-q7j6l" Mar 20 17:56:55 crc kubenswrapper[4690]: I0320 17:56:55.455664 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/50265e08-57d1-4ae0-8434-086c38b3e525-ovsdbserver-nb\") pod \"dnsmasq-dns-55478c4467-q7j6l\" (UID: \"50265e08-57d1-4ae0-8434-086c38b3e525\") " pod="openstack/dnsmasq-dns-55478c4467-q7j6l" Mar 20 17:56:55 crc kubenswrapper[4690]: I0320 17:56:55.455738 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/50265e08-57d1-4ae0-8434-086c38b3e525-openstack-edpm-ipam\") pod \"dnsmasq-dns-55478c4467-q7j6l\" (UID: \"50265e08-57d1-4ae0-8434-086c38b3e525\") " pod="openstack/dnsmasq-dns-55478c4467-q7j6l" Mar 20 17:56:55 crc kubenswrapper[4690]: I0320 17:56:55.455801 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/50265e08-57d1-4ae0-8434-086c38b3e525-dns-swift-storage-0\") pod \"dnsmasq-dns-55478c4467-q7j6l\" (UID: \"50265e08-57d1-4ae0-8434-086c38b3e525\") " pod="openstack/dnsmasq-dns-55478c4467-q7j6l" Mar 20 17:56:55 crc kubenswrapper[4690]: I0320 17:56:55.456337 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50265e08-57d1-4ae0-8434-086c38b3e525-config\") pod \"dnsmasq-dns-55478c4467-q7j6l\" (UID: \"50265e08-57d1-4ae0-8434-086c38b3e525\") " pod="openstack/dnsmasq-dns-55478c4467-q7j6l" Mar 20 17:56:55 crc kubenswrapper[4690]: I0320 17:56:55.456613 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/50265e08-57d1-4ae0-8434-086c38b3e525-ovsdbserver-sb\") pod \"dnsmasq-dns-55478c4467-q7j6l\" (UID: \"50265e08-57d1-4ae0-8434-086c38b3e525\") " pod="openstack/dnsmasq-dns-55478c4467-q7j6l" Mar 20 17:56:55 crc kubenswrapper[4690]: I0320 17:56:55.456634 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/50265e08-57d1-4ae0-8434-086c38b3e525-dns-swift-storage-0\") pod \"dnsmasq-dns-55478c4467-q7j6l\" (UID: \"50265e08-57d1-4ae0-8434-086c38b3e525\") " pod="openstack/dnsmasq-dns-55478c4467-q7j6l" Mar 20 17:56:55 crc kubenswrapper[4690]: I0320 17:56:55.456958 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/50265e08-57d1-4ae0-8434-086c38b3e525-dns-svc\") pod \"dnsmasq-dns-55478c4467-q7j6l\" (UID: \"50265e08-57d1-4ae0-8434-086c38b3e525\") " pod="openstack/dnsmasq-dns-55478c4467-q7j6l" Mar 20 17:56:55 crc kubenswrapper[4690]: I0320 17:56:55.457304 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/50265e08-57d1-4ae0-8434-086c38b3e525-openstack-edpm-ipam\") pod \"dnsmasq-dns-55478c4467-q7j6l\" (UID: \"50265e08-57d1-4ae0-8434-086c38b3e525\") " pod="openstack/dnsmasq-dns-55478c4467-q7j6l" Mar 20 17:56:55 crc kubenswrapper[4690]: I0320 17:56:55.458090 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/50265e08-57d1-4ae0-8434-086c38b3e525-ovsdbserver-nb\") pod \"dnsmasq-dns-55478c4467-q7j6l\" (UID: \"50265e08-57d1-4ae0-8434-086c38b3e525\") " pod="openstack/dnsmasq-dns-55478c4467-q7j6l" Mar 20 17:56:55 crc kubenswrapper[4690]: I0320 17:56:55.479649 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mm52l\" (UniqueName: \"kubernetes.io/projected/50265e08-57d1-4ae0-8434-086c38b3e525-kube-api-access-mm52l\") pod \"dnsmasq-dns-55478c4467-q7j6l\" (UID: \"50265e08-57d1-4ae0-8434-086c38b3e525\") " pod="openstack/dnsmasq-dns-55478c4467-q7j6l" Mar 20 17:56:55 crc kubenswrapper[4690]: I0320 17:56:55.531671 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55478c4467-q7j6l" Mar 20 17:56:55 crc kubenswrapper[4690]: I0320 17:56:55.546402 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-7h684" Mar 20 17:56:55 crc kubenswrapper[4690]: I0320 17:56:55.616485 4690 generic.go:334] "Generic (PLEG): container finished" podID="1b5a6a56-2ecc-47cf-9f38-4fd2df362c77" containerID="22556642c84455540ea294d9c9842722880e77895a45fbd6b0aaa97a6f82aec6" exitCode=0 Mar 20 17:56:55 crc kubenswrapper[4690]: I0320 17:56:55.616554 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-7h684" Mar 20 17:56:55 crc kubenswrapper[4690]: I0320 17:56:55.616546 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-7h684" event={"ID":"1b5a6a56-2ecc-47cf-9f38-4fd2df362c77","Type":"ContainerDied","Data":"22556642c84455540ea294d9c9842722880e77895a45fbd6b0aaa97a6f82aec6"} Mar 20 17:56:55 crc kubenswrapper[4690]: I0320 17:56:55.616730 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-7h684" event={"ID":"1b5a6a56-2ecc-47cf-9f38-4fd2df362c77","Type":"ContainerDied","Data":"601b8ba3957e1335746be44c44075457ab0c8ebbb31d05fa597a6f7a164d2b2c"} Mar 20 17:56:55 crc kubenswrapper[4690]: I0320 17:56:55.616770 4690 scope.go:117] "RemoveContainer" containerID="22556642c84455540ea294d9c9842722880e77895a45fbd6b0aaa97a6f82aec6" Mar 20 17:56:55 crc kubenswrapper[4690]: I0320 17:56:55.662668 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-flcvv\" (UniqueName: \"kubernetes.io/projected/1b5a6a56-2ecc-47cf-9f38-4fd2df362c77-kube-api-access-flcvv\") pod \"1b5a6a56-2ecc-47cf-9f38-4fd2df362c77\" (UID: \"1b5a6a56-2ecc-47cf-9f38-4fd2df362c77\") " Mar 20 17:56:55 crc kubenswrapper[4690]: I0320 17:56:55.662752 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1b5a6a56-2ecc-47cf-9f38-4fd2df362c77-ovsdbserver-sb\") pod \"1b5a6a56-2ecc-47cf-9f38-4fd2df362c77\" (UID: \"1b5a6a56-2ecc-47cf-9f38-4fd2df362c77\") " Mar 20 17:56:55 crc kubenswrapper[4690]: I0320 17:56:55.662770 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1b5a6a56-2ecc-47cf-9f38-4fd2df362c77-dns-swift-storage-0\") pod \"1b5a6a56-2ecc-47cf-9f38-4fd2df362c77\" (UID: \"1b5a6a56-2ecc-47cf-9f38-4fd2df362c77\") " Mar 20 17:56:55 crc kubenswrapper[4690]: I0320 17:56:55.662815 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b5a6a56-2ecc-47cf-9f38-4fd2df362c77-config\") pod \"1b5a6a56-2ecc-47cf-9f38-4fd2df362c77\" (UID: \"1b5a6a56-2ecc-47cf-9f38-4fd2df362c77\") " Mar 20 17:56:55 crc kubenswrapper[4690]: I0320 17:56:55.663475 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1b5a6a56-2ecc-47cf-9f38-4fd2df362c77-ovsdbserver-nb\") pod \"1b5a6a56-2ecc-47cf-9f38-4fd2df362c77\" (UID: \"1b5a6a56-2ecc-47cf-9f38-4fd2df362c77\") " Mar 20 17:56:55 crc kubenswrapper[4690]: I0320 17:56:55.663897 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1b5a6a56-2ecc-47cf-9f38-4fd2df362c77-dns-svc\") pod \"1b5a6a56-2ecc-47cf-9f38-4fd2df362c77\" (UID: \"1b5a6a56-2ecc-47cf-9f38-4fd2df362c77\") " Mar 20 17:56:55 crc kubenswrapper[4690]: I0320 17:56:55.668949 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b5a6a56-2ecc-47cf-9f38-4fd2df362c77-kube-api-access-flcvv" (OuterVolumeSpecName: "kube-api-access-flcvv") pod "1b5a6a56-2ecc-47cf-9f38-4fd2df362c77" (UID: "1b5a6a56-2ecc-47cf-9f38-4fd2df362c77"). InnerVolumeSpecName "kube-api-access-flcvv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:56:55 crc kubenswrapper[4690]: I0320 17:56:55.696682 4690 scope.go:117] "RemoveContainer" containerID="20769a7849e38f8dfc4f0c170b990eadc7c100f3b363f0a9b14cef58867a4dd1" Mar 20 17:56:55 crc kubenswrapper[4690]: I0320 17:56:55.717662 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b5a6a56-2ecc-47cf-9f38-4fd2df362c77-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1b5a6a56-2ecc-47cf-9f38-4fd2df362c77" (UID: "1b5a6a56-2ecc-47cf-9f38-4fd2df362c77"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:56:55 crc kubenswrapper[4690]: I0320 17:56:55.727348 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b5a6a56-2ecc-47cf-9f38-4fd2df362c77-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1b5a6a56-2ecc-47cf-9f38-4fd2df362c77" (UID: "1b5a6a56-2ecc-47cf-9f38-4fd2df362c77"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:56:55 crc kubenswrapper[4690]: I0320 17:56:55.743696 4690 scope.go:117] "RemoveContainer" containerID="22556642c84455540ea294d9c9842722880e77895a45fbd6b0aaa97a6f82aec6" Mar 20 17:56:55 crc kubenswrapper[4690]: I0320 17:56:55.744124 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b5a6a56-2ecc-47cf-9f38-4fd2df362c77-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1b5a6a56-2ecc-47cf-9f38-4fd2df362c77" (UID: "1b5a6a56-2ecc-47cf-9f38-4fd2df362c77"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:56:55 crc kubenswrapper[4690]: E0320 17:56:55.746528 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22556642c84455540ea294d9c9842722880e77895a45fbd6b0aaa97a6f82aec6\": container with ID starting with 22556642c84455540ea294d9c9842722880e77895a45fbd6b0aaa97a6f82aec6 not found: ID does not exist" containerID="22556642c84455540ea294d9c9842722880e77895a45fbd6b0aaa97a6f82aec6" Mar 20 17:56:55 crc kubenswrapper[4690]: I0320 17:56:55.746682 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22556642c84455540ea294d9c9842722880e77895a45fbd6b0aaa97a6f82aec6"} err="failed to get container status \"22556642c84455540ea294d9c9842722880e77895a45fbd6b0aaa97a6f82aec6\": rpc error: code = NotFound desc = could not find container \"22556642c84455540ea294d9c9842722880e77895a45fbd6b0aaa97a6f82aec6\": container with ID starting with 22556642c84455540ea294d9c9842722880e77895a45fbd6b0aaa97a6f82aec6 not found: ID does not exist" Mar 20 17:56:55 crc kubenswrapper[4690]: I0320 17:56:55.746838 4690 scope.go:117] "RemoveContainer" containerID="20769a7849e38f8dfc4f0c170b990eadc7c100f3b363f0a9b14cef58867a4dd1" Mar 20 17:56:55 crc kubenswrapper[4690]: E0320 17:56:55.750302 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20769a7849e38f8dfc4f0c170b990eadc7c100f3b363f0a9b14cef58867a4dd1\": container with ID starting with 20769a7849e38f8dfc4f0c170b990eadc7c100f3b363f0a9b14cef58867a4dd1 not found: ID does not exist" containerID="20769a7849e38f8dfc4f0c170b990eadc7c100f3b363f0a9b14cef58867a4dd1" Mar 20 17:56:55 crc kubenswrapper[4690]: I0320 17:56:55.750369 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20769a7849e38f8dfc4f0c170b990eadc7c100f3b363f0a9b14cef58867a4dd1"} err="failed to get container status \"20769a7849e38f8dfc4f0c170b990eadc7c100f3b363f0a9b14cef58867a4dd1\": rpc error: code = NotFound desc = could not find container \"20769a7849e38f8dfc4f0c170b990eadc7c100f3b363f0a9b14cef58867a4dd1\": container with ID starting with 20769a7849e38f8dfc4f0c170b990eadc7c100f3b363f0a9b14cef58867a4dd1 not found: ID does not exist" Mar 20 17:56:55 crc kubenswrapper[4690]: I0320 17:56:55.751798 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b5a6a56-2ecc-47cf-9f38-4fd2df362c77-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "1b5a6a56-2ecc-47cf-9f38-4fd2df362c77" (UID: "1b5a6a56-2ecc-47cf-9f38-4fd2df362c77"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:56:55 crc kubenswrapper[4690]: I0320 17:56:55.752838 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b5a6a56-2ecc-47cf-9f38-4fd2df362c77-config" (OuterVolumeSpecName: "config") pod "1b5a6a56-2ecc-47cf-9f38-4fd2df362c77" (UID: "1b5a6a56-2ecc-47cf-9f38-4fd2df362c77"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:56:55 crc kubenswrapper[4690]: I0320 17:56:55.766730 4690 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1b5a6a56-2ecc-47cf-9f38-4fd2df362c77-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 17:56:55 crc kubenswrapper[4690]: I0320 17:56:55.766758 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-flcvv\" (UniqueName: \"kubernetes.io/projected/1b5a6a56-2ecc-47cf-9f38-4fd2df362c77-kube-api-access-flcvv\") on node \"crc\" DevicePath \"\"" Mar 20 17:56:55 crc kubenswrapper[4690]: I0320 17:56:55.766771 4690 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1b5a6a56-2ecc-47cf-9f38-4fd2df362c77-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 17:56:55 crc kubenswrapper[4690]: I0320 17:56:55.766780 4690 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1b5a6a56-2ecc-47cf-9f38-4fd2df362c77-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 17:56:55 crc kubenswrapper[4690]: I0320 17:56:55.766791 4690 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b5a6a56-2ecc-47cf-9f38-4fd2df362c77-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:56:55 crc kubenswrapper[4690]: I0320 17:56:55.766799 4690 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1b5a6a56-2ecc-47cf-9f38-4fd2df362c77-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 17:56:55 crc kubenswrapper[4690]: I0320 17:56:55.941848 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-7h684"] Mar 20 17:56:55 crc kubenswrapper[4690]: I0320 17:56:55.950656 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-7h684"] Mar 20 17:56:55 crc kubenswrapper[4690]: I0320 17:56:55.995316 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55478c4467-q7j6l"] Mar 20 17:56:56 crc kubenswrapper[4690]: I0320 17:56:56.627721 4690 generic.go:334] "Generic (PLEG): container finished" podID="50265e08-57d1-4ae0-8434-086c38b3e525" containerID="9f77004c536369e00ac8511b75f1a6a12704b1f0f0641f400eda98f51337c805" exitCode=0 Mar 20 17:56:56 crc kubenswrapper[4690]: I0320 17:56:56.627816 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55478c4467-q7j6l" event={"ID":"50265e08-57d1-4ae0-8434-086c38b3e525","Type":"ContainerDied","Data":"9f77004c536369e00ac8511b75f1a6a12704b1f0f0641f400eda98f51337c805"} Mar 20 17:56:56 crc kubenswrapper[4690]: I0320 17:56:56.628018 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55478c4467-q7j6l" event={"ID":"50265e08-57d1-4ae0-8434-086c38b3e525","Type":"ContainerStarted","Data":"ab5225803f52eb5a2cd9be1100e978d485c0a6806e00c3ec708ff71555ca14cb"} Mar 20 17:56:57 crc kubenswrapper[4690]: I0320 17:56:57.641120 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55478c4467-q7j6l" event={"ID":"50265e08-57d1-4ae0-8434-086c38b3e525","Type":"ContainerStarted","Data":"e157e560356dcacf2c12e9c0fe4f7f2914761eec8ccd310d5a3c0c452f120887"} Mar 20 17:56:57 crc kubenswrapper[4690]: I0320 17:56:57.641426 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55478c4467-q7j6l" Mar 20 17:56:57 crc kubenswrapper[4690]: I0320 17:56:57.674422 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55478c4467-q7j6l" podStartSLOduration=2.674398638 podStartE2EDuration="2.674398638s" podCreationTimestamp="2026-03-20 17:56:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:56:57.66044065 +0000 UTC m=+1492.526266398" watchObservedRunningTime="2026-03-20 17:56:57.674398638 +0000 UTC m=+1492.540224326" Mar 20 17:56:57 crc kubenswrapper[4690]: I0320 17:56:57.894783 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b5a6a56-2ecc-47cf-9f38-4fd2df362c77" path="/var/lib/kubelet/pods/1b5a6a56-2ecc-47cf-9f38-4fd2df362c77/volumes" Mar 20 17:57:02 crc kubenswrapper[4690]: I0320 17:57:02.183878 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-j7vkk"] Mar 20 17:57:02 crc kubenswrapper[4690]: E0320 17:57:02.184901 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b5a6a56-2ecc-47cf-9f38-4fd2df362c77" containerName="dnsmasq-dns" Mar 20 17:57:02 crc kubenswrapper[4690]: I0320 17:57:02.184918 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b5a6a56-2ecc-47cf-9f38-4fd2df362c77" containerName="dnsmasq-dns" Mar 20 17:57:02 crc kubenswrapper[4690]: E0320 17:57:02.184943 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b5a6a56-2ecc-47cf-9f38-4fd2df362c77" containerName="init" Mar 20 17:57:02 crc kubenswrapper[4690]: I0320 17:57:02.184952 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b5a6a56-2ecc-47cf-9f38-4fd2df362c77" containerName="init" Mar 20 17:57:02 crc kubenswrapper[4690]: I0320 17:57:02.185210 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b5a6a56-2ecc-47cf-9f38-4fd2df362c77" containerName="dnsmasq-dns" Mar 20 17:57:02 crc kubenswrapper[4690]: I0320 17:57:02.186833 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j7vkk" Mar 20 17:57:02 crc kubenswrapper[4690]: I0320 17:57:02.200006 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-j7vkk"] Mar 20 17:57:02 crc kubenswrapper[4690]: I0320 17:57:02.225025 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqthk\" (UniqueName: \"kubernetes.io/projected/817a65d0-8808-4fbb-ae9b-10135392cb5e-kube-api-access-wqthk\") pod \"redhat-operators-j7vkk\" (UID: \"817a65d0-8808-4fbb-ae9b-10135392cb5e\") " pod="openshift-marketplace/redhat-operators-j7vkk" Mar 20 17:57:02 crc kubenswrapper[4690]: I0320 17:57:02.225251 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/817a65d0-8808-4fbb-ae9b-10135392cb5e-catalog-content\") pod \"redhat-operators-j7vkk\" (UID: \"817a65d0-8808-4fbb-ae9b-10135392cb5e\") " pod="openshift-marketplace/redhat-operators-j7vkk" Mar 20 17:57:02 crc kubenswrapper[4690]: I0320 17:57:02.225339 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/817a65d0-8808-4fbb-ae9b-10135392cb5e-utilities\") pod \"redhat-operators-j7vkk\" (UID: \"817a65d0-8808-4fbb-ae9b-10135392cb5e\") " pod="openshift-marketplace/redhat-operators-j7vkk" Mar 20 17:57:02 crc kubenswrapper[4690]: I0320 17:57:02.327314 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/817a65d0-8808-4fbb-ae9b-10135392cb5e-catalog-content\") pod \"redhat-operators-j7vkk\" (UID: \"817a65d0-8808-4fbb-ae9b-10135392cb5e\") " pod="openshift-marketplace/redhat-operators-j7vkk" Mar 20 17:57:02 crc kubenswrapper[4690]: I0320 17:57:02.327817 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/817a65d0-8808-4fbb-ae9b-10135392cb5e-catalog-content\") pod \"redhat-operators-j7vkk\" (UID: \"817a65d0-8808-4fbb-ae9b-10135392cb5e\") " pod="openshift-marketplace/redhat-operators-j7vkk" Mar 20 17:57:02 crc kubenswrapper[4690]: I0320 17:57:02.328480 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/817a65d0-8808-4fbb-ae9b-10135392cb5e-utilities\") pod \"redhat-operators-j7vkk\" (UID: \"817a65d0-8808-4fbb-ae9b-10135392cb5e\") " pod="openshift-marketplace/redhat-operators-j7vkk" Mar 20 17:57:02 crc kubenswrapper[4690]: I0320 17:57:02.329016 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/817a65d0-8808-4fbb-ae9b-10135392cb5e-utilities\") pod \"redhat-operators-j7vkk\" (UID: \"817a65d0-8808-4fbb-ae9b-10135392cb5e\") " pod="openshift-marketplace/redhat-operators-j7vkk" Mar 20 17:57:02 crc kubenswrapper[4690]: I0320 17:57:02.329378 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqthk\" (UniqueName: \"kubernetes.io/projected/817a65d0-8808-4fbb-ae9b-10135392cb5e-kube-api-access-wqthk\") pod \"redhat-operators-j7vkk\" (UID: \"817a65d0-8808-4fbb-ae9b-10135392cb5e\") " pod="openshift-marketplace/redhat-operators-j7vkk" Mar 20 17:57:02 crc kubenswrapper[4690]: I0320 17:57:02.365043 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqthk\" (UniqueName: \"kubernetes.io/projected/817a65d0-8808-4fbb-ae9b-10135392cb5e-kube-api-access-wqthk\") pod \"redhat-operators-j7vkk\" (UID: \"817a65d0-8808-4fbb-ae9b-10135392cb5e\") " pod="openshift-marketplace/redhat-operators-j7vkk" Mar 20 17:57:02 crc kubenswrapper[4690]: I0320 17:57:02.512902 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j7vkk" Mar 20 17:57:03 crc kubenswrapper[4690]: I0320 17:57:03.002608 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-j7vkk"] Mar 20 17:57:03 crc kubenswrapper[4690]: W0320 17:57:03.007569 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod817a65d0_8808_4fbb_ae9b_10135392cb5e.slice/crio-51dd0a4870a09182cc5a303252c8cc376def7f0500dede72555fb180ad749dd3 WatchSource:0}: Error finding container 51dd0a4870a09182cc5a303252c8cc376def7f0500dede72555fb180ad749dd3: Status 404 returned error can't find the container with id 51dd0a4870a09182cc5a303252c8cc376def7f0500dede72555fb180ad749dd3 Mar 20 17:57:03 crc kubenswrapper[4690]: I0320 17:57:03.706904 4690 generic.go:334] "Generic (PLEG): container finished" podID="817a65d0-8808-4fbb-ae9b-10135392cb5e" containerID="ff7146a960bce29c828c9fc42cf71a7f6107d956dc6425ed08b08f491b494449" exitCode=0 Mar 20 17:57:03 crc kubenswrapper[4690]: I0320 17:57:03.706955 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j7vkk" event={"ID":"817a65d0-8808-4fbb-ae9b-10135392cb5e","Type":"ContainerDied","Data":"ff7146a960bce29c828c9fc42cf71a7f6107d956dc6425ed08b08f491b494449"} Mar 20 17:57:03 crc kubenswrapper[4690]: I0320 17:57:03.708004 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j7vkk" event={"ID":"817a65d0-8808-4fbb-ae9b-10135392cb5e","Type":"ContainerStarted","Data":"51dd0a4870a09182cc5a303252c8cc376def7f0500dede72555fb180ad749dd3"} Mar 20 17:57:03 crc kubenswrapper[4690]: I0320 17:57:03.709334 4690 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 17:57:05 crc kubenswrapper[4690]: I0320 17:57:05.533435 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55478c4467-q7j6l" Mar 20 17:57:05 crc kubenswrapper[4690]: I0320 17:57:05.611788 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-njmnp"] Mar 20 17:57:05 crc kubenswrapper[4690]: I0320 17:57:05.612138 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-79bd4cc8c9-njmnp" podUID="c69a2330-7295-45a3-a6c0-edc86cfe42e7" containerName="dnsmasq-dns" containerID="cri-o://199a6cf779f74b869d7b93acf9d25e89ca0d9c9203cef1a701d6b142c267350b" gracePeriod=10 Mar 20 17:57:05 crc kubenswrapper[4690]: I0320 17:57:05.741532 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j7vkk" event={"ID":"817a65d0-8808-4fbb-ae9b-10135392cb5e","Type":"ContainerStarted","Data":"6e6f57c30890ca2cd2dcd577297609447d104b2c49f5796667e78e3637004f21"} Mar 20 17:57:06 crc kubenswrapper[4690]: I0320 17:57:06.672349 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-njmnp" Mar 20 17:57:06 crc kubenswrapper[4690]: I0320 17:57:06.753187 4690 generic.go:334] "Generic (PLEG): container finished" podID="817a65d0-8808-4fbb-ae9b-10135392cb5e" containerID="6e6f57c30890ca2cd2dcd577297609447d104b2c49f5796667e78e3637004f21" exitCode=0 Mar 20 17:57:06 crc kubenswrapper[4690]: I0320 17:57:06.753282 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j7vkk" event={"ID":"817a65d0-8808-4fbb-ae9b-10135392cb5e","Type":"ContainerDied","Data":"6e6f57c30890ca2cd2dcd577297609447d104b2c49f5796667e78e3637004f21"} Mar 20 17:57:06 crc kubenswrapper[4690]: I0320 17:57:06.755136 4690 generic.go:334] "Generic (PLEG): container finished" podID="c69a2330-7295-45a3-a6c0-edc86cfe42e7" containerID="199a6cf779f74b869d7b93acf9d25e89ca0d9c9203cef1a701d6b142c267350b" exitCode=0 Mar 20 17:57:06 crc kubenswrapper[4690]: I0320 17:57:06.755165 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-njmnp" event={"ID":"c69a2330-7295-45a3-a6c0-edc86cfe42e7","Type":"ContainerDied","Data":"199a6cf779f74b869d7b93acf9d25e89ca0d9c9203cef1a701d6b142c267350b"} Mar 20 17:57:06 crc kubenswrapper[4690]: I0320 17:57:06.755183 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-njmnp" event={"ID":"c69a2330-7295-45a3-a6c0-edc86cfe42e7","Type":"ContainerDied","Data":"4f79f840ce6342756bcb0d731ed7973ce0e3f5d66a19d5fb1a9798cef3939ff8"} Mar 20 17:57:06 crc kubenswrapper[4690]: I0320 17:57:06.755200 4690 scope.go:117] "RemoveContainer" containerID="199a6cf779f74b869d7b93acf9d25e89ca0d9c9203cef1a701d6b142c267350b" Mar 20 17:57:06 crc kubenswrapper[4690]: I0320 17:57:06.755229 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-njmnp" Mar 20 17:57:06 crc kubenswrapper[4690]: I0320 17:57:06.763482 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c69a2330-7295-45a3-a6c0-edc86cfe42e7-config\") pod \"c69a2330-7295-45a3-a6c0-edc86cfe42e7\" (UID: \"c69a2330-7295-45a3-a6c0-edc86cfe42e7\") " Mar 20 17:57:06 crc kubenswrapper[4690]: I0320 17:57:06.763541 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gjwq5\" (UniqueName: \"kubernetes.io/projected/c69a2330-7295-45a3-a6c0-edc86cfe42e7-kube-api-access-gjwq5\") pod \"c69a2330-7295-45a3-a6c0-edc86cfe42e7\" (UID: \"c69a2330-7295-45a3-a6c0-edc86cfe42e7\") " Mar 20 17:57:06 crc kubenswrapper[4690]: I0320 17:57:06.763568 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/c69a2330-7295-45a3-a6c0-edc86cfe42e7-openstack-edpm-ipam\") pod \"c69a2330-7295-45a3-a6c0-edc86cfe42e7\" (UID: \"c69a2330-7295-45a3-a6c0-edc86cfe42e7\") " Mar 20 17:57:06 crc kubenswrapper[4690]: I0320 17:57:06.763621 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c69a2330-7295-45a3-a6c0-edc86cfe42e7-ovsdbserver-nb\") pod \"c69a2330-7295-45a3-a6c0-edc86cfe42e7\" (UID: \"c69a2330-7295-45a3-a6c0-edc86cfe42e7\") " Mar 20 17:57:06 crc kubenswrapper[4690]: I0320 17:57:06.763791 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c69a2330-7295-45a3-a6c0-edc86cfe42e7-dns-swift-storage-0\") pod \"c69a2330-7295-45a3-a6c0-edc86cfe42e7\" (UID: \"c69a2330-7295-45a3-a6c0-edc86cfe42e7\") " Mar 20 17:57:06 crc kubenswrapper[4690]: I0320 17:57:06.763912 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c69a2330-7295-45a3-a6c0-edc86cfe42e7-dns-svc\") pod \"c69a2330-7295-45a3-a6c0-edc86cfe42e7\" (UID: \"c69a2330-7295-45a3-a6c0-edc86cfe42e7\") " Mar 20 17:57:06 crc kubenswrapper[4690]: I0320 17:57:06.763952 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c69a2330-7295-45a3-a6c0-edc86cfe42e7-ovsdbserver-sb\") pod \"c69a2330-7295-45a3-a6c0-edc86cfe42e7\" (UID: \"c69a2330-7295-45a3-a6c0-edc86cfe42e7\") " Mar 20 17:57:06 crc kubenswrapper[4690]: I0320 17:57:06.771645 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c69a2330-7295-45a3-a6c0-edc86cfe42e7-kube-api-access-gjwq5" (OuterVolumeSpecName: "kube-api-access-gjwq5") pod "c69a2330-7295-45a3-a6c0-edc86cfe42e7" (UID: "c69a2330-7295-45a3-a6c0-edc86cfe42e7"). InnerVolumeSpecName "kube-api-access-gjwq5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:57:06 crc kubenswrapper[4690]: I0320 17:57:06.777933 4690 scope.go:117] "RemoveContainer" containerID="d525df854186d4d0ba0835ec720b992845d1fdaa25f6b16722d593cf8f170de4" Mar 20 17:57:06 crc kubenswrapper[4690]: I0320 17:57:06.819410 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c69a2330-7295-45a3-a6c0-edc86cfe42e7-config" (OuterVolumeSpecName: "config") pod "c69a2330-7295-45a3-a6c0-edc86cfe42e7" (UID: "c69a2330-7295-45a3-a6c0-edc86cfe42e7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:57:06 crc kubenswrapper[4690]: I0320 17:57:06.822950 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c69a2330-7295-45a3-a6c0-edc86cfe42e7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c69a2330-7295-45a3-a6c0-edc86cfe42e7" (UID: "c69a2330-7295-45a3-a6c0-edc86cfe42e7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:57:06 crc kubenswrapper[4690]: I0320 17:57:06.823800 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c69a2330-7295-45a3-a6c0-edc86cfe42e7-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c69a2330-7295-45a3-a6c0-edc86cfe42e7" (UID: "c69a2330-7295-45a3-a6c0-edc86cfe42e7"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:57:06 crc kubenswrapper[4690]: I0320 17:57:06.829099 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c69a2330-7295-45a3-a6c0-edc86cfe42e7-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c69a2330-7295-45a3-a6c0-edc86cfe42e7" (UID: "c69a2330-7295-45a3-a6c0-edc86cfe42e7"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:57:06 crc kubenswrapper[4690]: I0320 17:57:06.842843 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c69a2330-7295-45a3-a6c0-edc86cfe42e7-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "c69a2330-7295-45a3-a6c0-edc86cfe42e7" (UID: "c69a2330-7295-45a3-a6c0-edc86cfe42e7"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:57:06 crc kubenswrapper[4690]: I0320 17:57:06.846835 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c69a2330-7295-45a3-a6c0-edc86cfe42e7-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c69a2330-7295-45a3-a6c0-edc86cfe42e7" (UID: "c69a2330-7295-45a3-a6c0-edc86cfe42e7"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:57:06 crc kubenswrapper[4690]: I0320 17:57:06.866429 4690 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/c69a2330-7295-45a3-a6c0-edc86cfe42e7-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 17:57:06 crc kubenswrapper[4690]: I0320 17:57:06.866484 4690 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c69a2330-7295-45a3-a6c0-edc86cfe42e7-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 17:57:06 crc kubenswrapper[4690]: I0320 17:57:06.866502 4690 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c69a2330-7295-45a3-a6c0-edc86cfe42e7-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 17:57:06 crc kubenswrapper[4690]: I0320 17:57:06.866524 4690 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c69a2330-7295-45a3-a6c0-edc86cfe42e7-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 17:57:06 crc kubenswrapper[4690]: I0320 17:57:06.866540 4690 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c69a2330-7295-45a3-a6c0-edc86cfe42e7-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 17:57:06 crc kubenswrapper[4690]: I0320 17:57:06.866555 4690 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c69a2330-7295-45a3-a6c0-edc86cfe42e7-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:57:06 crc kubenswrapper[4690]: I0320 17:57:06.866569 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gjwq5\" (UniqueName: \"kubernetes.io/projected/c69a2330-7295-45a3-a6c0-edc86cfe42e7-kube-api-access-gjwq5\") on node \"crc\" DevicePath \"\"" Mar 20 17:57:06 crc kubenswrapper[4690]: I0320 17:57:06.922157 4690 scope.go:117] "RemoveContainer" containerID="199a6cf779f74b869d7b93acf9d25e89ca0d9c9203cef1a701d6b142c267350b" Mar 20 17:57:06 crc kubenswrapper[4690]: E0320 17:57:06.922731 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"199a6cf779f74b869d7b93acf9d25e89ca0d9c9203cef1a701d6b142c267350b\": container with ID starting with 199a6cf779f74b869d7b93acf9d25e89ca0d9c9203cef1a701d6b142c267350b not found: ID does not exist" containerID="199a6cf779f74b869d7b93acf9d25e89ca0d9c9203cef1a701d6b142c267350b" Mar 20 17:57:06 crc kubenswrapper[4690]: I0320 17:57:06.922776 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"199a6cf779f74b869d7b93acf9d25e89ca0d9c9203cef1a701d6b142c267350b"} err="failed to get container status \"199a6cf779f74b869d7b93acf9d25e89ca0d9c9203cef1a701d6b142c267350b\": rpc error: code = NotFound desc = could not find container \"199a6cf779f74b869d7b93acf9d25e89ca0d9c9203cef1a701d6b142c267350b\": container with ID starting with 199a6cf779f74b869d7b93acf9d25e89ca0d9c9203cef1a701d6b142c267350b not found: ID does not exist" Mar 20 17:57:06 crc kubenswrapper[4690]: I0320 17:57:06.922800 4690 scope.go:117] "RemoveContainer" containerID="d525df854186d4d0ba0835ec720b992845d1fdaa25f6b16722d593cf8f170de4" Mar 20 17:57:06 crc kubenswrapper[4690]: E0320 17:57:06.923192 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d525df854186d4d0ba0835ec720b992845d1fdaa25f6b16722d593cf8f170de4\": container with ID starting with d525df854186d4d0ba0835ec720b992845d1fdaa25f6b16722d593cf8f170de4 not found: ID does not exist" containerID="d525df854186d4d0ba0835ec720b992845d1fdaa25f6b16722d593cf8f170de4" Mar 20 17:57:06 crc kubenswrapper[4690]: I0320 17:57:06.923231 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d525df854186d4d0ba0835ec720b992845d1fdaa25f6b16722d593cf8f170de4"} err="failed to get container status \"d525df854186d4d0ba0835ec720b992845d1fdaa25f6b16722d593cf8f170de4\": rpc error: code = NotFound desc = could not find container \"d525df854186d4d0ba0835ec720b992845d1fdaa25f6b16722d593cf8f170de4\": container with ID starting with d525df854186d4d0ba0835ec720b992845d1fdaa25f6b16722d593cf8f170de4 not found: ID does not exist" Mar 20 17:57:07 crc kubenswrapper[4690]: I0320 17:57:07.094182 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-njmnp"] Mar 20 17:57:07 crc kubenswrapper[4690]: I0320 17:57:07.104741 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-njmnp"] Mar 20 17:57:07 crc kubenswrapper[4690]: I0320 17:57:07.768627 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j7vkk" event={"ID":"817a65d0-8808-4fbb-ae9b-10135392cb5e","Type":"ContainerStarted","Data":"207cb527aeb1506423fc051d1af0c3ac1981bbd4b392d4c17c81548513cddce5"} Mar 20 17:57:07 crc kubenswrapper[4690]: I0320 17:57:07.790333 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-j7vkk" podStartSLOduration=2.327506081 podStartE2EDuration="5.790313293s" podCreationTimestamp="2026-03-20 17:57:02 +0000 UTC" firstStartedPulling="2026-03-20 17:57:03.709069096 +0000 UTC m=+1498.574894774" lastFinishedPulling="2026-03-20 17:57:07.171876308 +0000 UTC m=+1502.037701986" observedRunningTime="2026-03-20 17:57:07.78518547 +0000 UTC m=+1502.651011168" watchObservedRunningTime="2026-03-20 17:57:07.790313293 +0000 UTC m=+1502.656138991" Mar 20 17:57:07 crc kubenswrapper[4690]: I0320 17:57:07.895550 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c69a2330-7295-45a3-a6c0-edc86cfe42e7" path="/var/lib/kubelet/pods/c69a2330-7295-45a3-a6c0-edc86cfe42e7/volumes" Mar 20 17:57:12 crc kubenswrapper[4690]: I0320 17:57:12.514115 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-j7vkk" Mar 20 17:57:12 crc kubenswrapper[4690]: I0320 17:57:12.514613 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-j7vkk" Mar 20 17:57:13 crc kubenswrapper[4690]: I0320 17:57:13.575707 4690 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-j7vkk" podUID="817a65d0-8808-4fbb-ae9b-10135392cb5e" containerName="registry-server" probeResult="failure" output=< Mar 20 17:57:13 crc kubenswrapper[4690]: timeout: failed to connect service ":50051" within 1s Mar 20 17:57:13 crc kubenswrapper[4690]: > Mar 20 17:57:14 crc kubenswrapper[4690]: I0320 17:57:14.850242 4690 generic.go:334] "Generic (PLEG): container finished" podID="b93f0757-6c7a-473f-80e5-f4b9e7f88fad" containerID="c671d3d29278ae0c7a005e293815447fa52d515517090b0d6749d424b8221884" exitCode=0 Mar 20 17:57:14 crc kubenswrapper[4690]: I0320 17:57:14.850325 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b93f0757-6c7a-473f-80e5-f4b9e7f88fad","Type":"ContainerDied","Data":"c671d3d29278ae0c7a005e293815447fa52d515517090b0d6749d424b8221884"} Mar 20 17:57:15 crc kubenswrapper[4690]: I0320 17:57:15.865049 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b93f0757-6c7a-473f-80e5-f4b9e7f88fad","Type":"ContainerStarted","Data":"0bb241ce9c07614e7ec0f8c129ddcb256dc223b4cafee71b784124cfea4c8e71"} Mar 20 17:57:15 crc kubenswrapper[4690]: I0320 17:57:15.865561 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:57:15 crc kubenswrapper[4690]: I0320 17:57:15.909572 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.909550753 podStartE2EDuration="36.909550753s" podCreationTimestamp="2026-03-20 17:56:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:57:15.895141453 +0000 UTC m=+1510.760967131" watchObservedRunningTime="2026-03-20 17:57:15.909550753 +0000 UTC m=+1510.775376441" Mar 20 17:57:18 crc kubenswrapper[4690]: I0320 17:57:18.540939 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-krmd6"] Mar 20 17:57:18 crc kubenswrapper[4690]: E0320 17:57:18.541919 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c69a2330-7295-45a3-a6c0-edc86cfe42e7" containerName="dnsmasq-dns" Mar 20 17:57:18 crc kubenswrapper[4690]: I0320 17:57:18.541934 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="c69a2330-7295-45a3-a6c0-edc86cfe42e7" containerName="dnsmasq-dns" Mar 20 17:57:18 crc kubenswrapper[4690]: E0320 17:57:18.541949 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c69a2330-7295-45a3-a6c0-edc86cfe42e7" containerName="init" Mar 20 17:57:18 crc kubenswrapper[4690]: I0320 17:57:18.541956 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="c69a2330-7295-45a3-a6c0-edc86cfe42e7" containerName="init" Mar 20 17:57:18 crc kubenswrapper[4690]: I0320 17:57:18.542199 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="c69a2330-7295-45a3-a6c0-edc86cfe42e7" containerName="dnsmasq-dns" Mar 20 17:57:18 crc kubenswrapper[4690]: I0320 17:57:18.542988 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-krmd6" Mar 20 17:57:18 crc kubenswrapper[4690]: I0320 17:57:18.544995 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 17:57:18 crc kubenswrapper[4690]: I0320 17:57:18.545819 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 17:57:18 crc kubenswrapper[4690]: I0320 17:57:18.546358 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 17:57:18 crc kubenswrapper[4690]: I0320 17:57:18.546702 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-k9qb4" Mar 20 17:57:18 crc kubenswrapper[4690]: I0320 17:57:18.560813 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-krmd6"] Mar 20 17:57:18 crc kubenswrapper[4690]: I0320 17:57:18.638877 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05a81786-36ff-4e8b-9bba-5e0ebbfc3247-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-krmd6\" (UID: \"05a81786-36ff-4e8b-9bba-5e0ebbfc3247\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-krmd6" Mar 20 17:57:18 crc kubenswrapper[4690]: I0320 17:57:18.639121 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ft47p\" (UniqueName: \"kubernetes.io/projected/05a81786-36ff-4e8b-9bba-5e0ebbfc3247-kube-api-access-ft47p\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-krmd6\" (UID: \"05a81786-36ff-4e8b-9bba-5e0ebbfc3247\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-krmd6" Mar 20 17:57:18 crc kubenswrapper[4690]: I0320 17:57:18.639387 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/05a81786-36ff-4e8b-9bba-5e0ebbfc3247-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-krmd6\" (UID: \"05a81786-36ff-4e8b-9bba-5e0ebbfc3247\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-krmd6" Mar 20 17:57:18 crc kubenswrapper[4690]: I0320 17:57:18.639430 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/05a81786-36ff-4e8b-9bba-5e0ebbfc3247-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-krmd6\" (UID: \"05a81786-36ff-4e8b-9bba-5e0ebbfc3247\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-krmd6" Mar 20 17:57:18 crc kubenswrapper[4690]: I0320 17:57:18.741326 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/05a81786-36ff-4e8b-9bba-5e0ebbfc3247-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-krmd6\" (UID: \"05a81786-36ff-4e8b-9bba-5e0ebbfc3247\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-krmd6" Mar 20 17:57:18 crc kubenswrapper[4690]: I0320 17:57:18.741377 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/05a81786-36ff-4e8b-9bba-5e0ebbfc3247-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-krmd6\" (UID: \"05a81786-36ff-4e8b-9bba-5e0ebbfc3247\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-krmd6" Mar 20 17:57:18 crc kubenswrapper[4690]: I0320 17:57:18.741431 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05a81786-36ff-4e8b-9bba-5e0ebbfc3247-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-krmd6\" (UID: \"05a81786-36ff-4e8b-9bba-5e0ebbfc3247\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-krmd6" Mar 20 17:57:18 crc kubenswrapper[4690]: I0320 17:57:18.741532 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ft47p\" (UniqueName: \"kubernetes.io/projected/05a81786-36ff-4e8b-9bba-5e0ebbfc3247-kube-api-access-ft47p\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-krmd6\" (UID: \"05a81786-36ff-4e8b-9bba-5e0ebbfc3247\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-krmd6" Mar 20 17:57:18 crc kubenswrapper[4690]: I0320 17:57:18.747684 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/05a81786-36ff-4e8b-9bba-5e0ebbfc3247-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-krmd6\" (UID: \"05a81786-36ff-4e8b-9bba-5e0ebbfc3247\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-krmd6" Mar 20 17:57:18 crc kubenswrapper[4690]: I0320 17:57:18.747994 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05a81786-36ff-4e8b-9bba-5e0ebbfc3247-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-krmd6\" (UID: \"05a81786-36ff-4e8b-9bba-5e0ebbfc3247\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-krmd6" Mar 20 17:57:18 crc kubenswrapper[4690]: I0320 17:57:18.748434 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/05a81786-36ff-4e8b-9bba-5e0ebbfc3247-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-krmd6\" (UID: \"05a81786-36ff-4e8b-9bba-5e0ebbfc3247\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-krmd6" Mar 20 17:57:18 crc kubenswrapper[4690]: I0320 17:57:18.772029 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ft47p\" (UniqueName: \"kubernetes.io/projected/05a81786-36ff-4e8b-9bba-5e0ebbfc3247-kube-api-access-ft47p\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-krmd6\" (UID: \"05a81786-36ff-4e8b-9bba-5e0ebbfc3247\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-krmd6" Mar 20 17:57:18 crc kubenswrapper[4690]: I0320 17:57:18.863737 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-krmd6" Mar 20 17:57:19 crc kubenswrapper[4690]: W0320 17:57:19.444589 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod05a81786_36ff_4e8b_9bba_5e0ebbfc3247.slice/crio-754c126dd8bab1b23747dc0488fb30013072c6044e00de44698c328a59827839 WatchSource:0}: Error finding container 754c126dd8bab1b23747dc0488fb30013072c6044e00de44698c328a59827839: Status 404 returned error can't find the container with id 754c126dd8bab1b23747dc0488fb30013072c6044e00de44698c328a59827839 Mar 20 17:57:19 crc kubenswrapper[4690]: I0320 17:57:19.452541 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-krmd6"] Mar 20 17:57:19 crc kubenswrapper[4690]: I0320 17:57:19.912181 4690 generic.go:334] "Generic (PLEG): container finished" podID="ab528fee-94bb-4907-aca5-97dcabef8332" containerID="1b5f69ddeb012cffc65cf063976fa9fc89947c887fb70e237910cefca5aac610" exitCode=0 Mar 20 17:57:19 crc kubenswrapper[4690]: I0320 17:57:19.912301 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ab528fee-94bb-4907-aca5-97dcabef8332","Type":"ContainerDied","Data":"1b5f69ddeb012cffc65cf063976fa9fc89947c887fb70e237910cefca5aac610"} Mar 20 17:57:19 crc kubenswrapper[4690]: I0320 17:57:19.914132 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-krmd6" event={"ID":"05a81786-36ff-4e8b-9bba-5e0ebbfc3247","Type":"ContainerStarted","Data":"754c126dd8bab1b23747dc0488fb30013072c6044e00de44698c328a59827839"} Mar 20 17:57:20 crc kubenswrapper[4690]: I0320 17:57:20.928647 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ab528fee-94bb-4907-aca5-97dcabef8332","Type":"ContainerStarted","Data":"6343a9e7fe29b944f03aad71cf38a1d579fc3f7b02f204d06750d1b15f34124e"} Mar 20 17:57:20 crc kubenswrapper[4690]: I0320 17:57:20.930193 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 20 17:57:20 crc kubenswrapper[4690]: I0320 17:57:20.951807 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.951785554 podStartE2EDuration="36.951785554s" podCreationTimestamp="2026-03-20 17:56:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:57:20.949730747 +0000 UTC m=+1515.815556425" watchObservedRunningTime="2026-03-20 17:57:20.951785554 +0000 UTC m=+1515.817611232" Mar 20 17:57:22 crc kubenswrapper[4690]: I0320 17:57:22.568759 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-j7vkk" Mar 20 17:57:22 crc kubenswrapper[4690]: I0320 17:57:22.623011 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-j7vkk" Mar 20 17:57:22 crc kubenswrapper[4690]: I0320 17:57:22.807629 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-j7vkk"] Mar 20 17:57:23 crc kubenswrapper[4690]: I0320 17:57:23.967203 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-j7vkk" podUID="817a65d0-8808-4fbb-ae9b-10135392cb5e" containerName="registry-server" containerID="cri-o://207cb527aeb1506423fc051d1af0c3ac1981bbd4b392d4c17c81548513cddce5" gracePeriod=2 Mar 20 17:57:24 crc kubenswrapper[4690]: I0320 17:57:24.977249 4690 generic.go:334] "Generic (PLEG): container finished" podID="817a65d0-8808-4fbb-ae9b-10135392cb5e" containerID="207cb527aeb1506423fc051d1af0c3ac1981bbd4b392d4c17c81548513cddce5" exitCode=0 Mar 20 17:57:24 crc kubenswrapper[4690]: I0320 17:57:24.977512 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j7vkk" event={"ID":"817a65d0-8808-4fbb-ae9b-10135392cb5e","Type":"ContainerDied","Data":"207cb527aeb1506423fc051d1af0c3ac1981bbd4b392d4c17c81548513cddce5"} Mar 20 17:57:29 crc kubenswrapper[4690]: I0320 17:57:29.146514 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j7vkk" Mar 20 17:57:29 crc kubenswrapper[4690]: I0320 17:57:29.243188 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/817a65d0-8808-4fbb-ae9b-10135392cb5e-catalog-content\") pod \"817a65d0-8808-4fbb-ae9b-10135392cb5e\" (UID: \"817a65d0-8808-4fbb-ae9b-10135392cb5e\") " Mar 20 17:57:29 crc kubenswrapper[4690]: I0320 17:57:29.243373 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/817a65d0-8808-4fbb-ae9b-10135392cb5e-utilities\") pod \"817a65d0-8808-4fbb-ae9b-10135392cb5e\" (UID: \"817a65d0-8808-4fbb-ae9b-10135392cb5e\") " Mar 20 17:57:29 crc kubenswrapper[4690]: I0320 17:57:29.243649 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wqthk\" (UniqueName: \"kubernetes.io/projected/817a65d0-8808-4fbb-ae9b-10135392cb5e-kube-api-access-wqthk\") pod \"817a65d0-8808-4fbb-ae9b-10135392cb5e\" (UID: \"817a65d0-8808-4fbb-ae9b-10135392cb5e\") " Mar 20 17:57:29 crc kubenswrapper[4690]: I0320 17:57:29.244199 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/817a65d0-8808-4fbb-ae9b-10135392cb5e-utilities" (OuterVolumeSpecName: "utilities") pod "817a65d0-8808-4fbb-ae9b-10135392cb5e" (UID: "817a65d0-8808-4fbb-ae9b-10135392cb5e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:57:29 crc kubenswrapper[4690]: I0320 17:57:29.244674 4690 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/817a65d0-8808-4fbb-ae9b-10135392cb5e-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 17:57:29 crc kubenswrapper[4690]: I0320 17:57:29.249540 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/817a65d0-8808-4fbb-ae9b-10135392cb5e-kube-api-access-wqthk" (OuterVolumeSpecName: "kube-api-access-wqthk") pod "817a65d0-8808-4fbb-ae9b-10135392cb5e" (UID: "817a65d0-8808-4fbb-ae9b-10135392cb5e"). InnerVolumeSpecName "kube-api-access-wqthk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:57:29 crc kubenswrapper[4690]: I0320 17:57:29.328782 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/817a65d0-8808-4fbb-ae9b-10135392cb5e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "817a65d0-8808-4fbb-ae9b-10135392cb5e" (UID: "817a65d0-8808-4fbb-ae9b-10135392cb5e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:57:29 crc kubenswrapper[4690]: I0320 17:57:29.346737 4690 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/817a65d0-8808-4fbb-ae9b-10135392cb5e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 17:57:29 crc kubenswrapper[4690]: I0320 17:57:29.346782 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wqthk\" (UniqueName: \"kubernetes.io/projected/817a65d0-8808-4fbb-ae9b-10135392cb5e-kube-api-access-wqthk\") on node \"crc\" DevicePath \"\"" Mar 20 17:57:30 crc kubenswrapper[4690]: I0320 17:57:30.029478 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j7vkk" event={"ID":"817a65d0-8808-4fbb-ae9b-10135392cb5e","Type":"ContainerDied","Data":"51dd0a4870a09182cc5a303252c8cc376def7f0500dede72555fb180ad749dd3"} Mar 20 17:57:30 crc kubenswrapper[4690]: I0320 17:57:30.029503 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j7vkk" Mar 20 17:57:30 crc kubenswrapper[4690]: I0320 17:57:30.030225 4690 scope.go:117] "RemoveContainer" containerID="207cb527aeb1506423fc051d1af0c3ac1981bbd4b392d4c17c81548513cddce5" Mar 20 17:57:30 crc kubenswrapper[4690]: I0320 17:57:30.031967 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-krmd6" event={"ID":"05a81786-36ff-4e8b-9bba-5e0ebbfc3247","Type":"ContainerStarted","Data":"3387a32a1effff65b2ad55c87e960fa07164cc481f0562e9b7241ddb37cdcdb0"} Mar 20 17:57:30 crc kubenswrapper[4690]: I0320 17:57:30.063922 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-krmd6" podStartSLOduration=2.606893539 podStartE2EDuration="12.063898874s" podCreationTimestamp="2026-03-20 17:57:18 +0000 UTC" firstStartedPulling="2026-03-20 17:57:19.446988959 +0000 UTC m=+1514.312814637" lastFinishedPulling="2026-03-20 17:57:28.903994254 +0000 UTC m=+1523.769819972" observedRunningTime="2026-03-20 17:57:30.059110761 +0000 UTC m=+1524.924936449" watchObservedRunningTime="2026-03-20 17:57:30.063898874 +0000 UTC m=+1524.929724562" Mar 20 17:57:30 crc kubenswrapper[4690]: I0320 17:57:30.064592 4690 scope.go:117] "RemoveContainer" containerID="6e6f57c30890ca2cd2dcd577297609447d104b2c49f5796667e78e3637004f21" Mar 20 17:57:30 crc kubenswrapper[4690]: I0320 17:57:30.089505 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-j7vkk"] Mar 20 17:57:30 crc kubenswrapper[4690]: I0320 17:57:30.096477 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:57:30 crc kubenswrapper[4690]: I0320 17:57:30.100817 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-j7vkk"] Mar 20 17:57:30 crc kubenswrapper[4690]: I0320 17:57:30.136849 4690 scope.go:117] "RemoveContainer" containerID="ff7146a960bce29c828c9fc42cf71a7f6107d956dc6425ed08b08f491b494449" Mar 20 17:57:31 crc kubenswrapper[4690]: I0320 17:57:31.900916 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="817a65d0-8808-4fbb-ae9b-10135392cb5e" path="/var/lib/kubelet/pods/817a65d0-8808-4fbb-ae9b-10135392cb5e/volumes" Mar 20 17:57:35 crc kubenswrapper[4690]: I0320 17:57:35.121569 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 20 17:57:41 crc kubenswrapper[4690]: I0320 17:57:41.172858 4690 generic.go:334] "Generic (PLEG): container finished" podID="05a81786-36ff-4e8b-9bba-5e0ebbfc3247" containerID="3387a32a1effff65b2ad55c87e960fa07164cc481f0562e9b7241ddb37cdcdb0" exitCode=0 Mar 20 17:57:41 crc kubenswrapper[4690]: I0320 17:57:41.172974 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-krmd6" event={"ID":"05a81786-36ff-4e8b-9bba-5e0ebbfc3247","Type":"ContainerDied","Data":"3387a32a1effff65b2ad55c87e960fa07164cc481f0562e9b7241ddb37cdcdb0"} Mar 20 17:57:42 crc kubenswrapper[4690]: I0320 17:57:42.713893 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-krmd6" Mar 20 17:57:42 crc kubenswrapper[4690]: I0320 17:57:42.851753 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/05a81786-36ff-4e8b-9bba-5e0ebbfc3247-ssh-key-openstack-edpm-ipam\") pod \"05a81786-36ff-4e8b-9bba-5e0ebbfc3247\" (UID: \"05a81786-36ff-4e8b-9bba-5e0ebbfc3247\") " Mar 20 17:57:42 crc kubenswrapper[4690]: I0320 17:57:42.851878 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/05a81786-36ff-4e8b-9bba-5e0ebbfc3247-inventory\") pod \"05a81786-36ff-4e8b-9bba-5e0ebbfc3247\" (UID: \"05a81786-36ff-4e8b-9bba-5e0ebbfc3247\") " Mar 20 17:57:42 crc kubenswrapper[4690]: I0320 17:57:42.851948 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05a81786-36ff-4e8b-9bba-5e0ebbfc3247-repo-setup-combined-ca-bundle\") pod \"05a81786-36ff-4e8b-9bba-5e0ebbfc3247\" (UID: \"05a81786-36ff-4e8b-9bba-5e0ebbfc3247\") " Mar 20 17:57:42 crc kubenswrapper[4690]: I0320 17:57:42.852105 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ft47p\" (UniqueName: \"kubernetes.io/projected/05a81786-36ff-4e8b-9bba-5e0ebbfc3247-kube-api-access-ft47p\") pod \"05a81786-36ff-4e8b-9bba-5e0ebbfc3247\" (UID: \"05a81786-36ff-4e8b-9bba-5e0ebbfc3247\") " Mar 20 17:57:42 crc kubenswrapper[4690]: I0320 17:57:42.858012 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05a81786-36ff-4e8b-9bba-5e0ebbfc3247-kube-api-access-ft47p" (OuterVolumeSpecName: "kube-api-access-ft47p") pod "05a81786-36ff-4e8b-9bba-5e0ebbfc3247" (UID: "05a81786-36ff-4e8b-9bba-5e0ebbfc3247"). InnerVolumeSpecName "kube-api-access-ft47p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:57:42 crc kubenswrapper[4690]: I0320 17:57:42.858015 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05a81786-36ff-4e8b-9bba-5e0ebbfc3247-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "05a81786-36ff-4e8b-9bba-5e0ebbfc3247" (UID: "05a81786-36ff-4e8b-9bba-5e0ebbfc3247"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:57:42 crc kubenswrapper[4690]: I0320 17:57:42.887758 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05a81786-36ff-4e8b-9bba-5e0ebbfc3247-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "05a81786-36ff-4e8b-9bba-5e0ebbfc3247" (UID: "05a81786-36ff-4e8b-9bba-5e0ebbfc3247"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:57:42 crc kubenswrapper[4690]: I0320 17:57:42.890922 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05a81786-36ff-4e8b-9bba-5e0ebbfc3247-inventory" (OuterVolumeSpecName: "inventory") pod "05a81786-36ff-4e8b-9bba-5e0ebbfc3247" (UID: "05a81786-36ff-4e8b-9bba-5e0ebbfc3247"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:57:42 crc kubenswrapper[4690]: I0320 17:57:42.954642 4690 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/05a81786-36ff-4e8b-9bba-5e0ebbfc3247-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 17:57:42 crc kubenswrapper[4690]: I0320 17:57:42.954699 4690 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05a81786-36ff-4e8b-9bba-5e0ebbfc3247-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:57:42 crc kubenswrapper[4690]: I0320 17:57:42.954720 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ft47p\" (UniqueName: \"kubernetes.io/projected/05a81786-36ff-4e8b-9bba-5e0ebbfc3247-kube-api-access-ft47p\") on node \"crc\" DevicePath \"\"" Mar 20 17:57:42 crc kubenswrapper[4690]: I0320 17:57:42.954740 4690 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/05a81786-36ff-4e8b-9bba-5e0ebbfc3247-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 17:57:43 crc kubenswrapper[4690]: I0320 17:57:43.192641 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-krmd6" event={"ID":"05a81786-36ff-4e8b-9bba-5e0ebbfc3247","Type":"ContainerDied","Data":"754c126dd8bab1b23747dc0488fb30013072c6044e00de44698c328a59827839"} Mar 20 17:57:43 crc kubenswrapper[4690]: I0320 17:57:43.192927 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="754c126dd8bab1b23747dc0488fb30013072c6044e00de44698c328a59827839" Mar 20 17:57:43 crc kubenswrapper[4690]: I0320 17:57:43.192682 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-krmd6" Mar 20 17:57:43 crc kubenswrapper[4690]: I0320 17:57:43.364266 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-h7849"] Mar 20 17:57:43 crc kubenswrapper[4690]: E0320 17:57:43.365210 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="817a65d0-8808-4fbb-ae9b-10135392cb5e" containerName="registry-server" Mar 20 17:57:43 crc kubenswrapper[4690]: I0320 17:57:43.365235 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="817a65d0-8808-4fbb-ae9b-10135392cb5e" containerName="registry-server" Mar 20 17:57:43 crc kubenswrapper[4690]: E0320 17:57:43.365255 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="817a65d0-8808-4fbb-ae9b-10135392cb5e" containerName="extract-utilities" Mar 20 17:57:43 crc kubenswrapper[4690]: I0320 17:57:43.365264 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="817a65d0-8808-4fbb-ae9b-10135392cb5e" containerName="extract-utilities" Mar 20 17:57:43 crc kubenswrapper[4690]: E0320 17:57:43.365313 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="817a65d0-8808-4fbb-ae9b-10135392cb5e" containerName="extract-content" Mar 20 17:57:43 crc kubenswrapper[4690]: I0320 17:57:43.365325 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="817a65d0-8808-4fbb-ae9b-10135392cb5e" containerName="extract-content" Mar 20 17:57:43 crc kubenswrapper[4690]: E0320 17:57:43.365357 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05a81786-36ff-4e8b-9bba-5e0ebbfc3247" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 20 17:57:43 crc kubenswrapper[4690]: I0320 17:57:43.365371 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="05a81786-36ff-4e8b-9bba-5e0ebbfc3247" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 20 17:57:43 crc kubenswrapper[4690]: I0320 17:57:43.365710 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="817a65d0-8808-4fbb-ae9b-10135392cb5e" containerName="registry-server" Mar 20 17:57:43 crc kubenswrapper[4690]: I0320 17:57:43.365736 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="05a81786-36ff-4e8b-9bba-5e0ebbfc3247" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 20 17:57:43 crc kubenswrapper[4690]: I0320 17:57:43.366431 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-h7849" Mar 20 17:57:43 crc kubenswrapper[4690]: I0320 17:57:43.368551 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 17:57:43 crc kubenswrapper[4690]: I0320 17:57:43.373775 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 17:57:43 crc kubenswrapper[4690]: I0320 17:57:43.377367 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 17:57:43 crc kubenswrapper[4690]: I0320 17:57:43.377675 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-k9qb4" Mar 20 17:57:43 crc kubenswrapper[4690]: I0320 17:57:43.380456 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-h7849"] Mar 20 17:57:43 crc kubenswrapper[4690]: I0320 17:57:43.463549 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0fdbed5c-e2a7-42ee-9e92-68d0bbbff023-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-h7849\" (UID: \"0fdbed5c-e2a7-42ee-9e92-68d0bbbff023\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-h7849" Mar 20 17:57:43 crc kubenswrapper[4690]: I0320 17:57:43.463631 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0fdbed5c-e2a7-42ee-9e92-68d0bbbff023-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-h7849\" (UID: \"0fdbed5c-e2a7-42ee-9e92-68d0bbbff023\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-h7849" Mar 20 17:57:43 crc kubenswrapper[4690]: I0320 17:57:43.463834 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6xzx\" (UniqueName: \"kubernetes.io/projected/0fdbed5c-e2a7-42ee-9e92-68d0bbbff023-kube-api-access-f6xzx\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-h7849\" (UID: \"0fdbed5c-e2a7-42ee-9e92-68d0bbbff023\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-h7849" Mar 20 17:57:43 crc kubenswrapper[4690]: I0320 17:57:43.566157 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0fdbed5c-e2a7-42ee-9e92-68d0bbbff023-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-h7849\" (UID: \"0fdbed5c-e2a7-42ee-9e92-68d0bbbff023\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-h7849" Mar 20 17:57:43 crc kubenswrapper[4690]: I0320 17:57:43.566556 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0fdbed5c-e2a7-42ee-9e92-68d0bbbff023-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-h7849\" (UID: \"0fdbed5c-e2a7-42ee-9e92-68d0bbbff023\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-h7849" Mar 20 17:57:43 crc kubenswrapper[4690]: I0320 17:57:43.566810 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6xzx\" (UniqueName: \"kubernetes.io/projected/0fdbed5c-e2a7-42ee-9e92-68d0bbbff023-kube-api-access-f6xzx\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-h7849\" (UID: \"0fdbed5c-e2a7-42ee-9e92-68d0bbbff023\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-h7849" Mar 20 17:57:43 crc kubenswrapper[4690]: I0320 17:57:43.570203 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0fdbed5c-e2a7-42ee-9e92-68d0bbbff023-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-h7849\" (UID: \"0fdbed5c-e2a7-42ee-9e92-68d0bbbff023\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-h7849" Mar 20 17:57:43 crc kubenswrapper[4690]: I0320 17:57:43.571115 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0fdbed5c-e2a7-42ee-9e92-68d0bbbff023-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-h7849\" (UID: \"0fdbed5c-e2a7-42ee-9e92-68d0bbbff023\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-h7849" Mar 20 17:57:43 crc kubenswrapper[4690]: I0320 17:57:43.583245 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6xzx\" (UniqueName: \"kubernetes.io/projected/0fdbed5c-e2a7-42ee-9e92-68d0bbbff023-kube-api-access-f6xzx\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-h7849\" (UID: \"0fdbed5c-e2a7-42ee-9e92-68d0bbbff023\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-h7849" Mar 20 17:57:43 crc kubenswrapper[4690]: I0320 17:57:43.689473 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-h7849" Mar 20 17:57:44 crc kubenswrapper[4690]: I0320 17:57:44.243773 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-h7849"] Mar 20 17:57:44 crc kubenswrapper[4690]: W0320 17:57:44.244427 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0fdbed5c_e2a7_42ee_9e92_68d0bbbff023.slice/crio-40a840c63a52311b0c01975d9f5da434f6e430cc412fd2c95e8eb34781e63511 WatchSource:0}: Error finding container 40a840c63a52311b0c01975d9f5da434f6e430cc412fd2c95e8eb34781e63511: Status 404 returned error can't find the container with id 40a840c63a52311b0c01975d9f5da434f6e430cc412fd2c95e8eb34781e63511 Mar 20 17:57:45 crc kubenswrapper[4690]: I0320 17:57:45.219891 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-h7849" event={"ID":"0fdbed5c-e2a7-42ee-9e92-68d0bbbff023","Type":"ContainerStarted","Data":"ad86406b37af514f92dc97bfa792ebaf239c7fdfab98292db8e246c91db9a00a"} Mar 20 17:57:45 crc kubenswrapper[4690]: I0320 17:57:45.220674 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-h7849" event={"ID":"0fdbed5c-e2a7-42ee-9e92-68d0bbbff023","Type":"ContainerStarted","Data":"40a840c63a52311b0c01975d9f5da434f6e430cc412fd2c95e8eb34781e63511"} Mar 20 17:57:48 crc kubenswrapper[4690]: I0320 17:57:48.258814 4690 generic.go:334] "Generic (PLEG): container finished" podID="0fdbed5c-e2a7-42ee-9e92-68d0bbbff023" containerID="ad86406b37af514f92dc97bfa792ebaf239c7fdfab98292db8e246c91db9a00a" exitCode=0 Mar 20 17:57:48 crc kubenswrapper[4690]: I0320 17:57:48.258940 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-h7849" event={"ID":"0fdbed5c-e2a7-42ee-9e92-68d0bbbff023","Type":"ContainerDied","Data":"ad86406b37af514f92dc97bfa792ebaf239c7fdfab98292db8e246c91db9a00a"} Mar 20 17:57:49 crc kubenswrapper[4690]: I0320 17:57:49.700593 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-h7849" Mar 20 17:57:49 crc kubenswrapper[4690]: I0320 17:57:49.825786 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0fdbed5c-e2a7-42ee-9e92-68d0bbbff023-ssh-key-openstack-edpm-ipam\") pod \"0fdbed5c-e2a7-42ee-9e92-68d0bbbff023\" (UID: \"0fdbed5c-e2a7-42ee-9e92-68d0bbbff023\") " Mar 20 17:57:49 crc kubenswrapper[4690]: I0320 17:57:49.825851 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0fdbed5c-e2a7-42ee-9e92-68d0bbbff023-inventory\") pod \"0fdbed5c-e2a7-42ee-9e92-68d0bbbff023\" (UID: \"0fdbed5c-e2a7-42ee-9e92-68d0bbbff023\") " Mar 20 17:57:49 crc kubenswrapper[4690]: I0320 17:57:49.825924 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f6xzx\" (UniqueName: \"kubernetes.io/projected/0fdbed5c-e2a7-42ee-9e92-68d0bbbff023-kube-api-access-f6xzx\") pod \"0fdbed5c-e2a7-42ee-9e92-68d0bbbff023\" (UID: \"0fdbed5c-e2a7-42ee-9e92-68d0bbbff023\") " Mar 20 17:57:49 crc kubenswrapper[4690]: I0320 17:57:49.852783 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fdbed5c-e2a7-42ee-9e92-68d0bbbff023-kube-api-access-f6xzx" (OuterVolumeSpecName: "kube-api-access-f6xzx") pod "0fdbed5c-e2a7-42ee-9e92-68d0bbbff023" (UID: "0fdbed5c-e2a7-42ee-9e92-68d0bbbff023"). InnerVolumeSpecName "kube-api-access-f6xzx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:57:49 crc kubenswrapper[4690]: I0320 17:57:49.862092 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fdbed5c-e2a7-42ee-9e92-68d0bbbff023-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "0fdbed5c-e2a7-42ee-9e92-68d0bbbff023" (UID: "0fdbed5c-e2a7-42ee-9e92-68d0bbbff023"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:57:49 crc kubenswrapper[4690]: I0320 17:57:49.886788 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fdbed5c-e2a7-42ee-9e92-68d0bbbff023-inventory" (OuterVolumeSpecName: "inventory") pod "0fdbed5c-e2a7-42ee-9e92-68d0bbbff023" (UID: "0fdbed5c-e2a7-42ee-9e92-68d0bbbff023"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:57:49 crc kubenswrapper[4690]: I0320 17:57:49.928670 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f6xzx\" (UniqueName: \"kubernetes.io/projected/0fdbed5c-e2a7-42ee-9e92-68d0bbbff023-kube-api-access-f6xzx\") on node \"crc\" DevicePath \"\"" Mar 20 17:57:49 crc kubenswrapper[4690]: I0320 17:57:49.928705 4690 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0fdbed5c-e2a7-42ee-9e92-68d0bbbff023-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 17:57:49 crc kubenswrapper[4690]: I0320 17:57:49.928720 4690 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0fdbed5c-e2a7-42ee-9e92-68d0bbbff023-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 17:57:50 crc kubenswrapper[4690]: I0320 17:57:50.283396 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-h7849" Mar 20 17:57:50 crc kubenswrapper[4690]: I0320 17:57:50.283438 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-h7849" event={"ID":"0fdbed5c-e2a7-42ee-9e92-68d0bbbff023","Type":"ContainerDied","Data":"40a840c63a52311b0c01975d9f5da434f6e430cc412fd2c95e8eb34781e63511"} Mar 20 17:57:50 crc kubenswrapper[4690]: I0320 17:57:50.283493 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="40a840c63a52311b0c01975d9f5da434f6e430cc412fd2c95e8eb34781e63511" Mar 20 17:57:50 crc kubenswrapper[4690]: I0320 17:57:50.355566 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zmwzf"] Mar 20 17:57:50 crc kubenswrapper[4690]: E0320 17:57:50.358300 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fdbed5c-e2a7-42ee-9e92-68d0bbbff023" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 20 17:57:50 crc kubenswrapper[4690]: I0320 17:57:50.358335 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fdbed5c-e2a7-42ee-9e92-68d0bbbff023" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 20 17:57:50 crc kubenswrapper[4690]: I0320 17:57:50.358547 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fdbed5c-e2a7-42ee-9e92-68d0bbbff023" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 20 17:57:50 crc kubenswrapper[4690]: I0320 17:57:50.359991 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zmwzf" Mar 20 17:57:50 crc kubenswrapper[4690]: I0320 17:57:50.364614 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 17:57:50 crc kubenswrapper[4690]: I0320 17:57:50.364803 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 17:57:50 crc kubenswrapper[4690]: I0320 17:57:50.365442 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-k9qb4" Mar 20 17:57:50 crc kubenswrapper[4690]: I0320 17:57:50.365465 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 17:57:50 crc kubenswrapper[4690]: I0320 17:57:50.391653 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zmwzf"] Mar 20 17:57:50 crc kubenswrapper[4690]: I0320 17:57:50.451928 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9nsr\" (UniqueName: \"kubernetes.io/projected/33405126-fa78-4ad4-9587-e157ffd9f389-kube-api-access-c9nsr\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zmwzf\" (UID: \"33405126-fa78-4ad4-9587-e157ffd9f389\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zmwzf" Mar 20 17:57:50 crc kubenswrapper[4690]: I0320 17:57:50.451990 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/33405126-fa78-4ad4-9587-e157ffd9f389-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zmwzf\" (UID: \"33405126-fa78-4ad4-9587-e157ffd9f389\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zmwzf" Mar 20 17:57:50 crc kubenswrapper[4690]: I0320 17:57:50.452043 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33405126-fa78-4ad4-9587-e157ffd9f389-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zmwzf\" (UID: \"33405126-fa78-4ad4-9587-e157ffd9f389\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zmwzf" Mar 20 17:57:50 crc kubenswrapper[4690]: I0320 17:57:50.452078 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/33405126-fa78-4ad4-9587-e157ffd9f389-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zmwzf\" (UID: \"33405126-fa78-4ad4-9587-e157ffd9f389\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zmwzf" Mar 20 17:57:50 crc kubenswrapper[4690]: I0320 17:57:50.554122 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9nsr\" (UniqueName: \"kubernetes.io/projected/33405126-fa78-4ad4-9587-e157ffd9f389-kube-api-access-c9nsr\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zmwzf\" (UID: \"33405126-fa78-4ad4-9587-e157ffd9f389\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zmwzf" Mar 20 17:57:50 crc kubenswrapper[4690]: I0320 17:57:50.554303 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/33405126-fa78-4ad4-9587-e157ffd9f389-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zmwzf\" (UID: \"33405126-fa78-4ad4-9587-e157ffd9f389\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zmwzf" Mar 20 17:57:50 crc kubenswrapper[4690]: I0320 17:57:50.554490 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33405126-fa78-4ad4-9587-e157ffd9f389-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zmwzf\" (UID: \"33405126-fa78-4ad4-9587-e157ffd9f389\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zmwzf" Mar 20 17:57:50 crc kubenswrapper[4690]: I0320 17:57:50.555561 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/33405126-fa78-4ad4-9587-e157ffd9f389-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zmwzf\" (UID: \"33405126-fa78-4ad4-9587-e157ffd9f389\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zmwzf" Mar 20 17:57:50 crc kubenswrapper[4690]: I0320 17:57:50.559519 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/33405126-fa78-4ad4-9587-e157ffd9f389-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zmwzf\" (UID: \"33405126-fa78-4ad4-9587-e157ffd9f389\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zmwzf" Mar 20 17:57:50 crc kubenswrapper[4690]: I0320 17:57:50.559671 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/33405126-fa78-4ad4-9587-e157ffd9f389-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zmwzf\" (UID: \"33405126-fa78-4ad4-9587-e157ffd9f389\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zmwzf" Mar 20 17:57:50 crc kubenswrapper[4690]: I0320 17:57:50.562809 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33405126-fa78-4ad4-9587-e157ffd9f389-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zmwzf\" (UID: \"33405126-fa78-4ad4-9587-e157ffd9f389\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zmwzf" Mar 20 17:57:50 crc kubenswrapper[4690]: I0320 17:57:50.571467 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9nsr\" (UniqueName: \"kubernetes.io/projected/33405126-fa78-4ad4-9587-e157ffd9f389-kube-api-access-c9nsr\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zmwzf\" (UID: \"33405126-fa78-4ad4-9587-e157ffd9f389\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zmwzf" Mar 20 17:57:50 crc kubenswrapper[4690]: I0320 17:57:50.682864 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zmwzf" Mar 20 17:57:51 crc kubenswrapper[4690]: I0320 17:57:51.268400 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zmwzf"] Mar 20 17:57:51 crc kubenswrapper[4690]: I0320 17:57:51.294479 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zmwzf" event={"ID":"33405126-fa78-4ad4-9587-e157ffd9f389","Type":"ContainerStarted","Data":"84e6318b40412bcde2cfcdc1619c2fb992a79e983ffda7619d62b952c1965e05"} Mar 20 17:57:52 crc kubenswrapper[4690]: I0320 17:57:52.306563 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zmwzf" event={"ID":"33405126-fa78-4ad4-9587-e157ffd9f389","Type":"ContainerStarted","Data":"ab94ef72179a2a482a89ed746f85123c3a4cbc06440b8bbed7fd308ee24c56b8"} Mar 20 17:57:52 crc kubenswrapper[4690]: I0320 17:57:52.327192 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zmwzf" podStartSLOduration=1.831759775 podStartE2EDuration="2.327174942s" podCreationTimestamp="2026-03-20 17:57:50 +0000 UTC" firstStartedPulling="2026-03-20 17:57:51.269205483 +0000 UTC m=+1546.135031171" lastFinishedPulling="2026-03-20 17:57:51.76462066 +0000 UTC m=+1546.630446338" observedRunningTime="2026-03-20 17:57:52.32673887 +0000 UTC m=+1547.192564558" watchObservedRunningTime="2026-03-20 17:57:52.327174942 +0000 UTC m=+1547.193000620" Mar 20 17:57:54 crc kubenswrapper[4690]: I0320 17:57:54.274592 4690 patch_prober.go:28] interesting pod/machine-config-daemon-wtg2q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:57:54 crc kubenswrapper[4690]: I0320 17:57:54.274694 4690 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:58:00 crc kubenswrapper[4690]: I0320 17:58:00.146656 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567158-tf2mq"] Mar 20 17:58:00 crc kubenswrapper[4690]: I0320 17:58:00.149667 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567158-tf2mq" Mar 20 17:58:00 crc kubenswrapper[4690]: I0320 17:58:00.153765 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5fwhb" Mar 20 17:58:00 crc kubenswrapper[4690]: I0320 17:58:00.155179 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 17:58:00 crc kubenswrapper[4690]: I0320 17:58:00.155518 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 17:58:00 crc kubenswrapper[4690]: I0320 17:58:00.193575 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567158-tf2mq"] Mar 20 17:58:00 crc kubenswrapper[4690]: I0320 17:58:00.270281 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbkmk\" (UniqueName: \"kubernetes.io/projected/6aaec591-2764-4591-9113-632b649d5d7b-kube-api-access-sbkmk\") pod \"auto-csr-approver-29567158-tf2mq\" (UID: \"6aaec591-2764-4591-9113-632b649d5d7b\") " pod="openshift-infra/auto-csr-approver-29567158-tf2mq" Mar 20 17:58:00 crc kubenswrapper[4690]: I0320 17:58:00.371741 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbkmk\" (UniqueName: \"kubernetes.io/projected/6aaec591-2764-4591-9113-632b649d5d7b-kube-api-access-sbkmk\") pod \"auto-csr-approver-29567158-tf2mq\" (UID: \"6aaec591-2764-4591-9113-632b649d5d7b\") " pod="openshift-infra/auto-csr-approver-29567158-tf2mq" Mar 20 17:58:00 crc kubenswrapper[4690]: I0320 17:58:00.391979 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbkmk\" (UniqueName: \"kubernetes.io/projected/6aaec591-2764-4591-9113-632b649d5d7b-kube-api-access-sbkmk\") pod \"auto-csr-approver-29567158-tf2mq\" (UID: \"6aaec591-2764-4591-9113-632b649d5d7b\") " pod="openshift-infra/auto-csr-approver-29567158-tf2mq" Mar 20 17:58:00 crc kubenswrapper[4690]: I0320 17:58:00.498867 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567158-tf2mq" Mar 20 17:58:01 crc kubenswrapper[4690]: I0320 17:58:01.018691 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567158-tf2mq"] Mar 20 17:58:01 crc kubenswrapper[4690]: I0320 17:58:01.433204 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567158-tf2mq" event={"ID":"6aaec591-2764-4591-9113-632b649d5d7b","Type":"ContainerStarted","Data":"ec75609a7eb1a552c87122b127259c528c9914db906f680a9f3896f323133329"} Mar 20 17:58:03 crc kubenswrapper[4690]: I0320 17:58:03.457056 4690 generic.go:334] "Generic (PLEG): container finished" podID="6aaec591-2764-4591-9113-632b649d5d7b" containerID="f5979325d976675a62f975210086321a1495bad2c2f3af8a0b22cdbd2b5a2e40" exitCode=0 Mar 20 17:58:03 crc kubenswrapper[4690]: I0320 17:58:03.457135 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567158-tf2mq" event={"ID":"6aaec591-2764-4591-9113-632b649d5d7b","Type":"ContainerDied","Data":"f5979325d976675a62f975210086321a1495bad2c2f3af8a0b22cdbd2b5a2e40"} Mar 20 17:58:04 crc kubenswrapper[4690]: I0320 17:58:04.853203 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567158-tf2mq" Mar 20 17:58:04 crc kubenswrapper[4690]: I0320 17:58:04.968950 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbkmk\" (UniqueName: \"kubernetes.io/projected/6aaec591-2764-4591-9113-632b649d5d7b-kube-api-access-sbkmk\") pod \"6aaec591-2764-4591-9113-632b649d5d7b\" (UID: \"6aaec591-2764-4591-9113-632b649d5d7b\") " Mar 20 17:58:04 crc kubenswrapper[4690]: I0320 17:58:04.978420 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6aaec591-2764-4591-9113-632b649d5d7b-kube-api-access-sbkmk" (OuterVolumeSpecName: "kube-api-access-sbkmk") pod "6aaec591-2764-4591-9113-632b649d5d7b" (UID: "6aaec591-2764-4591-9113-632b649d5d7b"). InnerVolumeSpecName "kube-api-access-sbkmk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:58:05 crc kubenswrapper[4690]: I0320 17:58:05.072878 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sbkmk\" (UniqueName: \"kubernetes.io/projected/6aaec591-2764-4591-9113-632b649d5d7b-kube-api-access-sbkmk\") on node \"crc\" DevicePath \"\"" Mar 20 17:58:05 crc kubenswrapper[4690]: I0320 17:58:05.490730 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567158-tf2mq" event={"ID":"6aaec591-2764-4591-9113-632b649d5d7b","Type":"ContainerDied","Data":"ec75609a7eb1a552c87122b127259c528c9914db906f680a9f3896f323133329"} Mar 20 17:58:05 crc kubenswrapper[4690]: I0320 17:58:05.490777 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec75609a7eb1a552c87122b127259c528c9914db906f680a9f3896f323133329" Mar 20 17:58:05 crc kubenswrapper[4690]: I0320 17:58:05.490839 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567158-tf2mq" Mar 20 17:58:05 crc kubenswrapper[4690]: I0320 17:58:05.928554 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567152-5hrb6"] Mar 20 17:58:05 crc kubenswrapper[4690]: I0320 17:58:05.937196 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567152-5hrb6"] Mar 20 17:58:07 crc kubenswrapper[4690]: I0320 17:58:07.900787 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3572a99-ffc5-435a-b485-aa6aa5c9479c" path="/var/lib/kubelet/pods/f3572a99-ffc5-435a-b485-aa6aa5c9479c/volumes" Mar 20 17:58:20 crc kubenswrapper[4690]: I0320 17:58:20.109096 4690 scope.go:117] "RemoveContainer" containerID="04a913f5a73faf7628592bc268a024dd8bfde9048ee5435639ece269f06fd896" Mar 20 17:58:20 crc kubenswrapper[4690]: I0320 17:58:20.135132 4690 scope.go:117] "RemoveContainer" containerID="6bf6bdcd7318d5f656757b8d47a56206bcc57b804d2ea81a3466bcaaef721b20" Mar 20 17:58:20 crc kubenswrapper[4690]: I0320 17:58:20.157745 4690 scope.go:117] "RemoveContainer" containerID="e32fdf8c2102bd7ba3cad331c73e818ce6d0901e8c7c47f023f4120143d2d905" Mar 20 17:58:20 crc kubenswrapper[4690]: I0320 17:58:20.198805 4690 scope.go:117] "RemoveContainer" containerID="54a42d890ed25cd146ab7ae3de8ae0395d9660ddeb8c74ed0a2cbe365d3c3ffa" Mar 20 17:58:20 crc kubenswrapper[4690]: I0320 17:58:20.240907 4690 scope.go:117] "RemoveContainer" containerID="de0cff331526c76453f292c84b51632bb31a727a6cd1b72634f96ec3d1acd386" Mar 20 17:58:20 crc kubenswrapper[4690]: I0320 17:58:20.279302 4690 scope.go:117] "RemoveContainer" containerID="70b505a4f5072a8fe51177145adda014c686869af3a089c48a816ae8c2c6d31d" Mar 20 17:58:20 crc kubenswrapper[4690]: I0320 17:58:20.315621 4690 scope.go:117] "RemoveContainer" containerID="c301d4b9a666f9ed31a11c29a6644edc9964b4202ffff0da85d905570418c3c8" Mar 20 17:58:20 crc kubenswrapper[4690]: I0320 17:58:20.365645 4690 scope.go:117] "RemoveContainer" containerID="c48916f05045307b3413cb61f6721d52155a11bc486656d060bc10c70beba6d8" Mar 20 17:58:20 crc kubenswrapper[4690]: I0320 17:58:20.421857 4690 scope.go:117] "RemoveContainer" containerID="fc3f0da12c63a78519e5d130c139d0e75dd9bc8e62fd2f2a4ab1adb98f49cc1b" Mar 20 17:58:24 crc kubenswrapper[4690]: I0320 17:58:24.274521 4690 patch_prober.go:28] interesting pod/machine-config-daemon-wtg2q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:58:24 crc kubenswrapper[4690]: I0320 17:58:24.275294 4690 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:58:42 crc kubenswrapper[4690]: I0320 17:58:42.849988 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-jldpp"] Mar 20 17:58:42 crc kubenswrapper[4690]: E0320 17:58:42.852477 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6aaec591-2764-4591-9113-632b649d5d7b" containerName="oc" Mar 20 17:58:42 crc kubenswrapper[4690]: I0320 17:58:42.852608 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="6aaec591-2764-4591-9113-632b649d5d7b" containerName="oc" Mar 20 17:58:42 crc kubenswrapper[4690]: I0320 17:58:42.852909 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="6aaec591-2764-4591-9113-632b649d5d7b" containerName="oc" Mar 20 17:58:42 crc kubenswrapper[4690]: I0320 17:58:42.855104 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jldpp" Mar 20 17:58:42 crc kubenswrapper[4690]: I0320 17:58:42.871374 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jldpp"] Mar 20 17:58:42 crc kubenswrapper[4690]: I0320 17:58:42.893818 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsm42\" (UniqueName: \"kubernetes.io/projected/88337cf0-9721-4dc8-9ca0-5ff9b6828b1d-kube-api-access-bsm42\") pod \"community-operators-jldpp\" (UID: \"88337cf0-9721-4dc8-9ca0-5ff9b6828b1d\") " pod="openshift-marketplace/community-operators-jldpp" Mar 20 17:58:42 crc kubenswrapper[4690]: I0320 17:58:42.893875 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88337cf0-9721-4dc8-9ca0-5ff9b6828b1d-utilities\") pod \"community-operators-jldpp\" (UID: \"88337cf0-9721-4dc8-9ca0-5ff9b6828b1d\") " pod="openshift-marketplace/community-operators-jldpp" Mar 20 17:58:42 crc kubenswrapper[4690]: I0320 17:58:42.894013 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88337cf0-9721-4dc8-9ca0-5ff9b6828b1d-catalog-content\") pod \"community-operators-jldpp\" (UID: \"88337cf0-9721-4dc8-9ca0-5ff9b6828b1d\") " pod="openshift-marketplace/community-operators-jldpp" Mar 20 17:58:42 crc kubenswrapper[4690]: I0320 17:58:42.995924 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsm42\" (UniqueName: \"kubernetes.io/projected/88337cf0-9721-4dc8-9ca0-5ff9b6828b1d-kube-api-access-bsm42\") pod \"community-operators-jldpp\" (UID: \"88337cf0-9721-4dc8-9ca0-5ff9b6828b1d\") " pod="openshift-marketplace/community-operators-jldpp" Mar 20 17:58:42 crc kubenswrapper[4690]: I0320 17:58:42.995983 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88337cf0-9721-4dc8-9ca0-5ff9b6828b1d-utilities\") pod \"community-operators-jldpp\" (UID: \"88337cf0-9721-4dc8-9ca0-5ff9b6828b1d\") " pod="openshift-marketplace/community-operators-jldpp" Mar 20 17:58:42 crc kubenswrapper[4690]: I0320 17:58:42.996160 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88337cf0-9721-4dc8-9ca0-5ff9b6828b1d-catalog-content\") pod \"community-operators-jldpp\" (UID: \"88337cf0-9721-4dc8-9ca0-5ff9b6828b1d\") " pod="openshift-marketplace/community-operators-jldpp" Mar 20 17:58:42 crc kubenswrapper[4690]: I0320 17:58:42.996720 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88337cf0-9721-4dc8-9ca0-5ff9b6828b1d-utilities\") pod \"community-operators-jldpp\" (UID: \"88337cf0-9721-4dc8-9ca0-5ff9b6828b1d\") " pod="openshift-marketplace/community-operators-jldpp" Mar 20 17:58:42 crc kubenswrapper[4690]: I0320 17:58:42.996748 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88337cf0-9721-4dc8-9ca0-5ff9b6828b1d-catalog-content\") pod \"community-operators-jldpp\" (UID: \"88337cf0-9721-4dc8-9ca0-5ff9b6828b1d\") " pod="openshift-marketplace/community-operators-jldpp" Mar 20 17:58:43 crc kubenswrapper[4690]: I0320 17:58:43.021155 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsm42\" (UniqueName: \"kubernetes.io/projected/88337cf0-9721-4dc8-9ca0-5ff9b6828b1d-kube-api-access-bsm42\") pod \"community-operators-jldpp\" (UID: \"88337cf0-9721-4dc8-9ca0-5ff9b6828b1d\") " pod="openshift-marketplace/community-operators-jldpp" Mar 20 17:58:43 crc kubenswrapper[4690]: I0320 17:58:43.183128 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jldpp" Mar 20 17:58:43 crc kubenswrapper[4690]: I0320 17:58:43.727458 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jldpp"] Mar 20 17:58:43 crc kubenswrapper[4690]: I0320 17:58:43.954874 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jldpp" event={"ID":"88337cf0-9721-4dc8-9ca0-5ff9b6828b1d","Type":"ContainerStarted","Data":"24853583c0a49a83f5d1d6777e4786ebdbaa0f209722e40249352d04091af747"} Mar 20 17:58:43 crc kubenswrapper[4690]: I0320 17:58:43.956109 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jldpp" event={"ID":"88337cf0-9721-4dc8-9ca0-5ff9b6828b1d","Type":"ContainerStarted","Data":"b2db56258de5db8daa8ef5afdc5578b3f21ad63aeb3f4d9b2fc7099465b88cac"} Mar 20 17:58:44 crc kubenswrapper[4690]: I0320 17:58:44.967623 4690 generic.go:334] "Generic (PLEG): container finished" podID="88337cf0-9721-4dc8-9ca0-5ff9b6828b1d" containerID="24853583c0a49a83f5d1d6777e4786ebdbaa0f209722e40249352d04091af747" exitCode=0 Mar 20 17:58:44 crc kubenswrapper[4690]: I0320 17:58:44.967760 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jldpp" event={"ID":"88337cf0-9721-4dc8-9ca0-5ff9b6828b1d","Type":"ContainerDied","Data":"24853583c0a49a83f5d1d6777e4786ebdbaa0f209722e40249352d04091af747"} Mar 20 17:58:46 crc kubenswrapper[4690]: I0320 17:58:46.994975 4690 generic.go:334] "Generic (PLEG): container finished" podID="88337cf0-9721-4dc8-9ca0-5ff9b6828b1d" containerID="3ea8580aad82b4de4bbd1993ff59fbc286700b2ec94dec8be9e3d6a48d57429f" exitCode=0 Mar 20 17:58:46 crc kubenswrapper[4690]: I0320 17:58:46.995041 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jldpp" event={"ID":"88337cf0-9721-4dc8-9ca0-5ff9b6828b1d","Type":"ContainerDied","Data":"3ea8580aad82b4de4bbd1993ff59fbc286700b2ec94dec8be9e3d6a48d57429f"} Mar 20 17:58:48 crc kubenswrapper[4690]: I0320 17:58:48.006157 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jldpp" event={"ID":"88337cf0-9721-4dc8-9ca0-5ff9b6828b1d","Type":"ContainerStarted","Data":"e6697f9f0eed2e083697e8a863d7739763f042218f304efa219be99577fedcf8"} Mar 20 17:58:48 crc kubenswrapper[4690]: I0320 17:58:48.046986 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-jldpp" podStartSLOduration=3.546553963 podStartE2EDuration="6.046963613s" podCreationTimestamp="2026-03-20 17:58:42 +0000 UTC" firstStartedPulling="2026-03-20 17:58:44.970146396 +0000 UTC m=+1599.835972094" lastFinishedPulling="2026-03-20 17:58:47.470556046 +0000 UTC m=+1602.336381744" observedRunningTime="2026-03-20 17:58:48.022536724 +0000 UTC m=+1602.888362412" watchObservedRunningTime="2026-03-20 17:58:48.046963613 +0000 UTC m=+1602.912789301" Mar 20 17:58:53 crc kubenswrapper[4690]: I0320 17:58:53.183434 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-jldpp" Mar 20 17:58:53 crc kubenswrapper[4690]: I0320 17:58:53.184221 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-jldpp" Mar 20 17:58:53 crc kubenswrapper[4690]: I0320 17:58:53.240185 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-jldpp" Mar 20 17:58:54 crc kubenswrapper[4690]: I0320 17:58:54.160661 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-jldpp" Mar 20 17:58:54 crc kubenswrapper[4690]: I0320 17:58:54.217622 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jldpp"] Mar 20 17:58:54 crc kubenswrapper[4690]: I0320 17:58:54.274389 4690 patch_prober.go:28] interesting pod/machine-config-daemon-wtg2q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:58:54 crc kubenswrapper[4690]: I0320 17:58:54.274454 4690 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:58:54 crc kubenswrapper[4690]: I0320 17:58:54.274498 4690 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" Mar 20 17:58:54 crc kubenswrapper[4690]: I0320 17:58:54.275198 4690 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"965e35066bff888caca5b994dc3af56f56ca5e0e9e97a4c5970943a091971930"} pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 17:58:54 crc kubenswrapper[4690]: I0320 17:58:54.275251 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" containerName="machine-config-daemon" containerID="cri-o://965e35066bff888caca5b994dc3af56f56ca5e0e9e97a4c5970943a091971930" gracePeriod=600 Mar 20 17:58:54 crc kubenswrapper[4690]: E0320 17:58:54.399290 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 17:58:55 crc kubenswrapper[4690]: I0320 17:58:55.100306 4690 generic.go:334] "Generic (PLEG): container finished" podID="c18651e4-89e3-43fd-a780-bfa6df87591e" containerID="965e35066bff888caca5b994dc3af56f56ca5e0e9e97a4c5970943a091971930" exitCode=0 Mar 20 17:58:55 crc kubenswrapper[4690]: I0320 17:58:55.100401 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" event={"ID":"c18651e4-89e3-43fd-a780-bfa6df87591e","Type":"ContainerDied","Data":"965e35066bff888caca5b994dc3af56f56ca5e0e9e97a4c5970943a091971930"} Mar 20 17:58:55 crc kubenswrapper[4690]: I0320 17:58:55.100451 4690 scope.go:117] "RemoveContainer" containerID="c6c26ff37905c4c37c818991d48555bc929721ae7acd19a88c41bd55b417a5fe" Mar 20 17:58:55 crc kubenswrapper[4690]: I0320 17:58:55.101784 4690 scope.go:117] "RemoveContainer" containerID="965e35066bff888caca5b994dc3af56f56ca5e0e9e97a4c5970943a091971930" Mar 20 17:58:55 crc kubenswrapper[4690]: E0320 17:58:55.102433 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 17:58:56 crc kubenswrapper[4690]: I0320 17:58:56.109663 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-jldpp" podUID="88337cf0-9721-4dc8-9ca0-5ff9b6828b1d" containerName="registry-server" containerID="cri-o://e6697f9f0eed2e083697e8a863d7739763f042218f304efa219be99577fedcf8" gracePeriod=2 Mar 20 17:58:56 crc kubenswrapper[4690]: I0320 17:58:56.595494 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jldpp" Mar 20 17:58:56 crc kubenswrapper[4690]: I0320 17:58:56.781032 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bsm42\" (UniqueName: \"kubernetes.io/projected/88337cf0-9721-4dc8-9ca0-5ff9b6828b1d-kube-api-access-bsm42\") pod \"88337cf0-9721-4dc8-9ca0-5ff9b6828b1d\" (UID: \"88337cf0-9721-4dc8-9ca0-5ff9b6828b1d\") " Mar 20 17:58:56 crc kubenswrapper[4690]: I0320 17:58:56.781123 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88337cf0-9721-4dc8-9ca0-5ff9b6828b1d-catalog-content\") pod \"88337cf0-9721-4dc8-9ca0-5ff9b6828b1d\" (UID: \"88337cf0-9721-4dc8-9ca0-5ff9b6828b1d\") " Mar 20 17:58:56 crc kubenswrapper[4690]: I0320 17:58:56.781399 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88337cf0-9721-4dc8-9ca0-5ff9b6828b1d-utilities\") pod \"88337cf0-9721-4dc8-9ca0-5ff9b6828b1d\" (UID: \"88337cf0-9721-4dc8-9ca0-5ff9b6828b1d\") " Mar 20 17:58:56 crc kubenswrapper[4690]: I0320 17:58:56.782780 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88337cf0-9721-4dc8-9ca0-5ff9b6828b1d-utilities" (OuterVolumeSpecName: "utilities") pod "88337cf0-9721-4dc8-9ca0-5ff9b6828b1d" (UID: "88337cf0-9721-4dc8-9ca0-5ff9b6828b1d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:58:56 crc kubenswrapper[4690]: I0320 17:58:56.791050 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88337cf0-9721-4dc8-9ca0-5ff9b6828b1d-kube-api-access-bsm42" (OuterVolumeSpecName: "kube-api-access-bsm42") pod "88337cf0-9721-4dc8-9ca0-5ff9b6828b1d" (UID: "88337cf0-9721-4dc8-9ca0-5ff9b6828b1d"). InnerVolumeSpecName "kube-api-access-bsm42". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:58:56 crc kubenswrapper[4690]: I0320 17:58:56.870384 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88337cf0-9721-4dc8-9ca0-5ff9b6828b1d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "88337cf0-9721-4dc8-9ca0-5ff9b6828b1d" (UID: "88337cf0-9721-4dc8-9ca0-5ff9b6828b1d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:58:56 crc kubenswrapper[4690]: I0320 17:58:56.884197 4690 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88337cf0-9721-4dc8-9ca0-5ff9b6828b1d-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 17:58:56 crc kubenswrapper[4690]: I0320 17:58:56.884235 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bsm42\" (UniqueName: \"kubernetes.io/projected/88337cf0-9721-4dc8-9ca0-5ff9b6828b1d-kube-api-access-bsm42\") on node \"crc\" DevicePath \"\"" Mar 20 17:58:56 crc kubenswrapper[4690]: I0320 17:58:56.884252 4690 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88337cf0-9721-4dc8-9ca0-5ff9b6828b1d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 17:58:57 crc kubenswrapper[4690]: I0320 17:58:57.122209 4690 generic.go:334] "Generic (PLEG): container finished" podID="88337cf0-9721-4dc8-9ca0-5ff9b6828b1d" containerID="e6697f9f0eed2e083697e8a863d7739763f042218f304efa219be99577fedcf8" exitCode=0 Mar 20 17:58:57 crc kubenswrapper[4690]: I0320 17:58:57.122301 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jldpp" Mar 20 17:58:57 crc kubenswrapper[4690]: I0320 17:58:57.122303 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jldpp" event={"ID":"88337cf0-9721-4dc8-9ca0-5ff9b6828b1d","Type":"ContainerDied","Data":"e6697f9f0eed2e083697e8a863d7739763f042218f304efa219be99577fedcf8"} Mar 20 17:58:57 crc kubenswrapper[4690]: I0320 17:58:57.122378 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jldpp" event={"ID":"88337cf0-9721-4dc8-9ca0-5ff9b6828b1d","Type":"ContainerDied","Data":"b2db56258de5db8daa8ef5afdc5578b3f21ad63aeb3f4d9b2fc7099465b88cac"} Mar 20 17:58:57 crc kubenswrapper[4690]: I0320 17:58:57.122412 4690 scope.go:117] "RemoveContainer" containerID="e6697f9f0eed2e083697e8a863d7739763f042218f304efa219be99577fedcf8" Mar 20 17:58:57 crc kubenswrapper[4690]: I0320 17:58:57.144718 4690 scope.go:117] "RemoveContainer" containerID="3ea8580aad82b4de4bbd1993ff59fbc286700b2ec94dec8be9e3d6a48d57429f" Mar 20 17:58:57 crc kubenswrapper[4690]: I0320 17:58:57.166711 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jldpp"] Mar 20 17:58:57 crc kubenswrapper[4690]: I0320 17:58:57.184587 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-jldpp"] Mar 20 17:58:57 crc kubenswrapper[4690]: I0320 17:58:57.185990 4690 scope.go:117] "RemoveContainer" containerID="24853583c0a49a83f5d1d6777e4786ebdbaa0f209722e40249352d04091af747" Mar 20 17:58:57 crc kubenswrapper[4690]: I0320 17:58:57.234657 4690 scope.go:117] "RemoveContainer" containerID="e6697f9f0eed2e083697e8a863d7739763f042218f304efa219be99577fedcf8" Mar 20 17:58:57 crc kubenswrapper[4690]: E0320 17:58:57.235013 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6697f9f0eed2e083697e8a863d7739763f042218f304efa219be99577fedcf8\": container with ID starting with e6697f9f0eed2e083697e8a863d7739763f042218f304efa219be99577fedcf8 not found: ID does not exist" containerID="e6697f9f0eed2e083697e8a863d7739763f042218f304efa219be99577fedcf8" Mar 20 17:58:57 crc kubenswrapper[4690]: I0320 17:58:57.235057 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6697f9f0eed2e083697e8a863d7739763f042218f304efa219be99577fedcf8"} err="failed to get container status \"e6697f9f0eed2e083697e8a863d7739763f042218f304efa219be99577fedcf8\": rpc error: code = NotFound desc = could not find container \"e6697f9f0eed2e083697e8a863d7739763f042218f304efa219be99577fedcf8\": container with ID starting with e6697f9f0eed2e083697e8a863d7739763f042218f304efa219be99577fedcf8 not found: ID does not exist" Mar 20 17:58:57 crc kubenswrapper[4690]: I0320 17:58:57.235084 4690 scope.go:117] "RemoveContainer" containerID="3ea8580aad82b4de4bbd1993ff59fbc286700b2ec94dec8be9e3d6a48d57429f" Mar 20 17:58:57 crc kubenswrapper[4690]: E0320 17:58:57.235428 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ea8580aad82b4de4bbd1993ff59fbc286700b2ec94dec8be9e3d6a48d57429f\": container with ID starting with 3ea8580aad82b4de4bbd1993ff59fbc286700b2ec94dec8be9e3d6a48d57429f not found: ID does not exist" containerID="3ea8580aad82b4de4bbd1993ff59fbc286700b2ec94dec8be9e3d6a48d57429f" Mar 20 17:58:57 crc kubenswrapper[4690]: I0320 17:58:57.235453 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ea8580aad82b4de4bbd1993ff59fbc286700b2ec94dec8be9e3d6a48d57429f"} err="failed to get container status \"3ea8580aad82b4de4bbd1993ff59fbc286700b2ec94dec8be9e3d6a48d57429f\": rpc error: code = NotFound desc = could not find container \"3ea8580aad82b4de4bbd1993ff59fbc286700b2ec94dec8be9e3d6a48d57429f\": container with ID starting with 3ea8580aad82b4de4bbd1993ff59fbc286700b2ec94dec8be9e3d6a48d57429f not found: ID does not exist" Mar 20 17:58:57 crc kubenswrapper[4690]: I0320 17:58:57.235469 4690 scope.go:117] "RemoveContainer" containerID="24853583c0a49a83f5d1d6777e4786ebdbaa0f209722e40249352d04091af747" Mar 20 17:58:57 crc kubenswrapper[4690]: E0320 17:58:57.235789 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24853583c0a49a83f5d1d6777e4786ebdbaa0f209722e40249352d04091af747\": container with ID starting with 24853583c0a49a83f5d1d6777e4786ebdbaa0f209722e40249352d04091af747 not found: ID does not exist" containerID="24853583c0a49a83f5d1d6777e4786ebdbaa0f209722e40249352d04091af747" Mar 20 17:58:57 crc kubenswrapper[4690]: I0320 17:58:57.235844 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24853583c0a49a83f5d1d6777e4786ebdbaa0f209722e40249352d04091af747"} err="failed to get container status \"24853583c0a49a83f5d1d6777e4786ebdbaa0f209722e40249352d04091af747\": rpc error: code = NotFound desc = could not find container \"24853583c0a49a83f5d1d6777e4786ebdbaa0f209722e40249352d04091af747\": container with ID starting with 24853583c0a49a83f5d1d6777e4786ebdbaa0f209722e40249352d04091af747 not found: ID does not exist" Mar 20 17:58:57 crc kubenswrapper[4690]: I0320 17:58:57.892455 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88337cf0-9721-4dc8-9ca0-5ff9b6828b1d" path="/var/lib/kubelet/pods/88337cf0-9721-4dc8-9ca0-5ff9b6828b1d/volumes" Mar 20 17:59:01 crc kubenswrapper[4690]: I0320 17:59:01.205734 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6jkbp"] Mar 20 17:59:01 crc kubenswrapper[4690]: E0320 17:59:01.206636 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88337cf0-9721-4dc8-9ca0-5ff9b6828b1d" containerName="extract-utilities" Mar 20 17:59:01 crc kubenswrapper[4690]: I0320 17:59:01.206653 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="88337cf0-9721-4dc8-9ca0-5ff9b6828b1d" containerName="extract-utilities" Mar 20 17:59:01 crc kubenswrapper[4690]: E0320 17:59:01.206673 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88337cf0-9721-4dc8-9ca0-5ff9b6828b1d" containerName="extract-content" Mar 20 17:59:01 crc kubenswrapper[4690]: I0320 17:59:01.206682 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="88337cf0-9721-4dc8-9ca0-5ff9b6828b1d" containerName="extract-content" Mar 20 17:59:01 crc kubenswrapper[4690]: E0320 17:59:01.206697 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88337cf0-9721-4dc8-9ca0-5ff9b6828b1d" containerName="registry-server" Mar 20 17:59:01 crc kubenswrapper[4690]: I0320 17:59:01.206705 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="88337cf0-9721-4dc8-9ca0-5ff9b6828b1d" containerName="registry-server" Mar 20 17:59:01 crc kubenswrapper[4690]: I0320 17:59:01.206954 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="88337cf0-9721-4dc8-9ca0-5ff9b6828b1d" containerName="registry-server" Mar 20 17:59:01 crc kubenswrapper[4690]: I0320 17:59:01.208674 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6jkbp" Mar 20 17:59:01 crc kubenswrapper[4690]: I0320 17:59:01.227588 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6jkbp"] Mar 20 17:59:01 crc kubenswrapper[4690]: I0320 17:59:01.261633 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ab75f46-5a66-4bbd-922d-682bee04ea9f-catalog-content\") pod \"certified-operators-6jkbp\" (UID: \"8ab75f46-5a66-4bbd-922d-682bee04ea9f\") " pod="openshift-marketplace/certified-operators-6jkbp" Mar 20 17:59:01 crc kubenswrapper[4690]: I0320 17:59:01.261795 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ab75f46-5a66-4bbd-922d-682bee04ea9f-utilities\") pod \"certified-operators-6jkbp\" (UID: \"8ab75f46-5a66-4bbd-922d-682bee04ea9f\") " pod="openshift-marketplace/certified-operators-6jkbp" Mar 20 17:59:01 crc kubenswrapper[4690]: I0320 17:59:01.262107 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hfbn\" (UniqueName: \"kubernetes.io/projected/8ab75f46-5a66-4bbd-922d-682bee04ea9f-kube-api-access-9hfbn\") pod \"certified-operators-6jkbp\" (UID: \"8ab75f46-5a66-4bbd-922d-682bee04ea9f\") " pod="openshift-marketplace/certified-operators-6jkbp" Mar 20 17:59:01 crc kubenswrapper[4690]: I0320 17:59:01.364402 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hfbn\" (UniqueName: \"kubernetes.io/projected/8ab75f46-5a66-4bbd-922d-682bee04ea9f-kube-api-access-9hfbn\") pod \"certified-operators-6jkbp\" (UID: \"8ab75f46-5a66-4bbd-922d-682bee04ea9f\") " pod="openshift-marketplace/certified-operators-6jkbp" Mar 20 17:59:01 crc kubenswrapper[4690]: I0320 17:59:01.364524 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ab75f46-5a66-4bbd-922d-682bee04ea9f-catalog-content\") pod \"certified-operators-6jkbp\" (UID: \"8ab75f46-5a66-4bbd-922d-682bee04ea9f\") " pod="openshift-marketplace/certified-operators-6jkbp" Mar 20 17:59:01 crc kubenswrapper[4690]: I0320 17:59:01.364936 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ab75f46-5a66-4bbd-922d-682bee04ea9f-utilities\") pod \"certified-operators-6jkbp\" (UID: \"8ab75f46-5a66-4bbd-922d-682bee04ea9f\") " pod="openshift-marketplace/certified-operators-6jkbp" Mar 20 17:59:01 crc kubenswrapper[4690]: I0320 17:59:01.365412 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ab75f46-5a66-4bbd-922d-682bee04ea9f-catalog-content\") pod \"certified-operators-6jkbp\" (UID: \"8ab75f46-5a66-4bbd-922d-682bee04ea9f\") " pod="openshift-marketplace/certified-operators-6jkbp" Mar 20 17:59:01 crc kubenswrapper[4690]: I0320 17:59:01.365439 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ab75f46-5a66-4bbd-922d-682bee04ea9f-utilities\") pod \"certified-operators-6jkbp\" (UID: \"8ab75f46-5a66-4bbd-922d-682bee04ea9f\") " pod="openshift-marketplace/certified-operators-6jkbp" Mar 20 17:59:01 crc kubenswrapper[4690]: I0320 17:59:01.388330 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hfbn\" (UniqueName: \"kubernetes.io/projected/8ab75f46-5a66-4bbd-922d-682bee04ea9f-kube-api-access-9hfbn\") pod \"certified-operators-6jkbp\" (UID: \"8ab75f46-5a66-4bbd-922d-682bee04ea9f\") " pod="openshift-marketplace/certified-operators-6jkbp" Mar 20 17:59:01 crc kubenswrapper[4690]: I0320 17:59:01.546716 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6jkbp" Mar 20 17:59:02 crc kubenswrapper[4690]: I0320 17:59:02.026317 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6jkbp"] Mar 20 17:59:02 crc kubenswrapper[4690]: I0320 17:59:02.167229 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6jkbp" event={"ID":"8ab75f46-5a66-4bbd-922d-682bee04ea9f","Type":"ContainerStarted","Data":"986170e7d8be386abd43dddd8e051fc9ae7e11a1feee35e6229ae2c34fe4a2e6"} Mar 20 17:59:03 crc kubenswrapper[4690]: I0320 17:59:03.179433 4690 generic.go:334] "Generic (PLEG): container finished" podID="8ab75f46-5a66-4bbd-922d-682bee04ea9f" containerID="4d9cf0ffe81e233ed754da6d16be6704cd8118873a120a2b9ac85ba6c6537aba" exitCode=0 Mar 20 17:59:03 crc kubenswrapper[4690]: I0320 17:59:03.179542 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6jkbp" event={"ID":"8ab75f46-5a66-4bbd-922d-682bee04ea9f","Type":"ContainerDied","Data":"4d9cf0ffe81e233ed754da6d16be6704cd8118873a120a2b9ac85ba6c6537aba"} Mar 20 17:59:04 crc kubenswrapper[4690]: I0320 17:59:04.195884 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6jkbp" event={"ID":"8ab75f46-5a66-4bbd-922d-682bee04ea9f","Type":"ContainerStarted","Data":"f026c1d1ff4a126f37c5d33fe6150c2dbe4fe43de081b30cf1616c3eec1574f5"} Mar 20 17:59:06 crc kubenswrapper[4690]: I0320 17:59:06.217925 4690 generic.go:334] "Generic (PLEG): container finished" podID="8ab75f46-5a66-4bbd-922d-682bee04ea9f" containerID="f026c1d1ff4a126f37c5d33fe6150c2dbe4fe43de081b30cf1616c3eec1574f5" exitCode=0 Mar 20 17:59:06 crc kubenswrapper[4690]: I0320 17:59:06.218034 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6jkbp" event={"ID":"8ab75f46-5a66-4bbd-922d-682bee04ea9f","Type":"ContainerDied","Data":"f026c1d1ff4a126f37c5d33fe6150c2dbe4fe43de081b30cf1616c3eec1574f5"} Mar 20 17:59:07 crc kubenswrapper[4690]: I0320 17:59:07.230413 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6jkbp" event={"ID":"8ab75f46-5a66-4bbd-922d-682bee04ea9f","Type":"ContainerStarted","Data":"1823a29e52d75658954c21cb8660b355dc7f745e1ea2f0e510f23cd8b318be79"} Mar 20 17:59:07 crc kubenswrapper[4690]: I0320 17:59:07.260839 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6jkbp" podStartSLOduration=2.871089442 podStartE2EDuration="6.260821473s" podCreationTimestamp="2026-03-20 17:59:01 +0000 UTC" firstStartedPulling="2026-03-20 17:59:03.18184589 +0000 UTC m=+1618.047671568" lastFinishedPulling="2026-03-20 17:59:06.571577881 +0000 UTC m=+1621.437403599" observedRunningTime="2026-03-20 17:59:07.253240533 +0000 UTC m=+1622.119066221" watchObservedRunningTime="2026-03-20 17:59:07.260821473 +0000 UTC m=+1622.126647151" Mar 20 17:59:08 crc kubenswrapper[4690]: I0320 17:59:08.883691 4690 scope.go:117] "RemoveContainer" containerID="965e35066bff888caca5b994dc3af56f56ca5e0e9e97a4c5970943a091971930" Mar 20 17:59:08 crc kubenswrapper[4690]: E0320 17:59:08.884778 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 17:59:11 crc kubenswrapper[4690]: I0320 17:59:11.547556 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6jkbp" Mar 20 17:59:11 crc kubenswrapper[4690]: I0320 17:59:11.547926 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6jkbp" Mar 20 17:59:11 crc kubenswrapper[4690]: I0320 17:59:11.618955 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6jkbp" Mar 20 17:59:12 crc kubenswrapper[4690]: I0320 17:59:12.332966 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6jkbp" Mar 20 17:59:12 crc kubenswrapper[4690]: I0320 17:59:12.378588 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6jkbp"] Mar 20 17:59:14 crc kubenswrapper[4690]: I0320 17:59:14.309709 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6jkbp" podUID="8ab75f46-5a66-4bbd-922d-682bee04ea9f" containerName="registry-server" containerID="cri-o://1823a29e52d75658954c21cb8660b355dc7f745e1ea2f0e510f23cd8b318be79" gracePeriod=2 Mar 20 17:59:14 crc kubenswrapper[4690]: I0320 17:59:14.808870 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6jkbp" Mar 20 17:59:14 crc kubenswrapper[4690]: I0320 17:59:14.871715 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9hfbn\" (UniqueName: \"kubernetes.io/projected/8ab75f46-5a66-4bbd-922d-682bee04ea9f-kube-api-access-9hfbn\") pod \"8ab75f46-5a66-4bbd-922d-682bee04ea9f\" (UID: \"8ab75f46-5a66-4bbd-922d-682bee04ea9f\") " Mar 20 17:59:14 crc kubenswrapper[4690]: I0320 17:59:14.871814 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ab75f46-5a66-4bbd-922d-682bee04ea9f-catalog-content\") pod \"8ab75f46-5a66-4bbd-922d-682bee04ea9f\" (UID: \"8ab75f46-5a66-4bbd-922d-682bee04ea9f\") " Mar 20 17:59:14 crc kubenswrapper[4690]: I0320 17:59:14.871849 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ab75f46-5a66-4bbd-922d-682bee04ea9f-utilities\") pod \"8ab75f46-5a66-4bbd-922d-682bee04ea9f\" (UID: \"8ab75f46-5a66-4bbd-922d-682bee04ea9f\") " Mar 20 17:59:14 crc kubenswrapper[4690]: I0320 17:59:14.873114 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ab75f46-5a66-4bbd-922d-682bee04ea9f-utilities" (OuterVolumeSpecName: "utilities") pod "8ab75f46-5a66-4bbd-922d-682bee04ea9f" (UID: "8ab75f46-5a66-4bbd-922d-682bee04ea9f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:59:14 crc kubenswrapper[4690]: I0320 17:59:14.877551 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ab75f46-5a66-4bbd-922d-682bee04ea9f-kube-api-access-9hfbn" (OuterVolumeSpecName: "kube-api-access-9hfbn") pod "8ab75f46-5a66-4bbd-922d-682bee04ea9f" (UID: "8ab75f46-5a66-4bbd-922d-682bee04ea9f"). InnerVolumeSpecName "kube-api-access-9hfbn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:59:14 crc kubenswrapper[4690]: I0320 17:59:14.923903 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ab75f46-5a66-4bbd-922d-682bee04ea9f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8ab75f46-5a66-4bbd-922d-682bee04ea9f" (UID: "8ab75f46-5a66-4bbd-922d-682bee04ea9f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:59:14 crc kubenswrapper[4690]: I0320 17:59:14.974029 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9hfbn\" (UniqueName: \"kubernetes.io/projected/8ab75f46-5a66-4bbd-922d-682bee04ea9f-kube-api-access-9hfbn\") on node \"crc\" DevicePath \"\"" Mar 20 17:59:14 crc kubenswrapper[4690]: I0320 17:59:14.974058 4690 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ab75f46-5a66-4bbd-922d-682bee04ea9f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 17:59:14 crc kubenswrapper[4690]: I0320 17:59:14.974067 4690 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ab75f46-5a66-4bbd-922d-682bee04ea9f-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 17:59:15 crc kubenswrapper[4690]: I0320 17:59:15.321905 4690 generic.go:334] "Generic (PLEG): container finished" podID="8ab75f46-5a66-4bbd-922d-682bee04ea9f" containerID="1823a29e52d75658954c21cb8660b355dc7f745e1ea2f0e510f23cd8b318be79" exitCode=0 Mar 20 17:59:15 crc kubenswrapper[4690]: I0320 17:59:15.322001 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6jkbp" event={"ID":"8ab75f46-5a66-4bbd-922d-682bee04ea9f","Type":"ContainerDied","Data":"1823a29e52d75658954c21cb8660b355dc7f745e1ea2f0e510f23cd8b318be79"} Mar 20 17:59:15 crc kubenswrapper[4690]: I0320 17:59:15.323387 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6jkbp" event={"ID":"8ab75f46-5a66-4bbd-922d-682bee04ea9f","Type":"ContainerDied","Data":"986170e7d8be386abd43dddd8e051fc9ae7e11a1feee35e6229ae2c34fe4a2e6"} Mar 20 17:59:15 crc kubenswrapper[4690]: I0320 17:59:15.323457 4690 scope.go:117] "RemoveContainer" containerID="1823a29e52d75658954c21cb8660b355dc7f745e1ea2f0e510f23cd8b318be79" Mar 20 17:59:15 crc kubenswrapper[4690]: I0320 17:59:15.322056 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6jkbp" Mar 20 17:59:15 crc kubenswrapper[4690]: I0320 17:59:15.343613 4690 scope.go:117] "RemoveContainer" containerID="f026c1d1ff4a126f37c5d33fe6150c2dbe4fe43de081b30cf1616c3eec1574f5" Mar 20 17:59:15 crc kubenswrapper[4690]: I0320 17:59:15.367904 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6jkbp"] Mar 20 17:59:15 crc kubenswrapper[4690]: I0320 17:59:15.377978 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6jkbp"] Mar 20 17:59:15 crc kubenswrapper[4690]: I0320 17:59:15.386212 4690 scope.go:117] "RemoveContainer" containerID="4d9cf0ffe81e233ed754da6d16be6704cd8118873a120a2b9ac85ba6c6537aba" Mar 20 17:59:15 crc kubenswrapper[4690]: I0320 17:59:15.430898 4690 scope.go:117] "RemoveContainer" containerID="1823a29e52d75658954c21cb8660b355dc7f745e1ea2f0e510f23cd8b318be79" Mar 20 17:59:15 crc kubenswrapper[4690]: E0320 17:59:15.431511 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1823a29e52d75658954c21cb8660b355dc7f745e1ea2f0e510f23cd8b318be79\": container with ID starting with 1823a29e52d75658954c21cb8660b355dc7f745e1ea2f0e510f23cd8b318be79 not found: ID does not exist" containerID="1823a29e52d75658954c21cb8660b355dc7f745e1ea2f0e510f23cd8b318be79" Mar 20 17:59:15 crc kubenswrapper[4690]: I0320 17:59:15.431552 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1823a29e52d75658954c21cb8660b355dc7f745e1ea2f0e510f23cd8b318be79"} err="failed to get container status \"1823a29e52d75658954c21cb8660b355dc7f745e1ea2f0e510f23cd8b318be79\": rpc error: code = NotFound desc = could not find container \"1823a29e52d75658954c21cb8660b355dc7f745e1ea2f0e510f23cd8b318be79\": container with ID starting with 1823a29e52d75658954c21cb8660b355dc7f745e1ea2f0e510f23cd8b318be79 not found: ID does not exist" Mar 20 17:59:15 crc kubenswrapper[4690]: I0320 17:59:15.431576 4690 scope.go:117] "RemoveContainer" containerID="f026c1d1ff4a126f37c5d33fe6150c2dbe4fe43de081b30cf1616c3eec1574f5" Mar 20 17:59:15 crc kubenswrapper[4690]: E0320 17:59:15.431845 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f026c1d1ff4a126f37c5d33fe6150c2dbe4fe43de081b30cf1616c3eec1574f5\": container with ID starting with f026c1d1ff4a126f37c5d33fe6150c2dbe4fe43de081b30cf1616c3eec1574f5 not found: ID does not exist" containerID="f026c1d1ff4a126f37c5d33fe6150c2dbe4fe43de081b30cf1616c3eec1574f5" Mar 20 17:59:15 crc kubenswrapper[4690]: I0320 17:59:15.431890 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f026c1d1ff4a126f37c5d33fe6150c2dbe4fe43de081b30cf1616c3eec1574f5"} err="failed to get container status \"f026c1d1ff4a126f37c5d33fe6150c2dbe4fe43de081b30cf1616c3eec1574f5\": rpc error: code = NotFound desc = could not find container \"f026c1d1ff4a126f37c5d33fe6150c2dbe4fe43de081b30cf1616c3eec1574f5\": container with ID starting with f026c1d1ff4a126f37c5d33fe6150c2dbe4fe43de081b30cf1616c3eec1574f5 not found: ID does not exist" Mar 20 17:59:15 crc kubenswrapper[4690]: I0320 17:59:15.431923 4690 scope.go:117] "RemoveContainer" containerID="4d9cf0ffe81e233ed754da6d16be6704cd8118873a120a2b9ac85ba6c6537aba" Mar 20 17:59:15 crc kubenswrapper[4690]: E0320 17:59:15.432322 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d9cf0ffe81e233ed754da6d16be6704cd8118873a120a2b9ac85ba6c6537aba\": container with ID starting with 4d9cf0ffe81e233ed754da6d16be6704cd8118873a120a2b9ac85ba6c6537aba not found: ID does not exist" containerID="4d9cf0ffe81e233ed754da6d16be6704cd8118873a120a2b9ac85ba6c6537aba" Mar 20 17:59:15 crc kubenswrapper[4690]: I0320 17:59:15.432361 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d9cf0ffe81e233ed754da6d16be6704cd8118873a120a2b9ac85ba6c6537aba"} err="failed to get container status \"4d9cf0ffe81e233ed754da6d16be6704cd8118873a120a2b9ac85ba6c6537aba\": rpc error: code = NotFound desc = could not find container \"4d9cf0ffe81e233ed754da6d16be6704cd8118873a120a2b9ac85ba6c6537aba\": container with ID starting with 4d9cf0ffe81e233ed754da6d16be6704cd8118873a120a2b9ac85ba6c6537aba not found: ID does not exist" Mar 20 17:59:15 crc kubenswrapper[4690]: I0320 17:59:15.892893 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ab75f46-5a66-4bbd-922d-682bee04ea9f" path="/var/lib/kubelet/pods/8ab75f46-5a66-4bbd-922d-682bee04ea9f/volumes" Mar 20 17:59:20 crc kubenswrapper[4690]: I0320 17:59:20.565460 4690 scope.go:117] "RemoveContainer" containerID="18b20fa2f931ba634ef35191294085cb0f1172e33f8bc6cc2c9c6269344e60d8" Mar 20 17:59:20 crc kubenswrapper[4690]: I0320 17:59:20.601343 4690 scope.go:117] "RemoveContainer" containerID="014e5dcc51458e00ea1c1e92fc8066e86e8ba38713cdd0bb150493ff18fbc998" Mar 20 17:59:21 crc kubenswrapper[4690]: I0320 17:59:21.883201 4690 scope.go:117] "RemoveContainer" containerID="965e35066bff888caca5b994dc3af56f56ca5e0e9e97a4c5970943a091971930" Mar 20 17:59:21 crc kubenswrapper[4690]: E0320 17:59:21.883819 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 17:59:35 crc kubenswrapper[4690]: I0320 17:59:35.908896 4690 scope.go:117] "RemoveContainer" containerID="965e35066bff888caca5b994dc3af56f56ca5e0e9e97a4c5970943a091971930" Mar 20 17:59:35 crc kubenswrapper[4690]: E0320 17:59:35.909791 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 17:59:47 crc kubenswrapper[4690]: I0320 17:59:47.883410 4690 scope.go:117] "RemoveContainer" containerID="965e35066bff888caca5b994dc3af56f56ca5e0e9e97a4c5970943a091971930" Mar 20 17:59:47 crc kubenswrapper[4690]: E0320 17:59:47.884238 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:00:00 crc kubenswrapper[4690]: I0320 18:00:00.143639 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567160-25b7n"] Mar 20 18:00:00 crc kubenswrapper[4690]: E0320 18:00:00.144457 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ab75f46-5a66-4bbd-922d-682bee04ea9f" containerName="extract-content" Mar 20 18:00:00 crc kubenswrapper[4690]: I0320 18:00:00.144470 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ab75f46-5a66-4bbd-922d-682bee04ea9f" containerName="extract-content" Mar 20 18:00:00 crc kubenswrapper[4690]: E0320 18:00:00.144489 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ab75f46-5a66-4bbd-922d-682bee04ea9f" containerName="registry-server" Mar 20 18:00:00 crc kubenswrapper[4690]: I0320 18:00:00.144495 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ab75f46-5a66-4bbd-922d-682bee04ea9f" containerName="registry-server" Mar 20 18:00:00 crc kubenswrapper[4690]: E0320 18:00:00.144502 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ab75f46-5a66-4bbd-922d-682bee04ea9f" containerName="extract-utilities" Mar 20 18:00:00 crc kubenswrapper[4690]: I0320 18:00:00.144508 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ab75f46-5a66-4bbd-922d-682bee04ea9f" containerName="extract-utilities" Mar 20 18:00:00 crc kubenswrapper[4690]: I0320 18:00:00.144695 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ab75f46-5a66-4bbd-922d-682bee04ea9f" containerName="registry-server" Mar 20 18:00:00 crc kubenswrapper[4690]: I0320 18:00:00.145227 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567160-25b7n" Mar 20 18:00:00 crc kubenswrapper[4690]: I0320 18:00:00.147591 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5fwhb" Mar 20 18:00:00 crc kubenswrapper[4690]: I0320 18:00:00.148020 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 18:00:00 crc kubenswrapper[4690]: I0320 18:00:00.152514 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 18:00:00 crc kubenswrapper[4690]: I0320 18:00:00.155315 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567160-25b7n"] Mar 20 18:00:00 crc kubenswrapper[4690]: I0320 18:00:00.247066 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567160-sjbbb"] Mar 20 18:00:00 crc kubenswrapper[4690]: I0320 18:00:00.248548 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567160-sjbbb" Mar 20 18:00:00 crc kubenswrapper[4690]: I0320 18:00:00.251026 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 18:00:00 crc kubenswrapper[4690]: I0320 18:00:00.251061 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 18:00:00 crc kubenswrapper[4690]: I0320 18:00:00.258892 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567160-sjbbb"] Mar 20 18:00:00 crc kubenswrapper[4690]: I0320 18:00:00.266860 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9f95\" (UniqueName: \"kubernetes.io/projected/a1a94716-92ff-429b-b528-1144c64571c4-kube-api-access-b9f95\") pod \"auto-csr-approver-29567160-25b7n\" (UID: \"a1a94716-92ff-429b-b528-1144c64571c4\") " pod="openshift-infra/auto-csr-approver-29567160-25b7n" Mar 20 18:00:00 crc kubenswrapper[4690]: I0320 18:00:00.368985 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kz69v\" (UniqueName: \"kubernetes.io/projected/06575ad3-48a5-4fdc-8e69-761bf8ab240b-kube-api-access-kz69v\") pod \"collect-profiles-29567160-sjbbb\" (UID: \"06575ad3-48a5-4fdc-8e69-761bf8ab240b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567160-sjbbb" Mar 20 18:00:00 crc kubenswrapper[4690]: I0320 18:00:00.369060 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9f95\" (UniqueName: \"kubernetes.io/projected/a1a94716-92ff-429b-b528-1144c64571c4-kube-api-access-b9f95\") pod \"auto-csr-approver-29567160-25b7n\" (UID: \"a1a94716-92ff-429b-b528-1144c64571c4\") " pod="openshift-infra/auto-csr-approver-29567160-25b7n" Mar 20 18:00:00 crc kubenswrapper[4690]: I0320 18:00:00.369194 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/06575ad3-48a5-4fdc-8e69-761bf8ab240b-secret-volume\") pod \"collect-profiles-29567160-sjbbb\" (UID: \"06575ad3-48a5-4fdc-8e69-761bf8ab240b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567160-sjbbb" Mar 20 18:00:00 crc kubenswrapper[4690]: I0320 18:00:00.369473 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/06575ad3-48a5-4fdc-8e69-761bf8ab240b-config-volume\") pod \"collect-profiles-29567160-sjbbb\" (UID: \"06575ad3-48a5-4fdc-8e69-761bf8ab240b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567160-sjbbb" Mar 20 18:00:00 crc kubenswrapper[4690]: I0320 18:00:00.392591 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9f95\" (UniqueName: \"kubernetes.io/projected/a1a94716-92ff-429b-b528-1144c64571c4-kube-api-access-b9f95\") pod \"auto-csr-approver-29567160-25b7n\" (UID: \"a1a94716-92ff-429b-b528-1144c64571c4\") " pod="openshift-infra/auto-csr-approver-29567160-25b7n" Mar 20 18:00:00 crc kubenswrapper[4690]: I0320 18:00:00.471608 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/06575ad3-48a5-4fdc-8e69-761bf8ab240b-secret-volume\") pod \"collect-profiles-29567160-sjbbb\" (UID: \"06575ad3-48a5-4fdc-8e69-761bf8ab240b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567160-sjbbb" Mar 20 18:00:00 crc kubenswrapper[4690]: I0320 18:00:00.472005 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/06575ad3-48a5-4fdc-8e69-761bf8ab240b-config-volume\") pod \"collect-profiles-29567160-sjbbb\" (UID: \"06575ad3-48a5-4fdc-8e69-761bf8ab240b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567160-sjbbb" Mar 20 18:00:00 crc kubenswrapper[4690]: I0320 18:00:00.472212 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kz69v\" (UniqueName: \"kubernetes.io/projected/06575ad3-48a5-4fdc-8e69-761bf8ab240b-kube-api-access-kz69v\") pod \"collect-profiles-29567160-sjbbb\" (UID: \"06575ad3-48a5-4fdc-8e69-761bf8ab240b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567160-sjbbb" Mar 20 18:00:00 crc kubenswrapper[4690]: I0320 18:00:00.473082 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/06575ad3-48a5-4fdc-8e69-761bf8ab240b-config-volume\") pod \"collect-profiles-29567160-sjbbb\" (UID: \"06575ad3-48a5-4fdc-8e69-761bf8ab240b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567160-sjbbb" Mar 20 18:00:00 crc kubenswrapper[4690]: I0320 18:00:00.475228 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/06575ad3-48a5-4fdc-8e69-761bf8ab240b-secret-volume\") pod \"collect-profiles-29567160-sjbbb\" (UID: \"06575ad3-48a5-4fdc-8e69-761bf8ab240b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567160-sjbbb" Mar 20 18:00:00 crc kubenswrapper[4690]: I0320 18:00:00.483207 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567160-25b7n" Mar 20 18:00:00 crc kubenswrapper[4690]: I0320 18:00:00.491315 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kz69v\" (UniqueName: \"kubernetes.io/projected/06575ad3-48a5-4fdc-8e69-761bf8ab240b-kube-api-access-kz69v\") pod \"collect-profiles-29567160-sjbbb\" (UID: \"06575ad3-48a5-4fdc-8e69-761bf8ab240b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567160-sjbbb" Mar 20 18:00:00 crc kubenswrapper[4690]: I0320 18:00:00.575168 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567160-sjbbb" Mar 20 18:00:00 crc kubenswrapper[4690]: I0320 18:00:00.998559 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567160-25b7n"] Mar 20 18:00:01 crc kubenswrapper[4690]: I0320 18:00:01.060909 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567160-sjbbb"] Mar 20 18:00:01 crc kubenswrapper[4690]: I0320 18:00:01.994733 4690 generic.go:334] "Generic (PLEG): container finished" podID="06575ad3-48a5-4fdc-8e69-761bf8ab240b" containerID="5033a1b7e5a1ae707b4a8d8bc47a1721de5645ededab65cca0f9f63de56c7796" exitCode=0 Mar 20 18:00:01 crc kubenswrapper[4690]: I0320 18:00:01.994944 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567160-sjbbb" event={"ID":"06575ad3-48a5-4fdc-8e69-761bf8ab240b","Type":"ContainerDied","Data":"5033a1b7e5a1ae707b4a8d8bc47a1721de5645ededab65cca0f9f63de56c7796"} Mar 20 18:00:01 crc kubenswrapper[4690]: I0320 18:00:01.994992 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567160-sjbbb" event={"ID":"06575ad3-48a5-4fdc-8e69-761bf8ab240b","Type":"ContainerStarted","Data":"4521b5949332106ce723eb8dc6db0ad88c2f680bdac8e9ee46e066cf7f741670"} Mar 20 18:00:01 crc kubenswrapper[4690]: I0320 18:00:01.996855 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567160-25b7n" event={"ID":"a1a94716-92ff-429b-b528-1144c64571c4","Type":"ContainerStarted","Data":"7cc87d6880537d3563589f8d167b651c4499e55f4e088c003e8df80f1d9a694c"} Mar 20 18:00:02 crc kubenswrapper[4690]: I0320 18:00:02.883584 4690 scope.go:117] "RemoveContainer" containerID="965e35066bff888caca5b994dc3af56f56ca5e0e9e97a4c5970943a091971930" Mar 20 18:00:02 crc kubenswrapper[4690]: E0320 18:00:02.884443 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:00:03 crc kubenswrapper[4690]: I0320 18:00:03.428184 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567160-sjbbb" Mar 20 18:00:03 crc kubenswrapper[4690]: I0320 18:00:03.533236 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/06575ad3-48a5-4fdc-8e69-761bf8ab240b-config-volume\") pod \"06575ad3-48a5-4fdc-8e69-761bf8ab240b\" (UID: \"06575ad3-48a5-4fdc-8e69-761bf8ab240b\") " Mar 20 18:00:03 crc kubenswrapper[4690]: I0320 18:00:03.533388 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/06575ad3-48a5-4fdc-8e69-761bf8ab240b-secret-volume\") pod \"06575ad3-48a5-4fdc-8e69-761bf8ab240b\" (UID: \"06575ad3-48a5-4fdc-8e69-761bf8ab240b\") " Mar 20 18:00:03 crc kubenswrapper[4690]: I0320 18:00:03.533554 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kz69v\" (UniqueName: \"kubernetes.io/projected/06575ad3-48a5-4fdc-8e69-761bf8ab240b-kube-api-access-kz69v\") pod \"06575ad3-48a5-4fdc-8e69-761bf8ab240b\" (UID: \"06575ad3-48a5-4fdc-8e69-761bf8ab240b\") " Mar 20 18:00:03 crc kubenswrapper[4690]: I0320 18:00:03.535252 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06575ad3-48a5-4fdc-8e69-761bf8ab240b-config-volume" (OuterVolumeSpecName: "config-volume") pod "06575ad3-48a5-4fdc-8e69-761bf8ab240b" (UID: "06575ad3-48a5-4fdc-8e69-761bf8ab240b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 18:00:03 crc kubenswrapper[4690]: I0320 18:00:03.541898 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06575ad3-48a5-4fdc-8e69-761bf8ab240b-kube-api-access-kz69v" (OuterVolumeSpecName: "kube-api-access-kz69v") pod "06575ad3-48a5-4fdc-8e69-761bf8ab240b" (UID: "06575ad3-48a5-4fdc-8e69-761bf8ab240b"). InnerVolumeSpecName "kube-api-access-kz69v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:00:03 crc kubenswrapper[4690]: I0320 18:00:03.542218 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06575ad3-48a5-4fdc-8e69-761bf8ab240b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "06575ad3-48a5-4fdc-8e69-761bf8ab240b" (UID: "06575ad3-48a5-4fdc-8e69-761bf8ab240b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:00:03 crc kubenswrapper[4690]: I0320 18:00:03.636448 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kz69v\" (UniqueName: \"kubernetes.io/projected/06575ad3-48a5-4fdc-8e69-761bf8ab240b-kube-api-access-kz69v\") on node \"crc\" DevicePath \"\"" Mar 20 18:00:03 crc kubenswrapper[4690]: I0320 18:00:03.636493 4690 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/06575ad3-48a5-4fdc-8e69-761bf8ab240b-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 18:00:03 crc kubenswrapper[4690]: I0320 18:00:03.636511 4690 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/06575ad3-48a5-4fdc-8e69-761bf8ab240b-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 18:00:04 crc kubenswrapper[4690]: I0320 18:00:04.015864 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567160-sjbbb" event={"ID":"06575ad3-48a5-4fdc-8e69-761bf8ab240b","Type":"ContainerDied","Data":"4521b5949332106ce723eb8dc6db0ad88c2f680bdac8e9ee46e066cf7f741670"} Mar 20 18:00:04 crc kubenswrapper[4690]: I0320 18:00:04.015906 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4521b5949332106ce723eb8dc6db0ad88c2f680bdac8e9ee46e066cf7f741670" Mar 20 18:00:04 crc kubenswrapper[4690]: I0320 18:00:04.015913 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567160-sjbbb" Mar 20 18:00:04 crc kubenswrapper[4690]: I0320 18:00:04.912555 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wnfn4"] Mar 20 18:00:04 crc kubenswrapper[4690]: E0320 18:00:04.913220 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06575ad3-48a5-4fdc-8e69-761bf8ab240b" containerName="collect-profiles" Mar 20 18:00:04 crc kubenswrapper[4690]: I0320 18:00:04.913233 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="06575ad3-48a5-4fdc-8e69-761bf8ab240b" containerName="collect-profiles" Mar 20 18:00:04 crc kubenswrapper[4690]: I0320 18:00:04.913435 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="06575ad3-48a5-4fdc-8e69-761bf8ab240b" containerName="collect-profiles" Mar 20 18:00:04 crc kubenswrapper[4690]: I0320 18:00:04.914992 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wnfn4" Mar 20 18:00:04 crc kubenswrapper[4690]: I0320 18:00:04.935337 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wnfn4"] Mar 20 18:00:05 crc kubenswrapper[4690]: I0320 18:00:05.025022 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567160-25b7n" event={"ID":"a1a94716-92ff-429b-b528-1144c64571c4","Type":"ContainerStarted","Data":"899a33ad242cb750149df083fff3c13bf65d20eae215086d4e8c6bfa16c62150"} Mar 20 18:00:05 crc kubenswrapper[4690]: I0320 18:00:05.065830 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f89fcb23-e73f-4539-b5d8-8b20e3762a67-catalog-content\") pod \"redhat-marketplace-wnfn4\" (UID: \"f89fcb23-e73f-4539-b5d8-8b20e3762a67\") " pod="openshift-marketplace/redhat-marketplace-wnfn4" Mar 20 18:00:05 crc kubenswrapper[4690]: I0320 18:00:05.065950 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f89fcb23-e73f-4539-b5d8-8b20e3762a67-utilities\") pod \"redhat-marketplace-wnfn4\" (UID: \"f89fcb23-e73f-4539-b5d8-8b20e3762a67\") " pod="openshift-marketplace/redhat-marketplace-wnfn4" Mar 20 18:00:05 crc kubenswrapper[4690]: I0320 18:00:05.066151 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvn6q\" (UniqueName: \"kubernetes.io/projected/f89fcb23-e73f-4539-b5d8-8b20e3762a67-kube-api-access-lvn6q\") pod \"redhat-marketplace-wnfn4\" (UID: \"f89fcb23-e73f-4539-b5d8-8b20e3762a67\") " pod="openshift-marketplace/redhat-marketplace-wnfn4" Mar 20 18:00:05 crc kubenswrapper[4690]: I0320 18:00:05.168121 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvn6q\" (UniqueName: \"kubernetes.io/projected/f89fcb23-e73f-4539-b5d8-8b20e3762a67-kube-api-access-lvn6q\") pod \"redhat-marketplace-wnfn4\" (UID: \"f89fcb23-e73f-4539-b5d8-8b20e3762a67\") " pod="openshift-marketplace/redhat-marketplace-wnfn4" Mar 20 18:00:05 crc kubenswrapper[4690]: I0320 18:00:05.168193 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f89fcb23-e73f-4539-b5d8-8b20e3762a67-catalog-content\") pod \"redhat-marketplace-wnfn4\" (UID: \"f89fcb23-e73f-4539-b5d8-8b20e3762a67\") " pod="openshift-marketplace/redhat-marketplace-wnfn4" Mar 20 18:00:05 crc kubenswrapper[4690]: I0320 18:00:05.168238 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f89fcb23-e73f-4539-b5d8-8b20e3762a67-utilities\") pod \"redhat-marketplace-wnfn4\" (UID: \"f89fcb23-e73f-4539-b5d8-8b20e3762a67\") " pod="openshift-marketplace/redhat-marketplace-wnfn4" Mar 20 18:00:05 crc kubenswrapper[4690]: I0320 18:00:05.168748 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f89fcb23-e73f-4539-b5d8-8b20e3762a67-utilities\") pod \"redhat-marketplace-wnfn4\" (UID: \"f89fcb23-e73f-4539-b5d8-8b20e3762a67\") " pod="openshift-marketplace/redhat-marketplace-wnfn4" Mar 20 18:00:05 crc kubenswrapper[4690]: I0320 18:00:05.169818 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f89fcb23-e73f-4539-b5d8-8b20e3762a67-catalog-content\") pod \"redhat-marketplace-wnfn4\" (UID: \"f89fcb23-e73f-4539-b5d8-8b20e3762a67\") " pod="openshift-marketplace/redhat-marketplace-wnfn4" Mar 20 18:00:05 crc kubenswrapper[4690]: I0320 18:00:05.195207 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvn6q\" (UniqueName: \"kubernetes.io/projected/f89fcb23-e73f-4539-b5d8-8b20e3762a67-kube-api-access-lvn6q\") pod \"redhat-marketplace-wnfn4\" (UID: \"f89fcb23-e73f-4539-b5d8-8b20e3762a67\") " pod="openshift-marketplace/redhat-marketplace-wnfn4" Mar 20 18:00:05 crc kubenswrapper[4690]: I0320 18:00:05.269173 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wnfn4" Mar 20 18:00:05 crc kubenswrapper[4690]: I0320 18:00:05.720763 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567160-25b7n" podStartSLOduration=2.158243243 podStartE2EDuration="5.720747645s" podCreationTimestamp="2026-03-20 18:00:00 +0000 UTC" firstStartedPulling="2026-03-20 18:00:00.992651924 +0000 UTC m=+1675.858477602" lastFinishedPulling="2026-03-20 18:00:04.555156326 +0000 UTC m=+1679.420982004" observedRunningTime="2026-03-20 18:00:05.045119851 +0000 UTC m=+1679.910945529" watchObservedRunningTime="2026-03-20 18:00:05.720747645 +0000 UTC m=+1680.586573313" Mar 20 18:00:05 crc kubenswrapper[4690]: I0320 18:00:05.727488 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wnfn4"] Mar 20 18:00:05 crc kubenswrapper[4690]: W0320 18:00:05.728481 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf89fcb23_e73f_4539_b5d8_8b20e3762a67.slice/crio-a23190391891190031fb1056813533ee7bc096200acc0120705789bffb113c90 WatchSource:0}: Error finding container a23190391891190031fb1056813533ee7bc096200acc0120705789bffb113c90: Status 404 returned error can't find the container with id a23190391891190031fb1056813533ee7bc096200acc0120705789bffb113c90 Mar 20 18:00:06 crc kubenswrapper[4690]: I0320 18:00:06.043951 4690 generic.go:334] "Generic (PLEG): container finished" podID="a1a94716-92ff-429b-b528-1144c64571c4" containerID="899a33ad242cb750149df083fff3c13bf65d20eae215086d4e8c6bfa16c62150" exitCode=0 Mar 20 18:00:06 crc kubenswrapper[4690]: I0320 18:00:06.044061 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567160-25b7n" event={"ID":"a1a94716-92ff-429b-b528-1144c64571c4","Type":"ContainerDied","Data":"899a33ad242cb750149df083fff3c13bf65d20eae215086d4e8c6bfa16c62150"} Mar 20 18:00:06 crc kubenswrapper[4690]: I0320 18:00:06.049402 4690 generic.go:334] "Generic (PLEG): container finished" podID="f89fcb23-e73f-4539-b5d8-8b20e3762a67" containerID="8b86e285b4fa717f50dad18a02cce21012bdc13b57a36469ccdf3e6de0114487" exitCode=0 Mar 20 18:00:06 crc kubenswrapper[4690]: I0320 18:00:06.049468 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wnfn4" event={"ID":"f89fcb23-e73f-4539-b5d8-8b20e3762a67","Type":"ContainerDied","Data":"8b86e285b4fa717f50dad18a02cce21012bdc13b57a36469ccdf3e6de0114487"} Mar 20 18:00:06 crc kubenswrapper[4690]: I0320 18:00:06.049507 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wnfn4" event={"ID":"f89fcb23-e73f-4539-b5d8-8b20e3762a67","Type":"ContainerStarted","Data":"a23190391891190031fb1056813533ee7bc096200acc0120705789bffb113c90"} Mar 20 18:00:07 crc kubenswrapper[4690]: I0320 18:00:07.100352 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wnfn4" event={"ID":"f89fcb23-e73f-4539-b5d8-8b20e3762a67","Type":"ContainerStarted","Data":"5c4f2af30b19abb03cef320cead723ec30be7809eb14a73ccd0f1f08f3315106"} Mar 20 18:00:07 crc kubenswrapper[4690]: I0320 18:00:07.436948 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567160-25b7n" Mar 20 18:00:07 crc kubenswrapper[4690]: I0320 18:00:07.511028 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b9f95\" (UniqueName: \"kubernetes.io/projected/a1a94716-92ff-429b-b528-1144c64571c4-kube-api-access-b9f95\") pod \"a1a94716-92ff-429b-b528-1144c64571c4\" (UID: \"a1a94716-92ff-429b-b528-1144c64571c4\") " Mar 20 18:00:07 crc kubenswrapper[4690]: I0320 18:00:07.517939 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1a94716-92ff-429b-b528-1144c64571c4-kube-api-access-b9f95" (OuterVolumeSpecName: "kube-api-access-b9f95") pod "a1a94716-92ff-429b-b528-1144c64571c4" (UID: "a1a94716-92ff-429b-b528-1144c64571c4"). InnerVolumeSpecName "kube-api-access-b9f95". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:00:07 crc kubenswrapper[4690]: I0320 18:00:07.613518 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b9f95\" (UniqueName: \"kubernetes.io/projected/a1a94716-92ff-429b-b528-1144c64571c4-kube-api-access-b9f95\") on node \"crc\" DevicePath \"\"" Mar 20 18:00:08 crc kubenswrapper[4690]: I0320 18:00:08.111514 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567160-25b7n" event={"ID":"a1a94716-92ff-429b-b528-1144c64571c4","Type":"ContainerDied","Data":"7cc87d6880537d3563589f8d167b651c4499e55f4e088c003e8df80f1d9a694c"} Mar 20 18:00:08 crc kubenswrapper[4690]: I0320 18:00:08.111804 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7cc87d6880537d3563589f8d167b651c4499e55f4e088c003e8df80f1d9a694c" Mar 20 18:00:08 crc kubenswrapper[4690]: I0320 18:00:08.111862 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567160-25b7n" Mar 20 18:00:08 crc kubenswrapper[4690]: I0320 18:00:08.117411 4690 generic.go:334] "Generic (PLEG): container finished" podID="f89fcb23-e73f-4539-b5d8-8b20e3762a67" containerID="5c4f2af30b19abb03cef320cead723ec30be7809eb14a73ccd0f1f08f3315106" exitCode=0 Mar 20 18:00:08 crc kubenswrapper[4690]: I0320 18:00:08.117458 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wnfn4" event={"ID":"f89fcb23-e73f-4539-b5d8-8b20e3762a67","Type":"ContainerDied","Data":"5c4f2af30b19abb03cef320cead723ec30be7809eb14a73ccd0f1f08f3315106"} Mar 20 18:00:08 crc kubenswrapper[4690]: I0320 18:00:08.153504 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567154-x5hrx"] Mar 20 18:00:08 crc kubenswrapper[4690]: I0320 18:00:08.161604 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567154-x5hrx"] Mar 20 18:00:09 crc kubenswrapper[4690]: I0320 18:00:09.132775 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wnfn4" event={"ID":"f89fcb23-e73f-4539-b5d8-8b20e3762a67","Type":"ContainerStarted","Data":"33cda9f9847a7f806f600bb0058b67899ae49383a5a16b98b0633b6db5c96649"} Mar 20 18:00:09 crc kubenswrapper[4690]: I0320 18:00:09.161367 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wnfn4" podStartSLOduration=2.661057949 podStartE2EDuration="5.161346046s" podCreationTimestamp="2026-03-20 18:00:04 +0000 UTC" firstStartedPulling="2026-03-20 18:00:06.052486793 +0000 UTC m=+1680.918312471" lastFinishedPulling="2026-03-20 18:00:08.55277489 +0000 UTC m=+1683.418600568" observedRunningTime="2026-03-20 18:00:09.156779488 +0000 UTC m=+1684.022605176" watchObservedRunningTime="2026-03-20 18:00:09.161346046 +0000 UTC m=+1684.027171734" Mar 20 18:00:09 crc kubenswrapper[4690]: I0320 18:00:09.893339 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b35a2b62-f869-4a99-a922-68822abfaa30" path="/var/lib/kubelet/pods/b35a2b62-f869-4a99-a922-68822abfaa30/volumes" Mar 20 18:00:13 crc kubenswrapper[4690]: I0320 18:00:13.884372 4690 scope.go:117] "RemoveContainer" containerID="965e35066bff888caca5b994dc3af56f56ca5e0e9e97a4c5970943a091971930" Mar 20 18:00:13 crc kubenswrapper[4690]: E0320 18:00:13.885578 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:00:15 crc kubenswrapper[4690]: I0320 18:00:15.270509 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wnfn4" Mar 20 18:00:15 crc kubenswrapper[4690]: I0320 18:00:15.270923 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wnfn4" Mar 20 18:00:15 crc kubenswrapper[4690]: I0320 18:00:15.328729 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wnfn4" Mar 20 18:00:16 crc kubenswrapper[4690]: I0320 18:00:16.272844 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wnfn4" Mar 20 18:00:16 crc kubenswrapper[4690]: I0320 18:00:16.339507 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wnfn4"] Mar 20 18:00:18 crc kubenswrapper[4690]: I0320 18:00:18.212471 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wnfn4" podUID="f89fcb23-e73f-4539-b5d8-8b20e3762a67" containerName="registry-server" containerID="cri-o://33cda9f9847a7f806f600bb0058b67899ae49383a5a16b98b0633b6db5c96649" gracePeriod=2 Mar 20 18:00:18 crc kubenswrapper[4690]: I0320 18:00:18.699760 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wnfn4" Mar 20 18:00:18 crc kubenswrapper[4690]: I0320 18:00:18.754567 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f89fcb23-e73f-4539-b5d8-8b20e3762a67-catalog-content\") pod \"f89fcb23-e73f-4539-b5d8-8b20e3762a67\" (UID: \"f89fcb23-e73f-4539-b5d8-8b20e3762a67\") " Mar 20 18:00:18 crc kubenswrapper[4690]: I0320 18:00:18.754771 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lvn6q\" (UniqueName: \"kubernetes.io/projected/f89fcb23-e73f-4539-b5d8-8b20e3762a67-kube-api-access-lvn6q\") pod \"f89fcb23-e73f-4539-b5d8-8b20e3762a67\" (UID: \"f89fcb23-e73f-4539-b5d8-8b20e3762a67\") " Mar 20 18:00:18 crc kubenswrapper[4690]: I0320 18:00:18.754897 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f89fcb23-e73f-4539-b5d8-8b20e3762a67-utilities\") pod \"f89fcb23-e73f-4539-b5d8-8b20e3762a67\" (UID: \"f89fcb23-e73f-4539-b5d8-8b20e3762a67\") " Mar 20 18:00:18 crc kubenswrapper[4690]: I0320 18:00:18.756172 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f89fcb23-e73f-4539-b5d8-8b20e3762a67-utilities" (OuterVolumeSpecName: "utilities") pod "f89fcb23-e73f-4539-b5d8-8b20e3762a67" (UID: "f89fcb23-e73f-4539-b5d8-8b20e3762a67"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:00:18 crc kubenswrapper[4690]: I0320 18:00:18.760554 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f89fcb23-e73f-4539-b5d8-8b20e3762a67-kube-api-access-lvn6q" (OuterVolumeSpecName: "kube-api-access-lvn6q") pod "f89fcb23-e73f-4539-b5d8-8b20e3762a67" (UID: "f89fcb23-e73f-4539-b5d8-8b20e3762a67"). InnerVolumeSpecName "kube-api-access-lvn6q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:00:18 crc kubenswrapper[4690]: I0320 18:00:18.782708 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f89fcb23-e73f-4539-b5d8-8b20e3762a67-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f89fcb23-e73f-4539-b5d8-8b20e3762a67" (UID: "f89fcb23-e73f-4539-b5d8-8b20e3762a67"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:00:18 crc kubenswrapper[4690]: I0320 18:00:18.856839 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lvn6q\" (UniqueName: \"kubernetes.io/projected/f89fcb23-e73f-4539-b5d8-8b20e3762a67-kube-api-access-lvn6q\") on node \"crc\" DevicePath \"\"" Mar 20 18:00:18 crc kubenswrapper[4690]: I0320 18:00:18.856871 4690 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f89fcb23-e73f-4539-b5d8-8b20e3762a67-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 18:00:18 crc kubenswrapper[4690]: I0320 18:00:18.856883 4690 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f89fcb23-e73f-4539-b5d8-8b20e3762a67-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 18:00:19 crc kubenswrapper[4690]: I0320 18:00:19.228189 4690 generic.go:334] "Generic (PLEG): container finished" podID="f89fcb23-e73f-4539-b5d8-8b20e3762a67" containerID="33cda9f9847a7f806f600bb0058b67899ae49383a5a16b98b0633b6db5c96649" exitCode=0 Mar 20 18:00:19 crc kubenswrapper[4690]: I0320 18:00:19.228238 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wnfn4" event={"ID":"f89fcb23-e73f-4539-b5d8-8b20e3762a67","Type":"ContainerDied","Data":"33cda9f9847a7f806f600bb0058b67899ae49383a5a16b98b0633b6db5c96649"} Mar 20 18:00:19 crc kubenswrapper[4690]: I0320 18:00:19.228310 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wnfn4" event={"ID":"f89fcb23-e73f-4539-b5d8-8b20e3762a67","Type":"ContainerDied","Data":"a23190391891190031fb1056813533ee7bc096200acc0120705789bffb113c90"} Mar 20 18:00:19 crc kubenswrapper[4690]: I0320 18:00:19.228315 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wnfn4" Mar 20 18:00:19 crc kubenswrapper[4690]: I0320 18:00:19.228330 4690 scope.go:117] "RemoveContainer" containerID="33cda9f9847a7f806f600bb0058b67899ae49383a5a16b98b0633b6db5c96649" Mar 20 18:00:19 crc kubenswrapper[4690]: I0320 18:00:19.261684 4690 scope.go:117] "RemoveContainer" containerID="5c4f2af30b19abb03cef320cead723ec30be7809eb14a73ccd0f1f08f3315106" Mar 20 18:00:19 crc kubenswrapper[4690]: I0320 18:00:19.272121 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wnfn4"] Mar 20 18:00:19 crc kubenswrapper[4690]: I0320 18:00:19.283240 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wnfn4"] Mar 20 18:00:19 crc kubenswrapper[4690]: I0320 18:00:19.304737 4690 scope.go:117] "RemoveContainer" containerID="8b86e285b4fa717f50dad18a02cce21012bdc13b57a36469ccdf3e6de0114487" Mar 20 18:00:19 crc kubenswrapper[4690]: I0320 18:00:19.338504 4690 scope.go:117] "RemoveContainer" containerID="33cda9f9847a7f806f600bb0058b67899ae49383a5a16b98b0633b6db5c96649" Mar 20 18:00:19 crc kubenswrapper[4690]: E0320 18:00:19.339084 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33cda9f9847a7f806f600bb0058b67899ae49383a5a16b98b0633b6db5c96649\": container with ID starting with 33cda9f9847a7f806f600bb0058b67899ae49383a5a16b98b0633b6db5c96649 not found: ID does not exist" containerID="33cda9f9847a7f806f600bb0058b67899ae49383a5a16b98b0633b6db5c96649" Mar 20 18:00:19 crc kubenswrapper[4690]: I0320 18:00:19.339128 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33cda9f9847a7f806f600bb0058b67899ae49383a5a16b98b0633b6db5c96649"} err="failed to get container status \"33cda9f9847a7f806f600bb0058b67899ae49383a5a16b98b0633b6db5c96649\": rpc error: code = NotFound desc = could not find container \"33cda9f9847a7f806f600bb0058b67899ae49383a5a16b98b0633b6db5c96649\": container with ID starting with 33cda9f9847a7f806f600bb0058b67899ae49383a5a16b98b0633b6db5c96649 not found: ID does not exist" Mar 20 18:00:19 crc kubenswrapper[4690]: I0320 18:00:19.339154 4690 scope.go:117] "RemoveContainer" containerID="5c4f2af30b19abb03cef320cead723ec30be7809eb14a73ccd0f1f08f3315106" Mar 20 18:00:19 crc kubenswrapper[4690]: E0320 18:00:19.339680 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c4f2af30b19abb03cef320cead723ec30be7809eb14a73ccd0f1f08f3315106\": container with ID starting with 5c4f2af30b19abb03cef320cead723ec30be7809eb14a73ccd0f1f08f3315106 not found: ID does not exist" containerID="5c4f2af30b19abb03cef320cead723ec30be7809eb14a73ccd0f1f08f3315106" Mar 20 18:00:19 crc kubenswrapper[4690]: I0320 18:00:19.339719 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c4f2af30b19abb03cef320cead723ec30be7809eb14a73ccd0f1f08f3315106"} err="failed to get container status \"5c4f2af30b19abb03cef320cead723ec30be7809eb14a73ccd0f1f08f3315106\": rpc error: code = NotFound desc = could not find container \"5c4f2af30b19abb03cef320cead723ec30be7809eb14a73ccd0f1f08f3315106\": container with ID starting with 5c4f2af30b19abb03cef320cead723ec30be7809eb14a73ccd0f1f08f3315106 not found: ID does not exist" Mar 20 18:00:19 crc kubenswrapper[4690]: I0320 18:00:19.339744 4690 scope.go:117] "RemoveContainer" containerID="8b86e285b4fa717f50dad18a02cce21012bdc13b57a36469ccdf3e6de0114487" Mar 20 18:00:19 crc kubenswrapper[4690]: E0320 18:00:19.340076 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b86e285b4fa717f50dad18a02cce21012bdc13b57a36469ccdf3e6de0114487\": container with ID starting with 8b86e285b4fa717f50dad18a02cce21012bdc13b57a36469ccdf3e6de0114487 not found: ID does not exist" containerID="8b86e285b4fa717f50dad18a02cce21012bdc13b57a36469ccdf3e6de0114487" Mar 20 18:00:19 crc kubenswrapper[4690]: I0320 18:00:19.340140 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b86e285b4fa717f50dad18a02cce21012bdc13b57a36469ccdf3e6de0114487"} err="failed to get container status \"8b86e285b4fa717f50dad18a02cce21012bdc13b57a36469ccdf3e6de0114487\": rpc error: code = NotFound desc = could not find container \"8b86e285b4fa717f50dad18a02cce21012bdc13b57a36469ccdf3e6de0114487\": container with ID starting with 8b86e285b4fa717f50dad18a02cce21012bdc13b57a36469ccdf3e6de0114487 not found: ID does not exist" Mar 20 18:00:19 crc kubenswrapper[4690]: I0320 18:00:19.894455 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f89fcb23-e73f-4539-b5d8-8b20e3762a67" path="/var/lib/kubelet/pods/f89fcb23-e73f-4539-b5d8-8b20e3762a67/volumes" Mar 20 18:00:20 crc kubenswrapper[4690]: I0320 18:00:20.702429 4690 scope.go:117] "RemoveContainer" containerID="78a6a708e8f1e4570a60795cb1ccaacbc4545031dccf0efe5c2caca106220363" Mar 20 18:00:20 crc kubenswrapper[4690]: I0320 18:00:20.724306 4690 scope.go:117] "RemoveContainer" containerID="3a3910af7bc6e45224246856a6c4b3db5027b27a45428a159726fd635325e20f" Mar 20 18:00:20 crc kubenswrapper[4690]: I0320 18:00:20.743356 4690 scope.go:117] "RemoveContainer" containerID="eba29818292a7ee79edced794f1a5f5bd1986090cfbc4d98d81db31754af90eb" Mar 20 18:00:26 crc kubenswrapper[4690]: I0320 18:00:26.883725 4690 scope.go:117] "RemoveContainer" containerID="965e35066bff888caca5b994dc3af56f56ca5e0e9e97a4c5970943a091971930" Mar 20 18:00:26 crc kubenswrapper[4690]: E0320 18:00:26.884921 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:00:37 crc kubenswrapper[4690]: I0320 18:00:37.883480 4690 scope.go:117] "RemoveContainer" containerID="965e35066bff888caca5b994dc3af56f56ca5e0e9e97a4c5970943a091971930" Mar 20 18:00:37 crc kubenswrapper[4690]: E0320 18:00:37.884778 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:00:49 crc kubenswrapper[4690]: I0320 18:00:49.883362 4690 scope.go:117] "RemoveContainer" containerID="965e35066bff888caca5b994dc3af56f56ca5e0e9e97a4c5970943a091971930" Mar 20 18:00:49 crc kubenswrapper[4690]: E0320 18:00:49.885398 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:00:54 crc kubenswrapper[4690]: I0320 18:00:54.309392 4690 generic.go:334] "Generic (PLEG): container finished" podID="33405126-fa78-4ad4-9587-e157ffd9f389" containerID="ab94ef72179a2a482a89ed746f85123c3a4cbc06440b8bbed7fd308ee24c56b8" exitCode=0 Mar 20 18:00:54 crc kubenswrapper[4690]: I0320 18:00:54.309513 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zmwzf" event={"ID":"33405126-fa78-4ad4-9587-e157ffd9f389","Type":"ContainerDied","Data":"ab94ef72179a2a482a89ed746f85123c3a4cbc06440b8bbed7fd308ee24c56b8"} Mar 20 18:00:55 crc kubenswrapper[4690]: I0320 18:00:55.786474 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zmwzf" Mar 20 18:00:55 crc kubenswrapper[4690]: I0320 18:00:55.920351 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9nsr\" (UniqueName: \"kubernetes.io/projected/33405126-fa78-4ad4-9587-e157ffd9f389-kube-api-access-c9nsr\") pod \"33405126-fa78-4ad4-9587-e157ffd9f389\" (UID: \"33405126-fa78-4ad4-9587-e157ffd9f389\") " Mar 20 18:00:55 crc kubenswrapper[4690]: I0320 18:00:55.920745 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/33405126-fa78-4ad4-9587-e157ffd9f389-ssh-key-openstack-edpm-ipam\") pod \"33405126-fa78-4ad4-9587-e157ffd9f389\" (UID: \"33405126-fa78-4ad4-9587-e157ffd9f389\") " Mar 20 18:00:55 crc kubenswrapper[4690]: I0320 18:00:55.920936 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33405126-fa78-4ad4-9587-e157ffd9f389-bootstrap-combined-ca-bundle\") pod \"33405126-fa78-4ad4-9587-e157ffd9f389\" (UID: \"33405126-fa78-4ad4-9587-e157ffd9f389\") " Mar 20 18:00:55 crc kubenswrapper[4690]: I0320 18:00:55.921161 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/33405126-fa78-4ad4-9587-e157ffd9f389-inventory\") pod \"33405126-fa78-4ad4-9587-e157ffd9f389\" (UID: \"33405126-fa78-4ad4-9587-e157ffd9f389\") " Mar 20 18:00:55 crc kubenswrapper[4690]: I0320 18:00:55.939337 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33405126-fa78-4ad4-9587-e157ffd9f389-kube-api-access-c9nsr" (OuterVolumeSpecName: "kube-api-access-c9nsr") pod "33405126-fa78-4ad4-9587-e157ffd9f389" (UID: "33405126-fa78-4ad4-9587-e157ffd9f389"). InnerVolumeSpecName "kube-api-access-c9nsr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:00:55 crc kubenswrapper[4690]: I0320 18:00:55.940029 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33405126-fa78-4ad4-9587-e157ffd9f389-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "33405126-fa78-4ad4-9587-e157ffd9f389" (UID: "33405126-fa78-4ad4-9587-e157ffd9f389"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:00:55 crc kubenswrapper[4690]: I0320 18:00:55.958041 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33405126-fa78-4ad4-9587-e157ffd9f389-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "33405126-fa78-4ad4-9587-e157ffd9f389" (UID: "33405126-fa78-4ad4-9587-e157ffd9f389"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:00:55 crc kubenswrapper[4690]: I0320 18:00:55.965192 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33405126-fa78-4ad4-9587-e157ffd9f389-inventory" (OuterVolumeSpecName: "inventory") pod "33405126-fa78-4ad4-9587-e157ffd9f389" (UID: "33405126-fa78-4ad4-9587-e157ffd9f389"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:00:56 crc kubenswrapper[4690]: I0320 18:00:56.023769 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c9nsr\" (UniqueName: \"kubernetes.io/projected/33405126-fa78-4ad4-9587-e157ffd9f389-kube-api-access-c9nsr\") on node \"crc\" DevicePath \"\"" Mar 20 18:00:56 crc kubenswrapper[4690]: I0320 18:00:56.023813 4690 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/33405126-fa78-4ad4-9587-e157ffd9f389-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 18:00:56 crc kubenswrapper[4690]: I0320 18:00:56.023825 4690 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33405126-fa78-4ad4-9587-e157ffd9f389-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 18:00:56 crc kubenswrapper[4690]: I0320 18:00:56.023838 4690 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/33405126-fa78-4ad4-9587-e157ffd9f389-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 18:00:56 crc kubenswrapper[4690]: I0320 18:00:56.351816 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zmwzf" event={"ID":"33405126-fa78-4ad4-9587-e157ffd9f389","Type":"ContainerDied","Data":"84e6318b40412bcde2cfcdc1619c2fb992a79e983ffda7619d62b952c1965e05"} Mar 20 18:00:56 crc kubenswrapper[4690]: I0320 18:00:56.351865 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="84e6318b40412bcde2cfcdc1619c2fb992a79e983ffda7619d62b952c1965e05" Mar 20 18:00:56 crc kubenswrapper[4690]: I0320 18:00:56.351956 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zmwzf" Mar 20 18:00:56 crc kubenswrapper[4690]: I0320 18:00:56.476664 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6nfgk"] Mar 20 18:00:56 crc kubenswrapper[4690]: E0320 18:00:56.478335 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33405126-fa78-4ad4-9587-e157ffd9f389" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 20 18:00:56 crc kubenswrapper[4690]: I0320 18:00:56.478369 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="33405126-fa78-4ad4-9587-e157ffd9f389" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 20 18:00:56 crc kubenswrapper[4690]: E0320 18:00:56.478423 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f89fcb23-e73f-4539-b5d8-8b20e3762a67" containerName="extract-content" Mar 20 18:00:56 crc kubenswrapper[4690]: I0320 18:00:56.478436 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="f89fcb23-e73f-4539-b5d8-8b20e3762a67" containerName="extract-content" Mar 20 18:00:56 crc kubenswrapper[4690]: E0320 18:00:56.478462 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1a94716-92ff-429b-b528-1144c64571c4" containerName="oc" Mar 20 18:00:56 crc kubenswrapper[4690]: I0320 18:00:56.478471 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1a94716-92ff-429b-b528-1144c64571c4" containerName="oc" Mar 20 18:00:56 crc kubenswrapper[4690]: E0320 18:00:56.478485 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f89fcb23-e73f-4539-b5d8-8b20e3762a67" containerName="registry-server" Mar 20 18:00:56 crc kubenswrapper[4690]: I0320 18:00:56.478493 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="f89fcb23-e73f-4539-b5d8-8b20e3762a67" containerName="registry-server" Mar 20 18:00:56 crc kubenswrapper[4690]: E0320 18:00:56.478516 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f89fcb23-e73f-4539-b5d8-8b20e3762a67" containerName="extract-utilities" Mar 20 18:00:56 crc kubenswrapper[4690]: I0320 18:00:56.478525 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="f89fcb23-e73f-4539-b5d8-8b20e3762a67" containerName="extract-utilities" Mar 20 18:00:56 crc kubenswrapper[4690]: I0320 18:00:56.479034 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="f89fcb23-e73f-4539-b5d8-8b20e3762a67" containerName="registry-server" Mar 20 18:00:56 crc kubenswrapper[4690]: I0320 18:00:56.479071 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="33405126-fa78-4ad4-9587-e157ffd9f389" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 20 18:00:56 crc kubenswrapper[4690]: I0320 18:00:56.479103 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1a94716-92ff-429b-b528-1144c64571c4" containerName="oc" Mar 20 18:00:56 crc kubenswrapper[4690]: I0320 18:00:56.480433 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6nfgk" Mar 20 18:00:56 crc kubenswrapper[4690]: I0320 18:00:56.485247 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 18:00:56 crc kubenswrapper[4690]: I0320 18:00:56.485521 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-k9qb4" Mar 20 18:00:56 crc kubenswrapper[4690]: I0320 18:00:56.486295 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 18:00:56 crc kubenswrapper[4690]: I0320 18:00:56.487885 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 18:00:56 crc kubenswrapper[4690]: I0320 18:00:56.491229 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6nfgk"] Mar 20 18:00:56 crc kubenswrapper[4690]: I0320 18:00:56.634996 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/86d7b6e3-05d5-475d-b95f-9ba0d5b43df4-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-6nfgk\" (UID: \"86d7b6e3-05d5-475d-b95f-9ba0d5b43df4\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6nfgk" Mar 20 18:00:56 crc kubenswrapper[4690]: I0320 18:00:56.635090 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gs7jx\" (UniqueName: \"kubernetes.io/projected/86d7b6e3-05d5-475d-b95f-9ba0d5b43df4-kube-api-access-gs7jx\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-6nfgk\" (UID: \"86d7b6e3-05d5-475d-b95f-9ba0d5b43df4\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6nfgk" Mar 20 18:00:56 crc kubenswrapper[4690]: I0320 18:00:56.635430 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/86d7b6e3-05d5-475d-b95f-9ba0d5b43df4-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-6nfgk\" (UID: \"86d7b6e3-05d5-475d-b95f-9ba0d5b43df4\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6nfgk" Mar 20 18:00:56 crc kubenswrapper[4690]: I0320 18:00:56.737449 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/86d7b6e3-05d5-475d-b95f-9ba0d5b43df4-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-6nfgk\" (UID: \"86d7b6e3-05d5-475d-b95f-9ba0d5b43df4\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6nfgk" Mar 20 18:00:56 crc kubenswrapper[4690]: I0320 18:00:56.737547 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/86d7b6e3-05d5-475d-b95f-9ba0d5b43df4-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-6nfgk\" (UID: \"86d7b6e3-05d5-475d-b95f-9ba0d5b43df4\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6nfgk" Mar 20 18:00:56 crc kubenswrapper[4690]: I0320 18:00:56.737669 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gs7jx\" (UniqueName: \"kubernetes.io/projected/86d7b6e3-05d5-475d-b95f-9ba0d5b43df4-kube-api-access-gs7jx\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-6nfgk\" (UID: \"86d7b6e3-05d5-475d-b95f-9ba0d5b43df4\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6nfgk" Mar 20 18:00:56 crc kubenswrapper[4690]: I0320 18:00:56.747655 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/86d7b6e3-05d5-475d-b95f-9ba0d5b43df4-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-6nfgk\" (UID: \"86d7b6e3-05d5-475d-b95f-9ba0d5b43df4\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6nfgk" Mar 20 18:00:56 crc kubenswrapper[4690]: I0320 18:00:56.748921 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/86d7b6e3-05d5-475d-b95f-9ba0d5b43df4-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-6nfgk\" (UID: \"86d7b6e3-05d5-475d-b95f-9ba0d5b43df4\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6nfgk" Mar 20 18:00:56 crc kubenswrapper[4690]: I0320 18:00:56.764767 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gs7jx\" (UniqueName: \"kubernetes.io/projected/86d7b6e3-05d5-475d-b95f-9ba0d5b43df4-kube-api-access-gs7jx\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-6nfgk\" (UID: \"86d7b6e3-05d5-475d-b95f-9ba0d5b43df4\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6nfgk" Mar 20 18:00:56 crc kubenswrapper[4690]: I0320 18:00:56.796360 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6nfgk" Mar 20 18:00:57 crc kubenswrapper[4690]: I0320 18:00:57.435705 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6nfgk"] Mar 20 18:00:58 crc kubenswrapper[4690]: I0320 18:00:58.373482 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6nfgk" event={"ID":"86d7b6e3-05d5-475d-b95f-9ba0d5b43df4","Type":"ContainerStarted","Data":"f7d0a44f820ecdec63361aec2d469466a3b87821f41e9fb74e8440fe6dfd5999"} Mar 20 18:00:59 crc kubenswrapper[4690]: I0320 18:00:59.393348 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6nfgk" event={"ID":"86d7b6e3-05d5-475d-b95f-9ba0d5b43df4","Type":"ContainerStarted","Data":"53c786d4c1f25a343e397178757134a6918e52682c8fd88551b6155f0de8e35c"} Mar 20 18:00:59 crc kubenswrapper[4690]: I0320 18:00:59.414837 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6nfgk" podStartSLOduration=2.078210091 podStartE2EDuration="3.414791724s" podCreationTimestamp="2026-03-20 18:00:56 +0000 UTC" firstStartedPulling="2026-03-20 18:00:57.439960294 +0000 UTC m=+1732.305785962" lastFinishedPulling="2026-03-20 18:00:58.776541877 +0000 UTC m=+1733.642367595" observedRunningTime="2026-03-20 18:00:59.412236702 +0000 UTC m=+1734.278062470" watchObservedRunningTime="2026-03-20 18:00:59.414791724 +0000 UTC m=+1734.280617402" Mar 20 18:01:00 crc kubenswrapper[4690]: I0320 18:01:00.139871 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29567161-sq958"] Mar 20 18:01:00 crc kubenswrapper[4690]: I0320 18:01:00.142595 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29567161-sq958" Mar 20 18:01:00 crc kubenswrapper[4690]: I0320 18:01:00.170476 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29567161-sq958"] Mar 20 18:01:00 crc kubenswrapper[4690]: I0320 18:01:00.207467 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d7d7e4d-2f06-4abf-aa2d-ff85ac933f66-config-data\") pod \"keystone-cron-29567161-sq958\" (UID: \"3d7d7e4d-2f06-4abf-aa2d-ff85ac933f66\") " pod="openstack/keystone-cron-29567161-sq958" Mar 20 18:01:00 crc kubenswrapper[4690]: I0320 18:01:00.207795 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d7d7e4d-2f06-4abf-aa2d-ff85ac933f66-combined-ca-bundle\") pod \"keystone-cron-29567161-sq958\" (UID: \"3d7d7e4d-2f06-4abf-aa2d-ff85ac933f66\") " pod="openstack/keystone-cron-29567161-sq958" Mar 20 18:01:00 crc kubenswrapper[4690]: I0320 18:01:00.207827 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dj86s\" (UniqueName: \"kubernetes.io/projected/3d7d7e4d-2f06-4abf-aa2d-ff85ac933f66-kube-api-access-dj86s\") pod \"keystone-cron-29567161-sq958\" (UID: \"3d7d7e4d-2f06-4abf-aa2d-ff85ac933f66\") " pod="openstack/keystone-cron-29567161-sq958" Mar 20 18:01:00 crc kubenswrapper[4690]: I0320 18:01:00.207883 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3d7d7e4d-2f06-4abf-aa2d-ff85ac933f66-fernet-keys\") pod \"keystone-cron-29567161-sq958\" (UID: \"3d7d7e4d-2f06-4abf-aa2d-ff85ac933f66\") " pod="openstack/keystone-cron-29567161-sq958" Mar 20 18:01:00 crc kubenswrapper[4690]: I0320 18:01:00.309523 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3d7d7e4d-2f06-4abf-aa2d-ff85ac933f66-fernet-keys\") pod \"keystone-cron-29567161-sq958\" (UID: \"3d7d7e4d-2f06-4abf-aa2d-ff85ac933f66\") " pod="openstack/keystone-cron-29567161-sq958" Mar 20 18:01:00 crc kubenswrapper[4690]: I0320 18:01:00.309711 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d7d7e4d-2f06-4abf-aa2d-ff85ac933f66-config-data\") pod \"keystone-cron-29567161-sq958\" (UID: \"3d7d7e4d-2f06-4abf-aa2d-ff85ac933f66\") " pod="openstack/keystone-cron-29567161-sq958" Mar 20 18:01:00 crc kubenswrapper[4690]: I0320 18:01:00.309774 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d7d7e4d-2f06-4abf-aa2d-ff85ac933f66-combined-ca-bundle\") pod \"keystone-cron-29567161-sq958\" (UID: \"3d7d7e4d-2f06-4abf-aa2d-ff85ac933f66\") " pod="openstack/keystone-cron-29567161-sq958" Mar 20 18:01:00 crc kubenswrapper[4690]: I0320 18:01:00.309805 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dj86s\" (UniqueName: \"kubernetes.io/projected/3d7d7e4d-2f06-4abf-aa2d-ff85ac933f66-kube-api-access-dj86s\") pod \"keystone-cron-29567161-sq958\" (UID: \"3d7d7e4d-2f06-4abf-aa2d-ff85ac933f66\") " pod="openstack/keystone-cron-29567161-sq958" Mar 20 18:01:00 crc kubenswrapper[4690]: I0320 18:01:00.315811 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d7d7e4d-2f06-4abf-aa2d-ff85ac933f66-config-data\") pod \"keystone-cron-29567161-sq958\" (UID: \"3d7d7e4d-2f06-4abf-aa2d-ff85ac933f66\") " pod="openstack/keystone-cron-29567161-sq958" Mar 20 18:01:00 crc kubenswrapper[4690]: I0320 18:01:00.316547 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3d7d7e4d-2f06-4abf-aa2d-ff85ac933f66-fernet-keys\") pod \"keystone-cron-29567161-sq958\" (UID: \"3d7d7e4d-2f06-4abf-aa2d-ff85ac933f66\") " pod="openstack/keystone-cron-29567161-sq958" Mar 20 18:01:00 crc kubenswrapper[4690]: I0320 18:01:00.332500 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d7d7e4d-2f06-4abf-aa2d-ff85ac933f66-combined-ca-bundle\") pod \"keystone-cron-29567161-sq958\" (UID: \"3d7d7e4d-2f06-4abf-aa2d-ff85ac933f66\") " pod="openstack/keystone-cron-29567161-sq958" Mar 20 18:01:00 crc kubenswrapper[4690]: I0320 18:01:00.334189 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dj86s\" (UniqueName: \"kubernetes.io/projected/3d7d7e4d-2f06-4abf-aa2d-ff85ac933f66-kube-api-access-dj86s\") pod \"keystone-cron-29567161-sq958\" (UID: \"3d7d7e4d-2f06-4abf-aa2d-ff85ac933f66\") " pod="openstack/keystone-cron-29567161-sq958" Mar 20 18:01:00 crc kubenswrapper[4690]: I0320 18:01:00.465959 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29567161-sq958" Mar 20 18:01:00 crc kubenswrapper[4690]: I0320 18:01:00.758477 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29567161-sq958"] Mar 20 18:01:01 crc kubenswrapper[4690]: I0320 18:01:01.416187 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29567161-sq958" event={"ID":"3d7d7e4d-2f06-4abf-aa2d-ff85ac933f66","Type":"ContainerStarted","Data":"b39e8e8ff1471ec165dbe2fd73c0c94717860cc6b6240651bf8f8bce8181e3ae"} Mar 20 18:01:01 crc kubenswrapper[4690]: I0320 18:01:01.416566 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29567161-sq958" event={"ID":"3d7d7e4d-2f06-4abf-aa2d-ff85ac933f66","Type":"ContainerStarted","Data":"7c34eb7b7ab4ff8016dba4ef98391f83180e83c20188f47f6dc315c473fe1f8b"} Mar 20 18:01:01 crc kubenswrapper[4690]: I0320 18:01:01.439761 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29567161-sq958" podStartSLOduration=1.439736303 podStartE2EDuration="1.439736303s" podCreationTimestamp="2026-03-20 18:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 18:01:01.436627965 +0000 UTC m=+1736.302453683" watchObservedRunningTime="2026-03-20 18:01:01.439736303 +0000 UTC m=+1736.305561981" Mar 20 18:01:03 crc kubenswrapper[4690]: I0320 18:01:03.438891 4690 generic.go:334] "Generic (PLEG): container finished" podID="3d7d7e4d-2f06-4abf-aa2d-ff85ac933f66" containerID="b39e8e8ff1471ec165dbe2fd73c0c94717860cc6b6240651bf8f8bce8181e3ae" exitCode=0 Mar 20 18:01:03 crc kubenswrapper[4690]: I0320 18:01:03.439005 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29567161-sq958" event={"ID":"3d7d7e4d-2f06-4abf-aa2d-ff85ac933f66","Type":"ContainerDied","Data":"b39e8e8ff1471ec165dbe2fd73c0c94717860cc6b6240651bf8f8bce8181e3ae"} Mar 20 18:01:04 crc kubenswrapper[4690]: I0320 18:01:04.800896 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29567161-sq958" Mar 20 18:01:04 crc kubenswrapper[4690]: I0320 18:01:04.809363 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d7d7e4d-2f06-4abf-aa2d-ff85ac933f66-config-data\") pod \"3d7d7e4d-2f06-4abf-aa2d-ff85ac933f66\" (UID: \"3d7d7e4d-2f06-4abf-aa2d-ff85ac933f66\") " Mar 20 18:01:04 crc kubenswrapper[4690]: I0320 18:01:04.809556 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3d7d7e4d-2f06-4abf-aa2d-ff85ac933f66-fernet-keys\") pod \"3d7d7e4d-2f06-4abf-aa2d-ff85ac933f66\" (UID: \"3d7d7e4d-2f06-4abf-aa2d-ff85ac933f66\") " Mar 20 18:01:04 crc kubenswrapper[4690]: I0320 18:01:04.809714 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d7d7e4d-2f06-4abf-aa2d-ff85ac933f66-combined-ca-bundle\") pod \"3d7d7e4d-2f06-4abf-aa2d-ff85ac933f66\" (UID: \"3d7d7e4d-2f06-4abf-aa2d-ff85ac933f66\") " Mar 20 18:01:04 crc kubenswrapper[4690]: I0320 18:01:04.809830 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dj86s\" (UniqueName: \"kubernetes.io/projected/3d7d7e4d-2f06-4abf-aa2d-ff85ac933f66-kube-api-access-dj86s\") pod \"3d7d7e4d-2f06-4abf-aa2d-ff85ac933f66\" (UID: \"3d7d7e4d-2f06-4abf-aa2d-ff85ac933f66\") " Mar 20 18:01:04 crc kubenswrapper[4690]: I0320 18:01:04.815847 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d7d7e4d-2f06-4abf-aa2d-ff85ac933f66-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "3d7d7e4d-2f06-4abf-aa2d-ff85ac933f66" (UID: "3d7d7e4d-2f06-4abf-aa2d-ff85ac933f66"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:01:04 crc kubenswrapper[4690]: I0320 18:01:04.816621 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d7d7e4d-2f06-4abf-aa2d-ff85ac933f66-kube-api-access-dj86s" (OuterVolumeSpecName: "kube-api-access-dj86s") pod "3d7d7e4d-2f06-4abf-aa2d-ff85ac933f66" (UID: "3d7d7e4d-2f06-4abf-aa2d-ff85ac933f66"). InnerVolumeSpecName "kube-api-access-dj86s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:01:04 crc kubenswrapper[4690]: I0320 18:01:04.849414 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d7d7e4d-2f06-4abf-aa2d-ff85ac933f66-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3d7d7e4d-2f06-4abf-aa2d-ff85ac933f66" (UID: "3d7d7e4d-2f06-4abf-aa2d-ff85ac933f66"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:01:04 crc kubenswrapper[4690]: I0320 18:01:04.884395 4690 scope.go:117] "RemoveContainer" containerID="965e35066bff888caca5b994dc3af56f56ca5e0e9e97a4c5970943a091971930" Mar 20 18:01:04 crc kubenswrapper[4690]: E0320 18:01:04.884837 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:01:04 crc kubenswrapper[4690]: I0320 18:01:04.895057 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d7d7e4d-2f06-4abf-aa2d-ff85ac933f66-config-data" (OuterVolumeSpecName: "config-data") pod "3d7d7e4d-2f06-4abf-aa2d-ff85ac933f66" (UID: "3d7d7e4d-2f06-4abf-aa2d-ff85ac933f66"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:01:04 crc kubenswrapper[4690]: I0320 18:01:04.912863 4690 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d7d7e4d-2f06-4abf-aa2d-ff85ac933f66-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 18:01:04 crc kubenswrapper[4690]: I0320 18:01:04.912898 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dj86s\" (UniqueName: \"kubernetes.io/projected/3d7d7e4d-2f06-4abf-aa2d-ff85ac933f66-kube-api-access-dj86s\") on node \"crc\" DevicePath \"\"" Mar 20 18:01:04 crc kubenswrapper[4690]: I0320 18:01:04.912913 4690 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d7d7e4d-2f06-4abf-aa2d-ff85ac933f66-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 18:01:04 crc kubenswrapper[4690]: I0320 18:01:04.912925 4690 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3d7d7e4d-2f06-4abf-aa2d-ff85ac933f66-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 20 18:01:05 crc kubenswrapper[4690]: I0320 18:01:05.468824 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29567161-sq958" event={"ID":"3d7d7e4d-2f06-4abf-aa2d-ff85ac933f66","Type":"ContainerDied","Data":"7c34eb7b7ab4ff8016dba4ef98391f83180e83c20188f47f6dc315c473fe1f8b"} Mar 20 18:01:05 crc kubenswrapper[4690]: I0320 18:01:05.468863 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c34eb7b7ab4ff8016dba4ef98391f83180e83c20188f47f6dc315c473fe1f8b" Mar 20 18:01:05 crc kubenswrapper[4690]: I0320 18:01:05.468914 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29567161-sq958" Mar 20 18:01:16 crc kubenswrapper[4690]: I0320 18:01:16.883731 4690 scope.go:117] "RemoveContainer" containerID="965e35066bff888caca5b994dc3af56f56ca5e0e9e97a4c5970943a091971930" Mar 20 18:01:16 crc kubenswrapper[4690]: E0320 18:01:16.885115 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:01:29 crc kubenswrapper[4690]: I0320 18:01:29.883740 4690 scope.go:117] "RemoveContainer" containerID="965e35066bff888caca5b994dc3af56f56ca5e0e9e97a4c5970943a091971930" Mar 20 18:01:29 crc kubenswrapper[4690]: E0320 18:01:29.884781 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:01:40 crc kubenswrapper[4690]: I0320 18:01:40.883431 4690 scope.go:117] "RemoveContainer" containerID="965e35066bff888caca5b994dc3af56f56ca5e0e9e97a4c5970943a091971930" Mar 20 18:01:40 crc kubenswrapper[4690]: E0320 18:01:40.885279 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:01:52 crc kubenswrapper[4690]: I0320 18:01:52.884149 4690 scope.go:117] "RemoveContainer" containerID="965e35066bff888caca5b994dc3af56f56ca5e0e9e97a4c5970943a091971930" Mar 20 18:01:52 crc kubenswrapper[4690]: E0320 18:01:52.885235 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:01:56 crc kubenswrapper[4690]: I0320 18:01:56.043367 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-9jdjw"] Mar 20 18:01:56 crc kubenswrapper[4690]: I0320 18:01:56.055591 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-d6a5-account-create-update-kwm5l"] Mar 20 18:01:56 crc kubenswrapper[4690]: I0320 18:01:56.067061 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-d6a5-account-create-update-kwm5l"] Mar 20 18:01:56 crc kubenswrapper[4690]: I0320 18:01:56.076668 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-9jdjw"] Mar 20 18:01:57 crc kubenswrapper[4690]: I0320 18:01:57.046734 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-6nm7z"] Mar 20 18:01:57 crc kubenswrapper[4690]: I0320 18:01:57.065433 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-qch77"] Mar 20 18:01:57 crc kubenswrapper[4690]: I0320 18:01:57.076555 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-43e3-account-create-update-hb8zx"] Mar 20 18:01:57 crc kubenswrapper[4690]: I0320 18:01:57.087849 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-98d6-account-create-update-hbppj"] Mar 20 18:01:57 crc kubenswrapper[4690]: I0320 18:01:57.098085 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-qch77"] Mar 20 18:01:57 crc kubenswrapper[4690]: I0320 18:01:57.105578 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-6nm7z"] Mar 20 18:01:57 crc kubenswrapper[4690]: I0320 18:01:57.113316 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-98d6-account-create-update-hbppj"] Mar 20 18:01:57 crc kubenswrapper[4690]: I0320 18:01:57.120586 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-43e3-account-create-update-hb8zx"] Mar 20 18:01:57 crc kubenswrapper[4690]: I0320 18:01:57.895528 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08b775c4-9217-4241-8049-1253db4ecb81" path="/var/lib/kubelet/pods/08b775c4-9217-4241-8049-1253db4ecb81/volumes" Mar 20 18:01:57 crc kubenswrapper[4690]: I0320 18:01:57.896298 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8762da29-e17c-42a8-b233-a6c565c3a6de" path="/var/lib/kubelet/pods/8762da29-e17c-42a8-b233-a6c565c3a6de/volumes" Mar 20 18:01:57 crc kubenswrapper[4690]: I0320 18:01:57.896939 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="995742f4-b26d-4f30-ae2c-2635257cd664" path="/var/lib/kubelet/pods/995742f4-b26d-4f30-ae2c-2635257cd664/volumes" Mar 20 18:01:57 crc kubenswrapper[4690]: I0320 18:01:57.897631 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6415de2-c5b2-4077-91d9-74f1c0852b56" path="/var/lib/kubelet/pods/d6415de2-c5b2-4077-91d9-74f1c0852b56/volumes" Mar 20 18:01:57 crc kubenswrapper[4690]: I0320 18:01:57.898899 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e760b621-28bc-4ede-b08d-8e46250407eb" path="/var/lib/kubelet/pods/e760b621-28bc-4ede-b08d-8e46250407eb/volumes" Mar 20 18:01:57 crc kubenswrapper[4690]: I0320 18:01:57.899623 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8409e1a-31b0-4050-86bf-69c3d18c6185" path="/var/lib/kubelet/pods/f8409e1a-31b0-4050-86bf-69c3d18c6185/volumes" Mar 20 18:02:00 crc kubenswrapper[4690]: I0320 18:02:00.143083 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567162-cs4rc"] Mar 20 18:02:00 crc kubenswrapper[4690]: E0320 18:02:00.144157 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d7d7e4d-2f06-4abf-aa2d-ff85ac933f66" containerName="keystone-cron" Mar 20 18:02:00 crc kubenswrapper[4690]: I0320 18:02:00.144175 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d7d7e4d-2f06-4abf-aa2d-ff85ac933f66" containerName="keystone-cron" Mar 20 18:02:00 crc kubenswrapper[4690]: I0320 18:02:00.144426 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d7d7e4d-2f06-4abf-aa2d-ff85ac933f66" containerName="keystone-cron" Mar 20 18:02:00 crc kubenswrapper[4690]: I0320 18:02:00.145147 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567162-cs4rc" Mar 20 18:02:00 crc kubenswrapper[4690]: I0320 18:02:00.147584 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 18:02:00 crc kubenswrapper[4690]: I0320 18:02:00.147756 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5fwhb" Mar 20 18:02:00 crc kubenswrapper[4690]: I0320 18:02:00.148298 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 18:02:00 crc kubenswrapper[4690]: I0320 18:02:00.156409 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567162-cs4rc"] Mar 20 18:02:00 crc kubenswrapper[4690]: I0320 18:02:00.165796 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gzdj\" (UniqueName: \"kubernetes.io/projected/5c99c907-4cb3-4381-88e6-ada4a9efb021-kube-api-access-8gzdj\") pod \"auto-csr-approver-29567162-cs4rc\" (UID: \"5c99c907-4cb3-4381-88e6-ada4a9efb021\") " pod="openshift-infra/auto-csr-approver-29567162-cs4rc" Mar 20 18:02:00 crc kubenswrapper[4690]: I0320 18:02:00.267482 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gzdj\" (UniqueName: \"kubernetes.io/projected/5c99c907-4cb3-4381-88e6-ada4a9efb021-kube-api-access-8gzdj\") pod \"auto-csr-approver-29567162-cs4rc\" (UID: \"5c99c907-4cb3-4381-88e6-ada4a9efb021\") " pod="openshift-infra/auto-csr-approver-29567162-cs4rc" Mar 20 18:02:00 crc kubenswrapper[4690]: I0320 18:02:00.284496 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gzdj\" (UniqueName: \"kubernetes.io/projected/5c99c907-4cb3-4381-88e6-ada4a9efb021-kube-api-access-8gzdj\") pod \"auto-csr-approver-29567162-cs4rc\" (UID: \"5c99c907-4cb3-4381-88e6-ada4a9efb021\") " pod="openshift-infra/auto-csr-approver-29567162-cs4rc" Mar 20 18:02:00 crc kubenswrapper[4690]: I0320 18:02:00.467229 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567162-cs4rc" Mar 20 18:02:00 crc kubenswrapper[4690]: I0320 18:02:00.919617 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567162-cs4rc"] Mar 20 18:02:01 crc kubenswrapper[4690]: I0320 18:02:01.090235 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567162-cs4rc" event={"ID":"5c99c907-4cb3-4381-88e6-ada4a9efb021","Type":"ContainerStarted","Data":"de9870792f322281ca8b67db86ea71811f4a9091ee402f4d72fd20b94f149032"} Mar 20 18:02:03 crc kubenswrapper[4690]: I0320 18:02:03.116595 4690 generic.go:334] "Generic (PLEG): container finished" podID="5c99c907-4cb3-4381-88e6-ada4a9efb021" containerID="21e06f020d4543c1c68681b411fbfb109693f057944924580580bb928e32dd5a" exitCode=0 Mar 20 18:02:03 crc kubenswrapper[4690]: I0320 18:02:03.116685 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567162-cs4rc" event={"ID":"5c99c907-4cb3-4381-88e6-ada4a9efb021","Type":"ContainerDied","Data":"21e06f020d4543c1c68681b411fbfb109693f057944924580580bb928e32dd5a"} Mar 20 18:02:04 crc kubenswrapper[4690]: I0320 18:02:04.500341 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567162-cs4rc" Mar 20 18:02:04 crc kubenswrapper[4690]: I0320 18:02:04.658504 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8gzdj\" (UniqueName: \"kubernetes.io/projected/5c99c907-4cb3-4381-88e6-ada4a9efb021-kube-api-access-8gzdj\") pod \"5c99c907-4cb3-4381-88e6-ada4a9efb021\" (UID: \"5c99c907-4cb3-4381-88e6-ada4a9efb021\") " Mar 20 18:02:04 crc kubenswrapper[4690]: I0320 18:02:04.665866 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c99c907-4cb3-4381-88e6-ada4a9efb021-kube-api-access-8gzdj" (OuterVolumeSpecName: "kube-api-access-8gzdj") pod "5c99c907-4cb3-4381-88e6-ada4a9efb021" (UID: "5c99c907-4cb3-4381-88e6-ada4a9efb021"). InnerVolumeSpecName "kube-api-access-8gzdj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:02:04 crc kubenswrapper[4690]: I0320 18:02:04.760918 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8gzdj\" (UniqueName: \"kubernetes.io/projected/5c99c907-4cb3-4381-88e6-ada4a9efb021-kube-api-access-8gzdj\") on node \"crc\" DevicePath \"\"" Mar 20 18:02:05 crc kubenswrapper[4690]: I0320 18:02:05.140979 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567162-cs4rc" event={"ID":"5c99c907-4cb3-4381-88e6-ada4a9efb021","Type":"ContainerDied","Data":"de9870792f322281ca8b67db86ea71811f4a9091ee402f4d72fd20b94f149032"} Mar 20 18:02:05 crc kubenswrapper[4690]: I0320 18:02:05.141324 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de9870792f322281ca8b67db86ea71811f4a9091ee402f4d72fd20b94f149032" Mar 20 18:02:05 crc kubenswrapper[4690]: I0320 18:02:05.141069 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567162-cs4rc" Mar 20 18:02:05 crc kubenswrapper[4690]: I0320 18:02:05.587749 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567156-n6gh2"] Mar 20 18:02:05 crc kubenswrapper[4690]: I0320 18:02:05.598819 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567156-n6gh2"] Mar 20 18:02:05 crc kubenswrapper[4690]: I0320 18:02:05.906738 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98fc80fa-7ce3-43dc-9ec0-cccc94302c99" path="/var/lib/kubelet/pods/98fc80fa-7ce3-43dc-9ec0-cccc94302c99/volumes" Mar 20 18:02:07 crc kubenswrapper[4690]: I0320 18:02:07.883477 4690 scope.go:117] "RemoveContainer" containerID="965e35066bff888caca5b994dc3af56f56ca5e0e9e97a4c5970943a091971930" Mar 20 18:02:07 crc kubenswrapper[4690]: E0320 18:02:07.884174 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:02:15 crc kubenswrapper[4690]: I0320 18:02:15.032371 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-khb42"] Mar 20 18:02:15 crc kubenswrapper[4690]: I0320 18:02:15.041694 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-khb42"] Mar 20 18:02:15 crc kubenswrapper[4690]: I0320 18:02:15.901571 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd3dc2e5-e309-432f-a876-8cf78434e9d7" path="/var/lib/kubelet/pods/fd3dc2e5-e309-432f-a876-8cf78434e9d7/volumes" Mar 20 18:02:19 crc kubenswrapper[4690]: I0320 18:02:19.884079 4690 scope.go:117] "RemoveContainer" containerID="965e35066bff888caca5b994dc3af56f56ca5e0e9e97a4c5970943a091971930" Mar 20 18:02:19 crc kubenswrapper[4690]: E0320 18:02:19.885186 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:02:20 crc kubenswrapper[4690]: I0320 18:02:20.885629 4690 scope.go:117] "RemoveContainer" containerID="c8c1f89c24b34f4002fd1d1894f93e1415f49d52941e54264d9454bf070accb2" Mar 20 18:02:20 crc kubenswrapper[4690]: I0320 18:02:20.953169 4690 scope.go:117] "RemoveContainer" containerID="708db4bb1e48b05476ad52b46d6ff75400022cc63ab48bf41ae24dd02a6cfd03" Mar 20 18:02:20 crc kubenswrapper[4690]: I0320 18:02:20.988577 4690 scope.go:117] "RemoveContainer" containerID="e7517286ef6ba806a0a448a5c4c3fb9ed64d8f3cc9d5aade27b840310610bc2e" Mar 20 18:02:21 crc kubenswrapper[4690]: I0320 18:02:21.040908 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-x2bws"] Mar 20 18:02:21 crc kubenswrapper[4690]: I0320 18:02:21.047139 4690 scope.go:117] "RemoveContainer" containerID="ebb24362457997a0b0694fab60326ed9c7225d8706dabcd9a2cd29616831dcc5" Mar 20 18:02:21 crc kubenswrapper[4690]: I0320 18:02:21.051047 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-x2bws"] Mar 20 18:02:21 crc kubenswrapper[4690]: I0320 18:02:21.087498 4690 scope.go:117] "RemoveContainer" containerID="a5fed98b3c2fb3d0785b82bc8369314110a9df4d935ba8e7ec174b0b47a35990" Mar 20 18:02:21 crc kubenswrapper[4690]: I0320 18:02:21.121812 4690 scope.go:117] "RemoveContainer" containerID="a62e6985e8a728e6df4be921ff90b2ef220525cb6bedddd86024e5a64cceb273" Mar 20 18:02:21 crc kubenswrapper[4690]: I0320 18:02:21.166586 4690 scope.go:117] "RemoveContainer" containerID="1b3f404decb5c61241855588ab03641c91bda827e6d79fe66660b1469f93eaea" Mar 20 18:02:21 crc kubenswrapper[4690]: I0320 18:02:21.186279 4690 scope.go:117] "RemoveContainer" containerID="9e8043a54febb29b6ec235d3caddc995474dc5724fc958d914aa6e1fb3298c94" Mar 20 18:02:21 crc kubenswrapper[4690]: I0320 18:02:21.900077 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d08ec433-4043-43b3-ae56-de712919babe" path="/var/lib/kubelet/pods/d08ec433-4043-43b3-ae56-de712919babe/volumes" Mar 20 18:02:33 crc kubenswrapper[4690]: I0320 18:02:33.511093 4690 generic.go:334] "Generic (PLEG): container finished" podID="86d7b6e3-05d5-475d-b95f-9ba0d5b43df4" containerID="53c786d4c1f25a343e397178757134a6918e52682c8fd88551b6155f0de8e35c" exitCode=0 Mar 20 18:02:33 crc kubenswrapper[4690]: I0320 18:02:33.511208 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6nfgk" event={"ID":"86d7b6e3-05d5-475d-b95f-9ba0d5b43df4","Type":"ContainerDied","Data":"53c786d4c1f25a343e397178757134a6918e52682c8fd88551b6155f0de8e35c"} Mar 20 18:02:33 crc kubenswrapper[4690]: I0320 18:02:33.884175 4690 scope.go:117] "RemoveContainer" containerID="965e35066bff888caca5b994dc3af56f56ca5e0e9e97a4c5970943a091971930" Mar 20 18:02:33 crc kubenswrapper[4690]: E0320 18:02:33.884746 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:02:34 crc kubenswrapper[4690]: I0320 18:02:34.933641 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6nfgk" Mar 20 18:02:35 crc kubenswrapper[4690]: I0320 18:02:35.081733 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/86d7b6e3-05d5-475d-b95f-9ba0d5b43df4-inventory\") pod \"86d7b6e3-05d5-475d-b95f-9ba0d5b43df4\" (UID: \"86d7b6e3-05d5-475d-b95f-9ba0d5b43df4\") " Mar 20 18:02:35 crc kubenswrapper[4690]: I0320 18:02:35.081828 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gs7jx\" (UniqueName: \"kubernetes.io/projected/86d7b6e3-05d5-475d-b95f-9ba0d5b43df4-kube-api-access-gs7jx\") pod \"86d7b6e3-05d5-475d-b95f-9ba0d5b43df4\" (UID: \"86d7b6e3-05d5-475d-b95f-9ba0d5b43df4\") " Mar 20 18:02:35 crc kubenswrapper[4690]: I0320 18:02:35.081946 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/86d7b6e3-05d5-475d-b95f-9ba0d5b43df4-ssh-key-openstack-edpm-ipam\") pod \"86d7b6e3-05d5-475d-b95f-9ba0d5b43df4\" (UID: \"86d7b6e3-05d5-475d-b95f-9ba0d5b43df4\") " Mar 20 18:02:35 crc kubenswrapper[4690]: I0320 18:02:35.086913 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86d7b6e3-05d5-475d-b95f-9ba0d5b43df4-kube-api-access-gs7jx" (OuterVolumeSpecName: "kube-api-access-gs7jx") pod "86d7b6e3-05d5-475d-b95f-9ba0d5b43df4" (UID: "86d7b6e3-05d5-475d-b95f-9ba0d5b43df4"). InnerVolumeSpecName "kube-api-access-gs7jx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:02:35 crc kubenswrapper[4690]: I0320 18:02:35.108308 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86d7b6e3-05d5-475d-b95f-9ba0d5b43df4-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "86d7b6e3-05d5-475d-b95f-9ba0d5b43df4" (UID: "86d7b6e3-05d5-475d-b95f-9ba0d5b43df4"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:02:35 crc kubenswrapper[4690]: I0320 18:02:35.113437 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86d7b6e3-05d5-475d-b95f-9ba0d5b43df4-inventory" (OuterVolumeSpecName: "inventory") pod "86d7b6e3-05d5-475d-b95f-9ba0d5b43df4" (UID: "86d7b6e3-05d5-475d-b95f-9ba0d5b43df4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:02:35 crc kubenswrapper[4690]: I0320 18:02:35.184585 4690 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/86d7b6e3-05d5-475d-b95f-9ba0d5b43df4-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 18:02:35 crc kubenswrapper[4690]: I0320 18:02:35.184625 4690 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/86d7b6e3-05d5-475d-b95f-9ba0d5b43df4-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 18:02:35 crc kubenswrapper[4690]: I0320 18:02:35.184641 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gs7jx\" (UniqueName: \"kubernetes.io/projected/86d7b6e3-05d5-475d-b95f-9ba0d5b43df4-kube-api-access-gs7jx\") on node \"crc\" DevicePath \"\"" Mar 20 18:02:35 crc kubenswrapper[4690]: I0320 18:02:35.534848 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6nfgk" event={"ID":"86d7b6e3-05d5-475d-b95f-9ba0d5b43df4","Type":"ContainerDied","Data":"f7d0a44f820ecdec63361aec2d469466a3b87821f41e9fb74e8440fe6dfd5999"} Mar 20 18:02:35 crc kubenswrapper[4690]: I0320 18:02:35.534890 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f7d0a44f820ecdec63361aec2d469466a3b87821f41e9fb74e8440fe6dfd5999" Mar 20 18:02:35 crc kubenswrapper[4690]: I0320 18:02:35.534936 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6nfgk" Mar 20 18:02:35 crc kubenswrapper[4690]: I0320 18:02:35.622482 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lbwch"] Mar 20 18:02:35 crc kubenswrapper[4690]: E0320 18:02:35.622874 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c99c907-4cb3-4381-88e6-ada4a9efb021" containerName="oc" Mar 20 18:02:35 crc kubenswrapper[4690]: I0320 18:02:35.623077 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c99c907-4cb3-4381-88e6-ada4a9efb021" containerName="oc" Mar 20 18:02:35 crc kubenswrapper[4690]: E0320 18:02:35.623093 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86d7b6e3-05d5-475d-b95f-9ba0d5b43df4" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 20 18:02:35 crc kubenswrapper[4690]: I0320 18:02:35.623101 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="86d7b6e3-05d5-475d-b95f-9ba0d5b43df4" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 20 18:02:35 crc kubenswrapper[4690]: I0320 18:02:35.623273 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="86d7b6e3-05d5-475d-b95f-9ba0d5b43df4" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 20 18:02:35 crc kubenswrapper[4690]: I0320 18:02:35.623292 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c99c907-4cb3-4381-88e6-ada4a9efb021" containerName="oc" Mar 20 18:02:35 crc kubenswrapper[4690]: I0320 18:02:35.623863 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lbwch" Mar 20 18:02:35 crc kubenswrapper[4690]: I0320 18:02:35.627075 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 18:02:35 crc kubenswrapper[4690]: I0320 18:02:35.627126 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 18:02:35 crc kubenswrapper[4690]: I0320 18:02:35.627391 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 18:02:35 crc kubenswrapper[4690]: I0320 18:02:35.627592 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-k9qb4" Mar 20 18:02:35 crc kubenswrapper[4690]: I0320 18:02:35.638284 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lbwch"] Mar 20 18:02:35 crc kubenswrapper[4690]: I0320 18:02:35.696094 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hl8mb\" (UniqueName: \"kubernetes.io/projected/6983c278-26ba-4802-9320-1270d48b04ce-kube-api-access-hl8mb\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-lbwch\" (UID: \"6983c278-26ba-4802-9320-1270d48b04ce\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lbwch" Mar 20 18:02:35 crc kubenswrapper[4690]: I0320 18:02:35.696214 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6983c278-26ba-4802-9320-1270d48b04ce-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-lbwch\" (UID: \"6983c278-26ba-4802-9320-1270d48b04ce\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lbwch" Mar 20 18:02:35 crc kubenswrapper[4690]: I0320 18:02:35.696323 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6983c278-26ba-4802-9320-1270d48b04ce-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-lbwch\" (UID: \"6983c278-26ba-4802-9320-1270d48b04ce\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lbwch" Mar 20 18:02:35 crc kubenswrapper[4690]: I0320 18:02:35.798919 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hl8mb\" (UniqueName: \"kubernetes.io/projected/6983c278-26ba-4802-9320-1270d48b04ce-kube-api-access-hl8mb\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-lbwch\" (UID: \"6983c278-26ba-4802-9320-1270d48b04ce\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lbwch" Mar 20 18:02:35 crc kubenswrapper[4690]: I0320 18:02:35.799830 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6983c278-26ba-4802-9320-1270d48b04ce-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-lbwch\" (UID: \"6983c278-26ba-4802-9320-1270d48b04ce\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lbwch" Mar 20 18:02:35 crc kubenswrapper[4690]: I0320 18:02:35.800128 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6983c278-26ba-4802-9320-1270d48b04ce-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-lbwch\" (UID: \"6983c278-26ba-4802-9320-1270d48b04ce\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lbwch" Mar 20 18:02:35 crc kubenswrapper[4690]: I0320 18:02:35.804278 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6983c278-26ba-4802-9320-1270d48b04ce-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-lbwch\" (UID: \"6983c278-26ba-4802-9320-1270d48b04ce\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lbwch" Mar 20 18:02:35 crc kubenswrapper[4690]: I0320 18:02:35.807546 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6983c278-26ba-4802-9320-1270d48b04ce-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-lbwch\" (UID: \"6983c278-26ba-4802-9320-1270d48b04ce\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lbwch" Mar 20 18:02:35 crc kubenswrapper[4690]: I0320 18:02:35.825815 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hl8mb\" (UniqueName: \"kubernetes.io/projected/6983c278-26ba-4802-9320-1270d48b04ce-kube-api-access-hl8mb\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-lbwch\" (UID: \"6983c278-26ba-4802-9320-1270d48b04ce\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lbwch" Mar 20 18:02:35 crc kubenswrapper[4690]: I0320 18:02:35.943013 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lbwch" Mar 20 18:02:36 crc kubenswrapper[4690]: I0320 18:02:36.442928 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lbwch"] Mar 20 18:02:36 crc kubenswrapper[4690]: I0320 18:02:36.454516 4690 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 18:02:36 crc kubenswrapper[4690]: I0320 18:02:36.543879 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lbwch" event={"ID":"6983c278-26ba-4802-9320-1270d48b04ce","Type":"ContainerStarted","Data":"6fab4e80fe8170bd90fc2c74e7e080743ac452a61630a56d1107f34ccbf6479a"} Mar 20 18:02:37 crc kubenswrapper[4690]: I0320 18:02:37.554600 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lbwch" event={"ID":"6983c278-26ba-4802-9320-1270d48b04ce","Type":"ContainerStarted","Data":"c10f0800984ecc3f9897fe003e50d1f2939aaebd700623d024c99f2915195080"} Mar 20 18:02:37 crc kubenswrapper[4690]: I0320 18:02:37.609462 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lbwch" podStartSLOduration=2.134060668 podStartE2EDuration="2.609436501s" podCreationTimestamp="2026-03-20 18:02:35 +0000 UTC" firstStartedPulling="2026-03-20 18:02:36.454189012 +0000 UTC m=+1831.320014700" lastFinishedPulling="2026-03-20 18:02:36.929564825 +0000 UTC m=+1831.795390533" observedRunningTime="2026-03-20 18:02:37.580197098 +0000 UTC m=+1832.446022776" watchObservedRunningTime="2026-03-20 18:02:37.609436501 +0000 UTC m=+1832.475262189" Mar 20 18:02:45 crc kubenswrapper[4690]: I0320 18:02:45.055283 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-sh9q5"] Mar 20 18:02:45 crc kubenswrapper[4690]: I0320 18:02:45.082414 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-61c8-account-create-update-nqdzc"] Mar 20 18:02:45 crc kubenswrapper[4690]: I0320 18:02:45.098225 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-c546-account-create-update-t5wsm"] Mar 20 18:02:45 crc kubenswrapper[4690]: I0320 18:02:45.106031 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-8927-account-create-update-r78hx"] Mar 20 18:02:45 crc kubenswrapper[4690]: I0320 18:02:45.112938 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-w8n6k"] Mar 20 18:02:45 crc kubenswrapper[4690]: I0320 18:02:45.120441 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-sh9q5"] Mar 20 18:02:45 crc kubenswrapper[4690]: I0320 18:02:45.127338 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-h6p97"] Mar 20 18:02:45 crc kubenswrapper[4690]: I0320 18:02:45.134371 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-w8n6k"] Mar 20 18:02:45 crc kubenswrapper[4690]: I0320 18:02:45.141399 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-8927-account-create-update-r78hx"] Mar 20 18:02:45 crc kubenswrapper[4690]: I0320 18:02:45.148596 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-61c8-account-create-update-nqdzc"] Mar 20 18:02:45 crc kubenswrapper[4690]: I0320 18:02:45.155414 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-c546-account-create-update-t5wsm"] Mar 20 18:02:45 crc kubenswrapper[4690]: I0320 18:02:45.162061 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-h6p97"] Mar 20 18:02:45 crc kubenswrapper[4690]: I0320 18:02:45.891574 4690 scope.go:117] "RemoveContainer" containerID="965e35066bff888caca5b994dc3af56f56ca5e0e9e97a4c5970943a091971930" Mar 20 18:02:45 crc kubenswrapper[4690]: E0320 18:02:45.892007 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:02:45 crc kubenswrapper[4690]: I0320 18:02:45.903740 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37a5619a-4b07-4071-9e82-35d8f8a32f19" path="/var/lib/kubelet/pods/37a5619a-4b07-4071-9e82-35d8f8a32f19/volumes" Mar 20 18:02:45 crc kubenswrapper[4690]: I0320 18:02:45.905076 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c7fc068-1b4a-4181-9cd5-cc9eb17d691b" path="/var/lib/kubelet/pods/3c7fc068-1b4a-4181-9cd5-cc9eb17d691b/volumes" Mar 20 18:02:45 crc kubenswrapper[4690]: I0320 18:02:45.906331 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56f8f9ef-16ac-496b-a070-f82d7a55e5f8" path="/var/lib/kubelet/pods/56f8f9ef-16ac-496b-a070-f82d7a55e5f8/volumes" Mar 20 18:02:45 crc kubenswrapper[4690]: I0320 18:02:45.907600 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a616d63-75d6-49ee-b12c-e68bcc6303c8" path="/var/lib/kubelet/pods/7a616d63-75d6-49ee-b12c-e68bcc6303c8/volumes" Mar 20 18:02:45 crc kubenswrapper[4690]: I0320 18:02:45.909169 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80ed3d10-eba4-40ba-b635-7f43e2cc68d9" path="/var/lib/kubelet/pods/80ed3d10-eba4-40ba-b635-7f43e2cc68d9/volumes" Mar 20 18:02:45 crc kubenswrapper[4690]: I0320 18:02:45.909951 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8c999e8-0459-4d7a-8369-a44ec4af0bde" path="/var/lib/kubelet/pods/c8c999e8-0459-4d7a-8369-a44ec4af0bde/volumes" Mar 20 18:02:50 crc kubenswrapper[4690]: I0320 18:02:50.042916 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-7rf8f"] Mar 20 18:02:50 crc kubenswrapper[4690]: I0320 18:02:50.056539 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-7rf8f"] Mar 20 18:02:51 crc kubenswrapper[4690]: I0320 18:02:51.895192 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce1a2c5d-7796-4afb-b1b1-14c8f06ae79c" path="/var/lib/kubelet/pods/ce1a2c5d-7796-4afb-b1b1-14c8f06ae79c/volumes" Mar 20 18:02:57 crc kubenswrapper[4690]: I0320 18:02:57.883863 4690 scope.go:117] "RemoveContainer" containerID="965e35066bff888caca5b994dc3af56f56ca5e0e9e97a4c5970943a091971930" Mar 20 18:02:57 crc kubenswrapper[4690]: E0320 18:02:57.884950 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:03:11 crc kubenswrapper[4690]: I0320 18:03:11.884520 4690 scope.go:117] "RemoveContainer" containerID="965e35066bff888caca5b994dc3af56f56ca5e0e9e97a4c5970943a091971930" Mar 20 18:03:11 crc kubenswrapper[4690]: E0320 18:03:11.885330 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:03:18 crc kubenswrapper[4690]: I0320 18:03:18.072999 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-t2qth"] Mar 20 18:03:18 crc kubenswrapper[4690]: I0320 18:03:18.081058 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-t2qth"] Mar 20 18:03:19 crc kubenswrapper[4690]: I0320 18:03:19.905589 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3770976f-1610-4bb2-97db-0d81d8af8de1" path="/var/lib/kubelet/pods/3770976f-1610-4bb2-97db-0d81d8af8de1/volumes" Mar 20 18:03:21 crc kubenswrapper[4690]: I0320 18:03:21.325618 4690 scope.go:117] "RemoveContainer" containerID="e1dffb5d84cdfe9d6befcccc1c07d8a45fcbc2d86ba715ce0f4edfcc4492de08" Mar 20 18:03:21 crc kubenswrapper[4690]: I0320 18:03:21.364583 4690 scope.go:117] "RemoveContainer" containerID="a745b162a284579d6444986b78d0fd90e5ca48bb8d085d6310eb736cbb4f4c5e" Mar 20 18:03:21 crc kubenswrapper[4690]: I0320 18:03:21.440057 4690 scope.go:117] "RemoveContainer" containerID="2a7c80b59b16162714fe6ea48b60263010a288426068f3bed299f76fa86cb2dc" Mar 20 18:03:21 crc kubenswrapper[4690]: I0320 18:03:21.490511 4690 scope.go:117] "RemoveContainer" containerID="d0564bea1fff016e08463a9c5e3dcfd1bde664bd8343722644716c94489ee616" Mar 20 18:03:21 crc kubenswrapper[4690]: I0320 18:03:21.543628 4690 scope.go:117] "RemoveContainer" containerID="cba91b6ab23732ed261f40e322910c9f1b17b102b8693b9ec31cdbe5057efa66" Mar 20 18:03:21 crc kubenswrapper[4690]: I0320 18:03:21.600162 4690 scope.go:117] "RemoveContainer" containerID="cced9f852b989af8b65a3dc613df98da60471356b861f80cd8ca215aec818dc9" Mar 20 18:03:21 crc kubenswrapper[4690]: I0320 18:03:21.626064 4690 scope.go:117] "RemoveContainer" containerID="cb06015b063ddbeb5a91ce5edc99587e449ca03867ccb0724afd6ffc9abc6d78" Mar 20 18:03:21 crc kubenswrapper[4690]: I0320 18:03:21.666514 4690 scope.go:117] "RemoveContainer" containerID="b11c76373714c080709968c94f055d40a82ed3ee04a0368540f50a64bdf3515d" Mar 20 18:03:21 crc kubenswrapper[4690]: I0320 18:03:21.704867 4690 scope.go:117] "RemoveContainer" containerID="b627a95077c84b57c32b1a48a6bbaec55b07ac9158955445e7b39c8f085e3a67" Mar 20 18:03:23 crc kubenswrapper[4690]: I0320 18:03:23.037096 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-dzgr7"] Mar 20 18:03:23 crc kubenswrapper[4690]: I0320 18:03:23.045110 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-dzgr7"] Mar 20 18:03:23 crc kubenswrapper[4690]: I0320 18:03:23.884583 4690 scope.go:117] "RemoveContainer" containerID="965e35066bff888caca5b994dc3af56f56ca5e0e9e97a4c5970943a091971930" Mar 20 18:03:23 crc kubenswrapper[4690]: E0320 18:03:23.885041 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:03:23 crc kubenswrapper[4690]: I0320 18:03:23.904669 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14a50078-3ca8-4c47-8067-7473a9376323" path="/var/lib/kubelet/pods/14a50078-3ca8-4c47-8067-7473a9376323/volumes" Mar 20 18:03:29 crc kubenswrapper[4690]: I0320 18:03:29.049289 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-m2vln"] Mar 20 18:03:29 crc kubenswrapper[4690]: I0320 18:03:29.071116 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-m2vln"] Mar 20 18:03:29 crc kubenswrapper[4690]: I0320 18:03:29.899173 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2055a33a-e663-4664-9240-aab3d338c45e" path="/var/lib/kubelet/pods/2055a33a-e663-4664-9240-aab3d338c45e/volumes" Mar 20 18:03:37 crc kubenswrapper[4690]: I0320 18:03:37.882987 4690 scope.go:117] "RemoveContainer" containerID="965e35066bff888caca5b994dc3af56f56ca5e0e9e97a4c5970943a091971930" Mar 20 18:03:37 crc kubenswrapper[4690]: E0320 18:03:37.883693 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:03:38 crc kubenswrapper[4690]: I0320 18:03:38.045144 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-m4wn2"] Mar 20 18:03:38 crc kubenswrapper[4690]: I0320 18:03:38.056963 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-m4wn2"] Mar 20 18:03:39 crc kubenswrapper[4690]: I0320 18:03:39.898489 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1fc6c70-315f-47d3-b8d3-17e3da8ee4a0" path="/var/lib/kubelet/pods/d1fc6c70-315f-47d3-b8d3-17e3da8ee4a0/volumes" Mar 20 18:03:40 crc kubenswrapper[4690]: I0320 18:03:40.035531 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-wqk6t"] Mar 20 18:03:40 crc kubenswrapper[4690]: I0320 18:03:40.046855 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-wqk6t"] Mar 20 18:03:41 crc kubenswrapper[4690]: I0320 18:03:41.907833 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef3bcd50-5724-42a1-92df-262256c07d45" path="/var/lib/kubelet/pods/ef3bcd50-5724-42a1-92df-262256c07d45/volumes" Mar 20 18:03:48 crc kubenswrapper[4690]: I0320 18:03:48.326228 4690 generic.go:334] "Generic (PLEG): container finished" podID="6983c278-26ba-4802-9320-1270d48b04ce" containerID="c10f0800984ecc3f9897fe003e50d1f2939aaebd700623d024c99f2915195080" exitCode=0 Mar 20 18:03:48 crc kubenswrapper[4690]: I0320 18:03:48.326301 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lbwch" event={"ID":"6983c278-26ba-4802-9320-1270d48b04ce","Type":"ContainerDied","Data":"c10f0800984ecc3f9897fe003e50d1f2939aaebd700623d024c99f2915195080"} Mar 20 18:03:49 crc kubenswrapper[4690]: I0320 18:03:49.833288 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lbwch" Mar 20 18:03:49 crc kubenswrapper[4690]: I0320 18:03:49.887317 4690 scope.go:117] "RemoveContainer" containerID="965e35066bff888caca5b994dc3af56f56ca5e0e9e97a4c5970943a091971930" Mar 20 18:03:49 crc kubenswrapper[4690]: E0320 18:03:49.888488 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:03:49 crc kubenswrapper[4690]: I0320 18:03:49.941520 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hl8mb\" (UniqueName: \"kubernetes.io/projected/6983c278-26ba-4802-9320-1270d48b04ce-kube-api-access-hl8mb\") pod \"6983c278-26ba-4802-9320-1270d48b04ce\" (UID: \"6983c278-26ba-4802-9320-1270d48b04ce\") " Mar 20 18:03:49 crc kubenswrapper[4690]: I0320 18:03:49.941684 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6983c278-26ba-4802-9320-1270d48b04ce-ssh-key-openstack-edpm-ipam\") pod \"6983c278-26ba-4802-9320-1270d48b04ce\" (UID: \"6983c278-26ba-4802-9320-1270d48b04ce\") " Mar 20 18:03:49 crc kubenswrapper[4690]: I0320 18:03:49.941866 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6983c278-26ba-4802-9320-1270d48b04ce-inventory\") pod \"6983c278-26ba-4802-9320-1270d48b04ce\" (UID: \"6983c278-26ba-4802-9320-1270d48b04ce\") " Mar 20 18:03:49 crc kubenswrapper[4690]: I0320 18:03:49.947323 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6983c278-26ba-4802-9320-1270d48b04ce-kube-api-access-hl8mb" (OuterVolumeSpecName: "kube-api-access-hl8mb") pod "6983c278-26ba-4802-9320-1270d48b04ce" (UID: "6983c278-26ba-4802-9320-1270d48b04ce"). InnerVolumeSpecName "kube-api-access-hl8mb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:03:49 crc kubenswrapper[4690]: I0320 18:03:49.967390 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6983c278-26ba-4802-9320-1270d48b04ce-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "6983c278-26ba-4802-9320-1270d48b04ce" (UID: "6983c278-26ba-4802-9320-1270d48b04ce"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:03:49 crc kubenswrapper[4690]: I0320 18:03:49.996074 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6983c278-26ba-4802-9320-1270d48b04ce-inventory" (OuterVolumeSpecName: "inventory") pod "6983c278-26ba-4802-9320-1270d48b04ce" (UID: "6983c278-26ba-4802-9320-1270d48b04ce"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:03:50 crc kubenswrapper[4690]: I0320 18:03:50.044759 4690 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6983c278-26ba-4802-9320-1270d48b04ce-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 18:03:50 crc kubenswrapper[4690]: I0320 18:03:50.044933 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hl8mb\" (UniqueName: \"kubernetes.io/projected/6983c278-26ba-4802-9320-1270d48b04ce-kube-api-access-hl8mb\") on node \"crc\" DevicePath \"\"" Mar 20 18:03:50 crc kubenswrapper[4690]: I0320 18:03:50.045056 4690 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6983c278-26ba-4802-9320-1270d48b04ce-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 18:03:50 crc kubenswrapper[4690]: I0320 18:03:50.350848 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lbwch" event={"ID":"6983c278-26ba-4802-9320-1270d48b04ce","Type":"ContainerDied","Data":"6fab4e80fe8170bd90fc2c74e7e080743ac452a61630a56d1107f34ccbf6479a"} Mar 20 18:03:50 crc kubenswrapper[4690]: I0320 18:03:50.351153 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6fab4e80fe8170bd90fc2c74e7e080743ac452a61630a56d1107f34ccbf6479a" Mar 20 18:03:50 crc kubenswrapper[4690]: I0320 18:03:50.350903 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lbwch" Mar 20 18:03:50 crc kubenswrapper[4690]: I0320 18:03:50.438009 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nszcx"] Mar 20 18:03:50 crc kubenswrapper[4690]: E0320 18:03:50.438500 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6983c278-26ba-4802-9320-1270d48b04ce" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 20 18:03:50 crc kubenswrapper[4690]: I0320 18:03:50.438519 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="6983c278-26ba-4802-9320-1270d48b04ce" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 20 18:03:50 crc kubenswrapper[4690]: I0320 18:03:50.438699 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="6983c278-26ba-4802-9320-1270d48b04ce" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 20 18:03:50 crc kubenswrapper[4690]: I0320 18:03:50.439421 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nszcx" Mar 20 18:03:50 crc kubenswrapper[4690]: I0320 18:03:50.441727 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 18:03:50 crc kubenswrapper[4690]: I0320 18:03:50.441739 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 18:03:50 crc kubenswrapper[4690]: I0320 18:03:50.442607 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-k9qb4" Mar 20 18:03:50 crc kubenswrapper[4690]: I0320 18:03:50.443627 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 18:03:50 crc kubenswrapper[4690]: I0320 18:03:50.447952 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nszcx"] Mar 20 18:03:50 crc kubenswrapper[4690]: I0320 18:03:50.553105 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3fa5c87e-a9cd-4046-9344-3a66c0c0977c-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-nszcx\" (UID: \"3fa5c87e-a9cd-4046-9344-3a66c0c0977c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nszcx" Mar 20 18:03:50 crc kubenswrapper[4690]: I0320 18:03:50.553216 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdz48\" (UniqueName: \"kubernetes.io/projected/3fa5c87e-a9cd-4046-9344-3a66c0c0977c-kube-api-access-wdz48\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-nszcx\" (UID: \"3fa5c87e-a9cd-4046-9344-3a66c0c0977c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nszcx" Mar 20 18:03:50 crc kubenswrapper[4690]: I0320 18:03:50.553266 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3fa5c87e-a9cd-4046-9344-3a66c0c0977c-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-nszcx\" (UID: \"3fa5c87e-a9cd-4046-9344-3a66c0c0977c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nszcx" Mar 20 18:03:50 crc kubenswrapper[4690]: I0320 18:03:50.655162 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3fa5c87e-a9cd-4046-9344-3a66c0c0977c-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-nszcx\" (UID: \"3fa5c87e-a9cd-4046-9344-3a66c0c0977c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nszcx" Mar 20 18:03:50 crc kubenswrapper[4690]: I0320 18:03:50.655325 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdz48\" (UniqueName: \"kubernetes.io/projected/3fa5c87e-a9cd-4046-9344-3a66c0c0977c-kube-api-access-wdz48\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-nszcx\" (UID: \"3fa5c87e-a9cd-4046-9344-3a66c0c0977c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nszcx" Mar 20 18:03:50 crc kubenswrapper[4690]: I0320 18:03:50.655356 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3fa5c87e-a9cd-4046-9344-3a66c0c0977c-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-nszcx\" (UID: \"3fa5c87e-a9cd-4046-9344-3a66c0c0977c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nszcx" Mar 20 18:03:50 crc kubenswrapper[4690]: I0320 18:03:50.659512 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3fa5c87e-a9cd-4046-9344-3a66c0c0977c-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-nszcx\" (UID: \"3fa5c87e-a9cd-4046-9344-3a66c0c0977c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nszcx" Mar 20 18:03:50 crc kubenswrapper[4690]: I0320 18:03:50.660919 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3fa5c87e-a9cd-4046-9344-3a66c0c0977c-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-nszcx\" (UID: \"3fa5c87e-a9cd-4046-9344-3a66c0c0977c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nszcx" Mar 20 18:03:50 crc kubenswrapper[4690]: I0320 18:03:50.681107 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdz48\" (UniqueName: \"kubernetes.io/projected/3fa5c87e-a9cd-4046-9344-3a66c0c0977c-kube-api-access-wdz48\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-nszcx\" (UID: \"3fa5c87e-a9cd-4046-9344-3a66c0c0977c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nszcx" Mar 20 18:03:50 crc kubenswrapper[4690]: I0320 18:03:50.768006 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nszcx" Mar 20 18:03:51 crc kubenswrapper[4690]: I0320 18:03:51.351402 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nszcx"] Mar 20 18:03:51 crc kubenswrapper[4690]: I0320 18:03:51.365547 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nszcx" event={"ID":"3fa5c87e-a9cd-4046-9344-3a66c0c0977c","Type":"ContainerStarted","Data":"9cd5fbe24eb26ea1da2af8b6d764254a28ded0c54e03c67fb189ba424db42b4a"} Mar 20 18:03:52 crc kubenswrapper[4690]: I0320 18:03:52.375466 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nszcx" event={"ID":"3fa5c87e-a9cd-4046-9344-3a66c0c0977c","Type":"ContainerStarted","Data":"464ebbfee67a78fdcb90c79f73c98906a7d9136a5c528a000abc966b88c3677a"} Mar 20 18:03:52 crc kubenswrapper[4690]: I0320 18:03:52.389515 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nszcx" podStartSLOduration=1.964198114 podStartE2EDuration="2.389494078s" podCreationTimestamp="2026-03-20 18:03:50 +0000 UTC" firstStartedPulling="2026-03-20 18:03:51.358925177 +0000 UTC m=+1906.224750865" lastFinishedPulling="2026-03-20 18:03:51.784221151 +0000 UTC m=+1906.650046829" observedRunningTime="2026-03-20 18:03:52.38884468 +0000 UTC m=+1907.254670368" watchObservedRunningTime="2026-03-20 18:03:52.389494078 +0000 UTC m=+1907.255319756" Mar 20 18:03:57 crc kubenswrapper[4690]: I0320 18:03:57.428463 4690 generic.go:334] "Generic (PLEG): container finished" podID="3fa5c87e-a9cd-4046-9344-3a66c0c0977c" containerID="464ebbfee67a78fdcb90c79f73c98906a7d9136a5c528a000abc966b88c3677a" exitCode=0 Mar 20 18:03:57 crc kubenswrapper[4690]: I0320 18:03:57.428562 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nszcx" event={"ID":"3fa5c87e-a9cd-4046-9344-3a66c0c0977c","Type":"ContainerDied","Data":"464ebbfee67a78fdcb90c79f73c98906a7d9136a5c528a000abc966b88c3677a"} Mar 20 18:03:58 crc kubenswrapper[4690]: I0320 18:03:58.861852 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nszcx" Mar 20 18:03:58 crc kubenswrapper[4690]: I0320 18:03:58.920302 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3fa5c87e-a9cd-4046-9344-3a66c0c0977c-inventory\") pod \"3fa5c87e-a9cd-4046-9344-3a66c0c0977c\" (UID: \"3fa5c87e-a9cd-4046-9344-3a66c0c0977c\") " Mar 20 18:03:58 crc kubenswrapper[4690]: I0320 18:03:58.920450 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wdz48\" (UniqueName: \"kubernetes.io/projected/3fa5c87e-a9cd-4046-9344-3a66c0c0977c-kube-api-access-wdz48\") pod \"3fa5c87e-a9cd-4046-9344-3a66c0c0977c\" (UID: \"3fa5c87e-a9cd-4046-9344-3a66c0c0977c\") " Mar 20 18:03:58 crc kubenswrapper[4690]: I0320 18:03:58.920521 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3fa5c87e-a9cd-4046-9344-3a66c0c0977c-ssh-key-openstack-edpm-ipam\") pod \"3fa5c87e-a9cd-4046-9344-3a66c0c0977c\" (UID: \"3fa5c87e-a9cd-4046-9344-3a66c0c0977c\") " Mar 20 18:03:58 crc kubenswrapper[4690]: I0320 18:03:58.929542 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fa5c87e-a9cd-4046-9344-3a66c0c0977c-kube-api-access-wdz48" (OuterVolumeSpecName: "kube-api-access-wdz48") pod "3fa5c87e-a9cd-4046-9344-3a66c0c0977c" (UID: "3fa5c87e-a9cd-4046-9344-3a66c0c0977c"). InnerVolumeSpecName "kube-api-access-wdz48". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:03:58 crc kubenswrapper[4690]: I0320 18:03:58.956349 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fa5c87e-a9cd-4046-9344-3a66c0c0977c-inventory" (OuterVolumeSpecName: "inventory") pod "3fa5c87e-a9cd-4046-9344-3a66c0c0977c" (UID: "3fa5c87e-a9cd-4046-9344-3a66c0c0977c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:03:58 crc kubenswrapper[4690]: I0320 18:03:58.964182 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fa5c87e-a9cd-4046-9344-3a66c0c0977c-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "3fa5c87e-a9cd-4046-9344-3a66c0c0977c" (UID: "3fa5c87e-a9cd-4046-9344-3a66c0c0977c"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:03:59 crc kubenswrapper[4690]: I0320 18:03:59.022491 4690 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3fa5c87e-a9cd-4046-9344-3a66c0c0977c-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 18:03:59 crc kubenswrapper[4690]: I0320 18:03:59.022533 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wdz48\" (UniqueName: \"kubernetes.io/projected/3fa5c87e-a9cd-4046-9344-3a66c0c0977c-kube-api-access-wdz48\") on node \"crc\" DevicePath \"\"" Mar 20 18:03:59 crc kubenswrapper[4690]: I0320 18:03:59.022549 4690 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3fa5c87e-a9cd-4046-9344-3a66c0c0977c-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 18:03:59 crc kubenswrapper[4690]: I0320 18:03:59.446822 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nszcx" event={"ID":"3fa5c87e-a9cd-4046-9344-3a66c0c0977c","Type":"ContainerDied","Data":"9cd5fbe24eb26ea1da2af8b6d764254a28ded0c54e03c67fb189ba424db42b4a"} Mar 20 18:03:59 crc kubenswrapper[4690]: I0320 18:03:59.446877 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9cd5fbe24eb26ea1da2af8b6d764254a28ded0c54e03c67fb189ba424db42b4a" Mar 20 18:03:59 crc kubenswrapper[4690]: I0320 18:03:59.446945 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nszcx" Mar 20 18:03:59 crc kubenswrapper[4690]: I0320 18:03:59.533753 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-xpjkk"] Mar 20 18:03:59 crc kubenswrapper[4690]: E0320 18:03:59.534172 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fa5c87e-a9cd-4046-9344-3a66c0c0977c" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 20 18:03:59 crc kubenswrapper[4690]: I0320 18:03:59.534191 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fa5c87e-a9cd-4046-9344-3a66c0c0977c" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 20 18:03:59 crc kubenswrapper[4690]: I0320 18:03:59.534412 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fa5c87e-a9cd-4046-9344-3a66c0c0977c" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 20 18:03:59 crc kubenswrapper[4690]: I0320 18:03:59.535048 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xpjkk" Mar 20 18:03:59 crc kubenswrapper[4690]: I0320 18:03:59.537000 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 18:03:59 crc kubenswrapper[4690]: I0320 18:03:59.537246 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 18:03:59 crc kubenswrapper[4690]: I0320 18:03:59.537457 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-k9qb4" Mar 20 18:03:59 crc kubenswrapper[4690]: I0320 18:03:59.546121 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-xpjkk"] Mar 20 18:03:59 crc kubenswrapper[4690]: I0320 18:03:59.546623 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 18:03:59 crc kubenswrapper[4690]: I0320 18:03:59.634420 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7bdd8e58-aee6-495b-85b6-6d4ce7de1bdc-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xpjkk\" (UID: \"7bdd8e58-aee6-495b-85b6-6d4ce7de1bdc\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xpjkk" Mar 20 18:03:59 crc kubenswrapper[4690]: I0320 18:03:59.634525 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7bdd8e58-aee6-495b-85b6-6d4ce7de1bdc-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xpjkk\" (UID: \"7bdd8e58-aee6-495b-85b6-6d4ce7de1bdc\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xpjkk" Mar 20 18:03:59 crc kubenswrapper[4690]: I0320 18:03:59.634626 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4ldw\" (UniqueName: \"kubernetes.io/projected/7bdd8e58-aee6-495b-85b6-6d4ce7de1bdc-kube-api-access-d4ldw\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xpjkk\" (UID: \"7bdd8e58-aee6-495b-85b6-6d4ce7de1bdc\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xpjkk" Mar 20 18:03:59 crc kubenswrapper[4690]: I0320 18:03:59.735365 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4ldw\" (UniqueName: \"kubernetes.io/projected/7bdd8e58-aee6-495b-85b6-6d4ce7de1bdc-kube-api-access-d4ldw\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xpjkk\" (UID: \"7bdd8e58-aee6-495b-85b6-6d4ce7de1bdc\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xpjkk" Mar 20 18:03:59 crc kubenswrapper[4690]: I0320 18:03:59.735488 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7bdd8e58-aee6-495b-85b6-6d4ce7de1bdc-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xpjkk\" (UID: \"7bdd8e58-aee6-495b-85b6-6d4ce7de1bdc\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xpjkk" Mar 20 18:03:59 crc kubenswrapper[4690]: I0320 18:03:59.735583 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7bdd8e58-aee6-495b-85b6-6d4ce7de1bdc-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xpjkk\" (UID: \"7bdd8e58-aee6-495b-85b6-6d4ce7de1bdc\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xpjkk" Mar 20 18:03:59 crc kubenswrapper[4690]: I0320 18:03:59.739599 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7bdd8e58-aee6-495b-85b6-6d4ce7de1bdc-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xpjkk\" (UID: \"7bdd8e58-aee6-495b-85b6-6d4ce7de1bdc\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xpjkk" Mar 20 18:03:59 crc kubenswrapper[4690]: I0320 18:03:59.740181 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7bdd8e58-aee6-495b-85b6-6d4ce7de1bdc-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xpjkk\" (UID: \"7bdd8e58-aee6-495b-85b6-6d4ce7de1bdc\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xpjkk" Mar 20 18:03:59 crc kubenswrapper[4690]: I0320 18:03:59.755725 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4ldw\" (UniqueName: \"kubernetes.io/projected/7bdd8e58-aee6-495b-85b6-6d4ce7de1bdc-kube-api-access-d4ldw\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xpjkk\" (UID: \"7bdd8e58-aee6-495b-85b6-6d4ce7de1bdc\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xpjkk" Mar 20 18:03:59 crc kubenswrapper[4690]: I0320 18:03:59.858673 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xpjkk" Mar 20 18:04:00 crc kubenswrapper[4690]: I0320 18:04:00.134975 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567164-cd9d7"] Mar 20 18:04:00 crc kubenswrapper[4690]: I0320 18:04:00.136654 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567164-cd9d7" Mar 20 18:04:00 crc kubenswrapper[4690]: I0320 18:04:00.138597 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5fwhb" Mar 20 18:04:00 crc kubenswrapper[4690]: I0320 18:04:00.138686 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 18:04:00 crc kubenswrapper[4690]: I0320 18:04:00.139353 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 18:04:00 crc kubenswrapper[4690]: I0320 18:04:00.142012 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kh2hq\" (UniqueName: \"kubernetes.io/projected/6adac120-f240-4287-881f-428d4400c7c2-kube-api-access-kh2hq\") pod \"auto-csr-approver-29567164-cd9d7\" (UID: \"6adac120-f240-4287-881f-428d4400c7c2\") " pod="openshift-infra/auto-csr-approver-29567164-cd9d7" Mar 20 18:04:00 crc kubenswrapper[4690]: I0320 18:04:00.144506 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567164-cd9d7"] Mar 20 18:04:00 crc kubenswrapper[4690]: I0320 18:04:00.218536 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-xpjkk"] Mar 20 18:04:00 crc kubenswrapper[4690]: I0320 18:04:00.243364 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kh2hq\" (UniqueName: \"kubernetes.io/projected/6adac120-f240-4287-881f-428d4400c7c2-kube-api-access-kh2hq\") pod \"auto-csr-approver-29567164-cd9d7\" (UID: \"6adac120-f240-4287-881f-428d4400c7c2\") " pod="openshift-infra/auto-csr-approver-29567164-cd9d7" Mar 20 18:04:00 crc kubenswrapper[4690]: I0320 18:04:00.260630 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kh2hq\" (UniqueName: \"kubernetes.io/projected/6adac120-f240-4287-881f-428d4400c7c2-kube-api-access-kh2hq\") pod \"auto-csr-approver-29567164-cd9d7\" (UID: \"6adac120-f240-4287-881f-428d4400c7c2\") " pod="openshift-infra/auto-csr-approver-29567164-cd9d7" Mar 20 18:04:00 crc kubenswrapper[4690]: I0320 18:04:00.454852 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xpjkk" event={"ID":"7bdd8e58-aee6-495b-85b6-6d4ce7de1bdc","Type":"ContainerStarted","Data":"de6299abd79cd27587ee55221074980892fa946b52f0f72fb40ea5fd6fe77255"} Mar 20 18:04:00 crc kubenswrapper[4690]: I0320 18:04:00.458759 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567164-cd9d7" Mar 20 18:04:00 crc kubenswrapper[4690]: W0320 18:04:00.925468 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6adac120_f240_4287_881f_428d4400c7c2.slice/crio-d2f4987f6dfca7268e1fe7dc99662b721bcf1737dff47384f6574b331a8a178a WatchSource:0}: Error finding container d2f4987f6dfca7268e1fe7dc99662b721bcf1737dff47384f6574b331a8a178a: Status 404 returned error can't find the container with id d2f4987f6dfca7268e1fe7dc99662b721bcf1737dff47384f6574b331a8a178a Mar 20 18:04:00 crc kubenswrapper[4690]: I0320 18:04:00.927725 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567164-cd9d7"] Mar 20 18:04:01 crc kubenswrapper[4690]: I0320 18:04:01.467010 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xpjkk" event={"ID":"7bdd8e58-aee6-495b-85b6-6d4ce7de1bdc","Type":"ContainerStarted","Data":"c0911d807ad5b52a394e7e0448d2a80573927b60c8b58e357ff429a280c49d8e"} Mar 20 18:04:01 crc kubenswrapper[4690]: I0320 18:04:01.471886 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567164-cd9d7" event={"ID":"6adac120-f240-4287-881f-428d4400c7c2","Type":"ContainerStarted","Data":"d2f4987f6dfca7268e1fe7dc99662b721bcf1737dff47384f6574b331a8a178a"} Mar 20 18:04:01 crc kubenswrapper[4690]: I0320 18:04:01.515108 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xpjkk" podStartSLOduration=2.082099719 podStartE2EDuration="2.51508702s" podCreationTimestamp="2026-03-20 18:03:59 +0000 UTC" firstStartedPulling="2026-03-20 18:04:00.221830928 +0000 UTC m=+1915.087656616" lastFinishedPulling="2026-03-20 18:04:00.654818209 +0000 UTC m=+1915.520643917" observedRunningTime="2026-03-20 18:04:01.486782834 +0000 UTC m=+1916.352608512" watchObservedRunningTime="2026-03-20 18:04:01.51508702 +0000 UTC m=+1916.380912698" Mar 20 18:04:02 crc kubenswrapper[4690]: I0320 18:04:02.487478 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567164-cd9d7" event={"ID":"6adac120-f240-4287-881f-428d4400c7c2","Type":"ContainerStarted","Data":"18952c04d289dd13580ce4f54fd816b23f488300e4119b032de7140bafb75567"} Mar 20 18:04:02 crc kubenswrapper[4690]: I0320 18:04:02.510378 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567164-cd9d7" podStartSLOduration=1.512976551 podStartE2EDuration="2.510351329s" podCreationTimestamp="2026-03-20 18:04:00 +0000 UTC" firstStartedPulling="2026-03-20 18:04:00.928014595 +0000 UTC m=+1915.793840273" lastFinishedPulling="2026-03-20 18:04:01.925389343 +0000 UTC m=+1916.791215051" observedRunningTime="2026-03-20 18:04:02.507967412 +0000 UTC m=+1917.373793090" watchObservedRunningTime="2026-03-20 18:04:02.510351329 +0000 UTC m=+1917.376177007" Mar 20 18:04:03 crc kubenswrapper[4690]: I0320 18:04:03.502609 4690 generic.go:334] "Generic (PLEG): container finished" podID="6adac120-f240-4287-881f-428d4400c7c2" containerID="18952c04d289dd13580ce4f54fd816b23f488300e4119b032de7140bafb75567" exitCode=0 Mar 20 18:04:03 crc kubenswrapper[4690]: I0320 18:04:03.502686 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567164-cd9d7" event={"ID":"6adac120-f240-4287-881f-428d4400c7c2","Type":"ContainerDied","Data":"18952c04d289dd13580ce4f54fd816b23f488300e4119b032de7140bafb75567"} Mar 20 18:04:03 crc kubenswrapper[4690]: I0320 18:04:03.883438 4690 scope.go:117] "RemoveContainer" containerID="965e35066bff888caca5b994dc3af56f56ca5e0e9e97a4c5970943a091971930" Mar 20 18:04:04 crc kubenswrapper[4690]: I0320 18:04:04.516049 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" event={"ID":"c18651e4-89e3-43fd-a780-bfa6df87591e","Type":"ContainerStarted","Data":"9c743870b72976847070b0c9956af89e5f5f2891d80131c888a10eec990b9c51"} Mar 20 18:04:04 crc kubenswrapper[4690]: I0320 18:04:04.929298 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567164-cd9d7" Mar 20 18:04:05 crc kubenswrapper[4690]: I0320 18:04:05.037889 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kh2hq\" (UniqueName: \"kubernetes.io/projected/6adac120-f240-4287-881f-428d4400c7c2-kube-api-access-kh2hq\") pod \"6adac120-f240-4287-881f-428d4400c7c2\" (UID: \"6adac120-f240-4287-881f-428d4400c7c2\") " Mar 20 18:04:05 crc kubenswrapper[4690]: I0320 18:04:05.045449 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6adac120-f240-4287-881f-428d4400c7c2-kube-api-access-kh2hq" (OuterVolumeSpecName: "kube-api-access-kh2hq") pod "6adac120-f240-4287-881f-428d4400c7c2" (UID: "6adac120-f240-4287-881f-428d4400c7c2"). InnerVolumeSpecName "kube-api-access-kh2hq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:04:05 crc kubenswrapper[4690]: I0320 18:04:05.140216 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kh2hq\" (UniqueName: \"kubernetes.io/projected/6adac120-f240-4287-881f-428d4400c7c2-kube-api-access-kh2hq\") on node \"crc\" DevicePath \"\"" Mar 20 18:04:05 crc kubenswrapper[4690]: I0320 18:04:05.528508 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567164-cd9d7" event={"ID":"6adac120-f240-4287-881f-428d4400c7c2","Type":"ContainerDied","Data":"d2f4987f6dfca7268e1fe7dc99662b721bcf1737dff47384f6574b331a8a178a"} Mar 20 18:04:05 crc kubenswrapper[4690]: I0320 18:04:05.528887 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d2f4987f6dfca7268e1fe7dc99662b721bcf1737dff47384f6574b331a8a178a" Mar 20 18:04:05 crc kubenswrapper[4690]: I0320 18:04:05.528661 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567164-cd9d7" Mar 20 18:04:05 crc kubenswrapper[4690]: I0320 18:04:05.595066 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567158-tf2mq"] Mar 20 18:04:05 crc kubenswrapper[4690]: I0320 18:04:05.605102 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567158-tf2mq"] Mar 20 18:04:05 crc kubenswrapper[4690]: I0320 18:04:05.895331 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6aaec591-2764-4591-9113-632b649d5d7b" path="/var/lib/kubelet/pods/6aaec591-2764-4591-9113-632b649d5d7b/volumes" Mar 20 18:04:19 crc kubenswrapper[4690]: I0320 18:04:19.033205 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-2bb3-account-create-update-t2z8f"] Mar 20 18:04:19 crc kubenswrapper[4690]: I0320 18:04:19.044242 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-3684-account-create-update-9hhdp"] Mar 20 18:04:19 crc kubenswrapper[4690]: I0320 18:04:19.077210 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-nb497"] Mar 20 18:04:19 crc kubenswrapper[4690]: I0320 18:04:19.085532 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-3684-account-create-update-9hhdp"] Mar 20 18:04:19 crc kubenswrapper[4690]: I0320 18:04:19.093561 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-nb497"] Mar 20 18:04:19 crc kubenswrapper[4690]: I0320 18:04:19.101367 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-2bb3-account-create-update-t2z8f"] Mar 20 18:04:19 crc kubenswrapper[4690]: I0320 18:04:19.108569 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-plcj5"] Mar 20 18:04:19 crc kubenswrapper[4690]: I0320 18:04:19.116178 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-plcj5"] Mar 20 18:04:19 crc kubenswrapper[4690]: I0320 18:04:19.122819 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-rb6zb"] Mar 20 18:04:19 crc kubenswrapper[4690]: I0320 18:04:19.131291 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-rb6zb"] Mar 20 18:04:19 crc kubenswrapper[4690]: I0320 18:04:19.140659 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-ce9f-account-create-update-4vcxr"] Mar 20 18:04:19 crc kubenswrapper[4690]: I0320 18:04:19.148153 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-ce9f-account-create-update-4vcxr"] Mar 20 18:04:19 crc kubenswrapper[4690]: I0320 18:04:19.894620 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="140a83c9-1fb7-49cc-83a6-b78677db1779" path="/var/lib/kubelet/pods/140a83c9-1fb7-49cc-83a6-b78677db1779/volumes" Mar 20 18:04:19 crc kubenswrapper[4690]: I0320 18:04:19.895153 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="393d76ad-66f6-46fe-93d8-833e5193f216" path="/var/lib/kubelet/pods/393d76ad-66f6-46fe-93d8-833e5193f216/volumes" Mar 20 18:04:19 crc kubenswrapper[4690]: I0320 18:04:19.895743 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ffe945e-e151-44eb-82d1-99c46c113fbe" path="/var/lib/kubelet/pods/6ffe945e-e151-44eb-82d1-99c46c113fbe/volumes" Mar 20 18:04:19 crc kubenswrapper[4690]: I0320 18:04:19.896228 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9643bf40-9abd-47b9-9e39-2b1dfea6949d" path="/var/lib/kubelet/pods/9643bf40-9abd-47b9-9e39-2b1dfea6949d/volumes" Mar 20 18:04:19 crc kubenswrapper[4690]: I0320 18:04:19.897300 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebf4cd57-6e6d-40f1-9fb6-a0d140c04c30" path="/var/lib/kubelet/pods/ebf4cd57-6e6d-40f1-9fb6-a0d140c04c30/volumes" Mar 20 18:04:19 crc kubenswrapper[4690]: I0320 18:04:19.897807 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbcd1d86-01fb-4773-a19c-10ac29e045e1" path="/var/lib/kubelet/pods/fbcd1d86-01fb-4773-a19c-10ac29e045e1/volumes" Mar 20 18:04:21 crc kubenswrapper[4690]: I0320 18:04:21.948803 4690 scope.go:117] "RemoveContainer" containerID="61274cb8cd6778715a81cba83f182cc3dc615d2d6ca808fe2febc61a37e4d6ad" Mar 20 18:04:21 crc kubenswrapper[4690]: I0320 18:04:21.979739 4690 scope.go:117] "RemoveContainer" containerID="f6803aedebf555d37ebe569b5527b1cd56fd53b3197d0cbbd757d5db758379ad" Mar 20 18:04:22 crc kubenswrapper[4690]: I0320 18:04:22.025562 4690 scope.go:117] "RemoveContainer" containerID="a2c347262d69e38b72766e97307bcd4b12f3a01e70cb971c0fcefa75c411a292" Mar 20 18:04:22 crc kubenswrapper[4690]: I0320 18:04:22.079469 4690 scope.go:117] "RemoveContainer" containerID="2d7067f9eb49bcecf0d8a345aba52d85b7adeb2469cdac922ca43bee76fec1df" Mar 20 18:04:22 crc kubenswrapper[4690]: I0320 18:04:22.147100 4690 scope.go:117] "RemoveContainer" containerID="5a32bba702758b598f20bc79be94a3a7ce52e126fb8463b7326ee083977e5faa" Mar 20 18:04:22 crc kubenswrapper[4690]: I0320 18:04:22.196092 4690 scope.go:117] "RemoveContainer" containerID="a7b92f445b086ac03ad7068b57f19c3f42f306c08763ea5e7ee0bf1bb4b060a9" Mar 20 18:04:22 crc kubenswrapper[4690]: I0320 18:04:22.247534 4690 scope.go:117] "RemoveContainer" containerID="c23a13d04ecada2157aaa2f4170be1d91cd1e581bebe8dfc0552d1ff02ca790a" Mar 20 18:04:22 crc kubenswrapper[4690]: I0320 18:04:22.267143 4690 scope.go:117] "RemoveContainer" containerID="12214f370e00b564156a5d39029a1d0d2c6c81f91e1b974f4dcdfb2559e97034" Mar 20 18:04:22 crc kubenswrapper[4690]: I0320 18:04:22.290051 4690 scope.go:117] "RemoveContainer" containerID="f5979325d976675a62f975210086321a1495bad2c2f3af8a0b22cdbd2b5a2e40" Mar 20 18:04:22 crc kubenswrapper[4690]: I0320 18:04:22.324624 4690 scope.go:117] "RemoveContainer" containerID="2271441d1328cfe6e45b0d4e507a22440a0fcaa13840aada66b397bde92e354c" Mar 20 18:04:22 crc kubenswrapper[4690]: I0320 18:04:22.341434 4690 scope.go:117] "RemoveContainer" containerID="d559e825f90c6dd81b674f00f720dd771ed4238cfc8ce5c0300dcfadf0dd3bb6" Mar 20 18:04:39 crc kubenswrapper[4690]: I0320 18:04:39.839447 4690 generic.go:334] "Generic (PLEG): container finished" podID="7bdd8e58-aee6-495b-85b6-6d4ce7de1bdc" containerID="c0911d807ad5b52a394e7e0448d2a80573927b60c8b58e357ff429a280c49d8e" exitCode=0 Mar 20 18:04:39 crc kubenswrapper[4690]: I0320 18:04:39.839539 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xpjkk" event={"ID":"7bdd8e58-aee6-495b-85b6-6d4ce7de1bdc","Type":"ContainerDied","Data":"c0911d807ad5b52a394e7e0448d2a80573927b60c8b58e357ff429a280c49d8e"} Mar 20 18:04:41 crc kubenswrapper[4690]: I0320 18:04:41.256847 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xpjkk" Mar 20 18:04:41 crc kubenswrapper[4690]: I0320 18:04:41.356440 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7bdd8e58-aee6-495b-85b6-6d4ce7de1bdc-inventory\") pod \"7bdd8e58-aee6-495b-85b6-6d4ce7de1bdc\" (UID: \"7bdd8e58-aee6-495b-85b6-6d4ce7de1bdc\") " Mar 20 18:04:41 crc kubenswrapper[4690]: I0320 18:04:41.356674 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4ldw\" (UniqueName: \"kubernetes.io/projected/7bdd8e58-aee6-495b-85b6-6d4ce7de1bdc-kube-api-access-d4ldw\") pod \"7bdd8e58-aee6-495b-85b6-6d4ce7de1bdc\" (UID: \"7bdd8e58-aee6-495b-85b6-6d4ce7de1bdc\") " Mar 20 18:04:41 crc kubenswrapper[4690]: I0320 18:04:41.357735 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7bdd8e58-aee6-495b-85b6-6d4ce7de1bdc-ssh-key-openstack-edpm-ipam\") pod \"7bdd8e58-aee6-495b-85b6-6d4ce7de1bdc\" (UID: \"7bdd8e58-aee6-495b-85b6-6d4ce7de1bdc\") " Mar 20 18:04:41 crc kubenswrapper[4690]: I0320 18:04:41.364723 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bdd8e58-aee6-495b-85b6-6d4ce7de1bdc-kube-api-access-d4ldw" (OuterVolumeSpecName: "kube-api-access-d4ldw") pod "7bdd8e58-aee6-495b-85b6-6d4ce7de1bdc" (UID: "7bdd8e58-aee6-495b-85b6-6d4ce7de1bdc"). InnerVolumeSpecName "kube-api-access-d4ldw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:04:41 crc kubenswrapper[4690]: I0320 18:04:41.393343 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bdd8e58-aee6-495b-85b6-6d4ce7de1bdc-inventory" (OuterVolumeSpecName: "inventory") pod "7bdd8e58-aee6-495b-85b6-6d4ce7de1bdc" (UID: "7bdd8e58-aee6-495b-85b6-6d4ce7de1bdc"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:04:41 crc kubenswrapper[4690]: I0320 18:04:41.426920 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bdd8e58-aee6-495b-85b6-6d4ce7de1bdc-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "7bdd8e58-aee6-495b-85b6-6d4ce7de1bdc" (UID: "7bdd8e58-aee6-495b-85b6-6d4ce7de1bdc"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:04:41 crc kubenswrapper[4690]: I0320 18:04:41.465908 4690 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7bdd8e58-aee6-495b-85b6-6d4ce7de1bdc-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 18:04:41 crc kubenswrapper[4690]: I0320 18:04:41.465977 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4ldw\" (UniqueName: \"kubernetes.io/projected/7bdd8e58-aee6-495b-85b6-6d4ce7de1bdc-kube-api-access-d4ldw\") on node \"crc\" DevicePath \"\"" Mar 20 18:04:41 crc kubenswrapper[4690]: I0320 18:04:41.466001 4690 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7bdd8e58-aee6-495b-85b6-6d4ce7de1bdc-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 18:04:41 crc kubenswrapper[4690]: I0320 18:04:41.861367 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xpjkk" event={"ID":"7bdd8e58-aee6-495b-85b6-6d4ce7de1bdc","Type":"ContainerDied","Data":"de6299abd79cd27587ee55221074980892fa946b52f0f72fb40ea5fd6fe77255"} Mar 20 18:04:41 crc kubenswrapper[4690]: I0320 18:04:41.861429 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de6299abd79cd27587ee55221074980892fa946b52f0f72fb40ea5fd6fe77255" Mar 20 18:04:41 crc kubenswrapper[4690]: I0320 18:04:41.861551 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xpjkk" Mar 20 18:04:41 crc kubenswrapper[4690]: I0320 18:04:41.964844 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p85rs"] Mar 20 18:04:41 crc kubenswrapper[4690]: E0320 18:04:41.965556 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bdd8e58-aee6-495b-85b6-6d4ce7de1bdc" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 20 18:04:41 crc kubenswrapper[4690]: I0320 18:04:41.965668 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bdd8e58-aee6-495b-85b6-6d4ce7de1bdc" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 20 18:04:41 crc kubenswrapper[4690]: E0320 18:04:41.965771 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6adac120-f240-4287-881f-428d4400c7c2" containerName="oc" Mar 20 18:04:41 crc kubenswrapper[4690]: I0320 18:04:41.965877 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="6adac120-f240-4287-881f-428d4400c7c2" containerName="oc" Mar 20 18:04:41 crc kubenswrapper[4690]: I0320 18:04:41.966187 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="6adac120-f240-4287-881f-428d4400c7c2" containerName="oc" Mar 20 18:04:41 crc kubenswrapper[4690]: I0320 18:04:41.966317 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bdd8e58-aee6-495b-85b6-6d4ce7de1bdc" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 20 18:04:41 crc kubenswrapper[4690]: I0320 18:04:41.968211 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p85rs" Mar 20 18:04:41 crc kubenswrapper[4690]: I0320 18:04:41.971531 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 18:04:41 crc kubenswrapper[4690]: I0320 18:04:41.975898 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 18:04:41 crc kubenswrapper[4690]: I0320 18:04:41.976356 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 18:04:41 crc kubenswrapper[4690]: I0320 18:04:41.976557 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-k9qb4" Mar 20 18:04:41 crc kubenswrapper[4690]: I0320 18:04:41.978786 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p85rs"] Mar 20 18:04:42 crc kubenswrapper[4690]: I0320 18:04:42.078643 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdhrl\" (UniqueName: \"kubernetes.io/projected/a59c2f4e-a048-421a-b4db-5411eeb2c3fd-kube-api-access-cdhrl\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-p85rs\" (UID: \"a59c2f4e-a048-421a-b4db-5411eeb2c3fd\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p85rs" Mar 20 18:04:42 crc kubenswrapper[4690]: I0320 18:04:42.078759 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a59c2f4e-a048-421a-b4db-5411eeb2c3fd-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-p85rs\" (UID: \"a59c2f4e-a048-421a-b4db-5411eeb2c3fd\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p85rs" Mar 20 18:04:42 crc kubenswrapper[4690]: I0320 18:04:42.078818 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a59c2f4e-a048-421a-b4db-5411eeb2c3fd-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-p85rs\" (UID: \"a59c2f4e-a048-421a-b4db-5411eeb2c3fd\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p85rs" Mar 20 18:04:42 crc kubenswrapper[4690]: I0320 18:04:42.180750 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a59c2f4e-a048-421a-b4db-5411eeb2c3fd-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-p85rs\" (UID: \"a59c2f4e-a048-421a-b4db-5411eeb2c3fd\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p85rs" Mar 20 18:04:42 crc kubenswrapper[4690]: I0320 18:04:42.180832 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a59c2f4e-a048-421a-b4db-5411eeb2c3fd-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-p85rs\" (UID: \"a59c2f4e-a048-421a-b4db-5411eeb2c3fd\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p85rs" Mar 20 18:04:42 crc kubenswrapper[4690]: I0320 18:04:42.180887 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdhrl\" (UniqueName: \"kubernetes.io/projected/a59c2f4e-a048-421a-b4db-5411eeb2c3fd-kube-api-access-cdhrl\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-p85rs\" (UID: \"a59c2f4e-a048-421a-b4db-5411eeb2c3fd\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p85rs" Mar 20 18:04:42 crc kubenswrapper[4690]: I0320 18:04:42.198032 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a59c2f4e-a048-421a-b4db-5411eeb2c3fd-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-p85rs\" (UID: \"a59c2f4e-a048-421a-b4db-5411eeb2c3fd\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p85rs" Mar 20 18:04:42 crc kubenswrapper[4690]: I0320 18:04:42.198129 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a59c2f4e-a048-421a-b4db-5411eeb2c3fd-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-p85rs\" (UID: \"a59c2f4e-a048-421a-b4db-5411eeb2c3fd\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p85rs" Mar 20 18:04:42 crc kubenswrapper[4690]: I0320 18:04:42.200499 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdhrl\" (UniqueName: \"kubernetes.io/projected/a59c2f4e-a048-421a-b4db-5411eeb2c3fd-kube-api-access-cdhrl\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-p85rs\" (UID: \"a59c2f4e-a048-421a-b4db-5411eeb2c3fd\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p85rs" Mar 20 18:04:42 crc kubenswrapper[4690]: I0320 18:04:42.329082 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p85rs" Mar 20 18:04:42 crc kubenswrapper[4690]: I0320 18:04:42.972519 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p85rs"] Mar 20 18:04:43 crc kubenswrapper[4690]: I0320 18:04:43.898861 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p85rs" event={"ID":"a59c2f4e-a048-421a-b4db-5411eeb2c3fd","Type":"ContainerStarted","Data":"4cc18eee0233ab62c85ab9074b1758e25a760de5e1033eaa53f8f48d628c5454"} Mar 20 18:04:43 crc kubenswrapper[4690]: I0320 18:04:43.901989 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p85rs" event={"ID":"a59c2f4e-a048-421a-b4db-5411eeb2c3fd","Type":"ContainerStarted","Data":"a1d932bb267c6ec4439e49980ad86b4baf27836ca2e1d0086aa00cb25b137052"} Mar 20 18:04:43 crc kubenswrapper[4690]: I0320 18:04:43.905787 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p85rs" podStartSLOduration=2.4901788959999998 podStartE2EDuration="2.905760507s" podCreationTimestamp="2026-03-20 18:04:41 +0000 UTC" firstStartedPulling="2026-03-20 18:04:42.968090759 +0000 UTC m=+1957.833916457" lastFinishedPulling="2026-03-20 18:04:43.38367239 +0000 UTC m=+1958.249498068" observedRunningTime="2026-03-20 18:04:43.897812394 +0000 UTC m=+1958.763638082" watchObservedRunningTime="2026-03-20 18:04:43.905760507 +0000 UTC m=+1958.771586205" Mar 20 18:04:46 crc kubenswrapper[4690]: I0320 18:04:46.053569 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-njclq"] Mar 20 18:04:46 crc kubenswrapper[4690]: I0320 18:04:46.071961 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-njclq"] Mar 20 18:04:47 crc kubenswrapper[4690]: I0320 18:04:47.899364 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95830c09-d53f-4e08-800d-09d227668aee" path="/var/lib/kubelet/pods/95830c09-d53f-4e08-800d-09d227668aee/volumes" Mar 20 18:05:11 crc kubenswrapper[4690]: I0320 18:05:11.060195 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-69qfb"] Mar 20 18:05:11 crc kubenswrapper[4690]: I0320 18:05:11.069798 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-69qfb"] Mar 20 18:05:11 crc kubenswrapper[4690]: I0320 18:05:11.898419 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11cba8a0-5804-4d01-bcdb-ef490500501f" path="/var/lib/kubelet/pods/11cba8a0-5804-4d01-bcdb-ef490500501f/volumes" Mar 20 18:05:12 crc kubenswrapper[4690]: I0320 18:05:12.026517 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-454zt"] Mar 20 18:05:12 crc kubenswrapper[4690]: I0320 18:05:12.035361 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-454zt"] Mar 20 18:05:13 crc kubenswrapper[4690]: I0320 18:05:13.897191 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3f5fa9c-b4e5-4674-8ecf-2dd41a12852e" path="/var/lib/kubelet/pods/d3f5fa9c-b4e5-4674-8ecf-2dd41a12852e/volumes" Mar 20 18:05:22 crc kubenswrapper[4690]: I0320 18:05:22.567349 4690 scope.go:117] "RemoveContainer" containerID="a53656e4c8e345ea3e2b042f3181137b42a535e3c48b19f229cba3b9985e607a" Mar 20 18:05:22 crc kubenswrapper[4690]: I0320 18:05:22.616677 4690 scope.go:117] "RemoveContainer" containerID="d7edac921af3e2d5feb159e51483e142289f9d76a7f9e650bb9c796b33e41066" Mar 20 18:05:22 crc kubenswrapper[4690]: I0320 18:05:22.668736 4690 scope.go:117] "RemoveContainer" containerID="8ba3dabaad1997eae9e7f119e3e94ab9802dfb2f29f3870414a98004cdbf6e7b" Mar 20 18:05:34 crc kubenswrapper[4690]: I0320 18:05:34.371977 4690 generic.go:334] "Generic (PLEG): container finished" podID="a59c2f4e-a048-421a-b4db-5411eeb2c3fd" containerID="4cc18eee0233ab62c85ab9074b1758e25a760de5e1033eaa53f8f48d628c5454" exitCode=0 Mar 20 18:05:34 crc kubenswrapper[4690]: I0320 18:05:34.372020 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p85rs" event={"ID":"a59c2f4e-a048-421a-b4db-5411eeb2c3fd","Type":"ContainerDied","Data":"4cc18eee0233ab62c85ab9074b1758e25a760de5e1033eaa53f8f48d628c5454"} Mar 20 18:05:35 crc kubenswrapper[4690]: I0320 18:05:35.831395 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p85rs" Mar 20 18:05:35 crc kubenswrapper[4690]: I0320 18:05:35.880871 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cdhrl\" (UniqueName: \"kubernetes.io/projected/a59c2f4e-a048-421a-b4db-5411eeb2c3fd-kube-api-access-cdhrl\") pod \"a59c2f4e-a048-421a-b4db-5411eeb2c3fd\" (UID: \"a59c2f4e-a048-421a-b4db-5411eeb2c3fd\") " Mar 20 18:05:35 crc kubenswrapper[4690]: I0320 18:05:35.881027 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a59c2f4e-a048-421a-b4db-5411eeb2c3fd-inventory\") pod \"a59c2f4e-a048-421a-b4db-5411eeb2c3fd\" (UID: \"a59c2f4e-a048-421a-b4db-5411eeb2c3fd\") " Mar 20 18:05:35 crc kubenswrapper[4690]: I0320 18:05:35.881176 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a59c2f4e-a048-421a-b4db-5411eeb2c3fd-ssh-key-openstack-edpm-ipam\") pod \"a59c2f4e-a048-421a-b4db-5411eeb2c3fd\" (UID: \"a59c2f4e-a048-421a-b4db-5411eeb2c3fd\") " Mar 20 18:05:35 crc kubenswrapper[4690]: I0320 18:05:35.886758 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a59c2f4e-a048-421a-b4db-5411eeb2c3fd-kube-api-access-cdhrl" (OuterVolumeSpecName: "kube-api-access-cdhrl") pod "a59c2f4e-a048-421a-b4db-5411eeb2c3fd" (UID: "a59c2f4e-a048-421a-b4db-5411eeb2c3fd"). InnerVolumeSpecName "kube-api-access-cdhrl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:05:35 crc kubenswrapper[4690]: I0320 18:05:35.916100 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a59c2f4e-a048-421a-b4db-5411eeb2c3fd-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "a59c2f4e-a048-421a-b4db-5411eeb2c3fd" (UID: "a59c2f4e-a048-421a-b4db-5411eeb2c3fd"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:05:35 crc kubenswrapper[4690]: I0320 18:05:35.917159 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a59c2f4e-a048-421a-b4db-5411eeb2c3fd-inventory" (OuterVolumeSpecName: "inventory") pod "a59c2f4e-a048-421a-b4db-5411eeb2c3fd" (UID: "a59c2f4e-a048-421a-b4db-5411eeb2c3fd"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:05:35 crc kubenswrapper[4690]: I0320 18:05:35.984092 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cdhrl\" (UniqueName: \"kubernetes.io/projected/a59c2f4e-a048-421a-b4db-5411eeb2c3fd-kube-api-access-cdhrl\") on node \"crc\" DevicePath \"\"" Mar 20 18:05:35 crc kubenswrapper[4690]: I0320 18:05:35.984133 4690 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a59c2f4e-a048-421a-b4db-5411eeb2c3fd-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 18:05:35 crc kubenswrapper[4690]: I0320 18:05:35.984149 4690 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a59c2f4e-a048-421a-b4db-5411eeb2c3fd-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 18:05:36 crc kubenswrapper[4690]: I0320 18:05:36.398801 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p85rs" event={"ID":"a59c2f4e-a048-421a-b4db-5411eeb2c3fd","Type":"ContainerDied","Data":"a1d932bb267c6ec4439e49980ad86b4baf27836ca2e1d0086aa00cb25b137052"} Mar 20 18:05:36 crc kubenswrapper[4690]: I0320 18:05:36.399103 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a1d932bb267c6ec4439e49980ad86b4baf27836ca2e1d0086aa00cb25b137052" Mar 20 18:05:36 crc kubenswrapper[4690]: I0320 18:05:36.398837 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p85rs" Mar 20 18:05:36 crc kubenswrapper[4690]: I0320 18:05:36.501772 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-fnwhc"] Mar 20 18:05:36 crc kubenswrapper[4690]: E0320 18:05:36.502186 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a59c2f4e-a048-421a-b4db-5411eeb2c3fd" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 20 18:05:36 crc kubenswrapper[4690]: I0320 18:05:36.502207 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="a59c2f4e-a048-421a-b4db-5411eeb2c3fd" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 20 18:05:36 crc kubenswrapper[4690]: I0320 18:05:36.502431 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="a59c2f4e-a048-421a-b4db-5411eeb2c3fd" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 20 18:05:36 crc kubenswrapper[4690]: I0320 18:05:36.503075 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-fnwhc" Mar 20 18:05:36 crc kubenswrapper[4690]: I0320 18:05:36.511860 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-k9qb4" Mar 20 18:05:36 crc kubenswrapper[4690]: I0320 18:05:36.511916 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 18:05:36 crc kubenswrapper[4690]: I0320 18:05:36.511935 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 18:05:36 crc kubenswrapper[4690]: I0320 18:05:36.512001 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 18:05:36 crc kubenswrapper[4690]: I0320 18:05:36.518612 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-fnwhc"] Mar 20 18:05:36 crc kubenswrapper[4690]: I0320 18:05:36.698359 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7d323f18-a4a8-4074-8b3f-cafcb23bcd33-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-fnwhc\" (UID: \"7d323f18-a4a8-4074-8b3f-cafcb23bcd33\") " pod="openstack/ssh-known-hosts-edpm-deployment-fnwhc" Mar 20 18:05:36 crc kubenswrapper[4690]: I0320 18:05:36.698459 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/7d323f18-a4a8-4074-8b3f-cafcb23bcd33-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-fnwhc\" (UID: \"7d323f18-a4a8-4074-8b3f-cafcb23bcd33\") " pod="openstack/ssh-known-hosts-edpm-deployment-fnwhc" Mar 20 18:05:36 crc kubenswrapper[4690]: I0320 18:05:36.698554 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4lsw\" (UniqueName: \"kubernetes.io/projected/7d323f18-a4a8-4074-8b3f-cafcb23bcd33-kube-api-access-l4lsw\") pod \"ssh-known-hosts-edpm-deployment-fnwhc\" (UID: \"7d323f18-a4a8-4074-8b3f-cafcb23bcd33\") " pod="openstack/ssh-known-hosts-edpm-deployment-fnwhc" Mar 20 18:05:36 crc kubenswrapper[4690]: I0320 18:05:36.800566 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4lsw\" (UniqueName: \"kubernetes.io/projected/7d323f18-a4a8-4074-8b3f-cafcb23bcd33-kube-api-access-l4lsw\") pod \"ssh-known-hosts-edpm-deployment-fnwhc\" (UID: \"7d323f18-a4a8-4074-8b3f-cafcb23bcd33\") " pod="openstack/ssh-known-hosts-edpm-deployment-fnwhc" Mar 20 18:05:36 crc kubenswrapper[4690]: I0320 18:05:36.800740 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7d323f18-a4a8-4074-8b3f-cafcb23bcd33-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-fnwhc\" (UID: \"7d323f18-a4a8-4074-8b3f-cafcb23bcd33\") " pod="openstack/ssh-known-hosts-edpm-deployment-fnwhc" Mar 20 18:05:36 crc kubenswrapper[4690]: I0320 18:05:36.800794 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/7d323f18-a4a8-4074-8b3f-cafcb23bcd33-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-fnwhc\" (UID: \"7d323f18-a4a8-4074-8b3f-cafcb23bcd33\") " pod="openstack/ssh-known-hosts-edpm-deployment-fnwhc" Mar 20 18:05:36 crc kubenswrapper[4690]: I0320 18:05:36.805987 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7d323f18-a4a8-4074-8b3f-cafcb23bcd33-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-fnwhc\" (UID: \"7d323f18-a4a8-4074-8b3f-cafcb23bcd33\") " pod="openstack/ssh-known-hosts-edpm-deployment-fnwhc" Mar 20 18:05:36 crc kubenswrapper[4690]: I0320 18:05:36.806353 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/7d323f18-a4a8-4074-8b3f-cafcb23bcd33-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-fnwhc\" (UID: \"7d323f18-a4a8-4074-8b3f-cafcb23bcd33\") " pod="openstack/ssh-known-hosts-edpm-deployment-fnwhc" Mar 20 18:05:36 crc kubenswrapper[4690]: I0320 18:05:36.822810 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4lsw\" (UniqueName: \"kubernetes.io/projected/7d323f18-a4a8-4074-8b3f-cafcb23bcd33-kube-api-access-l4lsw\") pod \"ssh-known-hosts-edpm-deployment-fnwhc\" (UID: \"7d323f18-a4a8-4074-8b3f-cafcb23bcd33\") " pod="openstack/ssh-known-hosts-edpm-deployment-fnwhc" Mar 20 18:05:37 crc kubenswrapper[4690]: I0320 18:05:37.122604 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-fnwhc" Mar 20 18:05:37 crc kubenswrapper[4690]: I0320 18:05:37.659956 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-fnwhc"] Mar 20 18:05:38 crc kubenswrapper[4690]: I0320 18:05:38.419818 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-fnwhc" event={"ID":"7d323f18-a4a8-4074-8b3f-cafcb23bcd33","Type":"ContainerStarted","Data":"8504daf877c0cdc5e78c03be1f70da7ce98e6e32e335d2f1ce74052eb07ee3bb"} Mar 20 18:05:38 crc kubenswrapper[4690]: I0320 18:05:38.420216 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-fnwhc" event={"ID":"7d323f18-a4a8-4074-8b3f-cafcb23bcd33","Type":"ContainerStarted","Data":"6387cb9fbce3b69e039bb48c8205e4bfb29bb647519abd6100aa196f1e9c082e"} Mar 20 18:05:38 crc kubenswrapper[4690]: I0320 18:05:38.441236 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-fnwhc" podStartSLOduration=1.967248429 podStartE2EDuration="2.441214992s" podCreationTimestamp="2026-03-20 18:05:36 +0000 UTC" firstStartedPulling="2026-03-20 18:05:37.662440794 +0000 UTC m=+2012.528266512" lastFinishedPulling="2026-03-20 18:05:38.136407397 +0000 UTC m=+2013.002233075" observedRunningTime="2026-03-20 18:05:38.438696131 +0000 UTC m=+2013.304521819" watchObservedRunningTime="2026-03-20 18:05:38.441214992 +0000 UTC m=+2013.307040680" Mar 20 18:05:45 crc kubenswrapper[4690]: I0320 18:05:45.497296 4690 generic.go:334] "Generic (PLEG): container finished" podID="7d323f18-a4a8-4074-8b3f-cafcb23bcd33" containerID="8504daf877c0cdc5e78c03be1f70da7ce98e6e32e335d2f1ce74052eb07ee3bb" exitCode=0 Mar 20 18:05:45 crc kubenswrapper[4690]: I0320 18:05:45.497407 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-fnwhc" event={"ID":"7d323f18-a4a8-4074-8b3f-cafcb23bcd33","Type":"ContainerDied","Data":"8504daf877c0cdc5e78c03be1f70da7ce98e6e32e335d2f1ce74052eb07ee3bb"} Mar 20 18:05:46 crc kubenswrapper[4690]: I0320 18:05:46.895770 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-fnwhc" Mar 20 18:05:47 crc kubenswrapper[4690]: I0320 18:05:47.013500 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7d323f18-a4a8-4074-8b3f-cafcb23bcd33-ssh-key-openstack-edpm-ipam\") pod \"7d323f18-a4a8-4074-8b3f-cafcb23bcd33\" (UID: \"7d323f18-a4a8-4074-8b3f-cafcb23bcd33\") " Mar 20 18:05:47 crc kubenswrapper[4690]: I0320 18:05:47.013703 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/7d323f18-a4a8-4074-8b3f-cafcb23bcd33-inventory-0\") pod \"7d323f18-a4a8-4074-8b3f-cafcb23bcd33\" (UID: \"7d323f18-a4a8-4074-8b3f-cafcb23bcd33\") " Mar 20 18:05:47 crc kubenswrapper[4690]: I0320 18:05:47.013797 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4lsw\" (UniqueName: \"kubernetes.io/projected/7d323f18-a4a8-4074-8b3f-cafcb23bcd33-kube-api-access-l4lsw\") pod \"7d323f18-a4a8-4074-8b3f-cafcb23bcd33\" (UID: \"7d323f18-a4a8-4074-8b3f-cafcb23bcd33\") " Mar 20 18:05:47 crc kubenswrapper[4690]: I0320 18:05:47.019443 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d323f18-a4a8-4074-8b3f-cafcb23bcd33-kube-api-access-l4lsw" (OuterVolumeSpecName: "kube-api-access-l4lsw") pod "7d323f18-a4a8-4074-8b3f-cafcb23bcd33" (UID: "7d323f18-a4a8-4074-8b3f-cafcb23bcd33"). InnerVolumeSpecName "kube-api-access-l4lsw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:05:47 crc kubenswrapper[4690]: I0320 18:05:47.048704 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d323f18-a4a8-4074-8b3f-cafcb23bcd33-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "7d323f18-a4a8-4074-8b3f-cafcb23bcd33" (UID: "7d323f18-a4a8-4074-8b3f-cafcb23bcd33"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:05:47 crc kubenswrapper[4690]: I0320 18:05:47.050143 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d323f18-a4a8-4074-8b3f-cafcb23bcd33-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "7d323f18-a4a8-4074-8b3f-cafcb23bcd33" (UID: "7d323f18-a4a8-4074-8b3f-cafcb23bcd33"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:05:47 crc kubenswrapper[4690]: I0320 18:05:47.115938 4690 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/7d323f18-a4a8-4074-8b3f-cafcb23bcd33-inventory-0\") on node \"crc\" DevicePath \"\"" Mar 20 18:05:47 crc kubenswrapper[4690]: I0320 18:05:47.115991 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l4lsw\" (UniqueName: \"kubernetes.io/projected/7d323f18-a4a8-4074-8b3f-cafcb23bcd33-kube-api-access-l4lsw\") on node \"crc\" DevicePath \"\"" Mar 20 18:05:47 crc kubenswrapper[4690]: I0320 18:05:47.116013 4690 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7d323f18-a4a8-4074-8b3f-cafcb23bcd33-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 18:05:47 crc kubenswrapper[4690]: I0320 18:05:47.526226 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-fnwhc" event={"ID":"7d323f18-a4a8-4074-8b3f-cafcb23bcd33","Type":"ContainerDied","Data":"6387cb9fbce3b69e039bb48c8205e4bfb29bb647519abd6100aa196f1e9c082e"} Mar 20 18:05:47 crc kubenswrapper[4690]: I0320 18:05:47.526597 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6387cb9fbce3b69e039bb48c8205e4bfb29bb647519abd6100aa196f1e9c082e" Mar 20 18:05:47 crc kubenswrapper[4690]: I0320 18:05:47.526534 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-fnwhc" Mar 20 18:05:47 crc kubenswrapper[4690]: I0320 18:05:47.621324 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-8zmzz"] Mar 20 18:05:47 crc kubenswrapper[4690]: E0320 18:05:47.621907 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d323f18-a4a8-4074-8b3f-cafcb23bcd33" containerName="ssh-known-hosts-edpm-deployment" Mar 20 18:05:47 crc kubenswrapper[4690]: I0320 18:05:47.621933 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d323f18-a4a8-4074-8b3f-cafcb23bcd33" containerName="ssh-known-hosts-edpm-deployment" Mar 20 18:05:47 crc kubenswrapper[4690]: I0320 18:05:47.622153 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d323f18-a4a8-4074-8b3f-cafcb23bcd33" containerName="ssh-known-hosts-edpm-deployment" Mar 20 18:05:47 crc kubenswrapper[4690]: I0320 18:05:47.622981 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8zmzz" Mar 20 18:05:47 crc kubenswrapper[4690]: I0320 18:05:47.625237 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 18:05:47 crc kubenswrapper[4690]: I0320 18:05:47.625573 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 18:05:47 crc kubenswrapper[4690]: I0320 18:05:47.626954 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-k9qb4" Mar 20 18:05:47 crc kubenswrapper[4690]: I0320 18:05:47.627165 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 18:05:47 crc kubenswrapper[4690]: I0320 18:05:47.652782 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-8zmzz"] Mar 20 18:05:47 crc kubenswrapper[4690]: I0320 18:05:47.728263 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d7945514-9f35-4a0f-86f3-3d8e03a03d75-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-8zmzz\" (UID: \"d7945514-9f35-4a0f-86f3-3d8e03a03d75\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8zmzz" Mar 20 18:05:47 crc kubenswrapper[4690]: I0320 18:05:47.728366 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d7945514-9f35-4a0f-86f3-3d8e03a03d75-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-8zmzz\" (UID: \"d7945514-9f35-4a0f-86f3-3d8e03a03d75\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8zmzz" Mar 20 18:05:47 crc kubenswrapper[4690]: I0320 18:05:47.728878 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6dmp\" (UniqueName: \"kubernetes.io/projected/d7945514-9f35-4a0f-86f3-3d8e03a03d75-kube-api-access-n6dmp\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-8zmzz\" (UID: \"d7945514-9f35-4a0f-86f3-3d8e03a03d75\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8zmzz" Mar 20 18:05:47 crc kubenswrapper[4690]: I0320 18:05:47.831858 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d7945514-9f35-4a0f-86f3-3d8e03a03d75-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-8zmzz\" (UID: \"d7945514-9f35-4a0f-86f3-3d8e03a03d75\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8zmzz" Mar 20 18:05:47 crc kubenswrapper[4690]: I0320 18:05:47.832380 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d7945514-9f35-4a0f-86f3-3d8e03a03d75-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-8zmzz\" (UID: \"d7945514-9f35-4a0f-86f3-3d8e03a03d75\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8zmzz" Mar 20 18:05:47 crc kubenswrapper[4690]: I0320 18:05:47.832626 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6dmp\" (UniqueName: \"kubernetes.io/projected/d7945514-9f35-4a0f-86f3-3d8e03a03d75-kube-api-access-n6dmp\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-8zmzz\" (UID: \"d7945514-9f35-4a0f-86f3-3d8e03a03d75\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8zmzz" Mar 20 18:05:47 crc kubenswrapper[4690]: I0320 18:05:47.837403 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d7945514-9f35-4a0f-86f3-3d8e03a03d75-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-8zmzz\" (UID: \"d7945514-9f35-4a0f-86f3-3d8e03a03d75\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8zmzz" Mar 20 18:05:47 crc kubenswrapper[4690]: I0320 18:05:47.838613 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d7945514-9f35-4a0f-86f3-3d8e03a03d75-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-8zmzz\" (UID: \"d7945514-9f35-4a0f-86f3-3d8e03a03d75\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8zmzz" Mar 20 18:05:47 crc kubenswrapper[4690]: I0320 18:05:47.866554 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6dmp\" (UniqueName: \"kubernetes.io/projected/d7945514-9f35-4a0f-86f3-3d8e03a03d75-kube-api-access-n6dmp\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-8zmzz\" (UID: \"d7945514-9f35-4a0f-86f3-3d8e03a03d75\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8zmzz" Mar 20 18:05:47 crc kubenswrapper[4690]: I0320 18:05:47.946567 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8zmzz" Mar 20 18:05:48 crc kubenswrapper[4690]: I0320 18:05:48.524951 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-8zmzz"] Mar 20 18:05:48 crc kubenswrapper[4690]: W0320 18:05:48.530672 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd7945514_9f35_4a0f_86f3_3d8e03a03d75.slice/crio-8a4f38e4faec9b5525b9d75a59c9be9cce361e2fc0667885fb162ccfb1acb531 WatchSource:0}: Error finding container 8a4f38e4faec9b5525b9d75a59c9be9cce361e2fc0667885fb162ccfb1acb531: Status 404 returned error can't find the container with id 8a4f38e4faec9b5525b9d75a59c9be9cce361e2fc0667885fb162ccfb1acb531 Mar 20 18:05:49 crc kubenswrapper[4690]: I0320 18:05:49.564820 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8zmzz" event={"ID":"d7945514-9f35-4a0f-86f3-3d8e03a03d75","Type":"ContainerStarted","Data":"664355e688a9dae7d8a85e0f2dbbc762fb48a8b558509ec2888cf90687e01720"} Mar 20 18:05:49 crc kubenswrapper[4690]: I0320 18:05:49.565676 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8zmzz" event={"ID":"d7945514-9f35-4a0f-86f3-3d8e03a03d75","Type":"ContainerStarted","Data":"8a4f38e4faec9b5525b9d75a59c9be9cce361e2fc0667885fb162ccfb1acb531"} Mar 20 18:05:49 crc kubenswrapper[4690]: I0320 18:05:49.587231 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8zmzz" podStartSLOduration=2.148315245 podStartE2EDuration="2.587210322s" podCreationTimestamp="2026-03-20 18:05:47 +0000 UTC" firstStartedPulling="2026-03-20 18:05:48.533634913 +0000 UTC m=+2023.399460631" lastFinishedPulling="2026-03-20 18:05:48.97253001 +0000 UTC m=+2023.838355708" observedRunningTime="2026-03-20 18:05:49.584597959 +0000 UTC m=+2024.450423637" watchObservedRunningTime="2026-03-20 18:05:49.587210322 +0000 UTC m=+2024.453036000" Mar 20 18:05:56 crc kubenswrapper[4690]: I0320 18:05:56.048145 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-5q5q8"] Mar 20 18:05:56 crc kubenswrapper[4690]: I0320 18:05:56.055595 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-5q5q8"] Mar 20 18:05:57 crc kubenswrapper[4690]: I0320 18:05:57.656821 4690 generic.go:334] "Generic (PLEG): container finished" podID="d7945514-9f35-4a0f-86f3-3d8e03a03d75" containerID="664355e688a9dae7d8a85e0f2dbbc762fb48a8b558509ec2888cf90687e01720" exitCode=0 Mar 20 18:05:57 crc kubenswrapper[4690]: I0320 18:05:57.656904 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8zmzz" event={"ID":"d7945514-9f35-4a0f-86f3-3d8e03a03d75","Type":"ContainerDied","Data":"664355e688a9dae7d8a85e0f2dbbc762fb48a8b558509ec2888cf90687e01720"} Mar 20 18:05:57 crc kubenswrapper[4690]: I0320 18:05:57.896854 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1296ff75-6f88-4e2c-bf63-b46c3b090a6d" path="/var/lib/kubelet/pods/1296ff75-6f88-4e2c-bf63-b46c3b090a6d/volumes" Mar 20 18:05:59 crc kubenswrapper[4690]: I0320 18:05:59.186990 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8zmzz" Mar 20 18:05:59 crc kubenswrapper[4690]: I0320 18:05:59.201809 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n6dmp\" (UniqueName: \"kubernetes.io/projected/d7945514-9f35-4a0f-86f3-3d8e03a03d75-kube-api-access-n6dmp\") pod \"d7945514-9f35-4a0f-86f3-3d8e03a03d75\" (UID: \"d7945514-9f35-4a0f-86f3-3d8e03a03d75\") " Mar 20 18:05:59 crc kubenswrapper[4690]: I0320 18:05:59.212220 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7945514-9f35-4a0f-86f3-3d8e03a03d75-kube-api-access-n6dmp" (OuterVolumeSpecName: "kube-api-access-n6dmp") pod "d7945514-9f35-4a0f-86f3-3d8e03a03d75" (UID: "d7945514-9f35-4a0f-86f3-3d8e03a03d75"). InnerVolumeSpecName "kube-api-access-n6dmp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:05:59 crc kubenswrapper[4690]: I0320 18:05:59.303639 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d7945514-9f35-4a0f-86f3-3d8e03a03d75-ssh-key-openstack-edpm-ipam\") pod \"d7945514-9f35-4a0f-86f3-3d8e03a03d75\" (UID: \"d7945514-9f35-4a0f-86f3-3d8e03a03d75\") " Mar 20 18:05:59 crc kubenswrapper[4690]: I0320 18:05:59.305493 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d7945514-9f35-4a0f-86f3-3d8e03a03d75-inventory\") pod \"d7945514-9f35-4a0f-86f3-3d8e03a03d75\" (UID: \"d7945514-9f35-4a0f-86f3-3d8e03a03d75\") " Mar 20 18:05:59 crc kubenswrapper[4690]: I0320 18:05:59.306214 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n6dmp\" (UniqueName: \"kubernetes.io/projected/d7945514-9f35-4a0f-86f3-3d8e03a03d75-kube-api-access-n6dmp\") on node \"crc\" DevicePath \"\"" Mar 20 18:05:59 crc kubenswrapper[4690]: I0320 18:05:59.328871 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7945514-9f35-4a0f-86f3-3d8e03a03d75-inventory" (OuterVolumeSpecName: "inventory") pod "d7945514-9f35-4a0f-86f3-3d8e03a03d75" (UID: "d7945514-9f35-4a0f-86f3-3d8e03a03d75"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:05:59 crc kubenswrapper[4690]: I0320 18:05:59.347610 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7945514-9f35-4a0f-86f3-3d8e03a03d75-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "d7945514-9f35-4a0f-86f3-3d8e03a03d75" (UID: "d7945514-9f35-4a0f-86f3-3d8e03a03d75"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:05:59 crc kubenswrapper[4690]: I0320 18:05:59.407714 4690 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d7945514-9f35-4a0f-86f3-3d8e03a03d75-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 18:05:59 crc kubenswrapper[4690]: I0320 18:05:59.407743 4690 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d7945514-9f35-4a0f-86f3-3d8e03a03d75-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 18:05:59 crc kubenswrapper[4690]: I0320 18:05:59.677100 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8zmzz" event={"ID":"d7945514-9f35-4a0f-86f3-3d8e03a03d75","Type":"ContainerDied","Data":"8a4f38e4faec9b5525b9d75a59c9be9cce361e2fc0667885fb162ccfb1acb531"} Mar 20 18:05:59 crc kubenswrapper[4690]: I0320 18:05:59.677137 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8a4f38e4faec9b5525b9d75a59c9be9cce361e2fc0667885fb162ccfb1acb531" Mar 20 18:05:59 crc kubenswrapper[4690]: I0320 18:05:59.677192 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8zmzz" Mar 20 18:05:59 crc kubenswrapper[4690]: I0320 18:05:59.781654 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6q99q"] Mar 20 18:05:59 crc kubenswrapper[4690]: E0320 18:05:59.782116 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7945514-9f35-4a0f-86f3-3d8e03a03d75" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 20 18:05:59 crc kubenswrapper[4690]: I0320 18:05:59.782138 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7945514-9f35-4a0f-86f3-3d8e03a03d75" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 20 18:05:59 crc kubenswrapper[4690]: I0320 18:05:59.782344 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7945514-9f35-4a0f-86f3-3d8e03a03d75" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 20 18:05:59 crc kubenswrapper[4690]: I0320 18:05:59.783078 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6q99q" Mar 20 18:05:59 crc kubenswrapper[4690]: I0320 18:05:59.785879 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 18:05:59 crc kubenswrapper[4690]: I0320 18:05:59.786312 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 18:05:59 crc kubenswrapper[4690]: I0320 18:05:59.786587 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 18:05:59 crc kubenswrapper[4690]: I0320 18:05:59.787140 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-k9qb4" Mar 20 18:05:59 crc kubenswrapper[4690]: I0320 18:05:59.788498 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6q99q"] Mar 20 18:05:59 crc kubenswrapper[4690]: I0320 18:05:59.921659 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/94cbf02f-b47c-44f2-ab14-bd01174bcc77-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6q99q\" (UID: \"94cbf02f-b47c-44f2-ab14-bd01174bcc77\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6q99q" Mar 20 18:05:59 crc kubenswrapper[4690]: I0320 18:05:59.921911 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/94cbf02f-b47c-44f2-ab14-bd01174bcc77-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6q99q\" (UID: \"94cbf02f-b47c-44f2-ab14-bd01174bcc77\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6q99q" Mar 20 18:05:59 crc kubenswrapper[4690]: I0320 18:05:59.922047 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vqr6\" (UniqueName: \"kubernetes.io/projected/94cbf02f-b47c-44f2-ab14-bd01174bcc77-kube-api-access-8vqr6\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6q99q\" (UID: \"94cbf02f-b47c-44f2-ab14-bd01174bcc77\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6q99q" Mar 20 18:06:00 crc kubenswrapper[4690]: I0320 18:06:00.024002 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/94cbf02f-b47c-44f2-ab14-bd01174bcc77-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6q99q\" (UID: \"94cbf02f-b47c-44f2-ab14-bd01174bcc77\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6q99q" Mar 20 18:06:00 crc kubenswrapper[4690]: I0320 18:06:00.024108 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vqr6\" (UniqueName: \"kubernetes.io/projected/94cbf02f-b47c-44f2-ab14-bd01174bcc77-kube-api-access-8vqr6\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6q99q\" (UID: \"94cbf02f-b47c-44f2-ab14-bd01174bcc77\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6q99q" Mar 20 18:06:00 crc kubenswrapper[4690]: I0320 18:06:00.024224 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/94cbf02f-b47c-44f2-ab14-bd01174bcc77-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6q99q\" (UID: \"94cbf02f-b47c-44f2-ab14-bd01174bcc77\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6q99q" Mar 20 18:06:00 crc kubenswrapper[4690]: I0320 18:06:00.030760 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/94cbf02f-b47c-44f2-ab14-bd01174bcc77-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6q99q\" (UID: \"94cbf02f-b47c-44f2-ab14-bd01174bcc77\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6q99q" Mar 20 18:06:00 crc kubenswrapper[4690]: I0320 18:06:00.031469 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/94cbf02f-b47c-44f2-ab14-bd01174bcc77-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6q99q\" (UID: \"94cbf02f-b47c-44f2-ab14-bd01174bcc77\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6q99q" Mar 20 18:06:00 crc kubenswrapper[4690]: I0320 18:06:00.047569 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vqr6\" (UniqueName: \"kubernetes.io/projected/94cbf02f-b47c-44f2-ab14-bd01174bcc77-kube-api-access-8vqr6\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6q99q\" (UID: \"94cbf02f-b47c-44f2-ab14-bd01174bcc77\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6q99q" Mar 20 18:06:00 crc kubenswrapper[4690]: I0320 18:06:00.116091 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6q99q" Mar 20 18:06:00 crc kubenswrapper[4690]: I0320 18:06:00.158238 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567166-xg9rc"] Mar 20 18:06:00 crc kubenswrapper[4690]: I0320 18:06:00.159757 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567166-xg9rc"] Mar 20 18:06:00 crc kubenswrapper[4690]: I0320 18:06:00.159938 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567166-xg9rc" Mar 20 18:06:00 crc kubenswrapper[4690]: I0320 18:06:00.171271 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 18:06:00 crc kubenswrapper[4690]: I0320 18:06:00.171580 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5fwhb" Mar 20 18:06:00 crc kubenswrapper[4690]: I0320 18:06:00.172543 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 18:06:00 crc kubenswrapper[4690]: I0320 18:06:00.329725 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hp9xb\" (UniqueName: \"kubernetes.io/projected/95fcb3df-a81e-40b0-aefe-2f740db37426-kube-api-access-hp9xb\") pod \"auto-csr-approver-29567166-xg9rc\" (UID: \"95fcb3df-a81e-40b0-aefe-2f740db37426\") " pod="openshift-infra/auto-csr-approver-29567166-xg9rc" Mar 20 18:06:00 crc kubenswrapper[4690]: I0320 18:06:00.432358 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hp9xb\" (UniqueName: \"kubernetes.io/projected/95fcb3df-a81e-40b0-aefe-2f740db37426-kube-api-access-hp9xb\") pod \"auto-csr-approver-29567166-xg9rc\" (UID: \"95fcb3df-a81e-40b0-aefe-2f740db37426\") " pod="openshift-infra/auto-csr-approver-29567166-xg9rc" Mar 20 18:06:00 crc kubenswrapper[4690]: I0320 18:06:00.451723 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hp9xb\" (UniqueName: \"kubernetes.io/projected/95fcb3df-a81e-40b0-aefe-2f740db37426-kube-api-access-hp9xb\") pod \"auto-csr-approver-29567166-xg9rc\" (UID: \"95fcb3df-a81e-40b0-aefe-2f740db37426\") " pod="openshift-infra/auto-csr-approver-29567166-xg9rc" Mar 20 18:06:00 crc kubenswrapper[4690]: I0320 18:06:00.554193 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567166-xg9rc" Mar 20 18:06:00 crc kubenswrapper[4690]: I0320 18:06:00.665390 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6q99q"] Mar 20 18:06:00 crc kubenswrapper[4690]: I0320 18:06:00.699891 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6q99q" event={"ID":"94cbf02f-b47c-44f2-ab14-bd01174bcc77","Type":"ContainerStarted","Data":"b57073f8072acb6474ea99ab58a92b5d725c12dde95282794720ffdac28756e9"} Mar 20 18:06:01 crc kubenswrapper[4690]: W0320 18:06:01.023370 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95fcb3df_a81e_40b0_aefe_2f740db37426.slice/crio-8620a75221f58d9118103ead23bf31f340828c1f9e56bf10a33861fee687cefc WatchSource:0}: Error finding container 8620a75221f58d9118103ead23bf31f340828c1f9e56bf10a33861fee687cefc: Status 404 returned error can't find the container with id 8620a75221f58d9118103ead23bf31f340828c1f9e56bf10a33861fee687cefc Mar 20 18:06:01 crc kubenswrapper[4690]: I0320 18:06:01.024607 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567166-xg9rc"] Mar 20 18:06:01 crc kubenswrapper[4690]: I0320 18:06:01.713406 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567166-xg9rc" event={"ID":"95fcb3df-a81e-40b0-aefe-2f740db37426","Type":"ContainerStarted","Data":"8620a75221f58d9118103ead23bf31f340828c1f9e56bf10a33861fee687cefc"} Mar 20 18:06:01 crc kubenswrapper[4690]: I0320 18:06:01.715804 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6q99q" event={"ID":"94cbf02f-b47c-44f2-ab14-bd01174bcc77","Type":"ContainerStarted","Data":"bd2be04e0cee996cae45ea5f1bf81f3189191f36bee6278d72f798f1702ec35f"} Mar 20 18:06:01 crc kubenswrapper[4690]: I0320 18:06:01.759845 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6q99q" podStartSLOduration=2.3298759159999998 podStartE2EDuration="2.759816452s" podCreationTimestamp="2026-03-20 18:05:59 +0000 UTC" firstStartedPulling="2026-03-20 18:06:00.67483759 +0000 UTC m=+2035.540663268" lastFinishedPulling="2026-03-20 18:06:01.104778126 +0000 UTC m=+2035.970603804" observedRunningTime="2026-03-20 18:06:01.743430951 +0000 UTC m=+2036.609256669" watchObservedRunningTime="2026-03-20 18:06:01.759816452 +0000 UTC m=+2036.625642170" Mar 20 18:06:02 crc kubenswrapper[4690]: I0320 18:06:02.725566 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567166-xg9rc" event={"ID":"95fcb3df-a81e-40b0-aefe-2f740db37426","Type":"ContainerStarted","Data":"21cc76818b2012ca77adf91823bae802e3edaeebbf49fee7438e2987ab894ec3"} Mar 20 18:06:02 crc kubenswrapper[4690]: I0320 18:06:02.743975 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567166-xg9rc" podStartSLOduration=1.500741635 podStartE2EDuration="2.743953078s" podCreationTimestamp="2026-03-20 18:06:00 +0000 UTC" firstStartedPulling="2026-03-20 18:06:01.026694779 +0000 UTC m=+2035.892520457" lastFinishedPulling="2026-03-20 18:06:02.269906212 +0000 UTC m=+2037.135731900" observedRunningTime="2026-03-20 18:06:02.742894489 +0000 UTC m=+2037.608720177" watchObservedRunningTime="2026-03-20 18:06:02.743953078 +0000 UTC m=+2037.609778756" Mar 20 18:06:03 crc kubenswrapper[4690]: I0320 18:06:03.734462 4690 generic.go:334] "Generic (PLEG): container finished" podID="95fcb3df-a81e-40b0-aefe-2f740db37426" containerID="21cc76818b2012ca77adf91823bae802e3edaeebbf49fee7438e2987ab894ec3" exitCode=0 Mar 20 18:06:03 crc kubenswrapper[4690]: I0320 18:06:03.734532 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567166-xg9rc" event={"ID":"95fcb3df-a81e-40b0-aefe-2f740db37426","Type":"ContainerDied","Data":"21cc76818b2012ca77adf91823bae802e3edaeebbf49fee7438e2987ab894ec3"} Mar 20 18:06:05 crc kubenswrapper[4690]: I0320 18:06:05.092778 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567166-xg9rc" Mar 20 18:06:05 crc kubenswrapper[4690]: I0320 18:06:05.226852 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hp9xb\" (UniqueName: \"kubernetes.io/projected/95fcb3df-a81e-40b0-aefe-2f740db37426-kube-api-access-hp9xb\") pod \"95fcb3df-a81e-40b0-aefe-2f740db37426\" (UID: \"95fcb3df-a81e-40b0-aefe-2f740db37426\") " Mar 20 18:06:05 crc kubenswrapper[4690]: I0320 18:06:05.234727 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95fcb3df-a81e-40b0-aefe-2f740db37426-kube-api-access-hp9xb" (OuterVolumeSpecName: "kube-api-access-hp9xb") pod "95fcb3df-a81e-40b0-aefe-2f740db37426" (UID: "95fcb3df-a81e-40b0-aefe-2f740db37426"). InnerVolumeSpecName "kube-api-access-hp9xb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:06:05 crc kubenswrapper[4690]: I0320 18:06:05.329447 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hp9xb\" (UniqueName: \"kubernetes.io/projected/95fcb3df-a81e-40b0-aefe-2f740db37426-kube-api-access-hp9xb\") on node \"crc\" DevicePath \"\"" Mar 20 18:06:05 crc kubenswrapper[4690]: I0320 18:06:05.754940 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567166-xg9rc" event={"ID":"95fcb3df-a81e-40b0-aefe-2f740db37426","Type":"ContainerDied","Data":"8620a75221f58d9118103ead23bf31f340828c1f9e56bf10a33861fee687cefc"} Mar 20 18:06:05 crc kubenswrapper[4690]: I0320 18:06:05.754989 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8620a75221f58d9118103ead23bf31f340828c1f9e56bf10a33861fee687cefc" Mar 20 18:06:05 crc kubenswrapper[4690]: I0320 18:06:05.755016 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567166-xg9rc" Mar 20 18:06:05 crc kubenswrapper[4690]: I0320 18:06:05.814229 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567160-25b7n"] Mar 20 18:06:05 crc kubenswrapper[4690]: I0320 18:06:05.826384 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567160-25b7n"] Mar 20 18:06:05 crc kubenswrapper[4690]: I0320 18:06:05.895143 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1a94716-92ff-429b-b528-1144c64571c4" path="/var/lib/kubelet/pods/a1a94716-92ff-429b-b528-1144c64571c4/volumes" Mar 20 18:06:10 crc kubenswrapper[4690]: I0320 18:06:10.808619 4690 generic.go:334] "Generic (PLEG): container finished" podID="94cbf02f-b47c-44f2-ab14-bd01174bcc77" containerID="bd2be04e0cee996cae45ea5f1bf81f3189191f36bee6278d72f798f1702ec35f" exitCode=0 Mar 20 18:06:10 crc kubenswrapper[4690]: I0320 18:06:10.808696 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6q99q" event={"ID":"94cbf02f-b47c-44f2-ab14-bd01174bcc77","Type":"ContainerDied","Data":"bd2be04e0cee996cae45ea5f1bf81f3189191f36bee6278d72f798f1702ec35f"} Mar 20 18:06:12 crc kubenswrapper[4690]: I0320 18:06:12.240351 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6q99q" Mar 20 18:06:12 crc kubenswrapper[4690]: I0320 18:06:12.381703 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/94cbf02f-b47c-44f2-ab14-bd01174bcc77-inventory\") pod \"94cbf02f-b47c-44f2-ab14-bd01174bcc77\" (UID: \"94cbf02f-b47c-44f2-ab14-bd01174bcc77\") " Mar 20 18:06:12 crc kubenswrapper[4690]: I0320 18:06:12.381785 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vqr6\" (UniqueName: \"kubernetes.io/projected/94cbf02f-b47c-44f2-ab14-bd01174bcc77-kube-api-access-8vqr6\") pod \"94cbf02f-b47c-44f2-ab14-bd01174bcc77\" (UID: \"94cbf02f-b47c-44f2-ab14-bd01174bcc77\") " Mar 20 18:06:12 crc kubenswrapper[4690]: I0320 18:06:12.381805 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/94cbf02f-b47c-44f2-ab14-bd01174bcc77-ssh-key-openstack-edpm-ipam\") pod \"94cbf02f-b47c-44f2-ab14-bd01174bcc77\" (UID: \"94cbf02f-b47c-44f2-ab14-bd01174bcc77\") " Mar 20 18:06:12 crc kubenswrapper[4690]: I0320 18:06:12.388291 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94cbf02f-b47c-44f2-ab14-bd01174bcc77-kube-api-access-8vqr6" (OuterVolumeSpecName: "kube-api-access-8vqr6") pod "94cbf02f-b47c-44f2-ab14-bd01174bcc77" (UID: "94cbf02f-b47c-44f2-ab14-bd01174bcc77"). InnerVolumeSpecName "kube-api-access-8vqr6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:06:12 crc kubenswrapper[4690]: I0320 18:06:12.410381 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94cbf02f-b47c-44f2-ab14-bd01174bcc77-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "94cbf02f-b47c-44f2-ab14-bd01174bcc77" (UID: "94cbf02f-b47c-44f2-ab14-bd01174bcc77"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:06:12 crc kubenswrapper[4690]: I0320 18:06:12.410822 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94cbf02f-b47c-44f2-ab14-bd01174bcc77-inventory" (OuterVolumeSpecName: "inventory") pod "94cbf02f-b47c-44f2-ab14-bd01174bcc77" (UID: "94cbf02f-b47c-44f2-ab14-bd01174bcc77"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:06:12 crc kubenswrapper[4690]: I0320 18:06:12.484393 4690 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/94cbf02f-b47c-44f2-ab14-bd01174bcc77-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 18:06:12 crc kubenswrapper[4690]: I0320 18:06:12.484440 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vqr6\" (UniqueName: \"kubernetes.io/projected/94cbf02f-b47c-44f2-ab14-bd01174bcc77-kube-api-access-8vqr6\") on node \"crc\" DevicePath \"\"" Mar 20 18:06:12 crc kubenswrapper[4690]: I0320 18:06:12.484454 4690 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/94cbf02f-b47c-44f2-ab14-bd01174bcc77-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 18:06:12 crc kubenswrapper[4690]: I0320 18:06:12.828683 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6q99q" event={"ID":"94cbf02f-b47c-44f2-ab14-bd01174bcc77","Type":"ContainerDied","Data":"b57073f8072acb6474ea99ab58a92b5d725c12dde95282794720ffdac28756e9"} Mar 20 18:06:12 crc kubenswrapper[4690]: I0320 18:06:12.829026 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b57073f8072acb6474ea99ab58a92b5d725c12dde95282794720ffdac28756e9" Mar 20 18:06:12 crc kubenswrapper[4690]: I0320 18:06:12.828726 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6q99q" Mar 20 18:06:12 crc kubenswrapper[4690]: I0320 18:06:12.948456 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-w5shc"] Mar 20 18:06:12 crc kubenswrapper[4690]: E0320 18:06:12.948889 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94cbf02f-b47c-44f2-ab14-bd01174bcc77" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 20 18:06:12 crc kubenswrapper[4690]: I0320 18:06:12.948912 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="94cbf02f-b47c-44f2-ab14-bd01174bcc77" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 20 18:06:12 crc kubenswrapper[4690]: E0320 18:06:12.948940 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95fcb3df-a81e-40b0-aefe-2f740db37426" containerName="oc" Mar 20 18:06:12 crc kubenswrapper[4690]: I0320 18:06:12.948946 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="95fcb3df-a81e-40b0-aefe-2f740db37426" containerName="oc" Mar 20 18:06:12 crc kubenswrapper[4690]: I0320 18:06:12.949118 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="94cbf02f-b47c-44f2-ab14-bd01174bcc77" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 20 18:06:12 crc kubenswrapper[4690]: I0320 18:06:12.949140 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="95fcb3df-a81e-40b0-aefe-2f740db37426" containerName="oc" Mar 20 18:06:12 crc kubenswrapper[4690]: I0320 18:06:12.949811 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-w5shc" Mar 20 18:06:12 crc kubenswrapper[4690]: I0320 18:06:12.952523 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Mar 20 18:06:12 crc kubenswrapper[4690]: I0320 18:06:12.952906 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-k9qb4" Mar 20 18:06:12 crc kubenswrapper[4690]: I0320 18:06:12.953158 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 18:06:12 crc kubenswrapper[4690]: I0320 18:06:12.953423 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 18:06:12 crc kubenswrapper[4690]: I0320 18:06:12.953643 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 18:06:12 crc kubenswrapper[4690]: I0320 18:06:12.954085 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Mar 20 18:06:12 crc kubenswrapper[4690]: I0320 18:06:12.954115 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Mar 20 18:06:12 crc kubenswrapper[4690]: I0320 18:06:12.954188 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Mar 20 18:06:12 crc kubenswrapper[4690]: I0320 18:06:12.958399 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-w5shc"] Mar 20 18:06:13 crc kubenswrapper[4690]: I0320 18:06:13.096237 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1dd8af3-0ac3-42c4-ba88-c891b8c971bd-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-w5shc\" (UID: \"e1dd8af3-0ac3-42c4-ba88-c891b8c971bd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-w5shc" Mar 20 18:06:13 crc kubenswrapper[4690]: I0320 18:06:13.096425 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1dd8af3-0ac3-42c4-ba88-c891b8c971bd-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-w5shc\" (UID: \"e1dd8af3-0ac3-42c4-ba88-c891b8c971bd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-w5shc" Mar 20 18:06:13 crc kubenswrapper[4690]: I0320 18:06:13.096479 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1dd8af3-0ac3-42c4-ba88-c891b8c971bd-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-w5shc\" (UID: \"e1dd8af3-0ac3-42c4-ba88-c891b8c971bd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-w5shc" Mar 20 18:06:13 crc kubenswrapper[4690]: I0320 18:06:13.096538 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1dd8af3-0ac3-42c4-ba88-c891b8c971bd-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-w5shc\" (UID: \"e1dd8af3-0ac3-42c4-ba88-c891b8c971bd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-w5shc" Mar 20 18:06:13 crc kubenswrapper[4690]: I0320 18:06:13.096716 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1dd8af3-0ac3-42c4-ba88-c891b8c971bd-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-w5shc\" (UID: \"e1dd8af3-0ac3-42c4-ba88-c891b8c971bd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-w5shc" Mar 20 18:06:13 crc kubenswrapper[4690]: I0320 18:06:13.097039 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e1dd8af3-0ac3-42c4-ba88-c891b8c971bd-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-w5shc\" (UID: \"e1dd8af3-0ac3-42c4-ba88-c891b8c971bd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-w5shc" Mar 20 18:06:13 crc kubenswrapper[4690]: I0320 18:06:13.097204 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e1dd8af3-0ac3-42c4-ba88-c891b8c971bd-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-w5shc\" (UID: \"e1dd8af3-0ac3-42c4-ba88-c891b8c971bd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-w5shc" Mar 20 18:06:13 crc kubenswrapper[4690]: I0320 18:06:13.097283 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwmpr\" (UniqueName: \"kubernetes.io/projected/e1dd8af3-0ac3-42c4-ba88-c891b8c971bd-kube-api-access-zwmpr\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-w5shc\" (UID: \"e1dd8af3-0ac3-42c4-ba88-c891b8c971bd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-w5shc" Mar 20 18:06:13 crc kubenswrapper[4690]: I0320 18:06:13.097367 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e1dd8af3-0ac3-42c4-ba88-c891b8c971bd-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-w5shc\" (UID: \"e1dd8af3-0ac3-42c4-ba88-c891b8c971bd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-w5shc" Mar 20 18:06:13 crc kubenswrapper[4690]: I0320 18:06:13.097413 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1dd8af3-0ac3-42c4-ba88-c891b8c971bd-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-w5shc\" (UID: \"e1dd8af3-0ac3-42c4-ba88-c891b8c971bd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-w5shc" Mar 20 18:06:13 crc kubenswrapper[4690]: I0320 18:06:13.097460 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e1dd8af3-0ac3-42c4-ba88-c891b8c971bd-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-w5shc\" (UID: \"e1dd8af3-0ac3-42c4-ba88-c891b8c971bd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-w5shc" Mar 20 18:06:13 crc kubenswrapper[4690]: I0320 18:06:13.097515 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1dd8af3-0ac3-42c4-ba88-c891b8c971bd-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-w5shc\" (UID: \"e1dd8af3-0ac3-42c4-ba88-c891b8c971bd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-w5shc" Mar 20 18:06:13 crc kubenswrapper[4690]: I0320 18:06:13.097612 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e1dd8af3-0ac3-42c4-ba88-c891b8c971bd-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-w5shc\" (UID: \"e1dd8af3-0ac3-42c4-ba88-c891b8c971bd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-w5shc" Mar 20 18:06:13 crc kubenswrapper[4690]: I0320 18:06:13.097674 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e1dd8af3-0ac3-42c4-ba88-c891b8c971bd-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-w5shc\" (UID: \"e1dd8af3-0ac3-42c4-ba88-c891b8c971bd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-w5shc" Mar 20 18:06:13 crc kubenswrapper[4690]: I0320 18:06:13.199898 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e1dd8af3-0ac3-42c4-ba88-c891b8c971bd-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-w5shc\" (UID: \"e1dd8af3-0ac3-42c4-ba88-c891b8c971bd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-w5shc" Mar 20 18:06:13 crc kubenswrapper[4690]: I0320 18:06:13.199954 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e1dd8af3-0ac3-42c4-ba88-c891b8c971bd-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-w5shc\" (UID: \"e1dd8af3-0ac3-42c4-ba88-c891b8c971bd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-w5shc" Mar 20 18:06:13 crc kubenswrapper[4690]: I0320 18:06:13.199998 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1dd8af3-0ac3-42c4-ba88-c891b8c971bd-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-w5shc\" (UID: \"e1dd8af3-0ac3-42c4-ba88-c891b8c971bd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-w5shc" Mar 20 18:06:13 crc kubenswrapper[4690]: I0320 18:06:13.200029 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1dd8af3-0ac3-42c4-ba88-c891b8c971bd-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-w5shc\" (UID: \"e1dd8af3-0ac3-42c4-ba88-c891b8c971bd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-w5shc" Mar 20 18:06:13 crc kubenswrapper[4690]: I0320 18:06:13.200051 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1dd8af3-0ac3-42c4-ba88-c891b8c971bd-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-w5shc\" (UID: \"e1dd8af3-0ac3-42c4-ba88-c891b8c971bd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-w5shc" Mar 20 18:06:13 crc kubenswrapper[4690]: I0320 18:06:13.200083 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1dd8af3-0ac3-42c4-ba88-c891b8c971bd-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-w5shc\" (UID: \"e1dd8af3-0ac3-42c4-ba88-c891b8c971bd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-w5shc" Mar 20 18:06:13 crc kubenswrapper[4690]: I0320 18:06:13.200112 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1dd8af3-0ac3-42c4-ba88-c891b8c971bd-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-w5shc\" (UID: \"e1dd8af3-0ac3-42c4-ba88-c891b8c971bd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-w5shc" Mar 20 18:06:13 crc kubenswrapper[4690]: I0320 18:06:13.200163 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e1dd8af3-0ac3-42c4-ba88-c891b8c971bd-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-w5shc\" (UID: \"e1dd8af3-0ac3-42c4-ba88-c891b8c971bd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-w5shc" Mar 20 18:06:13 crc kubenswrapper[4690]: I0320 18:06:13.200203 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e1dd8af3-0ac3-42c4-ba88-c891b8c971bd-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-w5shc\" (UID: \"e1dd8af3-0ac3-42c4-ba88-c891b8c971bd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-w5shc" Mar 20 18:06:13 crc kubenswrapper[4690]: I0320 18:06:13.200221 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwmpr\" (UniqueName: \"kubernetes.io/projected/e1dd8af3-0ac3-42c4-ba88-c891b8c971bd-kube-api-access-zwmpr\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-w5shc\" (UID: \"e1dd8af3-0ac3-42c4-ba88-c891b8c971bd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-w5shc" Mar 20 18:06:13 crc kubenswrapper[4690]: I0320 18:06:13.200246 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e1dd8af3-0ac3-42c4-ba88-c891b8c971bd-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-w5shc\" (UID: \"e1dd8af3-0ac3-42c4-ba88-c891b8c971bd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-w5shc" Mar 20 18:06:13 crc kubenswrapper[4690]: I0320 18:06:13.200292 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1dd8af3-0ac3-42c4-ba88-c891b8c971bd-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-w5shc\" (UID: \"e1dd8af3-0ac3-42c4-ba88-c891b8c971bd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-w5shc" Mar 20 18:06:13 crc kubenswrapper[4690]: I0320 18:06:13.200318 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e1dd8af3-0ac3-42c4-ba88-c891b8c971bd-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-w5shc\" (UID: \"e1dd8af3-0ac3-42c4-ba88-c891b8c971bd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-w5shc" Mar 20 18:06:13 crc kubenswrapper[4690]: I0320 18:06:13.200339 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1dd8af3-0ac3-42c4-ba88-c891b8c971bd-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-w5shc\" (UID: \"e1dd8af3-0ac3-42c4-ba88-c891b8c971bd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-w5shc" Mar 20 18:06:13 crc kubenswrapper[4690]: I0320 18:06:13.206655 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e1dd8af3-0ac3-42c4-ba88-c891b8c971bd-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-w5shc\" (UID: \"e1dd8af3-0ac3-42c4-ba88-c891b8c971bd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-w5shc" Mar 20 18:06:13 crc kubenswrapper[4690]: I0320 18:06:13.206682 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e1dd8af3-0ac3-42c4-ba88-c891b8c971bd-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-w5shc\" (UID: \"e1dd8af3-0ac3-42c4-ba88-c891b8c971bd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-w5shc" Mar 20 18:06:13 crc kubenswrapper[4690]: I0320 18:06:13.208818 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1dd8af3-0ac3-42c4-ba88-c891b8c971bd-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-w5shc\" (UID: \"e1dd8af3-0ac3-42c4-ba88-c891b8c971bd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-w5shc" Mar 20 18:06:13 crc kubenswrapper[4690]: I0320 18:06:13.208990 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e1dd8af3-0ac3-42c4-ba88-c891b8c971bd-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-w5shc\" (UID: \"e1dd8af3-0ac3-42c4-ba88-c891b8c971bd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-w5shc" Mar 20 18:06:13 crc kubenswrapper[4690]: I0320 18:06:13.210994 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1dd8af3-0ac3-42c4-ba88-c891b8c971bd-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-w5shc\" (UID: \"e1dd8af3-0ac3-42c4-ba88-c891b8c971bd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-w5shc" Mar 20 18:06:13 crc kubenswrapper[4690]: I0320 18:06:13.212143 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1dd8af3-0ac3-42c4-ba88-c891b8c971bd-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-w5shc\" (UID: \"e1dd8af3-0ac3-42c4-ba88-c891b8c971bd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-w5shc" Mar 20 18:06:13 crc kubenswrapper[4690]: I0320 18:06:13.212516 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1dd8af3-0ac3-42c4-ba88-c891b8c971bd-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-w5shc\" (UID: \"e1dd8af3-0ac3-42c4-ba88-c891b8c971bd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-w5shc" Mar 20 18:06:13 crc kubenswrapper[4690]: I0320 18:06:13.213770 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1dd8af3-0ac3-42c4-ba88-c891b8c971bd-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-w5shc\" (UID: \"e1dd8af3-0ac3-42c4-ba88-c891b8c971bd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-w5shc" Mar 20 18:06:13 crc kubenswrapper[4690]: I0320 18:06:13.214171 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e1dd8af3-0ac3-42c4-ba88-c891b8c971bd-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-w5shc\" (UID: \"e1dd8af3-0ac3-42c4-ba88-c891b8c971bd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-w5shc" Mar 20 18:06:13 crc kubenswrapper[4690]: I0320 18:06:13.215482 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1dd8af3-0ac3-42c4-ba88-c891b8c971bd-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-w5shc\" (UID: \"e1dd8af3-0ac3-42c4-ba88-c891b8c971bd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-w5shc" Mar 20 18:06:13 crc kubenswrapper[4690]: I0320 18:06:13.216749 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e1dd8af3-0ac3-42c4-ba88-c891b8c971bd-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-w5shc\" (UID: \"e1dd8af3-0ac3-42c4-ba88-c891b8c971bd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-w5shc" Mar 20 18:06:13 crc kubenswrapper[4690]: I0320 18:06:13.219593 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e1dd8af3-0ac3-42c4-ba88-c891b8c971bd-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-w5shc\" (UID: \"e1dd8af3-0ac3-42c4-ba88-c891b8c971bd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-w5shc" Mar 20 18:06:13 crc kubenswrapper[4690]: I0320 18:06:13.222828 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1dd8af3-0ac3-42c4-ba88-c891b8c971bd-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-w5shc\" (UID: \"e1dd8af3-0ac3-42c4-ba88-c891b8c971bd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-w5shc" Mar 20 18:06:13 crc kubenswrapper[4690]: I0320 18:06:13.236857 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwmpr\" (UniqueName: \"kubernetes.io/projected/e1dd8af3-0ac3-42c4-ba88-c891b8c971bd-kube-api-access-zwmpr\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-w5shc\" (UID: \"e1dd8af3-0ac3-42c4-ba88-c891b8c971bd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-w5shc" Mar 20 18:06:13 crc kubenswrapper[4690]: I0320 18:06:13.274331 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-w5shc" Mar 20 18:06:13 crc kubenswrapper[4690]: I0320 18:06:13.836817 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-w5shc"] Mar 20 18:06:14 crc kubenswrapper[4690]: I0320 18:06:14.850986 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-w5shc" event={"ID":"e1dd8af3-0ac3-42c4-ba88-c891b8c971bd","Type":"ContainerStarted","Data":"0cf9b69bc8538fdd96e941d5e2974d026c7e49a63c3c3289a7fe6538d8c266ee"} Mar 20 18:06:14 crc kubenswrapper[4690]: I0320 18:06:14.851514 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-w5shc" event={"ID":"e1dd8af3-0ac3-42c4-ba88-c891b8c971bd","Type":"ContainerStarted","Data":"8f9553cd9eb4eba9a4d9f30dcac7cf20728a3480a42b700395a2619574a65fc6"} Mar 20 18:06:14 crc kubenswrapper[4690]: I0320 18:06:14.875587 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-w5shc" podStartSLOduration=2.371251319 podStartE2EDuration="2.875568636s" podCreationTimestamp="2026-03-20 18:06:12 +0000 UTC" firstStartedPulling="2026-03-20 18:06:13.857080284 +0000 UTC m=+2048.722905962" lastFinishedPulling="2026-03-20 18:06:14.361397591 +0000 UTC m=+2049.227223279" observedRunningTime="2026-03-20 18:06:14.872393937 +0000 UTC m=+2049.738219635" watchObservedRunningTime="2026-03-20 18:06:14.875568636 +0000 UTC m=+2049.741394304" Mar 20 18:06:22 crc kubenswrapper[4690]: I0320 18:06:22.787620 4690 scope.go:117] "RemoveContainer" containerID="5b896b52c746e31af9bd4f8d5f536765ca7340c733d71497868aaaf70f62f225" Mar 20 18:06:22 crc kubenswrapper[4690]: I0320 18:06:22.842962 4690 scope.go:117] "RemoveContainer" containerID="899a33ad242cb750149df083fff3c13bf65d20eae215086d4e8c6bfa16c62150" Mar 20 18:06:24 crc kubenswrapper[4690]: I0320 18:06:24.273745 4690 patch_prober.go:28] interesting pod/machine-config-daemon-wtg2q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 18:06:24 crc kubenswrapper[4690]: I0320 18:06:24.274241 4690 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 18:06:54 crc kubenswrapper[4690]: I0320 18:06:54.274169 4690 patch_prober.go:28] interesting pod/machine-config-daemon-wtg2q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 18:06:54 crc kubenswrapper[4690]: I0320 18:06:54.274920 4690 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 18:06:54 crc kubenswrapper[4690]: I0320 18:06:54.292079 4690 generic.go:334] "Generic (PLEG): container finished" podID="e1dd8af3-0ac3-42c4-ba88-c891b8c971bd" containerID="0cf9b69bc8538fdd96e941d5e2974d026c7e49a63c3c3289a7fe6538d8c266ee" exitCode=0 Mar 20 18:06:54 crc kubenswrapper[4690]: I0320 18:06:54.292121 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-w5shc" event={"ID":"e1dd8af3-0ac3-42c4-ba88-c891b8c971bd","Type":"ContainerDied","Data":"0cf9b69bc8538fdd96e941d5e2974d026c7e49a63c3c3289a7fe6538d8c266ee"} Mar 20 18:06:55 crc kubenswrapper[4690]: I0320 18:06:55.792546 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-w5shc" Mar 20 18:06:55 crc kubenswrapper[4690]: I0320 18:06:55.848490 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1dd8af3-0ac3-42c4-ba88-c891b8c971bd-neutron-metadata-combined-ca-bundle\") pod \"e1dd8af3-0ac3-42c4-ba88-c891b8c971bd\" (UID: \"e1dd8af3-0ac3-42c4-ba88-c891b8c971bd\") " Mar 20 18:06:55 crc kubenswrapper[4690]: I0320 18:06:55.848528 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e1dd8af3-0ac3-42c4-ba88-c891b8c971bd-inventory\") pod \"e1dd8af3-0ac3-42c4-ba88-c891b8c971bd\" (UID: \"e1dd8af3-0ac3-42c4-ba88-c891b8c971bd\") " Mar 20 18:06:55 crc kubenswrapper[4690]: I0320 18:06:55.848556 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e1dd8af3-0ac3-42c4-ba88-c891b8c971bd-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"e1dd8af3-0ac3-42c4-ba88-c891b8c971bd\" (UID: \"e1dd8af3-0ac3-42c4-ba88-c891b8c971bd\") " Mar 20 18:06:55 crc kubenswrapper[4690]: I0320 18:06:55.849477 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e1dd8af3-0ac3-42c4-ba88-c891b8c971bd-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"e1dd8af3-0ac3-42c4-ba88-c891b8c971bd\" (UID: \"e1dd8af3-0ac3-42c4-ba88-c891b8c971bd\") " Mar 20 18:06:55 crc kubenswrapper[4690]: I0320 18:06:55.849537 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e1dd8af3-0ac3-42c4-ba88-c891b8c971bd-ssh-key-openstack-edpm-ipam\") pod \"e1dd8af3-0ac3-42c4-ba88-c891b8c971bd\" (UID: \"e1dd8af3-0ac3-42c4-ba88-c891b8c971bd\") " Mar 20 18:06:55 crc kubenswrapper[4690]: I0320 18:06:55.849645 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1dd8af3-0ac3-42c4-ba88-c891b8c971bd-bootstrap-combined-ca-bundle\") pod \"e1dd8af3-0ac3-42c4-ba88-c891b8c971bd\" (UID: \"e1dd8af3-0ac3-42c4-ba88-c891b8c971bd\") " Mar 20 18:06:55 crc kubenswrapper[4690]: I0320 18:06:55.849703 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zwmpr\" (UniqueName: \"kubernetes.io/projected/e1dd8af3-0ac3-42c4-ba88-c891b8c971bd-kube-api-access-zwmpr\") pod \"e1dd8af3-0ac3-42c4-ba88-c891b8c971bd\" (UID: \"e1dd8af3-0ac3-42c4-ba88-c891b8c971bd\") " Mar 20 18:06:55 crc kubenswrapper[4690]: I0320 18:06:55.849747 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1dd8af3-0ac3-42c4-ba88-c891b8c971bd-libvirt-combined-ca-bundle\") pod \"e1dd8af3-0ac3-42c4-ba88-c891b8c971bd\" (UID: \"e1dd8af3-0ac3-42c4-ba88-c891b8c971bd\") " Mar 20 18:06:55 crc kubenswrapper[4690]: I0320 18:06:55.849859 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1dd8af3-0ac3-42c4-ba88-c891b8c971bd-repo-setup-combined-ca-bundle\") pod \"e1dd8af3-0ac3-42c4-ba88-c891b8c971bd\" (UID: \"e1dd8af3-0ac3-42c4-ba88-c891b8c971bd\") " Mar 20 18:06:55 crc kubenswrapper[4690]: I0320 18:06:55.849897 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e1dd8af3-0ac3-42c4-ba88-c891b8c971bd-openstack-edpm-ipam-ovn-default-certs-0\") pod \"e1dd8af3-0ac3-42c4-ba88-c891b8c971bd\" (UID: \"e1dd8af3-0ac3-42c4-ba88-c891b8c971bd\") " Mar 20 18:06:55 crc kubenswrapper[4690]: I0320 18:06:55.849973 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e1dd8af3-0ac3-42c4-ba88-c891b8c971bd-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"e1dd8af3-0ac3-42c4-ba88-c891b8c971bd\" (UID: \"e1dd8af3-0ac3-42c4-ba88-c891b8c971bd\") " Mar 20 18:06:55 crc kubenswrapper[4690]: I0320 18:06:55.850009 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1dd8af3-0ac3-42c4-ba88-c891b8c971bd-nova-combined-ca-bundle\") pod \"e1dd8af3-0ac3-42c4-ba88-c891b8c971bd\" (UID: \"e1dd8af3-0ac3-42c4-ba88-c891b8c971bd\") " Mar 20 18:06:55 crc kubenswrapper[4690]: I0320 18:06:55.850116 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1dd8af3-0ac3-42c4-ba88-c891b8c971bd-telemetry-combined-ca-bundle\") pod \"e1dd8af3-0ac3-42c4-ba88-c891b8c971bd\" (UID: \"e1dd8af3-0ac3-42c4-ba88-c891b8c971bd\") " Mar 20 18:06:55 crc kubenswrapper[4690]: I0320 18:06:55.850189 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1dd8af3-0ac3-42c4-ba88-c891b8c971bd-ovn-combined-ca-bundle\") pod \"e1dd8af3-0ac3-42c4-ba88-c891b8c971bd\" (UID: \"e1dd8af3-0ac3-42c4-ba88-c891b8c971bd\") " Mar 20 18:06:55 crc kubenswrapper[4690]: I0320 18:06:55.854578 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1dd8af3-0ac3-42c4-ba88-c891b8c971bd-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "e1dd8af3-0ac3-42c4-ba88-c891b8c971bd" (UID: "e1dd8af3-0ac3-42c4-ba88-c891b8c971bd"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:06:55 crc kubenswrapper[4690]: I0320 18:06:55.854598 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1dd8af3-0ac3-42c4-ba88-c891b8c971bd-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "e1dd8af3-0ac3-42c4-ba88-c891b8c971bd" (UID: "e1dd8af3-0ac3-42c4-ba88-c891b8c971bd"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:06:55 crc kubenswrapper[4690]: I0320 18:06:55.856801 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1dd8af3-0ac3-42c4-ba88-c891b8c971bd-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "e1dd8af3-0ac3-42c4-ba88-c891b8c971bd" (UID: "e1dd8af3-0ac3-42c4-ba88-c891b8c971bd"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:06:55 crc kubenswrapper[4690]: I0320 18:06:55.857067 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1dd8af3-0ac3-42c4-ba88-c891b8c971bd-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "e1dd8af3-0ac3-42c4-ba88-c891b8c971bd" (UID: "e1dd8af3-0ac3-42c4-ba88-c891b8c971bd"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:06:55 crc kubenswrapper[4690]: I0320 18:06:55.857953 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1dd8af3-0ac3-42c4-ba88-c891b8c971bd-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "e1dd8af3-0ac3-42c4-ba88-c891b8c971bd" (UID: "e1dd8af3-0ac3-42c4-ba88-c891b8c971bd"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:06:55 crc kubenswrapper[4690]: I0320 18:06:55.858356 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1dd8af3-0ac3-42c4-ba88-c891b8c971bd-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "e1dd8af3-0ac3-42c4-ba88-c891b8c971bd" (UID: "e1dd8af3-0ac3-42c4-ba88-c891b8c971bd"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:06:55 crc kubenswrapper[4690]: I0320 18:06:55.858491 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1dd8af3-0ac3-42c4-ba88-c891b8c971bd-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "e1dd8af3-0ac3-42c4-ba88-c891b8c971bd" (UID: "e1dd8af3-0ac3-42c4-ba88-c891b8c971bd"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:06:55 crc kubenswrapper[4690]: I0320 18:06:55.860185 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1dd8af3-0ac3-42c4-ba88-c891b8c971bd-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "e1dd8af3-0ac3-42c4-ba88-c891b8c971bd" (UID: "e1dd8af3-0ac3-42c4-ba88-c891b8c971bd"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:06:55 crc kubenswrapper[4690]: I0320 18:06:55.861459 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1dd8af3-0ac3-42c4-ba88-c891b8c971bd-kube-api-access-zwmpr" (OuterVolumeSpecName: "kube-api-access-zwmpr") pod "e1dd8af3-0ac3-42c4-ba88-c891b8c971bd" (UID: "e1dd8af3-0ac3-42c4-ba88-c891b8c971bd"). InnerVolumeSpecName "kube-api-access-zwmpr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:06:55 crc kubenswrapper[4690]: I0320 18:06:55.861502 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1dd8af3-0ac3-42c4-ba88-c891b8c971bd-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "e1dd8af3-0ac3-42c4-ba88-c891b8c971bd" (UID: "e1dd8af3-0ac3-42c4-ba88-c891b8c971bd"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:06:55 crc kubenswrapper[4690]: I0320 18:06:55.874453 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1dd8af3-0ac3-42c4-ba88-c891b8c971bd-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "e1dd8af3-0ac3-42c4-ba88-c891b8c971bd" (UID: "e1dd8af3-0ac3-42c4-ba88-c891b8c971bd"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:06:55 crc kubenswrapper[4690]: I0320 18:06:55.876823 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1dd8af3-0ac3-42c4-ba88-c891b8c971bd-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "e1dd8af3-0ac3-42c4-ba88-c891b8c971bd" (UID: "e1dd8af3-0ac3-42c4-ba88-c891b8c971bd"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:06:55 crc kubenswrapper[4690]: I0320 18:06:55.903595 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1dd8af3-0ac3-42c4-ba88-c891b8c971bd-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e1dd8af3-0ac3-42c4-ba88-c891b8c971bd" (UID: "e1dd8af3-0ac3-42c4-ba88-c891b8c971bd"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:06:55 crc kubenswrapper[4690]: I0320 18:06:55.908327 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1dd8af3-0ac3-42c4-ba88-c891b8c971bd-inventory" (OuterVolumeSpecName: "inventory") pod "e1dd8af3-0ac3-42c4-ba88-c891b8c971bd" (UID: "e1dd8af3-0ac3-42c4-ba88-c891b8c971bd"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:06:55 crc kubenswrapper[4690]: I0320 18:06:55.953378 4690 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1dd8af3-0ac3-42c4-ba88-c891b8c971bd-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 18:06:55 crc kubenswrapper[4690]: I0320 18:06:55.953422 4690 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e1dd8af3-0ac3-42c4-ba88-c891b8c971bd-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 20 18:06:55 crc kubenswrapper[4690]: I0320 18:06:55.953436 4690 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e1dd8af3-0ac3-42c4-ba88-c891b8c971bd-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 20 18:06:55 crc kubenswrapper[4690]: I0320 18:06:55.953450 4690 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1dd8af3-0ac3-42c4-ba88-c891b8c971bd-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 18:06:55 crc kubenswrapper[4690]: I0320 18:06:55.953462 4690 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1dd8af3-0ac3-42c4-ba88-c891b8c971bd-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 18:06:55 crc kubenswrapper[4690]: I0320 18:06:55.953473 4690 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1dd8af3-0ac3-42c4-ba88-c891b8c971bd-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 18:06:55 crc kubenswrapper[4690]: I0320 18:06:55.953483 4690 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1dd8af3-0ac3-42c4-ba88-c891b8c971bd-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 18:06:55 crc kubenswrapper[4690]: I0320 18:06:55.953494 4690 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e1dd8af3-0ac3-42c4-ba88-c891b8c971bd-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 18:06:55 crc kubenswrapper[4690]: I0320 18:06:55.953505 4690 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e1dd8af3-0ac3-42c4-ba88-c891b8c971bd-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 20 18:06:55 crc kubenswrapper[4690]: I0320 18:06:55.953519 4690 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e1dd8af3-0ac3-42c4-ba88-c891b8c971bd-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 20 18:06:55 crc kubenswrapper[4690]: I0320 18:06:55.953530 4690 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e1dd8af3-0ac3-42c4-ba88-c891b8c971bd-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 18:06:55 crc kubenswrapper[4690]: I0320 18:06:55.953541 4690 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1dd8af3-0ac3-42c4-ba88-c891b8c971bd-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 18:06:55 crc kubenswrapper[4690]: I0320 18:06:55.953550 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zwmpr\" (UniqueName: \"kubernetes.io/projected/e1dd8af3-0ac3-42c4-ba88-c891b8c971bd-kube-api-access-zwmpr\") on node \"crc\" DevicePath \"\"" Mar 20 18:06:55 crc kubenswrapper[4690]: I0320 18:06:55.953560 4690 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1dd8af3-0ac3-42c4-ba88-c891b8c971bd-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 18:06:56 crc kubenswrapper[4690]: I0320 18:06:56.318927 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-w5shc" event={"ID":"e1dd8af3-0ac3-42c4-ba88-c891b8c971bd","Type":"ContainerDied","Data":"8f9553cd9eb4eba9a4d9f30dcac7cf20728a3480a42b700395a2619574a65fc6"} Mar 20 18:06:56 crc kubenswrapper[4690]: I0320 18:06:56.319972 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f9553cd9eb4eba9a4d9f30dcac7cf20728a3480a42b700395a2619574a65fc6" Mar 20 18:06:56 crc kubenswrapper[4690]: I0320 18:06:56.318986 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-w5shc" Mar 20 18:06:56 crc kubenswrapper[4690]: I0320 18:06:56.594395 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-jhldh"] Mar 20 18:06:56 crc kubenswrapper[4690]: E0320 18:06:56.595027 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1dd8af3-0ac3-42c4-ba88-c891b8c971bd" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 20 18:06:56 crc kubenswrapper[4690]: I0320 18:06:56.595058 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1dd8af3-0ac3-42c4-ba88-c891b8c971bd" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 20 18:06:56 crc kubenswrapper[4690]: I0320 18:06:56.595452 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1dd8af3-0ac3-42c4-ba88-c891b8c971bd" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 20 18:06:56 crc kubenswrapper[4690]: I0320 18:06:56.596596 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jhldh" Mar 20 18:06:56 crc kubenswrapper[4690]: I0320 18:06:56.599374 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-k9qb4" Mar 20 18:06:56 crc kubenswrapper[4690]: I0320 18:06:56.599432 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 18:06:56 crc kubenswrapper[4690]: I0320 18:06:56.599373 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 18:06:56 crc kubenswrapper[4690]: I0320 18:06:56.605887 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Mar 20 18:06:56 crc kubenswrapper[4690]: I0320 18:06:56.606632 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 18:06:56 crc kubenswrapper[4690]: I0320 18:06:56.614053 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-jhldh"] Mar 20 18:06:56 crc kubenswrapper[4690]: I0320 18:06:56.674148 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/64a253c9-3348-4ba3-9d9a-755348ebf561-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-jhldh\" (UID: \"64a253c9-3348-4ba3-9d9a-755348ebf561\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jhldh" Mar 20 18:06:56 crc kubenswrapper[4690]: I0320 18:06:56.674202 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64a253c9-3348-4ba3-9d9a-755348ebf561-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-jhldh\" (UID: \"64a253c9-3348-4ba3-9d9a-755348ebf561\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jhldh" Mar 20 18:06:56 crc kubenswrapper[4690]: I0320 18:06:56.674376 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/64a253c9-3348-4ba3-9d9a-755348ebf561-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-jhldh\" (UID: \"64a253c9-3348-4ba3-9d9a-755348ebf561\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jhldh" Mar 20 18:06:56 crc kubenswrapper[4690]: I0320 18:06:56.674420 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qhkx\" (UniqueName: \"kubernetes.io/projected/64a253c9-3348-4ba3-9d9a-755348ebf561-kube-api-access-7qhkx\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-jhldh\" (UID: \"64a253c9-3348-4ba3-9d9a-755348ebf561\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jhldh" Mar 20 18:06:56 crc kubenswrapper[4690]: I0320 18:06:56.674455 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/64a253c9-3348-4ba3-9d9a-755348ebf561-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-jhldh\" (UID: \"64a253c9-3348-4ba3-9d9a-755348ebf561\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jhldh" Mar 20 18:06:56 crc kubenswrapper[4690]: I0320 18:06:56.776317 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/64a253c9-3348-4ba3-9d9a-755348ebf561-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-jhldh\" (UID: \"64a253c9-3348-4ba3-9d9a-755348ebf561\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jhldh" Mar 20 18:06:56 crc kubenswrapper[4690]: I0320 18:06:56.776424 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qhkx\" (UniqueName: \"kubernetes.io/projected/64a253c9-3348-4ba3-9d9a-755348ebf561-kube-api-access-7qhkx\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-jhldh\" (UID: \"64a253c9-3348-4ba3-9d9a-755348ebf561\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jhldh" Mar 20 18:06:56 crc kubenswrapper[4690]: I0320 18:06:56.776476 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/64a253c9-3348-4ba3-9d9a-755348ebf561-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-jhldh\" (UID: \"64a253c9-3348-4ba3-9d9a-755348ebf561\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jhldh" Mar 20 18:06:56 crc kubenswrapper[4690]: I0320 18:06:56.776870 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/64a253c9-3348-4ba3-9d9a-755348ebf561-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-jhldh\" (UID: \"64a253c9-3348-4ba3-9d9a-755348ebf561\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jhldh" Mar 20 18:06:56 crc kubenswrapper[4690]: I0320 18:06:56.776942 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64a253c9-3348-4ba3-9d9a-755348ebf561-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-jhldh\" (UID: \"64a253c9-3348-4ba3-9d9a-755348ebf561\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jhldh" Mar 20 18:06:56 crc kubenswrapper[4690]: I0320 18:06:56.778556 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/64a253c9-3348-4ba3-9d9a-755348ebf561-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-jhldh\" (UID: \"64a253c9-3348-4ba3-9d9a-755348ebf561\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jhldh" Mar 20 18:06:56 crc kubenswrapper[4690]: I0320 18:06:56.783174 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/64a253c9-3348-4ba3-9d9a-755348ebf561-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-jhldh\" (UID: \"64a253c9-3348-4ba3-9d9a-755348ebf561\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jhldh" Mar 20 18:06:56 crc kubenswrapper[4690]: I0320 18:06:56.784843 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/64a253c9-3348-4ba3-9d9a-755348ebf561-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-jhldh\" (UID: \"64a253c9-3348-4ba3-9d9a-755348ebf561\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jhldh" Mar 20 18:06:56 crc kubenswrapper[4690]: I0320 18:06:56.791952 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64a253c9-3348-4ba3-9d9a-755348ebf561-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-jhldh\" (UID: \"64a253c9-3348-4ba3-9d9a-755348ebf561\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jhldh" Mar 20 18:06:56 crc kubenswrapper[4690]: I0320 18:06:56.794181 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qhkx\" (UniqueName: \"kubernetes.io/projected/64a253c9-3348-4ba3-9d9a-755348ebf561-kube-api-access-7qhkx\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-jhldh\" (UID: \"64a253c9-3348-4ba3-9d9a-755348ebf561\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jhldh" Mar 20 18:06:56 crc kubenswrapper[4690]: I0320 18:06:56.924502 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jhldh" Mar 20 18:06:57 crc kubenswrapper[4690]: I0320 18:06:57.527756 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-jhldh"] Mar 20 18:06:58 crc kubenswrapper[4690]: I0320 18:06:58.345403 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jhldh" event={"ID":"64a253c9-3348-4ba3-9d9a-755348ebf561","Type":"ContainerStarted","Data":"a4e3ca18fe6cfaefe63edb639d8c11ae2de5a28ca8cade6a14a015a86578ebc5"} Mar 20 18:06:58 crc kubenswrapper[4690]: I0320 18:06:58.345824 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jhldh" event={"ID":"64a253c9-3348-4ba3-9d9a-755348ebf561","Type":"ContainerStarted","Data":"fd5eed149a0f9ac8adca9405b3e63d9663b9d36c204fd3494c969e4b47300209"} Mar 20 18:07:24 crc kubenswrapper[4690]: I0320 18:07:24.274175 4690 patch_prober.go:28] interesting pod/machine-config-daemon-wtg2q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 18:07:24 crc kubenswrapper[4690]: I0320 18:07:24.275485 4690 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 18:07:24 crc kubenswrapper[4690]: I0320 18:07:24.275564 4690 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" Mar 20 18:07:24 crc kubenswrapper[4690]: I0320 18:07:24.276582 4690 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9c743870b72976847070b0c9956af89e5f5f2891d80131c888a10eec990b9c51"} pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 18:07:24 crc kubenswrapper[4690]: I0320 18:07:24.276684 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" containerName="machine-config-daemon" containerID="cri-o://9c743870b72976847070b0c9956af89e5f5f2891d80131c888a10eec990b9c51" gracePeriod=600 Mar 20 18:07:24 crc kubenswrapper[4690]: I0320 18:07:24.630400 4690 generic.go:334] "Generic (PLEG): container finished" podID="c18651e4-89e3-43fd-a780-bfa6df87591e" containerID="9c743870b72976847070b0c9956af89e5f5f2891d80131c888a10eec990b9c51" exitCode=0 Mar 20 18:07:24 crc kubenswrapper[4690]: I0320 18:07:24.630655 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" event={"ID":"c18651e4-89e3-43fd-a780-bfa6df87591e","Type":"ContainerDied","Data":"9c743870b72976847070b0c9956af89e5f5f2891d80131c888a10eec990b9c51"} Mar 20 18:07:24 crc kubenswrapper[4690]: I0320 18:07:24.630861 4690 scope.go:117] "RemoveContainer" containerID="965e35066bff888caca5b994dc3af56f56ca5e0e9e97a4c5970943a091971930" Mar 20 18:07:25 crc kubenswrapper[4690]: I0320 18:07:25.641360 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" event={"ID":"c18651e4-89e3-43fd-a780-bfa6df87591e","Type":"ContainerStarted","Data":"24e5f76fee7e30729d09e38f23025e12449be266576373e532933c3f0101ae12"} Mar 20 18:07:25 crc kubenswrapper[4690]: I0320 18:07:25.665866 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jhldh" podStartSLOduration=29.153414311 podStartE2EDuration="29.665841994s" podCreationTimestamp="2026-03-20 18:06:56 +0000 UTC" firstStartedPulling="2026-03-20 18:06:57.546120642 +0000 UTC m=+2092.411946340" lastFinishedPulling="2026-03-20 18:06:58.058548305 +0000 UTC m=+2092.924374023" observedRunningTime="2026-03-20 18:06:58.373796611 +0000 UTC m=+2093.239622289" watchObservedRunningTime="2026-03-20 18:07:25.665841994 +0000 UTC m=+2120.531667672" Mar 20 18:08:00 crc kubenswrapper[4690]: I0320 18:08:00.145818 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567168-dhksw"] Mar 20 18:08:00 crc kubenswrapper[4690]: I0320 18:08:00.147505 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567168-dhksw" Mar 20 18:08:00 crc kubenswrapper[4690]: I0320 18:08:00.150245 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 18:08:00 crc kubenswrapper[4690]: I0320 18:08:00.150472 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5fwhb" Mar 20 18:08:00 crc kubenswrapper[4690]: I0320 18:08:00.150708 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 18:08:00 crc kubenswrapper[4690]: I0320 18:08:00.154217 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567168-dhksw"] Mar 20 18:08:00 crc kubenswrapper[4690]: I0320 18:08:00.327499 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5kpb\" (UniqueName: \"kubernetes.io/projected/42b7b7a1-8685-423b-a27c-fd5c5785c056-kube-api-access-w5kpb\") pod \"auto-csr-approver-29567168-dhksw\" (UID: \"42b7b7a1-8685-423b-a27c-fd5c5785c056\") " pod="openshift-infra/auto-csr-approver-29567168-dhksw" Mar 20 18:08:00 crc kubenswrapper[4690]: I0320 18:08:00.428943 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5kpb\" (UniqueName: \"kubernetes.io/projected/42b7b7a1-8685-423b-a27c-fd5c5785c056-kube-api-access-w5kpb\") pod \"auto-csr-approver-29567168-dhksw\" (UID: \"42b7b7a1-8685-423b-a27c-fd5c5785c056\") " pod="openshift-infra/auto-csr-approver-29567168-dhksw" Mar 20 18:08:00 crc kubenswrapper[4690]: I0320 18:08:00.449029 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5kpb\" (UniqueName: \"kubernetes.io/projected/42b7b7a1-8685-423b-a27c-fd5c5785c056-kube-api-access-w5kpb\") pod \"auto-csr-approver-29567168-dhksw\" (UID: \"42b7b7a1-8685-423b-a27c-fd5c5785c056\") " pod="openshift-infra/auto-csr-approver-29567168-dhksw" Mar 20 18:08:00 crc kubenswrapper[4690]: I0320 18:08:00.480671 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567168-dhksw" Mar 20 18:08:00 crc kubenswrapper[4690]: I0320 18:08:00.920395 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567168-dhksw"] Mar 20 18:08:00 crc kubenswrapper[4690]: I0320 18:08:00.920919 4690 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 18:08:01 crc kubenswrapper[4690]: I0320 18:08:01.022476 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567168-dhksw" event={"ID":"42b7b7a1-8685-423b-a27c-fd5c5785c056","Type":"ContainerStarted","Data":"98eb7f23dfe84dd6a099af7577902b1e71936ef85e1269c320032895ec130e2e"} Mar 20 18:08:03 crc kubenswrapper[4690]: I0320 18:08:03.041928 4690 generic.go:334] "Generic (PLEG): container finished" podID="42b7b7a1-8685-423b-a27c-fd5c5785c056" containerID="849623b4d82636d8b77839631781372232dfffbfa828a867ce29cea9caefa3ee" exitCode=0 Mar 20 18:08:03 crc kubenswrapper[4690]: I0320 18:08:03.042324 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567168-dhksw" event={"ID":"42b7b7a1-8685-423b-a27c-fd5c5785c056","Type":"ContainerDied","Data":"849623b4d82636d8b77839631781372232dfffbfa828a867ce29cea9caefa3ee"} Mar 20 18:08:04 crc kubenswrapper[4690]: I0320 18:08:04.375744 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567168-dhksw" Mar 20 18:08:04 crc kubenswrapper[4690]: I0320 18:08:04.512012 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5kpb\" (UniqueName: \"kubernetes.io/projected/42b7b7a1-8685-423b-a27c-fd5c5785c056-kube-api-access-w5kpb\") pod \"42b7b7a1-8685-423b-a27c-fd5c5785c056\" (UID: \"42b7b7a1-8685-423b-a27c-fd5c5785c056\") " Mar 20 18:08:04 crc kubenswrapper[4690]: I0320 18:08:04.518533 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42b7b7a1-8685-423b-a27c-fd5c5785c056-kube-api-access-w5kpb" (OuterVolumeSpecName: "kube-api-access-w5kpb") pod "42b7b7a1-8685-423b-a27c-fd5c5785c056" (UID: "42b7b7a1-8685-423b-a27c-fd5c5785c056"). InnerVolumeSpecName "kube-api-access-w5kpb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:08:04 crc kubenswrapper[4690]: I0320 18:08:04.614493 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5kpb\" (UniqueName: \"kubernetes.io/projected/42b7b7a1-8685-423b-a27c-fd5c5785c056-kube-api-access-w5kpb\") on node \"crc\" DevicePath \"\"" Mar 20 18:08:05 crc kubenswrapper[4690]: I0320 18:08:05.064739 4690 generic.go:334] "Generic (PLEG): container finished" podID="64a253c9-3348-4ba3-9d9a-755348ebf561" containerID="a4e3ca18fe6cfaefe63edb639d8c11ae2de5a28ca8cade6a14a015a86578ebc5" exitCode=0 Mar 20 18:08:05 crc kubenswrapper[4690]: I0320 18:08:05.065190 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jhldh" event={"ID":"64a253c9-3348-4ba3-9d9a-755348ebf561","Type":"ContainerDied","Data":"a4e3ca18fe6cfaefe63edb639d8c11ae2de5a28ca8cade6a14a015a86578ebc5"} Mar 20 18:08:05 crc kubenswrapper[4690]: I0320 18:08:05.067776 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567168-dhksw" event={"ID":"42b7b7a1-8685-423b-a27c-fd5c5785c056","Type":"ContainerDied","Data":"98eb7f23dfe84dd6a099af7577902b1e71936ef85e1269c320032895ec130e2e"} Mar 20 18:08:05 crc kubenswrapper[4690]: I0320 18:08:05.067820 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="98eb7f23dfe84dd6a099af7577902b1e71936ef85e1269c320032895ec130e2e" Mar 20 18:08:05 crc kubenswrapper[4690]: I0320 18:08:05.067881 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567168-dhksw" Mar 20 18:08:05 crc kubenswrapper[4690]: I0320 18:08:05.463204 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567162-cs4rc"] Mar 20 18:08:05 crc kubenswrapper[4690]: I0320 18:08:05.481751 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567162-cs4rc"] Mar 20 18:08:05 crc kubenswrapper[4690]: I0320 18:08:05.897558 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c99c907-4cb3-4381-88e6-ada4a9efb021" path="/var/lib/kubelet/pods/5c99c907-4cb3-4381-88e6-ada4a9efb021/volumes" Mar 20 18:08:06 crc kubenswrapper[4690]: I0320 18:08:06.513139 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jhldh" Mar 20 18:08:06 crc kubenswrapper[4690]: I0320 18:08:06.653150 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/64a253c9-3348-4ba3-9d9a-755348ebf561-inventory\") pod \"64a253c9-3348-4ba3-9d9a-755348ebf561\" (UID: \"64a253c9-3348-4ba3-9d9a-755348ebf561\") " Mar 20 18:08:06 crc kubenswrapper[4690]: I0320 18:08:06.653342 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/64a253c9-3348-4ba3-9d9a-755348ebf561-ssh-key-openstack-edpm-ipam\") pod \"64a253c9-3348-4ba3-9d9a-755348ebf561\" (UID: \"64a253c9-3348-4ba3-9d9a-755348ebf561\") " Mar 20 18:08:06 crc kubenswrapper[4690]: I0320 18:08:06.653476 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7qhkx\" (UniqueName: \"kubernetes.io/projected/64a253c9-3348-4ba3-9d9a-755348ebf561-kube-api-access-7qhkx\") pod \"64a253c9-3348-4ba3-9d9a-755348ebf561\" (UID: \"64a253c9-3348-4ba3-9d9a-755348ebf561\") " Mar 20 18:08:06 crc kubenswrapper[4690]: I0320 18:08:06.653521 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64a253c9-3348-4ba3-9d9a-755348ebf561-ovn-combined-ca-bundle\") pod \"64a253c9-3348-4ba3-9d9a-755348ebf561\" (UID: \"64a253c9-3348-4ba3-9d9a-755348ebf561\") " Mar 20 18:08:06 crc kubenswrapper[4690]: I0320 18:08:06.653669 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/64a253c9-3348-4ba3-9d9a-755348ebf561-ovncontroller-config-0\") pod \"64a253c9-3348-4ba3-9d9a-755348ebf561\" (UID: \"64a253c9-3348-4ba3-9d9a-755348ebf561\") " Mar 20 18:08:06 crc kubenswrapper[4690]: I0320 18:08:06.659486 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64a253c9-3348-4ba3-9d9a-755348ebf561-kube-api-access-7qhkx" (OuterVolumeSpecName: "kube-api-access-7qhkx") pod "64a253c9-3348-4ba3-9d9a-755348ebf561" (UID: "64a253c9-3348-4ba3-9d9a-755348ebf561"). InnerVolumeSpecName "kube-api-access-7qhkx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:08:06 crc kubenswrapper[4690]: I0320 18:08:06.661127 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64a253c9-3348-4ba3-9d9a-755348ebf561-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "64a253c9-3348-4ba3-9d9a-755348ebf561" (UID: "64a253c9-3348-4ba3-9d9a-755348ebf561"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:08:06 crc kubenswrapper[4690]: I0320 18:08:06.691369 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64a253c9-3348-4ba3-9d9a-755348ebf561-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "64a253c9-3348-4ba3-9d9a-755348ebf561" (UID: "64a253c9-3348-4ba3-9d9a-755348ebf561"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:08:06 crc kubenswrapper[4690]: I0320 18:08:06.702641 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64a253c9-3348-4ba3-9d9a-755348ebf561-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "64a253c9-3348-4ba3-9d9a-755348ebf561" (UID: "64a253c9-3348-4ba3-9d9a-755348ebf561"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 18:08:06 crc kubenswrapper[4690]: I0320 18:08:06.702929 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64a253c9-3348-4ba3-9d9a-755348ebf561-inventory" (OuterVolumeSpecName: "inventory") pod "64a253c9-3348-4ba3-9d9a-755348ebf561" (UID: "64a253c9-3348-4ba3-9d9a-755348ebf561"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:08:06 crc kubenswrapper[4690]: I0320 18:08:06.757168 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7qhkx\" (UniqueName: \"kubernetes.io/projected/64a253c9-3348-4ba3-9d9a-755348ebf561-kube-api-access-7qhkx\") on node \"crc\" DevicePath \"\"" Mar 20 18:08:06 crc kubenswrapper[4690]: I0320 18:08:06.757223 4690 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64a253c9-3348-4ba3-9d9a-755348ebf561-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 18:08:06 crc kubenswrapper[4690]: I0320 18:08:06.757247 4690 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/64a253c9-3348-4ba3-9d9a-755348ebf561-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Mar 20 18:08:06 crc kubenswrapper[4690]: I0320 18:08:06.757287 4690 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/64a253c9-3348-4ba3-9d9a-755348ebf561-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 18:08:06 crc kubenswrapper[4690]: I0320 18:08:06.757307 4690 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/64a253c9-3348-4ba3-9d9a-755348ebf561-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 18:08:07 crc kubenswrapper[4690]: I0320 18:08:07.094244 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jhldh" event={"ID":"64a253c9-3348-4ba3-9d9a-755348ebf561","Type":"ContainerDied","Data":"fd5eed149a0f9ac8adca9405b3e63d9663b9d36c204fd3494c969e4b47300209"} Mar 20 18:08:07 crc kubenswrapper[4690]: I0320 18:08:07.094319 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd5eed149a0f9ac8adca9405b3e63d9663b9d36c204fd3494c969e4b47300209" Mar 20 18:08:07 crc kubenswrapper[4690]: I0320 18:08:07.094391 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jhldh" Mar 20 18:08:07 crc kubenswrapper[4690]: I0320 18:08:07.167327 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qd7s"] Mar 20 18:08:07 crc kubenswrapper[4690]: E0320 18:08:07.167881 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64a253c9-3348-4ba3-9d9a-755348ebf561" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 20 18:08:07 crc kubenswrapper[4690]: I0320 18:08:07.167914 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="64a253c9-3348-4ba3-9d9a-755348ebf561" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 20 18:08:07 crc kubenswrapper[4690]: E0320 18:08:07.167930 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42b7b7a1-8685-423b-a27c-fd5c5785c056" containerName="oc" Mar 20 18:08:07 crc kubenswrapper[4690]: I0320 18:08:07.167941 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="42b7b7a1-8685-423b-a27c-fd5c5785c056" containerName="oc" Mar 20 18:08:07 crc kubenswrapper[4690]: I0320 18:08:07.168248 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="42b7b7a1-8685-423b-a27c-fd5c5785c056" containerName="oc" Mar 20 18:08:07 crc kubenswrapper[4690]: I0320 18:08:07.168319 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="64a253c9-3348-4ba3-9d9a-755348ebf561" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 20 18:08:07 crc kubenswrapper[4690]: I0320 18:08:07.169116 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qd7s" Mar 20 18:08:07 crc kubenswrapper[4690]: I0320 18:08:07.170902 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Mar 20 18:08:07 crc kubenswrapper[4690]: I0320 18:08:07.171276 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-k9qb4" Mar 20 18:08:07 crc kubenswrapper[4690]: I0320 18:08:07.171289 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 18:08:07 crc kubenswrapper[4690]: I0320 18:08:07.171624 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 18:08:07 crc kubenswrapper[4690]: I0320 18:08:07.171889 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Mar 20 18:08:07 crc kubenswrapper[4690]: I0320 18:08:07.173698 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 18:08:07 crc kubenswrapper[4690]: I0320 18:08:07.182597 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qd7s"] Mar 20 18:08:07 crc kubenswrapper[4690]: I0320 18:08:07.270538 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2jpm\" (UniqueName: \"kubernetes.io/projected/c59bc866-150a-4671-8bbf-91aea8f32646-kube-api-access-s2jpm\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qd7s\" (UID: \"c59bc866-150a-4671-8bbf-91aea8f32646\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qd7s" Mar 20 18:08:07 crc kubenswrapper[4690]: I0320 18:08:07.270614 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c59bc866-150a-4671-8bbf-91aea8f32646-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qd7s\" (UID: \"c59bc866-150a-4671-8bbf-91aea8f32646\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qd7s" Mar 20 18:08:07 crc kubenswrapper[4690]: I0320 18:08:07.270738 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c59bc866-150a-4671-8bbf-91aea8f32646-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qd7s\" (UID: \"c59bc866-150a-4671-8bbf-91aea8f32646\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qd7s" Mar 20 18:08:07 crc kubenswrapper[4690]: I0320 18:08:07.270873 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c59bc866-150a-4671-8bbf-91aea8f32646-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qd7s\" (UID: \"c59bc866-150a-4671-8bbf-91aea8f32646\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qd7s" Mar 20 18:08:07 crc kubenswrapper[4690]: I0320 18:08:07.271026 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c59bc866-150a-4671-8bbf-91aea8f32646-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qd7s\" (UID: \"c59bc866-150a-4671-8bbf-91aea8f32646\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qd7s" Mar 20 18:08:07 crc kubenswrapper[4690]: I0320 18:08:07.271097 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c59bc866-150a-4671-8bbf-91aea8f32646-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qd7s\" (UID: \"c59bc866-150a-4671-8bbf-91aea8f32646\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qd7s" Mar 20 18:08:07 crc kubenswrapper[4690]: I0320 18:08:07.373854 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c59bc866-150a-4671-8bbf-91aea8f32646-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qd7s\" (UID: \"c59bc866-150a-4671-8bbf-91aea8f32646\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qd7s" Mar 20 18:08:07 crc kubenswrapper[4690]: I0320 18:08:07.374348 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c59bc866-150a-4671-8bbf-91aea8f32646-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qd7s\" (UID: \"c59bc866-150a-4671-8bbf-91aea8f32646\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qd7s" Mar 20 18:08:07 crc kubenswrapper[4690]: I0320 18:08:07.374446 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2jpm\" (UniqueName: \"kubernetes.io/projected/c59bc866-150a-4671-8bbf-91aea8f32646-kube-api-access-s2jpm\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qd7s\" (UID: \"c59bc866-150a-4671-8bbf-91aea8f32646\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qd7s" Mar 20 18:08:07 crc kubenswrapper[4690]: I0320 18:08:07.374504 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c59bc866-150a-4671-8bbf-91aea8f32646-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qd7s\" (UID: \"c59bc866-150a-4671-8bbf-91aea8f32646\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qd7s" Mar 20 18:08:07 crc kubenswrapper[4690]: I0320 18:08:07.374580 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c59bc866-150a-4671-8bbf-91aea8f32646-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qd7s\" (UID: \"c59bc866-150a-4671-8bbf-91aea8f32646\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qd7s" Mar 20 18:08:07 crc kubenswrapper[4690]: I0320 18:08:07.374754 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c59bc866-150a-4671-8bbf-91aea8f32646-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qd7s\" (UID: \"c59bc866-150a-4671-8bbf-91aea8f32646\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qd7s" Mar 20 18:08:07 crc kubenswrapper[4690]: I0320 18:08:07.377955 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c59bc866-150a-4671-8bbf-91aea8f32646-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qd7s\" (UID: \"c59bc866-150a-4671-8bbf-91aea8f32646\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qd7s" Mar 20 18:08:07 crc kubenswrapper[4690]: I0320 18:08:07.378363 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c59bc866-150a-4671-8bbf-91aea8f32646-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qd7s\" (UID: \"c59bc866-150a-4671-8bbf-91aea8f32646\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qd7s" Mar 20 18:08:07 crc kubenswrapper[4690]: I0320 18:08:07.379057 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c59bc866-150a-4671-8bbf-91aea8f32646-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qd7s\" (UID: \"c59bc866-150a-4671-8bbf-91aea8f32646\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qd7s" Mar 20 18:08:07 crc kubenswrapper[4690]: I0320 18:08:07.379615 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c59bc866-150a-4671-8bbf-91aea8f32646-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qd7s\" (UID: \"c59bc866-150a-4671-8bbf-91aea8f32646\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qd7s" Mar 20 18:08:07 crc kubenswrapper[4690]: I0320 18:08:07.381978 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c59bc866-150a-4671-8bbf-91aea8f32646-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qd7s\" (UID: \"c59bc866-150a-4671-8bbf-91aea8f32646\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qd7s" Mar 20 18:08:07 crc kubenswrapper[4690]: I0320 18:08:07.393633 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2jpm\" (UniqueName: \"kubernetes.io/projected/c59bc866-150a-4671-8bbf-91aea8f32646-kube-api-access-s2jpm\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qd7s\" (UID: \"c59bc866-150a-4671-8bbf-91aea8f32646\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qd7s" Mar 20 18:08:07 crc kubenswrapper[4690]: I0320 18:08:07.504902 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qd7s" Mar 20 18:08:08 crc kubenswrapper[4690]: I0320 18:08:08.044971 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qd7s"] Mar 20 18:08:08 crc kubenswrapper[4690]: I0320 18:08:08.103756 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qd7s" event={"ID":"c59bc866-150a-4671-8bbf-91aea8f32646","Type":"ContainerStarted","Data":"2eba34c2794d0b566ca6aadeac42f9afcb1ba05c4325a09d244f910a84435dc6"} Mar 20 18:08:09 crc kubenswrapper[4690]: I0320 18:08:09.121302 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qd7s" event={"ID":"c59bc866-150a-4671-8bbf-91aea8f32646","Type":"ContainerStarted","Data":"a53bceeed7311f73da4dd597fa9bf6851cfe3e5b69acb64272f0656e5ca51178"} Mar 20 18:08:09 crc kubenswrapper[4690]: I0320 18:08:09.142710 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qd7s" podStartSLOduration=1.590458485 podStartE2EDuration="2.142693224s" podCreationTimestamp="2026-03-20 18:08:07 +0000 UTC" firstStartedPulling="2026-03-20 18:08:08.055059073 +0000 UTC m=+2162.920884751" lastFinishedPulling="2026-03-20 18:08:08.607293812 +0000 UTC m=+2163.473119490" observedRunningTime="2026-03-20 18:08:09.136771187 +0000 UTC m=+2164.002596855" watchObservedRunningTime="2026-03-20 18:08:09.142693224 +0000 UTC m=+2164.008518902" Mar 20 18:08:21 crc kubenswrapper[4690]: I0320 18:08:21.940575 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fh4jv"] Mar 20 18:08:21 crc kubenswrapper[4690]: I0320 18:08:21.944515 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fh4jv" Mar 20 18:08:21 crc kubenswrapper[4690]: I0320 18:08:21.950574 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fh4jv"] Mar 20 18:08:22 crc kubenswrapper[4690]: I0320 18:08:22.016195 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aff0d12d-4c9f-4d43-a8a2-f460bd1c62d2-utilities\") pod \"redhat-operators-fh4jv\" (UID: \"aff0d12d-4c9f-4d43-a8a2-f460bd1c62d2\") " pod="openshift-marketplace/redhat-operators-fh4jv" Mar 20 18:08:22 crc kubenswrapper[4690]: I0320 18:08:22.016247 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aff0d12d-4c9f-4d43-a8a2-f460bd1c62d2-catalog-content\") pod \"redhat-operators-fh4jv\" (UID: \"aff0d12d-4c9f-4d43-a8a2-f460bd1c62d2\") " pod="openshift-marketplace/redhat-operators-fh4jv" Mar 20 18:08:22 crc kubenswrapper[4690]: I0320 18:08:22.016302 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zx7fj\" (UniqueName: \"kubernetes.io/projected/aff0d12d-4c9f-4d43-a8a2-f460bd1c62d2-kube-api-access-zx7fj\") pod \"redhat-operators-fh4jv\" (UID: \"aff0d12d-4c9f-4d43-a8a2-f460bd1c62d2\") " pod="openshift-marketplace/redhat-operators-fh4jv" Mar 20 18:08:22 crc kubenswrapper[4690]: I0320 18:08:22.117900 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aff0d12d-4c9f-4d43-a8a2-f460bd1c62d2-utilities\") pod \"redhat-operators-fh4jv\" (UID: \"aff0d12d-4c9f-4d43-a8a2-f460bd1c62d2\") " pod="openshift-marketplace/redhat-operators-fh4jv" Mar 20 18:08:22 crc kubenswrapper[4690]: I0320 18:08:22.117975 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aff0d12d-4c9f-4d43-a8a2-f460bd1c62d2-catalog-content\") pod \"redhat-operators-fh4jv\" (UID: \"aff0d12d-4c9f-4d43-a8a2-f460bd1c62d2\") " pod="openshift-marketplace/redhat-operators-fh4jv" Mar 20 18:08:22 crc kubenswrapper[4690]: I0320 18:08:22.118580 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aff0d12d-4c9f-4d43-a8a2-f460bd1c62d2-utilities\") pod \"redhat-operators-fh4jv\" (UID: \"aff0d12d-4c9f-4d43-a8a2-f460bd1c62d2\") " pod="openshift-marketplace/redhat-operators-fh4jv" Mar 20 18:08:22 crc kubenswrapper[4690]: I0320 18:08:22.118957 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aff0d12d-4c9f-4d43-a8a2-f460bd1c62d2-catalog-content\") pod \"redhat-operators-fh4jv\" (UID: \"aff0d12d-4c9f-4d43-a8a2-f460bd1c62d2\") " pod="openshift-marketplace/redhat-operators-fh4jv" Mar 20 18:08:22 crc kubenswrapper[4690]: I0320 18:08:22.119031 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zx7fj\" (UniqueName: \"kubernetes.io/projected/aff0d12d-4c9f-4d43-a8a2-f460bd1c62d2-kube-api-access-zx7fj\") pod \"redhat-operators-fh4jv\" (UID: \"aff0d12d-4c9f-4d43-a8a2-f460bd1c62d2\") " pod="openshift-marketplace/redhat-operators-fh4jv" Mar 20 18:08:22 crc kubenswrapper[4690]: I0320 18:08:22.140239 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zx7fj\" (UniqueName: \"kubernetes.io/projected/aff0d12d-4c9f-4d43-a8a2-f460bd1c62d2-kube-api-access-zx7fj\") pod \"redhat-operators-fh4jv\" (UID: \"aff0d12d-4c9f-4d43-a8a2-f460bd1c62d2\") " pod="openshift-marketplace/redhat-operators-fh4jv" Mar 20 18:08:22 crc kubenswrapper[4690]: I0320 18:08:22.306617 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fh4jv" Mar 20 18:08:22 crc kubenswrapper[4690]: I0320 18:08:22.773124 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fh4jv"] Mar 20 18:08:22 crc kubenswrapper[4690]: I0320 18:08:22.993677 4690 scope.go:117] "RemoveContainer" containerID="21e06f020d4543c1c68681b411fbfb109693f057944924580580bb928e32dd5a" Mar 20 18:08:23 crc kubenswrapper[4690]: I0320 18:08:23.285498 4690 generic.go:334] "Generic (PLEG): container finished" podID="aff0d12d-4c9f-4d43-a8a2-f460bd1c62d2" containerID="6bde172111608732ae0fc2b0eeeea6de1283e7bb3fddee828415f0a6e9003f07" exitCode=0 Mar 20 18:08:23 crc kubenswrapper[4690]: I0320 18:08:23.285813 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fh4jv" event={"ID":"aff0d12d-4c9f-4d43-a8a2-f460bd1c62d2","Type":"ContainerDied","Data":"6bde172111608732ae0fc2b0eeeea6de1283e7bb3fddee828415f0a6e9003f07"} Mar 20 18:08:23 crc kubenswrapper[4690]: I0320 18:08:23.285874 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fh4jv" event={"ID":"aff0d12d-4c9f-4d43-a8a2-f460bd1c62d2","Type":"ContainerStarted","Data":"c66a743d46cb77379a18b250298cca6aad78811adfdcb4a3ae2ed0a5832fa45f"} Mar 20 18:08:24 crc kubenswrapper[4690]: I0320 18:08:24.298647 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fh4jv" event={"ID":"aff0d12d-4c9f-4d43-a8a2-f460bd1c62d2","Type":"ContainerStarted","Data":"c3a3abfae3945b388922674d8b398dd833221d4baf806ab7a8370e1a3d4a856d"} Mar 20 18:08:27 crc kubenswrapper[4690]: I0320 18:08:27.327823 4690 generic.go:334] "Generic (PLEG): container finished" podID="aff0d12d-4c9f-4d43-a8a2-f460bd1c62d2" containerID="c3a3abfae3945b388922674d8b398dd833221d4baf806ab7a8370e1a3d4a856d" exitCode=0 Mar 20 18:08:27 crc kubenswrapper[4690]: I0320 18:08:27.327902 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fh4jv" event={"ID":"aff0d12d-4c9f-4d43-a8a2-f460bd1c62d2","Type":"ContainerDied","Data":"c3a3abfae3945b388922674d8b398dd833221d4baf806ab7a8370e1a3d4a856d"} Mar 20 18:08:28 crc kubenswrapper[4690]: I0320 18:08:28.338017 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fh4jv" event={"ID":"aff0d12d-4c9f-4d43-a8a2-f460bd1c62d2","Type":"ContainerStarted","Data":"42751200998eae39c7af305cfbdee1edc35b3363f66d73ce7bebc283e3e04614"} Mar 20 18:08:28 crc kubenswrapper[4690]: I0320 18:08:28.354873 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fh4jv" podStartSLOduration=2.661588076 podStartE2EDuration="7.354854771s" podCreationTimestamp="2026-03-20 18:08:21 +0000 UTC" firstStartedPulling="2026-03-20 18:08:23.287582628 +0000 UTC m=+2178.153408316" lastFinishedPulling="2026-03-20 18:08:27.980849333 +0000 UTC m=+2182.846675011" observedRunningTime="2026-03-20 18:08:28.352106203 +0000 UTC m=+2183.217931901" watchObservedRunningTime="2026-03-20 18:08:28.354854771 +0000 UTC m=+2183.220680449" Mar 20 18:08:32 crc kubenswrapper[4690]: I0320 18:08:32.307269 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fh4jv" Mar 20 18:08:32 crc kubenswrapper[4690]: I0320 18:08:32.307573 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fh4jv" Mar 20 18:08:33 crc kubenswrapper[4690]: I0320 18:08:33.367895 4690 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fh4jv" podUID="aff0d12d-4c9f-4d43-a8a2-f460bd1c62d2" containerName="registry-server" probeResult="failure" output=< Mar 20 18:08:33 crc kubenswrapper[4690]: timeout: failed to connect service ":50051" within 1s Mar 20 18:08:33 crc kubenswrapper[4690]: > Mar 20 18:08:42 crc kubenswrapper[4690]: I0320 18:08:42.385349 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fh4jv" Mar 20 18:08:42 crc kubenswrapper[4690]: I0320 18:08:42.459540 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fh4jv" Mar 20 18:08:42 crc kubenswrapper[4690]: I0320 18:08:42.639484 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fh4jv"] Mar 20 18:08:43 crc kubenswrapper[4690]: I0320 18:08:43.534044 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fh4jv" podUID="aff0d12d-4c9f-4d43-a8a2-f460bd1c62d2" containerName="registry-server" containerID="cri-o://42751200998eae39c7af305cfbdee1edc35b3363f66d73ce7bebc283e3e04614" gracePeriod=2 Mar 20 18:08:43 crc kubenswrapper[4690]: I0320 18:08:43.990051 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fh4jv" Mar 20 18:08:44 crc kubenswrapper[4690]: I0320 18:08:44.105066 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aff0d12d-4c9f-4d43-a8a2-f460bd1c62d2-utilities\") pod \"aff0d12d-4c9f-4d43-a8a2-f460bd1c62d2\" (UID: \"aff0d12d-4c9f-4d43-a8a2-f460bd1c62d2\") " Mar 20 18:08:44 crc kubenswrapper[4690]: I0320 18:08:44.105179 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zx7fj\" (UniqueName: \"kubernetes.io/projected/aff0d12d-4c9f-4d43-a8a2-f460bd1c62d2-kube-api-access-zx7fj\") pod \"aff0d12d-4c9f-4d43-a8a2-f460bd1c62d2\" (UID: \"aff0d12d-4c9f-4d43-a8a2-f460bd1c62d2\") " Mar 20 18:08:44 crc kubenswrapper[4690]: I0320 18:08:44.105380 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aff0d12d-4c9f-4d43-a8a2-f460bd1c62d2-catalog-content\") pod \"aff0d12d-4c9f-4d43-a8a2-f460bd1c62d2\" (UID: \"aff0d12d-4c9f-4d43-a8a2-f460bd1c62d2\") " Mar 20 18:08:44 crc kubenswrapper[4690]: I0320 18:08:44.105837 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aff0d12d-4c9f-4d43-a8a2-f460bd1c62d2-utilities" (OuterVolumeSpecName: "utilities") pod "aff0d12d-4c9f-4d43-a8a2-f460bd1c62d2" (UID: "aff0d12d-4c9f-4d43-a8a2-f460bd1c62d2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:08:44 crc kubenswrapper[4690]: I0320 18:08:44.111159 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aff0d12d-4c9f-4d43-a8a2-f460bd1c62d2-kube-api-access-zx7fj" (OuterVolumeSpecName: "kube-api-access-zx7fj") pod "aff0d12d-4c9f-4d43-a8a2-f460bd1c62d2" (UID: "aff0d12d-4c9f-4d43-a8a2-f460bd1c62d2"). InnerVolumeSpecName "kube-api-access-zx7fj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:08:44 crc kubenswrapper[4690]: I0320 18:08:44.207374 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zx7fj\" (UniqueName: \"kubernetes.io/projected/aff0d12d-4c9f-4d43-a8a2-f460bd1c62d2-kube-api-access-zx7fj\") on node \"crc\" DevicePath \"\"" Mar 20 18:08:44 crc kubenswrapper[4690]: I0320 18:08:44.207406 4690 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aff0d12d-4c9f-4d43-a8a2-f460bd1c62d2-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 18:08:44 crc kubenswrapper[4690]: I0320 18:08:44.240929 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aff0d12d-4c9f-4d43-a8a2-f460bd1c62d2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "aff0d12d-4c9f-4d43-a8a2-f460bd1c62d2" (UID: "aff0d12d-4c9f-4d43-a8a2-f460bd1c62d2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:08:44 crc kubenswrapper[4690]: I0320 18:08:44.310045 4690 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aff0d12d-4c9f-4d43-a8a2-f460bd1c62d2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 18:08:44 crc kubenswrapper[4690]: I0320 18:08:44.547146 4690 generic.go:334] "Generic (PLEG): container finished" podID="aff0d12d-4c9f-4d43-a8a2-f460bd1c62d2" containerID="42751200998eae39c7af305cfbdee1edc35b3363f66d73ce7bebc283e3e04614" exitCode=0 Mar 20 18:08:44 crc kubenswrapper[4690]: I0320 18:08:44.547196 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fh4jv" event={"ID":"aff0d12d-4c9f-4d43-a8a2-f460bd1c62d2","Type":"ContainerDied","Data":"42751200998eae39c7af305cfbdee1edc35b3363f66d73ce7bebc283e3e04614"} Mar 20 18:08:44 crc kubenswrapper[4690]: I0320 18:08:44.547233 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fh4jv" event={"ID":"aff0d12d-4c9f-4d43-a8a2-f460bd1c62d2","Type":"ContainerDied","Data":"c66a743d46cb77379a18b250298cca6aad78811adfdcb4a3ae2ed0a5832fa45f"} Mar 20 18:08:44 crc kubenswrapper[4690]: I0320 18:08:44.547234 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fh4jv" Mar 20 18:08:44 crc kubenswrapper[4690]: I0320 18:08:44.547281 4690 scope.go:117] "RemoveContainer" containerID="42751200998eae39c7af305cfbdee1edc35b3363f66d73ce7bebc283e3e04614" Mar 20 18:08:44 crc kubenswrapper[4690]: I0320 18:08:44.585872 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fh4jv"] Mar 20 18:08:44 crc kubenswrapper[4690]: I0320 18:08:44.586656 4690 scope.go:117] "RemoveContainer" containerID="c3a3abfae3945b388922674d8b398dd833221d4baf806ab7a8370e1a3d4a856d" Mar 20 18:08:44 crc kubenswrapper[4690]: I0320 18:08:44.593870 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fh4jv"] Mar 20 18:08:44 crc kubenswrapper[4690]: I0320 18:08:44.621068 4690 scope.go:117] "RemoveContainer" containerID="6bde172111608732ae0fc2b0eeeea6de1283e7bb3fddee828415f0a6e9003f07" Mar 20 18:08:44 crc kubenswrapper[4690]: I0320 18:08:44.667768 4690 scope.go:117] "RemoveContainer" containerID="42751200998eae39c7af305cfbdee1edc35b3363f66d73ce7bebc283e3e04614" Mar 20 18:08:44 crc kubenswrapper[4690]: E0320 18:08:44.668239 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42751200998eae39c7af305cfbdee1edc35b3363f66d73ce7bebc283e3e04614\": container with ID starting with 42751200998eae39c7af305cfbdee1edc35b3363f66d73ce7bebc283e3e04614 not found: ID does not exist" containerID="42751200998eae39c7af305cfbdee1edc35b3363f66d73ce7bebc283e3e04614" Mar 20 18:08:44 crc kubenswrapper[4690]: I0320 18:08:44.668325 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42751200998eae39c7af305cfbdee1edc35b3363f66d73ce7bebc283e3e04614"} err="failed to get container status \"42751200998eae39c7af305cfbdee1edc35b3363f66d73ce7bebc283e3e04614\": rpc error: code = NotFound desc = could not find container \"42751200998eae39c7af305cfbdee1edc35b3363f66d73ce7bebc283e3e04614\": container with ID starting with 42751200998eae39c7af305cfbdee1edc35b3363f66d73ce7bebc283e3e04614 not found: ID does not exist" Mar 20 18:08:44 crc kubenswrapper[4690]: I0320 18:08:44.668347 4690 scope.go:117] "RemoveContainer" containerID="c3a3abfae3945b388922674d8b398dd833221d4baf806ab7a8370e1a3d4a856d" Mar 20 18:08:44 crc kubenswrapper[4690]: E0320 18:08:44.669811 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3a3abfae3945b388922674d8b398dd833221d4baf806ab7a8370e1a3d4a856d\": container with ID starting with c3a3abfae3945b388922674d8b398dd833221d4baf806ab7a8370e1a3d4a856d not found: ID does not exist" containerID="c3a3abfae3945b388922674d8b398dd833221d4baf806ab7a8370e1a3d4a856d" Mar 20 18:08:44 crc kubenswrapper[4690]: I0320 18:08:44.669864 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3a3abfae3945b388922674d8b398dd833221d4baf806ab7a8370e1a3d4a856d"} err="failed to get container status \"c3a3abfae3945b388922674d8b398dd833221d4baf806ab7a8370e1a3d4a856d\": rpc error: code = NotFound desc = could not find container \"c3a3abfae3945b388922674d8b398dd833221d4baf806ab7a8370e1a3d4a856d\": container with ID starting with c3a3abfae3945b388922674d8b398dd833221d4baf806ab7a8370e1a3d4a856d not found: ID does not exist" Mar 20 18:08:44 crc kubenswrapper[4690]: I0320 18:08:44.669900 4690 scope.go:117] "RemoveContainer" containerID="6bde172111608732ae0fc2b0eeeea6de1283e7bb3fddee828415f0a6e9003f07" Mar 20 18:08:44 crc kubenswrapper[4690]: E0320 18:08:44.670430 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6bde172111608732ae0fc2b0eeeea6de1283e7bb3fddee828415f0a6e9003f07\": container with ID starting with 6bde172111608732ae0fc2b0eeeea6de1283e7bb3fddee828415f0a6e9003f07 not found: ID does not exist" containerID="6bde172111608732ae0fc2b0eeeea6de1283e7bb3fddee828415f0a6e9003f07" Mar 20 18:08:44 crc kubenswrapper[4690]: I0320 18:08:44.670456 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6bde172111608732ae0fc2b0eeeea6de1283e7bb3fddee828415f0a6e9003f07"} err="failed to get container status \"6bde172111608732ae0fc2b0eeeea6de1283e7bb3fddee828415f0a6e9003f07\": rpc error: code = NotFound desc = could not find container \"6bde172111608732ae0fc2b0eeeea6de1283e7bb3fddee828415f0a6e9003f07\": container with ID starting with 6bde172111608732ae0fc2b0eeeea6de1283e7bb3fddee828415f0a6e9003f07 not found: ID does not exist" Mar 20 18:08:45 crc kubenswrapper[4690]: I0320 18:08:45.896304 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aff0d12d-4c9f-4d43-a8a2-f460bd1c62d2" path="/var/lib/kubelet/pods/aff0d12d-4c9f-4d43-a8a2-f460bd1c62d2/volumes" Mar 20 18:08:59 crc kubenswrapper[4690]: I0320 18:08:59.714154 4690 generic.go:334] "Generic (PLEG): container finished" podID="c59bc866-150a-4671-8bbf-91aea8f32646" containerID="a53bceeed7311f73da4dd597fa9bf6851cfe3e5b69acb64272f0656e5ca51178" exitCode=0 Mar 20 18:08:59 crc kubenswrapper[4690]: I0320 18:08:59.719914 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qd7s" event={"ID":"c59bc866-150a-4671-8bbf-91aea8f32646","Type":"ContainerDied","Data":"a53bceeed7311f73da4dd597fa9bf6851cfe3e5b69acb64272f0656e5ca51178"} Mar 20 18:09:01 crc kubenswrapper[4690]: I0320 18:09:01.192150 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qd7s" Mar 20 18:09:01 crc kubenswrapper[4690]: I0320 18:09:01.358350 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2jpm\" (UniqueName: \"kubernetes.io/projected/c59bc866-150a-4671-8bbf-91aea8f32646-kube-api-access-s2jpm\") pod \"c59bc866-150a-4671-8bbf-91aea8f32646\" (UID: \"c59bc866-150a-4671-8bbf-91aea8f32646\") " Mar 20 18:09:01 crc kubenswrapper[4690]: I0320 18:09:01.358509 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c59bc866-150a-4671-8bbf-91aea8f32646-ssh-key-openstack-edpm-ipam\") pod \"c59bc866-150a-4671-8bbf-91aea8f32646\" (UID: \"c59bc866-150a-4671-8bbf-91aea8f32646\") " Mar 20 18:09:01 crc kubenswrapper[4690]: I0320 18:09:01.358564 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c59bc866-150a-4671-8bbf-91aea8f32646-inventory\") pod \"c59bc866-150a-4671-8bbf-91aea8f32646\" (UID: \"c59bc866-150a-4671-8bbf-91aea8f32646\") " Mar 20 18:09:01 crc kubenswrapper[4690]: I0320 18:09:01.358681 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c59bc866-150a-4671-8bbf-91aea8f32646-neutron-ovn-metadata-agent-neutron-config-0\") pod \"c59bc866-150a-4671-8bbf-91aea8f32646\" (UID: \"c59bc866-150a-4671-8bbf-91aea8f32646\") " Mar 20 18:09:01 crc kubenswrapper[4690]: I0320 18:09:01.358706 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c59bc866-150a-4671-8bbf-91aea8f32646-nova-metadata-neutron-config-0\") pod \"c59bc866-150a-4671-8bbf-91aea8f32646\" (UID: \"c59bc866-150a-4671-8bbf-91aea8f32646\") " Mar 20 18:09:01 crc kubenswrapper[4690]: I0320 18:09:01.358725 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c59bc866-150a-4671-8bbf-91aea8f32646-neutron-metadata-combined-ca-bundle\") pod \"c59bc866-150a-4671-8bbf-91aea8f32646\" (UID: \"c59bc866-150a-4671-8bbf-91aea8f32646\") " Mar 20 18:09:01 crc kubenswrapper[4690]: I0320 18:09:01.364127 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c59bc866-150a-4671-8bbf-91aea8f32646-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "c59bc866-150a-4671-8bbf-91aea8f32646" (UID: "c59bc866-150a-4671-8bbf-91aea8f32646"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:09:01 crc kubenswrapper[4690]: I0320 18:09:01.365926 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c59bc866-150a-4671-8bbf-91aea8f32646-kube-api-access-s2jpm" (OuterVolumeSpecName: "kube-api-access-s2jpm") pod "c59bc866-150a-4671-8bbf-91aea8f32646" (UID: "c59bc866-150a-4671-8bbf-91aea8f32646"). InnerVolumeSpecName "kube-api-access-s2jpm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:09:01 crc kubenswrapper[4690]: I0320 18:09:01.388784 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c59bc866-150a-4671-8bbf-91aea8f32646-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "c59bc866-150a-4671-8bbf-91aea8f32646" (UID: "c59bc866-150a-4671-8bbf-91aea8f32646"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:09:01 crc kubenswrapper[4690]: I0320 18:09:01.394014 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c59bc866-150a-4671-8bbf-91aea8f32646-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "c59bc866-150a-4671-8bbf-91aea8f32646" (UID: "c59bc866-150a-4671-8bbf-91aea8f32646"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:09:01 crc kubenswrapper[4690]: I0320 18:09:01.408863 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c59bc866-150a-4671-8bbf-91aea8f32646-inventory" (OuterVolumeSpecName: "inventory") pod "c59bc866-150a-4671-8bbf-91aea8f32646" (UID: "c59bc866-150a-4671-8bbf-91aea8f32646"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:09:01 crc kubenswrapper[4690]: I0320 18:09:01.409087 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c59bc866-150a-4671-8bbf-91aea8f32646-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "c59bc866-150a-4671-8bbf-91aea8f32646" (UID: "c59bc866-150a-4671-8bbf-91aea8f32646"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:09:01 crc kubenswrapper[4690]: I0320 18:09:01.461448 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s2jpm\" (UniqueName: \"kubernetes.io/projected/c59bc866-150a-4671-8bbf-91aea8f32646-kube-api-access-s2jpm\") on node \"crc\" DevicePath \"\"" Mar 20 18:09:01 crc kubenswrapper[4690]: I0320 18:09:01.461706 4690 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c59bc866-150a-4671-8bbf-91aea8f32646-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 18:09:01 crc kubenswrapper[4690]: I0320 18:09:01.461828 4690 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c59bc866-150a-4671-8bbf-91aea8f32646-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 18:09:01 crc kubenswrapper[4690]: I0320 18:09:01.461951 4690 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c59bc866-150a-4671-8bbf-91aea8f32646-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 20 18:09:01 crc kubenswrapper[4690]: I0320 18:09:01.462044 4690 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c59bc866-150a-4671-8bbf-91aea8f32646-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 20 18:09:01 crc kubenswrapper[4690]: I0320 18:09:01.462135 4690 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c59bc866-150a-4671-8bbf-91aea8f32646-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 18:09:01 crc kubenswrapper[4690]: I0320 18:09:01.744655 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qd7s" event={"ID":"c59bc866-150a-4671-8bbf-91aea8f32646","Type":"ContainerDied","Data":"2eba34c2794d0b566ca6aadeac42f9afcb1ba05c4325a09d244f910a84435dc6"} Mar 20 18:09:01 crc kubenswrapper[4690]: I0320 18:09:01.744709 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2eba34c2794d0b566ca6aadeac42f9afcb1ba05c4325a09d244f910a84435dc6" Mar 20 18:09:01 crc kubenswrapper[4690]: I0320 18:09:01.745244 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qd7s" Mar 20 18:09:01 crc kubenswrapper[4690]: I0320 18:09:01.978200 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-926mx"] Mar 20 18:09:01 crc kubenswrapper[4690]: E0320 18:09:01.979012 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aff0d12d-4c9f-4d43-a8a2-f460bd1c62d2" containerName="extract-content" Mar 20 18:09:01 crc kubenswrapper[4690]: I0320 18:09:01.979055 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="aff0d12d-4c9f-4d43-a8a2-f460bd1c62d2" containerName="extract-content" Mar 20 18:09:01 crc kubenswrapper[4690]: E0320 18:09:01.979118 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aff0d12d-4c9f-4d43-a8a2-f460bd1c62d2" containerName="registry-server" Mar 20 18:09:01 crc kubenswrapper[4690]: I0320 18:09:01.979138 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="aff0d12d-4c9f-4d43-a8a2-f460bd1c62d2" containerName="registry-server" Mar 20 18:09:01 crc kubenswrapper[4690]: E0320 18:09:01.979173 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aff0d12d-4c9f-4d43-a8a2-f460bd1c62d2" containerName="extract-utilities" Mar 20 18:09:01 crc kubenswrapper[4690]: I0320 18:09:01.979195 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="aff0d12d-4c9f-4d43-a8a2-f460bd1c62d2" containerName="extract-utilities" Mar 20 18:09:01 crc kubenswrapper[4690]: E0320 18:09:01.979234 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c59bc866-150a-4671-8bbf-91aea8f32646" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 20 18:09:01 crc kubenswrapper[4690]: I0320 18:09:01.979287 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="c59bc866-150a-4671-8bbf-91aea8f32646" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 20 18:09:01 crc kubenswrapper[4690]: I0320 18:09:01.979736 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="aff0d12d-4c9f-4d43-a8a2-f460bd1c62d2" containerName="registry-server" Mar 20 18:09:01 crc kubenswrapper[4690]: I0320 18:09:01.979791 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="c59bc866-150a-4671-8bbf-91aea8f32646" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 20 18:09:01 crc kubenswrapper[4690]: I0320 18:09:01.981006 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-926mx" Mar 20 18:09:01 crc kubenswrapper[4690]: I0320 18:09:01.986290 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Mar 20 18:09:01 crc kubenswrapper[4690]: I0320 18:09:01.986779 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 18:09:01 crc kubenswrapper[4690]: I0320 18:09:01.987176 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 18:09:01 crc kubenswrapper[4690]: I0320 18:09:01.987279 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-k9qb4" Mar 20 18:09:01 crc kubenswrapper[4690]: I0320 18:09:01.988506 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-926mx"] Mar 20 18:09:01 crc kubenswrapper[4690]: I0320 18:09:01.988982 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 18:09:02 crc kubenswrapper[4690]: I0320 18:09:02.075931 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ca6878cf-74a4-4bf6-8e36-bf1a669d787f-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-926mx\" (UID: \"ca6878cf-74a4-4bf6-8e36-bf1a669d787f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-926mx" Mar 20 18:09:02 crc kubenswrapper[4690]: I0320 18:09:02.076014 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca6878cf-74a4-4bf6-8e36-bf1a669d787f-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-926mx\" (UID: \"ca6878cf-74a4-4bf6-8e36-bf1a669d787f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-926mx" Mar 20 18:09:02 crc kubenswrapper[4690]: I0320 18:09:02.076056 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ca6878cf-74a4-4bf6-8e36-bf1a669d787f-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-926mx\" (UID: \"ca6878cf-74a4-4bf6-8e36-bf1a669d787f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-926mx" Mar 20 18:09:02 crc kubenswrapper[4690]: I0320 18:09:02.076124 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vm9jv\" (UniqueName: \"kubernetes.io/projected/ca6878cf-74a4-4bf6-8e36-bf1a669d787f-kube-api-access-vm9jv\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-926mx\" (UID: \"ca6878cf-74a4-4bf6-8e36-bf1a669d787f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-926mx" Mar 20 18:09:02 crc kubenswrapper[4690]: I0320 18:09:02.076219 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/ca6878cf-74a4-4bf6-8e36-bf1a669d787f-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-926mx\" (UID: \"ca6878cf-74a4-4bf6-8e36-bf1a669d787f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-926mx" Mar 20 18:09:02 crc kubenswrapper[4690]: I0320 18:09:02.178550 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ca6878cf-74a4-4bf6-8e36-bf1a669d787f-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-926mx\" (UID: \"ca6878cf-74a4-4bf6-8e36-bf1a669d787f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-926mx" Mar 20 18:09:02 crc kubenswrapper[4690]: I0320 18:09:02.178924 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca6878cf-74a4-4bf6-8e36-bf1a669d787f-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-926mx\" (UID: \"ca6878cf-74a4-4bf6-8e36-bf1a669d787f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-926mx" Mar 20 18:09:02 crc kubenswrapper[4690]: I0320 18:09:02.178968 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ca6878cf-74a4-4bf6-8e36-bf1a669d787f-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-926mx\" (UID: \"ca6878cf-74a4-4bf6-8e36-bf1a669d787f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-926mx" Mar 20 18:09:02 crc kubenswrapper[4690]: I0320 18:09:02.179062 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vm9jv\" (UniqueName: \"kubernetes.io/projected/ca6878cf-74a4-4bf6-8e36-bf1a669d787f-kube-api-access-vm9jv\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-926mx\" (UID: \"ca6878cf-74a4-4bf6-8e36-bf1a669d787f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-926mx" Mar 20 18:09:02 crc kubenswrapper[4690]: I0320 18:09:02.179237 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/ca6878cf-74a4-4bf6-8e36-bf1a669d787f-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-926mx\" (UID: \"ca6878cf-74a4-4bf6-8e36-bf1a669d787f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-926mx" Mar 20 18:09:02 crc kubenswrapper[4690]: I0320 18:09:02.187148 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca6878cf-74a4-4bf6-8e36-bf1a669d787f-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-926mx\" (UID: \"ca6878cf-74a4-4bf6-8e36-bf1a669d787f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-926mx" Mar 20 18:09:02 crc kubenswrapper[4690]: I0320 18:09:02.187279 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ca6878cf-74a4-4bf6-8e36-bf1a669d787f-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-926mx\" (UID: \"ca6878cf-74a4-4bf6-8e36-bf1a669d787f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-926mx" Mar 20 18:09:02 crc kubenswrapper[4690]: I0320 18:09:02.188343 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ca6878cf-74a4-4bf6-8e36-bf1a669d787f-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-926mx\" (UID: \"ca6878cf-74a4-4bf6-8e36-bf1a669d787f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-926mx" Mar 20 18:09:02 crc kubenswrapper[4690]: I0320 18:09:02.203769 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/ca6878cf-74a4-4bf6-8e36-bf1a669d787f-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-926mx\" (UID: \"ca6878cf-74a4-4bf6-8e36-bf1a669d787f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-926mx" Mar 20 18:09:02 crc kubenswrapper[4690]: I0320 18:09:02.211211 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vm9jv\" (UniqueName: \"kubernetes.io/projected/ca6878cf-74a4-4bf6-8e36-bf1a669d787f-kube-api-access-vm9jv\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-926mx\" (UID: \"ca6878cf-74a4-4bf6-8e36-bf1a669d787f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-926mx" Mar 20 18:09:02 crc kubenswrapper[4690]: I0320 18:09:02.299994 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-926mx" Mar 20 18:09:02 crc kubenswrapper[4690]: I0320 18:09:02.856849 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-926mx"] Mar 20 18:09:03 crc kubenswrapper[4690]: I0320 18:09:03.775814 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-926mx" event={"ID":"ca6878cf-74a4-4bf6-8e36-bf1a669d787f","Type":"ContainerStarted","Data":"b8645cdea1b9c53e76c0669a5c440ce546d7538d9a0c01075ac051411f16afc5"} Mar 20 18:09:03 crc kubenswrapper[4690]: I0320 18:09:03.777087 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-926mx" event={"ID":"ca6878cf-74a4-4bf6-8e36-bf1a669d787f","Type":"ContainerStarted","Data":"160a14d0c29241e32561934a9335f784ee0f4aa6094cf272b35d7add4a6d5a10"} Mar 20 18:09:03 crc kubenswrapper[4690]: I0320 18:09:03.811574 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-926mx" podStartSLOduration=2.3367238759999998 podStartE2EDuration="2.811542105s" podCreationTimestamp="2026-03-20 18:09:01 +0000 UTC" firstStartedPulling="2026-03-20 18:09:02.86233264 +0000 UTC m=+2217.728158318" lastFinishedPulling="2026-03-20 18:09:03.337150869 +0000 UTC m=+2218.202976547" observedRunningTime="2026-03-20 18:09:03.797774896 +0000 UTC m=+2218.663600594" watchObservedRunningTime="2026-03-20 18:09:03.811542105 +0000 UTC m=+2218.677367833" Mar 20 18:09:24 crc kubenswrapper[4690]: I0320 18:09:24.274080 4690 patch_prober.go:28] interesting pod/machine-config-daemon-wtg2q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 18:09:24 crc kubenswrapper[4690]: I0320 18:09:24.274681 4690 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 18:09:45 crc kubenswrapper[4690]: I0320 18:09:45.476428 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-m9sw8"] Mar 20 18:09:45 crc kubenswrapper[4690]: I0320 18:09:45.479443 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m9sw8" Mar 20 18:09:45 crc kubenswrapper[4690]: I0320 18:09:45.489182 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m9sw8"] Mar 20 18:09:45 crc kubenswrapper[4690]: I0320 18:09:45.533925 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9368f02-da6a-493a-909b-73a9d11b1370-catalog-content\") pod \"community-operators-m9sw8\" (UID: \"f9368f02-da6a-493a-909b-73a9d11b1370\") " pod="openshift-marketplace/community-operators-m9sw8" Mar 20 18:09:45 crc kubenswrapper[4690]: I0320 18:09:45.533989 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9368f02-da6a-493a-909b-73a9d11b1370-utilities\") pod \"community-operators-m9sw8\" (UID: \"f9368f02-da6a-493a-909b-73a9d11b1370\") " pod="openshift-marketplace/community-operators-m9sw8" Mar 20 18:09:45 crc kubenswrapper[4690]: I0320 18:09:45.534068 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfq8h\" (UniqueName: \"kubernetes.io/projected/f9368f02-da6a-493a-909b-73a9d11b1370-kube-api-access-lfq8h\") pod \"community-operators-m9sw8\" (UID: \"f9368f02-da6a-493a-909b-73a9d11b1370\") " pod="openshift-marketplace/community-operators-m9sw8" Mar 20 18:09:45 crc kubenswrapper[4690]: I0320 18:09:45.636790 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9368f02-da6a-493a-909b-73a9d11b1370-catalog-content\") pod \"community-operators-m9sw8\" (UID: \"f9368f02-da6a-493a-909b-73a9d11b1370\") " pod="openshift-marketplace/community-operators-m9sw8" Mar 20 18:09:45 crc kubenswrapper[4690]: I0320 18:09:45.636881 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9368f02-da6a-493a-909b-73a9d11b1370-utilities\") pod \"community-operators-m9sw8\" (UID: \"f9368f02-da6a-493a-909b-73a9d11b1370\") " pod="openshift-marketplace/community-operators-m9sw8" Mar 20 18:09:45 crc kubenswrapper[4690]: I0320 18:09:45.636978 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfq8h\" (UniqueName: \"kubernetes.io/projected/f9368f02-da6a-493a-909b-73a9d11b1370-kube-api-access-lfq8h\") pod \"community-operators-m9sw8\" (UID: \"f9368f02-da6a-493a-909b-73a9d11b1370\") " pod="openshift-marketplace/community-operators-m9sw8" Mar 20 18:09:45 crc kubenswrapper[4690]: I0320 18:09:45.637433 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9368f02-da6a-493a-909b-73a9d11b1370-catalog-content\") pod \"community-operators-m9sw8\" (UID: \"f9368f02-da6a-493a-909b-73a9d11b1370\") " pod="openshift-marketplace/community-operators-m9sw8" Mar 20 18:09:45 crc kubenswrapper[4690]: I0320 18:09:45.637542 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9368f02-da6a-493a-909b-73a9d11b1370-utilities\") pod \"community-operators-m9sw8\" (UID: \"f9368f02-da6a-493a-909b-73a9d11b1370\") " pod="openshift-marketplace/community-operators-m9sw8" Mar 20 18:09:45 crc kubenswrapper[4690]: I0320 18:09:45.659201 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfq8h\" (UniqueName: \"kubernetes.io/projected/f9368f02-da6a-493a-909b-73a9d11b1370-kube-api-access-lfq8h\") pod \"community-operators-m9sw8\" (UID: \"f9368f02-da6a-493a-909b-73a9d11b1370\") " pod="openshift-marketplace/community-operators-m9sw8" Mar 20 18:09:45 crc kubenswrapper[4690]: I0320 18:09:45.796567 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m9sw8" Mar 20 18:09:46 crc kubenswrapper[4690]: I0320 18:09:46.439663 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m9sw8"] Mar 20 18:09:47 crc kubenswrapper[4690]: I0320 18:09:47.217034 4690 generic.go:334] "Generic (PLEG): container finished" podID="f9368f02-da6a-493a-909b-73a9d11b1370" containerID="712b35ce201900bbdae0ff1aa1144a59b0138c581300adc8f124b97ed089bf8f" exitCode=0 Mar 20 18:09:47 crc kubenswrapper[4690]: I0320 18:09:47.217087 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m9sw8" event={"ID":"f9368f02-da6a-493a-909b-73a9d11b1370","Type":"ContainerDied","Data":"712b35ce201900bbdae0ff1aa1144a59b0138c581300adc8f124b97ed089bf8f"} Mar 20 18:09:47 crc kubenswrapper[4690]: I0320 18:09:47.217117 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m9sw8" event={"ID":"f9368f02-da6a-493a-909b-73a9d11b1370","Type":"ContainerStarted","Data":"7bdfb1a36cff26262ecfd3492245d471d5b61136d2e29d4cfdd9d4d81c688d2e"} Mar 20 18:09:48 crc kubenswrapper[4690]: I0320 18:09:48.433909 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rcdjr"] Mar 20 18:09:48 crc kubenswrapper[4690]: I0320 18:09:48.439527 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rcdjr" Mar 20 18:09:48 crc kubenswrapper[4690]: I0320 18:09:48.453209 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rcdjr"] Mar 20 18:09:48 crc kubenswrapper[4690]: I0320 18:09:48.500094 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51ffd9e1-7ec0-490c-b5bf-746f61587d79-catalog-content\") pod \"certified-operators-rcdjr\" (UID: \"51ffd9e1-7ec0-490c-b5bf-746f61587d79\") " pod="openshift-marketplace/certified-operators-rcdjr" Mar 20 18:09:48 crc kubenswrapper[4690]: I0320 18:09:48.500246 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljz87\" (UniqueName: \"kubernetes.io/projected/51ffd9e1-7ec0-490c-b5bf-746f61587d79-kube-api-access-ljz87\") pod \"certified-operators-rcdjr\" (UID: \"51ffd9e1-7ec0-490c-b5bf-746f61587d79\") " pod="openshift-marketplace/certified-operators-rcdjr" Mar 20 18:09:48 crc kubenswrapper[4690]: I0320 18:09:48.500582 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51ffd9e1-7ec0-490c-b5bf-746f61587d79-utilities\") pod \"certified-operators-rcdjr\" (UID: \"51ffd9e1-7ec0-490c-b5bf-746f61587d79\") " pod="openshift-marketplace/certified-operators-rcdjr" Mar 20 18:09:48 crc kubenswrapper[4690]: I0320 18:09:48.602415 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51ffd9e1-7ec0-490c-b5bf-746f61587d79-catalog-content\") pod \"certified-operators-rcdjr\" (UID: \"51ffd9e1-7ec0-490c-b5bf-746f61587d79\") " pod="openshift-marketplace/certified-operators-rcdjr" Mar 20 18:09:48 crc kubenswrapper[4690]: I0320 18:09:48.602495 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljz87\" (UniqueName: \"kubernetes.io/projected/51ffd9e1-7ec0-490c-b5bf-746f61587d79-kube-api-access-ljz87\") pod \"certified-operators-rcdjr\" (UID: \"51ffd9e1-7ec0-490c-b5bf-746f61587d79\") " pod="openshift-marketplace/certified-operators-rcdjr" Mar 20 18:09:48 crc kubenswrapper[4690]: I0320 18:09:48.602625 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51ffd9e1-7ec0-490c-b5bf-746f61587d79-utilities\") pod \"certified-operators-rcdjr\" (UID: \"51ffd9e1-7ec0-490c-b5bf-746f61587d79\") " pod="openshift-marketplace/certified-operators-rcdjr" Mar 20 18:09:48 crc kubenswrapper[4690]: I0320 18:09:48.603028 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51ffd9e1-7ec0-490c-b5bf-746f61587d79-catalog-content\") pod \"certified-operators-rcdjr\" (UID: \"51ffd9e1-7ec0-490c-b5bf-746f61587d79\") " pod="openshift-marketplace/certified-operators-rcdjr" Mar 20 18:09:48 crc kubenswrapper[4690]: I0320 18:09:48.603059 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51ffd9e1-7ec0-490c-b5bf-746f61587d79-utilities\") pod \"certified-operators-rcdjr\" (UID: \"51ffd9e1-7ec0-490c-b5bf-746f61587d79\") " pod="openshift-marketplace/certified-operators-rcdjr" Mar 20 18:09:48 crc kubenswrapper[4690]: I0320 18:09:48.623208 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljz87\" (UniqueName: \"kubernetes.io/projected/51ffd9e1-7ec0-490c-b5bf-746f61587d79-kube-api-access-ljz87\") pod \"certified-operators-rcdjr\" (UID: \"51ffd9e1-7ec0-490c-b5bf-746f61587d79\") " pod="openshift-marketplace/certified-operators-rcdjr" Mar 20 18:09:48 crc kubenswrapper[4690]: I0320 18:09:48.764467 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rcdjr" Mar 20 18:09:49 crc kubenswrapper[4690]: I0320 18:09:49.237276 4690 generic.go:334] "Generic (PLEG): container finished" podID="f9368f02-da6a-493a-909b-73a9d11b1370" containerID="9e9326cd5abac688f0cc7f2c0ac27a3547cf4df8ed340f484369fadfe8716bb5" exitCode=0 Mar 20 18:09:49 crc kubenswrapper[4690]: I0320 18:09:49.237386 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m9sw8" event={"ID":"f9368f02-da6a-493a-909b-73a9d11b1370","Type":"ContainerDied","Data":"9e9326cd5abac688f0cc7f2c0ac27a3547cf4df8ed340f484369fadfe8716bb5"} Mar 20 18:09:49 crc kubenswrapper[4690]: I0320 18:09:49.302138 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rcdjr"] Mar 20 18:09:50 crc kubenswrapper[4690]: I0320 18:09:50.248541 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m9sw8" event={"ID":"f9368f02-da6a-493a-909b-73a9d11b1370","Type":"ContainerStarted","Data":"d3b7a547bccf9dde82969aaf835c85bc34b4d72e705da559e7bab30f7d3d5cf0"} Mar 20 18:09:50 crc kubenswrapper[4690]: I0320 18:09:50.252395 4690 generic.go:334] "Generic (PLEG): container finished" podID="51ffd9e1-7ec0-490c-b5bf-746f61587d79" containerID="a20712c9c3f2706bfcabdb82426b6a12878dc444a8c5ca00556815807f054010" exitCode=0 Mar 20 18:09:50 crc kubenswrapper[4690]: I0320 18:09:50.252501 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rcdjr" event={"ID":"51ffd9e1-7ec0-490c-b5bf-746f61587d79","Type":"ContainerDied","Data":"a20712c9c3f2706bfcabdb82426b6a12878dc444a8c5ca00556815807f054010"} Mar 20 18:09:50 crc kubenswrapper[4690]: I0320 18:09:50.252839 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rcdjr" event={"ID":"51ffd9e1-7ec0-490c-b5bf-746f61587d79","Type":"ContainerStarted","Data":"dfb7c260bf31487e48b1033734896c9785f51adcdb98412edb039e68db54a09e"} Mar 20 18:09:50 crc kubenswrapper[4690]: I0320 18:09:50.296461 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-m9sw8" podStartSLOduration=2.812127097 podStartE2EDuration="5.296440919s" podCreationTimestamp="2026-03-20 18:09:45 +0000 UTC" firstStartedPulling="2026-03-20 18:09:47.220404293 +0000 UTC m=+2262.086229981" lastFinishedPulling="2026-03-20 18:09:49.704718125 +0000 UTC m=+2264.570543803" observedRunningTime="2026-03-20 18:09:50.273435489 +0000 UTC m=+2265.139261207" watchObservedRunningTime="2026-03-20 18:09:50.296440919 +0000 UTC m=+2265.162266597" Mar 20 18:09:52 crc kubenswrapper[4690]: I0320 18:09:52.270544 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rcdjr" event={"ID":"51ffd9e1-7ec0-490c-b5bf-746f61587d79","Type":"ContainerStarted","Data":"4e3423e7fe8f82671f829b1f136c74e327d7f9c114dcdc94bd0ef22cb25e8091"} Mar 20 18:09:53 crc kubenswrapper[4690]: I0320 18:09:53.281286 4690 generic.go:334] "Generic (PLEG): container finished" podID="51ffd9e1-7ec0-490c-b5bf-746f61587d79" containerID="4e3423e7fe8f82671f829b1f136c74e327d7f9c114dcdc94bd0ef22cb25e8091" exitCode=0 Mar 20 18:09:53 crc kubenswrapper[4690]: I0320 18:09:53.281363 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rcdjr" event={"ID":"51ffd9e1-7ec0-490c-b5bf-746f61587d79","Type":"ContainerDied","Data":"4e3423e7fe8f82671f829b1f136c74e327d7f9c114dcdc94bd0ef22cb25e8091"} Mar 20 18:09:54 crc kubenswrapper[4690]: I0320 18:09:54.274112 4690 patch_prober.go:28] interesting pod/machine-config-daemon-wtg2q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 18:09:54 crc kubenswrapper[4690]: I0320 18:09:54.274602 4690 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 18:09:54 crc kubenswrapper[4690]: I0320 18:09:54.294740 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rcdjr" event={"ID":"51ffd9e1-7ec0-490c-b5bf-746f61587d79","Type":"ContainerStarted","Data":"cb2ed6969669e68d198c7a418691204aadb3354da4da3be9a30e49a233927cb9"} Mar 20 18:09:54 crc kubenswrapper[4690]: I0320 18:09:54.330425 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rcdjr" podStartSLOduration=2.837290795 podStartE2EDuration="6.330405098s" podCreationTimestamp="2026-03-20 18:09:48 +0000 UTC" firstStartedPulling="2026-03-20 18:09:50.257439066 +0000 UTC m=+2265.123264754" lastFinishedPulling="2026-03-20 18:09:53.750553379 +0000 UTC m=+2268.616379057" observedRunningTime="2026-03-20 18:09:54.321528697 +0000 UTC m=+2269.187354405" watchObservedRunningTime="2026-03-20 18:09:54.330405098 +0000 UTC m=+2269.196230796" Mar 20 18:09:55 crc kubenswrapper[4690]: I0320 18:09:55.797056 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-m9sw8" Mar 20 18:09:55 crc kubenswrapper[4690]: I0320 18:09:55.797108 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-m9sw8" Mar 20 18:09:55 crc kubenswrapper[4690]: I0320 18:09:55.855478 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-m9sw8" Mar 20 18:09:56 crc kubenswrapper[4690]: I0320 18:09:56.364821 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-m9sw8" Mar 20 18:09:57 crc kubenswrapper[4690]: I0320 18:09:57.017807 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-m9sw8"] Mar 20 18:09:58 crc kubenswrapper[4690]: I0320 18:09:58.344068 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-m9sw8" podUID="f9368f02-da6a-493a-909b-73a9d11b1370" containerName="registry-server" containerID="cri-o://d3b7a547bccf9dde82969aaf835c85bc34b4d72e705da559e7bab30f7d3d5cf0" gracePeriod=2 Mar 20 18:09:58 crc kubenswrapper[4690]: I0320 18:09:58.765080 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rcdjr" Mar 20 18:09:58 crc kubenswrapper[4690]: I0320 18:09:58.765534 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rcdjr" Mar 20 18:09:58 crc kubenswrapper[4690]: I0320 18:09:58.813857 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rcdjr" Mar 20 18:09:58 crc kubenswrapper[4690]: I0320 18:09:58.824419 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m9sw8" Mar 20 18:09:58 crc kubenswrapper[4690]: I0320 18:09:58.943286 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9368f02-da6a-493a-909b-73a9d11b1370-catalog-content\") pod \"f9368f02-da6a-493a-909b-73a9d11b1370\" (UID: \"f9368f02-da6a-493a-909b-73a9d11b1370\") " Mar 20 18:09:58 crc kubenswrapper[4690]: I0320 18:09:58.943402 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lfq8h\" (UniqueName: \"kubernetes.io/projected/f9368f02-da6a-493a-909b-73a9d11b1370-kube-api-access-lfq8h\") pod \"f9368f02-da6a-493a-909b-73a9d11b1370\" (UID: \"f9368f02-da6a-493a-909b-73a9d11b1370\") " Mar 20 18:09:58 crc kubenswrapper[4690]: I0320 18:09:58.943603 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9368f02-da6a-493a-909b-73a9d11b1370-utilities\") pod \"f9368f02-da6a-493a-909b-73a9d11b1370\" (UID: \"f9368f02-da6a-493a-909b-73a9d11b1370\") " Mar 20 18:09:58 crc kubenswrapper[4690]: I0320 18:09:58.944522 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9368f02-da6a-493a-909b-73a9d11b1370-utilities" (OuterVolumeSpecName: "utilities") pod "f9368f02-da6a-493a-909b-73a9d11b1370" (UID: "f9368f02-da6a-493a-909b-73a9d11b1370"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:09:58 crc kubenswrapper[4690]: I0320 18:09:58.948476 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9368f02-da6a-493a-909b-73a9d11b1370-kube-api-access-lfq8h" (OuterVolumeSpecName: "kube-api-access-lfq8h") pod "f9368f02-da6a-493a-909b-73a9d11b1370" (UID: "f9368f02-da6a-493a-909b-73a9d11b1370"). InnerVolumeSpecName "kube-api-access-lfq8h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:09:58 crc kubenswrapper[4690]: I0320 18:09:58.995847 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9368f02-da6a-493a-909b-73a9d11b1370-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f9368f02-da6a-493a-909b-73a9d11b1370" (UID: "f9368f02-da6a-493a-909b-73a9d11b1370"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:09:59 crc kubenswrapper[4690]: I0320 18:09:59.045524 4690 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9368f02-da6a-493a-909b-73a9d11b1370-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 18:09:59 crc kubenswrapper[4690]: I0320 18:09:59.045568 4690 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9368f02-da6a-493a-909b-73a9d11b1370-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 18:09:59 crc kubenswrapper[4690]: I0320 18:09:59.045584 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lfq8h\" (UniqueName: \"kubernetes.io/projected/f9368f02-da6a-493a-909b-73a9d11b1370-kube-api-access-lfq8h\") on node \"crc\" DevicePath \"\"" Mar 20 18:09:59 crc kubenswrapper[4690]: I0320 18:09:59.357494 4690 generic.go:334] "Generic (PLEG): container finished" podID="f9368f02-da6a-493a-909b-73a9d11b1370" containerID="d3b7a547bccf9dde82969aaf835c85bc34b4d72e705da559e7bab30f7d3d5cf0" exitCode=0 Mar 20 18:09:59 crc kubenswrapper[4690]: I0320 18:09:59.357619 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m9sw8" event={"ID":"f9368f02-da6a-493a-909b-73a9d11b1370","Type":"ContainerDied","Data":"d3b7a547bccf9dde82969aaf835c85bc34b4d72e705da559e7bab30f7d3d5cf0"} Mar 20 18:09:59 crc kubenswrapper[4690]: I0320 18:09:59.357659 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m9sw8" Mar 20 18:09:59 crc kubenswrapper[4690]: I0320 18:09:59.359511 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m9sw8" event={"ID":"f9368f02-da6a-493a-909b-73a9d11b1370","Type":"ContainerDied","Data":"7bdfb1a36cff26262ecfd3492245d471d5b61136d2e29d4cfdd9d4d81c688d2e"} Mar 20 18:09:59 crc kubenswrapper[4690]: I0320 18:09:59.359573 4690 scope.go:117] "RemoveContainer" containerID="d3b7a547bccf9dde82969aaf835c85bc34b4d72e705da559e7bab30f7d3d5cf0" Mar 20 18:09:59 crc kubenswrapper[4690]: I0320 18:09:59.384699 4690 scope.go:117] "RemoveContainer" containerID="9e9326cd5abac688f0cc7f2c0ac27a3547cf4df8ed340f484369fadfe8716bb5" Mar 20 18:09:59 crc kubenswrapper[4690]: I0320 18:09:59.430221 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-m9sw8"] Mar 20 18:09:59 crc kubenswrapper[4690]: I0320 18:09:59.443089 4690 scope.go:117] "RemoveContainer" containerID="712b35ce201900bbdae0ff1aa1144a59b0138c581300adc8f124b97ed089bf8f" Mar 20 18:09:59 crc kubenswrapper[4690]: I0320 18:09:59.446377 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-m9sw8"] Mar 20 18:09:59 crc kubenswrapper[4690]: I0320 18:09:59.447559 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rcdjr" Mar 20 18:09:59 crc kubenswrapper[4690]: I0320 18:09:59.484059 4690 scope.go:117] "RemoveContainer" containerID="d3b7a547bccf9dde82969aaf835c85bc34b4d72e705da559e7bab30f7d3d5cf0" Mar 20 18:09:59 crc kubenswrapper[4690]: E0320 18:09:59.489386 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3b7a547bccf9dde82969aaf835c85bc34b4d72e705da559e7bab30f7d3d5cf0\": container with ID starting with d3b7a547bccf9dde82969aaf835c85bc34b4d72e705da559e7bab30f7d3d5cf0 not found: ID does not exist" containerID="d3b7a547bccf9dde82969aaf835c85bc34b4d72e705da559e7bab30f7d3d5cf0" Mar 20 18:09:59 crc kubenswrapper[4690]: I0320 18:09:59.489445 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3b7a547bccf9dde82969aaf835c85bc34b4d72e705da559e7bab30f7d3d5cf0"} err="failed to get container status \"d3b7a547bccf9dde82969aaf835c85bc34b4d72e705da559e7bab30f7d3d5cf0\": rpc error: code = NotFound desc = could not find container \"d3b7a547bccf9dde82969aaf835c85bc34b4d72e705da559e7bab30f7d3d5cf0\": container with ID starting with d3b7a547bccf9dde82969aaf835c85bc34b4d72e705da559e7bab30f7d3d5cf0 not found: ID does not exist" Mar 20 18:09:59 crc kubenswrapper[4690]: I0320 18:09:59.489474 4690 scope.go:117] "RemoveContainer" containerID="9e9326cd5abac688f0cc7f2c0ac27a3547cf4df8ed340f484369fadfe8716bb5" Mar 20 18:09:59 crc kubenswrapper[4690]: E0320 18:09:59.490051 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e9326cd5abac688f0cc7f2c0ac27a3547cf4df8ed340f484369fadfe8716bb5\": container with ID starting with 9e9326cd5abac688f0cc7f2c0ac27a3547cf4df8ed340f484369fadfe8716bb5 not found: ID does not exist" containerID="9e9326cd5abac688f0cc7f2c0ac27a3547cf4df8ed340f484369fadfe8716bb5" Mar 20 18:09:59 crc kubenswrapper[4690]: I0320 18:09:59.490154 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e9326cd5abac688f0cc7f2c0ac27a3547cf4df8ed340f484369fadfe8716bb5"} err="failed to get container status \"9e9326cd5abac688f0cc7f2c0ac27a3547cf4df8ed340f484369fadfe8716bb5\": rpc error: code = NotFound desc = could not find container \"9e9326cd5abac688f0cc7f2c0ac27a3547cf4df8ed340f484369fadfe8716bb5\": container with ID starting with 9e9326cd5abac688f0cc7f2c0ac27a3547cf4df8ed340f484369fadfe8716bb5 not found: ID does not exist" Mar 20 18:09:59 crc kubenswrapper[4690]: I0320 18:09:59.490238 4690 scope.go:117] "RemoveContainer" containerID="712b35ce201900bbdae0ff1aa1144a59b0138c581300adc8f124b97ed089bf8f" Mar 20 18:09:59 crc kubenswrapper[4690]: E0320 18:09:59.490761 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"712b35ce201900bbdae0ff1aa1144a59b0138c581300adc8f124b97ed089bf8f\": container with ID starting with 712b35ce201900bbdae0ff1aa1144a59b0138c581300adc8f124b97ed089bf8f not found: ID does not exist" containerID="712b35ce201900bbdae0ff1aa1144a59b0138c581300adc8f124b97ed089bf8f" Mar 20 18:09:59 crc kubenswrapper[4690]: I0320 18:09:59.490876 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"712b35ce201900bbdae0ff1aa1144a59b0138c581300adc8f124b97ed089bf8f"} err="failed to get container status \"712b35ce201900bbdae0ff1aa1144a59b0138c581300adc8f124b97ed089bf8f\": rpc error: code = NotFound desc = could not find container \"712b35ce201900bbdae0ff1aa1144a59b0138c581300adc8f124b97ed089bf8f\": container with ID starting with 712b35ce201900bbdae0ff1aa1144a59b0138c581300adc8f124b97ed089bf8f not found: ID does not exist" Mar 20 18:09:59 crc kubenswrapper[4690]: I0320 18:09:59.904082 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9368f02-da6a-493a-909b-73a9d11b1370" path="/var/lib/kubelet/pods/f9368f02-da6a-493a-909b-73a9d11b1370/volumes" Mar 20 18:10:00 crc kubenswrapper[4690]: I0320 18:10:00.154243 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567170-ss7qv"] Mar 20 18:10:00 crc kubenswrapper[4690]: E0320 18:10:00.154752 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9368f02-da6a-493a-909b-73a9d11b1370" containerName="registry-server" Mar 20 18:10:00 crc kubenswrapper[4690]: I0320 18:10:00.154776 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9368f02-da6a-493a-909b-73a9d11b1370" containerName="registry-server" Mar 20 18:10:00 crc kubenswrapper[4690]: E0320 18:10:00.154808 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9368f02-da6a-493a-909b-73a9d11b1370" containerName="extract-content" Mar 20 18:10:00 crc kubenswrapper[4690]: I0320 18:10:00.154817 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9368f02-da6a-493a-909b-73a9d11b1370" containerName="extract-content" Mar 20 18:10:00 crc kubenswrapper[4690]: E0320 18:10:00.154847 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9368f02-da6a-493a-909b-73a9d11b1370" containerName="extract-utilities" Mar 20 18:10:00 crc kubenswrapper[4690]: I0320 18:10:00.154856 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9368f02-da6a-493a-909b-73a9d11b1370" containerName="extract-utilities" Mar 20 18:10:00 crc kubenswrapper[4690]: I0320 18:10:00.155105 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9368f02-da6a-493a-909b-73a9d11b1370" containerName="registry-server" Mar 20 18:10:00 crc kubenswrapper[4690]: I0320 18:10:00.157564 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567170-ss7qv" Mar 20 18:10:00 crc kubenswrapper[4690]: I0320 18:10:00.159804 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 18:10:00 crc kubenswrapper[4690]: I0320 18:10:00.160099 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 18:10:00 crc kubenswrapper[4690]: I0320 18:10:00.160111 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5fwhb" Mar 20 18:10:00 crc kubenswrapper[4690]: I0320 18:10:00.165831 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567170-ss7qv"] Mar 20 18:10:00 crc kubenswrapper[4690]: I0320 18:10:00.268352 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmpkh\" (UniqueName: \"kubernetes.io/projected/f6f1cf15-883c-4bac-8695-749c4d80c353-kube-api-access-jmpkh\") pod \"auto-csr-approver-29567170-ss7qv\" (UID: \"f6f1cf15-883c-4bac-8695-749c4d80c353\") " pod="openshift-infra/auto-csr-approver-29567170-ss7qv" Mar 20 18:10:00 crc kubenswrapper[4690]: I0320 18:10:00.369578 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmpkh\" (UniqueName: \"kubernetes.io/projected/f6f1cf15-883c-4bac-8695-749c4d80c353-kube-api-access-jmpkh\") pod \"auto-csr-approver-29567170-ss7qv\" (UID: \"f6f1cf15-883c-4bac-8695-749c4d80c353\") " pod="openshift-infra/auto-csr-approver-29567170-ss7qv" Mar 20 18:10:00 crc kubenswrapper[4690]: I0320 18:10:00.392848 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmpkh\" (UniqueName: \"kubernetes.io/projected/f6f1cf15-883c-4bac-8695-749c4d80c353-kube-api-access-jmpkh\") pod \"auto-csr-approver-29567170-ss7qv\" (UID: \"f6f1cf15-883c-4bac-8695-749c4d80c353\") " pod="openshift-infra/auto-csr-approver-29567170-ss7qv" Mar 20 18:10:00 crc kubenswrapper[4690]: I0320 18:10:00.486764 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567170-ss7qv" Mar 20 18:10:00 crc kubenswrapper[4690]: I0320 18:10:00.792531 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567170-ss7qv"] Mar 20 18:10:01 crc kubenswrapper[4690]: I0320 18:10:01.228015 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rcdjr"] Mar 20 18:10:01 crc kubenswrapper[4690]: I0320 18:10:01.394498 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567170-ss7qv" event={"ID":"f6f1cf15-883c-4bac-8695-749c4d80c353","Type":"ContainerStarted","Data":"527502dbd366017202940e6f51e82da49130032f44c3d0d382d021ca05093c3e"} Mar 20 18:10:01 crc kubenswrapper[4690]: I0320 18:10:01.394631 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-rcdjr" podUID="51ffd9e1-7ec0-490c-b5bf-746f61587d79" containerName="registry-server" containerID="cri-o://cb2ed6969669e68d198c7a418691204aadb3354da4da3be9a30e49a233927cb9" gracePeriod=2 Mar 20 18:10:01 crc kubenswrapper[4690]: I0320 18:10:01.864037 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rcdjr" Mar 20 18:10:02 crc kubenswrapper[4690]: I0320 18:10:02.019555 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljz87\" (UniqueName: \"kubernetes.io/projected/51ffd9e1-7ec0-490c-b5bf-746f61587d79-kube-api-access-ljz87\") pod \"51ffd9e1-7ec0-490c-b5bf-746f61587d79\" (UID: \"51ffd9e1-7ec0-490c-b5bf-746f61587d79\") " Mar 20 18:10:02 crc kubenswrapper[4690]: I0320 18:10:02.019614 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51ffd9e1-7ec0-490c-b5bf-746f61587d79-catalog-content\") pod \"51ffd9e1-7ec0-490c-b5bf-746f61587d79\" (UID: \"51ffd9e1-7ec0-490c-b5bf-746f61587d79\") " Mar 20 18:10:02 crc kubenswrapper[4690]: I0320 18:10:02.019704 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51ffd9e1-7ec0-490c-b5bf-746f61587d79-utilities\") pod \"51ffd9e1-7ec0-490c-b5bf-746f61587d79\" (UID: \"51ffd9e1-7ec0-490c-b5bf-746f61587d79\") " Mar 20 18:10:02 crc kubenswrapper[4690]: I0320 18:10:02.020867 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51ffd9e1-7ec0-490c-b5bf-746f61587d79-utilities" (OuterVolumeSpecName: "utilities") pod "51ffd9e1-7ec0-490c-b5bf-746f61587d79" (UID: "51ffd9e1-7ec0-490c-b5bf-746f61587d79"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:10:02 crc kubenswrapper[4690]: I0320 18:10:02.026789 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51ffd9e1-7ec0-490c-b5bf-746f61587d79-kube-api-access-ljz87" (OuterVolumeSpecName: "kube-api-access-ljz87") pod "51ffd9e1-7ec0-490c-b5bf-746f61587d79" (UID: "51ffd9e1-7ec0-490c-b5bf-746f61587d79"). InnerVolumeSpecName "kube-api-access-ljz87". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:10:02 crc kubenswrapper[4690]: I0320 18:10:02.068208 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51ffd9e1-7ec0-490c-b5bf-746f61587d79-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "51ffd9e1-7ec0-490c-b5bf-746f61587d79" (UID: "51ffd9e1-7ec0-490c-b5bf-746f61587d79"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:10:02 crc kubenswrapper[4690]: I0320 18:10:02.123122 4690 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51ffd9e1-7ec0-490c-b5bf-746f61587d79-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 18:10:02 crc kubenswrapper[4690]: I0320 18:10:02.123161 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljz87\" (UniqueName: \"kubernetes.io/projected/51ffd9e1-7ec0-490c-b5bf-746f61587d79-kube-api-access-ljz87\") on node \"crc\" DevicePath \"\"" Mar 20 18:10:02 crc kubenswrapper[4690]: I0320 18:10:02.123175 4690 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51ffd9e1-7ec0-490c-b5bf-746f61587d79-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 18:10:02 crc kubenswrapper[4690]: I0320 18:10:02.407204 4690 generic.go:334] "Generic (PLEG): container finished" podID="51ffd9e1-7ec0-490c-b5bf-746f61587d79" containerID="cb2ed6969669e68d198c7a418691204aadb3354da4da3be9a30e49a233927cb9" exitCode=0 Mar 20 18:10:02 crc kubenswrapper[4690]: I0320 18:10:02.407301 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rcdjr" event={"ID":"51ffd9e1-7ec0-490c-b5bf-746f61587d79","Type":"ContainerDied","Data":"cb2ed6969669e68d198c7a418691204aadb3354da4da3be9a30e49a233927cb9"} Mar 20 18:10:02 crc kubenswrapper[4690]: I0320 18:10:02.407333 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rcdjr" event={"ID":"51ffd9e1-7ec0-490c-b5bf-746f61587d79","Type":"ContainerDied","Data":"dfb7c260bf31487e48b1033734896c9785f51adcdb98412edb039e68db54a09e"} Mar 20 18:10:02 crc kubenswrapper[4690]: I0320 18:10:02.407354 4690 scope.go:117] "RemoveContainer" containerID="cb2ed6969669e68d198c7a418691204aadb3354da4da3be9a30e49a233927cb9" Mar 20 18:10:02 crc kubenswrapper[4690]: I0320 18:10:02.407486 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rcdjr" Mar 20 18:10:02 crc kubenswrapper[4690]: I0320 18:10:02.429209 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567170-ss7qv" podStartSLOduration=1.084274852 podStartE2EDuration="2.429187657s" podCreationTimestamp="2026-03-20 18:10:00 +0000 UTC" firstStartedPulling="2026-03-20 18:10:00.791495121 +0000 UTC m=+2275.657320839" lastFinishedPulling="2026-03-20 18:10:02.136407966 +0000 UTC m=+2277.002233644" observedRunningTime="2026-03-20 18:10:02.4239792 +0000 UTC m=+2277.289804878" watchObservedRunningTime="2026-03-20 18:10:02.429187657 +0000 UTC m=+2277.295013335" Mar 20 18:10:02 crc kubenswrapper[4690]: I0320 18:10:02.433321 4690 scope.go:117] "RemoveContainer" containerID="4e3423e7fe8f82671f829b1f136c74e327d7f9c114dcdc94bd0ef22cb25e8091" Mar 20 18:10:02 crc kubenswrapper[4690]: I0320 18:10:02.450827 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rcdjr"] Mar 20 18:10:02 crc kubenswrapper[4690]: I0320 18:10:02.460910 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-rcdjr"] Mar 20 18:10:02 crc kubenswrapper[4690]: I0320 18:10:02.486491 4690 scope.go:117] "RemoveContainer" containerID="a20712c9c3f2706bfcabdb82426b6a12878dc444a8c5ca00556815807f054010" Mar 20 18:10:02 crc kubenswrapper[4690]: I0320 18:10:02.505555 4690 scope.go:117] "RemoveContainer" containerID="cb2ed6969669e68d198c7a418691204aadb3354da4da3be9a30e49a233927cb9" Mar 20 18:10:02 crc kubenswrapper[4690]: E0320 18:10:02.505913 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb2ed6969669e68d198c7a418691204aadb3354da4da3be9a30e49a233927cb9\": container with ID starting with cb2ed6969669e68d198c7a418691204aadb3354da4da3be9a30e49a233927cb9 not found: ID does not exist" containerID="cb2ed6969669e68d198c7a418691204aadb3354da4da3be9a30e49a233927cb9" Mar 20 18:10:02 crc kubenswrapper[4690]: I0320 18:10:02.505945 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb2ed6969669e68d198c7a418691204aadb3354da4da3be9a30e49a233927cb9"} err="failed to get container status \"cb2ed6969669e68d198c7a418691204aadb3354da4da3be9a30e49a233927cb9\": rpc error: code = NotFound desc = could not find container \"cb2ed6969669e68d198c7a418691204aadb3354da4da3be9a30e49a233927cb9\": container with ID starting with cb2ed6969669e68d198c7a418691204aadb3354da4da3be9a30e49a233927cb9 not found: ID does not exist" Mar 20 18:10:02 crc kubenswrapper[4690]: I0320 18:10:02.505962 4690 scope.go:117] "RemoveContainer" containerID="4e3423e7fe8f82671f829b1f136c74e327d7f9c114dcdc94bd0ef22cb25e8091" Mar 20 18:10:02 crc kubenswrapper[4690]: E0320 18:10:02.507539 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e3423e7fe8f82671f829b1f136c74e327d7f9c114dcdc94bd0ef22cb25e8091\": container with ID starting with 4e3423e7fe8f82671f829b1f136c74e327d7f9c114dcdc94bd0ef22cb25e8091 not found: ID does not exist" containerID="4e3423e7fe8f82671f829b1f136c74e327d7f9c114dcdc94bd0ef22cb25e8091" Mar 20 18:10:02 crc kubenswrapper[4690]: I0320 18:10:02.507561 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e3423e7fe8f82671f829b1f136c74e327d7f9c114dcdc94bd0ef22cb25e8091"} err="failed to get container status \"4e3423e7fe8f82671f829b1f136c74e327d7f9c114dcdc94bd0ef22cb25e8091\": rpc error: code = NotFound desc = could not find container \"4e3423e7fe8f82671f829b1f136c74e327d7f9c114dcdc94bd0ef22cb25e8091\": container with ID starting with 4e3423e7fe8f82671f829b1f136c74e327d7f9c114dcdc94bd0ef22cb25e8091 not found: ID does not exist" Mar 20 18:10:02 crc kubenswrapper[4690]: I0320 18:10:02.507574 4690 scope.go:117] "RemoveContainer" containerID="a20712c9c3f2706bfcabdb82426b6a12878dc444a8c5ca00556815807f054010" Mar 20 18:10:02 crc kubenswrapper[4690]: E0320 18:10:02.507793 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a20712c9c3f2706bfcabdb82426b6a12878dc444a8c5ca00556815807f054010\": container with ID starting with a20712c9c3f2706bfcabdb82426b6a12878dc444a8c5ca00556815807f054010 not found: ID does not exist" containerID="a20712c9c3f2706bfcabdb82426b6a12878dc444a8c5ca00556815807f054010" Mar 20 18:10:02 crc kubenswrapper[4690]: I0320 18:10:02.507811 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a20712c9c3f2706bfcabdb82426b6a12878dc444a8c5ca00556815807f054010"} err="failed to get container status \"a20712c9c3f2706bfcabdb82426b6a12878dc444a8c5ca00556815807f054010\": rpc error: code = NotFound desc = could not find container \"a20712c9c3f2706bfcabdb82426b6a12878dc444a8c5ca00556815807f054010\": container with ID starting with a20712c9c3f2706bfcabdb82426b6a12878dc444a8c5ca00556815807f054010 not found: ID does not exist" Mar 20 18:10:03 crc kubenswrapper[4690]: I0320 18:10:03.423435 4690 generic.go:334] "Generic (PLEG): container finished" podID="f6f1cf15-883c-4bac-8695-749c4d80c353" containerID="35cb50e14fcb85f8bb096379d4bacd0e363d09fa2f18a91b7452a9c13579e8fc" exitCode=0 Mar 20 18:10:03 crc kubenswrapper[4690]: I0320 18:10:03.423581 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567170-ss7qv" event={"ID":"f6f1cf15-883c-4bac-8695-749c4d80c353","Type":"ContainerDied","Data":"35cb50e14fcb85f8bb096379d4bacd0e363d09fa2f18a91b7452a9c13579e8fc"} Mar 20 18:10:03 crc kubenswrapper[4690]: I0320 18:10:03.898868 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51ffd9e1-7ec0-490c-b5bf-746f61587d79" path="/var/lib/kubelet/pods/51ffd9e1-7ec0-490c-b5bf-746f61587d79/volumes" Mar 20 18:10:04 crc kubenswrapper[4690]: I0320 18:10:04.858036 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567170-ss7qv" Mar 20 18:10:04 crc kubenswrapper[4690]: I0320 18:10:04.981679 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jmpkh\" (UniqueName: \"kubernetes.io/projected/f6f1cf15-883c-4bac-8695-749c4d80c353-kube-api-access-jmpkh\") pod \"f6f1cf15-883c-4bac-8695-749c4d80c353\" (UID: \"f6f1cf15-883c-4bac-8695-749c4d80c353\") " Mar 20 18:10:04 crc kubenswrapper[4690]: I0320 18:10:04.987152 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6f1cf15-883c-4bac-8695-749c4d80c353-kube-api-access-jmpkh" (OuterVolumeSpecName: "kube-api-access-jmpkh") pod "f6f1cf15-883c-4bac-8695-749c4d80c353" (UID: "f6f1cf15-883c-4bac-8695-749c4d80c353"). InnerVolumeSpecName "kube-api-access-jmpkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:10:05 crc kubenswrapper[4690]: I0320 18:10:05.084690 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jmpkh\" (UniqueName: \"kubernetes.io/projected/f6f1cf15-883c-4bac-8695-749c4d80c353-kube-api-access-jmpkh\") on node \"crc\" DevicePath \"\"" Mar 20 18:10:05 crc kubenswrapper[4690]: I0320 18:10:05.454228 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567170-ss7qv" event={"ID":"f6f1cf15-883c-4bac-8695-749c4d80c353","Type":"ContainerDied","Data":"527502dbd366017202940e6f51e82da49130032f44c3d0d382d021ca05093c3e"} Mar 20 18:10:05 crc kubenswrapper[4690]: I0320 18:10:05.454290 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="527502dbd366017202940e6f51e82da49130032f44c3d0d382d021ca05093c3e" Mar 20 18:10:05 crc kubenswrapper[4690]: I0320 18:10:05.454391 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567170-ss7qv" Mar 20 18:10:05 crc kubenswrapper[4690]: I0320 18:10:05.534166 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567164-cd9d7"] Mar 20 18:10:05 crc kubenswrapper[4690]: I0320 18:10:05.541365 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567164-cd9d7"] Mar 20 18:10:05 crc kubenswrapper[4690]: I0320 18:10:05.896582 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6adac120-f240-4287-881f-428d4400c7c2" path="/var/lib/kubelet/pods/6adac120-f240-4287-881f-428d4400c7c2/volumes" Mar 20 18:10:23 crc kubenswrapper[4690]: I0320 18:10:23.104071 4690 scope.go:117] "RemoveContainer" containerID="18952c04d289dd13580ce4f54fd816b23f488300e4119b032de7140bafb75567" Mar 20 18:10:24 crc kubenswrapper[4690]: I0320 18:10:24.274155 4690 patch_prober.go:28] interesting pod/machine-config-daemon-wtg2q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 18:10:24 crc kubenswrapper[4690]: I0320 18:10:24.274629 4690 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 18:10:24 crc kubenswrapper[4690]: I0320 18:10:24.274710 4690 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" Mar 20 18:10:24 crc kubenswrapper[4690]: I0320 18:10:24.275931 4690 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"24e5f76fee7e30729d09e38f23025e12449be266576373e532933c3f0101ae12"} pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 18:10:24 crc kubenswrapper[4690]: I0320 18:10:24.276051 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" containerName="machine-config-daemon" containerID="cri-o://24e5f76fee7e30729d09e38f23025e12449be266576373e532933c3f0101ae12" gracePeriod=600 Mar 20 18:10:24 crc kubenswrapper[4690]: E0320 18:10:24.408407 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:10:24 crc kubenswrapper[4690]: I0320 18:10:24.698518 4690 generic.go:334] "Generic (PLEG): container finished" podID="c18651e4-89e3-43fd-a780-bfa6df87591e" containerID="24e5f76fee7e30729d09e38f23025e12449be266576373e532933c3f0101ae12" exitCode=0 Mar 20 18:10:24 crc kubenswrapper[4690]: I0320 18:10:24.698610 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" event={"ID":"c18651e4-89e3-43fd-a780-bfa6df87591e","Type":"ContainerDied","Data":"24e5f76fee7e30729d09e38f23025e12449be266576373e532933c3f0101ae12"} Mar 20 18:10:24 crc kubenswrapper[4690]: I0320 18:10:24.698687 4690 scope.go:117] "RemoveContainer" containerID="9c743870b72976847070b0c9956af89e5f5f2891d80131c888a10eec990b9c51" Mar 20 18:10:24 crc kubenswrapper[4690]: I0320 18:10:24.699909 4690 scope.go:117] "RemoveContainer" containerID="24e5f76fee7e30729d09e38f23025e12449be266576373e532933c3f0101ae12" Mar 20 18:10:24 crc kubenswrapper[4690]: E0320 18:10:24.700459 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:10:37 crc kubenswrapper[4690]: I0320 18:10:37.884100 4690 scope.go:117] "RemoveContainer" containerID="24e5f76fee7e30729d09e38f23025e12449be266576373e532933c3f0101ae12" Mar 20 18:10:37 crc kubenswrapper[4690]: E0320 18:10:37.885383 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:10:49 crc kubenswrapper[4690]: I0320 18:10:49.884389 4690 scope.go:117] "RemoveContainer" containerID="24e5f76fee7e30729d09e38f23025e12449be266576373e532933c3f0101ae12" Mar 20 18:10:49 crc kubenswrapper[4690]: E0320 18:10:49.885589 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:11:03 crc kubenswrapper[4690]: I0320 18:11:03.883817 4690 scope.go:117] "RemoveContainer" containerID="24e5f76fee7e30729d09e38f23025e12449be266576373e532933c3f0101ae12" Mar 20 18:11:03 crc kubenswrapper[4690]: E0320 18:11:03.884823 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:11:17 crc kubenswrapper[4690]: I0320 18:11:17.885584 4690 scope.go:117] "RemoveContainer" containerID="24e5f76fee7e30729d09e38f23025e12449be266576373e532933c3f0101ae12" Mar 20 18:11:17 crc kubenswrapper[4690]: E0320 18:11:17.886737 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:11:17 crc kubenswrapper[4690]: I0320 18:11:17.904418 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vk9sz"] Mar 20 18:11:17 crc kubenswrapper[4690]: E0320 18:11:17.905102 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51ffd9e1-7ec0-490c-b5bf-746f61587d79" containerName="extract-content" Mar 20 18:11:17 crc kubenswrapper[4690]: I0320 18:11:17.905143 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="51ffd9e1-7ec0-490c-b5bf-746f61587d79" containerName="extract-content" Mar 20 18:11:17 crc kubenswrapper[4690]: E0320 18:11:17.905181 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51ffd9e1-7ec0-490c-b5bf-746f61587d79" containerName="extract-utilities" Mar 20 18:11:17 crc kubenswrapper[4690]: I0320 18:11:17.905195 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="51ffd9e1-7ec0-490c-b5bf-746f61587d79" containerName="extract-utilities" Mar 20 18:11:17 crc kubenswrapper[4690]: E0320 18:11:17.905221 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6f1cf15-883c-4bac-8695-749c4d80c353" containerName="oc" Mar 20 18:11:17 crc kubenswrapper[4690]: I0320 18:11:17.905236 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6f1cf15-883c-4bac-8695-749c4d80c353" containerName="oc" Mar 20 18:11:17 crc kubenswrapper[4690]: E0320 18:11:17.905295 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51ffd9e1-7ec0-490c-b5bf-746f61587d79" containerName="registry-server" Mar 20 18:11:17 crc kubenswrapper[4690]: I0320 18:11:17.905314 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="51ffd9e1-7ec0-490c-b5bf-746f61587d79" containerName="registry-server" Mar 20 18:11:17 crc kubenswrapper[4690]: I0320 18:11:17.905839 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="51ffd9e1-7ec0-490c-b5bf-746f61587d79" containerName="registry-server" Mar 20 18:11:17 crc kubenswrapper[4690]: I0320 18:11:17.905919 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6f1cf15-883c-4bac-8695-749c4d80c353" containerName="oc" Mar 20 18:11:17 crc kubenswrapper[4690]: I0320 18:11:17.908991 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vk9sz"] Mar 20 18:11:17 crc kubenswrapper[4690]: I0320 18:11:17.909203 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vk9sz" Mar 20 18:11:18 crc kubenswrapper[4690]: I0320 18:11:18.031355 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzfgm\" (UniqueName: \"kubernetes.io/projected/24626328-6a8d-41ac-9663-3c27c9f3ad4e-kube-api-access-tzfgm\") pod \"redhat-marketplace-vk9sz\" (UID: \"24626328-6a8d-41ac-9663-3c27c9f3ad4e\") " pod="openshift-marketplace/redhat-marketplace-vk9sz" Mar 20 18:11:18 crc kubenswrapper[4690]: I0320 18:11:18.031461 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24626328-6a8d-41ac-9663-3c27c9f3ad4e-catalog-content\") pod \"redhat-marketplace-vk9sz\" (UID: \"24626328-6a8d-41ac-9663-3c27c9f3ad4e\") " pod="openshift-marketplace/redhat-marketplace-vk9sz" Mar 20 18:11:18 crc kubenswrapper[4690]: I0320 18:11:18.031606 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24626328-6a8d-41ac-9663-3c27c9f3ad4e-utilities\") pod \"redhat-marketplace-vk9sz\" (UID: \"24626328-6a8d-41ac-9663-3c27c9f3ad4e\") " pod="openshift-marketplace/redhat-marketplace-vk9sz" Mar 20 18:11:18 crc kubenswrapper[4690]: I0320 18:11:18.132768 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24626328-6a8d-41ac-9663-3c27c9f3ad4e-utilities\") pod \"redhat-marketplace-vk9sz\" (UID: \"24626328-6a8d-41ac-9663-3c27c9f3ad4e\") " pod="openshift-marketplace/redhat-marketplace-vk9sz" Mar 20 18:11:18 crc kubenswrapper[4690]: I0320 18:11:18.132883 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzfgm\" (UniqueName: \"kubernetes.io/projected/24626328-6a8d-41ac-9663-3c27c9f3ad4e-kube-api-access-tzfgm\") pod \"redhat-marketplace-vk9sz\" (UID: \"24626328-6a8d-41ac-9663-3c27c9f3ad4e\") " pod="openshift-marketplace/redhat-marketplace-vk9sz" Mar 20 18:11:18 crc kubenswrapper[4690]: I0320 18:11:18.132910 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24626328-6a8d-41ac-9663-3c27c9f3ad4e-catalog-content\") pod \"redhat-marketplace-vk9sz\" (UID: \"24626328-6a8d-41ac-9663-3c27c9f3ad4e\") " pod="openshift-marketplace/redhat-marketplace-vk9sz" Mar 20 18:11:18 crc kubenswrapper[4690]: I0320 18:11:18.133432 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24626328-6a8d-41ac-9663-3c27c9f3ad4e-catalog-content\") pod \"redhat-marketplace-vk9sz\" (UID: \"24626328-6a8d-41ac-9663-3c27c9f3ad4e\") " pod="openshift-marketplace/redhat-marketplace-vk9sz" Mar 20 18:11:18 crc kubenswrapper[4690]: I0320 18:11:18.133492 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24626328-6a8d-41ac-9663-3c27c9f3ad4e-utilities\") pod \"redhat-marketplace-vk9sz\" (UID: \"24626328-6a8d-41ac-9663-3c27c9f3ad4e\") " pod="openshift-marketplace/redhat-marketplace-vk9sz" Mar 20 18:11:18 crc kubenswrapper[4690]: I0320 18:11:18.155963 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzfgm\" (UniqueName: \"kubernetes.io/projected/24626328-6a8d-41ac-9663-3c27c9f3ad4e-kube-api-access-tzfgm\") pod \"redhat-marketplace-vk9sz\" (UID: \"24626328-6a8d-41ac-9663-3c27c9f3ad4e\") " pod="openshift-marketplace/redhat-marketplace-vk9sz" Mar 20 18:11:18 crc kubenswrapper[4690]: I0320 18:11:18.245411 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vk9sz" Mar 20 18:11:18 crc kubenswrapper[4690]: I0320 18:11:18.751101 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vk9sz"] Mar 20 18:11:19 crc kubenswrapper[4690]: I0320 18:11:19.327186 4690 generic.go:334] "Generic (PLEG): container finished" podID="24626328-6a8d-41ac-9663-3c27c9f3ad4e" containerID="d7632ecec498715b45117fd09f264bb0fed0fe335f3301992484c0df179cb01b" exitCode=0 Mar 20 18:11:19 crc kubenswrapper[4690]: I0320 18:11:19.327300 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vk9sz" event={"ID":"24626328-6a8d-41ac-9663-3c27c9f3ad4e","Type":"ContainerDied","Data":"d7632ecec498715b45117fd09f264bb0fed0fe335f3301992484c0df179cb01b"} Mar 20 18:11:19 crc kubenswrapper[4690]: I0320 18:11:19.327343 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vk9sz" event={"ID":"24626328-6a8d-41ac-9663-3c27c9f3ad4e","Type":"ContainerStarted","Data":"1255d527906a84fa3819d9aac65f364e6922cc2bdedbd9e1c954ffb71519646f"} Mar 20 18:11:20 crc kubenswrapper[4690]: I0320 18:11:20.336543 4690 generic.go:334] "Generic (PLEG): container finished" podID="24626328-6a8d-41ac-9663-3c27c9f3ad4e" containerID="d1c99c91e11d00e6ed285ef9a41e1f5ee2e3ffdc826f84aa1511d05249b3517c" exitCode=0 Mar 20 18:11:20 crc kubenswrapper[4690]: I0320 18:11:20.336796 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vk9sz" event={"ID":"24626328-6a8d-41ac-9663-3c27c9f3ad4e","Type":"ContainerDied","Data":"d1c99c91e11d00e6ed285ef9a41e1f5ee2e3ffdc826f84aa1511d05249b3517c"} Mar 20 18:11:21 crc kubenswrapper[4690]: I0320 18:11:21.352248 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vk9sz" event={"ID":"24626328-6a8d-41ac-9663-3c27c9f3ad4e","Type":"ContainerStarted","Data":"0d65534597a77f19db03240ba65a77279a5cdefe27a79197170d48fbb7c5d300"} Mar 20 18:11:21 crc kubenswrapper[4690]: I0320 18:11:21.396878 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vk9sz" podStartSLOduration=2.979874528 podStartE2EDuration="4.396856614s" podCreationTimestamp="2026-03-20 18:11:17 +0000 UTC" firstStartedPulling="2026-03-20 18:11:19.329859932 +0000 UTC m=+2354.195685650" lastFinishedPulling="2026-03-20 18:11:20.746842048 +0000 UTC m=+2355.612667736" observedRunningTime="2026-03-20 18:11:21.384843053 +0000 UTC m=+2356.250668741" watchObservedRunningTime="2026-03-20 18:11:21.396856614 +0000 UTC m=+2356.262682302" Mar 20 18:11:28 crc kubenswrapper[4690]: I0320 18:11:28.245682 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vk9sz" Mar 20 18:11:28 crc kubenswrapper[4690]: I0320 18:11:28.246056 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vk9sz" Mar 20 18:11:28 crc kubenswrapper[4690]: I0320 18:11:28.313451 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vk9sz" Mar 20 18:11:28 crc kubenswrapper[4690]: I0320 18:11:28.479040 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vk9sz" Mar 20 18:11:28 crc kubenswrapper[4690]: I0320 18:11:28.564827 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vk9sz"] Mar 20 18:11:29 crc kubenswrapper[4690]: I0320 18:11:29.883348 4690 scope.go:117] "RemoveContainer" containerID="24e5f76fee7e30729d09e38f23025e12449be266576373e532933c3f0101ae12" Mar 20 18:11:29 crc kubenswrapper[4690]: E0320 18:11:29.885323 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:11:30 crc kubenswrapper[4690]: I0320 18:11:30.460106 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-vk9sz" podUID="24626328-6a8d-41ac-9663-3c27c9f3ad4e" containerName="registry-server" containerID="cri-o://0d65534597a77f19db03240ba65a77279a5cdefe27a79197170d48fbb7c5d300" gracePeriod=2 Mar 20 18:11:31 crc kubenswrapper[4690]: I0320 18:11:31.475119 4690 generic.go:334] "Generic (PLEG): container finished" podID="24626328-6a8d-41ac-9663-3c27c9f3ad4e" containerID="0d65534597a77f19db03240ba65a77279a5cdefe27a79197170d48fbb7c5d300" exitCode=0 Mar 20 18:11:31 crc kubenswrapper[4690]: I0320 18:11:31.475364 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vk9sz" event={"ID":"24626328-6a8d-41ac-9663-3c27c9f3ad4e","Type":"ContainerDied","Data":"0d65534597a77f19db03240ba65a77279a5cdefe27a79197170d48fbb7c5d300"} Mar 20 18:11:31 crc kubenswrapper[4690]: I0320 18:11:31.475481 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vk9sz" event={"ID":"24626328-6a8d-41ac-9663-3c27c9f3ad4e","Type":"ContainerDied","Data":"1255d527906a84fa3819d9aac65f364e6922cc2bdedbd9e1c954ffb71519646f"} Mar 20 18:11:31 crc kubenswrapper[4690]: I0320 18:11:31.475496 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1255d527906a84fa3819d9aac65f364e6922cc2bdedbd9e1c954ffb71519646f" Mar 20 18:11:31 crc kubenswrapper[4690]: I0320 18:11:31.550367 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vk9sz" Mar 20 18:11:31 crc kubenswrapper[4690]: I0320 18:11:31.735945 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24626328-6a8d-41ac-9663-3c27c9f3ad4e-catalog-content\") pod \"24626328-6a8d-41ac-9663-3c27c9f3ad4e\" (UID: \"24626328-6a8d-41ac-9663-3c27c9f3ad4e\") " Mar 20 18:11:31 crc kubenswrapper[4690]: I0320 18:11:31.736082 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzfgm\" (UniqueName: \"kubernetes.io/projected/24626328-6a8d-41ac-9663-3c27c9f3ad4e-kube-api-access-tzfgm\") pod \"24626328-6a8d-41ac-9663-3c27c9f3ad4e\" (UID: \"24626328-6a8d-41ac-9663-3c27c9f3ad4e\") " Mar 20 18:11:31 crc kubenswrapper[4690]: I0320 18:11:31.736532 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24626328-6a8d-41ac-9663-3c27c9f3ad4e-utilities\") pod \"24626328-6a8d-41ac-9663-3c27c9f3ad4e\" (UID: \"24626328-6a8d-41ac-9663-3c27c9f3ad4e\") " Mar 20 18:11:31 crc kubenswrapper[4690]: I0320 18:11:31.738294 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24626328-6a8d-41ac-9663-3c27c9f3ad4e-utilities" (OuterVolumeSpecName: "utilities") pod "24626328-6a8d-41ac-9663-3c27c9f3ad4e" (UID: "24626328-6a8d-41ac-9663-3c27c9f3ad4e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:11:31 crc kubenswrapper[4690]: I0320 18:11:31.748667 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24626328-6a8d-41ac-9663-3c27c9f3ad4e-kube-api-access-tzfgm" (OuterVolumeSpecName: "kube-api-access-tzfgm") pod "24626328-6a8d-41ac-9663-3c27c9f3ad4e" (UID: "24626328-6a8d-41ac-9663-3c27c9f3ad4e"). InnerVolumeSpecName "kube-api-access-tzfgm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:11:31 crc kubenswrapper[4690]: I0320 18:11:31.784185 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24626328-6a8d-41ac-9663-3c27c9f3ad4e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "24626328-6a8d-41ac-9663-3c27c9f3ad4e" (UID: "24626328-6a8d-41ac-9663-3c27c9f3ad4e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:11:31 crc kubenswrapper[4690]: I0320 18:11:31.839573 4690 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24626328-6a8d-41ac-9663-3c27c9f3ad4e-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 18:11:31 crc kubenswrapper[4690]: I0320 18:11:31.839613 4690 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24626328-6a8d-41ac-9663-3c27c9f3ad4e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 18:11:31 crc kubenswrapper[4690]: I0320 18:11:31.839632 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tzfgm\" (UniqueName: \"kubernetes.io/projected/24626328-6a8d-41ac-9663-3c27c9f3ad4e-kube-api-access-tzfgm\") on node \"crc\" DevicePath \"\"" Mar 20 18:11:32 crc kubenswrapper[4690]: I0320 18:11:32.495809 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vk9sz" Mar 20 18:11:32 crc kubenswrapper[4690]: I0320 18:11:32.527566 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vk9sz"] Mar 20 18:11:32 crc kubenswrapper[4690]: I0320 18:11:32.543927 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-vk9sz"] Mar 20 18:11:33 crc kubenswrapper[4690]: I0320 18:11:33.903970 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24626328-6a8d-41ac-9663-3c27c9f3ad4e" path="/var/lib/kubelet/pods/24626328-6a8d-41ac-9663-3c27c9f3ad4e/volumes" Mar 20 18:11:44 crc kubenswrapper[4690]: I0320 18:11:44.884373 4690 scope.go:117] "RemoveContainer" containerID="24e5f76fee7e30729d09e38f23025e12449be266576373e532933c3f0101ae12" Mar 20 18:11:44 crc kubenswrapper[4690]: E0320 18:11:44.885572 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:11:55 crc kubenswrapper[4690]: I0320 18:11:55.892423 4690 scope.go:117] "RemoveContainer" containerID="24e5f76fee7e30729d09e38f23025e12449be266576373e532933c3f0101ae12" Mar 20 18:11:55 crc kubenswrapper[4690]: E0320 18:11:55.893144 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:12:00 crc kubenswrapper[4690]: I0320 18:12:00.165300 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567172-h7kxj"] Mar 20 18:12:00 crc kubenswrapper[4690]: E0320 18:12:00.166058 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24626328-6a8d-41ac-9663-3c27c9f3ad4e" containerName="extract-content" Mar 20 18:12:00 crc kubenswrapper[4690]: I0320 18:12:00.166076 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="24626328-6a8d-41ac-9663-3c27c9f3ad4e" containerName="extract-content" Mar 20 18:12:00 crc kubenswrapper[4690]: E0320 18:12:00.166117 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24626328-6a8d-41ac-9663-3c27c9f3ad4e" containerName="extract-utilities" Mar 20 18:12:00 crc kubenswrapper[4690]: I0320 18:12:00.166126 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="24626328-6a8d-41ac-9663-3c27c9f3ad4e" containerName="extract-utilities" Mar 20 18:12:00 crc kubenswrapper[4690]: E0320 18:12:00.166142 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24626328-6a8d-41ac-9663-3c27c9f3ad4e" containerName="registry-server" Mar 20 18:12:00 crc kubenswrapper[4690]: I0320 18:12:00.166149 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="24626328-6a8d-41ac-9663-3c27c9f3ad4e" containerName="registry-server" Mar 20 18:12:00 crc kubenswrapper[4690]: I0320 18:12:00.166437 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="24626328-6a8d-41ac-9663-3c27c9f3ad4e" containerName="registry-server" Mar 20 18:12:00 crc kubenswrapper[4690]: I0320 18:12:00.167144 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567172-h7kxj" Mar 20 18:12:00 crc kubenswrapper[4690]: I0320 18:12:00.169961 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5fwhb" Mar 20 18:12:00 crc kubenswrapper[4690]: I0320 18:12:00.170232 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 18:12:00 crc kubenswrapper[4690]: I0320 18:12:00.170680 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 18:12:00 crc kubenswrapper[4690]: I0320 18:12:00.182458 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567172-h7kxj"] Mar 20 18:12:00 crc kubenswrapper[4690]: I0320 18:12:00.239783 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lmxd\" (UniqueName: \"kubernetes.io/projected/e2ae2e33-2ac4-4ff3-a4c1-48c733a21a0d-kube-api-access-9lmxd\") pod \"auto-csr-approver-29567172-h7kxj\" (UID: \"e2ae2e33-2ac4-4ff3-a4c1-48c733a21a0d\") " pod="openshift-infra/auto-csr-approver-29567172-h7kxj" Mar 20 18:12:00 crc kubenswrapper[4690]: I0320 18:12:00.341648 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lmxd\" (UniqueName: \"kubernetes.io/projected/e2ae2e33-2ac4-4ff3-a4c1-48c733a21a0d-kube-api-access-9lmxd\") pod \"auto-csr-approver-29567172-h7kxj\" (UID: \"e2ae2e33-2ac4-4ff3-a4c1-48c733a21a0d\") " pod="openshift-infra/auto-csr-approver-29567172-h7kxj" Mar 20 18:12:00 crc kubenswrapper[4690]: I0320 18:12:00.365137 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lmxd\" (UniqueName: \"kubernetes.io/projected/e2ae2e33-2ac4-4ff3-a4c1-48c733a21a0d-kube-api-access-9lmxd\") pod \"auto-csr-approver-29567172-h7kxj\" (UID: \"e2ae2e33-2ac4-4ff3-a4c1-48c733a21a0d\") " pod="openshift-infra/auto-csr-approver-29567172-h7kxj" Mar 20 18:12:00 crc kubenswrapper[4690]: I0320 18:12:00.494737 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567172-h7kxj" Mar 20 18:12:00 crc kubenswrapper[4690]: I0320 18:12:00.967028 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567172-h7kxj"] Mar 20 18:12:00 crc kubenswrapper[4690]: W0320 18:12:00.977441 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode2ae2e33_2ac4_4ff3_a4c1_48c733a21a0d.slice/crio-945939917dd7709a714e1dd398a79a21a676997d3b3731e2726c9972fd0577ac WatchSource:0}: Error finding container 945939917dd7709a714e1dd398a79a21a676997d3b3731e2726c9972fd0577ac: Status 404 returned error can't find the container with id 945939917dd7709a714e1dd398a79a21a676997d3b3731e2726c9972fd0577ac Mar 20 18:12:01 crc kubenswrapper[4690]: I0320 18:12:01.789789 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567172-h7kxj" event={"ID":"e2ae2e33-2ac4-4ff3-a4c1-48c733a21a0d","Type":"ContainerStarted","Data":"945939917dd7709a714e1dd398a79a21a676997d3b3731e2726c9972fd0577ac"} Mar 20 18:12:02 crc kubenswrapper[4690]: I0320 18:12:02.802449 4690 generic.go:334] "Generic (PLEG): container finished" podID="e2ae2e33-2ac4-4ff3-a4c1-48c733a21a0d" containerID="386988d99d4cb11dafa7d024a5bd3cc5301f9c479049f50cbf1bbd3273d719a0" exitCode=0 Mar 20 18:12:02 crc kubenswrapper[4690]: I0320 18:12:02.802604 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567172-h7kxj" event={"ID":"e2ae2e33-2ac4-4ff3-a4c1-48c733a21a0d","Type":"ContainerDied","Data":"386988d99d4cb11dafa7d024a5bd3cc5301f9c479049f50cbf1bbd3273d719a0"} Mar 20 18:12:04 crc kubenswrapper[4690]: I0320 18:12:04.135349 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567172-h7kxj" Mar 20 18:12:04 crc kubenswrapper[4690]: I0320 18:12:04.214642 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9lmxd\" (UniqueName: \"kubernetes.io/projected/e2ae2e33-2ac4-4ff3-a4c1-48c733a21a0d-kube-api-access-9lmxd\") pod \"e2ae2e33-2ac4-4ff3-a4c1-48c733a21a0d\" (UID: \"e2ae2e33-2ac4-4ff3-a4c1-48c733a21a0d\") " Mar 20 18:12:04 crc kubenswrapper[4690]: I0320 18:12:04.221678 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2ae2e33-2ac4-4ff3-a4c1-48c733a21a0d-kube-api-access-9lmxd" (OuterVolumeSpecName: "kube-api-access-9lmxd") pod "e2ae2e33-2ac4-4ff3-a4c1-48c733a21a0d" (UID: "e2ae2e33-2ac4-4ff3-a4c1-48c733a21a0d"). InnerVolumeSpecName "kube-api-access-9lmxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:12:04 crc kubenswrapper[4690]: I0320 18:12:04.317357 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9lmxd\" (UniqueName: \"kubernetes.io/projected/e2ae2e33-2ac4-4ff3-a4c1-48c733a21a0d-kube-api-access-9lmxd\") on node \"crc\" DevicePath \"\"" Mar 20 18:12:04 crc kubenswrapper[4690]: I0320 18:12:04.824590 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567172-h7kxj" event={"ID":"e2ae2e33-2ac4-4ff3-a4c1-48c733a21a0d","Type":"ContainerDied","Data":"945939917dd7709a714e1dd398a79a21a676997d3b3731e2726c9972fd0577ac"} Mar 20 18:12:04 crc kubenswrapper[4690]: I0320 18:12:04.824645 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="945939917dd7709a714e1dd398a79a21a676997d3b3731e2726c9972fd0577ac" Mar 20 18:12:04 crc kubenswrapper[4690]: I0320 18:12:04.824656 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567172-h7kxj" Mar 20 18:12:05 crc kubenswrapper[4690]: I0320 18:12:05.218394 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567166-xg9rc"] Mar 20 18:12:05 crc kubenswrapper[4690]: I0320 18:12:05.240046 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567166-xg9rc"] Mar 20 18:12:05 crc kubenswrapper[4690]: I0320 18:12:05.896585 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95fcb3df-a81e-40b0-aefe-2f740db37426" path="/var/lib/kubelet/pods/95fcb3df-a81e-40b0-aefe-2f740db37426/volumes" Mar 20 18:12:10 crc kubenswrapper[4690]: I0320 18:12:10.883104 4690 scope.go:117] "RemoveContainer" containerID="24e5f76fee7e30729d09e38f23025e12449be266576373e532933c3f0101ae12" Mar 20 18:12:10 crc kubenswrapper[4690]: E0320 18:12:10.883702 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:12:21 crc kubenswrapper[4690]: I0320 18:12:21.883205 4690 scope.go:117] "RemoveContainer" containerID="24e5f76fee7e30729d09e38f23025e12449be266576373e532933c3f0101ae12" Mar 20 18:12:21 crc kubenswrapper[4690]: E0320 18:12:21.884066 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:12:23 crc kubenswrapper[4690]: I0320 18:12:23.266345 4690 scope.go:117] "RemoveContainer" containerID="21cc76818b2012ca77adf91823bae802e3edaeebbf49fee7438e2987ab894ec3" Mar 20 18:12:33 crc kubenswrapper[4690]: I0320 18:12:33.884387 4690 scope.go:117] "RemoveContainer" containerID="24e5f76fee7e30729d09e38f23025e12449be266576373e532933c3f0101ae12" Mar 20 18:12:33 crc kubenswrapper[4690]: E0320 18:12:33.885235 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:12:45 crc kubenswrapper[4690]: I0320 18:12:45.891406 4690 scope.go:117] "RemoveContainer" containerID="24e5f76fee7e30729d09e38f23025e12449be266576373e532933c3f0101ae12" Mar 20 18:12:45 crc kubenswrapper[4690]: E0320 18:12:45.892284 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:12:58 crc kubenswrapper[4690]: I0320 18:12:58.883550 4690 scope.go:117] "RemoveContainer" containerID="24e5f76fee7e30729d09e38f23025e12449be266576373e532933c3f0101ae12" Mar 20 18:12:58 crc kubenswrapper[4690]: E0320 18:12:58.884815 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:13:13 crc kubenswrapper[4690]: I0320 18:13:13.883438 4690 scope.go:117] "RemoveContainer" containerID="24e5f76fee7e30729d09e38f23025e12449be266576373e532933c3f0101ae12" Mar 20 18:13:13 crc kubenswrapper[4690]: E0320 18:13:13.884214 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:13:14 crc kubenswrapper[4690]: I0320 18:13:14.563559 4690 generic.go:334] "Generic (PLEG): container finished" podID="ca6878cf-74a4-4bf6-8e36-bf1a669d787f" containerID="b8645cdea1b9c53e76c0669a5c440ce546d7538d9a0c01075ac051411f16afc5" exitCode=0 Mar 20 18:13:14 crc kubenswrapper[4690]: I0320 18:13:14.563615 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-926mx" event={"ID":"ca6878cf-74a4-4bf6-8e36-bf1a669d787f","Type":"ContainerDied","Data":"b8645cdea1b9c53e76c0669a5c440ce546d7538d9a0c01075ac051411f16afc5"} Mar 20 18:13:16 crc kubenswrapper[4690]: I0320 18:13:16.075638 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-926mx" Mar 20 18:13:16 crc kubenswrapper[4690]: I0320 18:13:16.242444 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ca6878cf-74a4-4bf6-8e36-bf1a669d787f-ssh-key-openstack-edpm-ipam\") pod \"ca6878cf-74a4-4bf6-8e36-bf1a669d787f\" (UID: \"ca6878cf-74a4-4bf6-8e36-bf1a669d787f\") " Mar 20 18:13:16 crc kubenswrapper[4690]: I0320 18:13:16.242846 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ca6878cf-74a4-4bf6-8e36-bf1a669d787f-inventory\") pod \"ca6878cf-74a4-4bf6-8e36-bf1a669d787f\" (UID: \"ca6878cf-74a4-4bf6-8e36-bf1a669d787f\") " Mar 20 18:13:16 crc kubenswrapper[4690]: I0320 18:13:16.243156 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vm9jv\" (UniqueName: \"kubernetes.io/projected/ca6878cf-74a4-4bf6-8e36-bf1a669d787f-kube-api-access-vm9jv\") pod \"ca6878cf-74a4-4bf6-8e36-bf1a669d787f\" (UID: \"ca6878cf-74a4-4bf6-8e36-bf1a669d787f\") " Mar 20 18:13:16 crc kubenswrapper[4690]: I0320 18:13:16.243462 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca6878cf-74a4-4bf6-8e36-bf1a669d787f-libvirt-combined-ca-bundle\") pod \"ca6878cf-74a4-4bf6-8e36-bf1a669d787f\" (UID: \"ca6878cf-74a4-4bf6-8e36-bf1a669d787f\") " Mar 20 18:13:16 crc kubenswrapper[4690]: I0320 18:13:16.243843 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/ca6878cf-74a4-4bf6-8e36-bf1a669d787f-libvirt-secret-0\") pod \"ca6878cf-74a4-4bf6-8e36-bf1a669d787f\" (UID: \"ca6878cf-74a4-4bf6-8e36-bf1a669d787f\") " Mar 20 18:13:16 crc kubenswrapper[4690]: I0320 18:13:16.249700 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca6878cf-74a4-4bf6-8e36-bf1a669d787f-kube-api-access-vm9jv" (OuterVolumeSpecName: "kube-api-access-vm9jv") pod "ca6878cf-74a4-4bf6-8e36-bf1a669d787f" (UID: "ca6878cf-74a4-4bf6-8e36-bf1a669d787f"). InnerVolumeSpecName "kube-api-access-vm9jv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:13:16 crc kubenswrapper[4690]: I0320 18:13:16.249746 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca6878cf-74a4-4bf6-8e36-bf1a669d787f-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "ca6878cf-74a4-4bf6-8e36-bf1a669d787f" (UID: "ca6878cf-74a4-4bf6-8e36-bf1a669d787f"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:13:16 crc kubenswrapper[4690]: I0320 18:13:16.277561 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca6878cf-74a4-4bf6-8e36-bf1a669d787f-inventory" (OuterVolumeSpecName: "inventory") pod "ca6878cf-74a4-4bf6-8e36-bf1a669d787f" (UID: "ca6878cf-74a4-4bf6-8e36-bf1a669d787f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:13:16 crc kubenswrapper[4690]: I0320 18:13:16.278168 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca6878cf-74a4-4bf6-8e36-bf1a669d787f-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "ca6878cf-74a4-4bf6-8e36-bf1a669d787f" (UID: "ca6878cf-74a4-4bf6-8e36-bf1a669d787f"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:13:16 crc kubenswrapper[4690]: I0320 18:13:16.288547 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca6878cf-74a4-4bf6-8e36-bf1a669d787f-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "ca6878cf-74a4-4bf6-8e36-bf1a669d787f" (UID: "ca6878cf-74a4-4bf6-8e36-bf1a669d787f"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:13:16 crc kubenswrapper[4690]: I0320 18:13:16.346911 4690 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/ca6878cf-74a4-4bf6-8e36-bf1a669d787f-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Mar 20 18:13:16 crc kubenswrapper[4690]: I0320 18:13:16.346945 4690 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ca6878cf-74a4-4bf6-8e36-bf1a669d787f-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 18:13:16 crc kubenswrapper[4690]: I0320 18:13:16.346957 4690 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ca6878cf-74a4-4bf6-8e36-bf1a669d787f-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 18:13:16 crc kubenswrapper[4690]: I0320 18:13:16.346970 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vm9jv\" (UniqueName: \"kubernetes.io/projected/ca6878cf-74a4-4bf6-8e36-bf1a669d787f-kube-api-access-vm9jv\") on node \"crc\" DevicePath \"\"" Mar 20 18:13:16 crc kubenswrapper[4690]: I0320 18:13:16.346978 4690 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca6878cf-74a4-4bf6-8e36-bf1a669d787f-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 18:13:16 crc kubenswrapper[4690]: I0320 18:13:16.587034 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-926mx" event={"ID":"ca6878cf-74a4-4bf6-8e36-bf1a669d787f","Type":"ContainerDied","Data":"160a14d0c29241e32561934a9335f784ee0f4aa6094cf272b35d7add4a6d5a10"} Mar 20 18:13:16 crc kubenswrapper[4690]: I0320 18:13:16.587075 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="160a14d0c29241e32561934a9335f784ee0f4aa6094cf272b35d7add4a6d5a10" Mar 20 18:13:16 crc kubenswrapper[4690]: I0320 18:13:16.587086 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-926mx" Mar 20 18:13:16 crc kubenswrapper[4690]: I0320 18:13:16.701680 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-bs8n5"] Mar 20 18:13:16 crc kubenswrapper[4690]: E0320 18:13:16.702449 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2ae2e33-2ac4-4ff3-a4c1-48c733a21a0d" containerName="oc" Mar 20 18:13:16 crc kubenswrapper[4690]: I0320 18:13:16.702471 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2ae2e33-2ac4-4ff3-a4c1-48c733a21a0d" containerName="oc" Mar 20 18:13:16 crc kubenswrapper[4690]: E0320 18:13:16.702486 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca6878cf-74a4-4bf6-8e36-bf1a669d787f" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 20 18:13:16 crc kubenswrapper[4690]: I0320 18:13:16.702496 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca6878cf-74a4-4bf6-8e36-bf1a669d787f" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 20 18:13:16 crc kubenswrapper[4690]: I0320 18:13:16.702717 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2ae2e33-2ac4-4ff3-a4c1-48c733a21a0d" containerName="oc" Mar 20 18:13:16 crc kubenswrapper[4690]: I0320 18:13:16.702740 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca6878cf-74a4-4bf6-8e36-bf1a669d787f" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 20 18:13:16 crc kubenswrapper[4690]: I0320 18:13:16.703465 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bs8n5" Mar 20 18:13:16 crc kubenswrapper[4690]: I0320 18:13:16.707897 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 18:13:16 crc kubenswrapper[4690]: I0320 18:13:16.708119 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-k9qb4" Mar 20 18:13:16 crc kubenswrapper[4690]: I0320 18:13:16.708340 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Mar 20 18:13:16 crc kubenswrapper[4690]: I0320 18:13:16.708487 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 18:13:16 crc kubenswrapper[4690]: I0320 18:13:16.708629 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Mar 20 18:13:16 crc kubenswrapper[4690]: I0320 18:13:16.708832 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 18:13:16 crc kubenswrapper[4690]: I0320 18:13:16.714427 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Mar 20 18:13:16 crc kubenswrapper[4690]: I0320 18:13:16.741766 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-bs8n5"] Mar 20 18:13:16 crc kubenswrapper[4690]: I0320 18:13:16.856225 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/8146ff99-3308-4b91-b487-3bd707bed4dd-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bs8n5\" (UID: \"8146ff99-3308-4b91-b487-3bd707bed4dd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bs8n5" Mar 20 18:13:16 crc kubenswrapper[4690]: I0320 18:13:16.856357 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8146ff99-3308-4b91-b487-3bd707bed4dd-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bs8n5\" (UID: \"8146ff99-3308-4b91-b487-3bd707bed4dd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bs8n5" Mar 20 18:13:16 crc kubenswrapper[4690]: I0320 18:13:16.857455 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/8146ff99-3308-4b91-b487-3bd707bed4dd-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bs8n5\" (UID: \"8146ff99-3308-4b91-b487-3bd707bed4dd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bs8n5" Mar 20 18:13:16 crc kubenswrapper[4690]: I0320 18:13:16.857544 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ck7xx\" (UniqueName: \"kubernetes.io/projected/8146ff99-3308-4b91-b487-3bd707bed4dd-kube-api-access-ck7xx\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bs8n5\" (UID: \"8146ff99-3308-4b91-b487-3bd707bed4dd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bs8n5" Mar 20 18:13:16 crc kubenswrapper[4690]: I0320 18:13:16.857660 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8146ff99-3308-4b91-b487-3bd707bed4dd-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bs8n5\" (UID: \"8146ff99-3308-4b91-b487-3bd707bed4dd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bs8n5" Mar 20 18:13:16 crc kubenswrapper[4690]: I0320 18:13:16.857735 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8146ff99-3308-4b91-b487-3bd707bed4dd-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bs8n5\" (UID: \"8146ff99-3308-4b91-b487-3bd707bed4dd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bs8n5" Mar 20 18:13:16 crc kubenswrapper[4690]: I0320 18:13:16.857784 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/8146ff99-3308-4b91-b487-3bd707bed4dd-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bs8n5\" (UID: \"8146ff99-3308-4b91-b487-3bd707bed4dd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bs8n5" Mar 20 18:13:16 crc kubenswrapper[4690]: I0320 18:13:16.857867 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/8146ff99-3308-4b91-b487-3bd707bed4dd-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bs8n5\" (UID: \"8146ff99-3308-4b91-b487-3bd707bed4dd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bs8n5" Mar 20 18:13:16 crc kubenswrapper[4690]: I0320 18:13:16.857934 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/8146ff99-3308-4b91-b487-3bd707bed4dd-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bs8n5\" (UID: \"8146ff99-3308-4b91-b487-3bd707bed4dd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bs8n5" Mar 20 18:13:16 crc kubenswrapper[4690]: I0320 18:13:16.858028 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/8146ff99-3308-4b91-b487-3bd707bed4dd-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bs8n5\" (UID: \"8146ff99-3308-4b91-b487-3bd707bed4dd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bs8n5" Mar 20 18:13:16 crc kubenswrapper[4690]: I0320 18:13:16.858097 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/8146ff99-3308-4b91-b487-3bd707bed4dd-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bs8n5\" (UID: \"8146ff99-3308-4b91-b487-3bd707bed4dd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bs8n5" Mar 20 18:13:16 crc kubenswrapper[4690]: I0320 18:13:16.960422 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8146ff99-3308-4b91-b487-3bd707bed4dd-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bs8n5\" (UID: \"8146ff99-3308-4b91-b487-3bd707bed4dd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bs8n5" Mar 20 18:13:16 crc kubenswrapper[4690]: I0320 18:13:16.960737 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8146ff99-3308-4b91-b487-3bd707bed4dd-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bs8n5\" (UID: \"8146ff99-3308-4b91-b487-3bd707bed4dd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bs8n5" Mar 20 18:13:16 crc kubenswrapper[4690]: I0320 18:13:16.960856 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/8146ff99-3308-4b91-b487-3bd707bed4dd-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bs8n5\" (UID: \"8146ff99-3308-4b91-b487-3bd707bed4dd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bs8n5" Mar 20 18:13:16 crc kubenswrapper[4690]: I0320 18:13:16.960975 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/8146ff99-3308-4b91-b487-3bd707bed4dd-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bs8n5\" (UID: \"8146ff99-3308-4b91-b487-3bd707bed4dd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bs8n5" Mar 20 18:13:16 crc kubenswrapper[4690]: I0320 18:13:16.961095 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/8146ff99-3308-4b91-b487-3bd707bed4dd-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bs8n5\" (UID: \"8146ff99-3308-4b91-b487-3bd707bed4dd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bs8n5" Mar 20 18:13:16 crc kubenswrapper[4690]: I0320 18:13:16.961291 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/8146ff99-3308-4b91-b487-3bd707bed4dd-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bs8n5\" (UID: \"8146ff99-3308-4b91-b487-3bd707bed4dd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bs8n5" Mar 20 18:13:16 crc kubenswrapper[4690]: I0320 18:13:16.961641 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/8146ff99-3308-4b91-b487-3bd707bed4dd-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bs8n5\" (UID: \"8146ff99-3308-4b91-b487-3bd707bed4dd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bs8n5" Mar 20 18:13:16 crc kubenswrapper[4690]: I0320 18:13:16.961840 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/8146ff99-3308-4b91-b487-3bd707bed4dd-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bs8n5\" (UID: \"8146ff99-3308-4b91-b487-3bd707bed4dd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bs8n5" Mar 20 18:13:16 crc kubenswrapper[4690]: I0320 18:13:16.961972 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8146ff99-3308-4b91-b487-3bd707bed4dd-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bs8n5\" (UID: \"8146ff99-3308-4b91-b487-3bd707bed4dd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bs8n5" Mar 20 18:13:16 crc kubenswrapper[4690]: I0320 18:13:16.962266 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/8146ff99-3308-4b91-b487-3bd707bed4dd-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bs8n5\" (UID: \"8146ff99-3308-4b91-b487-3bd707bed4dd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bs8n5" Mar 20 18:13:16 crc kubenswrapper[4690]: I0320 18:13:16.962393 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ck7xx\" (UniqueName: \"kubernetes.io/projected/8146ff99-3308-4b91-b487-3bd707bed4dd-kube-api-access-ck7xx\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bs8n5\" (UID: \"8146ff99-3308-4b91-b487-3bd707bed4dd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bs8n5" Mar 20 18:13:16 crc kubenswrapper[4690]: I0320 18:13:16.962704 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/8146ff99-3308-4b91-b487-3bd707bed4dd-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bs8n5\" (UID: \"8146ff99-3308-4b91-b487-3bd707bed4dd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bs8n5" Mar 20 18:13:16 crc kubenswrapper[4690]: I0320 18:13:16.966678 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/8146ff99-3308-4b91-b487-3bd707bed4dd-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bs8n5\" (UID: \"8146ff99-3308-4b91-b487-3bd707bed4dd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bs8n5" Mar 20 18:13:16 crc kubenswrapper[4690]: I0320 18:13:16.966696 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8146ff99-3308-4b91-b487-3bd707bed4dd-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bs8n5\" (UID: \"8146ff99-3308-4b91-b487-3bd707bed4dd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bs8n5" Mar 20 18:13:16 crc kubenswrapper[4690]: I0320 18:13:16.966683 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/8146ff99-3308-4b91-b487-3bd707bed4dd-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bs8n5\" (UID: \"8146ff99-3308-4b91-b487-3bd707bed4dd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bs8n5" Mar 20 18:13:16 crc kubenswrapper[4690]: I0320 18:13:16.967964 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8146ff99-3308-4b91-b487-3bd707bed4dd-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bs8n5\" (UID: \"8146ff99-3308-4b91-b487-3bd707bed4dd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bs8n5" Mar 20 18:13:16 crc kubenswrapper[4690]: I0320 18:13:16.968567 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8146ff99-3308-4b91-b487-3bd707bed4dd-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bs8n5\" (UID: \"8146ff99-3308-4b91-b487-3bd707bed4dd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bs8n5" Mar 20 18:13:16 crc kubenswrapper[4690]: I0320 18:13:16.969640 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/8146ff99-3308-4b91-b487-3bd707bed4dd-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bs8n5\" (UID: \"8146ff99-3308-4b91-b487-3bd707bed4dd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bs8n5" Mar 20 18:13:16 crc kubenswrapper[4690]: I0320 18:13:16.969773 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/8146ff99-3308-4b91-b487-3bd707bed4dd-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bs8n5\" (UID: \"8146ff99-3308-4b91-b487-3bd707bed4dd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bs8n5" Mar 20 18:13:16 crc kubenswrapper[4690]: I0320 18:13:16.971027 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/8146ff99-3308-4b91-b487-3bd707bed4dd-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bs8n5\" (UID: \"8146ff99-3308-4b91-b487-3bd707bed4dd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bs8n5" Mar 20 18:13:16 crc kubenswrapper[4690]: I0320 18:13:16.972470 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/8146ff99-3308-4b91-b487-3bd707bed4dd-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bs8n5\" (UID: \"8146ff99-3308-4b91-b487-3bd707bed4dd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bs8n5" Mar 20 18:13:16 crc kubenswrapper[4690]: I0320 18:13:16.979732 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ck7xx\" (UniqueName: \"kubernetes.io/projected/8146ff99-3308-4b91-b487-3bd707bed4dd-kube-api-access-ck7xx\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bs8n5\" (UID: \"8146ff99-3308-4b91-b487-3bd707bed4dd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bs8n5" Mar 20 18:13:17 crc kubenswrapper[4690]: I0320 18:13:17.037736 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bs8n5" Mar 20 18:13:18 crc kubenswrapper[4690]: I0320 18:13:17.588063 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-bs8n5"] Mar 20 18:13:18 crc kubenswrapper[4690]: W0320 18:13:17.593487 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8146ff99_3308_4b91_b487_3bd707bed4dd.slice/crio-3e4a0de290c84f90311be82b7dcc45c26cc4700fb8b51126e068101b680f009e WatchSource:0}: Error finding container 3e4a0de290c84f90311be82b7dcc45c26cc4700fb8b51126e068101b680f009e: Status 404 returned error can't find the container with id 3e4a0de290c84f90311be82b7dcc45c26cc4700fb8b51126e068101b680f009e Mar 20 18:13:18 crc kubenswrapper[4690]: I0320 18:13:17.595564 4690 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 18:13:18 crc kubenswrapper[4690]: I0320 18:13:18.605200 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bs8n5" event={"ID":"8146ff99-3308-4b91-b487-3bd707bed4dd","Type":"ContainerStarted","Data":"7d2acaf39330b4a3decdbe295748d193ae75356e305474b95d81341fbd3c2886"} Mar 20 18:13:18 crc kubenswrapper[4690]: I0320 18:13:18.605977 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bs8n5" event={"ID":"8146ff99-3308-4b91-b487-3bd707bed4dd","Type":"ContainerStarted","Data":"3e4a0de290c84f90311be82b7dcc45c26cc4700fb8b51126e068101b680f009e"} Mar 20 18:13:18 crc kubenswrapper[4690]: I0320 18:13:18.634911 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bs8n5" podStartSLOduration=2.181333862 podStartE2EDuration="2.634886772s" podCreationTimestamp="2026-03-20 18:13:16 +0000 UTC" firstStartedPulling="2026-03-20 18:13:17.595376581 +0000 UTC m=+2472.461202259" lastFinishedPulling="2026-03-20 18:13:18.048929491 +0000 UTC m=+2472.914755169" observedRunningTime="2026-03-20 18:13:18.626876525 +0000 UTC m=+2473.492702203" watchObservedRunningTime="2026-03-20 18:13:18.634886772 +0000 UTC m=+2473.500712460" Mar 20 18:13:28 crc kubenswrapper[4690]: I0320 18:13:28.883377 4690 scope.go:117] "RemoveContainer" containerID="24e5f76fee7e30729d09e38f23025e12449be266576373e532933c3f0101ae12" Mar 20 18:13:28 crc kubenswrapper[4690]: E0320 18:13:28.884284 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:13:42 crc kubenswrapper[4690]: I0320 18:13:42.883782 4690 scope.go:117] "RemoveContainer" containerID="24e5f76fee7e30729d09e38f23025e12449be266576373e532933c3f0101ae12" Mar 20 18:13:42 crc kubenswrapper[4690]: E0320 18:13:42.885142 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:13:56 crc kubenswrapper[4690]: I0320 18:13:56.883492 4690 scope.go:117] "RemoveContainer" containerID="24e5f76fee7e30729d09e38f23025e12449be266576373e532933c3f0101ae12" Mar 20 18:13:56 crc kubenswrapper[4690]: E0320 18:13:56.884305 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:14:00 crc kubenswrapper[4690]: I0320 18:14:00.158732 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567174-77prw"] Mar 20 18:14:00 crc kubenswrapper[4690]: I0320 18:14:00.160590 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567174-77prw" Mar 20 18:14:00 crc kubenswrapper[4690]: I0320 18:14:00.162790 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 18:14:00 crc kubenswrapper[4690]: I0320 18:14:00.163316 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5fwhb" Mar 20 18:14:00 crc kubenswrapper[4690]: I0320 18:14:00.163594 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 18:14:00 crc kubenswrapper[4690]: I0320 18:14:00.173467 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567174-77prw"] Mar 20 18:14:00 crc kubenswrapper[4690]: I0320 18:14:00.218230 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xg2fp\" (UniqueName: \"kubernetes.io/projected/38bf052e-356e-49e0-af8e-e3c78c8da186-kube-api-access-xg2fp\") pod \"auto-csr-approver-29567174-77prw\" (UID: \"38bf052e-356e-49e0-af8e-e3c78c8da186\") " pod="openshift-infra/auto-csr-approver-29567174-77prw" Mar 20 18:14:00 crc kubenswrapper[4690]: I0320 18:14:00.320346 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xg2fp\" (UniqueName: \"kubernetes.io/projected/38bf052e-356e-49e0-af8e-e3c78c8da186-kube-api-access-xg2fp\") pod \"auto-csr-approver-29567174-77prw\" (UID: \"38bf052e-356e-49e0-af8e-e3c78c8da186\") " pod="openshift-infra/auto-csr-approver-29567174-77prw" Mar 20 18:14:00 crc kubenswrapper[4690]: I0320 18:14:00.340681 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xg2fp\" (UniqueName: \"kubernetes.io/projected/38bf052e-356e-49e0-af8e-e3c78c8da186-kube-api-access-xg2fp\") pod \"auto-csr-approver-29567174-77prw\" (UID: \"38bf052e-356e-49e0-af8e-e3c78c8da186\") " pod="openshift-infra/auto-csr-approver-29567174-77prw" Mar 20 18:14:00 crc kubenswrapper[4690]: I0320 18:14:00.518238 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567174-77prw" Mar 20 18:14:00 crc kubenswrapper[4690]: I0320 18:14:00.959759 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567174-77prw"] Mar 20 18:14:01 crc kubenswrapper[4690]: I0320 18:14:01.072453 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567174-77prw" event={"ID":"38bf052e-356e-49e0-af8e-e3c78c8da186","Type":"ContainerStarted","Data":"13b563184b8a224664635098b83b213f9c2749f4e163e67bbc9436d3f3377268"} Mar 20 18:14:03 crc kubenswrapper[4690]: I0320 18:14:03.102387 4690 generic.go:334] "Generic (PLEG): container finished" podID="38bf052e-356e-49e0-af8e-e3c78c8da186" containerID="205143e133a6d4baff66c65824716168bade68aa5a43caf672791c4c360868ab" exitCode=0 Mar 20 18:14:03 crc kubenswrapper[4690]: I0320 18:14:03.102467 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567174-77prw" event={"ID":"38bf052e-356e-49e0-af8e-e3c78c8da186","Type":"ContainerDied","Data":"205143e133a6d4baff66c65824716168bade68aa5a43caf672791c4c360868ab"} Mar 20 18:14:04 crc kubenswrapper[4690]: I0320 18:14:04.464986 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567174-77prw" Mar 20 18:14:04 crc kubenswrapper[4690]: I0320 18:14:04.510980 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xg2fp\" (UniqueName: \"kubernetes.io/projected/38bf052e-356e-49e0-af8e-e3c78c8da186-kube-api-access-xg2fp\") pod \"38bf052e-356e-49e0-af8e-e3c78c8da186\" (UID: \"38bf052e-356e-49e0-af8e-e3c78c8da186\") " Mar 20 18:14:04 crc kubenswrapper[4690]: I0320 18:14:04.521363 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38bf052e-356e-49e0-af8e-e3c78c8da186-kube-api-access-xg2fp" (OuterVolumeSpecName: "kube-api-access-xg2fp") pod "38bf052e-356e-49e0-af8e-e3c78c8da186" (UID: "38bf052e-356e-49e0-af8e-e3c78c8da186"). InnerVolumeSpecName "kube-api-access-xg2fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:14:04 crc kubenswrapper[4690]: I0320 18:14:04.614473 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xg2fp\" (UniqueName: \"kubernetes.io/projected/38bf052e-356e-49e0-af8e-e3c78c8da186-kube-api-access-xg2fp\") on node \"crc\" DevicePath \"\"" Mar 20 18:14:05 crc kubenswrapper[4690]: I0320 18:14:05.125637 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567174-77prw" event={"ID":"38bf052e-356e-49e0-af8e-e3c78c8da186","Type":"ContainerDied","Data":"13b563184b8a224664635098b83b213f9c2749f4e163e67bbc9436d3f3377268"} Mar 20 18:14:05 crc kubenswrapper[4690]: I0320 18:14:05.126032 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="13b563184b8a224664635098b83b213f9c2749f4e163e67bbc9436d3f3377268" Mar 20 18:14:05 crc kubenswrapper[4690]: I0320 18:14:05.125707 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567174-77prw" Mar 20 18:14:05 crc kubenswrapper[4690]: I0320 18:14:05.543280 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567168-dhksw"] Mar 20 18:14:05 crc kubenswrapper[4690]: I0320 18:14:05.550648 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567168-dhksw"] Mar 20 18:14:05 crc kubenswrapper[4690]: I0320 18:14:05.897771 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42b7b7a1-8685-423b-a27c-fd5c5785c056" path="/var/lib/kubelet/pods/42b7b7a1-8685-423b-a27c-fd5c5785c056/volumes" Mar 20 18:14:07 crc kubenswrapper[4690]: I0320 18:14:07.883993 4690 scope.go:117] "RemoveContainer" containerID="24e5f76fee7e30729d09e38f23025e12449be266576373e532933c3f0101ae12" Mar 20 18:14:07 crc kubenswrapper[4690]: E0320 18:14:07.884568 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:14:21 crc kubenswrapper[4690]: I0320 18:14:21.883973 4690 scope.go:117] "RemoveContainer" containerID="24e5f76fee7e30729d09e38f23025e12449be266576373e532933c3f0101ae12" Mar 20 18:14:21 crc kubenswrapper[4690]: E0320 18:14:21.885415 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:14:23 crc kubenswrapper[4690]: I0320 18:14:23.407101 4690 scope.go:117] "RemoveContainer" containerID="849623b4d82636d8b77839631781372232dfffbfa828a867ce29cea9caefa3ee" Mar 20 18:14:34 crc kubenswrapper[4690]: I0320 18:14:34.884783 4690 scope.go:117] "RemoveContainer" containerID="24e5f76fee7e30729d09e38f23025e12449be266576373e532933c3f0101ae12" Mar 20 18:14:34 crc kubenswrapper[4690]: E0320 18:14:34.886470 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:14:49 crc kubenswrapper[4690]: I0320 18:14:49.882922 4690 scope.go:117] "RemoveContainer" containerID="24e5f76fee7e30729d09e38f23025e12449be266576373e532933c3f0101ae12" Mar 20 18:14:49 crc kubenswrapper[4690]: E0320 18:14:49.883609 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:15:00 crc kubenswrapper[4690]: I0320 18:15:00.150978 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567175-4w5q6"] Mar 20 18:15:00 crc kubenswrapper[4690]: E0320 18:15:00.152021 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38bf052e-356e-49e0-af8e-e3c78c8da186" containerName="oc" Mar 20 18:15:00 crc kubenswrapper[4690]: I0320 18:15:00.152122 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="38bf052e-356e-49e0-af8e-e3c78c8da186" containerName="oc" Mar 20 18:15:00 crc kubenswrapper[4690]: I0320 18:15:00.152362 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="38bf052e-356e-49e0-af8e-e3c78c8da186" containerName="oc" Mar 20 18:15:00 crc kubenswrapper[4690]: I0320 18:15:00.153094 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567175-4w5q6" Mar 20 18:15:00 crc kubenswrapper[4690]: I0320 18:15:00.155710 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 18:15:00 crc kubenswrapper[4690]: I0320 18:15:00.156112 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 18:15:00 crc kubenswrapper[4690]: I0320 18:15:00.158936 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567175-4w5q6"] Mar 20 18:15:00 crc kubenswrapper[4690]: I0320 18:15:00.185392 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f0b5f94d-7149-4343-8bda-d76f89818a1c-config-volume\") pod \"collect-profiles-29567175-4w5q6\" (UID: \"f0b5f94d-7149-4343-8bda-d76f89818a1c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567175-4w5q6" Mar 20 18:15:00 crc kubenswrapper[4690]: I0320 18:15:00.185487 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqngx\" (UniqueName: \"kubernetes.io/projected/f0b5f94d-7149-4343-8bda-d76f89818a1c-kube-api-access-dqngx\") pod \"collect-profiles-29567175-4w5q6\" (UID: \"f0b5f94d-7149-4343-8bda-d76f89818a1c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567175-4w5q6" Mar 20 18:15:00 crc kubenswrapper[4690]: I0320 18:15:00.185521 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f0b5f94d-7149-4343-8bda-d76f89818a1c-secret-volume\") pod \"collect-profiles-29567175-4w5q6\" (UID: \"f0b5f94d-7149-4343-8bda-d76f89818a1c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567175-4w5q6" Mar 20 18:15:00 crc kubenswrapper[4690]: I0320 18:15:00.287084 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f0b5f94d-7149-4343-8bda-d76f89818a1c-config-volume\") pod \"collect-profiles-29567175-4w5q6\" (UID: \"f0b5f94d-7149-4343-8bda-d76f89818a1c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567175-4w5q6" Mar 20 18:15:00 crc kubenswrapper[4690]: I0320 18:15:00.287171 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqngx\" (UniqueName: \"kubernetes.io/projected/f0b5f94d-7149-4343-8bda-d76f89818a1c-kube-api-access-dqngx\") pod \"collect-profiles-29567175-4w5q6\" (UID: \"f0b5f94d-7149-4343-8bda-d76f89818a1c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567175-4w5q6" Mar 20 18:15:00 crc kubenswrapper[4690]: I0320 18:15:00.287209 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f0b5f94d-7149-4343-8bda-d76f89818a1c-secret-volume\") pod \"collect-profiles-29567175-4w5q6\" (UID: \"f0b5f94d-7149-4343-8bda-d76f89818a1c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567175-4w5q6" Mar 20 18:15:00 crc kubenswrapper[4690]: I0320 18:15:00.288370 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f0b5f94d-7149-4343-8bda-d76f89818a1c-config-volume\") pod \"collect-profiles-29567175-4w5q6\" (UID: \"f0b5f94d-7149-4343-8bda-d76f89818a1c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567175-4w5q6" Mar 20 18:15:00 crc kubenswrapper[4690]: I0320 18:15:00.302914 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f0b5f94d-7149-4343-8bda-d76f89818a1c-secret-volume\") pod \"collect-profiles-29567175-4w5q6\" (UID: \"f0b5f94d-7149-4343-8bda-d76f89818a1c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567175-4w5q6" Mar 20 18:15:00 crc kubenswrapper[4690]: I0320 18:15:00.306246 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqngx\" (UniqueName: \"kubernetes.io/projected/f0b5f94d-7149-4343-8bda-d76f89818a1c-kube-api-access-dqngx\") pod \"collect-profiles-29567175-4w5q6\" (UID: \"f0b5f94d-7149-4343-8bda-d76f89818a1c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567175-4w5q6" Mar 20 18:15:00 crc kubenswrapper[4690]: I0320 18:15:00.484698 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567175-4w5q6" Mar 20 18:15:01 crc kubenswrapper[4690]: I0320 18:15:01.004701 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567175-4w5q6"] Mar 20 18:15:01 crc kubenswrapper[4690]: I0320 18:15:01.729190 4690 generic.go:334] "Generic (PLEG): container finished" podID="f0b5f94d-7149-4343-8bda-d76f89818a1c" containerID="8cc91a10d471ec928af8c65fbe51a9dc977e3e7e7b89655160517e67e4144d71" exitCode=0 Mar 20 18:15:01 crc kubenswrapper[4690]: I0320 18:15:01.729319 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567175-4w5q6" event={"ID":"f0b5f94d-7149-4343-8bda-d76f89818a1c","Type":"ContainerDied","Data":"8cc91a10d471ec928af8c65fbe51a9dc977e3e7e7b89655160517e67e4144d71"} Mar 20 18:15:01 crc kubenswrapper[4690]: I0320 18:15:01.730673 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567175-4w5q6" event={"ID":"f0b5f94d-7149-4343-8bda-d76f89818a1c","Type":"ContainerStarted","Data":"244f52773e985ab12de2c96f0deb90e730c38c5e2551dd3f48a85a92edfeed17"} Mar 20 18:15:03 crc kubenswrapper[4690]: I0320 18:15:03.149573 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567175-4w5q6" Mar 20 18:15:03 crc kubenswrapper[4690]: I0320 18:15:03.347046 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f0b5f94d-7149-4343-8bda-d76f89818a1c-config-volume\") pod \"f0b5f94d-7149-4343-8bda-d76f89818a1c\" (UID: \"f0b5f94d-7149-4343-8bda-d76f89818a1c\") " Mar 20 18:15:03 crc kubenswrapper[4690]: I0320 18:15:03.347118 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dqngx\" (UniqueName: \"kubernetes.io/projected/f0b5f94d-7149-4343-8bda-d76f89818a1c-kube-api-access-dqngx\") pod \"f0b5f94d-7149-4343-8bda-d76f89818a1c\" (UID: \"f0b5f94d-7149-4343-8bda-d76f89818a1c\") " Mar 20 18:15:03 crc kubenswrapper[4690]: I0320 18:15:03.347248 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f0b5f94d-7149-4343-8bda-d76f89818a1c-secret-volume\") pod \"f0b5f94d-7149-4343-8bda-d76f89818a1c\" (UID: \"f0b5f94d-7149-4343-8bda-d76f89818a1c\") " Mar 20 18:15:03 crc kubenswrapper[4690]: I0320 18:15:03.347917 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0b5f94d-7149-4343-8bda-d76f89818a1c-config-volume" (OuterVolumeSpecName: "config-volume") pod "f0b5f94d-7149-4343-8bda-d76f89818a1c" (UID: "f0b5f94d-7149-4343-8bda-d76f89818a1c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 18:15:03 crc kubenswrapper[4690]: I0320 18:15:03.352465 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0b5f94d-7149-4343-8bda-d76f89818a1c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f0b5f94d-7149-4343-8bda-d76f89818a1c" (UID: "f0b5f94d-7149-4343-8bda-d76f89818a1c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:15:03 crc kubenswrapper[4690]: I0320 18:15:03.352694 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0b5f94d-7149-4343-8bda-d76f89818a1c-kube-api-access-dqngx" (OuterVolumeSpecName: "kube-api-access-dqngx") pod "f0b5f94d-7149-4343-8bda-d76f89818a1c" (UID: "f0b5f94d-7149-4343-8bda-d76f89818a1c"). InnerVolumeSpecName "kube-api-access-dqngx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:15:03 crc kubenswrapper[4690]: I0320 18:15:03.450045 4690 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f0b5f94d-7149-4343-8bda-d76f89818a1c-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 18:15:03 crc kubenswrapper[4690]: I0320 18:15:03.450087 4690 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f0b5f94d-7149-4343-8bda-d76f89818a1c-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 18:15:03 crc kubenswrapper[4690]: I0320 18:15:03.450101 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dqngx\" (UniqueName: \"kubernetes.io/projected/f0b5f94d-7149-4343-8bda-d76f89818a1c-kube-api-access-dqngx\") on node \"crc\" DevicePath \"\"" Mar 20 18:15:03 crc kubenswrapper[4690]: I0320 18:15:03.752795 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567175-4w5q6" event={"ID":"f0b5f94d-7149-4343-8bda-d76f89818a1c","Type":"ContainerDied","Data":"244f52773e985ab12de2c96f0deb90e730c38c5e2551dd3f48a85a92edfeed17"} Mar 20 18:15:03 crc kubenswrapper[4690]: I0320 18:15:03.752843 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="244f52773e985ab12de2c96f0deb90e730c38c5e2551dd3f48a85a92edfeed17" Mar 20 18:15:03 crc kubenswrapper[4690]: I0320 18:15:03.752882 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567175-4w5q6" Mar 20 18:15:04 crc kubenswrapper[4690]: I0320 18:15:04.242436 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567130-scc4x"] Mar 20 18:15:04 crc kubenswrapper[4690]: I0320 18:15:04.260373 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567130-scc4x"] Mar 20 18:15:04 crc kubenswrapper[4690]: I0320 18:15:04.883377 4690 scope.go:117] "RemoveContainer" containerID="24e5f76fee7e30729d09e38f23025e12449be266576373e532933c3f0101ae12" Mar 20 18:15:04 crc kubenswrapper[4690]: E0320 18:15:04.883586 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:15:05 crc kubenswrapper[4690]: I0320 18:15:05.906132 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03f86e30-e6e2-473e-8a52-c1e86d28c2e2" path="/var/lib/kubelet/pods/03f86e30-e6e2-473e-8a52-c1e86d28c2e2/volumes" Mar 20 18:15:17 crc kubenswrapper[4690]: I0320 18:15:17.883869 4690 scope.go:117] "RemoveContainer" containerID="24e5f76fee7e30729d09e38f23025e12449be266576373e532933c3f0101ae12" Mar 20 18:15:17 crc kubenswrapper[4690]: E0320 18:15:17.884766 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:15:23 crc kubenswrapper[4690]: I0320 18:15:23.504009 4690 scope.go:117] "RemoveContainer" containerID="703fc82a0d26e5620a573c19a2ae4b9e7776c9253848f6560f212c1a48b1f19a" Mar 20 18:15:30 crc kubenswrapper[4690]: I0320 18:15:30.883950 4690 scope.go:117] "RemoveContainer" containerID="24e5f76fee7e30729d09e38f23025e12449be266576373e532933c3f0101ae12" Mar 20 18:15:32 crc kubenswrapper[4690]: I0320 18:15:32.091623 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" event={"ID":"c18651e4-89e3-43fd-a780-bfa6df87591e","Type":"ContainerStarted","Data":"3f75d6ec75b86f2dc0e83c5cf97b53edbe2da563ef7799aed9f421624b964264"} Mar 20 18:15:34 crc kubenswrapper[4690]: I0320 18:15:34.611819 4690 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-index-xcb58" podUID="595a25b2-d477-4ec7-b9ad-8eb670c2ea3f" containerName="registry-server" probeResult="failure" output=< Mar 20 18:15:34 crc kubenswrapper[4690]: timeout: failed to connect service ":50051" within 1s Mar 20 18:15:34 crc kubenswrapper[4690]: > Mar 20 18:15:34 crc kubenswrapper[4690]: I0320 18:15:34.617793 4690 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-operator-index-xcb58" podUID="595a25b2-d477-4ec7-b9ad-8eb670c2ea3f" containerName="registry-server" probeResult="failure" output=< Mar 20 18:15:34 crc kubenswrapper[4690]: timeout: failed to connect service ":50051" within 1s Mar 20 18:15:34 crc kubenswrapper[4690]: > Mar 20 18:15:51 crc kubenswrapper[4690]: I0320 18:15:51.837329 4690 generic.go:334] "Generic (PLEG): container finished" podID="8146ff99-3308-4b91-b487-3bd707bed4dd" containerID="7d2acaf39330b4a3decdbe295748d193ae75356e305474b95d81341fbd3c2886" exitCode=0 Mar 20 18:15:51 crc kubenswrapper[4690]: I0320 18:15:51.837467 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bs8n5" event={"ID":"8146ff99-3308-4b91-b487-3bd707bed4dd","Type":"ContainerDied","Data":"7d2acaf39330b4a3decdbe295748d193ae75356e305474b95d81341fbd3c2886"} Mar 20 18:15:53 crc kubenswrapper[4690]: I0320 18:15:53.333637 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bs8n5" Mar 20 18:15:53 crc kubenswrapper[4690]: I0320 18:15:53.517268 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/8146ff99-3308-4b91-b487-3bd707bed4dd-nova-extra-config-0\") pod \"8146ff99-3308-4b91-b487-3bd707bed4dd\" (UID: \"8146ff99-3308-4b91-b487-3bd707bed4dd\") " Mar 20 18:15:53 crc kubenswrapper[4690]: I0320 18:15:53.517608 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8146ff99-3308-4b91-b487-3bd707bed4dd-ssh-key-openstack-edpm-ipam\") pod \"8146ff99-3308-4b91-b487-3bd707bed4dd\" (UID: \"8146ff99-3308-4b91-b487-3bd707bed4dd\") " Mar 20 18:15:53 crc kubenswrapper[4690]: I0320 18:15:53.517658 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/8146ff99-3308-4b91-b487-3bd707bed4dd-nova-cell1-compute-config-2\") pod \"8146ff99-3308-4b91-b487-3bd707bed4dd\" (UID: \"8146ff99-3308-4b91-b487-3bd707bed4dd\") " Mar 20 18:15:53 crc kubenswrapper[4690]: I0320 18:15:53.517683 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/8146ff99-3308-4b91-b487-3bd707bed4dd-nova-cell1-compute-config-0\") pod \"8146ff99-3308-4b91-b487-3bd707bed4dd\" (UID: \"8146ff99-3308-4b91-b487-3bd707bed4dd\") " Mar 20 18:15:53 crc kubenswrapper[4690]: I0320 18:15:53.517742 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/8146ff99-3308-4b91-b487-3bd707bed4dd-nova-cell1-compute-config-1\") pod \"8146ff99-3308-4b91-b487-3bd707bed4dd\" (UID: \"8146ff99-3308-4b91-b487-3bd707bed4dd\") " Mar 20 18:15:53 crc kubenswrapper[4690]: I0320 18:15:53.517761 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/8146ff99-3308-4b91-b487-3bd707bed4dd-nova-migration-ssh-key-1\") pod \"8146ff99-3308-4b91-b487-3bd707bed4dd\" (UID: \"8146ff99-3308-4b91-b487-3bd707bed4dd\") " Mar 20 18:15:53 crc kubenswrapper[4690]: I0320 18:15:53.517835 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/8146ff99-3308-4b91-b487-3bd707bed4dd-nova-cell1-compute-config-3\") pod \"8146ff99-3308-4b91-b487-3bd707bed4dd\" (UID: \"8146ff99-3308-4b91-b487-3bd707bed4dd\") " Mar 20 18:15:53 crc kubenswrapper[4690]: I0320 18:15:53.517876 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8146ff99-3308-4b91-b487-3bd707bed4dd-inventory\") pod \"8146ff99-3308-4b91-b487-3bd707bed4dd\" (UID: \"8146ff99-3308-4b91-b487-3bd707bed4dd\") " Mar 20 18:15:53 crc kubenswrapper[4690]: I0320 18:15:53.517905 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ck7xx\" (UniqueName: \"kubernetes.io/projected/8146ff99-3308-4b91-b487-3bd707bed4dd-kube-api-access-ck7xx\") pod \"8146ff99-3308-4b91-b487-3bd707bed4dd\" (UID: \"8146ff99-3308-4b91-b487-3bd707bed4dd\") " Mar 20 18:15:53 crc kubenswrapper[4690]: I0320 18:15:53.517965 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/8146ff99-3308-4b91-b487-3bd707bed4dd-nova-migration-ssh-key-0\") pod \"8146ff99-3308-4b91-b487-3bd707bed4dd\" (UID: \"8146ff99-3308-4b91-b487-3bd707bed4dd\") " Mar 20 18:15:53 crc kubenswrapper[4690]: I0320 18:15:53.518027 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8146ff99-3308-4b91-b487-3bd707bed4dd-nova-combined-ca-bundle\") pod \"8146ff99-3308-4b91-b487-3bd707bed4dd\" (UID: \"8146ff99-3308-4b91-b487-3bd707bed4dd\") " Mar 20 18:15:53 crc kubenswrapper[4690]: I0320 18:15:53.526891 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8146ff99-3308-4b91-b487-3bd707bed4dd-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "8146ff99-3308-4b91-b487-3bd707bed4dd" (UID: "8146ff99-3308-4b91-b487-3bd707bed4dd"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:15:53 crc kubenswrapper[4690]: I0320 18:15:53.544435 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8146ff99-3308-4b91-b487-3bd707bed4dd-kube-api-access-ck7xx" (OuterVolumeSpecName: "kube-api-access-ck7xx") pod "8146ff99-3308-4b91-b487-3bd707bed4dd" (UID: "8146ff99-3308-4b91-b487-3bd707bed4dd"). InnerVolumeSpecName "kube-api-access-ck7xx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:15:53 crc kubenswrapper[4690]: I0320 18:15:53.553749 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8146ff99-3308-4b91-b487-3bd707bed4dd-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "8146ff99-3308-4b91-b487-3bd707bed4dd" (UID: "8146ff99-3308-4b91-b487-3bd707bed4dd"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:15:53 crc kubenswrapper[4690]: I0320 18:15:53.560096 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8146ff99-3308-4b91-b487-3bd707bed4dd-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "8146ff99-3308-4b91-b487-3bd707bed4dd" (UID: "8146ff99-3308-4b91-b487-3bd707bed4dd"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:15:53 crc kubenswrapper[4690]: I0320 18:15:53.563274 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8146ff99-3308-4b91-b487-3bd707bed4dd-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "8146ff99-3308-4b91-b487-3bd707bed4dd" (UID: "8146ff99-3308-4b91-b487-3bd707bed4dd"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:15:53 crc kubenswrapper[4690]: I0320 18:15:53.565688 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8146ff99-3308-4b91-b487-3bd707bed4dd-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "8146ff99-3308-4b91-b487-3bd707bed4dd" (UID: "8146ff99-3308-4b91-b487-3bd707bed4dd"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 18:15:53 crc kubenswrapper[4690]: I0320 18:15:53.566044 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8146ff99-3308-4b91-b487-3bd707bed4dd-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "8146ff99-3308-4b91-b487-3bd707bed4dd" (UID: "8146ff99-3308-4b91-b487-3bd707bed4dd"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:15:53 crc kubenswrapper[4690]: I0320 18:15:53.575543 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8146ff99-3308-4b91-b487-3bd707bed4dd-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "8146ff99-3308-4b91-b487-3bd707bed4dd" (UID: "8146ff99-3308-4b91-b487-3bd707bed4dd"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:15:53 crc kubenswrapper[4690]: I0320 18:15:53.578486 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8146ff99-3308-4b91-b487-3bd707bed4dd-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "8146ff99-3308-4b91-b487-3bd707bed4dd" (UID: "8146ff99-3308-4b91-b487-3bd707bed4dd"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:15:53 crc kubenswrapper[4690]: I0320 18:15:53.582294 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8146ff99-3308-4b91-b487-3bd707bed4dd-inventory" (OuterVolumeSpecName: "inventory") pod "8146ff99-3308-4b91-b487-3bd707bed4dd" (UID: "8146ff99-3308-4b91-b487-3bd707bed4dd"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:15:53 crc kubenswrapper[4690]: I0320 18:15:53.591683 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8146ff99-3308-4b91-b487-3bd707bed4dd-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "8146ff99-3308-4b91-b487-3bd707bed4dd" (UID: "8146ff99-3308-4b91-b487-3bd707bed4dd"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:15:53 crc kubenswrapper[4690]: I0320 18:15:53.620742 4690 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8146ff99-3308-4b91-b487-3bd707bed4dd-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 18:15:53 crc kubenswrapper[4690]: I0320 18:15:53.620810 4690 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/8146ff99-3308-4b91-b487-3bd707bed4dd-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Mar 20 18:15:53 crc kubenswrapper[4690]: I0320 18:15:53.620831 4690 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/8146ff99-3308-4b91-b487-3bd707bed4dd-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Mar 20 18:15:53 crc kubenswrapper[4690]: I0320 18:15:53.620851 4690 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/8146ff99-3308-4b91-b487-3bd707bed4dd-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Mar 20 18:15:53 crc kubenswrapper[4690]: I0320 18:15:53.620870 4690 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/8146ff99-3308-4b91-b487-3bd707bed4dd-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Mar 20 18:15:53 crc kubenswrapper[4690]: I0320 18:15:53.620889 4690 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/8146ff99-3308-4b91-b487-3bd707bed4dd-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Mar 20 18:15:53 crc kubenswrapper[4690]: I0320 18:15:53.620908 4690 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8146ff99-3308-4b91-b487-3bd707bed4dd-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 18:15:53 crc kubenswrapper[4690]: I0320 18:15:53.620933 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ck7xx\" (UniqueName: \"kubernetes.io/projected/8146ff99-3308-4b91-b487-3bd707bed4dd-kube-api-access-ck7xx\") on node \"crc\" DevicePath \"\"" Mar 20 18:15:53 crc kubenswrapper[4690]: I0320 18:15:53.620976 4690 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/8146ff99-3308-4b91-b487-3bd707bed4dd-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Mar 20 18:15:53 crc kubenswrapper[4690]: I0320 18:15:53.621007 4690 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8146ff99-3308-4b91-b487-3bd707bed4dd-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 18:15:53 crc kubenswrapper[4690]: I0320 18:15:53.621035 4690 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/8146ff99-3308-4b91-b487-3bd707bed4dd-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Mar 20 18:15:53 crc kubenswrapper[4690]: I0320 18:15:53.861652 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bs8n5" event={"ID":"8146ff99-3308-4b91-b487-3bd707bed4dd","Type":"ContainerDied","Data":"3e4a0de290c84f90311be82b7dcc45c26cc4700fb8b51126e068101b680f009e"} Mar 20 18:15:53 crc kubenswrapper[4690]: I0320 18:15:53.861733 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e4a0de290c84f90311be82b7dcc45c26cc4700fb8b51126e068101b680f009e" Mar 20 18:15:53 crc kubenswrapper[4690]: I0320 18:15:53.861738 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bs8n5" Mar 20 18:15:53 crc kubenswrapper[4690]: I0320 18:15:53.982735 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qn629"] Mar 20 18:15:53 crc kubenswrapper[4690]: E0320 18:15:53.983124 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0b5f94d-7149-4343-8bda-d76f89818a1c" containerName="collect-profiles" Mar 20 18:15:53 crc kubenswrapper[4690]: I0320 18:15:53.983141 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0b5f94d-7149-4343-8bda-d76f89818a1c" containerName="collect-profiles" Mar 20 18:15:53 crc kubenswrapper[4690]: E0320 18:15:53.983156 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8146ff99-3308-4b91-b487-3bd707bed4dd" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 20 18:15:53 crc kubenswrapper[4690]: I0320 18:15:53.983162 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="8146ff99-3308-4b91-b487-3bd707bed4dd" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 20 18:15:53 crc kubenswrapper[4690]: I0320 18:15:53.983458 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0b5f94d-7149-4343-8bda-d76f89818a1c" containerName="collect-profiles" Mar 20 18:15:53 crc kubenswrapper[4690]: I0320 18:15:53.983491 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="8146ff99-3308-4b91-b487-3bd707bed4dd" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 20 18:15:53 crc kubenswrapper[4690]: I0320 18:15:53.984196 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qn629" Mar 20 18:15:53 crc kubenswrapper[4690]: I0320 18:15:53.991303 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Mar 20 18:15:53 crc kubenswrapper[4690]: I0320 18:15:53.991409 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 18:15:53 crc kubenswrapper[4690]: I0320 18:15:53.991773 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-k9qb4" Mar 20 18:15:53 crc kubenswrapper[4690]: I0320 18:15:53.991815 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 18:15:53 crc kubenswrapper[4690]: I0320 18:15:53.994815 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 18:15:54 crc kubenswrapper[4690]: I0320 18:15:54.002332 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qn629"] Mar 20 18:15:54 crc kubenswrapper[4690]: I0320 18:15:54.131923 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/0fb2f304-f772-4ce8-8372-177341555106-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qn629\" (UID: \"0fb2f304-f772-4ce8-8372-177341555106\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qn629" Mar 20 18:15:54 crc kubenswrapper[4690]: I0320 18:15:54.132008 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7n5ks\" (UniqueName: \"kubernetes.io/projected/0fb2f304-f772-4ce8-8372-177341555106-kube-api-access-7n5ks\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qn629\" (UID: \"0fb2f304-f772-4ce8-8372-177341555106\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qn629" Mar 20 18:15:54 crc kubenswrapper[4690]: I0320 18:15:54.132093 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/0fb2f304-f772-4ce8-8372-177341555106-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qn629\" (UID: \"0fb2f304-f772-4ce8-8372-177341555106\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qn629" Mar 20 18:15:54 crc kubenswrapper[4690]: I0320 18:15:54.132142 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0fb2f304-f772-4ce8-8372-177341555106-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qn629\" (UID: \"0fb2f304-f772-4ce8-8372-177341555106\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qn629" Mar 20 18:15:54 crc kubenswrapper[4690]: I0320 18:15:54.132220 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/0fb2f304-f772-4ce8-8372-177341555106-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qn629\" (UID: \"0fb2f304-f772-4ce8-8372-177341555106\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qn629" Mar 20 18:15:54 crc kubenswrapper[4690]: I0320 18:15:54.132246 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0fb2f304-f772-4ce8-8372-177341555106-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qn629\" (UID: \"0fb2f304-f772-4ce8-8372-177341555106\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qn629" Mar 20 18:15:54 crc kubenswrapper[4690]: I0320 18:15:54.132293 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fb2f304-f772-4ce8-8372-177341555106-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qn629\" (UID: \"0fb2f304-f772-4ce8-8372-177341555106\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qn629" Mar 20 18:15:54 crc kubenswrapper[4690]: I0320 18:15:54.233463 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/0fb2f304-f772-4ce8-8372-177341555106-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qn629\" (UID: \"0fb2f304-f772-4ce8-8372-177341555106\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qn629" Mar 20 18:15:54 crc kubenswrapper[4690]: I0320 18:15:54.233539 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7n5ks\" (UniqueName: \"kubernetes.io/projected/0fb2f304-f772-4ce8-8372-177341555106-kube-api-access-7n5ks\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qn629\" (UID: \"0fb2f304-f772-4ce8-8372-177341555106\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qn629" Mar 20 18:15:54 crc kubenswrapper[4690]: I0320 18:15:54.233602 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/0fb2f304-f772-4ce8-8372-177341555106-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qn629\" (UID: \"0fb2f304-f772-4ce8-8372-177341555106\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qn629" Mar 20 18:15:54 crc kubenswrapper[4690]: I0320 18:15:54.233620 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0fb2f304-f772-4ce8-8372-177341555106-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qn629\" (UID: \"0fb2f304-f772-4ce8-8372-177341555106\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qn629" Mar 20 18:15:54 crc kubenswrapper[4690]: I0320 18:15:54.233646 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/0fb2f304-f772-4ce8-8372-177341555106-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qn629\" (UID: \"0fb2f304-f772-4ce8-8372-177341555106\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qn629" Mar 20 18:15:54 crc kubenswrapper[4690]: I0320 18:15:54.233664 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0fb2f304-f772-4ce8-8372-177341555106-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qn629\" (UID: \"0fb2f304-f772-4ce8-8372-177341555106\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qn629" Mar 20 18:15:54 crc kubenswrapper[4690]: I0320 18:15:54.233686 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fb2f304-f772-4ce8-8372-177341555106-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qn629\" (UID: \"0fb2f304-f772-4ce8-8372-177341555106\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qn629" Mar 20 18:15:54 crc kubenswrapper[4690]: I0320 18:15:54.238645 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0fb2f304-f772-4ce8-8372-177341555106-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qn629\" (UID: \"0fb2f304-f772-4ce8-8372-177341555106\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qn629" Mar 20 18:15:54 crc kubenswrapper[4690]: I0320 18:15:54.238666 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0fb2f304-f772-4ce8-8372-177341555106-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qn629\" (UID: \"0fb2f304-f772-4ce8-8372-177341555106\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qn629" Mar 20 18:15:54 crc kubenswrapper[4690]: I0320 18:15:54.238674 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/0fb2f304-f772-4ce8-8372-177341555106-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qn629\" (UID: \"0fb2f304-f772-4ce8-8372-177341555106\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qn629" Mar 20 18:15:54 crc kubenswrapper[4690]: I0320 18:15:54.239157 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/0fb2f304-f772-4ce8-8372-177341555106-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qn629\" (UID: \"0fb2f304-f772-4ce8-8372-177341555106\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qn629" Mar 20 18:15:54 crc kubenswrapper[4690]: I0320 18:15:54.239563 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fb2f304-f772-4ce8-8372-177341555106-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qn629\" (UID: \"0fb2f304-f772-4ce8-8372-177341555106\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qn629" Mar 20 18:15:54 crc kubenswrapper[4690]: I0320 18:15:54.242861 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/0fb2f304-f772-4ce8-8372-177341555106-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qn629\" (UID: \"0fb2f304-f772-4ce8-8372-177341555106\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qn629" Mar 20 18:15:54 crc kubenswrapper[4690]: I0320 18:15:54.247892 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7n5ks\" (UniqueName: \"kubernetes.io/projected/0fb2f304-f772-4ce8-8372-177341555106-kube-api-access-7n5ks\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qn629\" (UID: \"0fb2f304-f772-4ce8-8372-177341555106\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qn629" Mar 20 18:15:54 crc kubenswrapper[4690]: I0320 18:15:54.303323 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qn629" Mar 20 18:15:54 crc kubenswrapper[4690]: I0320 18:15:54.901947 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qn629"] Mar 20 18:15:54 crc kubenswrapper[4690]: W0320 18:15:54.914192 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0fb2f304_f772_4ce8_8372_177341555106.slice/crio-45a80b7cd5fd6e719135186e8d194b633c97ae858e147f7b0e79c17afbfd7b41 WatchSource:0}: Error finding container 45a80b7cd5fd6e719135186e8d194b633c97ae858e147f7b0e79c17afbfd7b41: Status 404 returned error can't find the container with id 45a80b7cd5fd6e719135186e8d194b633c97ae858e147f7b0e79c17afbfd7b41 Mar 20 18:15:55 crc kubenswrapper[4690]: I0320 18:15:55.888938 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qn629" event={"ID":"0fb2f304-f772-4ce8-8372-177341555106","Type":"ContainerStarted","Data":"caa473aac4a04e19fe39e19a4372de66600db55b7c6a20159a9274a6cdf45a04"} Mar 20 18:15:55 crc kubenswrapper[4690]: I0320 18:15:55.889470 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qn629" event={"ID":"0fb2f304-f772-4ce8-8372-177341555106","Type":"ContainerStarted","Data":"45a80b7cd5fd6e719135186e8d194b633c97ae858e147f7b0e79c17afbfd7b41"} Mar 20 18:15:55 crc kubenswrapper[4690]: I0320 18:15:55.940897 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qn629" podStartSLOduration=2.300826503 podStartE2EDuration="2.940869409s" podCreationTimestamp="2026-03-20 18:15:53 +0000 UTC" firstStartedPulling="2026-03-20 18:15:54.917002913 +0000 UTC m=+2629.782828631" lastFinishedPulling="2026-03-20 18:15:55.557045819 +0000 UTC m=+2630.422871537" observedRunningTime="2026-03-20 18:15:55.937850104 +0000 UTC m=+2630.803675832" watchObservedRunningTime="2026-03-20 18:15:55.940869409 +0000 UTC m=+2630.806695097" Mar 20 18:16:00 crc kubenswrapper[4690]: I0320 18:16:00.137900 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567176-2dprz"] Mar 20 18:16:00 crc kubenswrapper[4690]: I0320 18:16:00.139649 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567176-2dprz" Mar 20 18:16:00 crc kubenswrapper[4690]: I0320 18:16:00.144637 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 18:16:00 crc kubenswrapper[4690]: I0320 18:16:00.144691 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 18:16:00 crc kubenswrapper[4690]: I0320 18:16:00.145931 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5fwhb" Mar 20 18:16:00 crc kubenswrapper[4690]: I0320 18:16:00.158026 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567176-2dprz"] Mar 20 18:16:00 crc kubenswrapper[4690]: I0320 18:16:00.261707 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rs9rq\" (UniqueName: \"kubernetes.io/projected/39c7e9bc-b76b-4754-8f65-42c99bcc0542-kube-api-access-rs9rq\") pod \"auto-csr-approver-29567176-2dprz\" (UID: \"39c7e9bc-b76b-4754-8f65-42c99bcc0542\") " pod="openshift-infra/auto-csr-approver-29567176-2dprz" Mar 20 18:16:00 crc kubenswrapper[4690]: I0320 18:16:00.363813 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rs9rq\" (UniqueName: \"kubernetes.io/projected/39c7e9bc-b76b-4754-8f65-42c99bcc0542-kube-api-access-rs9rq\") pod \"auto-csr-approver-29567176-2dprz\" (UID: \"39c7e9bc-b76b-4754-8f65-42c99bcc0542\") " pod="openshift-infra/auto-csr-approver-29567176-2dprz" Mar 20 18:16:00 crc kubenswrapper[4690]: I0320 18:16:00.385640 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rs9rq\" (UniqueName: \"kubernetes.io/projected/39c7e9bc-b76b-4754-8f65-42c99bcc0542-kube-api-access-rs9rq\") pod \"auto-csr-approver-29567176-2dprz\" (UID: \"39c7e9bc-b76b-4754-8f65-42c99bcc0542\") " pod="openshift-infra/auto-csr-approver-29567176-2dprz" Mar 20 18:16:00 crc kubenswrapper[4690]: I0320 18:16:00.469898 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567176-2dprz" Mar 20 18:16:00 crc kubenswrapper[4690]: I0320 18:16:00.744835 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567176-2dprz"] Mar 20 18:16:00 crc kubenswrapper[4690]: I0320 18:16:00.952151 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567176-2dprz" event={"ID":"39c7e9bc-b76b-4754-8f65-42c99bcc0542","Type":"ContainerStarted","Data":"a222d49d9fd0df05531dd1db7dd3a10d03b964e3d7955f8ea7f82f2c3dfe31aa"} Mar 20 18:16:02 crc kubenswrapper[4690]: I0320 18:16:02.974331 4690 generic.go:334] "Generic (PLEG): container finished" podID="39c7e9bc-b76b-4754-8f65-42c99bcc0542" containerID="3356581b01b530f4689a8ec2ef78513d83e5dcbefcc327543cdfc05297f92891" exitCode=0 Mar 20 18:16:02 crc kubenswrapper[4690]: I0320 18:16:02.974461 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567176-2dprz" event={"ID":"39c7e9bc-b76b-4754-8f65-42c99bcc0542","Type":"ContainerDied","Data":"3356581b01b530f4689a8ec2ef78513d83e5dcbefcc327543cdfc05297f92891"} Mar 20 18:16:04 crc kubenswrapper[4690]: I0320 18:16:04.344213 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567176-2dprz" Mar 20 18:16:04 crc kubenswrapper[4690]: I0320 18:16:04.448115 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rs9rq\" (UniqueName: \"kubernetes.io/projected/39c7e9bc-b76b-4754-8f65-42c99bcc0542-kube-api-access-rs9rq\") pod \"39c7e9bc-b76b-4754-8f65-42c99bcc0542\" (UID: \"39c7e9bc-b76b-4754-8f65-42c99bcc0542\") " Mar 20 18:16:04 crc kubenswrapper[4690]: I0320 18:16:04.453710 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39c7e9bc-b76b-4754-8f65-42c99bcc0542-kube-api-access-rs9rq" (OuterVolumeSpecName: "kube-api-access-rs9rq") pod "39c7e9bc-b76b-4754-8f65-42c99bcc0542" (UID: "39c7e9bc-b76b-4754-8f65-42c99bcc0542"). InnerVolumeSpecName "kube-api-access-rs9rq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:16:04 crc kubenswrapper[4690]: I0320 18:16:04.550367 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rs9rq\" (UniqueName: \"kubernetes.io/projected/39c7e9bc-b76b-4754-8f65-42c99bcc0542-kube-api-access-rs9rq\") on node \"crc\" DevicePath \"\"" Mar 20 18:16:04 crc kubenswrapper[4690]: I0320 18:16:04.995731 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567176-2dprz" event={"ID":"39c7e9bc-b76b-4754-8f65-42c99bcc0542","Type":"ContainerDied","Data":"a222d49d9fd0df05531dd1db7dd3a10d03b964e3d7955f8ea7f82f2c3dfe31aa"} Mar 20 18:16:04 crc kubenswrapper[4690]: I0320 18:16:04.995788 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a222d49d9fd0df05531dd1db7dd3a10d03b964e3d7955f8ea7f82f2c3dfe31aa" Mar 20 18:16:04 crc kubenswrapper[4690]: I0320 18:16:04.995799 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567176-2dprz" Mar 20 18:16:05 crc kubenswrapper[4690]: I0320 18:16:05.440365 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567170-ss7qv"] Mar 20 18:16:05 crc kubenswrapper[4690]: I0320 18:16:05.452196 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567170-ss7qv"] Mar 20 18:16:05 crc kubenswrapper[4690]: I0320 18:16:05.904161 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6f1cf15-883c-4bac-8695-749c4d80c353" path="/var/lib/kubelet/pods/f6f1cf15-883c-4bac-8695-749c4d80c353/volumes" Mar 20 18:16:23 crc kubenswrapper[4690]: I0320 18:16:23.571277 4690 scope.go:117] "RemoveContainer" containerID="35cb50e14fcb85f8bb096379d4bacd0e363d09fa2f18a91b7452a9c13579e8fc" Mar 20 18:17:23 crc kubenswrapper[4690]: I0320 18:17:23.689964 4690 scope.go:117] "RemoveContainer" containerID="0d65534597a77f19db03240ba65a77279a5cdefe27a79197170d48fbb7c5d300" Mar 20 18:17:23 crc kubenswrapper[4690]: I0320 18:17:23.723614 4690 scope.go:117] "RemoveContainer" containerID="d7632ecec498715b45117fd09f264bb0fed0fe335f3301992484c0df179cb01b" Mar 20 18:17:23 crc kubenswrapper[4690]: I0320 18:17:23.751059 4690 scope.go:117] "RemoveContainer" containerID="d1c99c91e11d00e6ed285ef9a41e1f5ee2e3ffdc826f84aa1511d05249b3517c" Mar 20 18:17:54 crc kubenswrapper[4690]: I0320 18:17:54.273976 4690 patch_prober.go:28] interesting pod/machine-config-daemon-wtg2q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 18:17:54 crc kubenswrapper[4690]: I0320 18:17:54.274720 4690 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 18:18:00 crc kubenswrapper[4690]: I0320 18:18:00.151293 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567178-96l5z"] Mar 20 18:18:00 crc kubenswrapper[4690]: E0320 18:18:00.153312 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39c7e9bc-b76b-4754-8f65-42c99bcc0542" containerName="oc" Mar 20 18:18:00 crc kubenswrapper[4690]: I0320 18:18:00.153401 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="39c7e9bc-b76b-4754-8f65-42c99bcc0542" containerName="oc" Mar 20 18:18:00 crc kubenswrapper[4690]: I0320 18:18:00.153640 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="39c7e9bc-b76b-4754-8f65-42c99bcc0542" containerName="oc" Mar 20 18:18:00 crc kubenswrapper[4690]: I0320 18:18:00.154443 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567178-96l5z" Mar 20 18:18:00 crc kubenswrapper[4690]: I0320 18:18:00.157231 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5fwhb" Mar 20 18:18:00 crc kubenswrapper[4690]: I0320 18:18:00.160729 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 18:18:00 crc kubenswrapper[4690]: I0320 18:18:00.161414 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 18:18:00 crc kubenswrapper[4690]: I0320 18:18:00.173359 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567178-96l5z"] Mar 20 18:18:00 crc kubenswrapper[4690]: I0320 18:18:00.208755 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2bv4\" (UniqueName: \"kubernetes.io/projected/b4c78c20-5d06-4ae9-b2d8-1038463e235d-kube-api-access-c2bv4\") pod \"auto-csr-approver-29567178-96l5z\" (UID: \"b4c78c20-5d06-4ae9-b2d8-1038463e235d\") " pod="openshift-infra/auto-csr-approver-29567178-96l5z" Mar 20 18:18:00 crc kubenswrapper[4690]: I0320 18:18:00.309804 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2bv4\" (UniqueName: \"kubernetes.io/projected/b4c78c20-5d06-4ae9-b2d8-1038463e235d-kube-api-access-c2bv4\") pod \"auto-csr-approver-29567178-96l5z\" (UID: \"b4c78c20-5d06-4ae9-b2d8-1038463e235d\") " pod="openshift-infra/auto-csr-approver-29567178-96l5z" Mar 20 18:18:00 crc kubenswrapper[4690]: I0320 18:18:00.330508 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2bv4\" (UniqueName: \"kubernetes.io/projected/b4c78c20-5d06-4ae9-b2d8-1038463e235d-kube-api-access-c2bv4\") pod \"auto-csr-approver-29567178-96l5z\" (UID: \"b4c78c20-5d06-4ae9-b2d8-1038463e235d\") " pod="openshift-infra/auto-csr-approver-29567178-96l5z" Mar 20 18:18:00 crc kubenswrapper[4690]: I0320 18:18:00.478836 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567178-96l5z" Mar 20 18:18:00 crc kubenswrapper[4690]: I0320 18:18:00.965179 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567178-96l5z"] Mar 20 18:18:01 crc kubenswrapper[4690]: I0320 18:18:01.295289 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567178-96l5z" event={"ID":"b4c78c20-5d06-4ae9-b2d8-1038463e235d","Type":"ContainerStarted","Data":"e26cecbabc4c28a498803d973d04f297e4a1a10078dea86515e4f0e2e31a5a42"} Mar 20 18:18:02 crc kubenswrapper[4690]: I0320 18:18:02.305826 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567178-96l5z" event={"ID":"b4c78c20-5d06-4ae9-b2d8-1038463e235d","Type":"ContainerStarted","Data":"3d170acfe4e91036374e01600024966964dd16c92b739363e9923deaa98dc1c8"} Mar 20 18:18:02 crc kubenswrapper[4690]: I0320 18:18:02.321902 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567178-96l5z" podStartSLOduration=1.358908206 podStartE2EDuration="2.321883404s" podCreationTimestamp="2026-03-20 18:18:00 +0000 UTC" firstStartedPulling="2026-03-20 18:18:00.97379869 +0000 UTC m=+2755.839624388" lastFinishedPulling="2026-03-20 18:18:01.936773908 +0000 UTC m=+2756.802599586" observedRunningTime="2026-03-20 18:18:02.320513235 +0000 UTC m=+2757.186338923" watchObservedRunningTime="2026-03-20 18:18:02.321883404 +0000 UTC m=+2757.187709082" Mar 20 18:18:03 crc kubenswrapper[4690]: I0320 18:18:03.321799 4690 generic.go:334] "Generic (PLEG): container finished" podID="b4c78c20-5d06-4ae9-b2d8-1038463e235d" containerID="3d170acfe4e91036374e01600024966964dd16c92b739363e9923deaa98dc1c8" exitCode=0 Mar 20 18:18:03 crc kubenswrapper[4690]: I0320 18:18:03.321905 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567178-96l5z" event={"ID":"b4c78c20-5d06-4ae9-b2d8-1038463e235d","Type":"ContainerDied","Data":"3d170acfe4e91036374e01600024966964dd16c92b739363e9923deaa98dc1c8"} Mar 20 18:18:04 crc kubenswrapper[4690]: I0320 18:18:04.618085 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567178-96l5z" Mar 20 18:18:04 crc kubenswrapper[4690]: I0320 18:18:04.810635 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c2bv4\" (UniqueName: \"kubernetes.io/projected/b4c78c20-5d06-4ae9-b2d8-1038463e235d-kube-api-access-c2bv4\") pod \"b4c78c20-5d06-4ae9-b2d8-1038463e235d\" (UID: \"b4c78c20-5d06-4ae9-b2d8-1038463e235d\") " Mar 20 18:18:04 crc kubenswrapper[4690]: I0320 18:18:04.825854 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4c78c20-5d06-4ae9-b2d8-1038463e235d-kube-api-access-c2bv4" (OuterVolumeSpecName: "kube-api-access-c2bv4") pod "b4c78c20-5d06-4ae9-b2d8-1038463e235d" (UID: "b4c78c20-5d06-4ae9-b2d8-1038463e235d"). InnerVolumeSpecName "kube-api-access-c2bv4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:18:04 crc kubenswrapper[4690]: I0320 18:18:04.912404 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c2bv4\" (UniqueName: \"kubernetes.io/projected/b4c78c20-5d06-4ae9-b2d8-1038463e235d-kube-api-access-c2bv4\") on node \"crc\" DevicePath \"\"" Mar 20 18:18:05 crc kubenswrapper[4690]: I0320 18:18:05.344770 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567178-96l5z" event={"ID":"b4c78c20-5d06-4ae9-b2d8-1038463e235d","Type":"ContainerDied","Data":"e26cecbabc4c28a498803d973d04f297e4a1a10078dea86515e4f0e2e31a5a42"} Mar 20 18:18:05 crc kubenswrapper[4690]: I0320 18:18:05.345313 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e26cecbabc4c28a498803d973d04f297e4a1a10078dea86515e4f0e2e31a5a42" Mar 20 18:18:05 crc kubenswrapper[4690]: I0320 18:18:05.344889 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567178-96l5z" Mar 20 18:18:05 crc kubenswrapper[4690]: I0320 18:18:05.400200 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567172-h7kxj"] Mar 20 18:18:05 crc kubenswrapper[4690]: I0320 18:18:05.410451 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567172-h7kxj"] Mar 20 18:18:05 crc kubenswrapper[4690]: I0320 18:18:05.894972 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2ae2e33-2ac4-4ff3-a4c1-48c733a21a0d" path="/var/lib/kubelet/pods/e2ae2e33-2ac4-4ff3-a4c1-48c733a21a0d/volumes" Mar 20 18:18:23 crc kubenswrapper[4690]: I0320 18:18:23.872984 4690 scope.go:117] "RemoveContainer" containerID="386988d99d4cb11dafa7d024a5bd3cc5301f9c479049f50cbf1bbd3273d719a0" Mar 20 18:18:24 crc kubenswrapper[4690]: I0320 18:18:24.274762 4690 patch_prober.go:28] interesting pod/machine-config-daemon-wtg2q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 18:18:24 crc kubenswrapper[4690]: I0320 18:18:24.275403 4690 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 18:18:27 crc kubenswrapper[4690]: I0320 18:18:27.595738 4690 generic.go:334] "Generic (PLEG): container finished" podID="0fb2f304-f772-4ce8-8372-177341555106" containerID="caa473aac4a04e19fe39e19a4372de66600db55b7c6a20159a9274a6cdf45a04" exitCode=0 Mar 20 18:18:27 crc kubenswrapper[4690]: I0320 18:18:27.595805 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qn629" event={"ID":"0fb2f304-f772-4ce8-8372-177341555106","Type":"ContainerDied","Data":"caa473aac4a04e19fe39e19a4372de66600db55b7c6a20159a9274a6cdf45a04"} Mar 20 18:18:28 crc kubenswrapper[4690]: I0320 18:18:28.241374 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-q46sl"] Mar 20 18:18:28 crc kubenswrapper[4690]: E0320 18:18:28.241826 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4c78c20-5d06-4ae9-b2d8-1038463e235d" containerName="oc" Mar 20 18:18:28 crc kubenswrapper[4690]: I0320 18:18:28.241845 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4c78c20-5d06-4ae9-b2d8-1038463e235d" containerName="oc" Mar 20 18:18:28 crc kubenswrapper[4690]: I0320 18:18:28.242137 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4c78c20-5d06-4ae9-b2d8-1038463e235d" containerName="oc" Mar 20 18:18:28 crc kubenswrapper[4690]: I0320 18:18:28.244497 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q46sl" Mar 20 18:18:28 crc kubenswrapper[4690]: I0320 18:18:28.292820 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-q46sl"] Mar 20 18:18:28 crc kubenswrapper[4690]: I0320 18:18:28.342003 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da566365-93b4-442f-a116-83fd58e319cb-catalog-content\") pod \"redhat-operators-q46sl\" (UID: \"da566365-93b4-442f-a116-83fd58e319cb\") " pod="openshift-marketplace/redhat-operators-q46sl" Mar 20 18:18:28 crc kubenswrapper[4690]: I0320 18:18:28.342449 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da566365-93b4-442f-a116-83fd58e319cb-utilities\") pod \"redhat-operators-q46sl\" (UID: \"da566365-93b4-442f-a116-83fd58e319cb\") " pod="openshift-marketplace/redhat-operators-q46sl" Mar 20 18:18:28 crc kubenswrapper[4690]: I0320 18:18:28.342641 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9pj2\" (UniqueName: \"kubernetes.io/projected/da566365-93b4-442f-a116-83fd58e319cb-kube-api-access-x9pj2\") pod \"redhat-operators-q46sl\" (UID: \"da566365-93b4-442f-a116-83fd58e319cb\") " pod="openshift-marketplace/redhat-operators-q46sl" Mar 20 18:18:28 crc kubenswrapper[4690]: I0320 18:18:28.444490 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da566365-93b4-442f-a116-83fd58e319cb-catalog-content\") pod \"redhat-operators-q46sl\" (UID: \"da566365-93b4-442f-a116-83fd58e319cb\") " pod="openshift-marketplace/redhat-operators-q46sl" Mar 20 18:18:28 crc kubenswrapper[4690]: I0320 18:18:28.444593 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da566365-93b4-442f-a116-83fd58e319cb-utilities\") pod \"redhat-operators-q46sl\" (UID: \"da566365-93b4-442f-a116-83fd58e319cb\") " pod="openshift-marketplace/redhat-operators-q46sl" Mar 20 18:18:28 crc kubenswrapper[4690]: I0320 18:18:28.444630 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9pj2\" (UniqueName: \"kubernetes.io/projected/da566365-93b4-442f-a116-83fd58e319cb-kube-api-access-x9pj2\") pod \"redhat-operators-q46sl\" (UID: \"da566365-93b4-442f-a116-83fd58e319cb\") " pod="openshift-marketplace/redhat-operators-q46sl" Mar 20 18:18:28 crc kubenswrapper[4690]: I0320 18:18:28.445279 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da566365-93b4-442f-a116-83fd58e319cb-catalog-content\") pod \"redhat-operators-q46sl\" (UID: \"da566365-93b4-442f-a116-83fd58e319cb\") " pod="openshift-marketplace/redhat-operators-q46sl" Mar 20 18:18:28 crc kubenswrapper[4690]: I0320 18:18:28.445503 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da566365-93b4-442f-a116-83fd58e319cb-utilities\") pod \"redhat-operators-q46sl\" (UID: \"da566365-93b4-442f-a116-83fd58e319cb\") " pod="openshift-marketplace/redhat-operators-q46sl" Mar 20 18:18:28 crc kubenswrapper[4690]: I0320 18:18:28.475984 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9pj2\" (UniqueName: \"kubernetes.io/projected/da566365-93b4-442f-a116-83fd58e319cb-kube-api-access-x9pj2\") pod \"redhat-operators-q46sl\" (UID: \"da566365-93b4-442f-a116-83fd58e319cb\") " pod="openshift-marketplace/redhat-operators-q46sl" Mar 20 18:18:28 crc kubenswrapper[4690]: I0320 18:18:28.574629 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q46sl" Mar 20 18:18:29 crc kubenswrapper[4690]: I0320 18:18:29.046927 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-q46sl"] Mar 20 18:18:29 crc kubenswrapper[4690]: I0320 18:18:29.061144 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qn629" Mar 20 18:18:29 crc kubenswrapper[4690]: I0320 18:18:29.161004 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7n5ks\" (UniqueName: \"kubernetes.io/projected/0fb2f304-f772-4ce8-8372-177341555106-kube-api-access-7n5ks\") pod \"0fb2f304-f772-4ce8-8372-177341555106\" (UID: \"0fb2f304-f772-4ce8-8372-177341555106\") " Mar 20 18:18:29 crc kubenswrapper[4690]: I0320 18:18:29.161099 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/0fb2f304-f772-4ce8-8372-177341555106-ceilometer-compute-config-data-0\") pod \"0fb2f304-f772-4ce8-8372-177341555106\" (UID: \"0fb2f304-f772-4ce8-8372-177341555106\") " Mar 20 18:18:29 crc kubenswrapper[4690]: I0320 18:18:29.161147 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fb2f304-f772-4ce8-8372-177341555106-telemetry-combined-ca-bundle\") pod \"0fb2f304-f772-4ce8-8372-177341555106\" (UID: \"0fb2f304-f772-4ce8-8372-177341555106\") " Mar 20 18:18:29 crc kubenswrapper[4690]: I0320 18:18:29.161177 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/0fb2f304-f772-4ce8-8372-177341555106-ceilometer-compute-config-data-2\") pod \"0fb2f304-f772-4ce8-8372-177341555106\" (UID: \"0fb2f304-f772-4ce8-8372-177341555106\") " Mar 20 18:18:29 crc kubenswrapper[4690]: I0320 18:18:29.161426 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0fb2f304-f772-4ce8-8372-177341555106-ssh-key-openstack-edpm-ipam\") pod \"0fb2f304-f772-4ce8-8372-177341555106\" (UID: \"0fb2f304-f772-4ce8-8372-177341555106\") " Mar 20 18:18:29 crc kubenswrapper[4690]: I0320 18:18:29.161475 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0fb2f304-f772-4ce8-8372-177341555106-inventory\") pod \"0fb2f304-f772-4ce8-8372-177341555106\" (UID: \"0fb2f304-f772-4ce8-8372-177341555106\") " Mar 20 18:18:29 crc kubenswrapper[4690]: I0320 18:18:29.161562 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/0fb2f304-f772-4ce8-8372-177341555106-ceilometer-compute-config-data-1\") pod \"0fb2f304-f772-4ce8-8372-177341555106\" (UID: \"0fb2f304-f772-4ce8-8372-177341555106\") " Mar 20 18:18:29 crc kubenswrapper[4690]: I0320 18:18:29.169012 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fb2f304-f772-4ce8-8372-177341555106-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "0fb2f304-f772-4ce8-8372-177341555106" (UID: "0fb2f304-f772-4ce8-8372-177341555106"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:18:29 crc kubenswrapper[4690]: I0320 18:18:29.176832 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fb2f304-f772-4ce8-8372-177341555106-kube-api-access-7n5ks" (OuterVolumeSpecName: "kube-api-access-7n5ks") pod "0fb2f304-f772-4ce8-8372-177341555106" (UID: "0fb2f304-f772-4ce8-8372-177341555106"). InnerVolumeSpecName "kube-api-access-7n5ks". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:18:29 crc kubenswrapper[4690]: I0320 18:18:29.192324 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fb2f304-f772-4ce8-8372-177341555106-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "0fb2f304-f772-4ce8-8372-177341555106" (UID: "0fb2f304-f772-4ce8-8372-177341555106"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:18:29 crc kubenswrapper[4690]: I0320 18:18:29.193379 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fb2f304-f772-4ce8-8372-177341555106-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "0fb2f304-f772-4ce8-8372-177341555106" (UID: "0fb2f304-f772-4ce8-8372-177341555106"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:18:29 crc kubenswrapper[4690]: I0320 18:18:29.195893 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fb2f304-f772-4ce8-8372-177341555106-inventory" (OuterVolumeSpecName: "inventory") pod "0fb2f304-f772-4ce8-8372-177341555106" (UID: "0fb2f304-f772-4ce8-8372-177341555106"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:18:29 crc kubenswrapper[4690]: I0320 18:18:29.209429 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fb2f304-f772-4ce8-8372-177341555106-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "0fb2f304-f772-4ce8-8372-177341555106" (UID: "0fb2f304-f772-4ce8-8372-177341555106"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:18:29 crc kubenswrapper[4690]: I0320 18:18:29.213937 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fb2f304-f772-4ce8-8372-177341555106-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "0fb2f304-f772-4ce8-8372-177341555106" (UID: "0fb2f304-f772-4ce8-8372-177341555106"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:18:29 crc kubenswrapper[4690]: I0320 18:18:29.263665 4690 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0fb2f304-f772-4ce8-8372-177341555106-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 18:18:29 crc kubenswrapper[4690]: I0320 18:18:29.263710 4690 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0fb2f304-f772-4ce8-8372-177341555106-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 18:18:29 crc kubenswrapper[4690]: I0320 18:18:29.263724 4690 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/0fb2f304-f772-4ce8-8372-177341555106-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Mar 20 18:18:29 crc kubenswrapper[4690]: I0320 18:18:29.263737 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7n5ks\" (UniqueName: \"kubernetes.io/projected/0fb2f304-f772-4ce8-8372-177341555106-kube-api-access-7n5ks\") on node \"crc\" DevicePath \"\"" Mar 20 18:18:29 crc kubenswrapper[4690]: I0320 18:18:29.263746 4690 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/0fb2f304-f772-4ce8-8372-177341555106-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Mar 20 18:18:29 crc kubenswrapper[4690]: I0320 18:18:29.263754 4690 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/0fb2f304-f772-4ce8-8372-177341555106-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Mar 20 18:18:29 crc kubenswrapper[4690]: I0320 18:18:29.263764 4690 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fb2f304-f772-4ce8-8372-177341555106-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 18:18:29 crc kubenswrapper[4690]: I0320 18:18:29.618839 4690 generic.go:334] "Generic (PLEG): container finished" podID="da566365-93b4-442f-a116-83fd58e319cb" containerID="a39e739daaec34da791bad8e9588f84111db001c03afc6c3e746a53db2f159bb" exitCode=0 Mar 20 18:18:29 crc kubenswrapper[4690]: I0320 18:18:29.618932 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q46sl" event={"ID":"da566365-93b4-442f-a116-83fd58e319cb","Type":"ContainerDied","Data":"a39e739daaec34da791bad8e9588f84111db001c03afc6c3e746a53db2f159bb"} Mar 20 18:18:29 crc kubenswrapper[4690]: I0320 18:18:29.618968 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q46sl" event={"ID":"da566365-93b4-442f-a116-83fd58e319cb","Type":"ContainerStarted","Data":"e42b838e9b1fa7477f5a58d4906183c272f85ab40957559a6a2ce8dac0679848"} Mar 20 18:18:29 crc kubenswrapper[4690]: I0320 18:18:29.623163 4690 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 18:18:29 crc kubenswrapper[4690]: I0320 18:18:29.623683 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qn629" event={"ID":"0fb2f304-f772-4ce8-8372-177341555106","Type":"ContainerDied","Data":"45a80b7cd5fd6e719135186e8d194b633c97ae858e147f7b0e79c17afbfd7b41"} Mar 20 18:18:29 crc kubenswrapper[4690]: I0320 18:18:29.623719 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="45a80b7cd5fd6e719135186e8d194b633c97ae858e147f7b0e79c17afbfd7b41" Mar 20 18:18:29 crc kubenswrapper[4690]: I0320 18:18:29.623774 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qn629" Mar 20 18:18:30 crc kubenswrapper[4690]: I0320 18:18:30.638187 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q46sl" event={"ID":"da566365-93b4-442f-a116-83fd58e319cb","Type":"ContainerStarted","Data":"ff1c608b09f086d05bf874d488975bd80a5c769031cf7db631fe9fa19718efe6"} Mar 20 18:18:31 crc kubenswrapper[4690]: I0320 18:18:31.654617 4690 generic.go:334] "Generic (PLEG): container finished" podID="da566365-93b4-442f-a116-83fd58e319cb" containerID="ff1c608b09f086d05bf874d488975bd80a5c769031cf7db631fe9fa19718efe6" exitCode=0 Mar 20 18:18:31 crc kubenswrapper[4690]: I0320 18:18:31.654690 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q46sl" event={"ID":"da566365-93b4-442f-a116-83fd58e319cb","Type":"ContainerDied","Data":"ff1c608b09f086d05bf874d488975bd80a5c769031cf7db631fe9fa19718efe6"} Mar 20 18:18:33 crc kubenswrapper[4690]: I0320 18:18:33.687340 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q46sl" event={"ID":"da566365-93b4-442f-a116-83fd58e319cb","Type":"ContainerStarted","Data":"a48d11f35490443d8514f17413fe6f7689cdcc82532393f29537182f456bc4ce"} Mar 20 18:18:33 crc kubenswrapper[4690]: I0320 18:18:33.736098 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-q46sl" podStartSLOduration=3.205584896 podStartE2EDuration="5.736070294s" podCreationTimestamp="2026-03-20 18:18:28 +0000 UTC" firstStartedPulling="2026-03-20 18:18:29.622882596 +0000 UTC m=+2784.488708274" lastFinishedPulling="2026-03-20 18:18:32.153367994 +0000 UTC m=+2787.019193672" observedRunningTime="2026-03-20 18:18:33.719182647 +0000 UTC m=+2788.585008355" watchObservedRunningTime="2026-03-20 18:18:33.736070294 +0000 UTC m=+2788.601896012" Mar 20 18:18:38 crc kubenswrapper[4690]: I0320 18:18:38.575620 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-q46sl" Mar 20 18:18:38 crc kubenswrapper[4690]: I0320 18:18:38.576337 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-q46sl" Mar 20 18:18:39 crc kubenswrapper[4690]: I0320 18:18:39.625768 4690 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-q46sl" podUID="da566365-93b4-442f-a116-83fd58e319cb" containerName="registry-server" probeResult="failure" output=< Mar 20 18:18:39 crc kubenswrapper[4690]: timeout: failed to connect service ":50051" within 1s Mar 20 18:18:39 crc kubenswrapper[4690]: > Mar 20 18:18:48 crc kubenswrapper[4690]: I0320 18:18:48.628577 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-q46sl" Mar 20 18:18:48 crc kubenswrapper[4690]: I0320 18:18:48.698944 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-q46sl" Mar 20 18:18:48 crc kubenswrapper[4690]: I0320 18:18:48.875042 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-q46sl"] Mar 20 18:18:49 crc kubenswrapper[4690]: I0320 18:18:49.847286 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-q46sl" podUID="da566365-93b4-442f-a116-83fd58e319cb" containerName="registry-server" containerID="cri-o://a48d11f35490443d8514f17413fe6f7689cdcc82532393f29537182f456bc4ce" gracePeriod=2 Mar 20 18:18:50 crc kubenswrapper[4690]: I0320 18:18:50.329742 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q46sl" Mar 20 18:18:50 crc kubenswrapper[4690]: I0320 18:18:50.484342 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da566365-93b4-442f-a116-83fd58e319cb-utilities\") pod \"da566365-93b4-442f-a116-83fd58e319cb\" (UID: \"da566365-93b4-442f-a116-83fd58e319cb\") " Mar 20 18:18:50 crc kubenswrapper[4690]: I0320 18:18:50.484475 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da566365-93b4-442f-a116-83fd58e319cb-catalog-content\") pod \"da566365-93b4-442f-a116-83fd58e319cb\" (UID: \"da566365-93b4-442f-a116-83fd58e319cb\") " Mar 20 18:18:50 crc kubenswrapper[4690]: I0320 18:18:50.484700 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x9pj2\" (UniqueName: \"kubernetes.io/projected/da566365-93b4-442f-a116-83fd58e319cb-kube-api-access-x9pj2\") pod \"da566365-93b4-442f-a116-83fd58e319cb\" (UID: \"da566365-93b4-442f-a116-83fd58e319cb\") " Mar 20 18:18:50 crc kubenswrapper[4690]: I0320 18:18:50.485570 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da566365-93b4-442f-a116-83fd58e319cb-utilities" (OuterVolumeSpecName: "utilities") pod "da566365-93b4-442f-a116-83fd58e319cb" (UID: "da566365-93b4-442f-a116-83fd58e319cb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:18:50 crc kubenswrapper[4690]: I0320 18:18:50.489381 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da566365-93b4-442f-a116-83fd58e319cb-kube-api-access-x9pj2" (OuterVolumeSpecName: "kube-api-access-x9pj2") pod "da566365-93b4-442f-a116-83fd58e319cb" (UID: "da566365-93b4-442f-a116-83fd58e319cb"). InnerVolumeSpecName "kube-api-access-x9pj2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:18:50 crc kubenswrapper[4690]: I0320 18:18:50.587094 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x9pj2\" (UniqueName: \"kubernetes.io/projected/da566365-93b4-442f-a116-83fd58e319cb-kube-api-access-x9pj2\") on node \"crc\" DevicePath \"\"" Mar 20 18:18:50 crc kubenswrapper[4690]: I0320 18:18:50.587380 4690 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da566365-93b4-442f-a116-83fd58e319cb-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 18:18:50 crc kubenswrapper[4690]: I0320 18:18:50.634180 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da566365-93b4-442f-a116-83fd58e319cb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "da566365-93b4-442f-a116-83fd58e319cb" (UID: "da566365-93b4-442f-a116-83fd58e319cb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:18:50 crc kubenswrapper[4690]: I0320 18:18:50.689344 4690 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da566365-93b4-442f-a116-83fd58e319cb-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 18:18:50 crc kubenswrapper[4690]: I0320 18:18:50.858836 4690 generic.go:334] "Generic (PLEG): container finished" podID="da566365-93b4-442f-a116-83fd58e319cb" containerID="a48d11f35490443d8514f17413fe6f7689cdcc82532393f29537182f456bc4ce" exitCode=0 Mar 20 18:18:50 crc kubenswrapper[4690]: I0320 18:18:50.858870 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q46sl" event={"ID":"da566365-93b4-442f-a116-83fd58e319cb","Type":"ContainerDied","Data":"a48d11f35490443d8514f17413fe6f7689cdcc82532393f29537182f456bc4ce"} Mar 20 18:18:50 crc kubenswrapper[4690]: I0320 18:18:50.858920 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q46sl" event={"ID":"da566365-93b4-442f-a116-83fd58e319cb","Type":"ContainerDied","Data":"e42b838e9b1fa7477f5a58d4906183c272f85ab40957559a6a2ce8dac0679848"} Mar 20 18:18:50 crc kubenswrapper[4690]: I0320 18:18:50.858954 4690 scope.go:117] "RemoveContainer" containerID="a48d11f35490443d8514f17413fe6f7689cdcc82532393f29537182f456bc4ce" Mar 20 18:18:50 crc kubenswrapper[4690]: I0320 18:18:50.860995 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q46sl" Mar 20 18:18:50 crc kubenswrapper[4690]: I0320 18:18:50.884149 4690 scope.go:117] "RemoveContainer" containerID="ff1c608b09f086d05bf874d488975bd80a5c769031cf7db631fe9fa19718efe6" Mar 20 18:18:50 crc kubenswrapper[4690]: I0320 18:18:50.909213 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-q46sl"] Mar 20 18:18:50 crc kubenswrapper[4690]: I0320 18:18:50.920564 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-q46sl"] Mar 20 18:18:50 crc kubenswrapper[4690]: I0320 18:18:50.922618 4690 scope.go:117] "RemoveContainer" containerID="a39e739daaec34da791bad8e9588f84111db001c03afc6c3e746a53db2f159bb" Mar 20 18:18:50 crc kubenswrapper[4690]: I0320 18:18:50.955795 4690 scope.go:117] "RemoveContainer" containerID="a48d11f35490443d8514f17413fe6f7689cdcc82532393f29537182f456bc4ce" Mar 20 18:18:50 crc kubenswrapper[4690]: E0320 18:18:50.956381 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a48d11f35490443d8514f17413fe6f7689cdcc82532393f29537182f456bc4ce\": container with ID starting with a48d11f35490443d8514f17413fe6f7689cdcc82532393f29537182f456bc4ce not found: ID does not exist" containerID="a48d11f35490443d8514f17413fe6f7689cdcc82532393f29537182f456bc4ce" Mar 20 18:18:50 crc kubenswrapper[4690]: I0320 18:18:50.956449 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a48d11f35490443d8514f17413fe6f7689cdcc82532393f29537182f456bc4ce"} err="failed to get container status \"a48d11f35490443d8514f17413fe6f7689cdcc82532393f29537182f456bc4ce\": rpc error: code = NotFound desc = could not find container \"a48d11f35490443d8514f17413fe6f7689cdcc82532393f29537182f456bc4ce\": container with ID starting with a48d11f35490443d8514f17413fe6f7689cdcc82532393f29537182f456bc4ce not found: ID does not exist" Mar 20 18:18:50 crc kubenswrapper[4690]: I0320 18:18:50.956492 4690 scope.go:117] "RemoveContainer" containerID="ff1c608b09f086d05bf874d488975bd80a5c769031cf7db631fe9fa19718efe6" Mar 20 18:18:50 crc kubenswrapper[4690]: E0320 18:18:50.956923 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff1c608b09f086d05bf874d488975bd80a5c769031cf7db631fe9fa19718efe6\": container with ID starting with ff1c608b09f086d05bf874d488975bd80a5c769031cf7db631fe9fa19718efe6 not found: ID does not exist" containerID="ff1c608b09f086d05bf874d488975bd80a5c769031cf7db631fe9fa19718efe6" Mar 20 18:18:50 crc kubenswrapper[4690]: I0320 18:18:50.956981 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff1c608b09f086d05bf874d488975bd80a5c769031cf7db631fe9fa19718efe6"} err="failed to get container status \"ff1c608b09f086d05bf874d488975bd80a5c769031cf7db631fe9fa19718efe6\": rpc error: code = NotFound desc = could not find container \"ff1c608b09f086d05bf874d488975bd80a5c769031cf7db631fe9fa19718efe6\": container with ID starting with ff1c608b09f086d05bf874d488975bd80a5c769031cf7db631fe9fa19718efe6 not found: ID does not exist" Mar 20 18:18:50 crc kubenswrapper[4690]: I0320 18:18:50.957020 4690 scope.go:117] "RemoveContainer" containerID="a39e739daaec34da791bad8e9588f84111db001c03afc6c3e746a53db2f159bb" Mar 20 18:18:50 crc kubenswrapper[4690]: E0320 18:18:50.957377 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a39e739daaec34da791bad8e9588f84111db001c03afc6c3e746a53db2f159bb\": container with ID starting with a39e739daaec34da791bad8e9588f84111db001c03afc6c3e746a53db2f159bb not found: ID does not exist" containerID="a39e739daaec34da791bad8e9588f84111db001c03afc6c3e746a53db2f159bb" Mar 20 18:18:50 crc kubenswrapper[4690]: I0320 18:18:50.957423 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a39e739daaec34da791bad8e9588f84111db001c03afc6c3e746a53db2f159bb"} err="failed to get container status \"a39e739daaec34da791bad8e9588f84111db001c03afc6c3e746a53db2f159bb\": rpc error: code = NotFound desc = could not find container \"a39e739daaec34da791bad8e9588f84111db001c03afc6c3e746a53db2f159bb\": container with ID starting with a39e739daaec34da791bad8e9588f84111db001c03afc6c3e746a53db2f159bb not found: ID does not exist" Mar 20 18:18:51 crc kubenswrapper[4690]: I0320 18:18:51.898173 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da566365-93b4-442f-a116-83fd58e319cb" path="/var/lib/kubelet/pods/da566365-93b4-442f-a116-83fd58e319cb/volumes" Mar 20 18:18:54 crc kubenswrapper[4690]: I0320 18:18:54.274096 4690 patch_prober.go:28] interesting pod/machine-config-daemon-wtg2q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 18:18:54 crc kubenswrapper[4690]: I0320 18:18:54.274652 4690 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 18:18:54 crc kubenswrapper[4690]: I0320 18:18:54.274703 4690 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" Mar 20 18:18:54 crc kubenswrapper[4690]: I0320 18:18:54.275357 4690 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3f75d6ec75b86f2dc0e83c5cf97b53edbe2da563ef7799aed9f421624b964264"} pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 18:18:54 crc kubenswrapper[4690]: I0320 18:18:54.275411 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" containerName="machine-config-daemon" containerID="cri-o://3f75d6ec75b86f2dc0e83c5cf97b53edbe2da563ef7799aed9f421624b964264" gracePeriod=600 Mar 20 18:18:54 crc kubenswrapper[4690]: I0320 18:18:54.904319 4690 generic.go:334] "Generic (PLEG): container finished" podID="c18651e4-89e3-43fd-a780-bfa6df87591e" containerID="3f75d6ec75b86f2dc0e83c5cf97b53edbe2da563ef7799aed9f421624b964264" exitCode=0 Mar 20 18:18:54 crc kubenswrapper[4690]: I0320 18:18:54.904382 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" event={"ID":"c18651e4-89e3-43fd-a780-bfa6df87591e","Type":"ContainerDied","Data":"3f75d6ec75b86f2dc0e83c5cf97b53edbe2da563ef7799aed9f421624b964264"} Mar 20 18:18:54 crc kubenswrapper[4690]: I0320 18:18:54.904662 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" event={"ID":"c18651e4-89e3-43fd-a780-bfa6df87591e","Type":"ContainerStarted","Data":"24ef90ba5ffd6fe8cfd84b882fb514055d9bcdb4482ff9cdfceca9605510153c"} Mar 20 18:18:54 crc kubenswrapper[4690]: I0320 18:18:54.904695 4690 scope.go:117] "RemoveContainer" containerID="24e5f76fee7e30729d09e38f23025e12449be266576373e532933c3f0101ae12" Mar 20 18:19:13 crc kubenswrapper[4690]: I0320 18:19:13.095439 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Mar 20 18:19:13 crc kubenswrapper[4690]: E0320 18:19:13.096808 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da566365-93b4-442f-a116-83fd58e319cb" containerName="registry-server" Mar 20 18:19:13 crc kubenswrapper[4690]: I0320 18:19:13.096832 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="da566365-93b4-442f-a116-83fd58e319cb" containerName="registry-server" Mar 20 18:19:13 crc kubenswrapper[4690]: E0320 18:19:13.096846 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da566365-93b4-442f-a116-83fd58e319cb" containerName="extract-utilities" Mar 20 18:19:13 crc kubenswrapper[4690]: I0320 18:19:13.096856 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="da566365-93b4-442f-a116-83fd58e319cb" containerName="extract-utilities" Mar 20 18:19:13 crc kubenswrapper[4690]: E0320 18:19:13.096886 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da566365-93b4-442f-a116-83fd58e319cb" containerName="extract-content" Mar 20 18:19:13 crc kubenswrapper[4690]: I0320 18:19:13.096894 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="da566365-93b4-442f-a116-83fd58e319cb" containerName="extract-content" Mar 20 18:19:13 crc kubenswrapper[4690]: E0320 18:19:13.096912 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fb2f304-f772-4ce8-8372-177341555106" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 20 18:19:13 crc kubenswrapper[4690]: I0320 18:19:13.096921 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fb2f304-f772-4ce8-8372-177341555106" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 20 18:19:13 crc kubenswrapper[4690]: I0320 18:19:13.097148 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fb2f304-f772-4ce8-8372-177341555106" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 20 18:19:13 crc kubenswrapper[4690]: I0320 18:19:13.097169 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="da566365-93b4-442f-a116-83fd58e319cb" containerName="registry-server" Mar 20 18:19:13 crc kubenswrapper[4690]: I0320 18:19:13.098105 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 20 18:19:13 crc kubenswrapper[4690]: I0320 18:19:13.102588 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Mar 20 18:19:13 crc kubenswrapper[4690]: I0320 18:19:13.102643 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 20 18:19:13 crc kubenswrapper[4690]: I0320 18:19:13.103201 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Mar 20 18:19:13 crc kubenswrapper[4690]: I0320 18:19:13.104741 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-b927d" Mar 20 18:19:13 crc kubenswrapper[4690]: I0320 18:19:13.121216 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 20 18:19:13 crc kubenswrapper[4690]: I0320 18:19:13.177872 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/86a8f040-c0ab-4923-8bab-8123fd72e63e-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"86a8f040-c0ab-4923-8bab-8123fd72e63e\") " pod="openstack/tempest-tests-tempest" Mar 20 18:19:13 crc kubenswrapper[4690]: I0320 18:19:13.177958 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/86a8f040-c0ab-4923-8bab-8123fd72e63e-config-data\") pod \"tempest-tests-tempest\" (UID: \"86a8f040-c0ab-4923-8bab-8123fd72e63e\") " pod="openstack/tempest-tests-tempest" Mar 20 18:19:13 crc kubenswrapper[4690]: I0320 18:19:13.178007 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/86a8f040-c0ab-4923-8bab-8123fd72e63e-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"86a8f040-c0ab-4923-8bab-8123fd72e63e\") " pod="openstack/tempest-tests-tempest" Mar 20 18:19:13 crc kubenswrapper[4690]: I0320 18:19:13.178044 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/86a8f040-c0ab-4923-8bab-8123fd72e63e-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"86a8f040-c0ab-4923-8bab-8123fd72e63e\") " pod="openstack/tempest-tests-tempest" Mar 20 18:19:13 crc kubenswrapper[4690]: I0320 18:19:13.178295 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/86a8f040-c0ab-4923-8bab-8123fd72e63e-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"86a8f040-c0ab-4923-8bab-8123fd72e63e\") " pod="openstack/tempest-tests-tempest" Mar 20 18:19:13 crc kubenswrapper[4690]: I0320 18:19:13.178467 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/86a8f040-c0ab-4923-8bab-8123fd72e63e-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"86a8f040-c0ab-4923-8bab-8123fd72e63e\") " pod="openstack/tempest-tests-tempest" Mar 20 18:19:13 crc kubenswrapper[4690]: I0320 18:19:13.178583 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6krmb\" (UniqueName: \"kubernetes.io/projected/86a8f040-c0ab-4923-8bab-8123fd72e63e-kube-api-access-6krmb\") pod \"tempest-tests-tempest\" (UID: \"86a8f040-c0ab-4923-8bab-8123fd72e63e\") " pod="openstack/tempest-tests-tempest" Mar 20 18:19:13 crc kubenswrapper[4690]: I0320 18:19:13.178634 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/86a8f040-c0ab-4923-8bab-8123fd72e63e-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"86a8f040-c0ab-4923-8bab-8123fd72e63e\") " pod="openstack/tempest-tests-tempest" Mar 20 18:19:13 crc kubenswrapper[4690]: I0320 18:19:13.178690 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"tempest-tests-tempest\" (UID: \"86a8f040-c0ab-4923-8bab-8123fd72e63e\") " pod="openstack/tempest-tests-tempest" Mar 20 18:19:13 crc kubenswrapper[4690]: I0320 18:19:13.280369 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/86a8f040-c0ab-4923-8bab-8123fd72e63e-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"86a8f040-c0ab-4923-8bab-8123fd72e63e\") " pod="openstack/tempest-tests-tempest" Mar 20 18:19:13 crc kubenswrapper[4690]: I0320 18:19:13.280443 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/86a8f040-c0ab-4923-8bab-8123fd72e63e-config-data\") pod \"tempest-tests-tempest\" (UID: \"86a8f040-c0ab-4923-8bab-8123fd72e63e\") " pod="openstack/tempest-tests-tempest" Mar 20 18:19:13 crc kubenswrapper[4690]: I0320 18:19:13.280476 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/86a8f040-c0ab-4923-8bab-8123fd72e63e-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"86a8f040-c0ab-4923-8bab-8123fd72e63e\") " pod="openstack/tempest-tests-tempest" Mar 20 18:19:13 crc kubenswrapper[4690]: I0320 18:19:13.280492 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/86a8f040-c0ab-4923-8bab-8123fd72e63e-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"86a8f040-c0ab-4923-8bab-8123fd72e63e\") " pod="openstack/tempest-tests-tempest" Mar 20 18:19:13 crc kubenswrapper[4690]: I0320 18:19:13.280545 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/86a8f040-c0ab-4923-8bab-8123fd72e63e-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"86a8f040-c0ab-4923-8bab-8123fd72e63e\") " pod="openstack/tempest-tests-tempest" Mar 20 18:19:13 crc kubenswrapper[4690]: I0320 18:19:13.280579 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/86a8f040-c0ab-4923-8bab-8123fd72e63e-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"86a8f040-c0ab-4923-8bab-8123fd72e63e\") " pod="openstack/tempest-tests-tempest" Mar 20 18:19:13 crc kubenswrapper[4690]: I0320 18:19:13.280616 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6krmb\" (UniqueName: \"kubernetes.io/projected/86a8f040-c0ab-4923-8bab-8123fd72e63e-kube-api-access-6krmb\") pod \"tempest-tests-tempest\" (UID: \"86a8f040-c0ab-4923-8bab-8123fd72e63e\") " pod="openstack/tempest-tests-tempest" Mar 20 18:19:13 crc kubenswrapper[4690]: I0320 18:19:13.280649 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/86a8f040-c0ab-4923-8bab-8123fd72e63e-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"86a8f040-c0ab-4923-8bab-8123fd72e63e\") " pod="openstack/tempest-tests-tempest" Mar 20 18:19:13 crc kubenswrapper[4690]: I0320 18:19:13.280690 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"tempest-tests-tempest\" (UID: \"86a8f040-c0ab-4923-8bab-8123fd72e63e\") " pod="openstack/tempest-tests-tempest" Mar 20 18:19:13 crc kubenswrapper[4690]: I0320 18:19:13.281027 4690 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"tempest-tests-tempest\" (UID: \"86a8f040-c0ab-4923-8bab-8123fd72e63e\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/tempest-tests-tempest" Mar 20 18:19:13 crc kubenswrapper[4690]: I0320 18:19:13.281628 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/86a8f040-c0ab-4923-8bab-8123fd72e63e-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"86a8f040-c0ab-4923-8bab-8123fd72e63e\") " pod="openstack/tempest-tests-tempest" Mar 20 18:19:13 crc kubenswrapper[4690]: I0320 18:19:13.281898 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/86a8f040-c0ab-4923-8bab-8123fd72e63e-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"86a8f040-c0ab-4923-8bab-8123fd72e63e\") " pod="openstack/tempest-tests-tempest" Mar 20 18:19:13 crc kubenswrapper[4690]: I0320 18:19:13.282437 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/86a8f040-c0ab-4923-8bab-8123fd72e63e-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"86a8f040-c0ab-4923-8bab-8123fd72e63e\") " pod="openstack/tempest-tests-tempest" Mar 20 18:19:13 crc kubenswrapper[4690]: I0320 18:19:13.282598 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/86a8f040-c0ab-4923-8bab-8123fd72e63e-config-data\") pod \"tempest-tests-tempest\" (UID: \"86a8f040-c0ab-4923-8bab-8123fd72e63e\") " pod="openstack/tempest-tests-tempest" Mar 20 18:19:13 crc kubenswrapper[4690]: I0320 18:19:13.293160 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/86a8f040-c0ab-4923-8bab-8123fd72e63e-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"86a8f040-c0ab-4923-8bab-8123fd72e63e\") " pod="openstack/tempest-tests-tempest" Mar 20 18:19:13 crc kubenswrapper[4690]: I0320 18:19:13.296803 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/86a8f040-c0ab-4923-8bab-8123fd72e63e-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"86a8f040-c0ab-4923-8bab-8123fd72e63e\") " pod="openstack/tempest-tests-tempest" Mar 20 18:19:13 crc kubenswrapper[4690]: I0320 18:19:13.305122 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/86a8f040-c0ab-4923-8bab-8123fd72e63e-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"86a8f040-c0ab-4923-8bab-8123fd72e63e\") " pod="openstack/tempest-tests-tempest" Mar 20 18:19:13 crc kubenswrapper[4690]: I0320 18:19:13.309417 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"tempest-tests-tempest\" (UID: \"86a8f040-c0ab-4923-8bab-8123fd72e63e\") " pod="openstack/tempest-tests-tempest" Mar 20 18:19:13 crc kubenswrapper[4690]: I0320 18:19:13.312720 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6krmb\" (UniqueName: \"kubernetes.io/projected/86a8f040-c0ab-4923-8bab-8123fd72e63e-kube-api-access-6krmb\") pod \"tempest-tests-tempest\" (UID: \"86a8f040-c0ab-4923-8bab-8123fd72e63e\") " pod="openstack/tempest-tests-tempest" Mar 20 18:19:13 crc kubenswrapper[4690]: I0320 18:19:13.419920 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 20 18:19:13 crc kubenswrapper[4690]: I0320 18:19:13.872608 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 20 18:19:14 crc kubenswrapper[4690]: I0320 18:19:14.106666 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"86a8f040-c0ab-4923-8bab-8123fd72e63e","Type":"ContainerStarted","Data":"575edbdf4eeb50533aa1847c1c8f6fe49c58ecc0f0103261b62199486a021aa6"} Mar 20 18:19:50 crc kubenswrapper[4690]: E0320 18:19:50.835331 4690 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Mar 20 18:19:50 crc kubenswrapper[4690]: E0320 18:19:50.836081 4690 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6krmb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(86a8f040-c0ab-4923-8bab-8123fd72e63e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 18:19:50 crc kubenswrapper[4690]: E0320 18:19:50.837229 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="86a8f040-c0ab-4923-8bab-8123fd72e63e" Mar 20 18:19:51 crc kubenswrapper[4690]: E0320 18:19:51.698597 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="86a8f040-c0ab-4923-8bab-8123fd72e63e" Mar 20 18:20:00 crc kubenswrapper[4690]: I0320 18:20:00.148192 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567180-dgt87"] Mar 20 18:20:00 crc kubenswrapper[4690]: I0320 18:20:00.150008 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567180-dgt87" Mar 20 18:20:00 crc kubenswrapper[4690]: I0320 18:20:00.152615 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 18:20:00 crc kubenswrapper[4690]: I0320 18:20:00.152856 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5fwhb" Mar 20 18:20:00 crc kubenswrapper[4690]: I0320 18:20:00.154063 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 18:20:00 crc kubenswrapper[4690]: I0320 18:20:00.160178 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567180-dgt87"] Mar 20 18:20:00 crc kubenswrapper[4690]: I0320 18:20:00.206350 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnjn4\" (UniqueName: \"kubernetes.io/projected/b4f1aafa-16b2-4d36-84da-fcbb45e17ba4-kube-api-access-bnjn4\") pod \"auto-csr-approver-29567180-dgt87\" (UID: \"b4f1aafa-16b2-4d36-84da-fcbb45e17ba4\") " pod="openshift-infra/auto-csr-approver-29567180-dgt87" Mar 20 18:20:00 crc kubenswrapper[4690]: I0320 18:20:00.307647 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnjn4\" (UniqueName: \"kubernetes.io/projected/b4f1aafa-16b2-4d36-84da-fcbb45e17ba4-kube-api-access-bnjn4\") pod \"auto-csr-approver-29567180-dgt87\" (UID: \"b4f1aafa-16b2-4d36-84da-fcbb45e17ba4\") " pod="openshift-infra/auto-csr-approver-29567180-dgt87" Mar 20 18:20:00 crc kubenswrapper[4690]: I0320 18:20:00.331219 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnjn4\" (UniqueName: \"kubernetes.io/projected/b4f1aafa-16b2-4d36-84da-fcbb45e17ba4-kube-api-access-bnjn4\") pod \"auto-csr-approver-29567180-dgt87\" (UID: \"b4f1aafa-16b2-4d36-84da-fcbb45e17ba4\") " pod="openshift-infra/auto-csr-approver-29567180-dgt87" Mar 20 18:20:00 crc kubenswrapper[4690]: I0320 18:20:00.502029 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567180-dgt87" Mar 20 18:20:00 crc kubenswrapper[4690]: I0320 18:20:00.968882 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567180-dgt87"] Mar 20 18:20:01 crc kubenswrapper[4690]: I0320 18:20:01.793342 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567180-dgt87" event={"ID":"b4f1aafa-16b2-4d36-84da-fcbb45e17ba4","Type":"ContainerStarted","Data":"451c32e54299be6198916747c1e764768a6b0c844265858cbf817d2f29ba0abc"} Mar 20 18:20:02 crc kubenswrapper[4690]: I0320 18:20:02.812291 4690 generic.go:334] "Generic (PLEG): container finished" podID="b4f1aafa-16b2-4d36-84da-fcbb45e17ba4" containerID="a09d091d4c1eabb850cbf47d8747b2dafa04a6f298358bf993b03224eb8171bc" exitCode=0 Mar 20 18:20:02 crc kubenswrapper[4690]: I0320 18:20:02.812397 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567180-dgt87" event={"ID":"b4f1aafa-16b2-4d36-84da-fcbb45e17ba4","Type":"ContainerDied","Data":"a09d091d4c1eabb850cbf47d8747b2dafa04a6f298358bf993b03224eb8171bc"} Mar 20 18:20:04 crc kubenswrapper[4690]: I0320 18:20:04.209432 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567180-dgt87" Mar 20 18:20:04 crc kubenswrapper[4690]: I0320 18:20:04.401641 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bnjn4\" (UniqueName: \"kubernetes.io/projected/b4f1aafa-16b2-4d36-84da-fcbb45e17ba4-kube-api-access-bnjn4\") pod \"b4f1aafa-16b2-4d36-84da-fcbb45e17ba4\" (UID: \"b4f1aafa-16b2-4d36-84da-fcbb45e17ba4\") " Mar 20 18:20:04 crc kubenswrapper[4690]: I0320 18:20:04.418554 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4f1aafa-16b2-4d36-84da-fcbb45e17ba4-kube-api-access-bnjn4" (OuterVolumeSpecName: "kube-api-access-bnjn4") pod "b4f1aafa-16b2-4d36-84da-fcbb45e17ba4" (UID: "b4f1aafa-16b2-4d36-84da-fcbb45e17ba4"). InnerVolumeSpecName "kube-api-access-bnjn4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:20:04 crc kubenswrapper[4690]: I0320 18:20:04.505428 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bnjn4\" (UniqueName: \"kubernetes.io/projected/b4f1aafa-16b2-4d36-84da-fcbb45e17ba4-kube-api-access-bnjn4\") on node \"crc\" DevicePath \"\"" Mar 20 18:20:04 crc kubenswrapper[4690]: I0320 18:20:04.836122 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567180-dgt87" event={"ID":"b4f1aafa-16b2-4d36-84da-fcbb45e17ba4","Type":"ContainerDied","Data":"451c32e54299be6198916747c1e764768a6b0c844265858cbf817d2f29ba0abc"} Mar 20 18:20:04 crc kubenswrapper[4690]: I0320 18:20:04.836181 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="451c32e54299be6198916747c1e764768a6b0c844265858cbf817d2f29ba0abc" Mar 20 18:20:04 crc kubenswrapper[4690]: I0320 18:20:04.836199 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567180-dgt87" Mar 20 18:20:05 crc kubenswrapper[4690]: I0320 18:20:05.299332 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567174-77prw"] Mar 20 18:20:05 crc kubenswrapper[4690]: I0320 18:20:05.310131 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567174-77prw"] Mar 20 18:20:05 crc kubenswrapper[4690]: I0320 18:20:05.909045 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38bf052e-356e-49e0-af8e-e3c78c8da186" path="/var/lib/kubelet/pods/38bf052e-356e-49e0-af8e-e3c78c8da186/volumes" Mar 20 18:20:06 crc kubenswrapper[4690]: I0320 18:20:06.873916 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"86a8f040-c0ab-4923-8bab-8123fd72e63e","Type":"ContainerStarted","Data":"42bc5f8f53ea25c410557a52c9a563702a9ba2f4637ea7a15908c38f83c496c4"} Mar 20 18:20:06 crc kubenswrapper[4690]: I0320 18:20:06.903747 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=3.268870905 podStartE2EDuration="54.90372553s" podCreationTimestamp="2026-03-20 18:19:12 +0000 UTC" firstStartedPulling="2026-03-20 18:19:13.884581133 +0000 UTC m=+2828.750406851" lastFinishedPulling="2026-03-20 18:20:05.519435788 +0000 UTC m=+2880.385261476" observedRunningTime="2026-03-20 18:20:06.901381944 +0000 UTC m=+2881.767207642" watchObservedRunningTime="2026-03-20 18:20:06.90372553 +0000 UTC m=+2881.769551228" Mar 20 18:20:24 crc kubenswrapper[4690]: I0320 18:20:24.027355 4690 scope.go:117] "RemoveContainer" containerID="205143e133a6d4baff66c65824716168bade68aa5a43caf672791c4c360868ab" Mar 20 18:20:54 crc kubenswrapper[4690]: I0320 18:20:54.274072 4690 patch_prober.go:28] interesting pod/machine-config-daemon-wtg2q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 18:20:54 crc kubenswrapper[4690]: I0320 18:20:54.274827 4690 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 18:21:24 crc kubenswrapper[4690]: I0320 18:21:24.274748 4690 patch_prober.go:28] interesting pod/machine-config-daemon-wtg2q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 18:21:24 crc kubenswrapper[4690]: I0320 18:21:24.275511 4690 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 18:21:54 crc kubenswrapper[4690]: I0320 18:21:54.274858 4690 patch_prober.go:28] interesting pod/machine-config-daemon-wtg2q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 18:21:54 crc kubenswrapper[4690]: I0320 18:21:54.275723 4690 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 18:21:54 crc kubenswrapper[4690]: I0320 18:21:54.275804 4690 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" Mar 20 18:21:54 crc kubenswrapper[4690]: I0320 18:21:54.277016 4690 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"24ef90ba5ffd6fe8cfd84b882fb514055d9bcdb4482ff9cdfceca9605510153c"} pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 18:21:54 crc kubenswrapper[4690]: I0320 18:21:54.277138 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" containerName="machine-config-daemon" containerID="cri-o://24ef90ba5ffd6fe8cfd84b882fb514055d9bcdb4482ff9cdfceca9605510153c" gracePeriod=600 Mar 20 18:21:54 crc kubenswrapper[4690]: E0320 18:21:54.405631 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:21:54 crc kubenswrapper[4690]: I0320 18:21:54.997856 4690 generic.go:334] "Generic (PLEG): container finished" podID="c18651e4-89e3-43fd-a780-bfa6df87591e" containerID="24ef90ba5ffd6fe8cfd84b882fb514055d9bcdb4482ff9cdfceca9605510153c" exitCode=0 Mar 20 18:21:54 crc kubenswrapper[4690]: I0320 18:21:54.997983 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" event={"ID":"c18651e4-89e3-43fd-a780-bfa6df87591e","Type":"ContainerDied","Data":"24ef90ba5ffd6fe8cfd84b882fb514055d9bcdb4482ff9cdfceca9605510153c"} Mar 20 18:21:54 crc kubenswrapper[4690]: I0320 18:21:54.998204 4690 scope.go:117] "RemoveContainer" containerID="3f75d6ec75b86f2dc0e83c5cf97b53edbe2da563ef7799aed9f421624b964264" Mar 20 18:21:54 crc kubenswrapper[4690]: I0320 18:21:54.999077 4690 scope.go:117] "RemoveContainer" containerID="24ef90ba5ffd6fe8cfd84b882fb514055d9bcdb4482ff9cdfceca9605510153c" Mar 20 18:21:54 crc kubenswrapper[4690]: E0320 18:21:54.999723 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:22:00 crc kubenswrapper[4690]: I0320 18:22:00.148986 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567182-ccxc8"] Mar 20 18:22:00 crc kubenswrapper[4690]: E0320 18:22:00.149970 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4f1aafa-16b2-4d36-84da-fcbb45e17ba4" containerName="oc" Mar 20 18:22:00 crc kubenswrapper[4690]: I0320 18:22:00.149985 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4f1aafa-16b2-4d36-84da-fcbb45e17ba4" containerName="oc" Mar 20 18:22:00 crc kubenswrapper[4690]: I0320 18:22:00.150211 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4f1aafa-16b2-4d36-84da-fcbb45e17ba4" containerName="oc" Mar 20 18:22:00 crc kubenswrapper[4690]: I0320 18:22:00.150960 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567182-ccxc8" Mar 20 18:22:00 crc kubenswrapper[4690]: I0320 18:22:00.153452 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5fwhb" Mar 20 18:22:00 crc kubenswrapper[4690]: I0320 18:22:00.156088 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 18:22:00 crc kubenswrapper[4690]: I0320 18:22:00.156280 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 18:22:00 crc kubenswrapper[4690]: I0320 18:22:00.160252 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567182-ccxc8"] Mar 20 18:22:00 crc kubenswrapper[4690]: I0320 18:22:00.261732 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6g5qt\" (UniqueName: \"kubernetes.io/projected/cc295a8d-40fc-4bc8-bcd3-969af4663933-kube-api-access-6g5qt\") pod \"auto-csr-approver-29567182-ccxc8\" (UID: \"cc295a8d-40fc-4bc8-bcd3-969af4663933\") " pod="openshift-infra/auto-csr-approver-29567182-ccxc8" Mar 20 18:22:00 crc kubenswrapper[4690]: I0320 18:22:00.363763 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6g5qt\" (UniqueName: \"kubernetes.io/projected/cc295a8d-40fc-4bc8-bcd3-969af4663933-kube-api-access-6g5qt\") pod \"auto-csr-approver-29567182-ccxc8\" (UID: \"cc295a8d-40fc-4bc8-bcd3-969af4663933\") " pod="openshift-infra/auto-csr-approver-29567182-ccxc8" Mar 20 18:22:00 crc kubenswrapper[4690]: I0320 18:22:00.390317 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6g5qt\" (UniqueName: \"kubernetes.io/projected/cc295a8d-40fc-4bc8-bcd3-969af4663933-kube-api-access-6g5qt\") pod \"auto-csr-approver-29567182-ccxc8\" (UID: \"cc295a8d-40fc-4bc8-bcd3-969af4663933\") " pod="openshift-infra/auto-csr-approver-29567182-ccxc8" Mar 20 18:22:00 crc kubenswrapper[4690]: I0320 18:22:00.510911 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567182-ccxc8" Mar 20 18:22:00 crc kubenswrapper[4690]: I0320 18:22:00.958378 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567182-ccxc8"] Mar 20 18:22:01 crc kubenswrapper[4690]: I0320 18:22:01.055817 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567182-ccxc8" event={"ID":"cc295a8d-40fc-4bc8-bcd3-969af4663933","Type":"ContainerStarted","Data":"4c46d50872901ea8ce456d7465a1df7118d315f90a6497991ecafc6785141889"} Mar 20 18:22:03 crc kubenswrapper[4690]: I0320 18:22:03.079541 4690 generic.go:334] "Generic (PLEG): container finished" podID="cc295a8d-40fc-4bc8-bcd3-969af4663933" containerID="622489cbfdda45a236856cf4f63dcbc208329ecf8b558b11bf3685ce116fdd61" exitCode=0 Mar 20 18:22:03 crc kubenswrapper[4690]: I0320 18:22:03.079622 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567182-ccxc8" event={"ID":"cc295a8d-40fc-4bc8-bcd3-969af4663933","Type":"ContainerDied","Data":"622489cbfdda45a236856cf4f63dcbc208329ecf8b558b11bf3685ce116fdd61"} Mar 20 18:22:04 crc kubenswrapper[4690]: I0320 18:22:04.454073 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567182-ccxc8" Mar 20 18:22:04 crc kubenswrapper[4690]: I0320 18:22:04.570034 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g5qt\" (UniqueName: \"kubernetes.io/projected/cc295a8d-40fc-4bc8-bcd3-969af4663933-kube-api-access-6g5qt\") pod \"cc295a8d-40fc-4bc8-bcd3-969af4663933\" (UID: \"cc295a8d-40fc-4bc8-bcd3-969af4663933\") " Mar 20 18:22:04 crc kubenswrapper[4690]: I0320 18:22:04.575411 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc295a8d-40fc-4bc8-bcd3-969af4663933-kube-api-access-6g5qt" (OuterVolumeSpecName: "kube-api-access-6g5qt") pod "cc295a8d-40fc-4bc8-bcd3-969af4663933" (UID: "cc295a8d-40fc-4bc8-bcd3-969af4663933"). InnerVolumeSpecName "kube-api-access-6g5qt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:22:04 crc kubenswrapper[4690]: I0320 18:22:04.672621 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g5qt\" (UniqueName: \"kubernetes.io/projected/cc295a8d-40fc-4bc8-bcd3-969af4663933-kube-api-access-6g5qt\") on node \"crc\" DevicePath \"\"" Mar 20 18:22:05 crc kubenswrapper[4690]: I0320 18:22:05.097660 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567182-ccxc8" event={"ID":"cc295a8d-40fc-4bc8-bcd3-969af4663933","Type":"ContainerDied","Data":"4c46d50872901ea8ce456d7465a1df7118d315f90a6497991ecafc6785141889"} Mar 20 18:22:05 crc kubenswrapper[4690]: I0320 18:22:05.097995 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c46d50872901ea8ce456d7465a1df7118d315f90a6497991ecafc6785141889" Mar 20 18:22:05 crc kubenswrapper[4690]: I0320 18:22:05.097731 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567182-ccxc8" Mar 20 18:22:05 crc kubenswrapper[4690]: I0320 18:22:05.532344 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567176-2dprz"] Mar 20 18:22:05 crc kubenswrapper[4690]: I0320 18:22:05.541824 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567176-2dprz"] Mar 20 18:22:05 crc kubenswrapper[4690]: I0320 18:22:05.894757 4690 scope.go:117] "RemoveContainer" containerID="24ef90ba5ffd6fe8cfd84b882fb514055d9bcdb4482ff9cdfceca9605510153c" Mar 20 18:22:05 crc kubenswrapper[4690]: E0320 18:22:05.895065 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:22:05 crc kubenswrapper[4690]: I0320 18:22:05.903212 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39c7e9bc-b76b-4754-8f65-42c99bcc0542" path="/var/lib/kubelet/pods/39c7e9bc-b76b-4754-8f65-42c99bcc0542/volumes" Mar 20 18:22:14 crc kubenswrapper[4690]: I0320 18:22:14.864339 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-k5fhg"] Mar 20 18:22:14 crc kubenswrapper[4690]: E0320 18:22:14.865550 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc295a8d-40fc-4bc8-bcd3-969af4663933" containerName="oc" Mar 20 18:22:14 crc kubenswrapper[4690]: I0320 18:22:14.865574 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc295a8d-40fc-4bc8-bcd3-969af4663933" containerName="oc" Mar 20 18:22:14 crc kubenswrapper[4690]: I0320 18:22:14.865897 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc295a8d-40fc-4bc8-bcd3-969af4663933" containerName="oc" Mar 20 18:22:14 crc kubenswrapper[4690]: I0320 18:22:14.873813 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k5fhg" Mar 20 18:22:14 crc kubenswrapper[4690]: I0320 18:22:14.876078 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-k5fhg"] Mar 20 18:22:14 crc kubenswrapper[4690]: I0320 18:22:14.902700 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8d95\" (UniqueName: \"kubernetes.io/projected/f3cb84cb-306f-423f-9fd5-81b2b24e8fd2-kube-api-access-x8d95\") pod \"redhat-marketplace-k5fhg\" (UID: \"f3cb84cb-306f-423f-9fd5-81b2b24e8fd2\") " pod="openshift-marketplace/redhat-marketplace-k5fhg" Mar 20 18:22:14 crc kubenswrapper[4690]: I0320 18:22:14.902771 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3cb84cb-306f-423f-9fd5-81b2b24e8fd2-catalog-content\") pod \"redhat-marketplace-k5fhg\" (UID: \"f3cb84cb-306f-423f-9fd5-81b2b24e8fd2\") " pod="openshift-marketplace/redhat-marketplace-k5fhg" Mar 20 18:22:14 crc kubenswrapper[4690]: I0320 18:22:14.902909 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3cb84cb-306f-423f-9fd5-81b2b24e8fd2-utilities\") pod \"redhat-marketplace-k5fhg\" (UID: \"f3cb84cb-306f-423f-9fd5-81b2b24e8fd2\") " pod="openshift-marketplace/redhat-marketplace-k5fhg" Mar 20 18:22:15 crc kubenswrapper[4690]: I0320 18:22:15.004395 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8d95\" (UniqueName: \"kubernetes.io/projected/f3cb84cb-306f-423f-9fd5-81b2b24e8fd2-kube-api-access-x8d95\") pod \"redhat-marketplace-k5fhg\" (UID: \"f3cb84cb-306f-423f-9fd5-81b2b24e8fd2\") " pod="openshift-marketplace/redhat-marketplace-k5fhg" Mar 20 18:22:15 crc kubenswrapper[4690]: I0320 18:22:15.004452 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3cb84cb-306f-423f-9fd5-81b2b24e8fd2-catalog-content\") pod \"redhat-marketplace-k5fhg\" (UID: \"f3cb84cb-306f-423f-9fd5-81b2b24e8fd2\") " pod="openshift-marketplace/redhat-marketplace-k5fhg" Mar 20 18:22:15 crc kubenswrapper[4690]: I0320 18:22:15.004555 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3cb84cb-306f-423f-9fd5-81b2b24e8fd2-utilities\") pod \"redhat-marketplace-k5fhg\" (UID: \"f3cb84cb-306f-423f-9fd5-81b2b24e8fd2\") " pod="openshift-marketplace/redhat-marketplace-k5fhg" Mar 20 18:22:15 crc kubenswrapper[4690]: I0320 18:22:15.005007 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3cb84cb-306f-423f-9fd5-81b2b24e8fd2-utilities\") pod \"redhat-marketplace-k5fhg\" (UID: \"f3cb84cb-306f-423f-9fd5-81b2b24e8fd2\") " pod="openshift-marketplace/redhat-marketplace-k5fhg" Mar 20 18:22:15 crc kubenswrapper[4690]: I0320 18:22:15.005163 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3cb84cb-306f-423f-9fd5-81b2b24e8fd2-catalog-content\") pod \"redhat-marketplace-k5fhg\" (UID: \"f3cb84cb-306f-423f-9fd5-81b2b24e8fd2\") " pod="openshift-marketplace/redhat-marketplace-k5fhg" Mar 20 18:22:15 crc kubenswrapper[4690]: I0320 18:22:15.029782 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8d95\" (UniqueName: \"kubernetes.io/projected/f3cb84cb-306f-423f-9fd5-81b2b24e8fd2-kube-api-access-x8d95\") pod \"redhat-marketplace-k5fhg\" (UID: \"f3cb84cb-306f-423f-9fd5-81b2b24e8fd2\") " pod="openshift-marketplace/redhat-marketplace-k5fhg" Mar 20 18:22:15 crc kubenswrapper[4690]: I0320 18:22:15.237110 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k5fhg" Mar 20 18:22:15 crc kubenswrapper[4690]: I0320 18:22:15.663969 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-k5fhg"] Mar 20 18:22:16 crc kubenswrapper[4690]: I0320 18:22:16.195428 4690 generic.go:334] "Generic (PLEG): container finished" podID="f3cb84cb-306f-423f-9fd5-81b2b24e8fd2" containerID="a505b1ddcd41f169fabbef0cc3886ed405fdd964daf520955df43f12a19efdae" exitCode=0 Mar 20 18:22:16 crc kubenswrapper[4690]: I0320 18:22:16.195505 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k5fhg" event={"ID":"f3cb84cb-306f-423f-9fd5-81b2b24e8fd2","Type":"ContainerDied","Data":"a505b1ddcd41f169fabbef0cc3886ed405fdd964daf520955df43f12a19efdae"} Mar 20 18:22:16 crc kubenswrapper[4690]: I0320 18:22:16.197896 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k5fhg" event={"ID":"f3cb84cb-306f-423f-9fd5-81b2b24e8fd2","Type":"ContainerStarted","Data":"33262ed2565f354a470f0e068d0af15fee0bb910e54f2e0d5252781b9dee65da"} Mar 20 18:22:17 crc kubenswrapper[4690]: I0320 18:22:17.213645 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k5fhg" event={"ID":"f3cb84cb-306f-423f-9fd5-81b2b24e8fd2","Type":"ContainerStarted","Data":"2a24de39229bea65a0020cacadfff58fa674db01dafea91ec6c6fd8d7bfce288"} Mar 20 18:22:18 crc kubenswrapper[4690]: I0320 18:22:18.223673 4690 generic.go:334] "Generic (PLEG): container finished" podID="f3cb84cb-306f-423f-9fd5-81b2b24e8fd2" containerID="2a24de39229bea65a0020cacadfff58fa674db01dafea91ec6c6fd8d7bfce288" exitCode=0 Mar 20 18:22:18 crc kubenswrapper[4690]: I0320 18:22:18.223774 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k5fhg" event={"ID":"f3cb84cb-306f-423f-9fd5-81b2b24e8fd2","Type":"ContainerDied","Data":"2a24de39229bea65a0020cacadfff58fa674db01dafea91ec6c6fd8d7bfce288"} Mar 20 18:22:19 crc kubenswrapper[4690]: I0320 18:22:19.236656 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k5fhg" event={"ID":"f3cb84cb-306f-423f-9fd5-81b2b24e8fd2","Type":"ContainerStarted","Data":"68bb6448b783f72e79c766f1c6815cc9187cf699c75e306d355754e61b30e832"} Mar 20 18:22:19 crc kubenswrapper[4690]: I0320 18:22:19.281296 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-k5fhg" podStartSLOduration=2.869118716 podStartE2EDuration="5.281281485s" podCreationTimestamp="2026-03-20 18:22:14 +0000 UTC" firstStartedPulling="2026-03-20 18:22:16.198470837 +0000 UTC m=+3011.064296545" lastFinishedPulling="2026-03-20 18:22:18.610633636 +0000 UTC m=+3013.476459314" observedRunningTime="2026-03-20 18:22:19.277644502 +0000 UTC m=+3014.143470180" watchObservedRunningTime="2026-03-20 18:22:19.281281485 +0000 UTC m=+3014.147107163" Mar 20 18:22:20 crc kubenswrapper[4690]: I0320 18:22:20.884088 4690 scope.go:117] "RemoveContainer" containerID="24ef90ba5ffd6fe8cfd84b882fb514055d9bcdb4482ff9cdfceca9605510153c" Mar 20 18:22:20 crc kubenswrapper[4690]: E0320 18:22:20.884592 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:22:24 crc kubenswrapper[4690]: I0320 18:22:24.147531 4690 scope.go:117] "RemoveContainer" containerID="3356581b01b530f4689a8ec2ef78513d83e5dcbefcc327543cdfc05297f92891" Mar 20 18:22:25 crc kubenswrapper[4690]: I0320 18:22:25.237646 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-k5fhg" Mar 20 18:22:25 crc kubenswrapper[4690]: I0320 18:22:25.237700 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-k5fhg" Mar 20 18:22:25 crc kubenswrapper[4690]: I0320 18:22:25.315720 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-k5fhg" Mar 20 18:22:25 crc kubenswrapper[4690]: I0320 18:22:25.388656 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-k5fhg" Mar 20 18:22:25 crc kubenswrapper[4690]: I0320 18:22:25.572551 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-k5fhg"] Mar 20 18:22:27 crc kubenswrapper[4690]: I0320 18:22:27.331558 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-k5fhg" podUID="f3cb84cb-306f-423f-9fd5-81b2b24e8fd2" containerName="registry-server" containerID="cri-o://68bb6448b783f72e79c766f1c6815cc9187cf699c75e306d355754e61b30e832" gracePeriod=2 Mar 20 18:22:27 crc kubenswrapper[4690]: I0320 18:22:27.934954 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k5fhg" Mar 20 18:22:28 crc kubenswrapper[4690]: I0320 18:22:28.098046 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3cb84cb-306f-423f-9fd5-81b2b24e8fd2-catalog-content\") pod \"f3cb84cb-306f-423f-9fd5-81b2b24e8fd2\" (UID: \"f3cb84cb-306f-423f-9fd5-81b2b24e8fd2\") " Mar 20 18:22:28 crc kubenswrapper[4690]: I0320 18:22:28.098145 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3cb84cb-306f-423f-9fd5-81b2b24e8fd2-utilities\") pod \"f3cb84cb-306f-423f-9fd5-81b2b24e8fd2\" (UID: \"f3cb84cb-306f-423f-9fd5-81b2b24e8fd2\") " Mar 20 18:22:28 crc kubenswrapper[4690]: I0320 18:22:28.098293 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x8d95\" (UniqueName: \"kubernetes.io/projected/f3cb84cb-306f-423f-9fd5-81b2b24e8fd2-kube-api-access-x8d95\") pod \"f3cb84cb-306f-423f-9fd5-81b2b24e8fd2\" (UID: \"f3cb84cb-306f-423f-9fd5-81b2b24e8fd2\") " Mar 20 18:22:28 crc kubenswrapper[4690]: I0320 18:22:28.100822 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3cb84cb-306f-423f-9fd5-81b2b24e8fd2-utilities" (OuterVolumeSpecName: "utilities") pod "f3cb84cb-306f-423f-9fd5-81b2b24e8fd2" (UID: "f3cb84cb-306f-423f-9fd5-81b2b24e8fd2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:22:28 crc kubenswrapper[4690]: I0320 18:22:28.112346 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3cb84cb-306f-423f-9fd5-81b2b24e8fd2-kube-api-access-x8d95" (OuterVolumeSpecName: "kube-api-access-x8d95") pod "f3cb84cb-306f-423f-9fd5-81b2b24e8fd2" (UID: "f3cb84cb-306f-423f-9fd5-81b2b24e8fd2"). InnerVolumeSpecName "kube-api-access-x8d95". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:22:28 crc kubenswrapper[4690]: I0320 18:22:28.138731 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3cb84cb-306f-423f-9fd5-81b2b24e8fd2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f3cb84cb-306f-423f-9fd5-81b2b24e8fd2" (UID: "f3cb84cb-306f-423f-9fd5-81b2b24e8fd2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:22:28 crc kubenswrapper[4690]: I0320 18:22:28.201447 4690 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3cb84cb-306f-423f-9fd5-81b2b24e8fd2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 18:22:28 crc kubenswrapper[4690]: I0320 18:22:28.201499 4690 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3cb84cb-306f-423f-9fd5-81b2b24e8fd2-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 18:22:28 crc kubenswrapper[4690]: I0320 18:22:28.201518 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x8d95\" (UniqueName: \"kubernetes.io/projected/f3cb84cb-306f-423f-9fd5-81b2b24e8fd2-kube-api-access-x8d95\") on node \"crc\" DevicePath \"\"" Mar 20 18:22:28 crc kubenswrapper[4690]: I0320 18:22:28.348166 4690 generic.go:334] "Generic (PLEG): container finished" podID="f3cb84cb-306f-423f-9fd5-81b2b24e8fd2" containerID="68bb6448b783f72e79c766f1c6815cc9187cf699c75e306d355754e61b30e832" exitCode=0 Mar 20 18:22:28 crc kubenswrapper[4690]: I0320 18:22:28.348226 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k5fhg" event={"ID":"f3cb84cb-306f-423f-9fd5-81b2b24e8fd2","Type":"ContainerDied","Data":"68bb6448b783f72e79c766f1c6815cc9187cf699c75e306d355754e61b30e832"} Mar 20 18:22:28 crc kubenswrapper[4690]: I0320 18:22:28.348337 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k5fhg" event={"ID":"f3cb84cb-306f-423f-9fd5-81b2b24e8fd2","Type":"ContainerDied","Data":"33262ed2565f354a470f0e068d0af15fee0bb910e54f2e0d5252781b9dee65da"} Mar 20 18:22:28 crc kubenswrapper[4690]: I0320 18:22:28.348274 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k5fhg" Mar 20 18:22:28 crc kubenswrapper[4690]: I0320 18:22:28.348416 4690 scope.go:117] "RemoveContainer" containerID="68bb6448b783f72e79c766f1c6815cc9187cf699c75e306d355754e61b30e832" Mar 20 18:22:28 crc kubenswrapper[4690]: I0320 18:22:28.389510 4690 scope.go:117] "RemoveContainer" containerID="2a24de39229bea65a0020cacadfff58fa674db01dafea91ec6c6fd8d7bfce288" Mar 20 18:22:28 crc kubenswrapper[4690]: I0320 18:22:28.393912 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-k5fhg"] Mar 20 18:22:28 crc kubenswrapper[4690]: I0320 18:22:28.406716 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-k5fhg"] Mar 20 18:22:28 crc kubenswrapper[4690]: I0320 18:22:28.416008 4690 scope.go:117] "RemoveContainer" containerID="a505b1ddcd41f169fabbef0cc3886ed405fdd964daf520955df43f12a19efdae" Mar 20 18:22:28 crc kubenswrapper[4690]: I0320 18:22:28.493236 4690 scope.go:117] "RemoveContainer" containerID="68bb6448b783f72e79c766f1c6815cc9187cf699c75e306d355754e61b30e832" Mar 20 18:22:28 crc kubenswrapper[4690]: E0320 18:22:28.493994 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68bb6448b783f72e79c766f1c6815cc9187cf699c75e306d355754e61b30e832\": container with ID starting with 68bb6448b783f72e79c766f1c6815cc9187cf699c75e306d355754e61b30e832 not found: ID does not exist" containerID="68bb6448b783f72e79c766f1c6815cc9187cf699c75e306d355754e61b30e832" Mar 20 18:22:28 crc kubenswrapper[4690]: I0320 18:22:28.494138 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68bb6448b783f72e79c766f1c6815cc9187cf699c75e306d355754e61b30e832"} err="failed to get container status \"68bb6448b783f72e79c766f1c6815cc9187cf699c75e306d355754e61b30e832\": rpc error: code = NotFound desc = could not find container \"68bb6448b783f72e79c766f1c6815cc9187cf699c75e306d355754e61b30e832\": container with ID starting with 68bb6448b783f72e79c766f1c6815cc9187cf699c75e306d355754e61b30e832 not found: ID does not exist" Mar 20 18:22:28 crc kubenswrapper[4690]: I0320 18:22:28.494233 4690 scope.go:117] "RemoveContainer" containerID="2a24de39229bea65a0020cacadfff58fa674db01dafea91ec6c6fd8d7bfce288" Mar 20 18:22:28 crc kubenswrapper[4690]: E0320 18:22:28.494911 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a24de39229bea65a0020cacadfff58fa674db01dafea91ec6c6fd8d7bfce288\": container with ID starting with 2a24de39229bea65a0020cacadfff58fa674db01dafea91ec6c6fd8d7bfce288 not found: ID does not exist" containerID="2a24de39229bea65a0020cacadfff58fa674db01dafea91ec6c6fd8d7bfce288" Mar 20 18:22:28 crc kubenswrapper[4690]: I0320 18:22:28.494993 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a24de39229bea65a0020cacadfff58fa674db01dafea91ec6c6fd8d7bfce288"} err="failed to get container status \"2a24de39229bea65a0020cacadfff58fa674db01dafea91ec6c6fd8d7bfce288\": rpc error: code = NotFound desc = could not find container \"2a24de39229bea65a0020cacadfff58fa674db01dafea91ec6c6fd8d7bfce288\": container with ID starting with 2a24de39229bea65a0020cacadfff58fa674db01dafea91ec6c6fd8d7bfce288 not found: ID does not exist" Mar 20 18:22:28 crc kubenswrapper[4690]: I0320 18:22:28.495042 4690 scope.go:117] "RemoveContainer" containerID="a505b1ddcd41f169fabbef0cc3886ed405fdd964daf520955df43f12a19efdae" Mar 20 18:22:28 crc kubenswrapper[4690]: E0320 18:22:28.495529 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a505b1ddcd41f169fabbef0cc3886ed405fdd964daf520955df43f12a19efdae\": container with ID starting with a505b1ddcd41f169fabbef0cc3886ed405fdd964daf520955df43f12a19efdae not found: ID does not exist" containerID="a505b1ddcd41f169fabbef0cc3886ed405fdd964daf520955df43f12a19efdae" Mar 20 18:22:28 crc kubenswrapper[4690]: I0320 18:22:28.495588 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a505b1ddcd41f169fabbef0cc3886ed405fdd964daf520955df43f12a19efdae"} err="failed to get container status \"a505b1ddcd41f169fabbef0cc3886ed405fdd964daf520955df43f12a19efdae\": rpc error: code = NotFound desc = could not find container \"a505b1ddcd41f169fabbef0cc3886ed405fdd964daf520955df43f12a19efdae\": container with ID starting with a505b1ddcd41f169fabbef0cc3886ed405fdd964daf520955df43f12a19efdae not found: ID does not exist" Mar 20 18:22:29 crc kubenswrapper[4690]: I0320 18:22:29.895199 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3cb84cb-306f-423f-9fd5-81b2b24e8fd2" path="/var/lib/kubelet/pods/f3cb84cb-306f-423f-9fd5-81b2b24e8fd2/volumes" Mar 20 18:22:32 crc kubenswrapper[4690]: I0320 18:22:32.884469 4690 scope.go:117] "RemoveContainer" containerID="24ef90ba5ffd6fe8cfd84b882fb514055d9bcdb4482ff9cdfceca9605510153c" Mar 20 18:22:32 crc kubenswrapper[4690]: E0320 18:22:32.885154 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:22:45 crc kubenswrapper[4690]: I0320 18:22:45.901129 4690 scope.go:117] "RemoveContainer" containerID="24ef90ba5ffd6fe8cfd84b882fb514055d9bcdb4482ff9cdfceca9605510153c" Mar 20 18:22:45 crc kubenswrapper[4690]: E0320 18:22:45.902965 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:23:00 crc kubenswrapper[4690]: I0320 18:23:00.883356 4690 scope.go:117] "RemoveContainer" containerID="24ef90ba5ffd6fe8cfd84b882fb514055d9bcdb4482ff9cdfceca9605510153c" Mar 20 18:23:00 crc kubenswrapper[4690]: E0320 18:23:00.883979 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:23:13 crc kubenswrapper[4690]: I0320 18:23:13.883310 4690 scope.go:117] "RemoveContainer" containerID="24ef90ba5ffd6fe8cfd84b882fb514055d9bcdb4482ff9cdfceca9605510153c" Mar 20 18:23:13 crc kubenswrapper[4690]: E0320 18:23:13.884159 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:23:28 crc kubenswrapper[4690]: I0320 18:23:28.883672 4690 scope.go:117] "RemoveContainer" containerID="24ef90ba5ffd6fe8cfd84b882fb514055d9bcdb4482ff9cdfceca9605510153c" Mar 20 18:23:28 crc kubenswrapper[4690]: E0320 18:23:28.884428 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:23:40 crc kubenswrapper[4690]: I0320 18:23:40.884369 4690 scope.go:117] "RemoveContainer" containerID="24ef90ba5ffd6fe8cfd84b882fb514055d9bcdb4482ff9cdfceca9605510153c" Mar 20 18:23:40 crc kubenswrapper[4690]: E0320 18:23:40.885405 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:23:54 crc kubenswrapper[4690]: I0320 18:23:54.883411 4690 scope.go:117] "RemoveContainer" containerID="24ef90ba5ffd6fe8cfd84b882fb514055d9bcdb4482ff9cdfceca9605510153c" Mar 20 18:23:54 crc kubenswrapper[4690]: E0320 18:23:54.884296 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:24:00 crc kubenswrapper[4690]: I0320 18:24:00.152084 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567184-bt79j"] Mar 20 18:24:00 crc kubenswrapper[4690]: E0320 18:24:00.153116 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3cb84cb-306f-423f-9fd5-81b2b24e8fd2" containerName="extract-utilities" Mar 20 18:24:00 crc kubenswrapper[4690]: I0320 18:24:00.153131 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3cb84cb-306f-423f-9fd5-81b2b24e8fd2" containerName="extract-utilities" Mar 20 18:24:00 crc kubenswrapper[4690]: E0320 18:24:00.153182 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3cb84cb-306f-423f-9fd5-81b2b24e8fd2" containerName="extract-content" Mar 20 18:24:00 crc kubenswrapper[4690]: I0320 18:24:00.153190 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3cb84cb-306f-423f-9fd5-81b2b24e8fd2" containerName="extract-content" Mar 20 18:24:00 crc kubenswrapper[4690]: E0320 18:24:00.153210 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3cb84cb-306f-423f-9fd5-81b2b24e8fd2" containerName="registry-server" Mar 20 18:24:00 crc kubenswrapper[4690]: I0320 18:24:00.153217 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3cb84cb-306f-423f-9fd5-81b2b24e8fd2" containerName="registry-server" Mar 20 18:24:00 crc kubenswrapper[4690]: I0320 18:24:00.153509 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3cb84cb-306f-423f-9fd5-81b2b24e8fd2" containerName="registry-server" Mar 20 18:24:00 crc kubenswrapper[4690]: I0320 18:24:00.154328 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567184-bt79j" Mar 20 18:24:00 crc kubenswrapper[4690]: I0320 18:24:00.156929 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 18:24:00 crc kubenswrapper[4690]: I0320 18:24:00.157126 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 18:24:00 crc kubenswrapper[4690]: I0320 18:24:00.157168 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5fwhb" Mar 20 18:24:00 crc kubenswrapper[4690]: I0320 18:24:00.168309 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567184-bt79j"] Mar 20 18:24:00 crc kubenswrapper[4690]: I0320 18:24:00.313511 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dssq5\" (UniqueName: \"kubernetes.io/projected/c9c3ac1c-eae1-4ff1-b67a-6892f1032d50-kube-api-access-dssq5\") pod \"auto-csr-approver-29567184-bt79j\" (UID: \"c9c3ac1c-eae1-4ff1-b67a-6892f1032d50\") " pod="openshift-infra/auto-csr-approver-29567184-bt79j" Mar 20 18:24:00 crc kubenswrapper[4690]: I0320 18:24:00.415074 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dssq5\" (UniqueName: \"kubernetes.io/projected/c9c3ac1c-eae1-4ff1-b67a-6892f1032d50-kube-api-access-dssq5\") pod \"auto-csr-approver-29567184-bt79j\" (UID: \"c9c3ac1c-eae1-4ff1-b67a-6892f1032d50\") " pod="openshift-infra/auto-csr-approver-29567184-bt79j" Mar 20 18:24:00 crc kubenswrapper[4690]: I0320 18:24:00.435142 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dssq5\" (UniqueName: \"kubernetes.io/projected/c9c3ac1c-eae1-4ff1-b67a-6892f1032d50-kube-api-access-dssq5\") pod \"auto-csr-approver-29567184-bt79j\" (UID: \"c9c3ac1c-eae1-4ff1-b67a-6892f1032d50\") " pod="openshift-infra/auto-csr-approver-29567184-bt79j" Mar 20 18:24:00 crc kubenswrapper[4690]: I0320 18:24:00.480776 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567184-bt79j" Mar 20 18:24:00 crc kubenswrapper[4690]: I0320 18:24:00.959424 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567184-bt79j"] Mar 20 18:24:00 crc kubenswrapper[4690]: I0320 18:24:00.960371 4690 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 18:24:01 crc kubenswrapper[4690]: I0320 18:24:01.320540 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567184-bt79j" event={"ID":"c9c3ac1c-eae1-4ff1-b67a-6892f1032d50","Type":"ContainerStarted","Data":"1a56897c5a25c3677c42ad0fb762d066e5d024dee69bf1ebe525f5e1eaff05ed"} Mar 20 18:24:02 crc kubenswrapper[4690]: I0320 18:24:02.329909 4690 generic.go:334] "Generic (PLEG): container finished" podID="c9c3ac1c-eae1-4ff1-b67a-6892f1032d50" containerID="2515792b8d0096fa242a7862922dea334f0bd2244a1fce9d313f283d9ac8d1c5" exitCode=0 Mar 20 18:24:02 crc kubenswrapper[4690]: I0320 18:24:02.330019 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567184-bt79j" event={"ID":"c9c3ac1c-eae1-4ff1-b67a-6892f1032d50","Type":"ContainerDied","Data":"2515792b8d0096fa242a7862922dea334f0bd2244a1fce9d313f283d9ac8d1c5"} Mar 20 18:24:03 crc kubenswrapper[4690]: I0320 18:24:03.788813 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567184-bt79j" Mar 20 18:24:03 crc kubenswrapper[4690]: I0320 18:24:03.991493 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dssq5\" (UniqueName: \"kubernetes.io/projected/c9c3ac1c-eae1-4ff1-b67a-6892f1032d50-kube-api-access-dssq5\") pod \"c9c3ac1c-eae1-4ff1-b67a-6892f1032d50\" (UID: \"c9c3ac1c-eae1-4ff1-b67a-6892f1032d50\") " Mar 20 18:24:04 crc kubenswrapper[4690]: I0320 18:24:04.001749 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9c3ac1c-eae1-4ff1-b67a-6892f1032d50-kube-api-access-dssq5" (OuterVolumeSpecName: "kube-api-access-dssq5") pod "c9c3ac1c-eae1-4ff1-b67a-6892f1032d50" (UID: "c9c3ac1c-eae1-4ff1-b67a-6892f1032d50"). InnerVolumeSpecName "kube-api-access-dssq5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:24:04 crc kubenswrapper[4690]: I0320 18:24:04.094231 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dssq5\" (UniqueName: \"kubernetes.io/projected/c9c3ac1c-eae1-4ff1-b67a-6892f1032d50-kube-api-access-dssq5\") on node \"crc\" DevicePath \"\"" Mar 20 18:24:04 crc kubenswrapper[4690]: I0320 18:24:04.352970 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567184-bt79j" event={"ID":"c9c3ac1c-eae1-4ff1-b67a-6892f1032d50","Type":"ContainerDied","Data":"1a56897c5a25c3677c42ad0fb762d066e5d024dee69bf1ebe525f5e1eaff05ed"} Mar 20 18:24:04 crc kubenswrapper[4690]: I0320 18:24:04.353016 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a56897c5a25c3677c42ad0fb762d066e5d024dee69bf1ebe525f5e1eaff05ed" Mar 20 18:24:04 crc kubenswrapper[4690]: I0320 18:24:04.353052 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567184-bt79j" Mar 20 18:24:04 crc kubenswrapper[4690]: I0320 18:24:04.885700 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567178-96l5z"] Mar 20 18:24:04 crc kubenswrapper[4690]: I0320 18:24:04.895851 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567178-96l5z"] Mar 20 18:24:05 crc kubenswrapper[4690]: I0320 18:24:05.903017 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4c78c20-5d06-4ae9-b2d8-1038463e235d" path="/var/lib/kubelet/pods/b4c78c20-5d06-4ae9-b2d8-1038463e235d/volumes" Mar 20 18:24:08 crc kubenswrapper[4690]: I0320 18:24:08.884013 4690 scope.go:117] "RemoveContainer" containerID="24ef90ba5ffd6fe8cfd84b882fb514055d9bcdb4482ff9cdfceca9605510153c" Mar 20 18:24:08 crc kubenswrapper[4690]: E0320 18:24:08.885139 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:24:19 crc kubenswrapper[4690]: I0320 18:24:19.883481 4690 scope.go:117] "RemoveContainer" containerID="24ef90ba5ffd6fe8cfd84b882fb514055d9bcdb4482ff9cdfceca9605510153c" Mar 20 18:24:19 crc kubenswrapper[4690]: E0320 18:24:19.884290 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:24:24 crc kubenswrapper[4690]: I0320 18:24:24.246285 4690 scope.go:117] "RemoveContainer" containerID="3d170acfe4e91036374e01600024966964dd16c92b739363e9923deaa98dc1c8" Mar 20 18:24:32 crc kubenswrapper[4690]: I0320 18:24:32.883786 4690 scope.go:117] "RemoveContainer" containerID="24ef90ba5ffd6fe8cfd84b882fb514055d9bcdb4482ff9cdfceca9605510153c" Mar 20 18:24:32 crc kubenswrapper[4690]: E0320 18:24:32.884832 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:24:47 crc kubenswrapper[4690]: I0320 18:24:47.883392 4690 scope.go:117] "RemoveContainer" containerID="24ef90ba5ffd6fe8cfd84b882fb514055d9bcdb4482ff9cdfceca9605510153c" Mar 20 18:24:47 crc kubenswrapper[4690]: E0320 18:24:47.884173 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:25:00 crc kubenswrapper[4690]: I0320 18:25:00.884194 4690 scope.go:117] "RemoveContainer" containerID="24ef90ba5ffd6fe8cfd84b882fb514055d9bcdb4482ff9cdfceca9605510153c" Mar 20 18:25:00 crc kubenswrapper[4690]: E0320 18:25:00.884961 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:25:11 crc kubenswrapper[4690]: I0320 18:25:11.883884 4690 scope.go:117] "RemoveContainer" containerID="24ef90ba5ffd6fe8cfd84b882fb514055d9bcdb4482ff9cdfceca9605510153c" Mar 20 18:25:11 crc kubenswrapper[4690]: E0320 18:25:11.884709 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:25:25 crc kubenswrapper[4690]: I0320 18:25:25.889576 4690 scope.go:117] "RemoveContainer" containerID="24ef90ba5ffd6fe8cfd84b882fb514055d9bcdb4482ff9cdfceca9605510153c" Mar 20 18:25:25 crc kubenswrapper[4690]: E0320 18:25:25.890402 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:25:40 crc kubenswrapper[4690]: I0320 18:25:40.883458 4690 scope.go:117] "RemoveContainer" containerID="24ef90ba5ffd6fe8cfd84b882fb514055d9bcdb4482ff9cdfceca9605510153c" Mar 20 18:25:40 crc kubenswrapper[4690]: E0320 18:25:40.884076 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:25:53 crc kubenswrapper[4690]: I0320 18:25:53.883680 4690 scope.go:117] "RemoveContainer" containerID="24ef90ba5ffd6fe8cfd84b882fb514055d9bcdb4482ff9cdfceca9605510153c" Mar 20 18:25:53 crc kubenswrapper[4690]: E0320 18:25:53.884744 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:26:00 crc kubenswrapper[4690]: I0320 18:26:00.158045 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567186-ggxmj"] Mar 20 18:26:00 crc kubenswrapper[4690]: E0320 18:26:00.159056 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9c3ac1c-eae1-4ff1-b67a-6892f1032d50" containerName="oc" Mar 20 18:26:00 crc kubenswrapper[4690]: I0320 18:26:00.159072 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9c3ac1c-eae1-4ff1-b67a-6892f1032d50" containerName="oc" Mar 20 18:26:00 crc kubenswrapper[4690]: I0320 18:26:00.159562 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9c3ac1c-eae1-4ff1-b67a-6892f1032d50" containerName="oc" Mar 20 18:26:00 crc kubenswrapper[4690]: I0320 18:26:00.160328 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567186-ggxmj" Mar 20 18:26:00 crc kubenswrapper[4690]: I0320 18:26:00.165413 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 18:26:00 crc kubenswrapper[4690]: I0320 18:26:00.165845 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5fwhb" Mar 20 18:26:00 crc kubenswrapper[4690]: I0320 18:26:00.169942 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 18:26:00 crc kubenswrapper[4690]: I0320 18:26:00.170170 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567186-ggxmj"] Mar 20 18:26:00 crc kubenswrapper[4690]: I0320 18:26:00.288573 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgh7r\" (UniqueName: \"kubernetes.io/projected/e1632020-99f1-449a-b150-387af3337331-kube-api-access-pgh7r\") pod \"auto-csr-approver-29567186-ggxmj\" (UID: \"e1632020-99f1-449a-b150-387af3337331\") " pod="openshift-infra/auto-csr-approver-29567186-ggxmj" Mar 20 18:26:00 crc kubenswrapper[4690]: I0320 18:26:00.390471 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgh7r\" (UniqueName: \"kubernetes.io/projected/e1632020-99f1-449a-b150-387af3337331-kube-api-access-pgh7r\") pod \"auto-csr-approver-29567186-ggxmj\" (UID: \"e1632020-99f1-449a-b150-387af3337331\") " pod="openshift-infra/auto-csr-approver-29567186-ggxmj" Mar 20 18:26:00 crc kubenswrapper[4690]: I0320 18:26:00.414285 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgh7r\" (UniqueName: \"kubernetes.io/projected/e1632020-99f1-449a-b150-387af3337331-kube-api-access-pgh7r\") pod \"auto-csr-approver-29567186-ggxmj\" (UID: \"e1632020-99f1-449a-b150-387af3337331\") " pod="openshift-infra/auto-csr-approver-29567186-ggxmj" Mar 20 18:26:00 crc kubenswrapper[4690]: I0320 18:26:00.479148 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567186-ggxmj" Mar 20 18:26:01 crc kubenswrapper[4690]: I0320 18:26:00.999919 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567186-ggxmj"] Mar 20 18:26:01 crc kubenswrapper[4690]: W0320 18:26:01.000111 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode1632020_99f1_449a_b150_387af3337331.slice/crio-e6668fd3168312eb115f75c1f88f366704dcf5d364bd387a39be31e8796446bc WatchSource:0}: Error finding container e6668fd3168312eb115f75c1f88f366704dcf5d364bd387a39be31e8796446bc: Status 404 returned error can't find the container with id e6668fd3168312eb115f75c1f88f366704dcf5d364bd387a39be31e8796446bc Mar 20 18:26:01 crc kubenswrapper[4690]: I0320 18:26:01.607716 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567186-ggxmj" event={"ID":"e1632020-99f1-449a-b150-387af3337331","Type":"ContainerStarted","Data":"e6668fd3168312eb115f75c1f88f366704dcf5d364bd387a39be31e8796446bc"} Mar 20 18:26:02 crc kubenswrapper[4690]: I0320 18:26:02.617736 4690 generic.go:334] "Generic (PLEG): container finished" podID="e1632020-99f1-449a-b150-387af3337331" containerID="17c3a62df930ee9165517410d1a4c83e575c60e6d4bf84e92750a00dea6928df" exitCode=0 Mar 20 18:26:02 crc kubenswrapper[4690]: I0320 18:26:02.617827 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567186-ggxmj" event={"ID":"e1632020-99f1-449a-b150-387af3337331","Type":"ContainerDied","Data":"17c3a62df930ee9165517410d1a4c83e575c60e6d4bf84e92750a00dea6928df"} Mar 20 18:26:04 crc kubenswrapper[4690]: I0320 18:26:04.072656 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567186-ggxmj" Mar 20 18:26:04 crc kubenswrapper[4690]: I0320 18:26:04.176077 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pgh7r\" (UniqueName: \"kubernetes.io/projected/e1632020-99f1-449a-b150-387af3337331-kube-api-access-pgh7r\") pod \"e1632020-99f1-449a-b150-387af3337331\" (UID: \"e1632020-99f1-449a-b150-387af3337331\") " Mar 20 18:26:04 crc kubenswrapper[4690]: I0320 18:26:04.185491 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1632020-99f1-449a-b150-387af3337331-kube-api-access-pgh7r" (OuterVolumeSpecName: "kube-api-access-pgh7r") pod "e1632020-99f1-449a-b150-387af3337331" (UID: "e1632020-99f1-449a-b150-387af3337331"). InnerVolumeSpecName "kube-api-access-pgh7r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:26:04 crc kubenswrapper[4690]: I0320 18:26:04.277969 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pgh7r\" (UniqueName: \"kubernetes.io/projected/e1632020-99f1-449a-b150-387af3337331-kube-api-access-pgh7r\") on node \"crc\" DevicePath \"\"" Mar 20 18:26:04 crc kubenswrapper[4690]: I0320 18:26:04.643110 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567186-ggxmj" event={"ID":"e1632020-99f1-449a-b150-387af3337331","Type":"ContainerDied","Data":"e6668fd3168312eb115f75c1f88f366704dcf5d364bd387a39be31e8796446bc"} Mar 20 18:26:04 crc kubenswrapper[4690]: I0320 18:26:04.643614 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e6668fd3168312eb115f75c1f88f366704dcf5d364bd387a39be31e8796446bc" Mar 20 18:26:04 crc kubenswrapper[4690]: I0320 18:26:04.643180 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567186-ggxmj" Mar 20 18:26:05 crc kubenswrapper[4690]: I0320 18:26:05.191048 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567180-dgt87"] Mar 20 18:26:05 crc kubenswrapper[4690]: I0320 18:26:05.200908 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567180-dgt87"] Mar 20 18:26:05 crc kubenswrapper[4690]: I0320 18:26:05.900933 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4f1aafa-16b2-4d36-84da-fcbb45e17ba4" path="/var/lib/kubelet/pods/b4f1aafa-16b2-4d36-84da-fcbb45e17ba4/volumes" Mar 20 18:26:06 crc kubenswrapper[4690]: I0320 18:26:06.884686 4690 scope.go:117] "RemoveContainer" containerID="24ef90ba5ffd6fe8cfd84b882fb514055d9bcdb4482ff9cdfceca9605510153c" Mar 20 18:26:06 crc kubenswrapper[4690]: E0320 18:26:06.885148 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:26:17 crc kubenswrapper[4690]: I0320 18:26:17.884104 4690 scope.go:117] "RemoveContainer" containerID="24ef90ba5ffd6fe8cfd84b882fb514055d9bcdb4482ff9cdfceca9605510153c" Mar 20 18:26:17 crc kubenswrapper[4690]: E0320 18:26:17.885388 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:26:24 crc kubenswrapper[4690]: I0320 18:26:24.377550 4690 scope.go:117] "RemoveContainer" containerID="a09d091d4c1eabb850cbf47d8747b2dafa04a6f298358bf993b03224eb8171bc" Mar 20 18:26:29 crc kubenswrapper[4690]: I0320 18:26:29.884153 4690 scope.go:117] "RemoveContainer" containerID="24ef90ba5ffd6fe8cfd84b882fb514055d9bcdb4482ff9cdfceca9605510153c" Mar 20 18:26:29 crc kubenswrapper[4690]: E0320 18:26:29.885458 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:26:43 crc kubenswrapper[4690]: I0320 18:26:43.883653 4690 scope.go:117] "RemoveContainer" containerID="24ef90ba5ffd6fe8cfd84b882fb514055d9bcdb4482ff9cdfceca9605510153c" Mar 20 18:26:43 crc kubenswrapper[4690]: E0320 18:26:43.884422 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:26:56 crc kubenswrapper[4690]: I0320 18:26:56.883567 4690 scope.go:117] "RemoveContainer" containerID="24ef90ba5ffd6fe8cfd84b882fb514055d9bcdb4482ff9cdfceca9605510153c" Mar 20 18:26:57 crc kubenswrapper[4690]: I0320 18:26:57.389284 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" event={"ID":"c18651e4-89e3-43fd-a780-bfa6df87591e","Type":"ContainerStarted","Data":"a2e6f68efca4135e3c8fa49777a5346857b4523349d6d6127d731aa0476809cd"} Mar 20 18:28:00 crc kubenswrapper[4690]: I0320 18:28:00.025069 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hs8bh"] Mar 20 18:28:00 crc kubenswrapper[4690]: E0320 18:28:00.026306 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1632020-99f1-449a-b150-387af3337331" containerName="oc" Mar 20 18:28:00 crc kubenswrapper[4690]: I0320 18:28:00.026322 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1632020-99f1-449a-b150-387af3337331" containerName="oc" Mar 20 18:28:00 crc kubenswrapper[4690]: I0320 18:28:00.026530 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1632020-99f1-449a-b150-387af3337331" containerName="oc" Mar 20 18:28:00 crc kubenswrapper[4690]: I0320 18:28:00.027834 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hs8bh" Mar 20 18:28:00 crc kubenswrapper[4690]: I0320 18:28:00.046179 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hs8bh"] Mar 20 18:28:00 crc kubenswrapper[4690]: I0320 18:28:00.127559 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b6e5eca-65ff-4d86-8f2e-e65c784e5ce9-catalog-content\") pod \"certified-operators-hs8bh\" (UID: \"3b6e5eca-65ff-4d86-8f2e-e65c784e5ce9\") " pod="openshift-marketplace/certified-operators-hs8bh" Mar 20 18:28:00 crc kubenswrapper[4690]: I0320 18:28:00.127659 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlsfs\" (UniqueName: \"kubernetes.io/projected/3b6e5eca-65ff-4d86-8f2e-e65c784e5ce9-kube-api-access-dlsfs\") pod \"certified-operators-hs8bh\" (UID: \"3b6e5eca-65ff-4d86-8f2e-e65c784e5ce9\") " pod="openshift-marketplace/certified-operators-hs8bh" Mar 20 18:28:00 crc kubenswrapper[4690]: I0320 18:28:00.127769 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b6e5eca-65ff-4d86-8f2e-e65c784e5ce9-utilities\") pod \"certified-operators-hs8bh\" (UID: \"3b6e5eca-65ff-4d86-8f2e-e65c784e5ce9\") " pod="openshift-marketplace/certified-operators-hs8bh" Mar 20 18:28:00 crc kubenswrapper[4690]: I0320 18:28:00.146936 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567188-bh6s9"] Mar 20 18:28:00 crc kubenswrapper[4690]: I0320 18:28:00.151345 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567188-bh6s9" Mar 20 18:28:00 crc kubenswrapper[4690]: I0320 18:28:00.163191 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 18:28:00 crc kubenswrapper[4690]: I0320 18:28:00.164914 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 18:28:00 crc kubenswrapper[4690]: I0320 18:28:00.165079 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5fwhb" Mar 20 18:28:00 crc kubenswrapper[4690]: I0320 18:28:00.198977 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567188-bh6s9"] Mar 20 18:28:00 crc kubenswrapper[4690]: I0320 18:28:00.229386 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b6e5eca-65ff-4d86-8f2e-e65c784e5ce9-utilities\") pod \"certified-operators-hs8bh\" (UID: \"3b6e5eca-65ff-4d86-8f2e-e65c784e5ce9\") " pod="openshift-marketplace/certified-operators-hs8bh" Mar 20 18:28:00 crc kubenswrapper[4690]: I0320 18:28:00.229577 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b6e5eca-65ff-4d86-8f2e-e65c784e5ce9-catalog-content\") pod \"certified-operators-hs8bh\" (UID: \"3b6e5eca-65ff-4d86-8f2e-e65c784e5ce9\") " pod="openshift-marketplace/certified-operators-hs8bh" Mar 20 18:28:00 crc kubenswrapper[4690]: I0320 18:28:00.229625 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlsfs\" (UniqueName: \"kubernetes.io/projected/3b6e5eca-65ff-4d86-8f2e-e65c784e5ce9-kube-api-access-dlsfs\") pod \"certified-operators-hs8bh\" (UID: \"3b6e5eca-65ff-4d86-8f2e-e65c784e5ce9\") " pod="openshift-marketplace/certified-operators-hs8bh" Mar 20 18:28:00 crc kubenswrapper[4690]: I0320 18:28:00.230429 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b6e5eca-65ff-4d86-8f2e-e65c784e5ce9-utilities\") pod \"certified-operators-hs8bh\" (UID: \"3b6e5eca-65ff-4d86-8f2e-e65c784e5ce9\") " pod="openshift-marketplace/certified-operators-hs8bh" Mar 20 18:28:00 crc kubenswrapper[4690]: I0320 18:28:00.231667 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b6e5eca-65ff-4d86-8f2e-e65c784e5ce9-catalog-content\") pod \"certified-operators-hs8bh\" (UID: \"3b6e5eca-65ff-4d86-8f2e-e65c784e5ce9\") " pod="openshift-marketplace/certified-operators-hs8bh" Mar 20 18:28:00 crc kubenswrapper[4690]: I0320 18:28:00.240950 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-cv9j6"] Mar 20 18:28:00 crc kubenswrapper[4690]: I0320 18:28:00.242820 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cv9j6" Mar 20 18:28:00 crc kubenswrapper[4690]: I0320 18:28:00.248548 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cv9j6"] Mar 20 18:28:00 crc kubenswrapper[4690]: I0320 18:28:00.274158 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlsfs\" (UniqueName: \"kubernetes.io/projected/3b6e5eca-65ff-4d86-8f2e-e65c784e5ce9-kube-api-access-dlsfs\") pod \"certified-operators-hs8bh\" (UID: \"3b6e5eca-65ff-4d86-8f2e-e65c784e5ce9\") " pod="openshift-marketplace/certified-operators-hs8bh" Mar 20 18:28:00 crc kubenswrapper[4690]: I0320 18:28:00.330738 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5fqz\" (UniqueName: \"kubernetes.io/projected/a15191f6-a6b7-458d-be7d-3387d83561d7-kube-api-access-j5fqz\") pod \"auto-csr-approver-29567188-bh6s9\" (UID: \"a15191f6-a6b7-458d-be7d-3387d83561d7\") " pod="openshift-infra/auto-csr-approver-29567188-bh6s9" Mar 20 18:28:00 crc kubenswrapper[4690]: I0320 18:28:00.349813 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hs8bh" Mar 20 18:28:00 crc kubenswrapper[4690]: I0320 18:28:00.432640 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6359f153-23f5-49f4-a900-c64d4110f8b5-catalog-content\") pod \"community-operators-cv9j6\" (UID: \"6359f153-23f5-49f4-a900-c64d4110f8b5\") " pod="openshift-marketplace/community-operators-cv9j6" Mar 20 18:28:00 crc kubenswrapper[4690]: I0320 18:28:00.432760 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6359f153-23f5-49f4-a900-c64d4110f8b5-utilities\") pod \"community-operators-cv9j6\" (UID: \"6359f153-23f5-49f4-a900-c64d4110f8b5\") " pod="openshift-marketplace/community-operators-cv9j6" Mar 20 18:28:00 crc kubenswrapper[4690]: I0320 18:28:00.432807 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgt45\" (UniqueName: \"kubernetes.io/projected/6359f153-23f5-49f4-a900-c64d4110f8b5-kube-api-access-qgt45\") pod \"community-operators-cv9j6\" (UID: \"6359f153-23f5-49f4-a900-c64d4110f8b5\") " pod="openshift-marketplace/community-operators-cv9j6" Mar 20 18:28:00 crc kubenswrapper[4690]: I0320 18:28:00.432876 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5fqz\" (UniqueName: \"kubernetes.io/projected/a15191f6-a6b7-458d-be7d-3387d83561d7-kube-api-access-j5fqz\") pod \"auto-csr-approver-29567188-bh6s9\" (UID: \"a15191f6-a6b7-458d-be7d-3387d83561d7\") " pod="openshift-infra/auto-csr-approver-29567188-bh6s9" Mar 20 18:28:00 crc kubenswrapper[4690]: I0320 18:28:00.453663 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5fqz\" (UniqueName: \"kubernetes.io/projected/a15191f6-a6b7-458d-be7d-3387d83561d7-kube-api-access-j5fqz\") pod \"auto-csr-approver-29567188-bh6s9\" (UID: \"a15191f6-a6b7-458d-be7d-3387d83561d7\") " pod="openshift-infra/auto-csr-approver-29567188-bh6s9" Mar 20 18:28:00 crc kubenswrapper[4690]: I0320 18:28:00.489407 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567188-bh6s9" Mar 20 18:28:00 crc kubenswrapper[4690]: I0320 18:28:00.536905 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6359f153-23f5-49f4-a900-c64d4110f8b5-catalog-content\") pod \"community-operators-cv9j6\" (UID: \"6359f153-23f5-49f4-a900-c64d4110f8b5\") " pod="openshift-marketplace/community-operators-cv9j6" Mar 20 18:28:00 crc kubenswrapper[4690]: I0320 18:28:00.536996 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6359f153-23f5-49f4-a900-c64d4110f8b5-utilities\") pod \"community-operators-cv9j6\" (UID: \"6359f153-23f5-49f4-a900-c64d4110f8b5\") " pod="openshift-marketplace/community-operators-cv9j6" Mar 20 18:28:00 crc kubenswrapper[4690]: I0320 18:28:00.537036 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgt45\" (UniqueName: \"kubernetes.io/projected/6359f153-23f5-49f4-a900-c64d4110f8b5-kube-api-access-qgt45\") pod \"community-operators-cv9j6\" (UID: \"6359f153-23f5-49f4-a900-c64d4110f8b5\") " pod="openshift-marketplace/community-operators-cv9j6" Mar 20 18:28:00 crc kubenswrapper[4690]: I0320 18:28:00.537723 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6359f153-23f5-49f4-a900-c64d4110f8b5-catalog-content\") pod \"community-operators-cv9j6\" (UID: \"6359f153-23f5-49f4-a900-c64d4110f8b5\") " pod="openshift-marketplace/community-operators-cv9j6" Mar 20 18:28:00 crc kubenswrapper[4690]: I0320 18:28:00.537821 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6359f153-23f5-49f4-a900-c64d4110f8b5-utilities\") pod \"community-operators-cv9j6\" (UID: \"6359f153-23f5-49f4-a900-c64d4110f8b5\") " pod="openshift-marketplace/community-operators-cv9j6" Mar 20 18:28:00 crc kubenswrapper[4690]: I0320 18:28:00.559281 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgt45\" (UniqueName: \"kubernetes.io/projected/6359f153-23f5-49f4-a900-c64d4110f8b5-kube-api-access-qgt45\") pod \"community-operators-cv9j6\" (UID: \"6359f153-23f5-49f4-a900-c64d4110f8b5\") " pod="openshift-marketplace/community-operators-cv9j6" Mar 20 18:28:00 crc kubenswrapper[4690]: I0320 18:28:00.625110 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cv9j6" Mar 20 18:28:00 crc kubenswrapper[4690]: I0320 18:28:00.960442 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hs8bh"] Mar 20 18:28:01 crc kubenswrapper[4690]: I0320 18:28:01.084012 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567188-bh6s9"] Mar 20 18:28:01 crc kubenswrapper[4690]: I0320 18:28:01.093290 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hs8bh" event={"ID":"3b6e5eca-65ff-4d86-8f2e-e65c784e5ce9","Type":"ContainerStarted","Data":"f2166aa0f7385d3ff8d7185ac9c94023db9226a99b65fe42c55f31c9160bbad0"} Mar 20 18:28:01 crc kubenswrapper[4690]: W0320 18:28:01.378991 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6359f153_23f5_49f4_a900_c64d4110f8b5.slice/crio-89b353a644d0da184e1bb7f05300bfd8bb75a074e6c80e57dc3177ba5d9013e4 WatchSource:0}: Error finding container 89b353a644d0da184e1bb7f05300bfd8bb75a074e6c80e57dc3177ba5d9013e4: Status 404 returned error can't find the container with id 89b353a644d0da184e1bb7f05300bfd8bb75a074e6c80e57dc3177ba5d9013e4 Mar 20 18:28:01 crc kubenswrapper[4690]: I0320 18:28:01.380767 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cv9j6"] Mar 20 18:28:02 crc kubenswrapper[4690]: I0320 18:28:02.103195 4690 generic.go:334] "Generic (PLEG): container finished" podID="3b6e5eca-65ff-4d86-8f2e-e65c784e5ce9" containerID="211de1fc8f236174fed47862f76a9f2e5aa0d60f3886303eac259f69e8f36912" exitCode=0 Mar 20 18:28:02 crc kubenswrapper[4690]: I0320 18:28:02.103541 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hs8bh" event={"ID":"3b6e5eca-65ff-4d86-8f2e-e65c784e5ce9","Type":"ContainerDied","Data":"211de1fc8f236174fed47862f76a9f2e5aa0d60f3886303eac259f69e8f36912"} Mar 20 18:28:02 crc kubenswrapper[4690]: I0320 18:28:02.111396 4690 generic.go:334] "Generic (PLEG): container finished" podID="6359f153-23f5-49f4-a900-c64d4110f8b5" containerID="fee206ba1279f69237a2ecd754f5b7c33b0f64589a1e192b638499e0a31308ec" exitCode=0 Mar 20 18:28:02 crc kubenswrapper[4690]: I0320 18:28:02.111524 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cv9j6" event={"ID":"6359f153-23f5-49f4-a900-c64d4110f8b5","Type":"ContainerDied","Data":"fee206ba1279f69237a2ecd754f5b7c33b0f64589a1e192b638499e0a31308ec"} Mar 20 18:28:02 crc kubenswrapper[4690]: I0320 18:28:02.111584 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cv9j6" event={"ID":"6359f153-23f5-49f4-a900-c64d4110f8b5","Type":"ContainerStarted","Data":"89b353a644d0da184e1bb7f05300bfd8bb75a074e6c80e57dc3177ba5d9013e4"} Mar 20 18:28:02 crc kubenswrapper[4690]: I0320 18:28:02.116692 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567188-bh6s9" event={"ID":"a15191f6-a6b7-458d-be7d-3387d83561d7","Type":"ContainerStarted","Data":"943f6a623eaed9c41e7fefdae3da9f84f091218ec83750229ee1f311ff5fe29a"} Mar 20 18:28:03 crc kubenswrapper[4690]: I0320 18:28:03.137183 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cv9j6" event={"ID":"6359f153-23f5-49f4-a900-c64d4110f8b5","Type":"ContainerStarted","Data":"38624417d3ec98ce02708d7a0fbe26f09299b1cc24e1e877b6b5c56853eb4e71"} Mar 20 18:28:03 crc kubenswrapper[4690]: I0320 18:28:03.139998 4690 generic.go:334] "Generic (PLEG): container finished" podID="a15191f6-a6b7-458d-be7d-3387d83561d7" containerID="880b1625ae6ddbfbb89745ebf7924388f76f23475b2891410aa70c1e7c410b36" exitCode=0 Mar 20 18:28:03 crc kubenswrapper[4690]: I0320 18:28:03.140052 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567188-bh6s9" event={"ID":"a15191f6-a6b7-458d-be7d-3387d83561d7","Type":"ContainerDied","Data":"880b1625ae6ddbfbb89745ebf7924388f76f23475b2891410aa70c1e7c410b36"} Mar 20 18:28:04 crc kubenswrapper[4690]: I0320 18:28:04.152425 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hs8bh" event={"ID":"3b6e5eca-65ff-4d86-8f2e-e65c784e5ce9","Type":"ContainerStarted","Data":"cf33ae3d97335bc5f9b18bbef9326a66ec063613caeaaa5e91e36e089513dd60"} Mar 20 18:28:04 crc kubenswrapper[4690]: I0320 18:28:04.157469 4690 generic.go:334] "Generic (PLEG): container finished" podID="6359f153-23f5-49f4-a900-c64d4110f8b5" containerID="38624417d3ec98ce02708d7a0fbe26f09299b1cc24e1e877b6b5c56853eb4e71" exitCode=0 Mar 20 18:28:04 crc kubenswrapper[4690]: I0320 18:28:04.157568 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cv9j6" event={"ID":"6359f153-23f5-49f4-a900-c64d4110f8b5","Type":"ContainerDied","Data":"38624417d3ec98ce02708d7a0fbe26f09299b1cc24e1e877b6b5c56853eb4e71"} Mar 20 18:28:04 crc kubenswrapper[4690]: I0320 18:28:04.547728 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567188-bh6s9" Mar 20 18:28:04 crc kubenswrapper[4690]: I0320 18:28:04.725574 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j5fqz\" (UniqueName: \"kubernetes.io/projected/a15191f6-a6b7-458d-be7d-3387d83561d7-kube-api-access-j5fqz\") pod \"a15191f6-a6b7-458d-be7d-3387d83561d7\" (UID: \"a15191f6-a6b7-458d-be7d-3387d83561d7\") " Mar 20 18:28:04 crc kubenswrapper[4690]: I0320 18:28:04.730900 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a15191f6-a6b7-458d-be7d-3387d83561d7-kube-api-access-j5fqz" (OuterVolumeSpecName: "kube-api-access-j5fqz") pod "a15191f6-a6b7-458d-be7d-3387d83561d7" (UID: "a15191f6-a6b7-458d-be7d-3387d83561d7"). InnerVolumeSpecName "kube-api-access-j5fqz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:28:04 crc kubenswrapper[4690]: I0320 18:28:04.827555 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j5fqz\" (UniqueName: \"kubernetes.io/projected/a15191f6-a6b7-458d-be7d-3387d83561d7-kube-api-access-j5fqz\") on node \"crc\" DevicePath \"\"" Mar 20 18:28:05 crc kubenswrapper[4690]: I0320 18:28:05.169028 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567188-bh6s9" Mar 20 18:28:05 crc kubenswrapper[4690]: I0320 18:28:05.169019 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567188-bh6s9" event={"ID":"a15191f6-a6b7-458d-be7d-3387d83561d7","Type":"ContainerDied","Data":"943f6a623eaed9c41e7fefdae3da9f84f091218ec83750229ee1f311ff5fe29a"} Mar 20 18:28:05 crc kubenswrapper[4690]: I0320 18:28:05.169082 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="943f6a623eaed9c41e7fefdae3da9f84f091218ec83750229ee1f311ff5fe29a" Mar 20 18:28:05 crc kubenswrapper[4690]: I0320 18:28:05.623704 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567182-ccxc8"] Mar 20 18:28:05 crc kubenswrapper[4690]: I0320 18:28:05.631921 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567182-ccxc8"] Mar 20 18:28:05 crc kubenswrapper[4690]: I0320 18:28:05.895973 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc295a8d-40fc-4bc8-bcd3-969af4663933" path="/var/lib/kubelet/pods/cc295a8d-40fc-4bc8-bcd3-969af4663933/volumes" Mar 20 18:28:06 crc kubenswrapper[4690]: I0320 18:28:06.181874 4690 generic.go:334] "Generic (PLEG): container finished" podID="3b6e5eca-65ff-4d86-8f2e-e65c784e5ce9" containerID="cf33ae3d97335bc5f9b18bbef9326a66ec063613caeaaa5e91e36e089513dd60" exitCode=0 Mar 20 18:28:06 crc kubenswrapper[4690]: I0320 18:28:06.182035 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hs8bh" event={"ID":"3b6e5eca-65ff-4d86-8f2e-e65c784e5ce9","Type":"ContainerDied","Data":"cf33ae3d97335bc5f9b18bbef9326a66ec063613caeaaa5e91e36e089513dd60"} Mar 20 18:28:06 crc kubenswrapper[4690]: I0320 18:28:06.184901 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cv9j6" event={"ID":"6359f153-23f5-49f4-a900-c64d4110f8b5","Type":"ContainerStarted","Data":"f857a94b4a15e2656e9e935bdf7a20fb2d5f9b2a55d7d035683b782cdca265cd"} Mar 20 18:28:06 crc kubenswrapper[4690]: I0320 18:28:06.232412 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-cv9j6" podStartSLOduration=3.408002323 podStartE2EDuration="6.232391146s" podCreationTimestamp="2026-03-20 18:28:00 +0000 UTC" firstStartedPulling="2026-03-20 18:28:02.114827327 +0000 UTC m=+3356.980653005" lastFinishedPulling="2026-03-20 18:28:04.93921614 +0000 UTC m=+3359.805041828" observedRunningTime="2026-03-20 18:28:06.222782784 +0000 UTC m=+3361.088608512" watchObservedRunningTime="2026-03-20 18:28:06.232391146 +0000 UTC m=+3361.098216834" Mar 20 18:28:07 crc kubenswrapper[4690]: I0320 18:28:07.197520 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hs8bh" event={"ID":"3b6e5eca-65ff-4d86-8f2e-e65c784e5ce9","Type":"ContainerStarted","Data":"68360088bbb085f966b3cd9c0fac30c56c7cc2fea59a2122f9eb7a01417fc253"} Mar 20 18:28:07 crc kubenswrapper[4690]: I0320 18:28:07.229985 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hs8bh" podStartSLOduration=2.6013738010000003 podStartE2EDuration="7.229961806s" podCreationTimestamp="2026-03-20 18:28:00 +0000 UTC" firstStartedPulling="2026-03-20 18:28:02.105529114 +0000 UTC m=+3356.971354812" lastFinishedPulling="2026-03-20 18:28:06.734117139 +0000 UTC m=+3361.599942817" observedRunningTime="2026-03-20 18:28:07.223121613 +0000 UTC m=+3362.088947311" watchObservedRunningTime="2026-03-20 18:28:07.229961806 +0000 UTC m=+3362.095787484" Mar 20 18:28:10 crc kubenswrapper[4690]: I0320 18:28:10.350806 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hs8bh" Mar 20 18:28:10 crc kubenswrapper[4690]: I0320 18:28:10.352632 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hs8bh" Mar 20 18:28:10 crc kubenswrapper[4690]: I0320 18:28:10.626162 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-cv9j6" Mar 20 18:28:10 crc kubenswrapper[4690]: I0320 18:28:10.626897 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-cv9j6" Mar 20 18:28:11 crc kubenswrapper[4690]: I0320 18:28:11.394963 4690 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-hs8bh" podUID="3b6e5eca-65ff-4d86-8f2e-e65c784e5ce9" containerName="registry-server" probeResult="failure" output=< Mar 20 18:28:11 crc kubenswrapper[4690]: timeout: failed to connect service ":50051" within 1s Mar 20 18:28:11 crc kubenswrapper[4690]: > Mar 20 18:28:11 crc kubenswrapper[4690]: I0320 18:28:11.685663 4690 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-cv9j6" podUID="6359f153-23f5-49f4-a900-c64d4110f8b5" containerName="registry-server" probeResult="failure" output=< Mar 20 18:28:11 crc kubenswrapper[4690]: timeout: failed to connect service ":50051" within 1s Mar 20 18:28:11 crc kubenswrapper[4690]: > Mar 20 18:28:20 crc kubenswrapper[4690]: I0320 18:28:20.398544 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hs8bh" Mar 20 18:28:20 crc kubenswrapper[4690]: I0320 18:28:20.444253 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hs8bh" Mar 20 18:28:20 crc kubenswrapper[4690]: I0320 18:28:20.637655 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hs8bh"] Mar 20 18:28:20 crc kubenswrapper[4690]: I0320 18:28:20.677089 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-cv9j6" Mar 20 18:28:20 crc kubenswrapper[4690]: I0320 18:28:20.729655 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-cv9j6" Mar 20 18:28:22 crc kubenswrapper[4690]: I0320 18:28:22.327919 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hs8bh" podUID="3b6e5eca-65ff-4d86-8f2e-e65c784e5ce9" containerName="registry-server" containerID="cri-o://68360088bbb085f966b3cd9c0fac30c56c7cc2fea59a2122f9eb7a01417fc253" gracePeriod=2 Mar 20 18:28:22 crc kubenswrapper[4690]: I0320 18:28:22.481172 4690 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/swift-proxy-799f9bd8b7-4q7w9" podUID="3f074183-2793-4719-95b3-c2df447c93ab" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Mar 20 18:28:22 crc kubenswrapper[4690]: I0320 18:28:22.891543 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hs8bh" Mar 20 18:28:22 crc kubenswrapper[4690]: I0320 18:28:22.983114 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b6e5eca-65ff-4d86-8f2e-e65c784e5ce9-catalog-content\") pod \"3b6e5eca-65ff-4d86-8f2e-e65c784e5ce9\" (UID: \"3b6e5eca-65ff-4d86-8f2e-e65c784e5ce9\") " Mar 20 18:28:22 crc kubenswrapper[4690]: I0320 18:28:22.983505 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b6e5eca-65ff-4d86-8f2e-e65c784e5ce9-utilities\") pod \"3b6e5eca-65ff-4d86-8f2e-e65c784e5ce9\" (UID: \"3b6e5eca-65ff-4d86-8f2e-e65c784e5ce9\") " Mar 20 18:28:22 crc kubenswrapper[4690]: I0320 18:28:22.983756 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dlsfs\" (UniqueName: \"kubernetes.io/projected/3b6e5eca-65ff-4d86-8f2e-e65c784e5ce9-kube-api-access-dlsfs\") pod \"3b6e5eca-65ff-4d86-8f2e-e65c784e5ce9\" (UID: \"3b6e5eca-65ff-4d86-8f2e-e65c784e5ce9\") " Mar 20 18:28:22 crc kubenswrapper[4690]: I0320 18:28:22.984053 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b6e5eca-65ff-4d86-8f2e-e65c784e5ce9-utilities" (OuterVolumeSpecName: "utilities") pod "3b6e5eca-65ff-4d86-8f2e-e65c784e5ce9" (UID: "3b6e5eca-65ff-4d86-8f2e-e65c784e5ce9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:28:22 crc kubenswrapper[4690]: I0320 18:28:22.984411 4690 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b6e5eca-65ff-4d86-8f2e-e65c784e5ce9-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 18:28:22 crc kubenswrapper[4690]: I0320 18:28:22.988818 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b6e5eca-65ff-4d86-8f2e-e65c784e5ce9-kube-api-access-dlsfs" (OuterVolumeSpecName: "kube-api-access-dlsfs") pod "3b6e5eca-65ff-4d86-8f2e-e65c784e5ce9" (UID: "3b6e5eca-65ff-4d86-8f2e-e65c784e5ce9"). InnerVolumeSpecName "kube-api-access-dlsfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:28:23 crc kubenswrapper[4690]: I0320 18:28:23.034803 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b6e5eca-65ff-4d86-8f2e-e65c784e5ce9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3b6e5eca-65ff-4d86-8f2e-e65c784e5ce9" (UID: "3b6e5eca-65ff-4d86-8f2e-e65c784e5ce9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:28:23 crc kubenswrapper[4690]: I0320 18:28:23.037165 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cv9j6"] Mar 20 18:28:23 crc kubenswrapper[4690]: I0320 18:28:23.037490 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-cv9j6" podUID="6359f153-23f5-49f4-a900-c64d4110f8b5" containerName="registry-server" containerID="cri-o://f857a94b4a15e2656e9e935bdf7a20fb2d5f9b2a55d7d035683b782cdca265cd" gracePeriod=2 Mar 20 18:28:23 crc kubenswrapper[4690]: I0320 18:28:23.085944 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dlsfs\" (UniqueName: \"kubernetes.io/projected/3b6e5eca-65ff-4d86-8f2e-e65c784e5ce9-kube-api-access-dlsfs\") on node \"crc\" DevicePath \"\"" Mar 20 18:28:23 crc kubenswrapper[4690]: I0320 18:28:23.085979 4690 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b6e5eca-65ff-4d86-8f2e-e65c784e5ce9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 18:28:23 crc kubenswrapper[4690]: I0320 18:28:23.340330 4690 generic.go:334] "Generic (PLEG): container finished" podID="3b6e5eca-65ff-4d86-8f2e-e65c784e5ce9" containerID="68360088bbb085f966b3cd9c0fac30c56c7cc2fea59a2122f9eb7a01417fc253" exitCode=0 Mar 20 18:28:23 crc kubenswrapper[4690]: I0320 18:28:23.340412 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hs8bh" Mar 20 18:28:23 crc kubenswrapper[4690]: I0320 18:28:23.340433 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hs8bh" event={"ID":"3b6e5eca-65ff-4d86-8f2e-e65c784e5ce9","Type":"ContainerDied","Data":"68360088bbb085f966b3cd9c0fac30c56c7cc2fea59a2122f9eb7a01417fc253"} Mar 20 18:28:23 crc kubenswrapper[4690]: I0320 18:28:23.340731 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hs8bh" event={"ID":"3b6e5eca-65ff-4d86-8f2e-e65c784e5ce9","Type":"ContainerDied","Data":"f2166aa0f7385d3ff8d7185ac9c94023db9226a99b65fe42c55f31c9160bbad0"} Mar 20 18:28:23 crc kubenswrapper[4690]: I0320 18:28:23.340749 4690 scope.go:117] "RemoveContainer" containerID="68360088bbb085f966b3cd9c0fac30c56c7cc2fea59a2122f9eb7a01417fc253" Mar 20 18:28:23 crc kubenswrapper[4690]: I0320 18:28:23.344145 4690 generic.go:334] "Generic (PLEG): container finished" podID="6359f153-23f5-49f4-a900-c64d4110f8b5" containerID="f857a94b4a15e2656e9e935bdf7a20fb2d5f9b2a55d7d035683b782cdca265cd" exitCode=0 Mar 20 18:28:23 crc kubenswrapper[4690]: I0320 18:28:23.344164 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cv9j6" event={"ID":"6359f153-23f5-49f4-a900-c64d4110f8b5","Type":"ContainerDied","Data":"f857a94b4a15e2656e9e935bdf7a20fb2d5f9b2a55d7d035683b782cdca265cd"} Mar 20 18:28:23 crc kubenswrapper[4690]: I0320 18:28:23.361479 4690 scope.go:117] "RemoveContainer" containerID="cf33ae3d97335bc5f9b18bbef9326a66ec063613caeaaa5e91e36e089513dd60" Mar 20 18:28:23 crc kubenswrapper[4690]: I0320 18:28:23.388375 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hs8bh"] Mar 20 18:28:23 crc kubenswrapper[4690]: I0320 18:28:23.389585 4690 scope.go:117] "RemoveContainer" containerID="211de1fc8f236174fed47862f76a9f2e5aa0d60f3886303eac259f69e8f36912" Mar 20 18:28:23 crc kubenswrapper[4690]: I0320 18:28:23.397658 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hs8bh"] Mar 20 18:28:23 crc kubenswrapper[4690]: I0320 18:28:23.408762 4690 scope.go:117] "RemoveContainer" containerID="68360088bbb085f966b3cd9c0fac30c56c7cc2fea59a2122f9eb7a01417fc253" Mar 20 18:28:23 crc kubenswrapper[4690]: E0320 18:28:23.409265 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68360088bbb085f966b3cd9c0fac30c56c7cc2fea59a2122f9eb7a01417fc253\": container with ID starting with 68360088bbb085f966b3cd9c0fac30c56c7cc2fea59a2122f9eb7a01417fc253 not found: ID does not exist" containerID="68360088bbb085f966b3cd9c0fac30c56c7cc2fea59a2122f9eb7a01417fc253" Mar 20 18:28:23 crc kubenswrapper[4690]: I0320 18:28:23.409296 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68360088bbb085f966b3cd9c0fac30c56c7cc2fea59a2122f9eb7a01417fc253"} err="failed to get container status \"68360088bbb085f966b3cd9c0fac30c56c7cc2fea59a2122f9eb7a01417fc253\": rpc error: code = NotFound desc = could not find container \"68360088bbb085f966b3cd9c0fac30c56c7cc2fea59a2122f9eb7a01417fc253\": container with ID starting with 68360088bbb085f966b3cd9c0fac30c56c7cc2fea59a2122f9eb7a01417fc253 not found: ID does not exist" Mar 20 18:28:23 crc kubenswrapper[4690]: I0320 18:28:23.409319 4690 scope.go:117] "RemoveContainer" containerID="cf33ae3d97335bc5f9b18bbef9326a66ec063613caeaaa5e91e36e089513dd60" Mar 20 18:28:23 crc kubenswrapper[4690]: E0320 18:28:23.410575 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf33ae3d97335bc5f9b18bbef9326a66ec063613caeaaa5e91e36e089513dd60\": container with ID starting with cf33ae3d97335bc5f9b18bbef9326a66ec063613caeaaa5e91e36e089513dd60 not found: ID does not exist" containerID="cf33ae3d97335bc5f9b18bbef9326a66ec063613caeaaa5e91e36e089513dd60" Mar 20 18:28:23 crc kubenswrapper[4690]: I0320 18:28:23.410598 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf33ae3d97335bc5f9b18bbef9326a66ec063613caeaaa5e91e36e089513dd60"} err="failed to get container status \"cf33ae3d97335bc5f9b18bbef9326a66ec063613caeaaa5e91e36e089513dd60\": rpc error: code = NotFound desc = could not find container \"cf33ae3d97335bc5f9b18bbef9326a66ec063613caeaaa5e91e36e089513dd60\": container with ID starting with cf33ae3d97335bc5f9b18bbef9326a66ec063613caeaaa5e91e36e089513dd60 not found: ID does not exist" Mar 20 18:28:23 crc kubenswrapper[4690]: I0320 18:28:23.410611 4690 scope.go:117] "RemoveContainer" containerID="211de1fc8f236174fed47862f76a9f2e5aa0d60f3886303eac259f69e8f36912" Mar 20 18:28:23 crc kubenswrapper[4690]: E0320 18:28:23.410871 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"211de1fc8f236174fed47862f76a9f2e5aa0d60f3886303eac259f69e8f36912\": container with ID starting with 211de1fc8f236174fed47862f76a9f2e5aa0d60f3886303eac259f69e8f36912 not found: ID does not exist" containerID="211de1fc8f236174fed47862f76a9f2e5aa0d60f3886303eac259f69e8f36912" Mar 20 18:28:23 crc kubenswrapper[4690]: I0320 18:28:23.410893 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"211de1fc8f236174fed47862f76a9f2e5aa0d60f3886303eac259f69e8f36912"} err="failed to get container status \"211de1fc8f236174fed47862f76a9f2e5aa0d60f3886303eac259f69e8f36912\": rpc error: code = NotFound desc = could not find container \"211de1fc8f236174fed47862f76a9f2e5aa0d60f3886303eac259f69e8f36912\": container with ID starting with 211de1fc8f236174fed47862f76a9f2e5aa0d60f3886303eac259f69e8f36912 not found: ID does not exist" Mar 20 18:28:23 crc kubenswrapper[4690]: I0320 18:28:23.531006 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cv9j6" Mar 20 18:28:23 crc kubenswrapper[4690]: I0320 18:28:23.594108 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6359f153-23f5-49f4-a900-c64d4110f8b5-utilities\") pod \"6359f153-23f5-49f4-a900-c64d4110f8b5\" (UID: \"6359f153-23f5-49f4-a900-c64d4110f8b5\") " Mar 20 18:28:23 crc kubenswrapper[4690]: I0320 18:28:23.594333 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6359f153-23f5-49f4-a900-c64d4110f8b5-catalog-content\") pod \"6359f153-23f5-49f4-a900-c64d4110f8b5\" (UID: \"6359f153-23f5-49f4-a900-c64d4110f8b5\") " Mar 20 18:28:23 crc kubenswrapper[4690]: I0320 18:28:23.594374 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qgt45\" (UniqueName: \"kubernetes.io/projected/6359f153-23f5-49f4-a900-c64d4110f8b5-kube-api-access-qgt45\") pod \"6359f153-23f5-49f4-a900-c64d4110f8b5\" (UID: \"6359f153-23f5-49f4-a900-c64d4110f8b5\") " Mar 20 18:28:23 crc kubenswrapper[4690]: I0320 18:28:23.595513 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6359f153-23f5-49f4-a900-c64d4110f8b5-utilities" (OuterVolumeSpecName: "utilities") pod "6359f153-23f5-49f4-a900-c64d4110f8b5" (UID: "6359f153-23f5-49f4-a900-c64d4110f8b5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:28:23 crc kubenswrapper[4690]: I0320 18:28:23.604857 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6359f153-23f5-49f4-a900-c64d4110f8b5-kube-api-access-qgt45" (OuterVolumeSpecName: "kube-api-access-qgt45") pod "6359f153-23f5-49f4-a900-c64d4110f8b5" (UID: "6359f153-23f5-49f4-a900-c64d4110f8b5"). InnerVolumeSpecName "kube-api-access-qgt45". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:28:23 crc kubenswrapper[4690]: I0320 18:28:23.644975 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6359f153-23f5-49f4-a900-c64d4110f8b5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6359f153-23f5-49f4-a900-c64d4110f8b5" (UID: "6359f153-23f5-49f4-a900-c64d4110f8b5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:28:23 crc kubenswrapper[4690]: I0320 18:28:23.696842 4690 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6359f153-23f5-49f4-a900-c64d4110f8b5-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 18:28:23 crc kubenswrapper[4690]: I0320 18:28:23.696885 4690 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6359f153-23f5-49f4-a900-c64d4110f8b5-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 18:28:23 crc kubenswrapper[4690]: I0320 18:28:23.696907 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qgt45\" (UniqueName: \"kubernetes.io/projected/6359f153-23f5-49f4-a900-c64d4110f8b5-kube-api-access-qgt45\") on node \"crc\" DevicePath \"\"" Mar 20 18:28:23 crc kubenswrapper[4690]: I0320 18:28:23.900510 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b6e5eca-65ff-4d86-8f2e-e65c784e5ce9" path="/var/lib/kubelet/pods/3b6e5eca-65ff-4d86-8f2e-e65c784e5ce9/volumes" Mar 20 18:28:24 crc kubenswrapper[4690]: I0320 18:28:24.360246 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cv9j6" event={"ID":"6359f153-23f5-49f4-a900-c64d4110f8b5","Type":"ContainerDied","Data":"89b353a644d0da184e1bb7f05300bfd8bb75a074e6c80e57dc3177ba5d9013e4"} Mar 20 18:28:24 crc kubenswrapper[4690]: I0320 18:28:24.360395 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cv9j6" Mar 20 18:28:24 crc kubenswrapper[4690]: I0320 18:28:24.360650 4690 scope.go:117] "RemoveContainer" containerID="f857a94b4a15e2656e9e935bdf7a20fb2d5f9b2a55d7d035683b782cdca265cd" Mar 20 18:28:24 crc kubenswrapper[4690]: I0320 18:28:24.404161 4690 scope.go:117] "RemoveContainer" containerID="38624417d3ec98ce02708d7a0fbe26f09299b1cc24e1e877b6b5c56853eb4e71" Mar 20 18:28:24 crc kubenswrapper[4690]: I0320 18:28:24.408456 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cv9j6"] Mar 20 18:28:24 crc kubenswrapper[4690]: I0320 18:28:24.441766 4690 scope.go:117] "RemoveContainer" containerID="fee206ba1279f69237a2ecd754f5b7c33b0f64589a1e192b638499e0a31308ec" Mar 20 18:28:24 crc kubenswrapper[4690]: I0320 18:28:24.454270 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-cv9j6"] Mar 20 18:28:24 crc kubenswrapper[4690]: I0320 18:28:24.549061 4690 scope.go:117] "RemoveContainer" containerID="622489cbfdda45a236856cf4f63dcbc208329ecf8b558b11bf3685ce116fdd61" Mar 20 18:28:25 crc kubenswrapper[4690]: I0320 18:28:25.897209 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6359f153-23f5-49f4-a900-c64d4110f8b5" path="/var/lib/kubelet/pods/6359f153-23f5-49f4-a900-c64d4110f8b5/volumes" Mar 20 18:28:34 crc kubenswrapper[4690]: I0320 18:28:34.712659 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-82cvq"] Mar 20 18:28:34 crc kubenswrapper[4690]: E0320 18:28:34.713494 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6359f153-23f5-49f4-a900-c64d4110f8b5" containerName="extract-utilities" Mar 20 18:28:34 crc kubenswrapper[4690]: I0320 18:28:34.713513 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="6359f153-23f5-49f4-a900-c64d4110f8b5" containerName="extract-utilities" Mar 20 18:28:34 crc kubenswrapper[4690]: E0320 18:28:34.713533 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b6e5eca-65ff-4d86-8f2e-e65c784e5ce9" containerName="registry-server" Mar 20 18:28:34 crc kubenswrapper[4690]: I0320 18:28:34.713543 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b6e5eca-65ff-4d86-8f2e-e65c784e5ce9" containerName="registry-server" Mar 20 18:28:34 crc kubenswrapper[4690]: E0320 18:28:34.713563 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b6e5eca-65ff-4d86-8f2e-e65c784e5ce9" containerName="extract-content" Mar 20 18:28:34 crc kubenswrapper[4690]: I0320 18:28:34.713572 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b6e5eca-65ff-4d86-8f2e-e65c784e5ce9" containerName="extract-content" Mar 20 18:28:34 crc kubenswrapper[4690]: E0320 18:28:34.713589 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6359f153-23f5-49f4-a900-c64d4110f8b5" containerName="registry-server" Mar 20 18:28:34 crc kubenswrapper[4690]: I0320 18:28:34.713597 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="6359f153-23f5-49f4-a900-c64d4110f8b5" containerName="registry-server" Mar 20 18:28:34 crc kubenswrapper[4690]: E0320 18:28:34.713606 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a15191f6-a6b7-458d-be7d-3387d83561d7" containerName="oc" Mar 20 18:28:34 crc kubenswrapper[4690]: I0320 18:28:34.713615 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="a15191f6-a6b7-458d-be7d-3387d83561d7" containerName="oc" Mar 20 18:28:34 crc kubenswrapper[4690]: E0320 18:28:34.713629 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b6e5eca-65ff-4d86-8f2e-e65c784e5ce9" containerName="extract-utilities" Mar 20 18:28:34 crc kubenswrapper[4690]: I0320 18:28:34.713637 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b6e5eca-65ff-4d86-8f2e-e65c784e5ce9" containerName="extract-utilities" Mar 20 18:28:34 crc kubenswrapper[4690]: E0320 18:28:34.713663 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6359f153-23f5-49f4-a900-c64d4110f8b5" containerName="extract-content" Mar 20 18:28:34 crc kubenswrapper[4690]: I0320 18:28:34.713671 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="6359f153-23f5-49f4-a900-c64d4110f8b5" containerName="extract-content" Mar 20 18:28:34 crc kubenswrapper[4690]: I0320 18:28:34.713929 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b6e5eca-65ff-4d86-8f2e-e65c784e5ce9" containerName="registry-server" Mar 20 18:28:34 crc kubenswrapper[4690]: I0320 18:28:34.713953 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="6359f153-23f5-49f4-a900-c64d4110f8b5" containerName="registry-server" Mar 20 18:28:34 crc kubenswrapper[4690]: I0320 18:28:34.713977 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="a15191f6-a6b7-458d-be7d-3387d83561d7" containerName="oc" Mar 20 18:28:34 crc kubenswrapper[4690]: I0320 18:28:34.716101 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-82cvq" Mar 20 18:28:34 crc kubenswrapper[4690]: I0320 18:28:34.748752 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-82cvq"] Mar 20 18:28:34 crc kubenswrapper[4690]: I0320 18:28:34.832871 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d7ebf70-ea07-412e-b6ef-5574f592516f-utilities\") pod \"redhat-operators-82cvq\" (UID: \"7d7ebf70-ea07-412e-b6ef-5574f592516f\") " pod="openshift-marketplace/redhat-operators-82cvq" Mar 20 18:28:34 crc kubenswrapper[4690]: I0320 18:28:34.833049 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2xf7\" (UniqueName: \"kubernetes.io/projected/7d7ebf70-ea07-412e-b6ef-5574f592516f-kube-api-access-c2xf7\") pod \"redhat-operators-82cvq\" (UID: \"7d7ebf70-ea07-412e-b6ef-5574f592516f\") " pod="openshift-marketplace/redhat-operators-82cvq" Mar 20 18:28:34 crc kubenswrapper[4690]: I0320 18:28:34.833277 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d7ebf70-ea07-412e-b6ef-5574f592516f-catalog-content\") pod \"redhat-operators-82cvq\" (UID: \"7d7ebf70-ea07-412e-b6ef-5574f592516f\") " pod="openshift-marketplace/redhat-operators-82cvq" Mar 20 18:28:34 crc kubenswrapper[4690]: I0320 18:28:34.937161 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d7ebf70-ea07-412e-b6ef-5574f592516f-catalog-content\") pod \"redhat-operators-82cvq\" (UID: \"7d7ebf70-ea07-412e-b6ef-5574f592516f\") " pod="openshift-marketplace/redhat-operators-82cvq" Mar 20 18:28:34 crc kubenswrapper[4690]: I0320 18:28:34.937241 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d7ebf70-ea07-412e-b6ef-5574f592516f-utilities\") pod \"redhat-operators-82cvq\" (UID: \"7d7ebf70-ea07-412e-b6ef-5574f592516f\") " pod="openshift-marketplace/redhat-operators-82cvq" Mar 20 18:28:34 crc kubenswrapper[4690]: I0320 18:28:34.937379 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2xf7\" (UniqueName: \"kubernetes.io/projected/7d7ebf70-ea07-412e-b6ef-5574f592516f-kube-api-access-c2xf7\") pod \"redhat-operators-82cvq\" (UID: \"7d7ebf70-ea07-412e-b6ef-5574f592516f\") " pod="openshift-marketplace/redhat-operators-82cvq" Mar 20 18:28:34 crc kubenswrapper[4690]: I0320 18:28:34.938185 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d7ebf70-ea07-412e-b6ef-5574f592516f-utilities\") pod \"redhat-operators-82cvq\" (UID: \"7d7ebf70-ea07-412e-b6ef-5574f592516f\") " pod="openshift-marketplace/redhat-operators-82cvq" Mar 20 18:28:34 crc kubenswrapper[4690]: I0320 18:28:34.938229 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d7ebf70-ea07-412e-b6ef-5574f592516f-catalog-content\") pod \"redhat-operators-82cvq\" (UID: \"7d7ebf70-ea07-412e-b6ef-5574f592516f\") " pod="openshift-marketplace/redhat-operators-82cvq" Mar 20 18:28:34 crc kubenswrapper[4690]: I0320 18:28:34.958029 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2xf7\" (UniqueName: \"kubernetes.io/projected/7d7ebf70-ea07-412e-b6ef-5574f592516f-kube-api-access-c2xf7\") pod \"redhat-operators-82cvq\" (UID: \"7d7ebf70-ea07-412e-b6ef-5574f592516f\") " pod="openshift-marketplace/redhat-operators-82cvq" Mar 20 18:28:35 crc kubenswrapper[4690]: I0320 18:28:35.050827 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-82cvq" Mar 20 18:28:35 crc kubenswrapper[4690]: I0320 18:28:35.556848 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-82cvq"] Mar 20 18:28:36 crc kubenswrapper[4690]: I0320 18:28:36.478727 4690 generic.go:334] "Generic (PLEG): container finished" podID="7d7ebf70-ea07-412e-b6ef-5574f592516f" containerID="8f85afd8fce28eaa86cb0121ac080382d9399f78cfb6b8f00fd3c3990a21c7d4" exitCode=0 Mar 20 18:28:36 crc kubenswrapper[4690]: I0320 18:28:36.478840 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-82cvq" event={"ID":"7d7ebf70-ea07-412e-b6ef-5574f592516f","Type":"ContainerDied","Data":"8f85afd8fce28eaa86cb0121ac080382d9399f78cfb6b8f00fd3c3990a21c7d4"} Mar 20 18:28:36 crc kubenswrapper[4690]: I0320 18:28:36.479087 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-82cvq" event={"ID":"7d7ebf70-ea07-412e-b6ef-5574f592516f","Type":"ContainerStarted","Data":"9cb2b7e4a7ccacde459059484700670fb6821889c84fb26092957c438560a8f3"} Mar 20 18:28:38 crc kubenswrapper[4690]: I0320 18:28:38.498832 4690 generic.go:334] "Generic (PLEG): container finished" podID="7d7ebf70-ea07-412e-b6ef-5574f592516f" containerID="20e194e5af08201d9c83b83409fdab610248dae6843d5fb6f79630f8621ac9cd" exitCode=0 Mar 20 18:28:38 crc kubenswrapper[4690]: I0320 18:28:38.498888 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-82cvq" event={"ID":"7d7ebf70-ea07-412e-b6ef-5574f592516f","Type":"ContainerDied","Data":"20e194e5af08201d9c83b83409fdab610248dae6843d5fb6f79630f8621ac9cd"} Mar 20 18:28:44 crc kubenswrapper[4690]: I0320 18:28:44.580627 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-82cvq" event={"ID":"7d7ebf70-ea07-412e-b6ef-5574f592516f","Type":"ContainerStarted","Data":"e663f4c13cd31af12ab513eba11f604f45de45d2bbcc39539904f13ba66c09cc"} Mar 20 18:28:44 crc kubenswrapper[4690]: I0320 18:28:44.604818 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-82cvq" podStartSLOduration=2.802656663 podStartE2EDuration="10.604795102s" podCreationTimestamp="2026-03-20 18:28:34 +0000 UTC" firstStartedPulling="2026-03-20 18:28:36.480693269 +0000 UTC m=+3391.346518937" lastFinishedPulling="2026-03-20 18:28:44.282831698 +0000 UTC m=+3399.148657376" observedRunningTime="2026-03-20 18:28:44.602397654 +0000 UTC m=+3399.468223332" watchObservedRunningTime="2026-03-20 18:28:44.604795102 +0000 UTC m=+3399.470620800" Mar 20 18:28:45 crc kubenswrapper[4690]: I0320 18:28:45.051989 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-82cvq" Mar 20 18:28:45 crc kubenswrapper[4690]: I0320 18:28:45.052977 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-82cvq" Mar 20 18:28:46 crc kubenswrapper[4690]: I0320 18:28:46.101408 4690 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-82cvq" podUID="7d7ebf70-ea07-412e-b6ef-5574f592516f" containerName="registry-server" probeResult="failure" output=< Mar 20 18:28:46 crc kubenswrapper[4690]: timeout: failed to connect service ":50051" within 1s Mar 20 18:28:46 crc kubenswrapper[4690]: > Mar 20 18:28:55 crc kubenswrapper[4690]: I0320 18:28:55.108503 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-82cvq" Mar 20 18:28:55 crc kubenswrapper[4690]: I0320 18:28:55.167374 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-82cvq" Mar 20 18:28:55 crc kubenswrapper[4690]: I0320 18:28:55.349808 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-82cvq"] Mar 20 18:28:56 crc kubenswrapper[4690]: I0320 18:28:56.684013 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-82cvq" podUID="7d7ebf70-ea07-412e-b6ef-5574f592516f" containerName="registry-server" containerID="cri-o://e663f4c13cd31af12ab513eba11f604f45de45d2bbcc39539904f13ba66c09cc" gracePeriod=2 Mar 20 18:28:57 crc kubenswrapper[4690]: I0320 18:28:57.199379 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-82cvq" Mar 20 18:28:57 crc kubenswrapper[4690]: I0320 18:28:57.252882 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c2xf7\" (UniqueName: \"kubernetes.io/projected/7d7ebf70-ea07-412e-b6ef-5574f592516f-kube-api-access-c2xf7\") pod \"7d7ebf70-ea07-412e-b6ef-5574f592516f\" (UID: \"7d7ebf70-ea07-412e-b6ef-5574f592516f\") " Mar 20 18:28:57 crc kubenswrapper[4690]: I0320 18:28:57.253017 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d7ebf70-ea07-412e-b6ef-5574f592516f-utilities\") pod \"7d7ebf70-ea07-412e-b6ef-5574f592516f\" (UID: \"7d7ebf70-ea07-412e-b6ef-5574f592516f\") " Mar 20 18:28:57 crc kubenswrapper[4690]: I0320 18:28:57.253183 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d7ebf70-ea07-412e-b6ef-5574f592516f-catalog-content\") pod \"7d7ebf70-ea07-412e-b6ef-5574f592516f\" (UID: \"7d7ebf70-ea07-412e-b6ef-5574f592516f\") " Mar 20 18:28:57 crc kubenswrapper[4690]: I0320 18:28:57.253765 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d7ebf70-ea07-412e-b6ef-5574f592516f-utilities" (OuterVolumeSpecName: "utilities") pod "7d7ebf70-ea07-412e-b6ef-5574f592516f" (UID: "7d7ebf70-ea07-412e-b6ef-5574f592516f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:28:57 crc kubenswrapper[4690]: I0320 18:28:57.262445 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d7ebf70-ea07-412e-b6ef-5574f592516f-kube-api-access-c2xf7" (OuterVolumeSpecName: "kube-api-access-c2xf7") pod "7d7ebf70-ea07-412e-b6ef-5574f592516f" (UID: "7d7ebf70-ea07-412e-b6ef-5574f592516f"). InnerVolumeSpecName "kube-api-access-c2xf7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:28:57 crc kubenswrapper[4690]: I0320 18:28:57.354794 4690 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d7ebf70-ea07-412e-b6ef-5574f592516f-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 18:28:57 crc kubenswrapper[4690]: I0320 18:28:57.354827 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c2xf7\" (UniqueName: \"kubernetes.io/projected/7d7ebf70-ea07-412e-b6ef-5574f592516f-kube-api-access-c2xf7\") on node \"crc\" DevicePath \"\"" Mar 20 18:28:57 crc kubenswrapper[4690]: I0320 18:28:57.397272 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d7ebf70-ea07-412e-b6ef-5574f592516f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7d7ebf70-ea07-412e-b6ef-5574f592516f" (UID: "7d7ebf70-ea07-412e-b6ef-5574f592516f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:28:57 crc kubenswrapper[4690]: I0320 18:28:57.456492 4690 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d7ebf70-ea07-412e-b6ef-5574f592516f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 18:28:57 crc kubenswrapper[4690]: I0320 18:28:57.721353 4690 generic.go:334] "Generic (PLEG): container finished" podID="7d7ebf70-ea07-412e-b6ef-5574f592516f" containerID="e663f4c13cd31af12ab513eba11f604f45de45d2bbcc39539904f13ba66c09cc" exitCode=0 Mar 20 18:28:57 crc kubenswrapper[4690]: I0320 18:28:57.721404 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-82cvq" event={"ID":"7d7ebf70-ea07-412e-b6ef-5574f592516f","Type":"ContainerDied","Data":"e663f4c13cd31af12ab513eba11f604f45de45d2bbcc39539904f13ba66c09cc"} Mar 20 18:28:57 crc kubenswrapper[4690]: I0320 18:28:57.721433 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-82cvq" event={"ID":"7d7ebf70-ea07-412e-b6ef-5574f592516f","Type":"ContainerDied","Data":"9cb2b7e4a7ccacde459059484700670fb6821889c84fb26092957c438560a8f3"} Mar 20 18:28:57 crc kubenswrapper[4690]: I0320 18:28:57.721529 4690 scope.go:117] "RemoveContainer" containerID="e663f4c13cd31af12ab513eba11f604f45de45d2bbcc39539904f13ba66c09cc" Mar 20 18:28:57 crc kubenswrapper[4690]: I0320 18:28:57.721533 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-82cvq" Mar 20 18:28:57 crc kubenswrapper[4690]: I0320 18:28:57.766070 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-82cvq"] Mar 20 18:28:57 crc kubenswrapper[4690]: I0320 18:28:57.774701 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-82cvq"] Mar 20 18:28:57 crc kubenswrapper[4690]: I0320 18:28:57.790684 4690 scope.go:117] "RemoveContainer" containerID="20e194e5af08201d9c83b83409fdab610248dae6843d5fb6f79630f8621ac9cd" Mar 20 18:28:57 crc kubenswrapper[4690]: I0320 18:28:57.817314 4690 scope.go:117] "RemoveContainer" containerID="8f85afd8fce28eaa86cb0121ac080382d9399f78cfb6b8f00fd3c3990a21c7d4" Mar 20 18:28:57 crc kubenswrapper[4690]: I0320 18:28:57.857956 4690 scope.go:117] "RemoveContainer" containerID="e663f4c13cd31af12ab513eba11f604f45de45d2bbcc39539904f13ba66c09cc" Mar 20 18:28:57 crc kubenswrapper[4690]: E0320 18:28:57.858805 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e663f4c13cd31af12ab513eba11f604f45de45d2bbcc39539904f13ba66c09cc\": container with ID starting with e663f4c13cd31af12ab513eba11f604f45de45d2bbcc39539904f13ba66c09cc not found: ID does not exist" containerID="e663f4c13cd31af12ab513eba11f604f45de45d2bbcc39539904f13ba66c09cc" Mar 20 18:28:57 crc kubenswrapper[4690]: I0320 18:28:57.858870 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e663f4c13cd31af12ab513eba11f604f45de45d2bbcc39539904f13ba66c09cc"} err="failed to get container status \"e663f4c13cd31af12ab513eba11f604f45de45d2bbcc39539904f13ba66c09cc\": rpc error: code = NotFound desc = could not find container \"e663f4c13cd31af12ab513eba11f604f45de45d2bbcc39539904f13ba66c09cc\": container with ID starting with e663f4c13cd31af12ab513eba11f604f45de45d2bbcc39539904f13ba66c09cc not found: ID does not exist" Mar 20 18:28:57 crc kubenswrapper[4690]: I0320 18:28:57.858897 4690 scope.go:117] "RemoveContainer" containerID="20e194e5af08201d9c83b83409fdab610248dae6843d5fb6f79630f8621ac9cd" Mar 20 18:28:57 crc kubenswrapper[4690]: E0320 18:28:57.859296 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20e194e5af08201d9c83b83409fdab610248dae6843d5fb6f79630f8621ac9cd\": container with ID starting with 20e194e5af08201d9c83b83409fdab610248dae6843d5fb6f79630f8621ac9cd not found: ID does not exist" containerID="20e194e5af08201d9c83b83409fdab610248dae6843d5fb6f79630f8621ac9cd" Mar 20 18:28:57 crc kubenswrapper[4690]: I0320 18:28:57.859386 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20e194e5af08201d9c83b83409fdab610248dae6843d5fb6f79630f8621ac9cd"} err="failed to get container status \"20e194e5af08201d9c83b83409fdab610248dae6843d5fb6f79630f8621ac9cd\": rpc error: code = NotFound desc = could not find container \"20e194e5af08201d9c83b83409fdab610248dae6843d5fb6f79630f8621ac9cd\": container with ID starting with 20e194e5af08201d9c83b83409fdab610248dae6843d5fb6f79630f8621ac9cd not found: ID does not exist" Mar 20 18:28:57 crc kubenswrapper[4690]: I0320 18:28:57.859468 4690 scope.go:117] "RemoveContainer" containerID="8f85afd8fce28eaa86cb0121ac080382d9399f78cfb6b8f00fd3c3990a21c7d4" Mar 20 18:28:57 crc kubenswrapper[4690]: E0320 18:28:57.859942 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f85afd8fce28eaa86cb0121ac080382d9399f78cfb6b8f00fd3c3990a21c7d4\": container with ID starting with 8f85afd8fce28eaa86cb0121ac080382d9399f78cfb6b8f00fd3c3990a21c7d4 not found: ID does not exist" containerID="8f85afd8fce28eaa86cb0121ac080382d9399f78cfb6b8f00fd3c3990a21c7d4" Mar 20 18:28:57 crc kubenswrapper[4690]: I0320 18:28:57.859966 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f85afd8fce28eaa86cb0121ac080382d9399f78cfb6b8f00fd3c3990a21c7d4"} err="failed to get container status \"8f85afd8fce28eaa86cb0121ac080382d9399f78cfb6b8f00fd3c3990a21c7d4\": rpc error: code = NotFound desc = could not find container \"8f85afd8fce28eaa86cb0121ac080382d9399f78cfb6b8f00fd3c3990a21c7d4\": container with ID starting with 8f85afd8fce28eaa86cb0121ac080382d9399f78cfb6b8f00fd3c3990a21c7d4 not found: ID does not exist" Mar 20 18:28:57 crc kubenswrapper[4690]: I0320 18:28:57.892499 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d7ebf70-ea07-412e-b6ef-5574f592516f" path="/var/lib/kubelet/pods/7d7ebf70-ea07-412e-b6ef-5574f592516f/volumes" Mar 20 18:29:24 crc kubenswrapper[4690]: I0320 18:29:24.274522 4690 patch_prober.go:28] interesting pod/machine-config-daemon-wtg2q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 18:29:24 crc kubenswrapper[4690]: I0320 18:29:24.275026 4690 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 18:29:54 crc kubenswrapper[4690]: I0320 18:29:54.274111 4690 patch_prober.go:28] interesting pod/machine-config-daemon-wtg2q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 18:29:54 crc kubenswrapper[4690]: I0320 18:29:54.275126 4690 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 18:30:00 crc kubenswrapper[4690]: I0320 18:30:00.165210 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567190-vwnbr"] Mar 20 18:30:00 crc kubenswrapper[4690]: E0320 18:30:00.166510 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d7ebf70-ea07-412e-b6ef-5574f592516f" containerName="extract-utilities" Mar 20 18:30:00 crc kubenswrapper[4690]: I0320 18:30:00.166562 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d7ebf70-ea07-412e-b6ef-5574f592516f" containerName="extract-utilities" Mar 20 18:30:00 crc kubenswrapper[4690]: E0320 18:30:00.166578 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d7ebf70-ea07-412e-b6ef-5574f592516f" containerName="extract-content" Mar 20 18:30:00 crc kubenswrapper[4690]: I0320 18:30:00.166588 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d7ebf70-ea07-412e-b6ef-5574f592516f" containerName="extract-content" Mar 20 18:30:00 crc kubenswrapper[4690]: E0320 18:30:00.166623 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d7ebf70-ea07-412e-b6ef-5574f592516f" containerName="registry-server" Mar 20 18:30:00 crc kubenswrapper[4690]: I0320 18:30:00.166633 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d7ebf70-ea07-412e-b6ef-5574f592516f" containerName="registry-server" Mar 20 18:30:00 crc kubenswrapper[4690]: I0320 18:30:00.166947 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d7ebf70-ea07-412e-b6ef-5574f592516f" containerName="registry-server" Mar 20 18:30:00 crc kubenswrapper[4690]: I0320 18:30:00.167966 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567190-vwnbr" Mar 20 18:30:00 crc kubenswrapper[4690]: I0320 18:30:00.171642 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5fwhb" Mar 20 18:30:00 crc kubenswrapper[4690]: I0320 18:30:00.171799 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 18:30:00 crc kubenswrapper[4690]: I0320 18:30:00.172169 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 18:30:00 crc kubenswrapper[4690]: I0320 18:30:00.177359 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567190-vwnbr"] Mar 20 18:30:00 crc kubenswrapper[4690]: I0320 18:30:00.216777 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czrzx\" (UniqueName: \"kubernetes.io/projected/9d26f710-83a3-42e2-861e-46f27ab271df-kube-api-access-czrzx\") pod \"auto-csr-approver-29567190-vwnbr\" (UID: \"9d26f710-83a3-42e2-861e-46f27ab271df\") " pod="openshift-infra/auto-csr-approver-29567190-vwnbr" Mar 20 18:30:00 crc kubenswrapper[4690]: I0320 18:30:00.255168 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567190-pfgkg"] Mar 20 18:30:00 crc kubenswrapper[4690]: I0320 18:30:00.256748 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567190-pfgkg" Mar 20 18:30:00 crc kubenswrapper[4690]: I0320 18:30:00.258922 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 18:30:00 crc kubenswrapper[4690]: I0320 18:30:00.259189 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 18:30:00 crc kubenswrapper[4690]: I0320 18:30:00.263936 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567190-pfgkg"] Mar 20 18:30:00 crc kubenswrapper[4690]: I0320 18:30:00.319125 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c9dbaaea-b9ed-42b8-b05f-7e7ac3697911-secret-volume\") pod \"collect-profiles-29567190-pfgkg\" (UID: \"c9dbaaea-b9ed-42b8-b05f-7e7ac3697911\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567190-pfgkg" Mar 20 18:30:00 crc kubenswrapper[4690]: I0320 18:30:00.319294 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znw4v\" (UniqueName: \"kubernetes.io/projected/c9dbaaea-b9ed-42b8-b05f-7e7ac3697911-kube-api-access-znw4v\") pod \"collect-profiles-29567190-pfgkg\" (UID: \"c9dbaaea-b9ed-42b8-b05f-7e7ac3697911\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567190-pfgkg" Mar 20 18:30:00 crc kubenswrapper[4690]: I0320 18:30:00.319339 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czrzx\" (UniqueName: \"kubernetes.io/projected/9d26f710-83a3-42e2-861e-46f27ab271df-kube-api-access-czrzx\") pod \"auto-csr-approver-29567190-vwnbr\" (UID: \"9d26f710-83a3-42e2-861e-46f27ab271df\") " pod="openshift-infra/auto-csr-approver-29567190-vwnbr" Mar 20 18:30:00 crc kubenswrapper[4690]: I0320 18:30:00.319364 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c9dbaaea-b9ed-42b8-b05f-7e7ac3697911-config-volume\") pod \"collect-profiles-29567190-pfgkg\" (UID: \"c9dbaaea-b9ed-42b8-b05f-7e7ac3697911\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567190-pfgkg" Mar 20 18:30:00 crc kubenswrapper[4690]: I0320 18:30:00.353368 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czrzx\" (UniqueName: \"kubernetes.io/projected/9d26f710-83a3-42e2-861e-46f27ab271df-kube-api-access-czrzx\") pod \"auto-csr-approver-29567190-vwnbr\" (UID: \"9d26f710-83a3-42e2-861e-46f27ab271df\") " pod="openshift-infra/auto-csr-approver-29567190-vwnbr" Mar 20 18:30:00 crc kubenswrapper[4690]: I0320 18:30:00.420901 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c9dbaaea-b9ed-42b8-b05f-7e7ac3697911-config-volume\") pod \"collect-profiles-29567190-pfgkg\" (UID: \"c9dbaaea-b9ed-42b8-b05f-7e7ac3697911\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567190-pfgkg" Mar 20 18:30:00 crc kubenswrapper[4690]: I0320 18:30:00.421604 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c9dbaaea-b9ed-42b8-b05f-7e7ac3697911-secret-volume\") pod \"collect-profiles-29567190-pfgkg\" (UID: \"c9dbaaea-b9ed-42b8-b05f-7e7ac3697911\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567190-pfgkg" Mar 20 18:30:00 crc kubenswrapper[4690]: I0320 18:30:00.421741 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znw4v\" (UniqueName: \"kubernetes.io/projected/c9dbaaea-b9ed-42b8-b05f-7e7ac3697911-kube-api-access-znw4v\") pod \"collect-profiles-29567190-pfgkg\" (UID: \"c9dbaaea-b9ed-42b8-b05f-7e7ac3697911\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567190-pfgkg" Mar 20 18:30:00 crc kubenswrapper[4690]: I0320 18:30:00.422100 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c9dbaaea-b9ed-42b8-b05f-7e7ac3697911-config-volume\") pod \"collect-profiles-29567190-pfgkg\" (UID: \"c9dbaaea-b9ed-42b8-b05f-7e7ac3697911\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567190-pfgkg" Mar 20 18:30:00 crc kubenswrapper[4690]: I0320 18:30:00.432641 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c9dbaaea-b9ed-42b8-b05f-7e7ac3697911-secret-volume\") pod \"collect-profiles-29567190-pfgkg\" (UID: \"c9dbaaea-b9ed-42b8-b05f-7e7ac3697911\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567190-pfgkg" Mar 20 18:30:00 crc kubenswrapper[4690]: I0320 18:30:00.439787 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znw4v\" (UniqueName: \"kubernetes.io/projected/c9dbaaea-b9ed-42b8-b05f-7e7ac3697911-kube-api-access-znw4v\") pod \"collect-profiles-29567190-pfgkg\" (UID: \"c9dbaaea-b9ed-42b8-b05f-7e7ac3697911\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567190-pfgkg" Mar 20 18:30:00 crc kubenswrapper[4690]: I0320 18:30:00.530900 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567190-vwnbr" Mar 20 18:30:00 crc kubenswrapper[4690]: I0320 18:30:00.575796 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567190-pfgkg" Mar 20 18:30:00 crc kubenswrapper[4690]: I0320 18:30:00.960779 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567190-vwnbr"] Mar 20 18:30:00 crc kubenswrapper[4690]: I0320 18:30:00.966204 4690 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 18:30:01 crc kubenswrapper[4690]: I0320 18:30:01.106893 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567190-pfgkg"] Mar 20 18:30:01 crc kubenswrapper[4690]: W0320 18:30:01.109107 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9dbaaea_b9ed_42b8_b05f_7e7ac3697911.slice/crio-3c1d96bdf5bbf2f7b2e30d449431fe7f6e9d581efe4e33c9c6615e8f70d30d02 WatchSource:0}: Error finding container 3c1d96bdf5bbf2f7b2e30d449431fe7f6e9d581efe4e33c9c6615e8f70d30d02: Status 404 returned error can't find the container with id 3c1d96bdf5bbf2f7b2e30d449431fe7f6e9d581efe4e33c9c6615e8f70d30d02 Mar 20 18:30:01 crc kubenswrapper[4690]: I0320 18:30:01.336695 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567190-pfgkg" event={"ID":"c9dbaaea-b9ed-42b8-b05f-7e7ac3697911","Type":"ContainerStarted","Data":"5ec44a4f9e63615736375e84e3eb9ed123d670f093b24bb7fdaf7f712fb38237"} Mar 20 18:30:01 crc kubenswrapper[4690]: I0320 18:30:01.337008 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567190-pfgkg" event={"ID":"c9dbaaea-b9ed-42b8-b05f-7e7ac3697911","Type":"ContainerStarted","Data":"3c1d96bdf5bbf2f7b2e30d449431fe7f6e9d581efe4e33c9c6615e8f70d30d02"} Mar 20 18:30:01 crc kubenswrapper[4690]: I0320 18:30:01.338540 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567190-vwnbr" event={"ID":"9d26f710-83a3-42e2-861e-46f27ab271df","Type":"ContainerStarted","Data":"a65ef022719fb88bd1fc4408013bfd95494115d6317dc474cfee4c3a7ba9639c"} Mar 20 18:30:01 crc kubenswrapper[4690]: I0320 18:30:01.353394 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29567190-pfgkg" podStartSLOduration=1.353371948 podStartE2EDuration="1.353371948s" podCreationTimestamp="2026-03-20 18:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 18:30:01.351639069 +0000 UTC m=+3476.217464747" watchObservedRunningTime="2026-03-20 18:30:01.353371948 +0000 UTC m=+3476.219197626" Mar 20 18:30:02 crc kubenswrapper[4690]: I0320 18:30:02.348447 4690 generic.go:334] "Generic (PLEG): container finished" podID="c9dbaaea-b9ed-42b8-b05f-7e7ac3697911" containerID="5ec44a4f9e63615736375e84e3eb9ed123d670f093b24bb7fdaf7f712fb38237" exitCode=0 Mar 20 18:30:02 crc kubenswrapper[4690]: I0320 18:30:02.348491 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567190-pfgkg" event={"ID":"c9dbaaea-b9ed-42b8-b05f-7e7ac3697911","Type":"ContainerDied","Data":"5ec44a4f9e63615736375e84e3eb9ed123d670f093b24bb7fdaf7f712fb38237"} Mar 20 18:30:03 crc kubenswrapper[4690]: I0320 18:30:03.358123 4690 generic.go:334] "Generic (PLEG): container finished" podID="9d26f710-83a3-42e2-861e-46f27ab271df" containerID="ba3b7a58688c9ecd6b2365531a7406696328ac397be70982f36f713b04e3952f" exitCode=0 Mar 20 18:30:03 crc kubenswrapper[4690]: I0320 18:30:03.358951 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567190-vwnbr" event={"ID":"9d26f710-83a3-42e2-861e-46f27ab271df","Type":"ContainerDied","Data":"ba3b7a58688c9ecd6b2365531a7406696328ac397be70982f36f713b04e3952f"} Mar 20 18:30:03 crc kubenswrapper[4690]: I0320 18:30:03.794324 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567190-pfgkg" Mar 20 18:30:03 crc kubenswrapper[4690]: I0320 18:30:03.979728 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c9dbaaea-b9ed-42b8-b05f-7e7ac3697911-secret-volume\") pod \"c9dbaaea-b9ed-42b8-b05f-7e7ac3697911\" (UID: \"c9dbaaea-b9ed-42b8-b05f-7e7ac3697911\") " Mar 20 18:30:03 crc kubenswrapper[4690]: I0320 18:30:03.980461 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c9dbaaea-b9ed-42b8-b05f-7e7ac3697911-config-volume\") pod \"c9dbaaea-b9ed-42b8-b05f-7e7ac3697911\" (UID: \"c9dbaaea-b9ed-42b8-b05f-7e7ac3697911\") " Mar 20 18:30:03 crc kubenswrapper[4690]: I0320 18:30:03.980623 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-znw4v\" (UniqueName: \"kubernetes.io/projected/c9dbaaea-b9ed-42b8-b05f-7e7ac3697911-kube-api-access-znw4v\") pod \"c9dbaaea-b9ed-42b8-b05f-7e7ac3697911\" (UID: \"c9dbaaea-b9ed-42b8-b05f-7e7ac3697911\") " Mar 20 18:30:03 crc kubenswrapper[4690]: I0320 18:30:03.981215 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9dbaaea-b9ed-42b8-b05f-7e7ac3697911-config-volume" (OuterVolumeSpecName: "config-volume") pod "c9dbaaea-b9ed-42b8-b05f-7e7ac3697911" (UID: "c9dbaaea-b9ed-42b8-b05f-7e7ac3697911"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 18:30:03 crc kubenswrapper[4690]: I0320 18:30:03.987225 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9dbaaea-b9ed-42b8-b05f-7e7ac3697911-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c9dbaaea-b9ed-42b8-b05f-7e7ac3697911" (UID: "c9dbaaea-b9ed-42b8-b05f-7e7ac3697911"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:30:03 crc kubenswrapper[4690]: I0320 18:30:03.987352 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9dbaaea-b9ed-42b8-b05f-7e7ac3697911-kube-api-access-znw4v" (OuterVolumeSpecName: "kube-api-access-znw4v") pod "c9dbaaea-b9ed-42b8-b05f-7e7ac3697911" (UID: "c9dbaaea-b9ed-42b8-b05f-7e7ac3697911"). InnerVolumeSpecName "kube-api-access-znw4v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:30:04 crc kubenswrapper[4690]: I0320 18:30:04.083671 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-znw4v\" (UniqueName: \"kubernetes.io/projected/c9dbaaea-b9ed-42b8-b05f-7e7ac3697911-kube-api-access-znw4v\") on node \"crc\" DevicePath \"\"" Mar 20 18:30:04 crc kubenswrapper[4690]: I0320 18:30:04.083717 4690 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c9dbaaea-b9ed-42b8-b05f-7e7ac3697911-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 18:30:04 crc kubenswrapper[4690]: I0320 18:30:04.083727 4690 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c9dbaaea-b9ed-42b8-b05f-7e7ac3697911-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 18:30:04 crc kubenswrapper[4690]: I0320 18:30:04.368037 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567190-pfgkg" Mar 20 18:30:04 crc kubenswrapper[4690]: I0320 18:30:04.369070 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567190-pfgkg" event={"ID":"c9dbaaea-b9ed-42b8-b05f-7e7ac3697911","Type":"ContainerDied","Data":"3c1d96bdf5bbf2f7b2e30d449431fe7f6e9d581efe4e33c9c6615e8f70d30d02"} Mar 20 18:30:04 crc kubenswrapper[4690]: I0320 18:30:04.369326 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c1d96bdf5bbf2f7b2e30d449431fe7f6e9d581efe4e33c9c6615e8f70d30d02" Mar 20 18:30:04 crc kubenswrapper[4690]: I0320 18:30:04.426599 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567145-rch7w"] Mar 20 18:30:04 crc kubenswrapper[4690]: I0320 18:30:04.433440 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567145-rch7w"] Mar 20 18:30:04 crc kubenswrapper[4690]: I0320 18:30:04.766226 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567190-vwnbr" Mar 20 18:30:04 crc kubenswrapper[4690]: I0320 18:30:04.902694 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czrzx\" (UniqueName: \"kubernetes.io/projected/9d26f710-83a3-42e2-861e-46f27ab271df-kube-api-access-czrzx\") pod \"9d26f710-83a3-42e2-861e-46f27ab271df\" (UID: \"9d26f710-83a3-42e2-861e-46f27ab271df\") " Mar 20 18:30:04 crc kubenswrapper[4690]: I0320 18:30:04.908037 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d26f710-83a3-42e2-861e-46f27ab271df-kube-api-access-czrzx" (OuterVolumeSpecName: "kube-api-access-czrzx") pod "9d26f710-83a3-42e2-861e-46f27ab271df" (UID: "9d26f710-83a3-42e2-861e-46f27ab271df"). InnerVolumeSpecName "kube-api-access-czrzx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:30:05 crc kubenswrapper[4690]: I0320 18:30:05.005317 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-czrzx\" (UniqueName: \"kubernetes.io/projected/9d26f710-83a3-42e2-861e-46f27ab271df-kube-api-access-czrzx\") on node \"crc\" DevicePath \"\"" Mar 20 18:30:05 crc kubenswrapper[4690]: I0320 18:30:05.380721 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567190-vwnbr" event={"ID":"9d26f710-83a3-42e2-861e-46f27ab271df","Type":"ContainerDied","Data":"a65ef022719fb88bd1fc4408013bfd95494115d6317dc474cfee4c3a7ba9639c"} Mar 20 18:30:05 crc kubenswrapper[4690]: I0320 18:30:05.381056 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a65ef022719fb88bd1fc4408013bfd95494115d6317dc474cfee4c3a7ba9639c" Mar 20 18:30:05 crc kubenswrapper[4690]: I0320 18:30:05.380791 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567190-vwnbr" Mar 20 18:30:05 crc kubenswrapper[4690]: I0320 18:30:05.834513 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567184-bt79j"] Mar 20 18:30:05 crc kubenswrapper[4690]: I0320 18:30:05.844658 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567184-bt79j"] Mar 20 18:30:05 crc kubenswrapper[4690]: I0320 18:30:05.897849 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1eb37b19-502e-4b27-8fda-4d31630eb068" path="/var/lib/kubelet/pods/1eb37b19-502e-4b27-8fda-4d31630eb068/volumes" Mar 20 18:30:05 crc kubenswrapper[4690]: I0320 18:30:05.898974 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9c3ac1c-eae1-4ff1-b67a-6892f1032d50" path="/var/lib/kubelet/pods/c9c3ac1c-eae1-4ff1-b67a-6892f1032d50/volumes" Mar 20 18:30:24 crc kubenswrapper[4690]: I0320 18:30:24.274354 4690 patch_prober.go:28] interesting pod/machine-config-daemon-wtg2q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 18:30:24 crc kubenswrapper[4690]: I0320 18:30:24.275173 4690 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 18:30:24 crc kubenswrapper[4690]: I0320 18:30:24.275294 4690 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" Mar 20 18:30:24 crc kubenswrapper[4690]: I0320 18:30:24.276472 4690 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a2e6f68efca4135e3c8fa49777a5346857b4523349d6d6127d731aa0476809cd"} pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 18:30:24 crc kubenswrapper[4690]: I0320 18:30:24.276602 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" containerName="machine-config-daemon" containerID="cri-o://a2e6f68efca4135e3c8fa49777a5346857b4523349d6d6127d731aa0476809cd" gracePeriod=600 Mar 20 18:30:24 crc kubenswrapper[4690]: I0320 18:30:24.588900 4690 generic.go:334] "Generic (PLEG): container finished" podID="c18651e4-89e3-43fd-a780-bfa6df87591e" containerID="a2e6f68efca4135e3c8fa49777a5346857b4523349d6d6127d731aa0476809cd" exitCode=0 Mar 20 18:30:24 crc kubenswrapper[4690]: I0320 18:30:24.588972 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" event={"ID":"c18651e4-89e3-43fd-a780-bfa6df87591e","Type":"ContainerDied","Data":"a2e6f68efca4135e3c8fa49777a5346857b4523349d6d6127d731aa0476809cd"} Mar 20 18:30:24 crc kubenswrapper[4690]: I0320 18:30:24.589345 4690 scope.go:117] "RemoveContainer" containerID="24ef90ba5ffd6fe8cfd84b882fb514055d9bcdb4482ff9cdfceca9605510153c" Mar 20 18:30:24 crc kubenswrapper[4690]: I0320 18:30:24.707111 4690 scope.go:117] "RemoveContainer" containerID="b274e77be08a76224f30f0f6fe348b80db5f4c05e6e9047246f5ded12e6d429f" Mar 20 18:30:24 crc kubenswrapper[4690]: I0320 18:30:24.745051 4690 scope.go:117] "RemoveContainer" containerID="2515792b8d0096fa242a7862922dea334f0bd2244a1fce9d313f283d9ac8d1c5" Mar 20 18:30:25 crc kubenswrapper[4690]: I0320 18:30:25.602227 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" event={"ID":"c18651e4-89e3-43fd-a780-bfa6df87591e","Type":"ContainerStarted","Data":"bdbe59d9ce73fb94720cd5938d39dd30976660c151e267e22ef199f0f2141309"} Mar 20 18:31:54 crc kubenswrapper[4690]: I0320 18:31:54.919801 4690 generic.go:334] "Generic (PLEG): container finished" podID="86a8f040-c0ab-4923-8bab-8123fd72e63e" containerID="42bc5f8f53ea25c410557a52c9a563702a9ba2f4637ea7a15908c38f83c496c4" exitCode=0 Mar 20 18:31:54 crc kubenswrapper[4690]: I0320 18:31:54.919904 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"86a8f040-c0ab-4923-8bab-8123fd72e63e","Type":"ContainerDied","Data":"42bc5f8f53ea25c410557a52c9a563702a9ba2f4637ea7a15908c38f83c496c4"} Mar 20 18:31:56 crc kubenswrapper[4690]: I0320 18:31:56.316132 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 20 18:31:56 crc kubenswrapper[4690]: I0320 18:31:56.479224 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/86a8f040-c0ab-4923-8bab-8123fd72e63e-test-operator-ephemeral-workdir\") pod \"86a8f040-c0ab-4923-8bab-8123fd72e63e\" (UID: \"86a8f040-c0ab-4923-8bab-8123fd72e63e\") " Mar 20 18:31:56 crc kubenswrapper[4690]: I0320 18:31:56.479284 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/86a8f040-c0ab-4923-8bab-8123fd72e63e-config-data\") pod \"86a8f040-c0ab-4923-8bab-8123fd72e63e\" (UID: \"86a8f040-c0ab-4923-8bab-8123fd72e63e\") " Mar 20 18:31:56 crc kubenswrapper[4690]: I0320 18:31:56.479334 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/86a8f040-c0ab-4923-8bab-8123fd72e63e-ca-certs\") pod \"86a8f040-c0ab-4923-8bab-8123fd72e63e\" (UID: \"86a8f040-c0ab-4923-8bab-8123fd72e63e\") " Mar 20 18:31:56 crc kubenswrapper[4690]: I0320 18:31:56.479353 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6krmb\" (UniqueName: \"kubernetes.io/projected/86a8f040-c0ab-4923-8bab-8123fd72e63e-kube-api-access-6krmb\") pod \"86a8f040-c0ab-4923-8bab-8123fd72e63e\" (UID: \"86a8f040-c0ab-4923-8bab-8123fd72e63e\") " Mar 20 18:31:56 crc kubenswrapper[4690]: I0320 18:31:56.479439 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/86a8f040-c0ab-4923-8bab-8123fd72e63e-ssh-key\") pod \"86a8f040-c0ab-4923-8bab-8123fd72e63e\" (UID: \"86a8f040-c0ab-4923-8bab-8123fd72e63e\") " Mar 20 18:31:56 crc kubenswrapper[4690]: I0320 18:31:56.479461 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"86a8f040-c0ab-4923-8bab-8123fd72e63e\" (UID: \"86a8f040-c0ab-4923-8bab-8123fd72e63e\") " Mar 20 18:31:56 crc kubenswrapper[4690]: I0320 18:31:56.479519 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/86a8f040-c0ab-4923-8bab-8123fd72e63e-openstack-config-secret\") pod \"86a8f040-c0ab-4923-8bab-8123fd72e63e\" (UID: \"86a8f040-c0ab-4923-8bab-8123fd72e63e\") " Mar 20 18:31:56 crc kubenswrapper[4690]: I0320 18:31:56.479573 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/86a8f040-c0ab-4923-8bab-8123fd72e63e-test-operator-ephemeral-temporary\") pod \"86a8f040-c0ab-4923-8bab-8123fd72e63e\" (UID: \"86a8f040-c0ab-4923-8bab-8123fd72e63e\") " Mar 20 18:31:56 crc kubenswrapper[4690]: I0320 18:31:56.479624 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/86a8f040-c0ab-4923-8bab-8123fd72e63e-openstack-config\") pod \"86a8f040-c0ab-4923-8bab-8123fd72e63e\" (UID: \"86a8f040-c0ab-4923-8bab-8123fd72e63e\") " Mar 20 18:31:56 crc kubenswrapper[4690]: I0320 18:31:56.480471 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86a8f040-c0ab-4923-8bab-8123fd72e63e-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "86a8f040-c0ab-4923-8bab-8123fd72e63e" (UID: "86a8f040-c0ab-4923-8bab-8123fd72e63e"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:31:56 crc kubenswrapper[4690]: I0320 18:31:56.486024 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86a8f040-c0ab-4923-8bab-8123fd72e63e-config-data" (OuterVolumeSpecName: "config-data") pod "86a8f040-c0ab-4923-8bab-8123fd72e63e" (UID: "86a8f040-c0ab-4923-8bab-8123fd72e63e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 18:31:56 crc kubenswrapper[4690]: I0320 18:31:56.488535 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86a8f040-c0ab-4923-8bab-8123fd72e63e-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "86a8f040-c0ab-4923-8bab-8123fd72e63e" (UID: "86a8f040-c0ab-4923-8bab-8123fd72e63e"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:31:56 crc kubenswrapper[4690]: I0320 18:31:56.489306 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "test-operator-logs") pod "86a8f040-c0ab-4923-8bab-8123fd72e63e" (UID: "86a8f040-c0ab-4923-8bab-8123fd72e63e"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 18:31:56 crc kubenswrapper[4690]: I0320 18:31:56.489933 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86a8f040-c0ab-4923-8bab-8123fd72e63e-kube-api-access-6krmb" (OuterVolumeSpecName: "kube-api-access-6krmb") pod "86a8f040-c0ab-4923-8bab-8123fd72e63e" (UID: "86a8f040-c0ab-4923-8bab-8123fd72e63e"). InnerVolumeSpecName "kube-api-access-6krmb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:31:56 crc kubenswrapper[4690]: I0320 18:31:56.512147 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86a8f040-c0ab-4923-8bab-8123fd72e63e-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "86a8f040-c0ab-4923-8bab-8123fd72e63e" (UID: "86a8f040-c0ab-4923-8bab-8123fd72e63e"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:31:56 crc kubenswrapper[4690]: I0320 18:31:56.515440 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86a8f040-c0ab-4923-8bab-8123fd72e63e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "86a8f040-c0ab-4923-8bab-8123fd72e63e" (UID: "86a8f040-c0ab-4923-8bab-8123fd72e63e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:31:56 crc kubenswrapper[4690]: I0320 18:31:56.520929 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86a8f040-c0ab-4923-8bab-8123fd72e63e-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "86a8f040-c0ab-4923-8bab-8123fd72e63e" (UID: "86a8f040-c0ab-4923-8bab-8123fd72e63e"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:31:56 crc kubenswrapper[4690]: I0320 18:31:56.564243 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86a8f040-c0ab-4923-8bab-8123fd72e63e-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "86a8f040-c0ab-4923-8bab-8123fd72e63e" (UID: "86a8f040-c0ab-4923-8bab-8123fd72e63e"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 18:31:56 crc kubenswrapper[4690]: I0320 18:31:56.581576 4690 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/86a8f040-c0ab-4923-8bab-8123fd72e63e-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 20 18:31:56 crc kubenswrapper[4690]: I0320 18:31:56.581617 4690 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/86a8f040-c0ab-4923-8bab-8123fd72e63e-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Mar 20 18:31:56 crc kubenswrapper[4690]: I0320 18:31:56.581633 4690 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/86a8f040-c0ab-4923-8bab-8123fd72e63e-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 18:31:56 crc kubenswrapper[4690]: I0320 18:31:56.581645 4690 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/86a8f040-c0ab-4923-8bab-8123fd72e63e-ca-certs\") on node \"crc\" DevicePath \"\"" Mar 20 18:31:56 crc kubenswrapper[4690]: I0320 18:31:56.581657 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6krmb\" (UniqueName: \"kubernetes.io/projected/86a8f040-c0ab-4923-8bab-8123fd72e63e-kube-api-access-6krmb\") on node \"crc\" DevicePath \"\"" Mar 20 18:31:56 crc kubenswrapper[4690]: I0320 18:31:56.581671 4690 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/86a8f040-c0ab-4923-8bab-8123fd72e63e-ssh-key\") on node \"crc\" DevicePath \"\"" Mar 20 18:31:56 crc kubenswrapper[4690]: I0320 18:31:56.581710 4690 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Mar 20 18:31:56 crc kubenswrapper[4690]: I0320 18:31:56.581724 4690 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/86a8f040-c0ab-4923-8bab-8123fd72e63e-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 20 18:31:56 crc kubenswrapper[4690]: I0320 18:31:56.581737 4690 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/86a8f040-c0ab-4923-8bab-8123fd72e63e-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Mar 20 18:31:56 crc kubenswrapper[4690]: I0320 18:31:56.606835 4690 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Mar 20 18:31:56 crc kubenswrapper[4690]: I0320 18:31:56.683545 4690 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Mar 20 18:31:56 crc kubenswrapper[4690]: I0320 18:31:56.944657 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"86a8f040-c0ab-4923-8bab-8123fd72e63e","Type":"ContainerDied","Data":"575edbdf4eeb50533aa1847c1c8f6fe49c58ecc0f0103261b62199486a021aa6"} Mar 20 18:31:56 crc kubenswrapper[4690]: I0320 18:31:56.944754 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="575edbdf4eeb50533aa1847c1c8f6fe49c58ecc0f0103261b62199486a021aa6" Mar 20 18:31:56 crc kubenswrapper[4690]: I0320 18:31:56.944821 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 20 18:32:00 crc kubenswrapper[4690]: I0320 18:32:00.169311 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567192-lsb4v"] Mar 20 18:32:00 crc kubenswrapper[4690]: E0320 18:32:00.170809 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9dbaaea-b9ed-42b8-b05f-7e7ac3697911" containerName="collect-profiles" Mar 20 18:32:00 crc kubenswrapper[4690]: I0320 18:32:00.170842 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9dbaaea-b9ed-42b8-b05f-7e7ac3697911" containerName="collect-profiles" Mar 20 18:32:00 crc kubenswrapper[4690]: E0320 18:32:00.170917 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86a8f040-c0ab-4923-8bab-8123fd72e63e" containerName="tempest-tests-tempest-tests-runner" Mar 20 18:32:00 crc kubenswrapper[4690]: I0320 18:32:00.170935 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="86a8f040-c0ab-4923-8bab-8123fd72e63e" containerName="tempest-tests-tempest-tests-runner" Mar 20 18:32:00 crc kubenswrapper[4690]: E0320 18:32:00.170964 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d26f710-83a3-42e2-861e-46f27ab271df" containerName="oc" Mar 20 18:32:00 crc kubenswrapper[4690]: I0320 18:32:00.170981 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d26f710-83a3-42e2-861e-46f27ab271df" containerName="oc" Mar 20 18:32:00 crc kubenswrapper[4690]: I0320 18:32:00.171434 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9dbaaea-b9ed-42b8-b05f-7e7ac3697911" containerName="collect-profiles" Mar 20 18:32:00 crc kubenswrapper[4690]: I0320 18:32:00.171474 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="86a8f040-c0ab-4923-8bab-8123fd72e63e" containerName="tempest-tests-tempest-tests-runner" Mar 20 18:32:00 crc kubenswrapper[4690]: I0320 18:32:00.171520 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d26f710-83a3-42e2-861e-46f27ab271df" containerName="oc" Mar 20 18:32:00 crc kubenswrapper[4690]: I0320 18:32:00.172953 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567192-lsb4v" Mar 20 18:32:00 crc kubenswrapper[4690]: I0320 18:32:00.175683 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 18:32:00 crc kubenswrapper[4690]: I0320 18:32:00.176768 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5fwhb" Mar 20 18:32:00 crc kubenswrapper[4690]: I0320 18:32:00.178047 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567192-lsb4v"] Mar 20 18:32:00 crc kubenswrapper[4690]: I0320 18:32:00.179989 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 18:32:00 crc kubenswrapper[4690]: I0320 18:32:00.205397 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcr7d\" (UniqueName: \"kubernetes.io/projected/2fad462b-be0f-40f9-bc9c-8ae5aec84e6a-kube-api-access-jcr7d\") pod \"auto-csr-approver-29567192-lsb4v\" (UID: \"2fad462b-be0f-40f9-bc9c-8ae5aec84e6a\") " pod="openshift-infra/auto-csr-approver-29567192-lsb4v" Mar 20 18:32:00 crc kubenswrapper[4690]: I0320 18:32:00.307823 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcr7d\" (UniqueName: \"kubernetes.io/projected/2fad462b-be0f-40f9-bc9c-8ae5aec84e6a-kube-api-access-jcr7d\") pod \"auto-csr-approver-29567192-lsb4v\" (UID: \"2fad462b-be0f-40f9-bc9c-8ae5aec84e6a\") " pod="openshift-infra/auto-csr-approver-29567192-lsb4v" Mar 20 18:32:00 crc kubenswrapper[4690]: I0320 18:32:00.329897 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcr7d\" (UniqueName: \"kubernetes.io/projected/2fad462b-be0f-40f9-bc9c-8ae5aec84e6a-kube-api-access-jcr7d\") pod \"auto-csr-approver-29567192-lsb4v\" (UID: \"2fad462b-be0f-40f9-bc9c-8ae5aec84e6a\") " pod="openshift-infra/auto-csr-approver-29567192-lsb4v" Mar 20 18:32:00 crc kubenswrapper[4690]: I0320 18:32:00.500140 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567192-lsb4v" Mar 20 18:32:00 crc kubenswrapper[4690]: I0320 18:32:00.939405 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567192-lsb4v"] Mar 20 18:32:00 crc kubenswrapper[4690]: W0320 18:32:00.946873 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2fad462b_be0f_40f9_bc9c_8ae5aec84e6a.slice/crio-8f31a1b49f93b3e06eea8c6ac447e4242994933274d351952a942fbd505adb08 WatchSource:0}: Error finding container 8f31a1b49f93b3e06eea8c6ac447e4242994933274d351952a942fbd505adb08: Status 404 returned error can't find the container with id 8f31a1b49f93b3e06eea8c6ac447e4242994933274d351952a942fbd505adb08 Mar 20 18:32:00 crc kubenswrapper[4690]: I0320 18:32:00.990371 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567192-lsb4v" event={"ID":"2fad462b-be0f-40f9-bc9c-8ae5aec84e6a","Type":"ContainerStarted","Data":"8f31a1b49f93b3e06eea8c6ac447e4242994933274d351952a942fbd505adb08"} Mar 20 18:32:03 crc kubenswrapper[4690]: I0320 18:32:03.026109 4690 generic.go:334] "Generic (PLEG): container finished" podID="2fad462b-be0f-40f9-bc9c-8ae5aec84e6a" containerID="0d672a3fb53e5ac2689ddf8b7c559f859e690a26fc918689bdf48f3925e81dde" exitCode=0 Mar 20 18:32:03 crc kubenswrapper[4690]: I0320 18:32:03.026185 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567192-lsb4v" event={"ID":"2fad462b-be0f-40f9-bc9c-8ae5aec84e6a","Type":"ContainerDied","Data":"0d672a3fb53e5ac2689ddf8b7c559f859e690a26fc918689bdf48f3925e81dde"} Mar 20 18:32:04 crc kubenswrapper[4690]: I0320 18:32:04.415114 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567192-lsb4v" Mar 20 18:32:04 crc kubenswrapper[4690]: I0320 18:32:04.485056 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jcr7d\" (UniqueName: \"kubernetes.io/projected/2fad462b-be0f-40f9-bc9c-8ae5aec84e6a-kube-api-access-jcr7d\") pod \"2fad462b-be0f-40f9-bc9c-8ae5aec84e6a\" (UID: \"2fad462b-be0f-40f9-bc9c-8ae5aec84e6a\") " Mar 20 18:32:04 crc kubenswrapper[4690]: I0320 18:32:04.496558 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fad462b-be0f-40f9-bc9c-8ae5aec84e6a-kube-api-access-jcr7d" (OuterVolumeSpecName: "kube-api-access-jcr7d") pod "2fad462b-be0f-40f9-bc9c-8ae5aec84e6a" (UID: "2fad462b-be0f-40f9-bc9c-8ae5aec84e6a"). InnerVolumeSpecName "kube-api-access-jcr7d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:32:04 crc kubenswrapper[4690]: I0320 18:32:04.586681 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jcr7d\" (UniqueName: \"kubernetes.io/projected/2fad462b-be0f-40f9-bc9c-8ae5aec84e6a-kube-api-access-jcr7d\") on node \"crc\" DevicePath \"\"" Mar 20 18:32:05 crc kubenswrapper[4690]: I0320 18:32:05.051087 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567192-lsb4v" event={"ID":"2fad462b-be0f-40f9-bc9c-8ae5aec84e6a","Type":"ContainerDied","Data":"8f31a1b49f93b3e06eea8c6ac447e4242994933274d351952a942fbd505adb08"} Mar 20 18:32:05 crc kubenswrapper[4690]: I0320 18:32:05.051136 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f31a1b49f93b3e06eea8c6ac447e4242994933274d351952a942fbd505adb08" Mar 20 18:32:05 crc kubenswrapper[4690]: I0320 18:32:05.051154 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567192-lsb4v" Mar 20 18:32:05 crc kubenswrapper[4690]: I0320 18:32:05.485350 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567186-ggxmj"] Mar 20 18:32:05 crc kubenswrapper[4690]: I0320 18:32:05.495496 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567186-ggxmj"] Mar 20 18:32:05 crc kubenswrapper[4690]: I0320 18:32:05.918838 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1632020-99f1-449a-b150-387af3337331" path="/var/lib/kubelet/pods/e1632020-99f1-449a-b150-387af3337331/volumes" Mar 20 18:32:07 crc kubenswrapper[4690]: I0320 18:32:07.533585 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 20 18:32:07 crc kubenswrapper[4690]: E0320 18:32:07.534494 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fad462b-be0f-40f9-bc9c-8ae5aec84e6a" containerName="oc" Mar 20 18:32:07 crc kubenswrapper[4690]: I0320 18:32:07.534512 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fad462b-be0f-40f9-bc9c-8ae5aec84e6a" containerName="oc" Mar 20 18:32:07 crc kubenswrapper[4690]: I0320 18:32:07.534834 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fad462b-be0f-40f9-bc9c-8ae5aec84e6a" containerName="oc" Mar 20 18:32:07 crc kubenswrapper[4690]: I0320 18:32:07.535625 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 20 18:32:07 crc kubenswrapper[4690]: I0320 18:32:07.541116 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-b927d" Mar 20 18:32:07 crc kubenswrapper[4690]: I0320 18:32:07.552202 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 20 18:32:07 crc kubenswrapper[4690]: I0320 18:32:07.651660 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hn4hb\" (UniqueName: \"kubernetes.io/projected/af317ab4-ee88-4ad6-b2c8-02b26765f15f-kube-api-access-hn4hb\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"af317ab4-ee88-4ad6-b2c8-02b26765f15f\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 20 18:32:07 crc kubenswrapper[4690]: I0320 18:32:07.651718 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"af317ab4-ee88-4ad6-b2c8-02b26765f15f\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 20 18:32:07 crc kubenswrapper[4690]: I0320 18:32:07.753466 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hn4hb\" (UniqueName: \"kubernetes.io/projected/af317ab4-ee88-4ad6-b2c8-02b26765f15f-kube-api-access-hn4hb\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"af317ab4-ee88-4ad6-b2c8-02b26765f15f\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 20 18:32:07 crc kubenswrapper[4690]: I0320 18:32:07.753528 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"af317ab4-ee88-4ad6-b2c8-02b26765f15f\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 20 18:32:07 crc kubenswrapper[4690]: I0320 18:32:07.754069 4690 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"af317ab4-ee88-4ad6-b2c8-02b26765f15f\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 20 18:32:07 crc kubenswrapper[4690]: I0320 18:32:07.785285 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hn4hb\" (UniqueName: \"kubernetes.io/projected/af317ab4-ee88-4ad6-b2c8-02b26765f15f-kube-api-access-hn4hb\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"af317ab4-ee88-4ad6-b2c8-02b26765f15f\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 20 18:32:07 crc kubenswrapper[4690]: I0320 18:32:07.787132 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"af317ab4-ee88-4ad6-b2c8-02b26765f15f\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 20 18:32:07 crc kubenswrapper[4690]: I0320 18:32:07.858289 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 20 18:32:08 crc kubenswrapper[4690]: I0320 18:32:08.368172 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 20 18:32:09 crc kubenswrapper[4690]: I0320 18:32:09.116729 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"af317ab4-ee88-4ad6-b2c8-02b26765f15f","Type":"ContainerStarted","Data":"1eff32a042e7a77b843c1e2555ff556da749f4a5e5c29c7ebe8d94ee074f2007"} Mar 20 18:32:10 crc kubenswrapper[4690]: I0320 18:32:10.127866 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"af317ab4-ee88-4ad6-b2c8-02b26765f15f","Type":"ContainerStarted","Data":"3b64f21431c7041878c349720324b22f3dc62a3466f0db502c569073d81369e2"} Mar 20 18:32:10 crc kubenswrapper[4690]: I0320 18:32:10.146045 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.235297992 podStartE2EDuration="3.146022369s" podCreationTimestamp="2026-03-20 18:32:07 +0000 UTC" firstStartedPulling="2026-03-20 18:32:08.374911131 +0000 UTC m=+3603.240736809" lastFinishedPulling="2026-03-20 18:32:09.285635498 +0000 UTC m=+3604.151461186" observedRunningTime="2026-03-20 18:32:10.139770482 +0000 UTC m=+3605.005596180" watchObservedRunningTime="2026-03-20 18:32:10.146022369 +0000 UTC m=+3605.011848047" Mar 20 18:32:24 crc kubenswrapper[4690]: I0320 18:32:24.273605 4690 patch_prober.go:28] interesting pod/machine-config-daemon-wtg2q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 18:32:24 crc kubenswrapper[4690]: I0320 18:32:24.274210 4690 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 18:32:24 crc kubenswrapper[4690]: I0320 18:32:24.888042 4690 scope.go:117] "RemoveContainer" containerID="17c3a62df930ee9165517410d1a4c83e575c60e6d4bf84e92750a00dea6928df" Mar 20 18:32:28 crc kubenswrapper[4690]: I0320 18:32:28.512457 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-w49js"] Mar 20 18:32:28 crc kubenswrapper[4690]: I0320 18:32:28.515020 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w49js" Mar 20 18:32:28 crc kubenswrapper[4690]: I0320 18:32:28.544690 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-w49js"] Mar 20 18:32:28 crc kubenswrapper[4690]: I0320 18:32:28.550631 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/749b0d07-eff8-4679-8e96-d11c84eb04d2-utilities\") pod \"redhat-marketplace-w49js\" (UID: \"749b0d07-eff8-4679-8e96-d11c84eb04d2\") " pod="openshift-marketplace/redhat-marketplace-w49js" Mar 20 18:32:28 crc kubenswrapper[4690]: I0320 18:32:28.550696 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/749b0d07-eff8-4679-8e96-d11c84eb04d2-catalog-content\") pod \"redhat-marketplace-w49js\" (UID: \"749b0d07-eff8-4679-8e96-d11c84eb04d2\") " pod="openshift-marketplace/redhat-marketplace-w49js" Mar 20 18:32:28 crc kubenswrapper[4690]: I0320 18:32:28.550793 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7bp8\" (UniqueName: \"kubernetes.io/projected/749b0d07-eff8-4679-8e96-d11c84eb04d2-kube-api-access-h7bp8\") pod \"redhat-marketplace-w49js\" (UID: \"749b0d07-eff8-4679-8e96-d11c84eb04d2\") " pod="openshift-marketplace/redhat-marketplace-w49js" Mar 20 18:32:28 crc kubenswrapper[4690]: I0320 18:32:28.651857 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7bp8\" (UniqueName: \"kubernetes.io/projected/749b0d07-eff8-4679-8e96-d11c84eb04d2-kube-api-access-h7bp8\") pod \"redhat-marketplace-w49js\" (UID: \"749b0d07-eff8-4679-8e96-d11c84eb04d2\") " pod="openshift-marketplace/redhat-marketplace-w49js" Mar 20 18:32:28 crc kubenswrapper[4690]: I0320 18:32:28.652027 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/749b0d07-eff8-4679-8e96-d11c84eb04d2-utilities\") pod \"redhat-marketplace-w49js\" (UID: \"749b0d07-eff8-4679-8e96-d11c84eb04d2\") " pod="openshift-marketplace/redhat-marketplace-w49js" Mar 20 18:32:28 crc kubenswrapper[4690]: I0320 18:32:28.652071 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/749b0d07-eff8-4679-8e96-d11c84eb04d2-catalog-content\") pod \"redhat-marketplace-w49js\" (UID: \"749b0d07-eff8-4679-8e96-d11c84eb04d2\") " pod="openshift-marketplace/redhat-marketplace-w49js" Mar 20 18:32:28 crc kubenswrapper[4690]: I0320 18:32:28.652690 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/749b0d07-eff8-4679-8e96-d11c84eb04d2-catalog-content\") pod \"redhat-marketplace-w49js\" (UID: \"749b0d07-eff8-4679-8e96-d11c84eb04d2\") " pod="openshift-marketplace/redhat-marketplace-w49js" Mar 20 18:32:28 crc kubenswrapper[4690]: I0320 18:32:28.652714 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/749b0d07-eff8-4679-8e96-d11c84eb04d2-utilities\") pod \"redhat-marketplace-w49js\" (UID: \"749b0d07-eff8-4679-8e96-d11c84eb04d2\") " pod="openshift-marketplace/redhat-marketplace-w49js" Mar 20 18:32:28 crc kubenswrapper[4690]: I0320 18:32:28.672975 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7bp8\" (UniqueName: \"kubernetes.io/projected/749b0d07-eff8-4679-8e96-d11c84eb04d2-kube-api-access-h7bp8\") pod \"redhat-marketplace-w49js\" (UID: \"749b0d07-eff8-4679-8e96-d11c84eb04d2\") " pod="openshift-marketplace/redhat-marketplace-w49js" Mar 20 18:32:28 crc kubenswrapper[4690]: I0320 18:32:28.845151 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w49js" Mar 20 18:32:29 crc kubenswrapper[4690]: I0320 18:32:29.325651 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-w49js"] Mar 20 18:32:29 crc kubenswrapper[4690]: W0320 18:32:29.343711 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod749b0d07_eff8_4679_8e96_d11c84eb04d2.slice/crio-b0c9bab92be180bcbf713f39587567273d91e29e1b24543980e61567bf632b36 WatchSource:0}: Error finding container b0c9bab92be180bcbf713f39587567273d91e29e1b24543980e61567bf632b36: Status 404 returned error can't find the container with id b0c9bab92be180bcbf713f39587567273d91e29e1b24543980e61567bf632b36 Mar 20 18:32:30 crc kubenswrapper[4690]: I0320 18:32:30.366438 4690 generic.go:334] "Generic (PLEG): container finished" podID="749b0d07-eff8-4679-8e96-d11c84eb04d2" containerID="e05b9b53b2dafc267c25a7e403f7e8e96acf73cc955d2b8cf0f8a184e4ac9a47" exitCode=0 Mar 20 18:32:30 crc kubenswrapper[4690]: I0320 18:32:30.366550 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w49js" event={"ID":"749b0d07-eff8-4679-8e96-d11c84eb04d2","Type":"ContainerDied","Data":"e05b9b53b2dafc267c25a7e403f7e8e96acf73cc955d2b8cf0f8a184e4ac9a47"} Mar 20 18:32:30 crc kubenswrapper[4690]: I0320 18:32:30.367079 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w49js" event={"ID":"749b0d07-eff8-4679-8e96-d11c84eb04d2","Type":"ContainerStarted","Data":"b0c9bab92be180bcbf713f39587567273d91e29e1b24543980e61567bf632b36"} Mar 20 18:32:31 crc kubenswrapper[4690]: I0320 18:32:31.377435 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w49js" event={"ID":"749b0d07-eff8-4679-8e96-d11c84eb04d2","Type":"ContainerStarted","Data":"e1941b984618b2eb57eaaa71e1a9f9b93873b2b71f0a3ae3d7eada41d481bdf6"} Mar 20 18:32:32 crc kubenswrapper[4690]: I0320 18:32:32.387808 4690 generic.go:334] "Generic (PLEG): container finished" podID="749b0d07-eff8-4679-8e96-d11c84eb04d2" containerID="e1941b984618b2eb57eaaa71e1a9f9b93873b2b71f0a3ae3d7eada41d481bdf6" exitCode=0 Mar 20 18:32:32 crc kubenswrapper[4690]: I0320 18:32:32.387873 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w49js" event={"ID":"749b0d07-eff8-4679-8e96-d11c84eb04d2","Type":"ContainerDied","Data":"e1941b984618b2eb57eaaa71e1a9f9b93873b2b71f0a3ae3d7eada41d481bdf6"} Mar 20 18:32:33 crc kubenswrapper[4690]: I0320 18:32:33.400992 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w49js" event={"ID":"749b0d07-eff8-4679-8e96-d11c84eb04d2","Type":"ContainerStarted","Data":"4d9a8d60f2b4a1c9f844c74ffe3d1daa839bb8c3e7a0935abf8ac19d5f8108dc"} Mar 20 18:32:33 crc kubenswrapper[4690]: I0320 18:32:33.431209 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-w49js" podStartSLOduration=3.045341281 podStartE2EDuration="5.43118794s" podCreationTimestamp="2026-03-20 18:32:28 +0000 UTC" firstStartedPulling="2026-03-20 18:32:30.370189139 +0000 UTC m=+3625.236014817" lastFinishedPulling="2026-03-20 18:32:32.756035808 +0000 UTC m=+3627.621861476" observedRunningTime="2026-03-20 18:32:33.423142612 +0000 UTC m=+3628.288968310" watchObservedRunningTime="2026-03-20 18:32:33.43118794 +0000 UTC m=+3628.297013638" Mar 20 18:32:34 crc kubenswrapper[4690]: I0320 18:32:34.760245 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-pwxb9/must-gather-t7qds"] Mar 20 18:32:34 crc kubenswrapper[4690]: I0320 18:32:34.762349 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pwxb9/must-gather-t7qds" Mar 20 18:32:34 crc kubenswrapper[4690]: I0320 18:32:34.766890 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-pwxb9"/"openshift-service-ca.crt" Mar 20 18:32:34 crc kubenswrapper[4690]: I0320 18:32:34.766969 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-pwxb9"/"default-dockercfg-4qlzb" Mar 20 18:32:34 crc kubenswrapper[4690]: I0320 18:32:34.767249 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-pwxb9"/"kube-root-ca.crt" Mar 20 18:32:34 crc kubenswrapper[4690]: I0320 18:32:34.779578 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-pwxb9/must-gather-t7qds"] Mar 20 18:32:34 crc kubenswrapper[4690]: I0320 18:32:34.893994 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8f57ca90-28d1-4064-a387-af2a1bf69731-must-gather-output\") pod \"must-gather-t7qds\" (UID: \"8f57ca90-28d1-4064-a387-af2a1bf69731\") " pod="openshift-must-gather-pwxb9/must-gather-t7qds" Mar 20 18:32:34 crc kubenswrapper[4690]: I0320 18:32:34.894352 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ph65n\" (UniqueName: \"kubernetes.io/projected/8f57ca90-28d1-4064-a387-af2a1bf69731-kube-api-access-ph65n\") pod \"must-gather-t7qds\" (UID: \"8f57ca90-28d1-4064-a387-af2a1bf69731\") " pod="openshift-must-gather-pwxb9/must-gather-t7qds" Mar 20 18:32:34 crc kubenswrapper[4690]: I0320 18:32:34.995585 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8f57ca90-28d1-4064-a387-af2a1bf69731-must-gather-output\") pod \"must-gather-t7qds\" (UID: \"8f57ca90-28d1-4064-a387-af2a1bf69731\") " pod="openshift-must-gather-pwxb9/must-gather-t7qds" Mar 20 18:32:34 crc kubenswrapper[4690]: I0320 18:32:34.995913 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ph65n\" (UniqueName: \"kubernetes.io/projected/8f57ca90-28d1-4064-a387-af2a1bf69731-kube-api-access-ph65n\") pod \"must-gather-t7qds\" (UID: \"8f57ca90-28d1-4064-a387-af2a1bf69731\") " pod="openshift-must-gather-pwxb9/must-gather-t7qds" Mar 20 18:32:34 crc kubenswrapper[4690]: I0320 18:32:34.997766 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8f57ca90-28d1-4064-a387-af2a1bf69731-must-gather-output\") pod \"must-gather-t7qds\" (UID: \"8f57ca90-28d1-4064-a387-af2a1bf69731\") " pod="openshift-must-gather-pwxb9/must-gather-t7qds" Mar 20 18:32:35 crc kubenswrapper[4690]: I0320 18:32:35.022397 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ph65n\" (UniqueName: \"kubernetes.io/projected/8f57ca90-28d1-4064-a387-af2a1bf69731-kube-api-access-ph65n\") pod \"must-gather-t7qds\" (UID: \"8f57ca90-28d1-4064-a387-af2a1bf69731\") " pod="openshift-must-gather-pwxb9/must-gather-t7qds" Mar 20 18:32:35 crc kubenswrapper[4690]: I0320 18:32:35.084660 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pwxb9/must-gather-t7qds" Mar 20 18:32:35 crc kubenswrapper[4690]: I0320 18:32:35.490039 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-pwxb9/must-gather-t7qds"] Mar 20 18:32:36 crc kubenswrapper[4690]: I0320 18:32:36.455210 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pwxb9/must-gather-t7qds" event={"ID":"8f57ca90-28d1-4064-a387-af2a1bf69731","Type":"ContainerStarted","Data":"046a5f8f302d3a8753c53f037b5aff4082b87e6b8a5247e3a3f7fdb4e3f891aa"} Mar 20 18:32:38 crc kubenswrapper[4690]: I0320 18:32:38.846267 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-w49js" Mar 20 18:32:38 crc kubenswrapper[4690]: I0320 18:32:38.846317 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-w49js" Mar 20 18:32:38 crc kubenswrapper[4690]: I0320 18:32:38.891376 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-w49js" Mar 20 18:32:39 crc kubenswrapper[4690]: I0320 18:32:39.530680 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-w49js" Mar 20 18:32:39 crc kubenswrapper[4690]: I0320 18:32:39.587972 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-w49js"] Mar 20 18:32:40 crc kubenswrapper[4690]: I0320 18:32:40.496135 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pwxb9/must-gather-t7qds" event={"ID":"8f57ca90-28d1-4064-a387-af2a1bf69731","Type":"ContainerStarted","Data":"de38e64b0939fef49f9fae9f8fd3b6e77b76ff109ff075ac69b805eaebdf5626"} Mar 20 18:32:40 crc kubenswrapper[4690]: I0320 18:32:40.496409 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pwxb9/must-gather-t7qds" event={"ID":"8f57ca90-28d1-4064-a387-af2a1bf69731","Type":"ContainerStarted","Data":"06f866b63ac88dc8b55e1d03ee1bb4389be06869dddc2dfa4ff495f46d9283ec"} Mar 20 18:32:40 crc kubenswrapper[4690]: I0320 18:32:40.523224 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-pwxb9/must-gather-t7qds" podStartSLOduration=2.705965926 podStartE2EDuration="6.523205945s" podCreationTimestamp="2026-03-20 18:32:34 +0000 UTC" firstStartedPulling="2026-03-20 18:32:35.497753659 +0000 UTC m=+3630.363579367" lastFinishedPulling="2026-03-20 18:32:39.314993698 +0000 UTC m=+3634.180819386" observedRunningTime="2026-03-20 18:32:40.512966165 +0000 UTC m=+3635.378791933" watchObservedRunningTime="2026-03-20 18:32:40.523205945 +0000 UTC m=+3635.389031623" Mar 20 18:32:41 crc kubenswrapper[4690]: I0320 18:32:41.506738 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-w49js" podUID="749b0d07-eff8-4679-8e96-d11c84eb04d2" containerName="registry-server" containerID="cri-o://4d9a8d60f2b4a1c9f844c74ffe3d1daa839bb8c3e7a0935abf8ac19d5f8108dc" gracePeriod=2 Mar 20 18:32:41 crc kubenswrapper[4690]: I0320 18:32:41.990042 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w49js" Mar 20 18:32:42 crc kubenswrapper[4690]: I0320 18:32:42.136977 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/749b0d07-eff8-4679-8e96-d11c84eb04d2-catalog-content\") pod \"749b0d07-eff8-4679-8e96-d11c84eb04d2\" (UID: \"749b0d07-eff8-4679-8e96-d11c84eb04d2\") " Mar 20 18:32:42 crc kubenswrapper[4690]: I0320 18:32:42.137333 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/749b0d07-eff8-4679-8e96-d11c84eb04d2-utilities\") pod \"749b0d07-eff8-4679-8e96-d11c84eb04d2\" (UID: \"749b0d07-eff8-4679-8e96-d11c84eb04d2\") " Mar 20 18:32:42 crc kubenswrapper[4690]: I0320 18:32:42.137519 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h7bp8\" (UniqueName: \"kubernetes.io/projected/749b0d07-eff8-4679-8e96-d11c84eb04d2-kube-api-access-h7bp8\") pod \"749b0d07-eff8-4679-8e96-d11c84eb04d2\" (UID: \"749b0d07-eff8-4679-8e96-d11c84eb04d2\") " Mar 20 18:32:42 crc kubenswrapper[4690]: I0320 18:32:42.138273 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/749b0d07-eff8-4679-8e96-d11c84eb04d2-utilities" (OuterVolumeSpecName: "utilities") pod "749b0d07-eff8-4679-8e96-d11c84eb04d2" (UID: "749b0d07-eff8-4679-8e96-d11c84eb04d2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:32:42 crc kubenswrapper[4690]: I0320 18:32:42.143711 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/749b0d07-eff8-4679-8e96-d11c84eb04d2-kube-api-access-h7bp8" (OuterVolumeSpecName: "kube-api-access-h7bp8") pod "749b0d07-eff8-4679-8e96-d11c84eb04d2" (UID: "749b0d07-eff8-4679-8e96-d11c84eb04d2"). InnerVolumeSpecName "kube-api-access-h7bp8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:32:42 crc kubenswrapper[4690]: I0320 18:32:42.163719 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/749b0d07-eff8-4679-8e96-d11c84eb04d2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "749b0d07-eff8-4679-8e96-d11c84eb04d2" (UID: "749b0d07-eff8-4679-8e96-d11c84eb04d2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:32:42 crc kubenswrapper[4690]: I0320 18:32:42.239869 4690 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/749b0d07-eff8-4679-8e96-d11c84eb04d2-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 18:32:42 crc kubenswrapper[4690]: I0320 18:32:42.239898 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h7bp8\" (UniqueName: \"kubernetes.io/projected/749b0d07-eff8-4679-8e96-d11c84eb04d2-kube-api-access-h7bp8\") on node \"crc\" DevicePath \"\"" Mar 20 18:32:42 crc kubenswrapper[4690]: I0320 18:32:42.239908 4690 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/749b0d07-eff8-4679-8e96-d11c84eb04d2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 18:32:42 crc kubenswrapper[4690]: I0320 18:32:42.562812 4690 generic.go:334] "Generic (PLEG): container finished" podID="749b0d07-eff8-4679-8e96-d11c84eb04d2" containerID="4d9a8d60f2b4a1c9f844c74ffe3d1daa839bb8c3e7a0935abf8ac19d5f8108dc" exitCode=0 Mar 20 18:32:42 crc kubenswrapper[4690]: I0320 18:32:42.563144 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w49js" event={"ID":"749b0d07-eff8-4679-8e96-d11c84eb04d2","Type":"ContainerDied","Data":"4d9a8d60f2b4a1c9f844c74ffe3d1daa839bb8c3e7a0935abf8ac19d5f8108dc"} Mar 20 18:32:42 crc kubenswrapper[4690]: I0320 18:32:42.563174 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w49js" event={"ID":"749b0d07-eff8-4679-8e96-d11c84eb04d2","Type":"ContainerDied","Data":"b0c9bab92be180bcbf713f39587567273d91e29e1b24543980e61567bf632b36"} Mar 20 18:32:42 crc kubenswrapper[4690]: I0320 18:32:42.563191 4690 scope.go:117] "RemoveContainer" containerID="4d9a8d60f2b4a1c9f844c74ffe3d1daa839bb8c3e7a0935abf8ac19d5f8108dc" Mar 20 18:32:42 crc kubenswrapper[4690]: I0320 18:32:42.563379 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w49js" Mar 20 18:32:42 crc kubenswrapper[4690]: I0320 18:32:42.610432 4690 scope.go:117] "RemoveContainer" containerID="e1941b984618b2eb57eaaa71e1a9f9b93873b2b71f0a3ae3d7eada41d481bdf6" Mar 20 18:32:42 crc kubenswrapper[4690]: I0320 18:32:42.651119 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-w49js"] Mar 20 18:32:42 crc kubenswrapper[4690]: I0320 18:32:42.659035 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-w49js"] Mar 20 18:32:42 crc kubenswrapper[4690]: I0320 18:32:42.696362 4690 scope.go:117] "RemoveContainer" containerID="e05b9b53b2dafc267c25a7e403f7e8e96acf73cc955d2b8cf0f8a184e4ac9a47" Mar 20 18:32:42 crc kubenswrapper[4690]: I0320 18:32:42.751644 4690 scope.go:117] "RemoveContainer" containerID="4d9a8d60f2b4a1c9f844c74ffe3d1daa839bb8c3e7a0935abf8ac19d5f8108dc" Mar 20 18:32:42 crc kubenswrapper[4690]: E0320 18:32:42.752951 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d9a8d60f2b4a1c9f844c74ffe3d1daa839bb8c3e7a0935abf8ac19d5f8108dc\": container with ID starting with 4d9a8d60f2b4a1c9f844c74ffe3d1daa839bb8c3e7a0935abf8ac19d5f8108dc not found: ID does not exist" containerID="4d9a8d60f2b4a1c9f844c74ffe3d1daa839bb8c3e7a0935abf8ac19d5f8108dc" Mar 20 18:32:42 crc kubenswrapper[4690]: I0320 18:32:42.752987 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d9a8d60f2b4a1c9f844c74ffe3d1daa839bb8c3e7a0935abf8ac19d5f8108dc"} err="failed to get container status \"4d9a8d60f2b4a1c9f844c74ffe3d1daa839bb8c3e7a0935abf8ac19d5f8108dc\": rpc error: code = NotFound desc = could not find container \"4d9a8d60f2b4a1c9f844c74ffe3d1daa839bb8c3e7a0935abf8ac19d5f8108dc\": container with ID starting with 4d9a8d60f2b4a1c9f844c74ffe3d1daa839bb8c3e7a0935abf8ac19d5f8108dc not found: ID does not exist" Mar 20 18:32:42 crc kubenswrapper[4690]: I0320 18:32:42.753010 4690 scope.go:117] "RemoveContainer" containerID="e1941b984618b2eb57eaaa71e1a9f9b93873b2b71f0a3ae3d7eada41d481bdf6" Mar 20 18:32:42 crc kubenswrapper[4690]: E0320 18:32:42.753581 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1941b984618b2eb57eaaa71e1a9f9b93873b2b71f0a3ae3d7eada41d481bdf6\": container with ID starting with e1941b984618b2eb57eaaa71e1a9f9b93873b2b71f0a3ae3d7eada41d481bdf6 not found: ID does not exist" containerID="e1941b984618b2eb57eaaa71e1a9f9b93873b2b71f0a3ae3d7eada41d481bdf6" Mar 20 18:32:42 crc kubenswrapper[4690]: I0320 18:32:42.753628 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1941b984618b2eb57eaaa71e1a9f9b93873b2b71f0a3ae3d7eada41d481bdf6"} err="failed to get container status \"e1941b984618b2eb57eaaa71e1a9f9b93873b2b71f0a3ae3d7eada41d481bdf6\": rpc error: code = NotFound desc = could not find container \"e1941b984618b2eb57eaaa71e1a9f9b93873b2b71f0a3ae3d7eada41d481bdf6\": container with ID starting with e1941b984618b2eb57eaaa71e1a9f9b93873b2b71f0a3ae3d7eada41d481bdf6 not found: ID does not exist" Mar 20 18:32:42 crc kubenswrapper[4690]: I0320 18:32:42.753656 4690 scope.go:117] "RemoveContainer" containerID="e05b9b53b2dafc267c25a7e403f7e8e96acf73cc955d2b8cf0f8a184e4ac9a47" Mar 20 18:32:42 crc kubenswrapper[4690]: E0320 18:32:42.754043 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e05b9b53b2dafc267c25a7e403f7e8e96acf73cc955d2b8cf0f8a184e4ac9a47\": container with ID starting with e05b9b53b2dafc267c25a7e403f7e8e96acf73cc955d2b8cf0f8a184e4ac9a47 not found: ID does not exist" containerID="e05b9b53b2dafc267c25a7e403f7e8e96acf73cc955d2b8cf0f8a184e4ac9a47" Mar 20 18:32:42 crc kubenswrapper[4690]: I0320 18:32:42.754098 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e05b9b53b2dafc267c25a7e403f7e8e96acf73cc955d2b8cf0f8a184e4ac9a47"} err="failed to get container status \"e05b9b53b2dafc267c25a7e403f7e8e96acf73cc955d2b8cf0f8a184e4ac9a47\": rpc error: code = NotFound desc = could not find container \"e05b9b53b2dafc267c25a7e403f7e8e96acf73cc955d2b8cf0f8a184e4ac9a47\": container with ID starting with e05b9b53b2dafc267c25a7e403f7e8e96acf73cc955d2b8cf0f8a184e4ac9a47 not found: ID does not exist" Mar 20 18:32:43 crc kubenswrapper[4690]: I0320 18:32:43.355039 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-pwxb9/crc-debug-hfqq2"] Mar 20 18:32:43 crc kubenswrapper[4690]: E0320 18:32:43.355453 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="749b0d07-eff8-4679-8e96-d11c84eb04d2" containerName="extract-utilities" Mar 20 18:32:43 crc kubenswrapper[4690]: I0320 18:32:43.355466 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="749b0d07-eff8-4679-8e96-d11c84eb04d2" containerName="extract-utilities" Mar 20 18:32:43 crc kubenswrapper[4690]: E0320 18:32:43.355482 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="749b0d07-eff8-4679-8e96-d11c84eb04d2" containerName="registry-server" Mar 20 18:32:43 crc kubenswrapper[4690]: I0320 18:32:43.355490 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="749b0d07-eff8-4679-8e96-d11c84eb04d2" containerName="registry-server" Mar 20 18:32:43 crc kubenswrapper[4690]: E0320 18:32:43.355508 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="749b0d07-eff8-4679-8e96-d11c84eb04d2" containerName="extract-content" Mar 20 18:32:43 crc kubenswrapper[4690]: I0320 18:32:43.355514 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="749b0d07-eff8-4679-8e96-d11c84eb04d2" containerName="extract-content" Mar 20 18:32:43 crc kubenswrapper[4690]: I0320 18:32:43.355695 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="749b0d07-eff8-4679-8e96-d11c84eb04d2" containerName="registry-server" Mar 20 18:32:43 crc kubenswrapper[4690]: I0320 18:32:43.356342 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pwxb9/crc-debug-hfqq2" Mar 20 18:32:43 crc kubenswrapper[4690]: I0320 18:32:43.461554 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59r4k\" (UniqueName: \"kubernetes.io/projected/e4c83e35-634d-498f-99df-23c1dad27989-kube-api-access-59r4k\") pod \"crc-debug-hfqq2\" (UID: \"e4c83e35-634d-498f-99df-23c1dad27989\") " pod="openshift-must-gather-pwxb9/crc-debug-hfqq2" Mar 20 18:32:43 crc kubenswrapper[4690]: I0320 18:32:43.461810 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e4c83e35-634d-498f-99df-23c1dad27989-host\") pod \"crc-debug-hfqq2\" (UID: \"e4c83e35-634d-498f-99df-23c1dad27989\") " pod="openshift-must-gather-pwxb9/crc-debug-hfqq2" Mar 20 18:32:43 crc kubenswrapper[4690]: I0320 18:32:43.563405 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59r4k\" (UniqueName: \"kubernetes.io/projected/e4c83e35-634d-498f-99df-23c1dad27989-kube-api-access-59r4k\") pod \"crc-debug-hfqq2\" (UID: \"e4c83e35-634d-498f-99df-23c1dad27989\") " pod="openshift-must-gather-pwxb9/crc-debug-hfqq2" Mar 20 18:32:43 crc kubenswrapper[4690]: I0320 18:32:43.563545 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e4c83e35-634d-498f-99df-23c1dad27989-host\") pod \"crc-debug-hfqq2\" (UID: \"e4c83e35-634d-498f-99df-23c1dad27989\") " pod="openshift-must-gather-pwxb9/crc-debug-hfqq2" Mar 20 18:32:43 crc kubenswrapper[4690]: I0320 18:32:43.563659 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e4c83e35-634d-498f-99df-23c1dad27989-host\") pod \"crc-debug-hfqq2\" (UID: \"e4c83e35-634d-498f-99df-23c1dad27989\") " pod="openshift-must-gather-pwxb9/crc-debug-hfqq2" Mar 20 18:32:43 crc kubenswrapper[4690]: I0320 18:32:43.583447 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59r4k\" (UniqueName: \"kubernetes.io/projected/e4c83e35-634d-498f-99df-23c1dad27989-kube-api-access-59r4k\") pod \"crc-debug-hfqq2\" (UID: \"e4c83e35-634d-498f-99df-23c1dad27989\") " pod="openshift-must-gather-pwxb9/crc-debug-hfqq2" Mar 20 18:32:43 crc kubenswrapper[4690]: I0320 18:32:43.678771 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pwxb9/crc-debug-hfqq2" Mar 20 18:32:43 crc kubenswrapper[4690]: I0320 18:32:43.894165 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="749b0d07-eff8-4679-8e96-d11c84eb04d2" path="/var/lib/kubelet/pods/749b0d07-eff8-4679-8e96-d11c84eb04d2/volumes" Mar 20 18:32:44 crc kubenswrapper[4690]: I0320 18:32:44.588649 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pwxb9/crc-debug-hfqq2" event={"ID":"e4c83e35-634d-498f-99df-23c1dad27989","Type":"ContainerStarted","Data":"228057ff948835bb5e4d06cf8423e27e9220688ab8f04374cb45b7242acd88ef"} Mar 20 18:32:54 crc kubenswrapper[4690]: I0320 18:32:54.274382 4690 patch_prober.go:28] interesting pod/machine-config-daemon-wtg2q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 18:32:54 crc kubenswrapper[4690]: I0320 18:32:54.274956 4690 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 18:32:55 crc kubenswrapper[4690]: I0320 18:32:55.682646 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pwxb9/crc-debug-hfqq2" event={"ID":"e4c83e35-634d-498f-99df-23c1dad27989","Type":"ContainerStarted","Data":"5091fd8657afe0c47155cc97e7fce21b48d72f658e95ae63fd2ae9c29f36d962"} Mar 20 18:32:55 crc kubenswrapper[4690]: I0320 18:32:55.697049 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-pwxb9/crc-debug-hfqq2" podStartSLOduration=1.185118611 podStartE2EDuration="12.697032934s" podCreationTimestamp="2026-03-20 18:32:43 +0000 UTC" firstStartedPulling="2026-03-20 18:32:43.728651377 +0000 UTC m=+3638.594477055" lastFinishedPulling="2026-03-20 18:32:55.2405657 +0000 UTC m=+3650.106391378" observedRunningTime="2026-03-20 18:32:55.696433727 +0000 UTC m=+3650.562259405" watchObservedRunningTime="2026-03-20 18:32:55.697032934 +0000 UTC m=+3650.562858612" Mar 20 18:33:24 crc kubenswrapper[4690]: I0320 18:33:24.274465 4690 patch_prober.go:28] interesting pod/machine-config-daemon-wtg2q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 18:33:24 crc kubenswrapper[4690]: I0320 18:33:24.275016 4690 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 18:33:24 crc kubenswrapper[4690]: I0320 18:33:24.275062 4690 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" Mar 20 18:33:24 crc kubenswrapper[4690]: I0320 18:33:24.275776 4690 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bdbe59d9ce73fb94720cd5938d39dd30976660c151e267e22ef199f0f2141309"} pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 18:33:24 crc kubenswrapper[4690]: I0320 18:33:24.275829 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" containerName="machine-config-daemon" containerID="cri-o://bdbe59d9ce73fb94720cd5938d39dd30976660c151e267e22ef199f0f2141309" gracePeriod=600 Mar 20 18:33:24 crc kubenswrapper[4690]: E0320 18:33:24.401863 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:33:24 crc kubenswrapper[4690]: I0320 18:33:24.993316 4690 generic.go:334] "Generic (PLEG): container finished" podID="c18651e4-89e3-43fd-a780-bfa6df87591e" containerID="bdbe59d9ce73fb94720cd5938d39dd30976660c151e267e22ef199f0f2141309" exitCode=0 Mar 20 18:33:24 crc kubenswrapper[4690]: I0320 18:33:24.993384 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" event={"ID":"c18651e4-89e3-43fd-a780-bfa6df87591e","Type":"ContainerDied","Data":"bdbe59d9ce73fb94720cd5938d39dd30976660c151e267e22ef199f0f2141309"} Mar 20 18:33:24 crc kubenswrapper[4690]: I0320 18:33:24.993438 4690 scope.go:117] "RemoveContainer" containerID="a2e6f68efca4135e3c8fa49777a5346857b4523349d6d6127d731aa0476809cd" Mar 20 18:33:24 crc kubenswrapper[4690]: I0320 18:33:24.995134 4690 scope.go:117] "RemoveContainer" containerID="bdbe59d9ce73fb94720cd5938d39dd30976660c151e267e22ef199f0f2141309" Mar 20 18:33:24 crc kubenswrapper[4690]: E0320 18:33:24.995788 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:33:24 crc kubenswrapper[4690]: E0320 18:33:24.999425 4690 kuberuntime_gc.go:389] "Failed to remove container log dead symlink" err="remove /var/log/containers/machine-config-daemon-wtg2q_openshift-machine-config-operator_machine-config-daemon-a2e6f68efca4135e3c8fa49777a5346857b4523349d6d6127d731aa0476809cd.log: no such file or directory" path="/var/log/containers/machine-config-daemon-wtg2q_openshift-machine-config-operator_machine-config-daemon-a2e6f68efca4135e3c8fa49777a5346857b4523349d6d6127d731aa0476809cd.log" Mar 20 18:33:33 crc kubenswrapper[4690]: I0320 18:33:33.310030 4690 generic.go:334] "Generic (PLEG): container finished" podID="e4c83e35-634d-498f-99df-23c1dad27989" containerID="5091fd8657afe0c47155cc97e7fce21b48d72f658e95ae63fd2ae9c29f36d962" exitCode=0 Mar 20 18:33:33 crc kubenswrapper[4690]: I0320 18:33:33.310596 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pwxb9/crc-debug-hfqq2" event={"ID":"e4c83e35-634d-498f-99df-23c1dad27989","Type":"ContainerDied","Data":"5091fd8657afe0c47155cc97e7fce21b48d72f658e95ae63fd2ae9c29f36d962"} Mar 20 18:33:34 crc kubenswrapper[4690]: I0320 18:33:34.424864 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pwxb9/crc-debug-hfqq2" Mar 20 18:33:34 crc kubenswrapper[4690]: I0320 18:33:34.475434 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-pwxb9/crc-debug-hfqq2"] Mar 20 18:33:34 crc kubenswrapper[4690]: I0320 18:33:34.485590 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-pwxb9/crc-debug-hfqq2"] Mar 20 18:33:34 crc kubenswrapper[4690]: I0320 18:33:34.534721 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e4c83e35-634d-498f-99df-23c1dad27989-host\") pod \"e4c83e35-634d-498f-99df-23c1dad27989\" (UID: \"e4c83e35-634d-498f-99df-23c1dad27989\") " Mar 20 18:33:34 crc kubenswrapper[4690]: I0320 18:33:34.534802 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59r4k\" (UniqueName: \"kubernetes.io/projected/e4c83e35-634d-498f-99df-23c1dad27989-kube-api-access-59r4k\") pod \"e4c83e35-634d-498f-99df-23c1dad27989\" (UID: \"e4c83e35-634d-498f-99df-23c1dad27989\") " Mar 20 18:33:34 crc kubenswrapper[4690]: I0320 18:33:34.534831 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e4c83e35-634d-498f-99df-23c1dad27989-host" (OuterVolumeSpecName: "host") pod "e4c83e35-634d-498f-99df-23c1dad27989" (UID: "e4c83e35-634d-498f-99df-23c1dad27989"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 18:33:34 crc kubenswrapper[4690]: I0320 18:33:34.535171 4690 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e4c83e35-634d-498f-99df-23c1dad27989-host\") on node \"crc\" DevicePath \"\"" Mar 20 18:33:34 crc kubenswrapper[4690]: I0320 18:33:34.540375 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4c83e35-634d-498f-99df-23c1dad27989-kube-api-access-59r4k" (OuterVolumeSpecName: "kube-api-access-59r4k") pod "e4c83e35-634d-498f-99df-23c1dad27989" (UID: "e4c83e35-634d-498f-99df-23c1dad27989"). InnerVolumeSpecName "kube-api-access-59r4k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:33:34 crc kubenswrapper[4690]: I0320 18:33:34.637557 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59r4k\" (UniqueName: \"kubernetes.io/projected/e4c83e35-634d-498f-99df-23c1dad27989-kube-api-access-59r4k\") on node \"crc\" DevicePath \"\"" Mar 20 18:33:35 crc kubenswrapper[4690]: I0320 18:33:35.332342 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pwxb9/crc-debug-hfqq2" Mar 20 18:33:35 crc kubenswrapper[4690]: I0320 18:33:35.332384 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="228057ff948835bb5e4d06cf8423e27e9220688ab8f04374cb45b7242acd88ef" Mar 20 18:33:35 crc kubenswrapper[4690]: I0320 18:33:35.647832 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-pwxb9/crc-debug-wpsr8"] Mar 20 18:33:35 crc kubenswrapper[4690]: E0320 18:33:35.648353 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4c83e35-634d-498f-99df-23c1dad27989" containerName="container-00" Mar 20 18:33:35 crc kubenswrapper[4690]: I0320 18:33:35.648372 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4c83e35-634d-498f-99df-23c1dad27989" containerName="container-00" Mar 20 18:33:35 crc kubenswrapper[4690]: I0320 18:33:35.648594 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4c83e35-634d-498f-99df-23c1dad27989" containerName="container-00" Mar 20 18:33:35 crc kubenswrapper[4690]: I0320 18:33:35.649354 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pwxb9/crc-debug-wpsr8" Mar 20 18:33:35 crc kubenswrapper[4690]: I0320 18:33:35.760307 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzjwd\" (UniqueName: \"kubernetes.io/projected/437f4262-e98a-468a-93bc-747de5a45c92-kube-api-access-gzjwd\") pod \"crc-debug-wpsr8\" (UID: \"437f4262-e98a-468a-93bc-747de5a45c92\") " pod="openshift-must-gather-pwxb9/crc-debug-wpsr8" Mar 20 18:33:35 crc kubenswrapper[4690]: I0320 18:33:35.760391 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/437f4262-e98a-468a-93bc-747de5a45c92-host\") pod \"crc-debug-wpsr8\" (UID: \"437f4262-e98a-468a-93bc-747de5a45c92\") " pod="openshift-must-gather-pwxb9/crc-debug-wpsr8" Mar 20 18:33:35 crc kubenswrapper[4690]: I0320 18:33:35.862075 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzjwd\" (UniqueName: \"kubernetes.io/projected/437f4262-e98a-468a-93bc-747de5a45c92-kube-api-access-gzjwd\") pod \"crc-debug-wpsr8\" (UID: \"437f4262-e98a-468a-93bc-747de5a45c92\") " pod="openshift-must-gather-pwxb9/crc-debug-wpsr8" Mar 20 18:33:35 crc kubenswrapper[4690]: I0320 18:33:35.862149 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/437f4262-e98a-468a-93bc-747de5a45c92-host\") pod \"crc-debug-wpsr8\" (UID: \"437f4262-e98a-468a-93bc-747de5a45c92\") " pod="openshift-must-gather-pwxb9/crc-debug-wpsr8" Mar 20 18:33:35 crc kubenswrapper[4690]: I0320 18:33:35.862278 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/437f4262-e98a-468a-93bc-747de5a45c92-host\") pod \"crc-debug-wpsr8\" (UID: \"437f4262-e98a-468a-93bc-747de5a45c92\") " pod="openshift-must-gather-pwxb9/crc-debug-wpsr8" Mar 20 18:33:35 crc kubenswrapper[4690]: I0320 18:33:35.879728 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzjwd\" (UniqueName: \"kubernetes.io/projected/437f4262-e98a-468a-93bc-747de5a45c92-kube-api-access-gzjwd\") pod \"crc-debug-wpsr8\" (UID: \"437f4262-e98a-468a-93bc-747de5a45c92\") " pod="openshift-must-gather-pwxb9/crc-debug-wpsr8" Mar 20 18:33:35 crc kubenswrapper[4690]: I0320 18:33:35.894723 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4c83e35-634d-498f-99df-23c1dad27989" path="/var/lib/kubelet/pods/e4c83e35-634d-498f-99df-23c1dad27989/volumes" Mar 20 18:33:35 crc kubenswrapper[4690]: I0320 18:33:35.967888 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pwxb9/crc-debug-wpsr8" Mar 20 18:33:36 crc kubenswrapper[4690]: I0320 18:33:36.344322 4690 generic.go:334] "Generic (PLEG): container finished" podID="437f4262-e98a-468a-93bc-747de5a45c92" containerID="ffd40a1364754b0e9d9152f8f94776b54ed7be052ea8c309754bdd551ffbdec5" exitCode=0 Mar 20 18:33:36 crc kubenswrapper[4690]: I0320 18:33:36.344600 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pwxb9/crc-debug-wpsr8" event={"ID":"437f4262-e98a-468a-93bc-747de5a45c92","Type":"ContainerDied","Data":"ffd40a1364754b0e9d9152f8f94776b54ed7be052ea8c309754bdd551ffbdec5"} Mar 20 18:33:36 crc kubenswrapper[4690]: I0320 18:33:36.344635 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pwxb9/crc-debug-wpsr8" event={"ID":"437f4262-e98a-468a-93bc-747de5a45c92","Type":"ContainerStarted","Data":"956c2a8160d594e30789e047214b43929776b214091d46907fef24a7b7c478fc"} Mar 20 18:33:36 crc kubenswrapper[4690]: I0320 18:33:36.761049 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-pwxb9/crc-debug-wpsr8"] Mar 20 18:33:36 crc kubenswrapper[4690]: I0320 18:33:36.768118 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-pwxb9/crc-debug-wpsr8"] Mar 20 18:33:37 crc kubenswrapper[4690]: I0320 18:33:37.462039 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pwxb9/crc-debug-wpsr8" Mar 20 18:33:37 crc kubenswrapper[4690]: I0320 18:33:37.489382 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzjwd\" (UniqueName: \"kubernetes.io/projected/437f4262-e98a-468a-93bc-747de5a45c92-kube-api-access-gzjwd\") pod \"437f4262-e98a-468a-93bc-747de5a45c92\" (UID: \"437f4262-e98a-468a-93bc-747de5a45c92\") " Mar 20 18:33:37 crc kubenswrapper[4690]: I0320 18:33:37.489505 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/437f4262-e98a-468a-93bc-747de5a45c92-host\") pod \"437f4262-e98a-468a-93bc-747de5a45c92\" (UID: \"437f4262-e98a-468a-93bc-747de5a45c92\") " Mar 20 18:33:37 crc kubenswrapper[4690]: I0320 18:33:37.489804 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/437f4262-e98a-468a-93bc-747de5a45c92-host" (OuterVolumeSpecName: "host") pod "437f4262-e98a-468a-93bc-747de5a45c92" (UID: "437f4262-e98a-468a-93bc-747de5a45c92"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 18:33:37 crc kubenswrapper[4690]: I0320 18:33:37.490052 4690 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/437f4262-e98a-468a-93bc-747de5a45c92-host\") on node \"crc\" DevicePath \"\"" Mar 20 18:33:37 crc kubenswrapper[4690]: I0320 18:33:37.520813 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/437f4262-e98a-468a-93bc-747de5a45c92-kube-api-access-gzjwd" (OuterVolumeSpecName: "kube-api-access-gzjwd") pod "437f4262-e98a-468a-93bc-747de5a45c92" (UID: "437f4262-e98a-468a-93bc-747de5a45c92"). InnerVolumeSpecName "kube-api-access-gzjwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:33:37 crc kubenswrapper[4690]: I0320 18:33:37.592468 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gzjwd\" (UniqueName: \"kubernetes.io/projected/437f4262-e98a-468a-93bc-747de5a45c92-kube-api-access-gzjwd\") on node \"crc\" DevicePath \"\"" Mar 20 18:33:37 crc kubenswrapper[4690]: I0320 18:33:37.882642 4690 scope.go:117] "RemoveContainer" containerID="bdbe59d9ce73fb94720cd5938d39dd30976660c151e267e22ef199f0f2141309" Mar 20 18:33:37 crc kubenswrapper[4690]: E0320 18:33:37.882940 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:33:37 crc kubenswrapper[4690]: I0320 18:33:37.894143 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="437f4262-e98a-468a-93bc-747de5a45c92" path="/var/lib/kubelet/pods/437f4262-e98a-468a-93bc-747de5a45c92/volumes" Mar 20 18:33:37 crc kubenswrapper[4690]: I0320 18:33:37.974930 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-pwxb9/crc-debug-96fgr"] Mar 20 18:33:37 crc kubenswrapper[4690]: E0320 18:33:37.975550 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="437f4262-e98a-468a-93bc-747de5a45c92" containerName="container-00" Mar 20 18:33:37 crc kubenswrapper[4690]: I0320 18:33:37.975571 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="437f4262-e98a-468a-93bc-747de5a45c92" containerName="container-00" Mar 20 18:33:37 crc kubenswrapper[4690]: I0320 18:33:37.976004 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="437f4262-e98a-468a-93bc-747de5a45c92" containerName="container-00" Mar 20 18:33:37 crc kubenswrapper[4690]: I0320 18:33:37.976818 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pwxb9/crc-debug-96fgr" Mar 20 18:33:37 crc kubenswrapper[4690]: I0320 18:33:37.999060 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/20831693-9d8b-4133-8a1c-482fbc8e6cb6-host\") pod \"crc-debug-96fgr\" (UID: \"20831693-9d8b-4133-8a1c-482fbc8e6cb6\") " pod="openshift-must-gather-pwxb9/crc-debug-96fgr" Mar 20 18:33:37 crc kubenswrapper[4690]: I0320 18:33:37.999117 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smcfm\" (UniqueName: \"kubernetes.io/projected/20831693-9d8b-4133-8a1c-482fbc8e6cb6-kube-api-access-smcfm\") pod \"crc-debug-96fgr\" (UID: \"20831693-9d8b-4133-8a1c-482fbc8e6cb6\") " pod="openshift-must-gather-pwxb9/crc-debug-96fgr" Mar 20 18:33:38 crc kubenswrapper[4690]: I0320 18:33:38.101229 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/20831693-9d8b-4133-8a1c-482fbc8e6cb6-host\") pod \"crc-debug-96fgr\" (UID: \"20831693-9d8b-4133-8a1c-482fbc8e6cb6\") " pod="openshift-must-gather-pwxb9/crc-debug-96fgr" Mar 20 18:33:38 crc kubenswrapper[4690]: I0320 18:33:38.101368 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/20831693-9d8b-4133-8a1c-482fbc8e6cb6-host\") pod \"crc-debug-96fgr\" (UID: \"20831693-9d8b-4133-8a1c-482fbc8e6cb6\") " pod="openshift-must-gather-pwxb9/crc-debug-96fgr" Mar 20 18:33:38 crc kubenswrapper[4690]: I0320 18:33:38.101506 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smcfm\" (UniqueName: \"kubernetes.io/projected/20831693-9d8b-4133-8a1c-482fbc8e6cb6-kube-api-access-smcfm\") pod \"crc-debug-96fgr\" (UID: \"20831693-9d8b-4133-8a1c-482fbc8e6cb6\") " pod="openshift-must-gather-pwxb9/crc-debug-96fgr" Mar 20 18:33:38 crc kubenswrapper[4690]: I0320 18:33:38.122836 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smcfm\" (UniqueName: \"kubernetes.io/projected/20831693-9d8b-4133-8a1c-482fbc8e6cb6-kube-api-access-smcfm\") pod \"crc-debug-96fgr\" (UID: \"20831693-9d8b-4133-8a1c-482fbc8e6cb6\") " pod="openshift-must-gather-pwxb9/crc-debug-96fgr" Mar 20 18:33:38 crc kubenswrapper[4690]: I0320 18:33:38.292844 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pwxb9/crc-debug-96fgr" Mar 20 18:33:38 crc kubenswrapper[4690]: W0320 18:33:38.321401 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod20831693_9d8b_4133_8a1c_482fbc8e6cb6.slice/crio-b76dff59aea569fe21da71dd4afc8b82a8bc7c1668242bc912862aebefdfb1d0 WatchSource:0}: Error finding container b76dff59aea569fe21da71dd4afc8b82a8bc7c1668242bc912862aebefdfb1d0: Status 404 returned error can't find the container with id b76dff59aea569fe21da71dd4afc8b82a8bc7c1668242bc912862aebefdfb1d0 Mar 20 18:33:38 crc kubenswrapper[4690]: I0320 18:33:38.360937 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pwxb9/crc-debug-96fgr" event={"ID":"20831693-9d8b-4133-8a1c-482fbc8e6cb6","Type":"ContainerStarted","Data":"b76dff59aea569fe21da71dd4afc8b82a8bc7c1668242bc912862aebefdfb1d0"} Mar 20 18:33:38 crc kubenswrapper[4690]: I0320 18:33:38.362925 4690 scope.go:117] "RemoveContainer" containerID="ffd40a1364754b0e9d9152f8f94776b54ed7be052ea8c309754bdd551ffbdec5" Mar 20 18:33:38 crc kubenswrapper[4690]: I0320 18:33:38.363357 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pwxb9/crc-debug-wpsr8" Mar 20 18:33:39 crc kubenswrapper[4690]: I0320 18:33:39.375515 4690 generic.go:334] "Generic (PLEG): container finished" podID="20831693-9d8b-4133-8a1c-482fbc8e6cb6" containerID="3f10c1b92fcff70e5ae89663c8c21ae7bb4c569114255aaf27ac90930fdd56cb" exitCode=0 Mar 20 18:33:39 crc kubenswrapper[4690]: I0320 18:33:39.375594 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pwxb9/crc-debug-96fgr" event={"ID":"20831693-9d8b-4133-8a1c-482fbc8e6cb6","Type":"ContainerDied","Data":"3f10c1b92fcff70e5ae89663c8c21ae7bb4c569114255aaf27ac90930fdd56cb"} Mar 20 18:33:39 crc kubenswrapper[4690]: I0320 18:33:39.412503 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-pwxb9/crc-debug-96fgr"] Mar 20 18:33:39 crc kubenswrapper[4690]: I0320 18:33:39.420646 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-pwxb9/crc-debug-96fgr"] Mar 20 18:33:40 crc kubenswrapper[4690]: I0320 18:33:40.491562 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pwxb9/crc-debug-96fgr" Mar 20 18:33:40 crc kubenswrapper[4690]: I0320 18:33:40.543385 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-smcfm\" (UniqueName: \"kubernetes.io/projected/20831693-9d8b-4133-8a1c-482fbc8e6cb6-kube-api-access-smcfm\") pod \"20831693-9d8b-4133-8a1c-482fbc8e6cb6\" (UID: \"20831693-9d8b-4133-8a1c-482fbc8e6cb6\") " Mar 20 18:33:40 crc kubenswrapper[4690]: I0320 18:33:40.543465 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/20831693-9d8b-4133-8a1c-482fbc8e6cb6-host\") pod \"20831693-9d8b-4133-8a1c-482fbc8e6cb6\" (UID: \"20831693-9d8b-4133-8a1c-482fbc8e6cb6\") " Mar 20 18:33:40 crc kubenswrapper[4690]: I0320 18:33:40.544078 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/20831693-9d8b-4133-8a1c-482fbc8e6cb6-host" (OuterVolumeSpecName: "host") pod "20831693-9d8b-4133-8a1c-482fbc8e6cb6" (UID: "20831693-9d8b-4133-8a1c-482fbc8e6cb6"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 18:33:40 crc kubenswrapper[4690]: I0320 18:33:40.552594 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20831693-9d8b-4133-8a1c-482fbc8e6cb6-kube-api-access-smcfm" (OuterVolumeSpecName: "kube-api-access-smcfm") pod "20831693-9d8b-4133-8a1c-482fbc8e6cb6" (UID: "20831693-9d8b-4133-8a1c-482fbc8e6cb6"). InnerVolumeSpecName "kube-api-access-smcfm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:33:40 crc kubenswrapper[4690]: I0320 18:33:40.645949 4690 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/20831693-9d8b-4133-8a1c-482fbc8e6cb6-host\") on node \"crc\" DevicePath \"\"" Mar 20 18:33:40 crc kubenswrapper[4690]: I0320 18:33:40.646001 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-smcfm\" (UniqueName: \"kubernetes.io/projected/20831693-9d8b-4133-8a1c-482fbc8e6cb6-kube-api-access-smcfm\") on node \"crc\" DevicePath \"\"" Mar 20 18:33:41 crc kubenswrapper[4690]: I0320 18:33:41.392712 4690 scope.go:117] "RemoveContainer" containerID="3f10c1b92fcff70e5ae89663c8c21ae7bb4c569114255aaf27ac90930fdd56cb" Mar 20 18:33:41 crc kubenswrapper[4690]: I0320 18:33:41.392745 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pwxb9/crc-debug-96fgr" Mar 20 18:33:41 crc kubenswrapper[4690]: I0320 18:33:41.895957 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20831693-9d8b-4133-8a1c-482fbc8e6cb6" path="/var/lib/kubelet/pods/20831693-9d8b-4133-8a1c-482fbc8e6cb6/volumes" Mar 20 18:33:49 crc kubenswrapper[4690]: I0320 18:33:49.883677 4690 scope.go:117] "RemoveContainer" containerID="bdbe59d9ce73fb94720cd5938d39dd30976660c151e267e22ef199f0f2141309" Mar 20 18:33:49 crc kubenswrapper[4690]: E0320 18:33:49.884543 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:33:55 crc kubenswrapper[4690]: I0320 18:33:55.062555 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-944584c5d-v2mwf_7d7354f4-3635-4c6c-a382-f405c559ef59/barbican-api/0.log" Mar 20 18:33:55 crc kubenswrapper[4690]: I0320 18:33:55.247032 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-944584c5d-v2mwf_7d7354f4-3635-4c6c-a382-f405c559ef59/barbican-api-log/0.log" Mar 20 18:33:55 crc kubenswrapper[4690]: I0320 18:33:55.305881 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7f7645db4-ph4nl_eb2225a2-e763-42d5-affd-562463c266e6/barbican-keystone-listener/0.log" Mar 20 18:33:55 crc kubenswrapper[4690]: I0320 18:33:55.358189 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7f7645db4-ph4nl_eb2225a2-e763-42d5-affd-562463c266e6/barbican-keystone-listener-log/0.log" Mar 20 18:33:55 crc kubenswrapper[4690]: I0320 18:33:55.495634 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-76d4bbb45f-p692s_d6df0c1b-ea55-44ae-8fb9-9573c54322a8/barbican-worker/0.log" Mar 20 18:33:55 crc kubenswrapper[4690]: I0320 18:33:55.529374 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-76d4bbb45f-p692s_d6df0c1b-ea55-44ae-8fb9-9573c54322a8/barbican-worker-log/0.log" Mar 20 18:33:55 crc kubenswrapper[4690]: I0320 18:33:55.807712 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_fc998c2a-5f75-4a3b-b62a-1d6f8fbfc6d5/ceilometer-central-agent/0.log" Mar 20 18:33:55 crc kubenswrapper[4690]: I0320 18:33:55.826154 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-zmwzf_33405126-fa78-4ad4-9587-e157ffd9f389/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:33:55 crc kubenswrapper[4690]: I0320 18:33:55.903113 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_fc998c2a-5f75-4a3b-b62a-1d6f8fbfc6d5/ceilometer-notification-agent/0.log" Mar 20 18:33:55 crc kubenswrapper[4690]: I0320 18:33:55.951965 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_fc998c2a-5f75-4a3b-b62a-1d6f8fbfc6d5/proxy-httpd/0.log" Mar 20 18:33:55 crc kubenswrapper[4690]: I0320 18:33:55.987944 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_fc998c2a-5f75-4a3b-b62a-1d6f8fbfc6d5/sg-core/0.log" Mar 20 18:33:56 crc kubenswrapper[4690]: I0320 18:33:56.244151 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_370661b8-157c-4a7f-ae3e-379d122d48b3/cinder-api/0.log" Mar 20 18:33:56 crc kubenswrapper[4690]: I0320 18:33:56.343207 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_370661b8-157c-4a7f-ae3e-379d122d48b3/cinder-api-log/0.log" Mar 20 18:33:56 crc kubenswrapper[4690]: I0320 18:33:56.462837 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_582dd6c0-32f0-41f1-b62d-2dfc7f5b6509/cinder-scheduler/0.log" Mar 20 18:33:56 crc kubenswrapper[4690]: I0320 18:33:56.551793 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_582dd6c0-32f0-41f1-b62d-2dfc7f5b6509/probe/0.log" Mar 20 18:33:56 crc kubenswrapper[4690]: I0320 18:33:56.602171 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-lbwch_6983c278-26ba-4802-9320-1270d48b04ce/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:33:56 crc kubenswrapper[4690]: I0320 18:33:56.844837 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-p85rs_a59c2f4e-a048-421a-b4db-5411eeb2c3fd/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:33:56 crc kubenswrapper[4690]: I0320 18:33:56.845217 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-q7j6l_50265e08-57d1-4ae0-8434-086c38b3e525/init/0.log" Mar 20 18:33:57 crc kubenswrapper[4690]: I0320 18:33:57.053576 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-q7j6l_50265e08-57d1-4ae0-8434-086c38b3e525/dnsmasq-dns/0.log" Mar 20 18:33:57 crc kubenswrapper[4690]: I0320 18:33:57.055921 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-q7j6l_50265e08-57d1-4ae0-8434-086c38b3e525/init/0.log" Mar 20 18:33:57 crc kubenswrapper[4690]: I0320 18:33:57.167811 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-6nfgk_86d7b6e3-05d5-475d-b95f-9ba0d5b43df4/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:33:57 crc kubenswrapper[4690]: I0320 18:33:57.299889 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_9b415aa0-2e76-4f43-8f53-2da695c5b62e/glance-log/0.log" Mar 20 18:33:57 crc kubenswrapper[4690]: I0320 18:33:57.304396 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_9b415aa0-2e76-4f43-8f53-2da695c5b62e/glance-httpd/0.log" Mar 20 18:33:57 crc kubenswrapper[4690]: I0320 18:33:57.456479 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_d70983fe-8325-430a-beeb-fa3b8007e70e/glance-httpd/0.log" Mar 20 18:33:57 crc kubenswrapper[4690]: I0320 18:33:57.516343 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_d70983fe-8325-430a-beeb-fa3b8007e70e/glance-log/0.log" Mar 20 18:33:57 crc kubenswrapper[4690]: I0320 18:33:57.688733 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-dc95ccffb-gvrdq_799b195a-e6e5-4a19-b41a-1c7550e21e90/horizon/0.log" Mar 20 18:33:57 crc kubenswrapper[4690]: I0320 18:33:57.847518 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-w5shc_e1dd8af3-0ac3-42c4-ba88-c891b8c971bd/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:33:58 crc kubenswrapper[4690]: I0320 18:33:58.101002 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-dc95ccffb-gvrdq_799b195a-e6e5-4a19-b41a-1c7550e21e90/horizon-log/0.log" Mar 20 18:33:58 crc kubenswrapper[4690]: I0320 18:33:58.235064 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-xpjkk_7bdd8e58-aee6-495b-85b6-6d4ce7de1bdc/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:33:58 crc kubenswrapper[4690]: I0320 18:33:58.365908 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29567161-sq958_3d7d7e4d-2f06-4abf-aa2d-ff85ac933f66/keystone-cron/0.log" Mar 20 18:33:58 crc kubenswrapper[4690]: I0320 18:33:58.378064 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-b966595d7-ccrp2_112d1eb4-f375-4825-94e3-d721fbafbeaa/keystone-api/0.log" Mar 20 18:33:58 crc kubenswrapper[4690]: I0320 18:33:58.527097 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_4d1751ac-6582-4c73-aef9-12952bde5126/kube-state-metrics/0.log" Mar 20 18:33:59 crc kubenswrapper[4690]: I0320 18:33:59.105039 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-926mx_ca6878cf-74a4-4bf6-8e36-bf1a669d787f/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:33:59 crc kubenswrapper[4690]: I0320 18:33:59.189626 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-675c5fd7b7-z9vsh_1ce9f480-c11d-4009-98e7-8e1d4a13ecd8/neutron-api/0.log" Mar 20 18:33:59 crc kubenswrapper[4690]: I0320 18:33:59.237832 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-675c5fd7b7-z9vsh_1ce9f480-c11d-4009-98e7-8e1d4a13ecd8/neutron-httpd/0.log" Mar 20 18:33:59 crc kubenswrapper[4690]: I0320 18:33:59.541097 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qd7s_c59bc866-150a-4671-8bbf-91aea8f32646/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:34:00 crc kubenswrapper[4690]: I0320 18:34:00.014825 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_9d9df793-f6e2-4d60-a54d-971847c8d3ea/nova-cell0-conductor-conductor/0.log" Mar 20 18:34:00 crc kubenswrapper[4690]: I0320 18:34:00.056362 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_5f8feaad-3661-4ea6-9e2d-90cf79d48df9/nova-api-log/0.log" Mar 20 18:34:00 crc kubenswrapper[4690]: I0320 18:34:00.154198 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567194-pqwm6"] Mar 20 18:34:00 crc kubenswrapper[4690]: E0320 18:34:00.154711 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20831693-9d8b-4133-8a1c-482fbc8e6cb6" containerName="container-00" Mar 20 18:34:00 crc kubenswrapper[4690]: I0320 18:34:00.154731 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="20831693-9d8b-4133-8a1c-482fbc8e6cb6" containerName="container-00" Mar 20 18:34:00 crc kubenswrapper[4690]: I0320 18:34:00.154924 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="20831693-9d8b-4133-8a1c-482fbc8e6cb6" containerName="container-00" Mar 20 18:34:00 crc kubenswrapper[4690]: I0320 18:34:00.155559 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567194-pqwm6" Mar 20 18:34:00 crc kubenswrapper[4690]: I0320 18:34:00.158630 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 18:34:00 crc kubenswrapper[4690]: I0320 18:34:00.158631 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 18:34:00 crc kubenswrapper[4690]: I0320 18:34:00.158690 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5fwhb" Mar 20 18:34:00 crc kubenswrapper[4690]: I0320 18:34:00.190886 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567194-pqwm6"] Mar 20 18:34:00 crc kubenswrapper[4690]: I0320 18:34:00.235342 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfffw\" (UniqueName: \"kubernetes.io/projected/3e3f0313-6464-43d3-96df-ae1a675322b1-kube-api-access-vfffw\") pod \"auto-csr-approver-29567194-pqwm6\" (UID: \"3e3f0313-6464-43d3-96df-ae1a675322b1\") " pod="openshift-infra/auto-csr-approver-29567194-pqwm6" Mar 20 18:34:00 crc kubenswrapper[4690]: I0320 18:34:00.336787 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfffw\" (UniqueName: \"kubernetes.io/projected/3e3f0313-6464-43d3-96df-ae1a675322b1-kube-api-access-vfffw\") pod \"auto-csr-approver-29567194-pqwm6\" (UID: \"3e3f0313-6464-43d3-96df-ae1a675322b1\") " pod="openshift-infra/auto-csr-approver-29567194-pqwm6" Mar 20 18:34:00 crc kubenswrapper[4690]: I0320 18:34:00.356834 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfffw\" (UniqueName: \"kubernetes.io/projected/3e3f0313-6464-43d3-96df-ae1a675322b1-kube-api-access-vfffw\") pod \"auto-csr-approver-29567194-pqwm6\" (UID: \"3e3f0313-6464-43d3-96df-ae1a675322b1\") " pod="openshift-infra/auto-csr-approver-29567194-pqwm6" Mar 20 18:34:00 crc kubenswrapper[4690]: I0320 18:34:00.421179 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_b1aa290f-4335-4859-83e2-b2283b49e235/nova-cell1-conductor-conductor/0.log" Mar 20 18:34:00 crc kubenswrapper[4690]: I0320 18:34:00.422677 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_5f8feaad-3661-4ea6-9e2d-90cf79d48df9/nova-api-api/0.log" Mar 20 18:34:00 crc kubenswrapper[4690]: I0320 18:34:00.460560 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_0d61bbf6-923c-45e5-9e55-42cb69c00b3b/nova-cell1-novncproxy-novncproxy/0.log" Mar 20 18:34:00 crc kubenswrapper[4690]: I0320 18:34:00.495508 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567194-pqwm6" Mar 20 18:34:00 crc kubenswrapper[4690]: I0320 18:34:00.879187 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_bc7c7487-ca7b-46c1-9cfd-6b9a34a0253f/nova-metadata-log/0.log" Mar 20 18:34:00 crc kubenswrapper[4690]: I0320 18:34:00.956710 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567194-pqwm6"] Mar 20 18:34:01 crc kubenswrapper[4690]: I0320 18:34:01.291688 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_a6c3a0b8-8793-4e94-bbee-851b32f0a393/nova-scheduler-scheduler/0.log" Mar 20 18:34:01 crc kubenswrapper[4690]: I0320 18:34:01.324622 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_bc7c7487-ca7b-46c1-9cfd-6b9a34a0253f/nova-metadata-metadata/0.log" Mar 20 18:34:01 crc kubenswrapper[4690]: I0320 18:34:01.358756 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_d4aa597c-8302-463f-a383-39c9a51baa2c/mysql-bootstrap/0.log" Mar 20 18:34:01 crc kubenswrapper[4690]: I0320 18:34:01.369749 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-bs8n5_8146ff99-3308-4b91-b487-3bd707bed4dd/nova-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:34:01 crc kubenswrapper[4690]: I0320 18:34:01.505763 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_d4aa597c-8302-463f-a383-39c9a51baa2c/mysql-bootstrap/0.log" Mar 20 18:34:01 crc kubenswrapper[4690]: I0320 18:34:01.557513 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_d4aa597c-8302-463f-a383-39c9a51baa2c/galera/0.log" Mar 20 18:34:01 crc kubenswrapper[4690]: I0320 18:34:01.583307 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567194-pqwm6" event={"ID":"3e3f0313-6464-43d3-96df-ae1a675322b1","Type":"ContainerStarted","Data":"c0fd53551f5e2bfb11ef06b5740fc2ad58d916e6a5b88f90cbd5c2c4b1dfe82a"} Mar 20 18:34:01 crc kubenswrapper[4690]: I0320 18:34:01.659337 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_dacc9bed-eaa9-4747-8a92-30f5afa0a698/mysql-bootstrap/0.log" Mar 20 18:34:01 crc kubenswrapper[4690]: I0320 18:34:01.774026 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_dacc9bed-eaa9-4747-8a92-30f5afa0a698/mysql-bootstrap/0.log" Mar 20 18:34:01 crc kubenswrapper[4690]: I0320 18:34:01.774988 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_dacc9bed-eaa9-4747-8a92-30f5afa0a698/galera/0.log" Mar 20 18:34:01 crc kubenswrapper[4690]: I0320 18:34:01.921461 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_a537f291-8787-434c-84bc-4355ccccbe47/openstackclient/0.log" Mar 20 18:34:02 crc kubenswrapper[4690]: I0320 18:34:02.102010 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-j8pr4_f0e78344-d5a9-4bc2-9556-e3daf0ce19db/ovn-controller/0.log" Mar 20 18:34:02 crc kubenswrapper[4690]: I0320 18:34:02.278118 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-bv2wl_f8bc22b8-57e1-4cfd-bce8-446fb8cee600/openstack-network-exporter/0.log" Mar 20 18:34:02 crc kubenswrapper[4690]: I0320 18:34:02.349507 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-8dmtk_497fed5f-b87a-4042-ae22-186983ed7536/ovsdb-server-init/0.log" Mar 20 18:34:02 crc kubenswrapper[4690]: I0320 18:34:02.539921 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-8dmtk_497fed5f-b87a-4042-ae22-186983ed7536/ovs-vswitchd/0.log" Mar 20 18:34:02 crc kubenswrapper[4690]: I0320 18:34:02.569601 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-8dmtk_497fed5f-b87a-4042-ae22-186983ed7536/ovsdb-server/0.log" Mar 20 18:34:02 crc kubenswrapper[4690]: I0320 18:34:02.580588 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-8dmtk_497fed5f-b87a-4042-ae22-186983ed7536/ovsdb-server-init/0.log" Mar 20 18:34:02 crc kubenswrapper[4690]: I0320 18:34:02.794676 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_6a18387d-9d4e-4fd5-bdb3-8568831a7930/openstack-network-exporter/0.log" Mar 20 18:34:02 crc kubenswrapper[4690]: I0320 18:34:02.853586 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-jhldh_64a253c9-3348-4ba3-9d9a-755348ebf561/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:34:02 crc kubenswrapper[4690]: I0320 18:34:02.872974 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_6a18387d-9d4e-4fd5-bdb3-8568831a7930/ovn-northd/0.log" Mar 20 18:34:02 crc kubenswrapper[4690]: I0320 18:34:02.883039 4690 scope.go:117] "RemoveContainer" containerID="bdbe59d9ce73fb94720cd5938d39dd30976660c151e267e22ef199f0f2141309" Mar 20 18:34:02 crc kubenswrapper[4690]: E0320 18:34:02.883312 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:34:03 crc kubenswrapper[4690]: I0320 18:34:03.068828 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_172668db-85fb-47e1-82fe-dee7c454993e/openstack-network-exporter/0.log" Mar 20 18:34:03 crc kubenswrapper[4690]: I0320 18:34:03.092211 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_172668db-85fb-47e1-82fe-dee7c454993e/ovsdbserver-nb/0.log" Mar 20 18:34:03 crc kubenswrapper[4690]: I0320 18:34:03.254381 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_d64e7a22-5bb9-49ec-95d6-7ff145a31f9a/openstack-network-exporter/0.log" Mar 20 18:34:03 crc kubenswrapper[4690]: I0320 18:34:03.340627 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_d64e7a22-5bb9-49ec-95d6-7ff145a31f9a/ovsdbserver-sb/0.log" Mar 20 18:34:03 crc kubenswrapper[4690]: I0320 18:34:03.442266 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-77ff877fdd-nntbj_02713b3f-f042-40fc-a24e-f68ac876ae20/placement-api/0.log" Mar 20 18:34:03 crc kubenswrapper[4690]: I0320 18:34:03.564732 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-77ff877fdd-nntbj_02713b3f-f042-40fc-a24e-f68ac876ae20/placement-log/0.log" Mar 20 18:34:03 crc kubenswrapper[4690]: I0320 18:34:03.607966 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567194-pqwm6" event={"ID":"3e3f0313-6464-43d3-96df-ae1a675322b1","Type":"ContainerStarted","Data":"d8ac455d756c20d2924c9548f608312f8a1e91a391cf9c76e9bfb32b96ff2d44"} Mar 20 18:34:03 crc kubenswrapper[4690]: I0320 18:34:03.632409 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567194-pqwm6" podStartSLOduration=1.653108263 podStartE2EDuration="3.63239427s" podCreationTimestamp="2026-03-20 18:34:00 +0000 UTC" firstStartedPulling="2026-03-20 18:34:00.964931993 +0000 UTC m=+3715.830757671" lastFinishedPulling="2026-03-20 18:34:02.94421801 +0000 UTC m=+3717.810043678" observedRunningTime="2026-03-20 18:34:03.623078506 +0000 UTC m=+3718.488904184" watchObservedRunningTime="2026-03-20 18:34:03.63239427 +0000 UTC m=+3718.498219938" Mar 20 18:34:03 crc kubenswrapper[4690]: I0320 18:34:03.633082 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_b93f0757-6c7a-473f-80e5-f4b9e7f88fad/setup-container/0.log" Mar 20 18:34:03 crc kubenswrapper[4690]: I0320 18:34:03.830962 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_b93f0757-6c7a-473f-80e5-f4b9e7f88fad/setup-container/0.log" Mar 20 18:34:03 crc kubenswrapper[4690]: I0320 18:34:03.840224 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_b93f0757-6c7a-473f-80e5-f4b9e7f88fad/rabbitmq/0.log" Mar 20 18:34:03 crc kubenswrapper[4690]: I0320 18:34:03.922146 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_ab528fee-94bb-4907-aca5-97dcabef8332/setup-container/0.log" Mar 20 18:34:04 crc kubenswrapper[4690]: I0320 18:34:04.123854 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_ab528fee-94bb-4907-aca5-97dcabef8332/setup-container/0.log" Mar 20 18:34:04 crc kubenswrapper[4690]: I0320 18:34:04.188845 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-6q99q_94cbf02f-b47c-44f2-ab14-bd01174bcc77/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:34:04 crc kubenswrapper[4690]: I0320 18:34:04.192054 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_ab528fee-94bb-4907-aca5-97dcabef8332/rabbitmq/0.log" Mar 20 18:34:04 crc kubenswrapper[4690]: I0320 18:34:04.401235 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-h7849_0fdbed5c-e2a7-42ee-9e92-68d0bbbff023/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:34:04 crc kubenswrapper[4690]: I0320 18:34:04.445925 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-krmd6_05a81786-36ff-4e8b-9bba-5e0ebbfc3247/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:34:04 crc kubenswrapper[4690]: I0320 18:34:04.616917 4690 generic.go:334] "Generic (PLEG): container finished" podID="3e3f0313-6464-43d3-96df-ae1a675322b1" containerID="d8ac455d756c20d2924c9548f608312f8a1e91a391cf9c76e9bfb32b96ff2d44" exitCode=0 Mar 20 18:34:04 crc kubenswrapper[4690]: I0320 18:34:04.617198 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567194-pqwm6" event={"ID":"3e3f0313-6464-43d3-96df-ae1a675322b1","Type":"ContainerDied","Data":"d8ac455d756c20d2924c9548f608312f8a1e91a391cf9c76e9bfb32b96ff2d44"} Mar 20 18:34:04 crc kubenswrapper[4690]: I0320 18:34:04.639682 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-8zmzz_d7945514-9f35-4a0f-86f3-3d8e03a03d75/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:34:04 crc kubenswrapper[4690]: I0320 18:34:04.668096 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-fnwhc_7d323f18-a4a8-4074-8b3f-cafcb23bcd33/ssh-known-hosts-edpm-deployment/0.log" Mar 20 18:34:04 crc kubenswrapper[4690]: I0320 18:34:04.909195 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-799f9bd8b7-4q7w9_3f074183-2793-4719-95b3-c2df447c93ab/proxy-server/0.log" Mar 20 18:34:04 crc kubenswrapper[4690]: I0320 18:34:04.947796 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-799f9bd8b7-4q7w9_3f074183-2793-4719-95b3-c2df447c93ab/proxy-httpd/0.log" Mar 20 18:34:05 crc kubenswrapper[4690]: I0320 18:34:05.018508 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-dsjgc_7930c325-4b03-450e-b3d0-b7116efc71cb/swift-ring-rebalance/0.log" Mar 20 18:34:05 crc kubenswrapper[4690]: I0320 18:34:05.167908 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4191d8a1-c023-4412-a90c-e819672da33f/account-auditor/0.log" Mar 20 18:34:05 crc kubenswrapper[4690]: I0320 18:34:05.232354 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4191d8a1-c023-4412-a90c-e819672da33f/account-reaper/0.log" Mar 20 18:34:05 crc kubenswrapper[4690]: I0320 18:34:05.281150 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4191d8a1-c023-4412-a90c-e819672da33f/account-replicator/0.log" Mar 20 18:34:05 crc kubenswrapper[4690]: I0320 18:34:05.369555 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4191d8a1-c023-4412-a90c-e819672da33f/account-server/0.log" Mar 20 18:34:05 crc kubenswrapper[4690]: I0320 18:34:05.456166 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4191d8a1-c023-4412-a90c-e819672da33f/container-auditor/0.log" Mar 20 18:34:05 crc kubenswrapper[4690]: I0320 18:34:05.494778 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4191d8a1-c023-4412-a90c-e819672da33f/container-server/0.log" Mar 20 18:34:05 crc kubenswrapper[4690]: I0320 18:34:05.500703 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4191d8a1-c023-4412-a90c-e819672da33f/container-replicator/0.log" Mar 20 18:34:05 crc kubenswrapper[4690]: I0320 18:34:05.587701 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4191d8a1-c023-4412-a90c-e819672da33f/container-updater/0.log" Mar 20 18:34:05 crc kubenswrapper[4690]: I0320 18:34:05.692359 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4191d8a1-c023-4412-a90c-e819672da33f/object-auditor/0.log" Mar 20 18:34:05 crc kubenswrapper[4690]: I0320 18:34:05.702737 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4191d8a1-c023-4412-a90c-e819672da33f/object-expirer/0.log" Mar 20 18:34:05 crc kubenswrapper[4690]: I0320 18:34:05.720496 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4191d8a1-c023-4412-a90c-e819672da33f/object-replicator/0.log" Mar 20 18:34:05 crc kubenswrapper[4690]: I0320 18:34:05.884173 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4191d8a1-c023-4412-a90c-e819672da33f/object-server/0.log" Mar 20 18:34:05 crc kubenswrapper[4690]: I0320 18:34:05.941514 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4191d8a1-c023-4412-a90c-e819672da33f/object-updater/0.log" Mar 20 18:34:05 crc kubenswrapper[4690]: I0320 18:34:05.970284 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4191d8a1-c023-4412-a90c-e819672da33f/rsync/0.log" Mar 20 18:34:05 crc kubenswrapper[4690]: I0320 18:34:05.994563 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4191d8a1-c023-4412-a90c-e819672da33f/swift-recon-cron/0.log" Mar 20 18:34:06 crc kubenswrapper[4690]: I0320 18:34:06.012489 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567194-pqwm6" Mar 20 18:34:06 crc kubenswrapper[4690]: I0320 18:34:06.140766 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vfffw\" (UniqueName: \"kubernetes.io/projected/3e3f0313-6464-43d3-96df-ae1a675322b1-kube-api-access-vfffw\") pod \"3e3f0313-6464-43d3-96df-ae1a675322b1\" (UID: \"3e3f0313-6464-43d3-96df-ae1a675322b1\") " Mar 20 18:34:06 crc kubenswrapper[4690]: I0320 18:34:06.149029 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e3f0313-6464-43d3-96df-ae1a675322b1-kube-api-access-vfffw" (OuterVolumeSpecName: "kube-api-access-vfffw") pod "3e3f0313-6464-43d3-96df-ae1a675322b1" (UID: "3e3f0313-6464-43d3-96df-ae1a675322b1"). InnerVolumeSpecName "kube-api-access-vfffw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:34:06 crc kubenswrapper[4690]: I0320 18:34:06.242844 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vfffw\" (UniqueName: \"kubernetes.io/projected/3e3f0313-6464-43d3-96df-ae1a675322b1-kube-api-access-vfffw\") on node \"crc\" DevicePath \"\"" Mar 20 18:34:06 crc kubenswrapper[4690]: I0320 18:34:06.243751 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_86a8f040-c0ab-4923-8bab-8123fd72e63e/tempest-tests-tempest-tests-runner/0.log" Mar 20 18:34:06 crc kubenswrapper[4690]: I0320 18:34:06.475964 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_af317ab4-ee88-4ad6-b2c8-02b26765f15f/test-operator-logs-container/0.log" Mar 20 18:34:06 crc kubenswrapper[4690]: I0320 18:34:06.581903 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-qn629_0fb2f304-f772-4ce8-8372-177341555106/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:34:06 crc kubenswrapper[4690]: I0320 18:34:06.665756 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567194-pqwm6" event={"ID":"3e3f0313-6464-43d3-96df-ae1a675322b1","Type":"ContainerDied","Data":"c0fd53551f5e2bfb11ef06b5740fc2ad58d916e6a5b88f90cbd5c2c4b1dfe82a"} Mar 20 18:34:06 crc kubenswrapper[4690]: I0320 18:34:06.666075 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c0fd53551f5e2bfb11ef06b5740fc2ad58d916e6a5b88f90cbd5c2c4b1dfe82a" Mar 20 18:34:06 crc kubenswrapper[4690]: I0320 18:34:06.666213 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567194-pqwm6" Mar 20 18:34:06 crc kubenswrapper[4690]: I0320 18:34:06.686333 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-nszcx_3fa5c87e-a9cd-4046-9344-3a66c0c0977c/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:34:06 crc kubenswrapper[4690]: I0320 18:34:06.699369 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567188-bh6s9"] Mar 20 18:34:06 crc kubenswrapper[4690]: I0320 18:34:06.710330 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567188-bh6s9"] Mar 20 18:34:07 crc kubenswrapper[4690]: I0320 18:34:07.900248 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a15191f6-a6b7-458d-be7d-3387d83561d7" path="/var/lib/kubelet/pods/a15191f6-a6b7-458d-be7d-3387d83561d7/volumes" Mar 20 18:34:13 crc kubenswrapper[4690]: I0320 18:34:13.882905 4690 scope.go:117] "RemoveContainer" containerID="bdbe59d9ce73fb94720cd5938d39dd30976660c151e267e22ef199f0f2141309" Mar 20 18:34:13 crc kubenswrapper[4690]: E0320 18:34:13.883602 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:34:16 crc kubenswrapper[4690]: I0320 18:34:16.238817 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_b74da73d-632e-490b-b3c7-22450d29ede6/memcached/0.log" Mar 20 18:34:24 crc kubenswrapper[4690]: I0320 18:34:24.883067 4690 scope.go:117] "RemoveContainer" containerID="bdbe59d9ce73fb94720cd5938d39dd30976660c151e267e22ef199f0f2141309" Mar 20 18:34:24 crc kubenswrapper[4690]: E0320 18:34:24.883499 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:34:25 crc kubenswrapper[4690]: I0320 18:34:25.006545 4690 scope.go:117] "RemoveContainer" containerID="880b1625ae6ddbfbb89745ebf7924388f76f23475b2891410aa70c1e7c410b36" Mar 20 18:34:31 crc kubenswrapper[4690]: I0320 18:34:31.315556 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-59bc569d95-5bf49_06fbcef9-d6fa-4dac-bfeb-93e3fc501f55/manager/0.log" Mar 20 18:34:31 crc kubenswrapper[4690]: I0320 18:34:31.492922 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d43f0597759d048207e8f8942c34d73ccb7a2672e1af8b0630dbcc16b1wfdpl_d1b6dbe3-2fff-4985-91e0-270e2d42fcbc/util/0.log" Mar 20 18:34:31 crc kubenswrapper[4690]: I0320 18:34:31.757941 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d43f0597759d048207e8f8942c34d73ccb7a2672e1af8b0630dbcc16b1wfdpl_d1b6dbe3-2fff-4985-91e0-270e2d42fcbc/pull/0.log" Mar 20 18:34:31 crc kubenswrapper[4690]: I0320 18:34:31.778521 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d43f0597759d048207e8f8942c34d73ccb7a2672e1af8b0630dbcc16b1wfdpl_d1b6dbe3-2fff-4985-91e0-270e2d42fcbc/pull/0.log" Mar 20 18:34:31 crc kubenswrapper[4690]: I0320 18:34:31.849442 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d43f0597759d048207e8f8942c34d73ccb7a2672e1af8b0630dbcc16b1wfdpl_d1b6dbe3-2fff-4985-91e0-270e2d42fcbc/util/0.log" Mar 20 18:34:32 crc kubenswrapper[4690]: I0320 18:34:32.537933 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d43f0597759d048207e8f8942c34d73ccb7a2672e1af8b0630dbcc16b1wfdpl_d1b6dbe3-2fff-4985-91e0-270e2d42fcbc/extract/0.log" Mar 20 18:34:32 crc kubenswrapper[4690]: I0320 18:34:32.690743 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-8d58dc466-mbp48_a3db7a74-f9a7-4dfc-89a3-5727f538a3a7/manager/0.log" Mar 20 18:34:32 crc kubenswrapper[4690]: I0320 18:34:32.727274 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d43f0597759d048207e8f8942c34d73ccb7a2672e1af8b0630dbcc16b1wfdpl_d1b6dbe3-2fff-4985-91e0-270e2d42fcbc/util/0.log" Mar 20 18:34:32 crc kubenswrapper[4690]: I0320 18:34:32.750125 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d43f0597759d048207e8f8942c34d73ccb7a2672e1af8b0630dbcc16b1wfdpl_d1b6dbe3-2fff-4985-91e0-270e2d42fcbc/pull/0.log" Mar 20 18:34:32 crc kubenswrapper[4690]: I0320 18:34:32.927140 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-588d4d986b-27xp7_1c5d887a-7a69-4f43-8b75-36de19325428/manager/0.log" Mar 20 18:34:33 crc kubenswrapper[4690]: I0320 18:34:33.043222 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-79df6bcc97-wclkd_6a4da3b7-e419-4565-8b7a-2f3f3fd20aa1/manager/0.log" Mar 20 18:34:33 crc kubenswrapper[4690]: I0320 18:34:33.105299 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-67dd5f86f5-rtb26_d71d628c-8060-418c-b0bf-f83193220e88/manager/0.log" Mar 20 18:34:33 crc kubenswrapper[4690]: I0320 18:34:33.235898 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-8464cc45fb-w2m9s_3871373d-0b43-4e90-84f8-01ee2e8e4159/manager/0.log" Mar 20 18:34:33 crc kubenswrapper[4690]: I0320 18:34:33.484566 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6f787dddc9-mmwbh_09c39274-3aa3-4774-98e2-10f70b707a97/manager/0.log" Mar 20 18:34:33 crc kubenswrapper[4690]: I0320 18:34:33.529463 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-c55d6cc99-gzcjf_535eb2e4-3de8-49bd-97a8-135823a8d1c9/manager/0.log" Mar 20 18:34:33 crc kubenswrapper[4690]: I0320 18:34:33.666857 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-768b96df4c-xgkr9_dc81b6ac-2881-4a5c-b3f6-e09fc1c634e4/manager/0.log" Mar 20 18:34:33 crc kubenswrapper[4690]: I0320 18:34:33.685905 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-55f864c847-tllng_a8f81ddb-a5b3-4881-88de-66ed78d8d344/manager/0.log" Mar 20 18:34:33 crc kubenswrapper[4690]: I0320 18:34:33.893963 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-767865f676-xs6lt_61746313-5249-48cd-8dae-f7984ba74f85/manager/0.log" Mar 20 18:34:33 crc kubenswrapper[4690]: I0320 18:34:33.937646 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-67ccfc9778-pxcl7_3efbe084-4e50-405e-b477-b3b87635d465/manager/0.log" Mar 20 18:34:34 crc kubenswrapper[4690]: I0320 18:34:34.146991 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5b9f45d989-xzzx7_a31564d4-ce19-4893-bde8-871ced7c077b/manager/0.log" Mar 20 18:34:34 crc kubenswrapper[4690]: I0320 18:34:34.147448 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5d488d59fb-55787_458fe699-42d5-44ad-9288-3b6fbcd87161/manager/0.log" Mar 20 18:34:34 crc kubenswrapper[4690]: I0320 18:34:34.323960 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-86657c54f5m7n5b_f3bca6f7-be2b-4420-8664-b94ba53d5f7f/manager/0.log" Mar 20 18:34:34 crc kubenswrapper[4690]: I0320 18:34:34.395645 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-77cd8cbff5-64vnn_2e2d986c-6f20-4436-a367-df98a71f79f0/operator/0.log" Mar 20 18:34:34 crc kubenswrapper[4690]: I0320 18:34:34.558566 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-xcb58_595a25b2-d477-4ec7-b9ad-8eb670c2ea3f/registry-server/0.log" Mar 20 18:34:34 crc kubenswrapper[4690]: I0320 18:34:34.911945 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-884679f54-48tn7_3730dc8b-cf83-4f29-ac0b-3776ef3efeba/manager/0.log" Mar 20 18:34:34 crc kubenswrapper[4690]: I0320 18:34:34.998167 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5784578c99-fm55z_a27fabe1-095d-4c34-8e91-862aa1dbf964/manager/0.log" Mar 20 18:34:35 crc kubenswrapper[4690]: I0320 18:34:35.218312 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-c674c5965-klmr6_64a2959f-0b79-4b19-934b-486aad0e782b/manager/0.log" Mar 20 18:34:35 crc kubenswrapper[4690]: I0320 18:34:35.433153 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-d6b694c5-8kl25_c7a8e424-00f3-4e97-b6b3-bd2513624b2e/manager/0.log" Mar 20 18:34:35 crc kubenswrapper[4690]: I0320 18:34:35.470818 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-vnl4s_d7d5bc08-99d0-4361-ae0e-ca9732db6154/manager/0.log" Mar 20 18:34:35 crc kubenswrapper[4690]: I0320 18:34:35.649733 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6c4d75f7f9-pv7x5_64ced890-6363-43c3-83e5-0001c72851ef/manager/0.log" Mar 20 18:34:35 crc kubenswrapper[4690]: I0320 18:34:35.708342 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-54c9b8654f-ms4r7_26b1c9fc-55f9-4895-9d23-a7c7e0e811c3/manager/0.log" Mar 20 18:34:38 crc kubenswrapper[4690]: I0320 18:34:38.883038 4690 scope.go:117] "RemoveContainer" containerID="bdbe59d9ce73fb94720cd5938d39dd30976660c151e267e22ef199f0f2141309" Mar 20 18:34:38 crc kubenswrapper[4690]: E0320 18:34:38.883635 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:34:50 crc kubenswrapper[4690]: I0320 18:34:50.883024 4690 scope.go:117] "RemoveContainer" containerID="bdbe59d9ce73fb94720cd5938d39dd30976660c151e267e22ef199f0f2141309" Mar 20 18:34:50 crc kubenswrapper[4690]: E0320 18:34:50.883674 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:34:55 crc kubenswrapper[4690]: I0320 18:34:55.430191 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-kkhg7_dc26c755-5e1b-480b-b3ed-b3d3dee36d94/control-plane-machine-set-operator/0.log" Mar 20 18:34:55 crc kubenswrapper[4690]: I0320 18:34:55.610233 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-9n98c_bb444275-6cc1-42be-b742-afc344a60995/kube-rbac-proxy/0.log" Mar 20 18:34:55 crc kubenswrapper[4690]: I0320 18:34:55.645277 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-9n98c_bb444275-6cc1-42be-b742-afc344a60995/machine-api-operator/0.log" Mar 20 18:35:05 crc kubenswrapper[4690]: I0320 18:35:05.895835 4690 scope.go:117] "RemoveContainer" containerID="bdbe59d9ce73fb94720cd5938d39dd30976660c151e267e22ef199f0f2141309" Mar 20 18:35:05 crc kubenswrapper[4690]: E0320 18:35:05.900733 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:35:08 crc kubenswrapper[4690]: I0320 18:35:08.230798 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-sj4vr_7f7c4ed7-ab53-40e9-8977-77afd116ce1b/cert-manager-controller/0.log" Mar 20 18:35:08 crc kubenswrapper[4690]: I0320 18:35:08.447984 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-5lwhr_2040aed1-0ccc-4068-8e68-5ddda58ddd5e/cert-manager-cainjector/0.log" Mar 20 18:35:08 crc kubenswrapper[4690]: I0320 18:35:08.475337 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-s9shq_46b55360-cf52-4b63-90e4-b578b7181c19/cert-manager-webhook/0.log" Mar 20 18:35:16 crc kubenswrapper[4690]: I0320 18:35:16.882947 4690 scope.go:117] "RemoveContainer" containerID="bdbe59d9ce73fb94720cd5938d39dd30976660c151e267e22ef199f0f2141309" Mar 20 18:35:16 crc kubenswrapper[4690]: E0320 18:35:16.884581 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:35:20 crc kubenswrapper[4690]: I0320 18:35:20.486958 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-66twl_bab1f41a-57aa-43a8-b690-62eb634c99dc/nmstate-console-plugin/0.log" Mar 20 18:35:20 crc kubenswrapper[4690]: I0320 18:35:20.688681 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-h7mj5_15577f5b-3df3-4e8e-bebe-6abe5379debf/nmstate-handler/0.log" Mar 20 18:35:20 crc kubenswrapper[4690]: I0320 18:35:20.721118 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-zpndn_2a8c0e04-bbfb-46b1-8562-2a1697b85035/kube-rbac-proxy/0.log" Mar 20 18:35:20 crc kubenswrapper[4690]: I0320 18:35:20.792622 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-zpndn_2a8c0e04-bbfb-46b1-8562-2a1697b85035/nmstate-metrics/0.log" Mar 20 18:35:20 crc kubenswrapper[4690]: I0320 18:35:20.887774 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-6th4x_7e787826-7e6f-4ac9-856e-73304533640d/nmstate-operator/0.log" Mar 20 18:35:21 crc kubenswrapper[4690]: I0320 18:35:21.016081 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-2gftm_943c74c5-b182-46da-9ea4-164a4eb553d0/nmstate-webhook/0.log" Mar 20 18:35:30 crc kubenswrapper[4690]: I0320 18:35:30.883839 4690 scope.go:117] "RemoveContainer" containerID="bdbe59d9ce73fb94720cd5938d39dd30976660c151e267e22ef199f0f2141309" Mar 20 18:35:30 crc kubenswrapper[4690]: E0320 18:35:30.884694 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:35:45 crc kubenswrapper[4690]: I0320 18:35:45.890041 4690 scope.go:117] "RemoveContainer" containerID="bdbe59d9ce73fb94720cd5938d39dd30976660c151e267e22ef199f0f2141309" Mar 20 18:35:45 crc kubenswrapper[4690]: E0320 18:35:45.891171 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:35:48 crc kubenswrapper[4690]: I0320 18:35:48.254377 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-jbfw7_fa6b1a4f-0c86-4aa0-8d19-f29a78797c6e/kube-rbac-proxy/0.log" Mar 20 18:35:48 crc kubenswrapper[4690]: I0320 18:35:48.275313 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-jbfw7_fa6b1a4f-0c86-4aa0-8d19-f29a78797c6e/controller/0.log" Mar 20 18:35:48 crc kubenswrapper[4690]: I0320 18:35:48.411044 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9cwmb_2edd85fb-0387-4738-ba35-2b326b635a1b/cp-frr-files/0.log" Mar 20 18:35:48 crc kubenswrapper[4690]: I0320 18:35:48.619357 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9cwmb_2edd85fb-0387-4738-ba35-2b326b635a1b/cp-frr-files/0.log" Mar 20 18:35:48 crc kubenswrapper[4690]: I0320 18:35:48.646319 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9cwmb_2edd85fb-0387-4738-ba35-2b326b635a1b/cp-reloader/0.log" Mar 20 18:35:48 crc kubenswrapper[4690]: I0320 18:35:48.649307 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9cwmb_2edd85fb-0387-4738-ba35-2b326b635a1b/cp-reloader/0.log" Mar 20 18:35:48 crc kubenswrapper[4690]: I0320 18:35:48.678758 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9cwmb_2edd85fb-0387-4738-ba35-2b326b635a1b/cp-metrics/0.log" Mar 20 18:35:48 crc kubenswrapper[4690]: I0320 18:35:48.849249 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9cwmb_2edd85fb-0387-4738-ba35-2b326b635a1b/cp-frr-files/0.log" Mar 20 18:35:48 crc kubenswrapper[4690]: I0320 18:35:48.873086 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9cwmb_2edd85fb-0387-4738-ba35-2b326b635a1b/cp-metrics/0.log" Mar 20 18:35:48 crc kubenswrapper[4690]: I0320 18:35:48.879003 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9cwmb_2edd85fb-0387-4738-ba35-2b326b635a1b/cp-reloader/0.log" Mar 20 18:35:48 crc kubenswrapper[4690]: I0320 18:35:48.960399 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9cwmb_2edd85fb-0387-4738-ba35-2b326b635a1b/cp-metrics/0.log" Mar 20 18:35:49 crc kubenswrapper[4690]: I0320 18:35:49.074167 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9cwmb_2edd85fb-0387-4738-ba35-2b326b635a1b/cp-metrics/0.log" Mar 20 18:35:49 crc kubenswrapper[4690]: I0320 18:35:49.078453 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9cwmb_2edd85fb-0387-4738-ba35-2b326b635a1b/cp-frr-files/0.log" Mar 20 18:35:49 crc kubenswrapper[4690]: I0320 18:35:49.082390 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9cwmb_2edd85fb-0387-4738-ba35-2b326b635a1b/cp-reloader/0.log" Mar 20 18:35:49 crc kubenswrapper[4690]: I0320 18:35:49.154234 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9cwmb_2edd85fb-0387-4738-ba35-2b326b635a1b/controller/0.log" Mar 20 18:35:49 crc kubenswrapper[4690]: I0320 18:35:49.285785 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9cwmb_2edd85fb-0387-4738-ba35-2b326b635a1b/kube-rbac-proxy/0.log" Mar 20 18:35:49 crc kubenswrapper[4690]: I0320 18:35:49.313635 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9cwmb_2edd85fb-0387-4738-ba35-2b326b635a1b/frr-metrics/0.log" Mar 20 18:35:49 crc kubenswrapper[4690]: I0320 18:35:49.345689 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9cwmb_2edd85fb-0387-4738-ba35-2b326b635a1b/kube-rbac-proxy-frr/0.log" Mar 20 18:35:49 crc kubenswrapper[4690]: I0320 18:35:49.518512 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9cwmb_2edd85fb-0387-4738-ba35-2b326b635a1b/reloader/0.log" Mar 20 18:35:49 crc kubenswrapper[4690]: I0320 18:35:49.542934 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-fn9hk_60df4ef1-2b12-4b8f-aec8-0a716fa5f7d0/frr-k8s-webhook-server/0.log" Mar 20 18:35:49 crc kubenswrapper[4690]: I0320 18:35:49.764527 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-77445cdfc6-46r4h_fcfa35ef-e556-4c2e-a742-84265930366f/manager/0.log" Mar 20 18:35:49 crc kubenswrapper[4690]: I0320 18:35:49.915764 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-5b96b44647-l5zw2_7a5a20c4-0745-41d2-a2ff-f389423513b2/webhook-server/0.log" Mar 20 18:35:50 crc kubenswrapper[4690]: I0320 18:35:50.041988 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-cfggn_9e0ecbbf-1e0c-408a-b58c-07cd90497c39/kube-rbac-proxy/0.log" Mar 20 18:35:50 crc kubenswrapper[4690]: I0320 18:35:50.573872 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-cfggn_9e0ecbbf-1e0c-408a-b58c-07cd90497c39/speaker/0.log" Mar 20 18:35:50 crc kubenswrapper[4690]: I0320 18:35:50.825211 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9cwmb_2edd85fb-0387-4738-ba35-2b326b635a1b/frr/0.log" Mar 20 18:35:57 crc kubenswrapper[4690]: I0320 18:35:57.883910 4690 scope.go:117] "RemoveContainer" containerID="bdbe59d9ce73fb94720cd5938d39dd30976660c151e267e22ef199f0f2141309" Mar 20 18:35:57 crc kubenswrapper[4690]: E0320 18:35:57.884588 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:36:00 crc kubenswrapper[4690]: I0320 18:36:00.154903 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567196-fwszr"] Mar 20 18:36:00 crc kubenswrapper[4690]: E0320 18:36:00.155852 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e3f0313-6464-43d3-96df-ae1a675322b1" containerName="oc" Mar 20 18:36:00 crc kubenswrapper[4690]: I0320 18:36:00.155867 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e3f0313-6464-43d3-96df-ae1a675322b1" containerName="oc" Mar 20 18:36:00 crc kubenswrapper[4690]: I0320 18:36:00.156046 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e3f0313-6464-43d3-96df-ae1a675322b1" containerName="oc" Mar 20 18:36:00 crc kubenswrapper[4690]: I0320 18:36:00.156732 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567196-fwszr" Mar 20 18:36:00 crc kubenswrapper[4690]: I0320 18:36:00.158770 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5fwhb" Mar 20 18:36:00 crc kubenswrapper[4690]: I0320 18:36:00.159118 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 18:36:00 crc kubenswrapper[4690]: I0320 18:36:00.159600 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 18:36:00 crc kubenswrapper[4690]: I0320 18:36:00.165388 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567196-fwszr"] Mar 20 18:36:00 crc kubenswrapper[4690]: I0320 18:36:00.344029 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffbs5\" (UniqueName: \"kubernetes.io/projected/8be67ef1-def0-48f6-8766-80d72249d2d5-kube-api-access-ffbs5\") pod \"auto-csr-approver-29567196-fwszr\" (UID: \"8be67ef1-def0-48f6-8766-80d72249d2d5\") " pod="openshift-infra/auto-csr-approver-29567196-fwszr" Mar 20 18:36:00 crc kubenswrapper[4690]: I0320 18:36:00.445666 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffbs5\" (UniqueName: \"kubernetes.io/projected/8be67ef1-def0-48f6-8766-80d72249d2d5-kube-api-access-ffbs5\") pod \"auto-csr-approver-29567196-fwszr\" (UID: \"8be67ef1-def0-48f6-8766-80d72249d2d5\") " pod="openshift-infra/auto-csr-approver-29567196-fwszr" Mar 20 18:36:00 crc kubenswrapper[4690]: I0320 18:36:00.470477 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffbs5\" (UniqueName: \"kubernetes.io/projected/8be67ef1-def0-48f6-8766-80d72249d2d5-kube-api-access-ffbs5\") pod \"auto-csr-approver-29567196-fwszr\" (UID: \"8be67ef1-def0-48f6-8766-80d72249d2d5\") " pod="openshift-infra/auto-csr-approver-29567196-fwszr" Mar 20 18:36:00 crc kubenswrapper[4690]: I0320 18:36:00.475408 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567196-fwszr" Mar 20 18:36:00 crc kubenswrapper[4690]: I0320 18:36:00.927135 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567196-fwszr"] Mar 20 18:36:00 crc kubenswrapper[4690]: I0320 18:36:00.943870 4690 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 18:36:01 crc kubenswrapper[4690]: I0320 18:36:01.707088 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567196-fwszr" event={"ID":"8be67ef1-def0-48f6-8766-80d72249d2d5","Type":"ContainerStarted","Data":"32bb459b4cc64cc2bb211795c1c4e3d1424adc736626ff30b30d137ed99b8eb2"} Mar 20 18:36:02 crc kubenswrapper[4690]: I0320 18:36:02.724984 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567196-fwszr" event={"ID":"8be67ef1-def0-48f6-8766-80d72249d2d5","Type":"ContainerStarted","Data":"a0149ee86bf7e7295aac586808e93729dd144cea23e38a82dbb01e8e53c0e5cf"} Mar 20 18:36:02 crc kubenswrapper[4690]: I0320 18:36:02.748683 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567196-fwszr" podStartSLOduration=1.509134765 podStartE2EDuration="2.748650808s" podCreationTimestamp="2026-03-20 18:36:00 +0000 UTC" firstStartedPulling="2026-03-20 18:36:00.943607219 +0000 UTC m=+3835.809432897" lastFinishedPulling="2026-03-20 18:36:02.183123222 +0000 UTC m=+3837.048948940" observedRunningTime="2026-03-20 18:36:02.74484597 +0000 UTC m=+3837.610671668" watchObservedRunningTime="2026-03-20 18:36:02.748650808 +0000 UTC m=+3837.614476506" Mar 20 18:36:03 crc kubenswrapper[4690]: I0320 18:36:03.392905 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8742tzwd_417924bb-8f83-4db1-b370-92e0fac118f4/util/0.log" Mar 20 18:36:03 crc kubenswrapper[4690]: I0320 18:36:03.539970 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8742tzwd_417924bb-8f83-4db1-b370-92e0fac118f4/util/0.log" Mar 20 18:36:03 crc kubenswrapper[4690]: I0320 18:36:03.591667 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8742tzwd_417924bb-8f83-4db1-b370-92e0fac118f4/pull/0.log" Mar 20 18:36:03 crc kubenswrapper[4690]: I0320 18:36:03.630906 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8742tzwd_417924bb-8f83-4db1-b370-92e0fac118f4/pull/0.log" Mar 20 18:36:03 crc kubenswrapper[4690]: I0320 18:36:03.734774 4690 generic.go:334] "Generic (PLEG): container finished" podID="8be67ef1-def0-48f6-8766-80d72249d2d5" containerID="a0149ee86bf7e7295aac586808e93729dd144cea23e38a82dbb01e8e53c0e5cf" exitCode=0 Mar 20 18:36:03 crc kubenswrapper[4690]: I0320 18:36:03.734830 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567196-fwszr" event={"ID":"8be67ef1-def0-48f6-8766-80d72249d2d5","Type":"ContainerDied","Data":"a0149ee86bf7e7295aac586808e93729dd144cea23e38a82dbb01e8e53c0e5cf"} Mar 20 18:36:03 crc kubenswrapper[4690]: I0320 18:36:03.737871 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8742tzwd_417924bb-8f83-4db1-b370-92e0fac118f4/util/0.log" Mar 20 18:36:03 crc kubenswrapper[4690]: I0320 18:36:03.775170 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8742tzwd_417924bb-8f83-4db1-b370-92e0fac118f4/pull/0.log" Mar 20 18:36:03 crc kubenswrapper[4690]: I0320 18:36:03.781196 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8742tzwd_417924bb-8f83-4db1-b370-92e0fac118f4/extract/0.log" Mar 20 18:36:03 crc kubenswrapper[4690]: I0320 18:36:03.930831 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1428tj_40861b19-ba1a-4adf-8ee2-25a7c3016940/util/0.log" Mar 20 18:36:04 crc kubenswrapper[4690]: I0320 18:36:04.100646 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1428tj_40861b19-ba1a-4adf-8ee2-25a7c3016940/pull/0.log" Mar 20 18:36:04 crc kubenswrapper[4690]: I0320 18:36:04.111061 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1428tj_40861b19-ba1a-4adf-8ee2-25a7c3016940/util/0.log" Mar 20 18:36:04 crc kubenswrapper[4690]: I0320 18:36:04.124964 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1428tj_40861b19-ba1a-4adf-8ee2-25a7c3016940/pull/0.log" Mar 20 18:36:04 crc kubenswrapper[4690]: I0320 18:36:04.273400 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1428tj_40861b19-ba1a-4adf-8ee2-25a7c3016940/extract/0.log" Mar 20 18:36:04 crc kubenswrapper[4690]: I0320 18:36:04.304775 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1428tj_40861b19-ba1a-4adf-8ee2-25a7c3016940/util/0.log" Mar 20 18:36:04 crc kubenswrapper[4690]: I0320 18:36:04.320049 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1428tj_40861b19-ba1a-4adf-8ee2-25a7c3016940/pull/0.log" Mar 20 18:36:04 crc kubenswrapper[4690]: I0320 18:36:04.454725 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vpf95_2f23f3d2-ebe3-44b0-9872-dfb5da5932e2/extract-utilities/0.log" Mar 20 18:36:04 crc kubenswrapper[4690]: I0320 18:36:04.597427 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vpf95_2f23f3d2-ebe3-44b0-9872-dfb5da5932e2/extract-utilities/0.log" Mar 20 18:36:04 crc kubenswrapper[4690]: I0320 18:36:04.598945 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vpf95_2f23f3d2-ebe3-44b0-9872-dfb5da5932e2/extract-content/0.log" Mar 20 18:36:04 crc kubenswrapper[4690]: I0320 18:36:04.605024 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vpf95_2f23f3d2-ebe3-44b0-9872-dfb5da5932e2/extract-content/0.log" Mar 20 18:36:04 crc kubenswrapper[4690]: I0320 18:36:04.783872 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vpf95_2f23f3d2-ebe3-44b0-9872-dfb5da5932e2/extract-utilities/0.log" Mar 20 18:36:04 crc kubenswrapper[4690]: I0320 18:36:04.806760 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vpf95_2f23f3d2-ebe3-44b0-9872-dfb5da5932e2/extract-content/0.log" Mar 20 18:36:05 crc kubenswrapper[4690]: I0320 18:36:05.015476 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bnw8t_254fbc18-10d1-444c-aef5-12f66b65b191/extract-utilities/0.log" Mar 20 18:36:05 crc kubenswrapper[4690]: I0320 18:36:05.116323 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567196-fwszr" Mar 20 18:36:05 crc kubenswrapper[4690]: I0320 18:36:05.242923 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ffbs5\" (UniqueName: \"kubernetes.io/projected/8be67ef1-def0-48f6-8766-80d72249d2d5-kube-api-access-ffbs5\") pod \"8be67ef1-def0-48f6-8766-80d72249d2d5\" (UID: \"8be67ef1-def0-48f6-8766-80d72249d2d5\") " Mar 20 18:36:05 crc kubenswrapper[4690]: I0320 18:36:05.250423 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8be67ef1-def0-48f6-8766-80d72249d2d5-kube-api-access-ffbs5" (OuterVolumeSpecName: "kube-api-access-ffbs5") pod "8be67ef1-def0-48f6-8766-80d72249d2d5" (UID: "8be67ef1-def0-48f6-8766-80d72249d2d5"). InnerVolumeSpecName "kube-api-access-ffbs5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:36:05 crc kubenswrapper[4690]: I0320 18:36:05.266628 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bnw8t_254fbc18-10d1-444c-aef5-12f66b65b191/extract-utilities/0.log" Mar 20 18:36:05 crc kubenswrapper[4690]: I0320 18:36:05.295041 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vpf95_2f23f3d2-ebe3-44b0-9872-dfb5da5932e2/registry-server/0.log" Mar 20 18:36:05 crc kubenswrapper[4690]: I0320 18:36:05.345642 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ffbs5\" (UniqueName: \"kubernetes.io/projected/8be67ef1-def0-48f6-8766-80d72249d2d5-kube-api-access-ffbs5\") on node \"crc\" DevicePath \"\"" Mar 20 18:36:05 crc kubenswrapper[4690]: I0320 18:36:05.353363 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bnw8t_254fbc18-10d1-444c-aef5-12f66b65b191/extract-content/0.log" Mar 20 18:36:05 crc kubenswrapper[4690]: I0320 18:36:05.356433 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bnw8t_254fbc18-10d1-444c-aef5-12f66b65b191/extract-content/0.log" Mar 20 18:36:05 crc kubenswrapper[4690]: I0320 18:36:05.480216 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bnw8t_254fbc18-10d1-444c-aef5-12f66b65b191/extract-utilities/0.log" Mar 20 18:36:05 crc kubenswrapper[4690]: I0320 18:36:05.489738 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bnw8t_254fbc18-10d1-444c-aef5-12f66b65b191/extract-content/0.log" Mar 20 18:36:05 crc kubenswrapper[4690]: I0320 18:36:05.678266 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-7hpm8_23f72eed-c3c0-4aed-a4a8-8243c27a2785/marketplace-operator/0.log" Mar 20 18:36:05 crc kubenswrapper[4690]: I0320 18:36:05.756216 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567196-fwszr" Mar 20 18:36:05 crc kubenswrapper[4690]: I0320 18:36:05.756287 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567196-fwszr" event={"ID":"8be67ef1-def0-48f6-8766-80d72249d2d5","Type":"ContainerDied","Data":"32bb459b4cc64cc2bb211795c1c4e3d1424adc736626ff30b30d137ed99b8eb2"} Mar 20 18:36:05 crc kubenswrapper[4690]: I0320 18:36:05.756324 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="32bb459b4cc64cc2bb211795c1c4e3d1424adc736626ff30b30d137ed99b8eb2" Mar 20 18:36:05 crc kubenswrapper[4690]: I0320 18:36:05.817028 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567190-vwnbr"] Mar 20 18:36:05 crc kubenswrapper[4690]: I0320 18:36:05.820782 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xxql6_255ea7b7-2364-4ebf-9104-6a78278ee9c0/extract-utilities/0.log" Mar 20 18:36:05 crc kubenswrapper[4690]: I0320 18:36:05.827475 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567190-vwnbr"] Mar 20 18:36:05 crc kubenswrapper[4690]: I0320 18:36:05.895302 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d26f710-83a3-42e2-861e-46f27ab271df" path="/var/lib/kubelet/pods/9d26f710-83a3-42e2-861e-46f27ab271df/volumes" Mar 20 18:36:05 crc kubenswrapper[4690]: I0320 18:36:05.992436 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bnw8t_254fbc18-10d1-444c-aef5-12f66b65b191/registry-server/0.log" Mar 20 18:36:05 crc kubenswrapper[4690]: I0320 18:36:05.997910 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xxql6_255ea7b7-2364-4ebf-9104-6a78278ee9c0/extract-utilities/0.log" Mar 20 18:36:06 crc kubenswrapper[4690]: I0320 18:36:06.030024 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xxql6_255ea7b7-2364-4ebf-9104-6a78278ee9c0/extract-content/0.log" Mar 20 18:36:06 crc kubenswrapper[4690]: I0320 18:36:06.054496 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xxql6_255ea7b7-2364-4ebf-9104-6a78278ee9c0/extract-content/0.log" Mar 20 18:36:06 crc kubenswrapper[4690]: I0320 18:36:06.230149 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xxql6_255ea7b7-2364-4ebf-9104-6a78278ee9c0/extract-content/0.log" Mar 20 18:36:06 crc kubenswrapper[4690]: I0320 18:36:06.317430 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xxql6_255ea7b7-2364-4ebf-9104-6a78278ee9c0/extract-utilities/0.log" Mar 20 18:36:06 crc kubenswrapper[4690]: I0320 18:36:06.355406 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xxql6_255ea7b7-2364-4ebf-9104-6a78278ee9c0/registry-server/0.log" Mar 20 18:36:06 crc kubenswrapper[4690]: I0320 18:36:06.461573 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qx8lq_d788b569-8dbd-4311-bab4-04c7cd0f1444/extract-utilities/0.log" Mar 20 18:36:06 crc kubenswrapper[4690]: I0320 18:36:06.644069 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qx8lq_d788b569-8dbd-4311-bab4-04c7cd0f1444/extract-content/0.log" Mar 20 18:36:06 crc kubenswrapper[4690]: I0320 18:36:06.652550 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qx8lq_d788b569-8dbd-4311-bab4-04c7cd0f1444/extract-utilities/0.log" Mar 20 18:36:06 crc kubenswrapper[4690]: I0320 18:36:06.660551 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qx8lq_d788b569-8dbd-4311-bab4-04c7cd0f1444/extract-content/0.log" Mar 20 18:36:06 crc kubenswrapper[4690]: I0320 18:36:06.813771 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qx8lq_d788b569-8dbd-4311-bab4-04c7cd0f1444/extract-content/0.log" Mar 20 18:36:06 crc kubenswrapper[4690]: I0320 18:36:06.853809 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qx8lq_d788b569-8dbd-4311-bab4-04c7cd0f1444/extract-utilities/0.log" Mar 20 18:36:07 crc kubenswrapper[4690]: I0320 18:36:07.275214 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qx8lq_d788b569-8dbd-4311-bab4-04c7cd0f1444/registry-server/0.log" Mar 20 18:36:11 crc kubenswrapper[4690]: I0320 18:36:11.886572 4690 scope.go:117] "RemoveContainer" containerID="bdbe59d9ce73fb94720cd5938d39dd30976660c151e267e22ef199f0f2141309" Mar 20 18:36:11 crc kubenswrapper[4690]: E0320 18:36:11.887290 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:36:22 crc kubenswrapper[4690]: I0320 18:36:22.883052 4690 scope.go:117] "RemoveContainer" containerID="bdbe59d9ce73fb94720cd5938d39dd30976660c151e267e22ef199f0f2141309" Mar 20 18:36:22 crc kubenswrapper[4690]: E0320 18:36:22.883800 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:36:25 crc kubenswrapper[4690]: I0320 18:36:25.143464 4690 scope.go:117] "RemoveContainer" containerID="ba3b7a58688c9ecd6b2365531a7406696328ac397be70982f36f713b04e3952f" Mar 20 18:36:31 crc kubenswrapper[4690]: E0320 18:36:31.855229 4690 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.192:33138->38.102.83.192:45043: write tcp 38.102.83.192:33138->38.102.83.192:45043: write: broken pipe Mar 20 18:36:34 crc kubenswrapper[4690]: I0320 18:36:34.882983 4690 scope.go:117] "RemoveContainer" containerID="bdbe59d9ce73fb94720cd5938d39dd30976660c151e267e22ef199f0f2141309" Mar 20 18:36:34 crc kubenswrapper[4690]: E0320 18:36:34.883701 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:36:48 crc kubenswrapper[4690]: I0320 18:36:48.883712 4690 scope.go:117] "RemoveContainer" containerID="bdbe59d9ce73fb94720cd5938d39dd30976660c151e267e22ef199f0f2141309" Mar 20 18:36:48 crc kubenswrapper[4690]: E0320 18:36:48.885753 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:37:00 crc kubenswrapper[4690]: I0320 18:37:00.883679 4690 scope.go:117] "RemoveContainer" containerID="bdbe59d9ce73fb94720cd5938d39dd30976660c151e267e22ef199f0f2141309" Mar 20 18:37:00 crc kubenswrapper[4690]: E0320 18:37:00.884981 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:37:13 crc kubenswrapper[4690]: I0320 18:37:13.889825 4690 scope.go:117] "RemoveContainer" containerID="bdbe59d9ce73fb94720cd5938d39dd30976660c151e267e22ef199f0f2141309" Mar 20 18:37:13 crc kubenswrapper[4690]: E0320 18:37:13.892504 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:37:25 crc kubenswrapper[4690]: I0320 18:37:25.902636 4690 scope.go:117] "RemoveContainer" containerID="bdbe59d9ce73fb94720cd5938d39dd30976660c151e267e22ef199f0f2141309" Mar 20 18:37:25 crc kubenswrapper[4690]: E0320 18:37:25.903529 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:37:37 crc kubenswrapper[4690]: I0320 18:37:37.885537 4690 scope.go:117] "RemoveContainer" containerID="bdbe59d9ce73fb94720cd5938d39dd30976660c151e267e22ef199f0f2141309" Mar 20 18:37:37 crc kubenswrapper[4690]: E0320 18:37:37.889328 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:37:50 crc kubenswrapper[4690]: I0320 18:37:50.884950 4690 scope.go:117] "RemoveContainer" containerID="bdbe59d9ce73fb94720cd5938d39dd30976660c151e267e22ef199f0f2141309" Mar 20 18:37:50 crc kubenswrapper[4690]: E0320 18:37:50.886148 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:37:56 crc kubenswrapper[4690]: I0320 18:37:56.877518 4690 generic.go:334] "Generic (PLEG): container finished" podID="8f57ca90-28d1-4064-a387-af2a1bf69731" containerID="06f866b63ac88dc8b55e1d03ee1bb4389be06869dddc2dfa4ff495f46d9283ec" exitCode=0 Mar 20 18:37:56 crc kubenswrapper[4690]: I0320 18:37:56.877643 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pwxb9/must-gather-t7qds" event={"ID":"8f57ca90-28d1-4064-a387-af2a1bf69731","Type":"ContainerDied","Data":"06f866b63ac88dc8b55e1d03ee1bb4389be06869dddc2dfa4ff495f46d9283ec"} Mar 20 18:37:56 crc kubenswrapper[4690]: I0320 18:37:56.879145 4690 scope.go:117] "RemoveContainer" containerID="06f866b63ac88dc8b55e1d03ee1bb4389be06869dddc2dfa4ff495f46d9283ec" Mar 20 18:37:57 crc kubenswrapper[4690]: I0320 18:37:57.182230 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-pwxb9_must-gather-t7qds_8f57ca90-28d1-4064-a387-af2a1bf69731/gather/0.log" Mar 20 18:38:00 crc kubenswrapper[4690]: I0320 18:38:00.157969 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567198-c9q5m"] Mar 20 18:38:00 crc kubenswrapper[4690]: E0320 18:38:00.158872 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8be67ef1-def0-48f6-8766-80d72249d2d5" containerName="oc" Mar 20 18:38:00 crc kubenswrapper[4690]: I0320 18:38:00.158885 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="8be67ef1-def0-48f6-8766-80d72249d2d5" containerName="oc" Mar 20 18:38:00 crc kubenswrapper[4690]: I0320 18:38:00.159066 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="8be67ef1-def0-48f6-8766-80d72249d2d5" containerName="oc" Mar 20 18:38:00 crc kubenswrapper[4690]: I0320 18:38:00.159696 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567198-c9q5m" Mar 20 18:38:00 crc kubenswrapper[4690]: I0320 18:38:00.166590 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 18:38:00 crc kubenswrapper[4690]: I0320 18:38:00.166699 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 18:38:00 crc kubenswrapper[4690]: I0320 18:38:00.166934 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5fwhb" Mar 20 18:38:00 crc kubenswrapper[4690]: I0320 18:38:00.167890 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567198-c9q5m"] Mar 20 18:38:00 crc kubenswrapper[4690]: I0320 18:38:00.293250 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vpgw\" (UniqueName: \"kubernetes.io/projected/973021e2-d5d7-48a9-b0e1-487008ee4009-kube-api-access-5vpgw\") pod \"auto-csr-approver-29567198-c9q5m\" (UID: \"973021e2-d5d7-48a9-b0e1-487008ee4009\") " pod="openshift-infra/auto-csr-approver-29567198-c9q5m" Mar 20 18:38:00 crc kubenswrapper[4690]: I0320 18:38:00.394529 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vpgw\" (UniqueName: \"kubernetes.io/projected/973021e2-d5d7-48a9-b0e1-487008ee4009-kube-api-access-5vpgw\") pod \"auto-csr-approver-29567198-c9q5m\" (UID: \"973021e2-d5d7-48a9-b0e1-487008ee4009\") " pod="openshift-infra/auto-csr-approver-29567198-c9q5m" Mar 20 18:38:00 crc kubenswrapper[4690]: I0320 18:38:00.412442 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vpgw\" (UniqueName: \"kubernetes.io/projected/973021e2-d5d7-48a9-b0e1-487008ee4009-kube-api-access-5vpgw\") pod \"auto-csr-approver-29567198-c9q5m\" (UID: \"973021e2-d5d7-48a9-b0e1-487008ee4009\") " pod="openshift-infra/auto-csr-approver-29567198-c9q5m" Mar 20 18:38:00 crc kubenswrapper[4690]: I0320 18:38:00.494606 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567198-c9q5m" Mar 20 18:38:00 crc kubenswrapper[4690]: I0320 18:38:00.922595 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567198-c9q5m"] Mar 20 18:38:00 crc kubenswrapper[4690]: W0320 18:38:00.932582 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod973021e2_d5d7_48a9_b0e1_487008ee4009.slice/crio-d91f33c9ab1c251898d53cf4b5c933e5c78523039753b0b558350fe571ea1c30 WatchSource:0}: Error finding container d91f33c9ab1c251898d53cf4b5c933e5c78523039753b0b558350fe571ea1c30: Status 404 returned error can't find the container with id d91f33c9ab1c251898d53cf4b5c933e5c78523039753b0b558350fe571ea1c30 Mar 20 18:38:01 crc kubenswrapper[4690]: I0320 18:38:01.938317 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567198-c9q5m" event={"ID":"973021e2-d5d7-48a9-b0e1-487008ee4009","Type":"ContainerStarted","Data":"d91f33c9ab1c251898d53cf4b5c933e5c78523039753b0b558350fe571ea1c30"} Mar 20 18:38:02 crc kubenswrapper[4690]: I0320 18:38:02.949942 4690 generic.go:334] "Generic (PLEG): container finished" podID="973021e2-d5d7-48a9-b0e1-487008ee4009" containerID="3a1c7a9f9da72a541c12435448886bf8fa98a757aa213cbb2267650c5ceaef50" exitCode=0 Mar 20 18:38:02 crc kubenswrapper[4690]: I0320 18:38:02.950038 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567198-c9q5m" event={"ID":"973021e2-d5d7-48a9-b0e1-487008ee4009","Type":"ContainerDied","Data":"3a1c7a9f9da72a541c12435448886bf8fa98a757aa213cbb2267650c5ceaef50"} Mar 20 18:38:04 crc kubenswrapper[4690]: I0320 18:38:04.395114 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567198-c9q5m" Mar 20 18:38:04 crc kubenswrapper[4690]: I0320 18:38:04.487141 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5vpgw\" (UniqueName: \"kubernetes.io/projected/973021e2-d5d7-48a9-b0e1-487008ee4009-kube-api-access-5vpgw\") pod \"973021e2-d5d7-48a9-b0e1-487008ee4009\" (UID: \"973021e2-d5d7-48a9-b0e1-487008ee4009\") " Mar 20 18:38:04 crc kubenswrapper[4690]: I0320 18:38:04.494536 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/973021e2-d5d7-48a9-b0e1-487008ee4009-kube-api-access-5vpgw" (OuterVolumeSpecName: "kube-api-access-5vpgw") pod "973021e2-d5d7-48a9-b0e1-487008ee4009" (UID: "973021e2-d5d7-48a9-b0e1-487008ee4009"). InnerVolumeSpecName "kube-api-access-5vpgw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:38:04 crc kubenswrapper[4690]: I0320 18:38:04.589108 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5vpgw\" (UniqueName: \"kubernetes.io/projected/973021e2-d5d7-48a9-b0e1-487008ee4009-kube-api-access-5vpgw\") on node \"crc\" DevicePath \"\"" Mar 20 18:38:04 crc kubenswrapper[4690]: I0320 18:38:04.966870 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567198-c9q5m" event={"ID":"973021e2-d5d7-48a9-b0e1-487008ee4009","Type":"ContainerDied","Data":"d91f33c9ab1c251898d53cf4b5c933e5c78523039753b0b558350fe571ea1c30"} Mar 20 18:38:04 crc kubenswrapper[4690]: I0320 18:38:04.966912 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d91f33c9ab1c251898d53cf4b5c933e5c78523039753b0b558350fe571ea1c30" Mar 20 18:38:04 crc kubenswrapper[4690]: I0320 18:38:04.966911 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567198-c9q5m" Mar 20 18:38:04 crc kubenswrapper[4690]: I0320 18:38:04.969796 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-pwxb9/must-gather-t7qds"] Mar 20 18:38:04 crc kubenswrapper[4690]: I0320 18:38:04.970000 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-pwxb9/must-gather-t7qds" podUID="8f57ca90-28d1-4064-a387-af2a1bf69731" containerName="copy" containerID="cri-o://de38e64b0939fef49f9fae9f8fd3b6e77b76ff109ff075ac69b805eaebdf5626" gracePeriod=2 Mar 20 18:38:04 crc kubenswrapper[4690]: I0320 18:38:04.980854 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-pwxb9/must-gather-t7qds"] Mar 20 18:38:05 crc kubenswrapper[4690]: I0320 18:38:05.374158 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-pwxb9_must-gather-t7qds_8f57ca90-28d1-4064-a387-af2a1bf69731/copy/0.log" Mar 20 18:38:05 crc kubenswrapper[4690]: I0320 18:38:05.374671 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pwxb9/must-gather-t7qds" Mar 20 18:38:05 crc kubenswrapper[4690]: I0320 18:38:05.466327 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567192-lsb4v"] Mar 20 18:38:05 crc kubenswrapper[4690]: I0320 18:38:05.475214 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567192-lsb4v"] Mar 20 18:38:05 crc kubenswrapper[4690]: I0320 18:38:05.507966 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ph65n\" (UniqueName: \"kubernetes.io/projected/8f57ca90-28d1-4064-a387-af2a1bf69731-kube-api-access-ph65n\") pod \"8f57ca90-28d1-4064-a387-af2a1bf69731\" (UID: \"8f57ca90-28d1-4064-a387-af2a1bf69731\") " Mar 20 18:38:05 crc kubenswrapper[4690]: I0320 18:38:05.508013 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8f57ca90-28d1-4064-a387-af2a1bf69731-must-gather-output\") pod \"8f57ca90-28d1-4064-a387-af2a1bf69731\" (UID: \"8f57ca90-28d1-4064-a387-af2a1bf69731\") " Mar 20 18:38:05 crc kubenswrapper[4690]: I0320 18:38:05.513152 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f57ca90-28d1-4064-a387-af2a1bf69731-kube-api-access-ph65n" (OuterVolumeSpecName: "kube-api-access-ph65n") pod "8f57ca90-28d1-4064-a387-af2a1bf69731" (UID: "8f57ca90-28d1-4064-a387-af2a1bf69731"). InnerVolumeSpecName "kube-api-access-ph65n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:38:05 crc kubenswrapper[4690]: I0320 18:38:05.610413 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ph65n\" (UniqueName: \"kubernetes.io/projected/8f57ca90-28d1-4064-a387-af2a1bf69731-kube-api-access-ph65n\") on node \"crc\" DevicePath \"\"" Mar 20 18:38:05 crc kubenswrapper[4690]: I0320 18:38:05.672488 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f57ca90-28d1-4064-a387-af2a1bf69731-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "8f57ca90-28d1-4064-a387-af2a1bf69731" (UID: "8f57ca90-28d1-4064-a387-af2a1bf69731"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:38:05 crc kubenswrapper[4690]: I0320 18:38:05.712843 4690 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8f57ca90-28d1-4064-a387-af2a1bf69731-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 20 18:38:05 crc kubenswrapper[4690]: I0320 18:38:05.891696 4690 scope.go:117] "RemoveContainer" containerID="bdbe59d9ce73fb94720cd5938d39dd30976660c151e267e22ef199f0f2141309" Mar 20 18:38:05 crc kubenswrapper[4690]: E0320 18:38:05.892482 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:38:05 crc kubenswrapper[4690]: I0320 18:38:05.895000 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2fad462b-be0f-40f9-bc9c-8ae5aec84e6a" path="/var/lib/kubelet/pods/2fad462b-be0f-40f9-bc9c-8ae5aec84e6a/volumes" Mar 20 18:38:05 crc kubenswrapper[4690]: I0320 18:38:05.895776 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f57ca90-28d1-4064-a387-af2a1bf69731" path="/var/lib/kubelet/pods/8f57ca90-28d1-4064-a387-af2a1bf69731/volumes" Mar 20 18:38:05 crc kubenswrapper[4690]: I0320 18:38:05.978772 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-pwxb9_must-gather-t7qds_8f57ca90-28d1-4064-a387-af2a1bf69731/copy/0.log" Mar 20 18:38:05 crc kubenswrapper[4690]: I0320 18:38:05.979085 4690 generic.go:334] "Generic (PLEG): container finished" podID="8f57ca90-28d1-4064-a387-af2a1bf69731" containerID="de38e64b0939fef49f9fae9f8fd3b6e77b76ff109ff075ac69b805eaebdf5626" exitCode=143 Mar 20 18:38:05 crc kubenswrapper[4690]: I0320 18:38:05.979133 4690 scope.go:117] "RemoveContainer" containerID="de38e64b0939fef49f9fae9f8fd3b6e77b76ff109ff075ac69b805eaebdf5626" Mar 20 18:38:05 crc kubenswrapper[4690]: I0320 18:38:05.979271 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pwxb9/must-gather-t7qds" Mar 20 18:38:06 crc kubenswrapper[4690]: I0320 18:38:06.000330 4690 scope.go:117] "RemoveContainer" containerID="06f866b63ac88dc8b55e1d03ee1bb4389be06869dddc2dfa4ff495f46d9283ec" Mar 20 18:38:06 crc kubenswrapper[4690]: I0320 18:38:06.065707 4690 scope.go:117] "RemoveContainer" containerID="de38e64b0939fef49f9fae9f8fd3b6e77b76ff109ff075ac69b805eaebdf5626" Mar 20 18:38:06 crc kubenswrapper[4690]: E0320 18:38:06.066191 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de38e64b0939fef49f9fae9f8fd3b6e77b76ff109ff075ac69b805eaebdf5626\": container with ID starting with de38e64b0939fef49f9fae9f8fd3b6e77b76ff109ff075ac69b805eaebdf5626 not found: ID does not exist" containerID="de38e64b0939fef49f9fae9f8fd3b6e77b76ff109ff075ac69b805eaebdf5626" Mar 20 18:38:06 crc kubenswrapper[4690]: I0320 18:38:06.066281 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de38e64b0939fef49f9fae9f8fd3b6e77b76ff109ff075ac69b805eaebdf5626"} err="failed to get container status \"de38e64b0939fef49f9fae9f8fd3b6e77b76ff109ff075ac69b805eaebdf5626\": rpc error: code = NotFound desc = could not find container \"de38e64b0939fef49f9fae9f8fd3b6e77b76ff109ff075ac69b805eaebdf5626\": container with ID starting with de38e64b0939fef49f9fae9f8fd3b6e77b76ff109ff075ac69b805eaebdf5626 not found: ID does not exist" Mar 20 18:38:06 crc kubenswrapper[4690]: I0320 18:38:06.066858 4690 scope.go:117] "RemoveContainer" containerID="06f866b63ac88dc8b55e1d03ee1bb4389be06869dddc2dfa4ff495f46d9283ec" Mar 20 18:38:06 crc kubenswrapper[4690]: E0320 18:38:06.067395 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06f866b63ac88dc8b55e1d03ee1bb4389be06869dddc2dfa4ff495f46d9283ec\": container with ID starting with 06f866b63ac88dc8b55e1d03ee1bb4389be06869dddc2dfa4ff495f46d9283ec not found: ID does not exist" containerID="06f866b63ac88dc8b55e1d03ee1bb4389be06869dddc2dfa4ff495f46d9283ec" Mar 20 18:38:06 crc kubenswrapper[4690]: I0320 18:38:06.067436 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06f866b63ac88dc8b55e1d03ee1bb4389be06869dddc2dfa4ff495f46d9283ec"} err="failed to get container status \"06f866b63ac88dc8b55e1d03ee1bb4389be06869dddc2dfa4ff495f46d9283ec\": rpc error: code = NotFound desc = could not find container \"06f866b63ac88dc8b55e1d03ee1bb4389be06869dddc2dfa4ff495f46d9283ec\": container with ID starting with 06f866b63ac88dc8b55e1d03ee1bb4389be06869dddc2dfa4ff495f46d9283ec not found: ID does not exist" Mar 20 18:38:12 crc kubenswrapper[4690]: E0320 18:38:12.537302 4690 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8f57ca90_28d1_4064_a387_af2a1bf69731.slice\": RecentStats: unable to find data in memory cache]" Mar 20 18:38:16 crc kubenswrapper[4690]: I0320 18:38:16.883574 4690 scope.go:117] "RemoveContainer" containerID="bdbe59d9ce73fb94720cd5938d39dd30976660c151e267e22ef199f0f2141309" Mar 20 18:38:16 crc kubenswrapper[4690]: E0320 18:38:16.884409 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:38:22 crc kubenswrapper[4690]: E0320 18:38:22.765316 4690 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8f57ca90_28d1_4064_a387_af2a1bf69731.slice\": RecentStats: unable to find data in memory cache]" Mar 20 18:38:25 crc kubenswrapper[4690]: I0320 18:38:25.279049 4690 scope.go:117] "RemoveContainer" containerID="0d672a3fb53e5ac2689ddf8b7c559f859e690a26fc918689bdf48f3925e81dde" Mar 20 18:38:31 crc kubenswrapper[4690]: I0320 18:38:31.883798 4690 scope.go:117] "RemoveContainer" containerID="bdbe59d9ce73fb94720cd5938d39dd30976660c151e267e22ef199f0f2141309" Mar 20 18:38:32 crc kubenswrapper[4690]: I0320 18:38:32.249962 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" event={"ID":"c18651e4-89e3-43fd-a780-bfa6df87591e","Type":"ContainerStarted","Data":"71d102b5805b8fe7f4d59a06973e7c063c629bb25d542cdb23626c14c624bec3"} Mar 20 18:38:33 crc kubenswrapper[4690]: E0320 18:38:33.027581 4690 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8f57ca90_28d1_4064_a387_af2a1bf69731.slice\": RecentStats: unable to find data in memory cache]" Mar 20 18:38:43 crc kubenswrapper[4690]: E0320 18:38:43.374825 4690 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8f57ca90_28d1_4064_a387_af2a1bf69731.slice\": RecentStats: unable to find data in memory cache]" Mar 20 18:38:53 crc kubenswrapper[4690]: E0320 18:38:53.639766 4690 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8f57ca90_28d1_4064_a387_af2a1bf69731.slice\": RecentStats: unable to find data in memory cache]" Mar 20 18:39:00 crc kubenswrapper[4690]: I0320 18:39:00.847665 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6sx89"] Mar 20 18:39:00 crc kubenswrapper[4690]: E0320 18:39:00.848407 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f57ca90-28d1-4064-a387-af2a1bf69731" containerName="copy" Mar 20 18:39:00 crc kubenswrapper[4690]: I0320 18:39:00.848418 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f57ca90-28d1-4064-a387-af2a1bf69731" containerName="copy" Mar 20 18:39:00 crc kubenswrapper[4690]: E0320 18:39:00.848441 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="973021e2-d5d7-48a9-b0e1-487008ee4009" containerName="oc" Mar 20 18:39:00 crc kubenswrapper[4690]: I0320 18:39:00.848446 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="973021e2-d5d7-48a9-b0e1-487008ee4009" containerName="oc" Mar 20 18:39:00 crc kubenswrapper[4690]: E0320 18:39:00.848457 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f57ca90-28d1-4064-a387-af2a1bf69731" containerName="gather" Mar 20 18:39:00 crc kubenswrapper[4690]: I0320 18:39:00.848463 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f57ca90-28d1-4064-a387-af2a1bf69731" containerName="gather" Mar 20 18:39:00 crc kubenswrapper[4690]: I0320 18:39:00.848624 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="973021e2-d5d7-48a9-b0e1-487008ee4009" containerName="oc" Mar 20 18:39:00 crc kubenswrapper[4690]: I0320 18:39:00.848635 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f57ca90-28d1-4064-a387-af2a1bf69731" containerName="copy" Mar 20 18:39:00 crc kubenswrapper[4690]: I0320 18:39:00.848649 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f57ca90-28d1-4064-a387-af2a1bf69731" containerName="gather" Mar 20 18:39:00 crc kubenswrapper[4690]: I0320 18:39:00.849911 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6sx89" Mar 20 18:39:00 crc kubenswrapper[4690]: I0320 18:39:00.870692 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6sx89"] Mar 20 18:39:00 crc kubenswrapper[4690]: I0320 18:39:00.952523 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-js6sn\" (UniqueName: \"kubernetes.io/projected/22fd45db-f26a-49cb-be67-edb6178d00b9-kube-api-access-js6sn\") pod \"community-operators-6sx89\" (UID: \"22fd45db-f26a-49cb-be67-edb6178d00b9\") " pod="openshift-marketplace/community-operators-6sx89" Mar 20 18:39:00 crc kubenswrapper[4690]: I0320 18:39:00.952592 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22fd45db-f26a-49cb-be67-edb6178d00b9-utilities\") pod \"community-operators-6sx89\" (UID: \"22fd45db-f26a-49cb-be67-edb6178d00b9\") " pod="openshift-marketplace/community-operators-6sx89" Mar 20 18:39:00 crc kubenswrapper[4690]: I0320 18:39:00.953033 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22fd45db-f26a-49cb-be67-edb6178d00b9-catalog-content\") pod \"community-operators-6sx89\" (UID: \"22fd45db-f26a-49cb-be67-edb6178d00b9\") " pod="openshift-marketplace/community-operators-6sx89" Mar 20 18:39:01 crc kubenswrapper[4690]: I0320 18:39:01.055549 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-js6sn\" (UniqueName: \"kubernetes.io/projected/22fd45db-f26a-49cb-be67-edb6178d00b9-kube-api-access-js6sn\") pod \"community-operators-6sx89\" (UID: \"22fd45db-f26a-49cb-be67-edb6178d00b9\") " pod="openshift-marketplace/community-operators-6sx89" Mar 20 18:39:01 crc kubenswrapper[4690]: I0320 18:39:01.055900 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22fd45db-f26a-49cb-be67-edb6178d00b9-utilities\") pod \"community-operators-6sx89\" (UID: \"22fd45db-f26a-49cb-be67-edb6178d00b9\") " pod="openshift-marketplace/community-operators-6sx89" Mar 20 18:39:01 crc kubenswrapper[4690]: I0320 18:39:01.056192 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22fd45db-f26a-49cb-be67-edb6178d00b9-catalog-content\") pod \"community-operators-6sx89\" (UID: \"22fd45db-f26a-49cb-be67-edb6178d00b9\") " pod="openshift-marketplace/community-operators-6sx89" Mar 20 18:39:01 crc kubenswrapper[4690]: I0320 18:39:01.056449 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22fd45db-f26a-49cb-be67-edb6178d00b9-utilities\") pod \"community-operators-6sx89\" (UID: \"22fd45db-f26a-49cb-be67-edb6178d00b9\") " pod="openshift-marketplace/community-operators-6sx89" Mar 20 18:39:01 crc kubenswrapper[4690]: I0320 18:39:01.056977 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22fd45db-f26a-49cb-be67-edb6178d00b9-catalog-content\") pod \"community-operators-6sx89\" (UID: \"22fd45db-f26a-49cb-be67-edb6178d00b9\") " pod="openshift-marketplace/community-operators-6sx89" Mar 20 18:39:01 crc kubenswrapper[4690]: I0320 18:39:01.086135 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-js6sn\" (UniqueName: \"kubernetes.io/projected/22fd45db-f26a-49cb-be67-edb6178d00b9-kube-api-access-js6sn\") pod \"community-operators-6sx89\" (UID: \"22fd45db-f26a-49cb-be67-edb6178d00b9\") " pod="openshift-marketplace/community-operators-6sx89" Mar 20 18:39:01 crc kubenswrapper[4690]: I0320 18:39:01.171453 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6sx89" Mar 20 18:39:01 crc kubenswrapper[4690]: I0320 18:39:01.695294 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6sx89"] Mar 20 18:39:02 crc kubenswrapper[4690]: I0320 18:39:02.557814 4690 generic.go:334] "Generic (PLEG): container finished" podID="22fd45db-f26a-49cb-be67-edb6178d00b9" containerID="647274f57afa8a5c85f2fabff675e03873884f639de6fcda53806c527e9c5760" exitCode=0 Mar 20 18:39:02 crc kubenswrapper[4690]: I0320 18:39:02.557914 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6sx89" event={"ID":"22fd45db-f26a-49cb-be67-edb6178d00b9","Type":"ContainerDied","Data":"647274f57afa8a5c85f2fabff675e03873884f639de6fcda53806c527e9c5760"} Mar 20 18:39:02 crc kubenswrapper[4690]: I0320 18:39:02.558074 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6sx89" event={"ID":"22fd45db-f26a-49cb-be67-edb6178d00b9","Type":"ContainerStarted","Data":"c152a739562357fe9265b6309216f06bf69eb5370ff3cd311e2d44a33340d453"} Mar 20 18:39:03 crc kubenswrapper[4690]: I0320 18:39:03.573312 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6sx89" event={"ID":"22fd45db-f26a-49cb-be67-edb6178d00b9","Type":"ContainerStarted","Data":"62153c762533bae716109d12de596e9054a66f793d2d084ff1be1b7481298efc"} Mar 20 18:39:03 crc kubenswrapper[4690]: E0320 18:39:03.863610 4690 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8f57ca90_28d1_4064_a387_af2a1bf69731.slice\": RecentStats: unable to find data in memory cache]" Mar 20 18:39:04 crc kubenswrapper[4690]: I0320 18:39:04.585838 4690 generic.go:334] "Generic (PLEG): container finished" podID="22fd45db-f26a-49cb-be67-edb6178d00b9" containerID="62153c762533bae716109d12de596e9054a66f793d2d084ff1be1b7481298efc" exitCode=0 Mar 20 18:39:04 crc kubenswrapper[4690]: I0320 18:39:04.585943 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6sx89" event={"ID":"22fd45db-f26a-49cb-be67-edb6178d00b9","Type":"ContainerDied","Data":"62153c762533bae716109d12de596e9054a66f793d2d084ff1be1b7481298efc"} Mar 20 18:39:05 crc kubenswrapper[4690]: I0320 18:39:05.599843 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6sx89" event={"ID":"22fd45db-f26a-49cb-be67-edb6178d00b9","Type":"ContainerStarted","Data":"2752ab6b6a5953e2bfb14d5042f5b4c819ee3d2c776e7207d60b0ba0ae7cd213"} Mar 20 18:39:05 crc kubenswrapper[4690]: I0320 18:39:05.622487 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6sx89" podStartSLOduration=3.104528012 podStartE2EDuration="5.622465021s" podCreationTimestamp="2026-03-20 18:39:00 +0000 UTC" firstStartedPulling="2026-03-20 18:39:02.559914637 +0000 UTC m=+4017.425740315" lastFinishedPulling="2026-03-20 18:39:05.077851636 +0000 UTC m=+4019.943677324" observedRunningTime="2026-03-20 18:39:05.616869702 +0000 UTC m=+4020.482695390" watchObservedRunningTime="2026-03-20 18:39:05.622465021 +0000 UTC m=+4020.488290699" Mar 20 18:39:05 crc kubenswrapper[4690]: I0320 18:39:05.642436 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-nl8tc"] Mar 20 18:39:05 crc kubenswrapper[4690]: I0320 18:39:05.644424 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nl8tc" Mar 20 18:39:05 crc kubenswrapper[4690]: I0320 18:39:05.651190 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/946b16f4-d454-4b40-b2c5-453f8bb52651-utilities\") pod \"certified-operators-nl8tc\" (UID: \"946b16f4-d454-4b40-b2c5-453f8bb52651\") " pod="openshift-marketplace/certified-operators-nl8tc" Mar 20 18:39:05 crc kubenswrapper[4690]: I0320 18:39:05.651451 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kgsw\" (UniqueName: \"kubernetes.io/projected/946b16f4-d454-4b40-b2c5-453f8bb52651-kube-api-access-5kgsw\") pod \"certified-operators-nl8tc\" (UID: \"946b16f4-d454-4b40-b2c5-453f8bb52651\") " pod="openshift-marketplace/certified-operators-nl8tc" Mar 20 18:39:05 crc kubenswrapper[4690]: I0320 18:39:05.651595 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/946b16f4-d454-4b40-b2c5-453f8bb52651-catalog-content\") pod \"certified-operators-nl8tc\" (UID: \"946b16f4-d454-4b40-b2c5-453f8bb52651\") " pod="openshift-marketplace/certified-operators-nl8tc" Mar 20 18:39:05 crc kubenswrapper[4690]: I0320 18:39:05.656152 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nl8tc"] Mar 20 18:39:05 crc kubenswrapper[4690]: I0320 18:39:05.753715 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kgsw\" (UniqueName: \"kubernetes.io/projected/946b16f4-d454-4b40-b2c5-453f8bb52651-kube-api-access-5kgsw\") pod \"certified-operators-nl8tc\" (UID: \"946b16f4-d454-4b40-b2c5-453f8bb52651\") " pod="openshift-marketplace/certified-operators-nl8tc" Mar 20 18:39:05 crc kubenswrapper[4690]: I0320 18:39:05.753795 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/946b16f4-d454-4b40-b2c5-453f8bb52651-catalog-content\") pod \"certified-operators-nl8tc\" (UID: \"946b16f4-d454-4b40-b2c5-453f8bb52651\") " pod="openshift-marketplace/certified-operators-nl8tc" Mar 20 18:39:05 crc kubenswrapper[4690]: I0320 18:39:05.753899 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/946b16f4-d454-4b40-b2c5-453f8bb52651-utilities\") pod \"certified-operators-nl8tc\" (UID: \"946b16f4-d454-4b40-b2c5-453f8bb52651\") " pod="openshift-marketplace/certified-operators-nl8tc" Mar 20 18:39:05 crc kubenswrapper[4690]: I0320 18:39:05.754218 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/946b16f4-d454-4b40-b2c5-453f8bb52651-catalog-content\") pod \"certified-operators-nl8tc\" (UID: \"946b16f4-d454-4b40-b2c5-453f8bb52651\") " pod="openshift-marketplace/certified-operators-nl8tc" Mar 20 18:39:05 crc kubenswrapper[4690]: I0320 18:39:05.754521 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/946b16f4-d454-4b40-b2c5-453f8bb52651-utilities\") pod \"certified-operators-nl8tc\" (UID: \"946b16f4-d454-4b40-b2c5-453f8bb52651\") " pod="openshift-marketplace/certified-operators-nl8tc" Mar 20 18:39:05 crc kubenswrapper[4690]: I0320 18:39:05.778413 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kgsw\" (UniqueName: \"kubernetes.io/projected/946b16f4-d454-4b40-b2c5-453f8bb52651-kube-api-access-5kgsw\") pod \"certified-operators-nl8tc\" (UID: \"946b16f4-d454-4b40-b2c5-453f8bb52651\") " pod="openshift-marketplace/certified-operators-nl8tc" Mar 20 18:39:05 crc kubenswrapper[4690]: I0320 18:39:05.965645 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nl8tc" Mar 20 18:39:06 crc kubenswrapper[4690]: I0320 18:39:06.471033 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nl8tc"] Mar 20 18:39:06 crc kubenswrapper[4690]: W0320 18:39:06.471479 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod946b16f4_d454_4b40_b2c5_453f8bb52651.slice/crio-5a2db0ba9a1b1dc85ae64b3a56a963dd38d17aafd132fa78db15d91dc4928b35 WatchSource:0}: Error finding container 5a2db0ba9a1b1dc85ae64b3a56a963dd38d17aafd132fa78db15d91dc4928b35: Status 404 returned error can't find the container with id 5a2db0ba9a1b1dc85ae64b3a56a963dd38d17aafd132fa78db15d91dc4928b35 Mar 20 18:39:06 crc kubenswrapper[4690]: I0320 18:39:06.614208 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nl8tc" event={"ID":"946b16f4-d454-4b40-b2c5-453f8bb52651","Type":"ContainerStarted","Data":"5a2db0ba9a1b1dc85ae64b3a56a963dd38d17aafd132fa78db15d91dc4928b35"} Mar 20 18:39:07 crc kubenswrapper[4690]: I0320 18:39:07.622354 4690 generic.go:334] "Generic (PLEG): container finished" podID="946b16f4-d454-4b40-b2c5-453f8bb52651" containerID="9e873e769f411b69a5d687adb1276ed4e3fb231f50ec52d38078e5474716ac37" exitCode=0 Mar 20 18:39:07 crc kubenswrapper[4690]: I0320 18:39:07.622481 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nl8tc" event={"ID":"946b16f4-d454-4b40-b2c5-453f8bb52651","Type":"ContainerDied","Data":"9e873e769f411b69a5d687adb1276ed4e3fb231f50ec52d38078e5474716ac37"} Mar 20 18:39:08 crc kubenswrapper[4690]: I0320 18:39:08.632329 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nl8tc" event={"ID":"946b16f4-d454-4b40-b2c5-453f8bb52651","Type":"ContainerStarted","Data":"5dc7b8540f59d88568146c26772c2b104d8b045d92d875489bb4b8d43a36b67f"} Mar 20 18:39:10 crc kubenswrapper[4690]: I0320 18:39:10.659404 4690 generic.go:334] "Generic (PLEG): container finished" podID="946b16f4-d454-4b40-b2c5-453f8bb52651" containerID="5dc7b8540f59d88568146c26772c2b104d8b045d92d875489bb4b8d43a36b67f" exitCode=0 Mar 20 18:39:10 crc kubenswrapper[4690]: I0320 18:39:10.659503 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nl8tc" event={"ID":"946b16f4-d454-4b40-b2c5-453f8bb52651","Type":"ContainerDied","Data":"5dc7b8540f59d88568146c26772c2b104d8b045d92d875489bb4b8d43a36b67f"} Mar 20 18:39:11 crc kubenswrapper[4690]: I0320 18:39:11.171800 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6sx89" Mar 20 18:39:11 crc kubenswrapper[4690]: I0320 18:39:11.172079 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6sx89" Mar 20 18:39:11 crc kubenswrapper[4690]: I0320 18:39:11.239905 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6sx89" Mar 20 18:39:11 crc kubenswrapper[4690]: I0320 18:39:11.669342 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nl8tc" event={"ID":"946b16f4-d454-4b40-b2c5-453f8bb52651","Type":"ContainerStarted","Data":"1f36b877bf4d4a71a56a16a9207f7519ef4f877621f3a7a31724cf954c58004b"} Mar 20 18:39:11 crc kubenswrapper[4690]: I0320 18:39:11.693482 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-nl8tc" podStartSLOduration=3.153886643 podStartE2EDuration="6.693463492s" podCreationTimestamp="2026-03-20 18:39:05 +0000 UTC" firstStartedPulling="2026-03-20 18:39:07.624749058 +0000 UTC m=+4022.490574746" lastFinishedPulling="2026-03-20 18:39:11.164325917 +0000 UTC m=+4026.030151595" observedRunningTime="2026-03-20 18:39:11.689662654 +0000 UTC m=+4026.555488322" watchObservedRunningTime="2026-03-20 18:39:11.693463492 +0000 UTC m=+4026.559289170" Mar 20 18:39:11 crc kubenswrapper[4690]: I0320 18:39:11.715992 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6sx89" Mar 20 18:39:13 crc kubenswrapper[4690]: I0320 18:39:13.040286 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6sx89"] Mar 20 18:39:13 crc kubenswrapper[4690]: I0320 18:39:13.686822 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-6sx89" podUID="22fd45db-f26a-49cb-be67-edb6178d00b9" containerName="registry-server" containerID="cri-o://2752ab6b6a5953e2bfb14d5042f5b4c819ee3d2c776e7207d60b0ba0ae7cd213" gracePeriod=2 Mar 20 18:39:14 crc kubenswrapper[4690]: E0320 18:39:14.120619 4690 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22fd45db_f26a_49cb_be67_edb6178d00b9.slice/crio-conmon-2752ab6b6a5953e2bfb14d5042f5b4c819ee3d2c776e7207d60b0ba0ae7cd213.scope\": RecentStats: unable to find data in memory cache]" Mar 20 18:39:14 crc kubenswrapper[4690]: I0320 18:39:14.250854 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6sx89" Mar 20 18:39:14 crc kubenswrapper[4690]: I0320 18:39:14.418521 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22fd45db-f26a-49cb-be67-edb6178d00b9-catalog-content\") pod \"22fd45db-f26a-49cb-be67-edb6178d00b9\" (UID: \"22fd45db-f26a-49cb-be67-edb6178d00b9\") " Mar 20 18:39:14 crc kubenswrapper[4690]: I0320 18:39:14.418744 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22fd45db-f26a-49cb-be67-edb6178d00b9-utilities\") pod \"22fd45db-f26a-49cb-be67-edb6178d00b9\" (UID: \"22fd45db-f26a-49cb-be67-edb6178d00b9\") " Mar 20 18:39:14 crc kubenswrapper[4690]: I0320 18:39:14.418860 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-js6sn\" (UniqueName: \"kubernetes.io/projected/22fd45db-f26a-49cb-be67-edb6178d00b9-kube-api-access-js6sn\") pod \"22fd45db-f26a-49cb-be67-edb6178d00b9\" (UID: \"22fd45db-f26a-49cb-be67-edb6178d00b9\") " Mar 20 18:39:14 crc kubenswrapper[4690]: I0320 18:39:14.420288 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22fd45db-f26a-49cb-be67-edb6178d00b9-utilities" (OuterVolumeSpecName: "utilities") pod "22fd45db-f26a-49cb-be67-edb6178d00b9" (UID: "22fd45db-f26a-49cb-be67-edb6178d00b9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:39:14 crc kubenswrapper[4690]: I0320 18:39:14.424475 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22fd45db-f26a-49cb-be67-edb6178d00b9-kube-api-access-js6sn" (OuterVolumeSpecName: "kube-api-access-js6sn") pod "22fd45db-f26a-49cb-be67-edb6178d00b9" (UID: "22fd45db-f26a-49cb-be67-edb6178d00b9"). InnerVolumeSpecName "kube-api-access-js6sn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:39:14 crc kubenswrapper[4690]: I0320 18:39:14.522122 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-js6sn\" (UniqueName: \"kubernetes.io/projected/22fd45db-f26a-49cb-be67-edb6178d00b9-kube-api-access-js6sn\") on node \"crc\" DevicePath \"\"" Mar 20 18:39:14 crc kubenswrapper[4690]: I0320 18:39:14.522190 4690 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22fd45db-f26a-49cb-be67-edb6178d00b9-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 18:39:14 crc kubenswrapper[4690]: I0320 18:39:14.687447 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22fd45db-f26a-49cb-be67-edb6178d00b9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "22fd45db-f26a-49cb-be67-edb6178d00b9" (UID: "22fd45db-f26a-49cb-be67-edb6178d00b9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:39:14 crc kubenswrapper[4690]: I0320 18:39:14.700526 4690 generic.go:334] "Generic (PLEG): container finished" podID="22fd45db-f26a-49cb-be67-edb6178d00b9" containerID="2752ab6b6a5953e2bfb14d5042f5b4c819ee3d2c776e7207d60b0ba0ae7cd213" exitCode=0 Mar 20 18:39:14 crc kubenswrapper[4690]: I0320 18:39:14.700582 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6sx89" event={"ID":"22fd45db-f26a-49cb-be67-edb6178d00b9","Type":"ContainerDied","Data":"2752ab6b6a5953e2bfb14d5042f5b4c819ee3d2c776e7207d60b0ba0ae7cd213"} Mar 20 18:39:14 crc kubenswrapper[4690]: I0320 18:39:14.700604 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6sx89" Mar 20 18:39:14 crc kubenswrapper[4690]: I0320 18:39:14.700641 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6sx89" event={"ID":"22fd45db-f26a-49cb-be67-edb6178d00b9","Type":"ContainerDied","Data":"c152a739562357fe9265b6309216f06bf69eb5370ff3cd311e2d44a33340d453"} Mar 20 18:39:14 crc kubenswrapper[4690]: I0320 18:39:14.700671 4690 scope.go:117] "RemoveContainer" containerID="2752ab6b6a5953e2bfb14d5042f5b4c819ee3d2c776e7207d60b0ba0ae7cd213" Mar 20 18:39:14 crc kubenswrapper[4690]: I0320 18:39:14.725948 4690 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22fd45db-f26a-49cb-be67-edb6178d00b9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 18:39:14 crc kubenswrapper[4690]: I0320 18:39:14.735716 4690 scope.go:117] "RemoveContainer" containerID="62153c762533bae716109d12de596e9054a66f793d2d084ff1be1b7481298efc" Mar 20 18:39:14 crc kubenswrapper[4690]: I0320 18:39:14.742974 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6sx89"] Mar 20 18:39:14 crc kubenswrapper[4690]: I0320 18:39:14.754079 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-6sx89"] Mar 20 18:39:14 crc kubenswrapper[4690]: I0320 18:39:14.773149 4690 scope.go:117] "RemoveContainer" containerID="647274f57afa8a5c85f2fabff675e03873884f639de6fcda53806c527e9c5760" Mar 20 18:39:14 crc kubenswrapper[4690]: I0320 18:39:14.820169 4690 scope.go:117] "RemoveContainer" containerID="2752ab6b6a5953e2bfb14d5042f5b4c819ee3d2c776e7207d60b0ba0ae7cd213" Mar 20 18:39:14 crc kubenswrapper[4690]: E0320 18:39:14.820656 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2752ab6b6a5953e2bfb14d5042f5b4c819ee3d2c776e7207d60b0ba0ae7cd213\": container with ID starting with 2752ab6b6a5953e2bfb14d5042f5b4c819ee3d2c776e7207d60b0ba0ae7cd213 not found: ID does not exist" containerID="2752ab6b6a5953e2bfb14d5042f5b4c819ee3d2c776e7207d60b0ba0ae7cd213" Mar 20 18:39:14 crc kubenswrapper[4690]: I0320 18:39:14.820697 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2752ab6b6a5953e2bfb14d5042f5b4c819ee3d2c776e7207d60b0ba0ae7cd213"} err="failed to get container status \"2752ab6b6a5953e2bfb14d5042f5b4c819ee3d2c776e7207d60b0ba0ae7cd213\": rpc error: code = NotFound desc = could not find container \"2752ab6b6a5953e2bfb14d5042f5b4c819ee3d2c776e7207d60b0ba0ae7cd213\": container with ID starting with 2752ab6b6a5953e2bfb14d5042f5b4c819ee3d2c776e7207d60b0ba0ae7cd213 not found: ID does not exist" Mar 20 18:39:14 crc kubenswrapper[4690]: I0320 18:39:14.820739 4690 scope.go:117] "RemoveContainer" containerID="62153c762533bae716109d12de596e9054a66f793d2d084ff1be1b7481298efc" Mar 20 18:39:14 crc kubenswrapper[4690]: E0320 18:39:14.820965 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62153c762533bae716109d12de596e9054a66f793d2d084ff1be1b7481298efc\": container with ID starting with 62153c762533bae716109d12de596e9054a66f793d2d084ff1be1b7481298efc not found: ID does not exist" containerID="62153c762533bae716109d12de596e9054a66f793d2d084ff1be1b7481298efc" Mar 20 18:39:14 crc kubenswrapper[4690]: I0320 18:39:14.820992 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62153c762533bae716109d12de596e9054a66f793d2d084ff1be1b7481298efc"} err="failed to get container status \"62153c762533bae716109d12de596e9054a66f793d2d084ff1be1b7481298efc\": rpc error: code = NotFound desc = could not find container \"62153c762533bae716109d12de596e9054a66f793d2d084ff1be1b7481298efc\": container with ID starting with 62153c762533bae716109d12de596e9054a66f793d2d084ff1be1b7481298efc not found: ID does not exist" Mar 20 18:39:14 crc kubenswrapper[4690]: I0320 18:39:14.821010 4690 scope.go:117] "RemoveContainer" containerID="647274f57afa8a5c85f2fabff675e03873884f639de6fcda53806c527e9c5760" Mar 20 18:39:14 crc kubenswrapper[4690]: E0320 18:39:14.821290 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"647274f57afa8a5c85f2fabff675e03873884f639de6fcda53806c527e9c5760\": container with ID starting with 647274f57afa8a5c85f2fabff675e03873884f639de6fcda53806c527e9c5760 not found: ID does not exist" containerID="647274f57afa8a5c85f2fabff675e03873884f639de6fcda53806c527e9c5760" Mar 20 18:39:14 crc kubenswrapper[4690]: I0320 18:39:14.821317 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"647274f57afa8a5c85f2fabff675e03873884f639de6fcda53806c527e9c5760"} err="failed to get container status \"647274f57afa8a5c85f2fabff675e03873884f639de6fcda53806c527e9c5760\": rpc error: code = NotFound desc = could not find container \"647274f57afa8a5c85f2fabff675e03873884f639de6fcda53806c527e9c5760\": container with ID starting with 647274f57afa8a5c85f2fabff675e03873884f639de6fcda53806c527e9c5760 not found: ID does not exist" Mar 20 18:39:15 crc kubenswrapper[4690]: I0320 18:39:15.898751 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22fd45db-f26a-49cb-be67-edb6178d00b9" path="/var/lib/kubelet/pods/22fd45db-f26a-49cb-be67-edb6178d00b9/volumes" Mar 20 18:39:15 crc kubenswrapper[4690]: I0320 18:39:15.966401 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-nl8tc" Mar 20 18:39:15 crc kubenswrapper[4690]: I0320 18:39:15.966543 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-nl8tc" Mar 20 18:39:16 crc kubenswrapper[4690]: I0320 18:39:16.037134 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-nl8tc" Mar 20 18:39:16 crc kubenswrapper[4690]: I0320 18:39:16.786617 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-nl8tc" Mar 20 18:39:17 crc kubenswrapper[4690]: I0320 18:39:17.630311 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nl8tc"] Mar 20 18:39:18 crc kubenswrapper[4690]: I0320 18:39:18.742870 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-nl8tc" podUID="946b16f4-d454-4b40-b2c5-453f8bb52651" containerName="registry-server" containerID="cri-o://1f36b877bf4d4a71a56a16a9207f7519ef4f877621f3a7a31724cf954c58004b" gracePeriod=2 Mar 20 18:39:19 crc kubenswrapper[4690]: I0320 18:39:19.660817 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nl8tc" Mar 20 18:39:19 crc kubenswrapper[4690]: I0320 18:39:19.754327 4690 generic.go:334] "Generic (PLEG): container finished" podID="946b16f4-d454-4b40-b2c5-453f8bb52651" containerID="1f36b877bf4d4a71a56a16a9207f7519ef4f877621f3a7a31724cf954c58004b" exitCode=0 Mar 20 18:39:19 crc kubenswrapper[4690]: I0320 18:39:19.754374 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nl8tc" event={"ID":"946b16f4-d454-4b40-b2c5-453f8bb52651","Type":"ContainerDied","Data":"1f36b877bf4d4a71a56a16a9207f7519ef4f877621f3a7a31724cf954c58004b"} Mar 20 18:39:19 crc kubenswrapper[4690]: I0320 18:39:19.754390 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nl8tc" Mar 20 18:39:19 crc kubenswrapper[4690]: I0320 18:39:19.754403 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nl8tc" event={"ID":"946b16f4-d454-4b40-b2c5-453f8bb52651","Type":"ContainerDied","Data":"5a2db0ba9a1b1dc85ae64b3a56a963dd38d17aafd132fa78db15d91dc4928b35"} Mar 20 18:39:19 crc kubenswrapper[4690]: I0320 18:39:19.754432 4690 scope.go:117] "RemoveContainer" containerID="1f36b877bf4d4a71a56a16a9207f7519ef4f877621f3a7a31724cf954c58004b" Mar 20 18:39:19 crc kubenswrapper[4690]: I0320 18:39:19.783814 4690 scope.go:117] "RemoveContainer" containerID="5dc7b8540f59d88568146c26772c2b104d8b045d92d875489bb4b8d43a36b67f" Mar 20 18:39:19 crc kubenswrapper[4690]: I0320 18:39:19.808837 4690 scope.go:117] "RemoveContainer" containerID="9e873e769f411b69a5d687adb1276ed4e3fb231f50ec52d38078e5474716ac37" Mar 20 18:39:19 crc kubenswrapper[4690]: I0320 18:39:19.837043 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5kgsw\" (UniqueName: \"kubernetes.io/projected/946b16f4-d454-4b40-b2c5-453f8bb52651-kube-api-access-5kgsw\") pod \"946b16f4-d454-4b40-b2c5-453f8bb52651\" (UID: \"946b16f4-d454-4b40-b2c5-453f8bb52651\") " Mar 20 18:39:19 crc kubenswrapper[4690]: I0320 18:39:19.837194 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/946b16f4-d454-4b40-b2c5-453f8bb52651-catalog-content\") pod \"946b16f4-d454-4b40-b2c5-453f8bb52651\" (UID: \"946b16f4-d454-4b40-b2c5-453f8bb52651\") " Mar 20 18:39:19 crc kubenswrapper[4690]: I0320 18:39:19.837578 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/946b16f4-d454-4b40-b2c5-453f8bb52651-utilities\") pod \"946b16f4-d454-4b40-b2c5-453f8bb52651\" (UID: \"946b16f4-d454-4b40-b2c5-453f8bb52651\") " Mar 20 18:39:19 crc kubenswrapper[4690]: I0320 18:39:19.839026 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/946b16f4-d454-4b40-b2c5-453f8bb52651-utilities" (OuterVolumeSpecName: "utilities") pod "946b16f4-d454-4b40-b2c5-453f8bb52651" (UID: "946b16f4-d454-4b40-b2c5-453f8bb52651"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:39:19 crc kubenswrapper[4690]: I0320 18:39:19.841899 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/946b16f4-d454-4b40-b2c5-453f8bb52651-kube-api-access-5kgsw" (OuterVolumeSpecName: "kube-api-access-5kgsw") pod "946b16f4-d454-4b40-b2c5-453f8bb52651" (UID: "946b16f4-d454-4b40-b2c5-453f8bb52651"). InnerVolumeSpecName "kube-api-access-5kgsw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:39:19 crc kubenswrapper[4690]: I0320 18:39:19.851366 4690 scope.go:117] "RemoveContainer" containerID="1f36b877bf4d4a71a56a16a9207f7519ef4f877621f3a7a31724cf954c58004b" Mar 20 18:39:19 crc kubenswrapper[4690]: E0320 18:39:19.851742 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f36b877bf4d4a71a56a16a9207f7519ef4f877621f3a7a31724cf954c58004b\": container with ID starting with 1f36b877bf4d4a71a56a16a9207f7519ef4f877621f3a7a31724cf954c58004b not found: ID does not exist" containerID="1f36b877bf4d4a71a56a16a9207f7519ef4f877621f3a7a31724cf954c58004b" Mar 20 18:39:19 crc kubenswrapper[4690]: I0320 18:39:19.851802 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f36b877bf4d4a71a56a16a9207f7519ef4f877621f3a7a31724cf954c58004b"} err="failed to get container status \"1f36b877bf4d4a71a56a16a9207f7519ef4f877621f3a7a31724cf954c58004b\": rpc error: code = NotFound desc = could not find container \"1f36b877bf4d4a71a56a16a9207f7519ef4f877621f3a7a31724cf954c58004b\": container with ID starting with 1f36b877bf4d4a71a56a16a9207f7519ef4f877621f3a7a31724cf954c58004b not found: ID does not exist" Mar 20 18:39:19 crc kubenswrapper[4690]: I0320 18:39:19.851826 4690 scope.go:117] "RemoveContainer" containerID="5dc7b8540f59d88568146c26772c2b104d8b045d92d875489bb4b8d43a36b67f" Mar 20 18:39:19 crc kubenswrapper[4690]: E0320 18:39:19.852323 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5dc7b8540f59d88568146c26772c2b104d8b045d92d875489bb4b8d43a36b67f\": container with ID starting with 5dc7b8540f59d88568146c26772c2b104d8b045d92d875489bb4b8d43a36b67f not found: ID does not exist" containerID="5dc7b8540f59d88568146c26772c2b104d8b045d92d875489bb4b8d43a36b67f" Mar 20 18:39:19 crc kubenswrapper[4690]: I0320 18:39:19.852381 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5dc7b8540f59d88568146c26772c2b104d8b045d92d875489bb4b8d43a36b67f"} err="failed to get container status \"5dc7b8540f59d88568146c26772c2b104d8b045d92d875489bb4b8d43a36b67f\": rpc error: code = NotFound desc = could not find container \"5dc7b8540f59d88568146c26772c2b104d8b045d92d875489bb4b8d43a36b67f\": container with ID starting with 5dc7b8540f59d88568146c26772c2b104d8b045d92d875489bb4b8d43a36b67f not found: ID does not exist" Mar 20 18:39:19 crc kubenswrapper[4690]: I0320 18:39:19.852412 4690 scope.go:117] "RemoveContainer" containerID="9e873e769f411b69a5d687adb1276ed4e3fb231f50ec52d38078e5474716ac37" Mar 20 18:39:19 crc kubenswrapper[4690]: E0320 18:39:19.852802 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e873e769f411b69a5d687adb1276ed4e3fb231f50ec52d38078e5474716ac37\": container with ID starting with 9e873e769f411b69a5d687adb1276ed4e3fb231f50ec52d38078e5474716ac37 not found: ID does not exist" containerID="9e873e769f411b69a5d687adb1276ed4e3fb231f50ec52d38078e5474716ac37" Mar 20 18:39:19 crc kubenswrapper[4690]: I0320 18:39:19.852847 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e873e769f411b69a5d687adb1276ed4e3fb231f50ec52d38078e5474716ac37"} err="failed to get container status \"9e873e769f411b69a5d687adb1276ed4e3fb231f50ec52d38078e5474716ac37\": rpc error: code = NotFound desc = could not find container \"9e873e769f411b69a5d687adb1276ed4e3fb231f50ec52d38078e5474716ac37\": container with ID starting with 9e873e769f411b69a5d687adb1276ed4e3fb231f50ec52d38078e5474716ac37 not found: ID does not exist" Mar 20 18:39:19 crc kubenswrapper[4690]: I0320 18:39:19.898774 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/946b16f4-d454-4b40-b2c5-453f8bb52651-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "946b16f4-d454-4b40-b2c5-453f8bb52651" (UID: "946b16f4-d454-4b40-b2c5-453f8bb52651"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:39:19 crc kubenswrapper[4690]: I0320 18:39:19.939978 4690 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/946b16f4-d454-4b40-b2c5-453f8bb52651-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 18:39:19 crc kubenswrapper[4690]: I0320 18:39:19.940019 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5kgsw\" (UniqueName: \"kubernetes.io/projected/946b16f4-d454-4b40-b2c5-453f8bb52651-kube-api-access-5kgsw\") on node \"crc\" DevicePath \"\"" Mar 20 18:39:19 crc kubenswrapper[4690]: I0320 18:39:19.940037 4690 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/946b16f4-d454-4b40-b2c5-453f8bb52651-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 18:39:20 crc kubenswrapper[4690]: I0320 18:39:20.094510 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nl8tc"] Mar 20 18:39:20 crc kubenswrapper[4690]: I0320 18:39:20.105435 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-nl8tc"] Mar 20 18:39:21 crc kubenswrapper[4690]: I0320 18:39:21.903329 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="946b16f4-d454-4b40-b2c5-453f8bb52651" path="/var/lib/kubelet/pods/946b16f4-d454-4b40-b2c5-453f8bb52651/volumes" Mar 20 18:39:25 crc kubenswrapper[4690]: I0320 18:39:25.393959 4690 scope.go:117] "RemoveContainer" containerID="5091fd8657afe0c47155cc97e7fce21b48d72f658e95ae63fd2ae9c29f36d962" Mar 20 18:40:00 crc kubenswrapper[4690]: I0320 18:40:00.149811 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567200-2z4fn"] Mar 20 18:40:00 crc kubenswrapper[4690]: E0320 18:40:00.150898 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="946b16f4-d454-4b40-b2c5-453f8bb52651" containerName="registry-server" Mar 20 18:40:00 crc kubenswrapper[4690]: I0320 18:40:00.150915 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="946b16f4-d454-4b40-b2c5-453f8bb52651" containerName="registry-server" Mar 20 18:40:00 crc kubenswrapper[4690]: E0320 18:40:00.150940 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22fd45db-f26a-49cb-be67-edb6178d00b9" containerName="extract-content" Mar 20 18:40:00 crc kubenswrapper[4690]: I0320 18:40:00.150949 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="22fd45db-f26a-49cb-be67-edb6178d00b9" containerName="extract-content" Mar 20 18:40:00 crc kubenswrapper[4690]: E0320 18:40:00.150960 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="946b16f4-d454-4b40-b2c5-453f8bb52651" containerName="extract-utilities" Mar 20 18:40:00 crc kubenswrapper[4690]: I0320 18:40:00.150969 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="946b16f4-d454-4b40-b2c5-453f8bb52651" containerName="extract-utilities" Mar 20 18:40:00 crc kubenswrapper[4690]: E0320 18:40:00.150984 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22fd45db-f26a-49cb-be67-edb6178d00b9" containerName="registry-server" Mar 20 18:40:00 crc kubenswrapper[4690]: I0320 18:40:00.150992 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="22fd45db-f26a-49cb-be67-edb6178d00b9" containerName="registry-server" Mar 20 18:40:00 crc kubenswrapper[4690]: E0320 18:40:00.151020 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22fd45db-f26a-49cb-be67-edb6178d00b9" containerName="extract-utilities" Mar 20 18:40:00 crc kubenswrapper[4690]: I0320 18:40:00.151027 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="22fd45db-f26a-49cb-be67-edb6178d00b9" containerName="extract-utilities" Mar 20 18:40:00 crc kubenswrapper[4690]: E0320 18:40:00.151042 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="946b16f4-d454-4b40-b2c5-453f8bb52651" containerName="extract-content" Mar 20 18:40:00 crc kubenswrapper[4690]: I0320 18:40:00.151050 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="946b16f4-d454-4b40-b2c5-453f8bb52651" containerName="extract-content" Mar 20 18:40:00 crc kubenswrapper[4690]: I0320 18:40:00.151368 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="946b16f4-d454-4b40-b2c5-453f8bb52651" containerName="registry-server" Mar 20 18:40:00 crc kubenswrapper[4690]: I0320 18:40:00.151399 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="22fd45db-f26a-49cb-be67-edb6178d00b9" containerName="registry-server" Mar 20 18:40:00 crc kubenswrapper[4690]: I0320 18:40:00.152053 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567200-2z4fn" Mar 20 18:40:00 crc kubenswrapper[4690]: I0320 18:40:00.154874 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5fwhb" Mar 20 18:40:00 crc kubenswrapper[4690]: I0320 18:40:00.156092 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 18:40:00 crc kubenswrapper[4690]: I0320 18:40:00.161711 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 18:40:00 crc kubenswrapper[4690]: I0320 18:40:00.173312 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567200-2z4fn"] Mar 20 18:40:00 crc kubenswrapper[4690]: I0320 18:40:00.287935 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hb98v\" (UniqueName: \"kubernetes.io/projected/be39da38-bef8-41df-9277-b515edecd46c-kube-api-access-hb98v\") pod \"auto-csr-approver-29567200-2z4fn\" (UID: \"be39da38-bef8-41df-9277-b515edecd46c\") " pod="openshift-infra/auto-csr-approver-29567200-2z4fn" Mar 20 18:40:00 crc kubenswrapper[4690]: I0320 18:40:00.389421 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hb98v\" (UniqueName: \"kubernetes.io/projected/be39da38-bef8-41df-9277-b515edecd46c-kube-api-access-hb98v\") pod \"auto-csr-approver-29567200-2z4fn\" (UID: \"be39da38-bef8-41df-9277-b515edecd46c\") " pod="openshift-infra/auto-csr-approver-29567200-2z4fn" Mar 20 18:40:00 crc kubenswrapper[4690]: I0320 18:40:00.419164 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hb98v\" (UniqueName: \"kubernetes.io/projected/be39da38-bef8-41df-9277-b515edecd46c-kube-api-access-hb98v\") pod \"auto-csr-approver-29567200-2z4fn\" (UID: \"be39da38-bef8-41df-9277-b515edecd46c\") " pod="openshift-infra/auto-csr-approver-29567200-2z4fn" Mar 20 18:40:00 crc kubenswrapper[4690]: I0320 18:40:00.473705 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567200-2z4fn" Mar 20 18:40:00 crc kubenswrapper[4690]: I0320 18:40:00.513165 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7rkqw"] Mar 20 18:40:00 crc kubenswrapper[4690]: I0320 18:40:00.515627 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7rkqw" Mar 20 18:40:00 crc kubenswrapper[4690]: I0320 18:40:00.537175 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7rkqw"] Mar 20 18:40:00 crc kubenswrapper[4690]: I0320 18:40:00.697151 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/660c2568-e93c-4d1d-9e9f-d008b8d96fc0-catalog-content\") pod \"redhat-operators-7rkqw\" (UID: \"660c2568-e93c-4d1d-9e9f-d008b8d96fc0\") " pod="openshift-marketplace/redhat-operators-7rkqw" Mar 20 18:40:00 crc kubenswrapper[4690]: I0320 18:40:00.697508 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c477c\" (UniqueName: \"kubernetes.io/projected/660c2568-e93c-4d1d-9e9f-d008b8d96fc0-kube-api-access-c477c\") pod \"redhat-operators-7rkqw\" (UID: \"660c2568-e93c-4d1d-9e9f-d008b8d96fc0\") " pod="openshift-marketplace/redhat-operators-7rkqw" Mar 20 18:40:00 crc kubenswrapper[4690]: I0320 18:40:00.697766 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/660c2568-e93c-4d1d-9e9f-d008b8d96fc0-utilities\") pod \"redhat-operators-7rkqw\" (UID: \"660c2568-e93c-4d1d-9e9f-d008b8d96fc0\") " pod="openshift-marketplace/redhat-operators-7rkqw" Mar 20 18:40:00 crc kubenswrapper[4690]: I0320 18:40:00.800190 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/660c2568-e93c-4d1d-9e9f-d008b8d96fc0-catalog-content\") pod \"redhat-operators-7rkqw\" (UID: \"660c2568-e93c-4d1d-9e9f-d008b8d96fc0\") " pod="openshift-marketplace/redhat-operators-7rkqw" Mar 20 18:40:00 crc kubenswrapper[4690]: I0320 18:40:00.800248 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c477c\" (UniqueName: \"kubernetes.io/projected/660c2568-e93c-4d1d-9e9f-d008b8d96fc0-kube-api-access-c477c\") pod \"redhat-operators-7rkqw\" (UID: \"660c2568-e93c-4d1d-9e9f-d008b8d96fc0\") " pod="openshift-marketplace/redhat-operators-7rkqw" Mar 20 18:40:00 crc kubenswrapper[4690]: I0320 18:40:00.800362 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/660c2568-e93c-4d1d-9e9f-d008b8d96fc0-utilities\") pod \"redhat-operators-7rkqw\" (UID: \"660c2568-e93c-4d1d-9e9f-d008b8d96fc0\") " pod="openshift-marketplace/redhat-operators-7rkqw" Mar 20 18:40:00 crc kubenswrapper[4690]: I0320 18:40:00.800733 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/660c2568-e93c-4d1d-9e9f-d008b8d96fc0-catalog-content\") pod \"redhat-operators-7rkqw\" (UID: \"660c2568-e93c-4d1d-9e9f-d008b8d96fc0\") " pod="openshift-marketplace/redhat-operators-7rkqw" Mar 20 18:40:00 crc kubenswrapper[4690]: I0320 18:40:00.800967 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/660c2568-e93c-4d1d-9e9f-d008b8d96fc0-utilities\") pod \"redhat-operators-7rkqw\" (UID: \"660c2568-e93c-4d1d-9e9f-d008b8d96fc0\") " pod="openshift-marketplace/redhat-operators-7rkqw" Mar 20 18:40:00 crc kubenswrapper[4690]: I0320 18:40:00.827240 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c477c\" (UniqueName: \"kubernetes.io/projected/660c2568-e93c-4d1d-9e9f-d008b8d96fc0-kube-api-access-c477c\") pod \"redhat-operators-7rkqw\" (UID: \"660c2568-e93c-4d1d-9e9f-d008b8d96fc0\") " pod="openshift-marketplace/redhat-operators-7rkqw" Mar 20 18:40:00 crc kubenswrapper[4690]: I0320 18:40:00.923738 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7rkqw" Mar 20 18:40:01 crc kubenswrapper[4690]: I0320 18:40:01.055378 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567200-2z4fn"] Mar 20 18:40:01 crc kubenswrapper[4690]: I0320 18:40:01.161382 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567200-2z4fn" event={"ID":"be39da38-bef8-41df-9277-b515edecd46c","Type":"ContainerStarted","Data":"eed52b12f55a79ff3037c13fbdc410f8e3a06cf1532ac66fd1be2a352f381270"} Mar 20 18:40:01 crc kubenswrapper[4690]: W0320 18:40:01.391059 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod660c2568_e93c_4d1d_9e9f_d008b8d96fc0.slice/crio-099006bbfd6266987122b8191c79fb323bcd94f122e5d2a846d33f61f841c804 WatchSource:0}: Error finding container 099006bbfd6266987122b8191c79fb323bcd94f122e5d2a846d33f61f841c804: Status 404 returned error can't find the container with id 099006bbfd6266987122b8191c79fb323bcd94f122e5d2a846d33f61f841c804 Mar 20 18:40:01 crc kubenswrapper[4690]: I0320 18:40:01.399246 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7rkqw"] Mar 20 18:40:02 crc kubenswrapper[4690]: I0320 18:40:02.169976 4690 generic.go:334] "Generic (PLEG): container finished" podID="660c2568-e93c-4d1d-9e9f-d008b8d96fc0" containerID="f778cae21f7a7b2a654060c211c73c6e2a3a52778d5a51e8b61b54f3ecf6b300" exitCode=0 Mar 20 18:40:02 crc kubenswrapper[4690]: I0320 18:40:02.170034 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7rkqw" event={"ID":"660c2568-e93c-4d1d-9e9f-d008b8d96fc0","Type":"ContainerDied","Data":"f778cae21f7a7b2a654060c211c73c6e2a3a52778d5a51e8b61b54f3ecf6b300"} Mar 20 18:40:02 crc kubenswrapper[4690]: I0320 18:40:02.170088 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7rkqw" event={"ID":"660c2568-e93c-4d1d-9e9f-d008b8d96fc0","Type":"ContainerStarted","Data":"099006bbfd6266987122b8191c79fb323bcd94f122e5d2a846d33f61f841c804"} Mar 20 18:40:03 crc kubenswrapper[4690]: I0320 18:40:03.181105 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7rkqw" event={"ID":"660c2568-e93c-4d1d-9e9f-d008b8d96fc0","Type":"ContainerStarted","Data":"3d04b5ba77546a06de30a9d2e5fcb3e0a6466a2cebccdd5c1f7bf3106d7f63d5"} Mar 20 18:40:03 crc kubenswrapper[4690]: I0320 18:40:03.184651 4690 generic.go:334] "Generic (PLEG): container finished" podID="be39da38-bef8-41df-9277-b515edecd46c" containerID="dc3c16228c72e0cb3ecf520f98d305ffd477107c631cb40b3f014effbd621188" exitCode=0 Mar 20 18:40:03 crc kubenswrapper[4690]: I0320 18:40:03.184728 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567200-2z4fn" event={"ID":"be39da38-bef8-41df-9277-b515edecd46c","Type":"ContainerDied","Data":"dc3c16228c72e0cb3ecf520f98d305ffd477107c631cb40b3f014effbd621188"} Mar 20 18:40:04 crc kubenswrapper[4690]: I0320 18:40:04.196573 4690 generic.go:334] "Generic (PLEG): container finished" podID="660c2568-e93c-4d1d-9e9f-d008b8d96fc0" containerID="3d04b5ba77546a06de30a9d2e5fcb3e0a6466a2cebccdd5c1f7bf3106d7f63d5" exitCode=0 Mar 20 18:40:04 crc kubenswrapper[4690]: I0320 18:40:04.196675 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7rkqw" event={"ID":"660c2568-e93c-4d1d-9e9f-d008b8d96fc0","Type":"ContainerDied","Data":"3d04b5ba77546a06de30a9d2e5fcb3e0a6466a2cebccdd5c1f7bf3106d7f63d5"} Mar 20 18:40:04 crc kubenswrapper[4690]: I0320 18:40:04.791118 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567200-2z4fn" Mar 20 18:40:04 crc kubenswrapper[4690]: I0320 18:40:04.882582 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hb98v\" (UniqueName: \"kubernetes.io/projected/be39da38-bef8-41df-9277-b515edecd46c-kube-api-access-hb98v\") pod \"be39da38-bef8-41df-9277-b515edecd46c\" (UID: \"be39da38-bef8-41df-9277-b515edecd46c\") " Mar 20 18:40:04 crc kubenswrapper[4690]: I0320 18:40:04.891456 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be39da38-bef8-41df-9277-b515edecd46c-kube-api-access-hb98v" (OuterVolumeSpecName: "kube-api-access-hb98v") pod "be39da38-bef8-41df-9277-b515edecd46c" (UID: "be39da38-bef8-41df-9277-b515edecd46c"). InnerVolumeSpecName "kube-api-access-hb98v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:40:04 crc kubenswrapper[4690]: I0320 18:40:04.986884 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hb98v\" (UniqueName: \"kubernetes.io/projected/be39da38-bef8-41df-9277-b515edecd46c-kube-api-access-hb98v\") on node \"crc\" DevicePath \"\"" Mar 20 18:40:05 crc kubenswrapper[4690]: I0320 18:40:05.206015 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7rkqw" event={"ID":"660c2568-e93c-4d1d-9e9f-d008b8d96fc0","Type":"ContainerStarted","Data":"85c1e75a761022c51fc82a71b1f8fc1e7c080f27c76eeedf1c2a8d8db46abcaf"} Mar 20 18:40:05 crc kubenswrapper[4690]: I0320 18:40:05.208728 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567200-2z4fn" event={"ID":"be39da38-bef8-41df-9277-b515edecd46c","Type":"ContainerDied","Data":"eed52b12f55a79ff3037c13fbdc410f8e3a06cf1532ac66fd1be2a352f381270"} Mar 20 18:40:05 crc kubenswrapper[4690]: I0320 18:40:05.208755 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567200-2z4fn" Mar 20 18:40:05 crc kubenswrapper[4690]: I0320 18:40:05.208768 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eed52b12f55a79ff3037c13fbdc410f8e3a06cf1532ac66fd1be2a352f381270" Mar 20 18:40:05 crc kubenswrapper[4690]: I0320 18:40:05.228568 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7rkqw" podStartSLOduration=2.7585619660000003 podStartE2EDuration="5.228551854s" podCreationTimestamp="2026-03-20 18:40:00 +0000 UTC" firstStartedPulling="2026-03-20 18:40:02.171766424 +0000 UTC m=+4077.037592102" lastFinishedPulling="2026-03-20 18:40:04.641756322 +0000 UTC m=+4079.507581990" observedRunningTime="2026-03-20 18:40:05.223195712 +0000 UTC m=+4080.089021390" watchObservedRunningTime="2026-03-20 18:40:05.228551854 +0000 UTC m=+4080.094377532" Mar 20 18:40:05 crc kubenswrapper[4690]: E0320 18:40:05.357326 4690 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbe39da38_bef8_41df_9277_b515edecd46c.slice/crio-eed52b12f55a79ff3037c13fbdc410f8e3a06cf1532ac66fd1be2a352f381270\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbe39da38_bef8_41df_9277_b515edecd46c.slice\": RecentStats: unable to find data in memory cache]" Mar 20 18:40:05 crc kubenswrapper[4690]: I0320 18:40:05.868514 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567194-pqwm6"] Mar 20 18:40:05 crc kubenswrapper[4690]: I0320 18:40:05.881718 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567194-pqwm6"] Mar 20 18:40:05 crc kubenswrapper[4690]: I0320 18:40:05.900641 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e3f0313-6464-43d3-96df-ae1a675322b1" path="/var/lib/kubelet/pods/3e3f0313-6464-43d3-96df-ae1a675322b1/volumes" Mar 20 18:40:10 crc kubenswrapper[4690]: I0320 18:40:10.923895 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7rkqw" Mar 20 18:40:10 crc kubenswrapper[4690]: I0320 18:40:10.924583 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7rkqw" Mar 20 18:40:11 crc kubenswrapper[4690]: I0320 18:40:11.989603 4690 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7rkqw" podUID="660c2568-e93c-4d1d-9e9f-d008b8d96fc0" containerName="registry-server" probeResult="failure" output=< Mar 20 18:40:11 crc kubenswrapper[4690]: timeout: failed to connect service ":50051" within 1s Mar 20 18:40:11 crc kubenswrapper[4690]: > Mar 20 18:40:20 crc kubenswrapper[4690]: I0320 18:40:20.995001 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7rkqw" Mar 20 18:40:21 crc kubenswrapper[4690]: I0320 18:40:21.077619 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7rkqw" Mar 20 18:40:21 crc kubenswrapper[4690]: I0320 18:40:21.254692 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7rkqw"] Mar 20 18:40:22 crc kubenswrapper[4690]: I0320 18:40:22.394249 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7rkqw" podUID="660c2568-e93c-4d1d-9e9f-d008b8d96fc0" containerName="registry-server" containerID="cri-o://85c1e75a761022c51fc82a71b1f8fc1e7c080f27c76eeedf1c2a8d8db46abcaf" gracePeriod=2 Mar 20 18:40:22 crc kubenswrapper[4690]: I0320 18:40:22.969469 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7rkqw" Mar 20 18:40:22 crc kubenswrapper[4690]: I0320 18:40:22.985664 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/660c2568-e93c-4d1d-9e9f-d008b8d96fc0-catalog-content\") pod \"660c2568-e93c-4d1d-9e9f-d008b8d96fc0\" (UID: \"660c2568-e93c-4d1d-9e9f-d008b8d96fc0\") " Mar 20 18:40:22 crc kubenswrapper[4690]: I0320 18:40:22.985779 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c477c\" (UniqueName: \"kubernetes.io/projected/660c2568-e93c-4d1d-9e9f-d008b8d96fc0-kube-api-access-c477c\") pod \"660c2568-e93c-4d1d-9e9f-d008b8d96fc0\" (UID: \"660c2568-e93c-4d1d-9e9f-d008b8d96fc0\") " Mar 20 18:40:22 crc kubenswrapper[4690]: I0320 18:40:22.985858 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/660c2568-e93c-4d1d-9e9f-d008b8d96fc0-utilities\") pod \"660c2568-e93c-4d1d-9e9f-d008b8d96fc0\" (UID: \"660c2568-e93c-4d1d-9e9f-d008b8d96fc0\") " Mar 20 18:40:22 crc kubenswrapper[4690]: I0320 18:40:22.990768 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/660c2568-e93c-4d1d-9e9f-d008b8d96fc0-utilities" (OuterVolumeSpecName: "utilities") pod "660c2568-e93c-4d1d-9e9f-d008b8d96fc0" (UID: "660c2568-e93c-4d1d-9e9f-d008b8d96fc0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:40:23 crc kubenswrapper[4690]: I0320 18:40:23.006645 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/660c2568-e93c-4d1d-9e9f-d008b8d96fc0-kube-api-access-c477c" (OuterVolumeSpecName: "kube-api-access-c477c") pod "660c2568-e93c-4d1d-9e9f-d008b8d96fc0" (UID: "660c2568-e93c-4d1d-9e9f-d008b8d96fc0"). InnerVolumeSpecName "kube-api-access-c477c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:40:23 crc kubenswrapper[4690]: I0320 18:40:23.090090 4690 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/660c2568-e93c-4d1d-9e9f-d008b8d96fc0-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 18:40:23 crc kubenswrapper[4690]: I0320 18:40:23.090152 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c477c\" (UniqueName: \"kubernetes.io/projected/660c2568-e93c-4d1d-9e9f-d008b8d96fc0-kube-api-access-c477c\") on node \"crc\" DevicePath \"\"" Mar 20 18:40:23 crc kubenswrapper[4690]: I0320 18:40:23.118285 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/660c2568-e93c-4d1d-9e9f-d008b8d96fc0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "660c2568-e93c-4d1d-9e9f-d008b8d96fc0" (UID: "660c2568-e93c-4d1d-9e9f-d008b8d96fc0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:40:23 crc kubenswrapper[4690]: I0320 18:40:23.193511 4690 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/660c2568-e93c-4d1d-9e9f-d008b8d96fc0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 18:40:23 crc kubenswrapper[4690]: I0320 18:40:23.408923 4690 generic.go:334] "Generic (PLEG): container finished" podID="660c2568-e93c-4d1d-9e9f-d008b8d96fc0" containerID="85c1e75a761022c51fc82a71b1f8fc1e7c080f27c76eeedf1c2a8d8db46abcaf" exitCode=0 Mar 20 18:40:23 crc kubenswrapper[4690]: I0320 18:40:23.408998 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7rkqw" event={"ID":"660c2568-e93c-4d1d-9e9f-d008b8d96fc0","Type":"ContainerDied","Data":"85c1e75a761022c51fc82a71b1f8fc1e7c080f27c76eeedf1c2a8d8db46abcaf"} Mar 20 18:40:23 crc kubenswrapper[4690]: I0320 18:40:23.409052 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7rkqw" event={"ID":"660c2568-e93c-4d1d-9e9f-d008b8d96fc0","Type":"ContainerDied","Data":"099006bbfd6266987122b8191c79fb323bcd94f122e5d2a846d33f61f841c804"} Mar 20 18:40:23 crc kubenswrapper[4690]: I0320 18:40:23.409092 4690 scope.go:117] "RemoveContainer" containerID="85c1e75a761022c51fc82a71b1f8fc1e7c080f27c76eeedf1c2a8d8db46abcaf" Mar 20 18:40:23 crc kubenswrapper[4690]: I0320 18:40:23.409111 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7rkqw" Mar 20 18:40:23 crc kubenswrapper[4690]: I0320 18:40:23.443383 4690 scope.go:117] "RemoveContainer" containerID="3d04b5ba77546a06de30a9d2e5fcb3e0a6466a2cebccdd5c1f7bf3106d7f63d5" Mar 20 18:40:23 crc kubenswrapper[4690]: I0320 18:40:23.477921 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7rkqw"] Mar 20 18:40:23 crc kubenswrapper[4690]: I0320 18:40:23.478151 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7rkqw"] Mar 20 18:40:23 crc kubenswrapper[4690]: I0320 18:40:23.485791 4690 scope.go:117] "RemoveContainer" containerID="f778cae21f7a7b2a654060c211c73c6e2a3a52778d5a51e8b61b54f3ecf6b300" Mar 20 18:40:23 crc kubenswrapper[4690]: I0320 18:40:23.538005 4690 scope.go:117] "RemoveContainer" containerID="85c1e75a761022c51fc82a71b1f8fc1e7c080f27c76eeedf1c2a8d8db46abcaf" Mar 20 18:40:23 crc kubenswrapper[4690]: E0320 18:40:23.539683 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85c1e75a761022c51fc82a71b1f8fc1e7c080f27c76eeedf1c2a8d8db46abcaf\": container with ID starting with 85c1e75a761022c51fc82a71b1f8fc1e7c080f27c76eeedf1c2a8d8db46abcaf not found: ID does not exist" containerID="85c1e75a761022c51fc82a71b1f8fc1e7c080f27c76eeedf1c2a8d8db46abcaf" Mar 20 18:40:23 crc kubenswrapper[4690]: I0320 18:40:23.539741 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85c1e75a761022c51fc82a71b1f8fc1e7c080f27c76eeedf1c2a8d8db46abcaf"} err="failed to get container status \"85c1e75a761022c51fc82a71b1f8fc1e7c080f27c76eeedf1c2a8d8db46abcaf\": rpc error: code = NotFound desc = could not find container \"85c1e75a761022c51fc82a71b1f8fc1e7c080f27c76eeedf1c2a8d8db46abcaf\": container with ID starting with 85c1e75a761022c51fc82a71b1f8fc1e7c080f27c76eeedf1c2a8d8db46abcaf not found: ID does not exist" Mar 20 18:40:23 crc kubenswrapper[4690]: I0320 18:40:23.539777 4690 scope.go:117] "RemoveContainer" containerID="3d04b5ba77546a06de30a9d2e5fcb3e0a6466a2cebccdd5c1f7bf3106d7f63d5" Mar 20 18:40:23 crc kubenswrapper[4690]: E0320 18:40:23.540818 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d04b5ba77546a06de30a9d2e5fcb3e0a6466a2cebccdd5c1f7bf3106d7f63d5\": container with ID starting with 3d04b5ba77546a06de30a9d2e5fcb3e0a6466a2cebccdd5c1f7bf3106d7f63d5 not found: ID does not exist" containerID="3d04b5ba77546a06de30a9d2e5fcb3e0a6466a2cebccdd5c1f7bf3106d7f63d5" Mar 20 18:40:23 crc kubenswrapper[4690]: I0320 18:40:23.540869 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d04b5ba77546a06de30a9d2e5fcb3e0a6466a2cebccdd5c1f7bf3106d7f63d5"} err="failed to get container status \"3d04b5ba77546a06de30a9d2e5fcb3e0a6466a2cebccdd5c1f7bf3106d7f63d5\": rpc error: code = NotFound desc = could not find container \"3d04b5ba77546a06de30a9d2e5fcb3e0a6466a2cebccdd5c1f7bf3106d7f63d5\": container with ID starting with 3d04b5ba77546a06de30a9d2e5fcb3e0a6466a2cebccdd5c1f7bf3106d7f63d5 not found: ID does not exist" Mar 20 18:40:23 crc kubenswrapper[4690]: I0320 18:40:23.540895 4690 scope.go:117] "RemoveContainer" containerID="f778cae21f7a7b2a654060c211c73c6e2a3a52778d5a51e8b61b54f3ecf6b300" Mar 20 18:40:23 crc kubenswrapper[4690]: E0320 18:40:23.541180 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f778cae21f7a7b2a654060c211c73c6e2a3a52778d5a51e8b61b54f3ecf6b300\": container with ID starting with f778cae21f7a7b2a654060c211c73c6e2a3a52778d5a51e8b61b54f3ecf6b300 not found: ID does not exist" containerID="f778cae21f7a7b2a654060c211c73c6e2a3a52778d5a51e8b61b54f3ecf6b300" Mar 20 18:40:23 crc kubenswrapper[4690]: I0320 18:40:23.541228 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f778cae21f7a7b2a654060c211c73c6e2a3a52778d5a51e8b61b54f3ecf6b300"} err="failed to get container status \"f778cae21f7a7b2a654060c211c73c6e2a3a52778d5a51e8b61b54f3ecf6b300\": rpc error: code = NotFound desc = could not find container \"f778cae21f7a7b2a654060c211c73c6e2a3a52778d5a51e8b61b54f3ecf6b300\": container with ID starting with f778cae21f7a7b2a654060c211c73c6e2a3a52778d5a51e8b61b54f3ecf6b300 not found: ID does not exist" Mar 20 18:40:23 crc kubenswrapper[4690]: I0320 18:40:23.901192 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="660c2568-e93c-4d1d-9e9f-d008b8d96fc0" path="/var/lib/kubelet/pods/660c2568-e93c-4d1d-9e9f-d008b8d96fc0/volumes" Mar 20 18:40:25 crc kubenswrapper[4690]: I0320 18:40:25.522889 4690 scope.go:117] "RemoveContainer" containerID="d8ac455d756c20d2924c9548f608312f8a1e91a391cf9c76e9bfb32b96ff2d44" Mar 20 18:40:54 crc kubenswrapper[4690]: I0320 18:40:54.274305 4690 patch_prober.go:28] interesting pod/machine-config-daemon-wtg2q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 18:40:54 crc kubenswrapper[4690]: I0320 18:40:54.274888 4690 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 18:40:56 crc kubenswrapper[4690]: I0320 18:40:56.526257 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-fhzm4/must-gather-jbnnt"] Mar 20 18:40:56 crc kubenswrapper[4690]: E0320 18:40:56.527305 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="660c2568-e93c-4d1d-9e9f-d008b8d96fc0" containerName="registry-server" Mar 20 18:40:56 crc kubenswrapper[4690]: I0320 18:40:56.527321 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="660c2568-e93c-4d1d-9e9f-d008b8d96fc0" containerName="registry-server" Mar 20 18:40:56 crc kubenswrapper[4690]: E0320 18:40:56.527338 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be39da38-bef8-41df-9277-b515edecd46c" containerName="oc" Mar 20 18:40:56 crc kubenswrapper[4690]: I0320 18:40:56.527345 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="be39da38-bef8-41df-9277-b515edecd46c" containerName="oc" Mar 20 18:40:56 crc kubenswrapper[4690]: E0320 18:40:56.527380 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="660c2568-e93c-4d1d-9e9f-d008b8d96fc0" containerName="extract-utilities" Mar 20 18:40:56 crc kubenswrapper[4690]: I0320 18:40:56.527388 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="660c2568-e93c-4d1d-9e9f-d008b8d96fc0" containerName="extract-utilities" Mar 20 18:40:56 crc kubenswrapper[4690]: E0320 18:40:56.527399 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="660c2568-e93c-4d1d-9e9f-d008b8d96fc0" containerName="extract-content" Mar 20 18:40:56 crc kubenswrapper[4690]: I0320 18:40:56.527405 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="660c2568-e93c-4d1d-9e9f-d008b8d96fc0" containerName="extract-content" Mar 20 18:40:56 crc kubenswrapper[4690]: I0320 18:40:56.527679 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="660c2568-e93c-4d1d-9e9f-d008b8d96fc0" containerName="registry-server" Mar 20 18:40:56 crc kubenswrapper[4690]: I0320 18:40:56.527697 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="be39da38-bef8-41df-9277-b515edecd46c" containerName="oc" Mar 20 18:40:56 crc kubenswrapper[4690]: I0320 18:40:56.528849 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fhzm4/must-gather-jbnnt" Mar 20 18:40:56 crc kubenswrapper[4690]: I0320 18:40:56.530557 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-fhzm4"/"openshift-service-ca.crt" Mar 20 18:40:56 crc kubenswrapper[4690]: I0320 18:40:56.530974 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-fhzm4"/"default-dockercfg-6ghhx" Mar 20 18:40:56 crc kubenswrapper[4690]: I0320 18:40:56.532171 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-fhzm4"/"kube-root-ca.crt" Mar 20 18:40:56 crc kubenswrapper[4690]: I0320 18:40:56.537037 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-fhzm4/must-gather-jbnnt"] Mar 20 18:40:56 crc kubenswrapper[4690]: I0320 18:40:56.715346 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljq8t\" (UniqueName: \"kubernetes.io/projected/27395613-e0e9-49a0-a752-008c71dd5c23-kube-api-access-ljq8t\") pod \"must-gather-jbnnt\" (UID: \"27395613-e0e9-49a0-a752-008c71dd5c23\") " pod="openshift-must-gather-fhzm4/must-gather-jbnnt" Mar 20 18:40:56 crc kubenswrapper[4690]: I0320 18:40:56.715406 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/27395613-e0e9-49a0-a752-008c71dd5c23-must-gather-output\") pod \"must-gather-jbnnt\" (UID: \"27395613-e0e9-49a0-a752-008c71dd5c23\") " pod="openshift-must-gather-fhzm4/must-gather-jbnnt" Mar 20 18:40:56 crc kubenswrapper[4690]: I0320 18:40:56.817383 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljq8t\" (UniqueName: \"kubernetes.io/projected/27395613-e0e9-49a0-a752-008c71dd5c23-kube-api-access-ljq8t\") pod \"must-gather-jbnnt\" (UID: \"27395613-e0e9-49a0-a752-008c71dd5c23\") " pod="openshift-must-gather-fhzm4/must-gather-jbnnt" Mar 20 18:40:56 crc kubenswrapper[4690]: I0320 18:40:56.817438 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/27395613-e0e9-49a0-a752-008c71dd5c23-must-gather-output\") pod \"must-gather-jbnnt\" (UID: \"27395613-e0e9-49a0-a752-008c71dd5c23\") " pod="openshift-must-gather-fhzm4/must-gather-jbnnt" Mar 20 18:40:56 crc kubenswrapper[4690]: I0320 18:40:56.817917 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/27395613-e0e9-49a0-a752-008c71dd5c23-must-gather-output\") pod \"must-gather-jbnnt\" (UID: \"27395613-e0e9-49a0-a752-008c71dd5c23\") " pod="openshift-must-gather-fhzm4/must-gather-jbnnt" Mar 20 18:40:56 crc kubenswrapper[4690]: I0320 18:40:56.835595 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljq8t\" (UniqueName: \"kubernetes.io/projected/27395613-e0e9-49a0-a752-008c71dd5c23-kube-api-access-ljq8t\") pod \"must-gather-jbnnt\" (UID: \"27395613-e0e9-49a0-a752-008c71dd5c23\") " pod="openshift-must-gather-fhzm4/must-gather-jbnnt" Mar 20 18:40:56 crc kubenswrapper[4690]: I0320 18:40:56.846684 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fhzm4/must-gather-jbnnt" Mar 20 18:40:57 crc kubenswrapper[4690]: I0320 18:40:57.285436 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-fhzm4/must-gather-jbnnt"] Mar 20 18:40:57 crc kubenswrapper[4690]: I0320 18:40:57.760300 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fhzm4/must-gather-jbnnt" event={"ID":"27395613-e0e9-49a0-a752-008c71dd5c23","Type":"ContainerStarted","Data":"c2ce96c4ac33ee45e2281d7cce9a08dd409a12388bd9bb0d846d20248ab79024"} Mar 20 18:40:57 crc kubenswrapper[4690]: I0320 18:40:57.760616 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fhzm4/must-gather-jbnnt" event={"ID":"27395613-e0e9-49a0-a752-008c71dd5c23","Type":"ContainerStarted","Data":"adcff40499340913ed7b0fc5c6686b4220658e4f2d3f458f5b037846c748f655"} Mar 20 18:40:58 crc kubenswrapper[4690]: I0320 18:40:58.773129 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fhzm4/must-gather-jbnnt" event={"ID":"27395613-e0e9-49a0-a752-008c71dd5c23","Type":"ContainerStarted","Data":"3f525513029ea6b94d57effc8292fcc7fa821331b3259ea6aec14f8eae863f85"} Mar 20 18:40:58 crc kubenswrapper[4690]: I0320 18:40:58.791110 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-fhzm4/must-gather-jbnnt" podStartSLOduration=2.79109359 podStartE2EDuration="2.79109359s" podCreationTimestamp="2026-03-20 18:40:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 18:40:58.789657219 +0000 UTC m=+4133.655482937" watchObservedRunningTime="2026-03-20 18:40:58.79109359 +0000 UTC m=+4133.656919268" Mar 20 18:41:01 crc kubenswrapper[4690]: I0320 18:41:01.355399 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-fhzm4/crc-debug-vzkm6"] Mar 20 18:41:01 crc kubenswrapper[4690]: I0320 18:41:01.357666 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fhzm4/crc-debug-vzkm6" Mar 20 18:41:01 crc kubenswrapper[4690]: I0320 18:41:01.551268 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ad46d478-f7f9-4242-8622-952ccf4c4c24-host\") pod \"crc-debug-vzkm6\" (UID: \"ad46d478-f7f9-4242-8622-952ccf4c4c24\") " pod="openshift-must-gather-fhzm4/crc-debug-vzkm6" Mar 20 18:41:01 crc kubenswrapper[4690]: I0320 18:41:01.551373 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkp4j\" (UniqueName: \"kubernetes.io/projected/ad46d478-f7f9-4242-8622-952ccf4c4c24-kube-api-access-kkp4j\") pod \"crc-debug-vzkm6\" (UID: \"ad46d478-f7f9-4242-8622-952ccf4c4c24\") " pod="openshift-must-gather-fhzm4/crc-debug-vzkm6" Mar 20 18:41:01 crc kubenswrapper[4690]: I0320 18:41:01.652934 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ad46d478-f7f9-4242-8622-952ccf4c4c24-host\") pod \"crc-debug-vzkm6\" (UID: \"ad46d478-f7f9-4242-8622-952ccf4c4c24\") " pod="openshift-must-gather-fhzm4/crc-debug-vzkm6" Mar 20 18:41:01 crc kubenswrapper[4690]: I0320 18:41:01.653022 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ad46d478-f7f9-4242-8622-952ccf4c4c24-host\") pod \"crc-debug-vzkm6\" (UID: \"ad46d478-f7f9-4242-8622-952ccf4c4c24\") " pod="openshift-must-gather-fhzm4/crc-debug-vzkm6" Mar 20 18:41:01 crc kubenswrapper[4690]: I0320 18:41:01.654286 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkp4j\" (UniqueName: \"kubernetes.io/projected/ad46d478-f7f9-4242-8622-952ccf4c4c24-kube-api-access-kkp4j\") pod \"crc-debug-vzkm6\" (UID: \"ad46d478-f7f9-4242-8622-952ccf4c4c24\") " pod="openshift-must-gather-fhzm4/crc-debug-vzkm6" Mar 20 18:41:01 crc kubenswrapper[4690]: I0320 18:41:01.683001 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkp4j\" (UniqueName: \"kubernetes.io/projected/ad46d478-f7f9-4242-8622-952ccf4c4c24-kube-api-access-kkp4j\") pod \"crc-debug-vzkm6\" (UID: \"ad46d478-f7f9-4242-8622-952ccf4c4c24\") " pod="openshift-must-gather-fhzm4/crc-debug-vzkm6" Mar 20 18:41:01 crc kubenswrapper[4690]: I0320 18:41:01.974495 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fhzm4/crc-debug-vzkm6" Mar 20 18:41:02 crc kubenswrapper[4690]: W0320 18:41:02.525418 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podad46d478_f7f9_4242_8622_952ccf4c4c24.slice/crio-cb45b851ba480971e6055f7eb2b716745f651f349a1346b04057b4f1a8f8feb5 WatchSource:0}: Error finding container cb45b851ba480971e6055f7eb2b716745f651f349a1346b04057b4f1a8f8feb5: Status 404 returned error can't find the container with id cb45b851ba480971e6055f7eb2b716745f651f349a1346b04057b4f1a8f8feb5 Mar 20 18:41:02 crc kubenswrapper[4690]: I0320 18:41:02.823829 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fhzm4/crc-debug-vzkm6" event={"ID":"ad46d478-f7f9-4242-8622-952ccf4c4c24","Type":"ContainerStarted","Data":"cb45b851ba480971e6055f7eb2b716745f651f349a1346b04057b4f1a8f8feb5"} Mar 20 18:41:03 crc kubenswrapper[4690]: I0320 18:41:03.833465 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fhzm4/crc-debug-vzkm6" event={"ID":"ad46d478-f7f9-4242-8622-952ccf4c4c24","Type":"ContainerStarted","Data":"bdd6a137479f10fe4952af146545b92452e3a79d559335346a98df5161a2900d"} Mar 20 18:41:03 crc kubenswrapper[4690]: I0320 18:41:03.854374 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-fhzm4/crc-debug-vzkm6" podStartSLOduration=2.8543592650000003 podStartE2EDuration="2.854359265s" podCreationTimestamp="2026-03-20 18:41:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 18:41:03.849729664 +0000 UTC m=+4138.715555342" watchObservedRunningTime="2026-03-20 18:41:03.854359265 +0000 UTC m=+4138.720184943" Mar 20 18:41:24 crc kubenswrapper[4690]: I0320 18:41:24.273917 4690 patch_prober.go:28] interesting pod/machine-config-daemon-wtg2q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 18:41:24 crc kubenswrapper[4690]: I0320 18:41:24.274420 4690 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 18:41:36 crc kubenswrapper[4690]: I0320 18:41:36.129236 4690 generic.go:334] "Generic (PLEG): container finished" podID="ad46d478-f7f9-4242-8622-952ccf4c4c24" containerID="bdd6a137479f10fe4952af146545b92452e3a79d559335346a98df5161a2900d" exitCode=0 Mar 20 18:41:36 crc kubenswrapper[4690]: I0320 18:41:36.129338 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fhzm4/crc-debug-vzkm6" event={"ID":"ad46d478-f7f9-4242-8622-952ccf4c4c24","Type":"ContainerDied","Data":"bdd6a137479f10fe4952af146545b92452e3a79d559335346a98df5161a2900d"} Mar 20 18:41:37 crc kubenswrapper[4690]: I0320 18:41:37.617788 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fhzm4/crc-debug-vzkm6" Mar 20 18:41:37 crc kubenswrapper[4690]: I0320 18:41:37.650932 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-fhzm4/crc-debug-vzkm6"] Mar 20 18:41:37 crc kubenswrapper[4690]: I0320 18:41:37.658654 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-fhzm4/crc-debug-vzkm6"] Mar 20 18:41:37 crc kubenswrapper[4690]: I0320 18:41:37.704316 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kkp4j\" (UniqueName: \"kubernetes.io/projected/ad46d478-f7f9-4242-8622-952ccf4c4c24-kube-api-access-kkp4j\") pod \"ad46d478-f7f9-4242-8622-952ccf4c4c24\" (UID: \"ad46d478-f7f9-4242-8622-952ccf4c4c24\") " Mar 20 18:41:37 crc kubenswrapper[4690]: I0320 18:41:37.704504 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ad46d478-f7f9-4242-8622-952ccf4c4c24-host\") pod \"ad46d478-f7f9-4242-8622-952ccf4c4c24\" (UID: \"ad46d478-f7f9-4242-8622-952ccf4c4c24\") " Mar 20 18:41:37 crc kubenswrapper[4690]: I0320 18:41:37.704664 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ad46d478-f7f9-4242-8622-952ccf4c4c24-host" (OuterVolumeSpecName: "host") pod "ad46d478-f7f9-4242-8622-952ccf4c4c24" (UID: "ad46d478-f7f9-4242-8622-952ccf4c4c24"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 18:41:37 crc kubenswrapper[4690]: I0320 18:41:37.704931 4690 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ad46d478-f7f9-4242-8622-952ccf4c4c24-host\") on node \"crc\" DevicePath \"\"" Mar 20 18:41:37 crc kubenswrapper[4690]: I0320 18:41:37.722241 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad46d478-f7f9-4242-8622-952ccf4c4c24-kube-api-access-kkp4j" (OuterVolumeSpecName: "kube-api-access-kkp4j") pod "ad46d478-f7f9-4242-8622-952ccf4c4c24" (UID: "ad46d478-f7f9-4242-8622-952ccf4c4c24"). InnerVolumeSpecName "kube-api-access-kkp4j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:41:37 crc kubenswrapper[4690]: I0320 18:41:37.806399 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kkp4j\" (UniqueName: \"kubernetes.io/projected/ad46d478-f7f9-4242-8622-952ccf4c4c24-kube-api-access-kkp4j\") on node \"crc\" DevicePath \"\"" Mar 20 18:41:37 crc kubenswrapper[4690]: I0320 18:41:37.902301 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad46d478-f7f9-4242-8622-952ccf4c4c24" path="/var/lib/kubelet/pods/ad46d478-f7f9-4242-8622-952ccf4c4c24/volumes" Mar 20 18:41:38 crc kubenswrapper[4690]: I0320 18:41:38.148449 4690 scope.go:117] "RemoveContainer" containerID="bdd6a137479f10fe4952af146545b92452e3a79d559335346a98df5161a2900d" Mar 20 18:41:38 crc kubenswrapper[4690]: I0320 18:41:38.148578 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fhzm4/crc-debug-vzkm6" Mar 20 18:41:38 crc kubenswrapper[4690]: I0320 18:41:38.844718 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-fhzm4/crc-debug-d7rj9"] Mar 20 18:41:38 crc kubenswrapper[4690]: E0320 18:41:38.845483 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad46d478-f7f9-4242-8622-952ccf4c4c24" containerName="container-00" Mar 20 18:41:38 crc kubenswrapper[4690]: I0320 18:41:38.845498 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad46d478-f7f9-4242-8622-952ccf4c4c24" containerName="container-00" Mar 20 18:41:38 crc kubenswrapper[4690]: I0320 18:41:38.845734 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad46d478-f7f9-4242-8622-952ccf4c4c24" containerName="container-00" Mar 20 18:41:38 crc kubenswrapper[4690]: I0320 18:41:38.846479 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fhzm4/crc-debug-d7rj9" Mar 20 18:41:38 crc kubenswrapper[4690]: I0320 18:41:38.925772 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/70b3f3c4-4160-4746-b3fc-14cf5e46d00c-host\") pod \"crc-debug-d7rj9\" (UID: \"70b3f3c4-4160-4746-b3fc-14cf5e46d00c\") " pod="openshift-must-gather-fhzm4/crc-debug-d7rj9" Mar 20 18:41:38 crc kubenswrapper[4690]: I0320 18:41:38.925895 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5d9b\" (UniqueName: \"kubernetes.io/projected/70b3f3c4-4160-4746-b3fc-14cf5e46d00c-kube-api-access-t5d9b\") pod \"crc-debug-d7rj9\" (UID: \"70b3f3c4-4160-4746-b3fc-14cf5e46d00c\") " pod="openshift-must-gather-fhzm4/crc-debug-d7rj9" Mar 20 18:41:39 crc kubenswrapper[4690]: I0320 18:41:39.032515 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5d9b\" (UniqueName: \"kubernetes.io/projected/70b3f3c4-4160-4746-b3fc-14cf5e46d00c-kube-api-access-t5d9b\") pod \"crc-debug-d7rj9\" (UID: \"70b3f3c4-4160-4746-b3fc-14cf5e46d00c\") " pod="openshift-must-gather-fhzm4/crc-debug-d7rj9" Mar 20 18:41:39 crc kubenswrapper[4690]: I0320 18:41:39.032800 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/70b3f3c4-4160-4746-b3fc-14cf5e46d00c-host\") pod \"crc-debug-d7rj9\" (UID: \"70b3f3c4-4160-4746-b3fc-14cf5e46d00c\") " pod="openshift-must-gather-fhzm4/crc-debug-d7rj9" Mar 20 18:41:39 crc kubenswrapper[4690]: I0320 18:41:39.034288 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/70b3f3c4-4160-4746-b3fc-14cf5e46d00c-host\") pod \"crc-debug-d7rj9\" (UID: \"70b3f3c4-4160-4746-b3fc-14cf5e46d00c\") " pod="openshift-must-gather-fhzm4/crc-debug-d7rj9" Mar 20 18:41:39 crc kubenswrapper[4690]: I0320 18:41:39.310951 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5d9b\" (UniqueName: \"kubernetes.io/projected/70b3f3c4-4160-4746-b3fc-14cf5e46d00c-kube-api-access-t5d9b\") pod \"crc-debug-d7rj9\" (UID: \"70b3f3c4-4160-4746-b3fc-14cf5e46d00c\") " pod="openshift-must-gather-fhzm4/crc-debug-d7rj9" Mar 20 18:41:39 crc kubenswrapper[4690]: I0320 18:41:39.464958 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fhzm4/crc-debug-d7rj9" Mar 20 18:41:40 crc kubenswrapper[4690]: I0320 18:41:40.174950 4690 generic.go:334] "Generic (PLEG): container finished" podID="70b3f3c4-4160-4746-b3fc-14cf5e46d00c" containerID="68a76ede53c00274718d2b11c026e6aa46c98697a8ffa60d05fb9f35a7ee6dec" exitCode=0 Mar 20 18:41:40 crc kubenswrapper[4690]: I0320 18:41:40.175277 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fhzm4/crc-debug-d7rj9" event={"ID":"70b3f3c4-4160-4746-b3fc-14cf5e46d00c","Type":"ContainerDied","Data":"68a76ede53c00274718d2b11c026e6aa46c98697a8ffa60d05fb9f35a7ee6dec"} Mar 20 18:41:40 crc kubenswrapper[4690]: I0320 18:41:40.175304 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fhzm4/crc-debug-d7rj9" event={"ID":"70b3f3c4-4160-4746-b3fc-14cf5e46d00c","Type":"ContainerStarted","Data":"188224c30f65c803a689163a0dfa56313194053e16af7d08efd91d276d512d20"} Mar 20 18:41:40 crc kubenswrapper[4690]: I0320 18:41:40.561058 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-fhzm4/crc-debug-d7rj9"] Mar 20 18:41:40 crc kubenswrapper[4690]: I0320 18:41:40.570072 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-fhzm4/crc-debug-d7rj9"] Mar 20 18:41:41 crc kubenswrapper[4690]: I0320 18:41:41.279331 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fhzm4/crc-debug-d7rj9" Mar 20 18:41:41 crc kubenswrapper[4690]: I0320 18:41:41.378694 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/70b3f3c4-4160-4746-b3fc-14cf5e46d00c-host\") pod \"70b3f3c4-4160-4746-b3fc-14cf5e46d00c\" (UID: \"70b3f3c4-4160-4746-b3fc-14cf5e46d00c\") " Mar 20 18:41:41 crc kubenswrapper[4690]: I0320 18:41:41.378824 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/70b3f3c4-4160-4746-b3fc-14cf5e46d00c-host" (OuterVolumeSpecName: "host") pod "70b3f3c4-4160-4746-b3fc-14cf5e46d00c" (UID: "70b3f3c4-4160-4746-b3fc-14cf5e46d00c"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 18:41:41 crc kubenswrapper[4690]: I0320 18:41:41.378861 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5d9b\" (UniqueName: \"kubernetes.io/projected/70b3f3c4-4160-4746-b3fc-14cf5e46d00c-kube-api-access-t5d9b\") pod \"70b3f3c4-4160-4746-b3fc-14cf5e46d00c\" (UID: \"70b3f3c4-4160-4746-b3fc-14cf5e46d00c\") " Mar 20 18:41:41 crc kubenswrapper[4690]: I0320 18:41:41.379381 4690 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/70b3f3c4-4160-4746-b3fc-14cf5e46d00c-host\") on node \"crc\" DevicePath \"\"" Mar 20 18:41:41 crc kubenswrapper[4690]: I0320 18:41:41.384194 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70b3f3c4-4160-4746-b3fc-14cf5e46d00c-kube-api-access-t5d9b" (OuterVolumeSpecName: "kube-api-access-t5d9b") pod "70b3f3c4-4160-4746-b3fc-14cf5e46d00c" (UID: "70b3f3c4-4160-4746-b3fc-14cf5e46d00c"). InnerVolumeSpecName "kube-api-access-t5d9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:41:41 crc kubenswrapper[4690]: I0320 18:41:41.480676 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5d9b\" (UniqueName: \"kubernetes.io/projected/70b3f3c4-4160-4746-b3fc-14cf5e46d00c-kube-api-access-t5d9b\") on node \"crc\" DevicePath \"\"" Mar 20 18:41:41 crc kubenswrapper[4690]: I0320 18:41:41.823957 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-fhzm4/crc-debug-9x5cv"] Mar 20 18:41:41 crc kubenswrapper[4690]: E0320 18:41:41.824445 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70b3f3c4-4160-4746-b3fc-14cf5e46d00c" containerName="container-00" Mar 20 18:41:41 crc kubenswrapper[4690]: I0320 18:41:41.824464 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="70b3f3c4-4160-4746-b3fc-14cf5e46d00c" containerName="container-00" Mar 20 18:41:41 crc kubenswrapper[4690]: I0320 18:41:41.824650 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="70b3f3c4-4160-4746-b3fc-14cf5e46d00c" containerName="container-00" Mar 20 18:41:41 crc kubenswrapper[4690]: I0320 18:41:41.825219 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fhzm4/crc-debug-9x5cv" Mar 20 18:41:41 crc kubenswrapper[4690]: I0320 18:41:41.887132 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwr4b\" (UniqueName: \"kubernetes.io/projected/f15b475f-cfe6-4f9c-b9ea-952bac75ad96-kube-api-access-mwr4b\") pod \"crc-debug-9x5cv\" (UID: \"f15b475f-cfe6-4f9c-b9ea-952bac75ad96\") " pod="openshift-must-gather-fhzm4/crc-debug-9x5cv" Mar 20 18:41:41 crc kubenswrapper[4690]: I0320 18:41:41.887427 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f15b475f-cfe6-4f9c-b9ea-952bac75ad96-host\") pod \"crc-debug-9x5cv\" (UID: \"f15b475f-cfe6-4f9c-b9ea-952bac75ad96\") " pod="openshift-must-gather-fhzm4/crc-debug-9x5cv" Mar 20 18:41:41 crc kubenswrapper[4690]: I0320 18:41:41.895341 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70b3f3c4-4160-4746-b3fc-14cf5e46d00c" path="/var/lib/kubelet/pods/70b3f3c4-4160-4746-b3fc-14cf5e46d00c/volumes" Mar 20 18:41:41 crc kubenswrapper[4690]: I0320 18:41:41.989194 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwr4b\" (UniqueName: \"kubernetes.io/projected/f15b475f-cfe6-4f9c-b9ea-952bac75ad96-kube-api-access-mwr4b\") pod \"crc-debug-9x5cv\" (UID: \"f15b475f-cfe6-4f9c-b9ea-952bac75ad96\") " pod="openshift-must-gather-fhzm4/crc-debug-9x5cv" Mar 20 18:41:41 crc kubenswrapper[4690]: I0320 18:41:41.989660 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f15b475f-cfe6-4f9c-b9ea-952bac75ad96-host\") pod \"crc-debug-9x5cv\" (UID: \"f15b475f-cfe6-4f9c-b9ea-952bac75ad96\") " pod="openshift-must-gather-fhzm4/crc-debug-9x5cv" Mar 20 18:41:41 crc kubenswrapper[4690]: I0320 18:41:41.989777 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f15b475f-cfe6-4f9c-b9ea-952bac75ad96-host\") pod \"crc-debug-9x5cv\" (UID: \"f15b475f-cfe6-4f9c-b9ea-952bac75ad96\") " pod="openshift-must-gather-fhzm4/crc-debug-9x5cv" Mar 20 18:41:42 crc kubenswrapper[4690]: I0320 18:41:42.006875 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwr4b\" (UniqueName: \"kubernetes.io/projected/f15b475f-cfe6-4f9c-b9ea-952bac75ad96-kube-api-access-mwr4b\") pod \"crc-debug-9x5cv\" (UID: \"f15b475f-cfe6-4f9c-b9ea-952bac75ad96\") " pod="openshift-must-gather-fhzm4/crc-debug-9x5cv" Mar 20 18:41:42 crc kubenswrapper[4690]: I0320 18:41:42.141767 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fhzm4/crc-debug-9x5cv" Mar 20 18:41:42 crc kubenswrapper[4690]: W0320 18:41:42.177377 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf15b475f_cfe6_4f9c_b9ea_952bac75ad96.slice/crio-4a6ddc1c8227f2f77397347f71a37ec232cd3807731ae61b1eb0ca0cdb3aac38 WatchSource:0}: Error finding container 4a6ddc1c8227f2f77397347f71a37ec232cd3807731ae61b1eb0ca0cdb3aac38: Status 404 returned error can't find the container with id 4a6ddc1c8227f2f77397347f71a37ec232cd3807731ae61b1eb0ca0cdb3aac38 Mar 20 18:41:42 crc kubenswrapper[4690]: I0320 18:41:42.195483 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fhzm4/crc-debug-9x5cv" event={"ID":"f15b475f-cfe6-4f9c-b9ea-952bac75ad96","Type":"ContainerStarted","Data":"4a6ddc1c8227f2f77397347f71a37ec232cd3807731ae61b1eb0ca0cdb3aac38"} Mar 20 18:41:42 crc kubenswrapper[4690]: I0320 18:41:42.205090 4690 scope.go:117] "RemoveContainer" containerID="68a76ede53c00274718d2b11c026e6aa46c98697a8ffa60d05fb9f35a7ee6dec" Mar 20 18:41:42 crc kubenswrapper[4690]: I0320 18:41:42.205244 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fhzm4/crc-debug-d7rj9" Mar 20 18:41:43 crc kubenswrapper[4690]: I0320 18:41:43.219962 4690 generic.go:334] "Generic (PLEG): container finished" podID="f15b475f-cfe6-4f9c-b9ea-952bac75ad96" containerID="968c5e9be3ff210d9a1f1816aa37ac044e792912257776adcc607260eb24c404" exitCode=0 Mar 20 18:41:43 crc kubenswrapper[4690]: I0320 18:41:43.220071 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fhzm4/crc-debug-9x5cv" event={"ID":"f15b475f-cfe6-4f9c-b9ea-952bac75ad96","Type":"ContainerDied","Data":"968c5e9be3ff210d9a1f1816aa37ac044e792912257776adcc607260eb24c404"} Mar 20 18:41:43 crc kubenswrapper[4690]: I0320 18:41:43.278562 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-fhzm4/crc-debug-9x5cv"] Mar 20 18:41:43 crc kubenswrapper[4690]: I0320 18:41:43.291688 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-fhzm4/crc-debug-9x5cv"] Mar 20 18:41:45 crc kubenswrapper[4690]: I0320 18:41:45.057448 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fhzm4/crc-debug-9x5cv" Mar 20 18:41:45 crc kubenswrapper[4690]: I0320 18:41:45.215289 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f15b475f-cfe6-4f9c-b9ea-952bac75ad96-host\") pod \"f15b475f-cfe6-4f9c-b9ea-952bac75ad96\" (UID: \"f15b475f-cfe6-4f9c-b9ea-952bac75ad96\") " Mar 20 18:41:45 crc kubenswrapper[4690]: I0320 18:41:45.215454 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f15b475f-cfe6-4f9c-b9ea-952bac75ad96-host" (OuterVolumeSpecName: "host") pod "f15b475f-cfe6-4f9c-b9ea-952bac75ad96" (UID: "f15b475f-cfe6-4f9c-b9ea-952bac75ad96"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 18:41:45 crc kubenswrapper[4690]: I0320 18:41:45.215510 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mwr4b\" (UniqueName: \"kubernetes.io/projected/f15b475f-cfe6-4f9c-b9ea-952bac75ad96-kube-api-access-mwr4b\") pod \"f15b475f-cfe6-4f9c-b9ea-952bac75ad96\" (UID: \"f15b475f-cfe6-4f9c-b9ea-952bac75ad96\") " Mar 20 18:41:45 crc kubenswrapper[4690]: I0320 18:41:45.216015 4690 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f15b475f-cfe6-4f9c-b9ea-952bac75ad96-host\") on node \"crc\" DevicePath \"\"" Mar 20 18:41:45 crc kubenswrapper[4690]: I0320 18:41:45.220783 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f15b475f-cfe6-4f9c-b9ea-952bac75ad96-kube-api-access-mwr4b" (OuterVolumeSpecName: "kube-api-access-mwr4b") pod "f15b475f-cfe6-4f9c-b9ea-952bac75ad96" (UID: "f15b475f-cfe6-4f9c-b9ea-952bac75ad96"). InnerVolumeSpecName "kube-api-access-mwr4b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:41:45 crc kubenswrapper[4690]: I0320 18:41:45.318067 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mwr4b\" (UniqueName: \"kubernetes.io/projected/f15b475f-cfe6-4f9c-b9ea-952bac75ad96-kube-api-access-mwr4b\") on node \"crc\" DevicePath \"\"" Mar 20 18:41:45 crc kubenswrapper[4690]: I0320 18:41:45.895494 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f15b475f-cfe6-4f9c-b9ea-952bac75ad96" path="/var/lib/kubelet/pods/f15b475f-cfe6-4f9c-b9ea-952bac75ad96/volumes" Mar 20 18:41:45 crc kubenswrapper[4690]: I0320 18:41:45.958853 4690 scope.go:117] "RemoveContainer" containerID="968c5e9be3ff210d9a1f1816aa37ac044e792912257776adcc607260eb24c404" Mar 20 18:41:45 crc kubenswrapper[4690]: I0320 18:41:45.958918 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fhzm4/crc-debug-9x5cv" Mar 20 18:41:54 crc kubenswrapper[4690]: I0320 18:41:54.274850 4690 patch_prober.go:28] interesting pod/machine-config-daemon-wtg2q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 18:41:54 crc kubenswrapper[4690]: I0320 18:41:54.275334 4690 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 18:41:54 crc kubenswrapper[4690]: I0320 18:41:54.275389 4690 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" Mar 20 18:41:54 crc kubenswrapper[4690]: I0320 18:41:54.276326 4690 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"71d102b5805b8fe7f4d59a06973e7c063c629bb25d542cdb23626c14c624bec3"} pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 18:41:54 crc kubenswrapper[4690]: I0320 18:41:54.276400 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" containerName="machine-config-daemon" containerID="cri-o://71d102b5805b8fe7f4d59a06973e7c063c629bb25d542cdb23626c14c624bec3" gracePeriod=600 Mar 20 18:41:55 crc kubenswrapper[4690]: I0320 18:41:55.042754 4690 generic.go:334] "Generic (PLEG): container finished" podID="c18651e4-89e3-43fd-a780-bfa6df87591e" containerID="71d102b5805b8fe7f4d59a06973e7c063c629bb25d542cdb23626c14c624bec3" exitCode=0 Mar 20 18:41:55 crc kubenswrapper[4690]: I0320 18:41:55.042964 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" event={"ID":"c18651e4-89e3-43fd-a780-bfa6df87591e","Type":"ContainerDied","Data":"71d102b5805b8fe7f4d59a06973e7c063c629bb25d542cdb23626c14c624bec3"} Mar 20 18:41:55 crc kubenswrapper[4690]: I0320 18:41:55.043295 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" event={"ID":"c18651e4-89e3-43fd-a780-bfa6df87591e","Type":"ContainerStarted","Data":"fbdcba45779a1815c161907da2b0a7af3b6b510d3ce48a9115db0a6fc2b46293"} Mar 20 18:41:55 crc kubenswrapper[4690]: I0320 18:41:55.043321 4690 scope.go:117] "RemoveContainer" containerID="bdbe59d9ce73fb94720cd5938d39dd30976660c151e267e22ef199f0f2141309" Mar 20 18:42:00 crc kubenswrapper[4690]: I0320 18:42:00.136666 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567202-zhspx"] Mar 20 18:42:00 crc kubenswrapper[4690]: E0320 18:42:00.138629 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f15b475f-cfe6-4f9c-b9ea-952bac75ad96" containerName="container-00" Mar 20 18:42:00 crc kubenswrapper[4690]: I0320 18:42:00.138745 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="f15b475f-cfe6-4f9c-b9ea-952bac75ad96" containerName="container-00" Mar 20 18:42:00 crc kubenswrapper[4690]: I0320 18:42:00.139061 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="f15b475f-cfe6-4f9c-b9ea-952bac75ad96" containerName="container-00" Mar 20 18:42:00 crc kubenswrapper[4690]: I0320 18:42:00.139939 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567202-zhspx" Mar 20 18:42:00 crc kubenswrapper[4690]: I0320 18:42:00.143109 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5fwhb" Mar 20 18:42:00 crc kubenswrapper[4690]: I0320 18:42:00.143392 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 18:42:00 crc kubenswrapper[4690]: I0320 18:42:00.146456 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 18:42:00 crc kubenswrapper[4690]: I0320 18:42:00.158687 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567202-zhspx"] Mar 20 18:42:00 crc kubenswrapper[4690]: I0320 18:42:00.203715 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49t7v\" (UniqueName: \"kubernetes.io/projected/a16fca88-e24b-40d2-ae8c-b4fa681a8516-kube-api-access-49t7v\") pod \"auto-csr-approver-29567202-zhspx\" (UID: \"a16fca88-e24b-40d2-ae8c-b4fa681a8516\") " pod="openshift-infra/auto-csr-approver-29567202-zhspx" Mar 20 18:42:00 crc kubenswrapper[4690]: I0320 18:42:00.305864 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49t7v\" (UniqueName: \"kubernetes.io/projected/a16fca88-e24b-40d2-ae8c-b4fa681a8516-kube-api-access-49t7v\") pod \"auto-csr-approver-29567202-zhspx\" (UID: \"a16fca88-e24b-40d2-ae8c-b4fa681a8516\") " pod="openshift-infra/auto-csr-approver-29567202-zhspx" Mar 20 18:42:00 crc kubenswrapper[4690]: I0320 18:42:00.324429 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49t7v\" (UniqueName: \"kubernetes.io/projected/a16fca88-e24b-40d2-ae8c-b4fa681a8516-kube-api-access-49t7v\") pod \"auto-csr-approver-29567202-zhspx\" (UID: \"a16fca88-e24b-40d2-ae8c-b4fa681a8516\") " pod="openshift-infra/auto-csr-approver-29567202-zhspx" Mar 20 18:42:00 crc kubenswrapper[4690]: I0320 18:42:00.458983 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567202-zhspx" Mar 20 18:42:00 crc kubenswrapper[4690]: I0320 18:42:00.913969 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567202-zhspx"] Mar 20 18:42:01 crc kubenswrapper[4690]: W0320 18:42:01.018024 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda16fca88_e24b_40d2_ae8c_b4fa681a8516.slice/crio-2c3fa01ece8b989ef99c7d84ceae7c4add88e0dbaf613e610794943f7d06ef12 WatchSource:0}: Error finding container 2c3fa01ece8b989ef99c7d84ceae7c4add88e0dbaf613e610794943f7d06ef12: Status 404 returned error can't find the container with id 2c3fa01ece8b989ef99c7d84ceae7c4add88e0dbaf613e610794943f7d06ef12 Mar 20 18:42:01 crc kubenswrapper[4690]: I0320 18:42:01.024039 4690 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 18:42:01 crc kubenswrapper[4690]: I0320 18:42:01.096974 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567202-zhspx" event={"ID":"a16fca88-e24b-40d2-ae8c-b4fa681a8516","Type":"ContainerStarted","Data":"2c3fa01ece8b989ef99c7d84ceae7c4add88e0dbaf613e610794943f7d06ef12"} Mar 20 18:42:03 crc kubenswrapper[4690]: I0320 18:42:03.113886 4690 generic.go:334] "Generic (PLEG): container finished" podID="a16fca88-e24b-40d2-ae8c-b4fa681a8516" containerID="1f6cc1187b0561c33045c29f84dc045a8b8a7a3b8e3801a9a8a7616128865d44" exitCode=0 Mar 20 18:42:03 crc kubenswrapper[4690]: I0320 18:42:03.113993 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567202-zhspx" event={"ID":"a16fca88-e24b-40d2-ae8c-b4fa681a8516","Type":"ContainerDied","Data":"1f6cc1187b0561c33045c29f84dc045a8b8a7a3b8e3801a9a8a7616128865d44"} Mar 20 18:42:04 crc kubenswrapper[4690]: I0320 18:42:04.480812 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567202-zhspx" Mar 20 18:42:04 crc kubenswrapper[4690]: I0320 18:42:04.594084 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49t7v\" (UniqueName: \"kubernetes.io/projected/a16fca88-e24b-40d2-ae8c-b4fa681a8516-kube-api-access-49t7v\") pod \"a16fca88-e24b-40d2-ae8c-b4fa681a8516\" (UID: \"a16fca88-e24b-40d2-ae8c-b4fa681a8516\") " Mar 20 18:42:04 crc kubenswrapper[4690]: I0320 18:42:04.600820 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a16fca88-e24b-40d2-ae8c-b4fa681a8516-kube-api-access-49t7v" (OuterVolumeSpecName: "kube-api-access-49t7v") pod "a16fca88-e24b-40d2-ae8c-b4fa681a8516" (UID: "a16fca88-e24b-40d2-ae8c-b4fa681a8516"). InnerVolumeSpecName "kube-api-access-49t7v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:42:04 crc kubenswrapper[4690]: I0320 18:42:04.696534 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-49t7v\" (UniqueName: \"kubernetes.io/projected/a16fca88-e24b-40d2-ae8c-b4fa681a8516-kube-api-access-49t7v\") on node \"crc\" DevicePath \"\"" Mar 20 18:42:05 crc kubenswrapper[4690]: I0320 18:42:05.129956 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567202-zhspx" event={"ID":"a16fca88-e24b-40d2-ae8c-b4fa681a8516","Type":"ContainerDied","Data":"2c3fa01ece8b989ef99c7d84ceae7c4add88e0dbaf613e610794943f7d06ef12"} Mar 20 18:42:05 crc kubenswrapper[4690]: I0320 18:42:05.130507 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c3fa01ece8b989ef99c7d84ceae7c4add88e0dbaf613e610794943f7d06ef12" Mar 20 18:42:05 crc kubenswrapper[4690]: I0320 18:42:05.130029 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567202-zhspx" Mar 20 18:42:05 crc kubenswrapper[4690]: I0320 18:42:05.558636 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567196-fwszr"] Mar 20 18:42:05 crc kubenswrapper[4690]: I0320 18:42:05.570306 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567196-fwszr"] Mar 20 18:42:05 crc kubenswrapper[4690]: I0320 18:42:05.897341 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8be67ef1-def0-48f6-8766-80d72249d2d5" path="/var/lib/kubelet/pods/8be67ef1-def0-48f6-8766-80d72249d2d5/volumes" Mar 20 18:42:18 crc kubenswrapper[4690]: I0320 18:42:18.605735 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-944584c5d-v2mwf_7d7354f4-3635-4c6c-a382-f405c559ef59/barbican-api/0.log" Mar 20 18:42:18 crc kubenswrapper[4690]: I0320 18:42:18.696990 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-944584c5d-v2mwf_7d7354f4-3635-4c6c-a382-f405c559ef59/barbican-api-log/0.log" Mar 20 18:42:18 crc kubenswrapper[4690]: I0320 18:42:18.785020 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7f7645db4-ph4nl_eb2225a2-e763-42d5-affd-562463c266e6/barbican-keystone-listener/0.log" Mar 20 18:42:18 crc kubenswrapper[4690]: I0320 18:42:18.852592 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7f7645db4-ph4nl_eb2225a2-e763-42d5-affd-562463c266e6/barbican-keystone-listener-log/0.log" Mar 20 18:42:18 crc kubenswrapper[4690]: I0320 18:42:18.955601 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-76d4bbb45f-p692s_d6df0c1b-ea55-44ae-8fb9-9573c54322a8/barbican-worker/0.log" Mar 20 18:42:18 crc kubenswrapper[4690]: I0320 18:42:18.968155 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-76d4bbb45f-p692s_d6df0c1b-ea55-44ae-8fb9-9573c54322a8/barbican-worker-log/0.log" Mar 20 18:42:19 crc kubenswrapper[4690]: I0320 18:42:19.209399 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_fc998c2a-5f75-4a3b-b62a-1d6f8fbfc6d5/ceilometer-central-agent/0.log" Mar 20 18:42:19 crc kubenswrapper[4690]: I0320 18:42:19.240610 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-zmwzf_33405126-fa78-4ad4-9587-e157ffd9f389/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:42:19 crc kubenswrapper[4690]: I0320 18:42:19.319945 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_fc998c2a-5f75-4a3b-b62a-1d6f8fbfc6d5/ceilometer-notification-agent/0.log" Mar 20 18:42:19 crc kubenswrapper[4690]: I0320 18:42:19.796491 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_fc998c2a-5f75-4a3b-b62a-1d6f8fbfc6d5/proxy-httpd/0.log" Mar 20 18:42:19 crc kubenswrapper[4690]: I0320 18:42:19.812399 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_fc998c2a-5f75-4a3b-b62a-1d6f8fbfc6d5/sg-core/0.log" Mar 20 18:42:19 crc kubenswrapper[4690]: I0320 18:42:19.907138 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_370661b8-157c-4a7f-ae3e-379d122d48b3/cinder-api/0.log" Mar 20 18:42:20 crc kubenswrapper[4690]: I0320 18:42:20.010659 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_370661b8-157c-4a7f-ae3e-379d122d48b3/cinder-api-log/0.log" Mar 20 18:42:20 crc kubenswrapper[4690]: I0320 18:42:20.121683 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_582dd6c0-32f0-41f1-b62d-2dfc7f5b6509/probe/0.log" Mar 20 18:42:20 crc kubenswrapper[4690]: I0320 18:42:20.139708 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_582dd6c0-32f0-41f1-b62d-2dfc7f5b6509/cinder-scheduler/0.log" Mar 20 18:42:20 crc kubenswrapper[4690]: I0320 18:42:20.323693 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-lbwch_6983c278-26ba-4802-9320-1270d48b04ce/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:42:20 crc kubenswrapper[4690]: I0320 18:42:20.464627 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-p85rs_a59c2f4e-a048-421a-b4db-5411eeb2c3fd/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:42:20 crc kubenswrapper[4690]: I0320 18:42:20.577119 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-q7j6l_50265e08-57d1-4ae0-8434-086c38b3e525/init/0.log" Mar 20 18:42:20 crc kubenswrapper[4690]: I0320 18:42:20.714602 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-q7j6l_50265e08-57d1-4ae0-8434-086c38b3e525/init/0.log" Mar 20 18:42:20 crc kubenswrapper[4690]: I0320 18:42:20.825005 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-q7j6l_50265e08-57d1-4ae0-8434-086c38b3e525/dnsmasq-dns/0.log" Mar 20 18:42:20 crc kubenswrapper[4690]: I0320 18:42:20.834288 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-6nfgk_86d7b6e3-05d5-475d-b95f-9ba0d5b43df4/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:42:21 crc kubenswrapper[4690]: I0320 18:42:21.687101 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_9b415aa0-2e76-4f43-8f53-2da695c5b62e/glance-httpd/0.log" Mar 20 18:42:21 crc kubenswrapper[4690]: I0320 18:42:21.826292 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_9b415aa0-2e76-4f43-8f53-2da695c5b62e/glance-log/0.log" Mar 20 18:42:21 crc kubenswrapper[4690]: I0320 18:42:21.837189 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_d70983fe-8325-430a-beeb-fa3b8007e70e/glance-httpd/0.log" Mar 20 18:42:21 crc kubenswrapper[4690]: I0320 18:42:21.908504 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_d70983fe-8325-430a-beeb-fa3b8007e70e/glance-log/0.log" Mar 20 18:42:22 crc kubenswrapper[4690]: I0320 18:42:22.095103 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-dc95ccffb-gvrdq_799b195a-e6e5-4a19-b41a-1c7550e21e90/horizon/0.log" Mar 20 18:42:22 crc kubenswrapper[4690]: I0320 18:42:22.370936 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-w5shc_e1dd8af3-0ac3-42c4-ba88-c891b8c971bd/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:42:22 crc kubenswrapper[4690]: I0320 18:42:22.489954 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-dc95ccffb-gvrdq_799b195a-e6e5-4a19-b41a-1c7550e21e90/horizon-log/0.log" Mar 20 18:42:22 crc kubenswrapper[4690]: I0320 18:42:22.652805 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29567161-sq958_3d7d7e4d-2f06-4abf-aa2d-ff85ac933f66/keystone-cron/0.log" Mar 20 18:42:22 crc kubenswrapper[4690]: I0320 18:42:22.741406 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-xpjkk_7bdd8e58-aee6-495b-85b6-6d4ce7de1bdc/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:42:22 crc kubenswrapper[4690]: I0320 18:42:22.765063 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-b966595d7-ccrp2_112d1eb4-f375-4825-94e3-d721fbafbeaa/keystone-api/0.log" Mar 20 18:42:22 crc kubenswrapper[4690]: I0320 18:42:22.852233 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_4d1751ac-6582-4c73-aef9-12952bde5126/kube-state-metrics/0.log" Mar 20 18:42:23 crc kubenswrapper[4690]: I0320 18:42:23.371989 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-675c5fd7b7-z9vsh_1ce9f480-c11d-4009-98e7-8e1d4a13ecd8/neutron-httpd/0.log" Mar 20 18:42:23 crc kubenswrapper[4690]: I0320 18:42:23.414521 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-675c5fd7b7-z9vsh_1ce9f480-c11d-4009-98e7-8e1d4a13ecd8/neutron-api/0.log" Mar 20 18:42:23 crc kubenswrapper[4690]: I0320 18:42:23.657043 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qd7s_c59bc866-150a-4671-8bbf-91aea8f32646/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:42:23 crc kubenswrapper[4690]: I0320 18:42:23.658151 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-926mx_ca6878cf-74a4-4bf6-8e36-bf1a669d787f/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:42:24 crc kubenswrapper[4690]: I0320 18:42:24.101324 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_5f8feaad-3661-4ea6-9e2d-90cf79d48df9/nova-api-log/0.log" Mar 20 18:42:24 crc kubenswrapper[4690]: I0320 18:42:24.114939 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_9d9df793-f6e2-4d60-a54d-971847c8d3ea/nova-cell0-conductor-conductor/0.log" Mar 20 18:42:24 crc kubenswrapper[4690]: I0320 18:42:24.457291 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_b1aa290f-4335-4859-83e2-b2283b49e235/nova-cell1-conductor-conductor/0.log" Mar 20 18:42:24 crc kubenswrapper[4690]: I0320 18:42:24.511535 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_0d61bbf6-923c-45e5-9e55-42cb69c00b3b/nova-cell1-novncproxy-novncproxy/0.log" Mar 20 18:42:24 crc kubenswrapper[4690]: I0320 18:42:24.521624 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_5f8feaad-3661-4ea6-9e2d-90cf79d48df9/nova-api-api/0.log" Mar 20 18:42:24 crc kubenswrapper[4690]: I0320 18:42:24.797323 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_bc7c7487-ca7b-46c1-9cfd-6b9a34a0253f/nova-metadata-log/0.log" Mar 20 18:42:25 crc kubenswrapper[4690]: I0320 18:42:25.213476 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_bc7c7487-ca7b-46c1-9cfd-6b9a34a0253f/nova-metadata-metadata/0.log" Mar 20 18:42:25 crc kubenswrapper[4690]: I0320 18:42:25.244879 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-bs8n5_8146ff99-3308-4b91-b487-3bd707bed4dd/nova-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:42:25 crc kubenswrapper[4690]: I0320 18:42:25.267819 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_a6c3a0b8-8793-4e94-bbee-851b32f0a393/nova-scheduler-scheduler/0.log" Mar 20 18:42:25 crc kubenswrapper[4690]: I0320 18:42:25.454335 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_d4aa597c-8302-463f-a383-39c9a51baa2c/mysql-bootstrap/0.log" Mar 20 18:42:25 crc kubenswrapper[4690]: I0320 18:42:25.623038 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_d4aa597c-8302-463f-a383-39c9a51baa2c/galera/0.log" Mar 20 18:42:25 crc kubenswrapper[4690]: I0320 18:42:25.640657 4690 scope.go:117] "RemoveContainer" containerID="a0149ee86bf7e7295aac586808e93729dd144cea23e38a82dbb01e8e53c0e5cf" Mar 20 18:42:25 crc kubenswrapper[4690]: I0320 18:42:25.689069 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_d4aa597c-8302-463f-a383-39c9a51baa2c/mysql-bootstrap/0.log" Mar 20 18:42:25 crc kubenswrapper[4690]: I0320 18:42:25.744743 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_dacc9bed-eaa9-4747-8a92-30f5afa0a698/mysql-bootstrap/0.log" Mar 20 18:42:25 crc kubenswrapper[4690]: I0320 18:42:25.860515 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_dacc9bed-eaa9-4747-8a92-30f5afa0a698/mysql-bootstrap/0.log" Mar 20 18:42:25 crc kubenswrapper[4690]: I0320 18:42:25.923393 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_dacc9bed-eaa9-4747-8a92-30f5afa0a698/galera/0.log" Mar 20 18:42:25 crc kubenswrapper[4690]: I0320 18:42:25.976103 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_a537f291-8787-434c-84bc-4355ccccbe47/openstackclient/0.log" Mar 20 18:42:26 crc kubenswrapper[4690]: I0320 18:42:26.152150 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-j8pr4_f0e78344-d5a9-4bc2-9556-e3daf0ce19db/ovn-controller/0.log" Mar 20 18:42:26 crc kubenswrapper[4690]: I0320 18:42:26.215323 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-bv2wl_f8bc22b8-57e1-4cfd-bce8-446fb8cee600/openstack-network-exporter/0.log" Mar 20 18:42:26 crc kubenswrapper[4690]: I0320 18:42:26.384923 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-8dmtk_497fed5f-b87a-4042-ae22-186983ed7536/ovsdb-server-init/0.log" Mar 20 18:42:26 crc kubenswrapper[4690]: I0320 18:42:26.518177 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-8dmtk_497fed5f-b87a-4042-ae22-186983ed7536/ovs-vswitchd/0.log" Mar 20 18:42:26 crc kubenswrapper[4690]: I0320 18:42:26.555128 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-8dmtk_497fed5f-b87a-4042-ae22-186983ed7536/ovsdb-server-init/0.log" Mar 20 18:42:26 crc kubenswrapper[4690]: I0320 18:42:26.563348 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-8dmtk_497fed5f-b87a-4042-ae22-186983ed7536/ovsdb-server/0.log" Mar 20 18:42:26 crc kubenswrapper[4690]: I0320 18:42:26.786501 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_6a18387d-9d4e-4fd5-bdb3-8568831a7930/openstack-network-exporter/0.log" Mar 20 18:42:26 crc kubenswrapper[4690]: I0320 18:42:26.841044 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-jhldh_64a253c9-3348-4ba3-9d9a-755348ebf561/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:42:26 crc kubenswrapper[4690]: I0320 18:42:26.881560 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_6a18387d-9d4e-4fd5-bdb3-8568831a7930/ovn-northd/0.log" Mar 20 18:42:27 crc kubenswrapper[4690]: I0320 18:42:27.042185 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_172668db-85fb-47e1-82fe-dee7c454993e/ovsdbserver-nb/0.log" Mar 20 18:42:27 crc kubenswrapper[4690]: I0320 18:42:27.072599 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_172668db-85fb-47e1-82fe-dee7c454993e/openstack-network-exporter/0.log" Mar 20 18:42:27 crc kubenswrapper[4690]: I0320 18:42:27.199433 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_d64e7a22-5bb9-49ec-95d6-7ff145a31f9a/openstack-network-exporter/0.log" Mar 20 18:42:27 crc kubenswrapper[4690]: I0320 18:42:27.277693 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_d64e7a22-5bb9-49ec-95d6-7ff145a31f9a/ovsdbserver-sb/0.log" Mar 20 18:42:27 crc kubenswrapper[4690]: I0320 18:42:27.488533 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-77ff877fdd-nntbj_02713b3f-f042-40fc-a24e-f68ac876ae20/placement-api/0.log" Mar 20 18:42:27 crc kubenswrapper[4690]: I0320 18:42:27.531847 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-77ff877fdd-nntbj_02713b3f-f042-40fc-a24e-f68ac876ae20/placement-log/0.log" Mar 20 18:42:27 crc kubenswrapper[4690]: I0320 18:42:27.556046 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_b93f0757-6c7a-473f-80e5-f4b9e7f88fad/setup-container/0.log" Mar 20 18:42:27 crc kubenswrapper[4690]: I0320 18:42:27.771166 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_b93f0757-6c7a-473f-80e5-f4b9e7f88fad/setup-container/0.log" Mar 20 18:42:27 crc kubenswrapper[4690]: I0320 18:42:27.869884 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_ab528fee-94bb-4907-aca5-97dcabef8332/setup-container/0.log" Mar 20 18:42:27 crc kubenswrapper[4690]: I0320 18:42:27.874639 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_b93f0757-6c7a-473f-80e5-f4b9e7f88fad/rabbitmq/0.log" Mar 20 18:42:28 crc kubenswrapper[4690]: I0320 18:42:28.579677 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_ab528fee-94bb-4907-aca5-97dcabef8332/setup-container/0.log" Mar 20 18:42:28 crc kubenswrapper[4690]: I0320 18:42:28.673451 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-6q99q_94cbf02f-b47c-44f2-ab14-bd01174bcc77/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:42:28 crc kubenswrapper[4690]: I0320 18:42:28.685956 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_ab528fee-94bb-4907-aca5-97dcabef8332/rabbitmq/0.log" Mar 20 18:42:28 crc kubenswrapper[4690]: I0320 18:42:28.851385 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-h7849_0fdbed5c-e2a7-42ee-9e92-68d0bbbff023/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:42:28 crc kubenswrapper[4690]: I0320 18:42:28.893612 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-krmd6_05a81786-36ff-4e8b-9bba-5e0ebbfc3247/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:42:29 crc kubenswrapper[4690]: I0320 18:42:29.042164 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-8zmzz_d7945514-9f35-4a0f-86f3-3d8e03a03d75/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:42:29 crc kubenswrapper[4690]: I0320 18:42:29.110423 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-fnwhc_7d323f18-a4a8-4074-8b3f-cafcb23bcd33/ssh-known-hosts-edpm-deployment/0.log" Mar 20 18:42:29 crc kubenswrapper[4690]: I0320 18:42:29.368238 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-799f9bd8b7-4q7w9_3f074183-2793-4719-95b3-c2df447c93ab/proxy-server/0.log" Mar 20 18:42:29 crc kubenswrapper[4690]: I0320 18:42:29.489318 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-dsjgc_7930c325-4b03-450e-b3d0-b7116efc71cb/swift-ring-rebalance/0.log" Mar 20 18:42:29 crc kubenswrapper[4690]: I0320 18:42:29.518670 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-799f9bd8b7-4q7w9_3f074183-2793-4719-95b3-c2df447c93ab/proxy-httpd/0.log" Mar 20 18:42:29 crc kubenswrapper[4690]: I0320 18:42:29.568994 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4191d8a1-c023-4412-a90c-e819672da33f/account-auditor/0.log" Mar 20 18:42:29 crc kubenswrapper[4690]: I0320 18:42:29.654377 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4191d8a1-c023-4412-a90c-e819672da33f/account-reaper/0.log" Mar 20 18:42:30 crc kubenswrapper[4690]: I0320 18:42:30.240625 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4191d8a1-c023-4412-a90c-e819672da33f/account-server/0.log" Mar 20 18:42:30 crc kubenswrapper[4690]: I0320 18:42:30.267752 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4191d8a1-c023-4412-a90c-e819672da33f/container-auditor/0.log" Mar 20 18:42:30 crc kubenswrapper[4690]: I0320 18:42:30.274809 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4191d8a1-c023-4412-a90c-e819672da33f/container-replicator/0.log" Mar 20 18:42:30 crc kubenswrapper[4690]: I0320 18:42:30.275247 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4191d8a1-c023-4412-a90c-e819672da33f/account-replicator/0.log" Mar 20 18:42:30 crc kubenswrapper[4690]: I0320 18:42:30.455973 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4191d8a1-c023-4412-a90c-e819672da33f/container-server/0.log" Mar 20 18:42:30 crc kubenswrapper[4690]: I0320 18:42:30.489333 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4191d8a1-c023-4412-a90c-e819672da33f/object-expirer/0.log" Mar 20 18:42:30 crc kubenswrapper[4690]: I0320 18:42:30.512487 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4191d8a1-c023-4412-a90c-e819672da33f/container-updater/0.log" Mar 20 18:42:30 crc kubenswrapper[4690]: I0320 18:42:30.552600 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4191d8a1-c023-4412-a90c-e819672da33f/object-auditor/0.log" Mar 20 18:42:30 crc kubenswrapper[4690]: I0320 18:42:30.662947 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4191d8a1-c023-4412-a90c-e819672da33f/object-replicator/0.log" Mar 20 18:42:30 crc kubenswrapper[4690]: I0320 18:42:30.688743 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4191d8a1-c023-4412-a90c-e819672da33f/object-updater/0.log" Mar 20 18:42:30 crc kubenswrapper[4690]: I0320 18:42:30.690773 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4191d8a1-c023-4412-a90c-e819672da33f/object-server/0.log" Mar 20 18:42:30 crc kubenswrapper[4690]: I0320 18:42:30.775681 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4191d8a1-c023-4412-a90c-e819672da33f/rsync/0.log" Mar 20 18:42:30 crc kubenswrapper[4690]: I0320 18:42:30.847557 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4191d8a1-c023-4412-a90c-e819672da33f/swift-recon-cron/0.log" Mar 20 18:42:31 crc kubenswrapper[4690]: I0320 18:42:31.113163 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_86a8f040-c0ab-4923-8bab-8123fd72e63e/tempest-tests-tempest-tests-runner/0.log" Mar 20 18:42:31 crc kubenswrapper[4690]: I0320 18:42:31.223363 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_af317ab4-ee88-4ad6-b2c8-02b26765f15f/test-operator-logs-container/0.log" Mar 20 18:42:31 crc kubenswrapper[4690]: I0320 18:42:31.338794 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-qn629_0fb2f304-f772-4ce8-8372-177341555106/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:42:31 crc kubenswrapper[4690]: I0320 18:42:31.427820 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-nszcx_3fa5c87e-a9cd-4046-9344-3a66c0c0977c/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:42:42 crc kubenswrapper[4690]: I0320 18:42:42.275980 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_b74da73d-632e-490b-b3c7-22450d29ede6/memcached/0.log" Mar 20 18:42:58 crc kubenswrapper[4690]: I0320 18:42:58.783815 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-59bc569d95-5bf49_06fbcef9-d6fa-4dac-bfeb-93e3fc501f55/manager/0.log" Mar 20 18:42:58 crc kubenswrapper[4690]: I0320 18:42:58.930874 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d43f0597759d048207e8f8942c34d73ccb7a2672e1af8b0630dbcc16b1wfdpl_d1b6dbe3-2fff-4985-91e0-270e2d42fcbc/util/0.log" Mar 20 18:42:59 crc kubenswrapper[4690]: I0320 18:42:59.214732 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d43f0597759d048207e8f8942c34d73ccb7a2672e1af8b0630dbcc16b1wfdpl_d1b6dbe3-2fff-4985-91e0-270e2d42fcbc/pull/0.log" Mar 20 18:42:59 crc kubenswrapper[4690]: I0320 18:42:59.219145 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d43f0597759d048207e8f8942c34d73ccb7a2672e1af8b0630dbcc16b1wfdpl_d1b6dbe3-2fff-4985-91e0-270e2d42fcbc/util/0.log" Mar 20 18:42:59 crc kubenswrapper[4690]: I0320 18:42:59.245647 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d43f0597759d048207e8f8942c34d73ccb7a2672e1af8b0630dbcc16b1wfdpl_d1b6dbe3-2fff-4985-91e0-270e2d42fcbc/pull/0.log" Mar 20 18:43:00 crc kubenswrapper[4690]: I0320 18:43:00.018946 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d43f0597759d048207e8f8942c34d73ccb7a2672e1af8b0630dbcc16b1wfdpl_d1b6dbe3-2fff-4985-91e0-270e2d42fcbc/pull/0.log" Mar 20 18:43:00 crc kubenswrapper[4690]: I0320 18:43:00.061687 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d43f0597759d048207e8f8942c34d73ccb7a2672e1af8b0630dbcc16b1wfdpl_d1b6dbe3-2fff-4985-91e0-270e2d42fcbc/util/0.log" Mar 20 18:43:00 crc kubenswrapper[4690]: I0320 18:43:00.069429 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d43f0597759d048207e8f8942c34d73ccb7a2672e1af8b0630dbcc16b1wfdpl_d1b6dbe3-2fff-4985-91e0-270e2d42fcbc/extract/0.log" Mar 20 18:43:00 crc kubenswrapper[4690]: I0320 18:43:00.274385 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-588d4d986b-27xp7_1c5d887a-7a69-4f43-8b75-36de19325428/manager/0.log" Mar 20 18:43:00 crc kubenswrapper[4690]: I0320 18:43:00.386291 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-8d58dc466-mbp48_a3db7a74-f9a7-4dfc-89a3-5727f538a3a7/manager/0.log" Mar 20 18:43:00 crc kubenswrapper[4690]: I0320 18:43:00.514121 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-79df6bcc97-wclkd_6a4da3b7-e419-4565-8b7a-2f3f3fd20aa1/manager/0.log" Mar 20 18:43:00 crc kubenswrapper[4690]: I0320 18:43:00.537064 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-67dd5f86f5-rtb26_d71d628c-8060-418c-b0bf-f83193220e88/manager/0.log" Mar 20 18:43:00 crc kubenswrapper[4690]: I0320 18:43:00.699360 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-8464cc45fb-w2m9s_3871373d-0b43-4e90-84f8-01ee2e8e4159/manager/0.log" Mar 20 18:43:00 crc kubenswrapper[4690]: I0320 18:43:00.897988 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6f787dddc9-mmwbh_09c39274-3aa3-4774-98e2-10f70b707a97/manager/0.log" Mar 20 18:43:00 crc kubenswrapper[4690]: I0320 18:43:00.992231 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-c55d6cc99-gzcjf_535eb2e4-3de8-49bd-97a8-135823a8d1c9/manager/0.log" Mar 20 18:43:01 crc kubenswrapper[4690]: I0320 18:43:01.078306 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-768b96df4c-xgkr9_dc81b6ac-2881-4a5c-b3f6-e09fc1c634e4/manager/0.log" Mar 20 18:43:01 crc kubenswrapper[4690]: I0320 18:43:01.158190 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-55f864c847-tllng_a8f81ddb-a5b3-4881-88de-66ed78d8d344/manager/0.log" Mar 20 18:43:01 crc kubenswrapper[4690]: I0320 18:43:01.306105 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-67ccfc9778-pxcl7_3efbe084-4e50-405e-b477-b3b87635d465/manager/0.log" Mar 20 18:43:01 crc kubenswrapper[4690]: I0320 18:43:01.407387 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-767865f676-xs6lt_61746313-5249-48cd-8dae-f7984ba74f85/manager/0.log" Mar 20 18:43:01 crc kubenswrapper[4690]: I0320 18:43:01.570225 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5d488d59fb-55787_458fe699-42d5-44ad-9288-3b6fbcd87161/manager/0.log" Mar 20 18:43:01 crc kubenswrapper[4690]: I0320 18:43:01.654817 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5b9f45d989-xzzx7_a31564d4-ce19-4893-bde8-871ced7c077b/manager/0.log" Mar 20 18:43:01 crc kubenswrapper[4690]: I0320 18:43:01.726982 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-86657c54f5m7n5b_f3bca6f7-be2b-4420-8664-b94ba53d5f7f/manager/0.log" Mar 20 18:43:01 crc kubenswrapper[4690]: I0320 18:43:01.907031 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-77cd8cbff5-64vnn_2e2d986c-6f20-4436-a367-df98a71f79f0/operator/0.log" Mar 20 18:43:02 crc kubenswrapper[4690]: I0320 18:43:02.127636 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-xcb58_595a25b2-d477-4ec7-b9ad-8eb670c2ea3f/registry-server/0.log" Mar 20 18:43:02 crc kubenswrapper[4690]: I0320 18:43:02.277916 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-884679f54-48tn7_3730dc8b-cf83-4f29-ac0b-3776ef3efeba/manager/0.log" Mar 20 18:43:02 crc kubenswrapper[4690]: I0320 18:43:02.476329 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5784578c99-fm55z_a27fabe1-095d-4c34-8e91-862aa1dbf964/manager/0.log" Mar 20 18:43:02 crc kubenswrapper[4690]: I0320 18:43:02.713362 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-c674c5965-klmr6_64a2959f-0b79-4b19-934b-486aad0e782b/manager/0.log" Mar 20 18:43:02 crc kubenswrapper[4690]: I0320 18:43:02.907511 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-d6b694c5-8kl25_c7a8e424-00f3-4e97-b6b3-bd2513624b2e/manager/0.log" Mar 20 18:43:02 crc kubenswrapper[4690]: I0320 18:43:02.921835 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-vnl4s_d7d5bc08-99d0-4361-ae0e-ca9732db6154/manager/0.log" Mar 20 18:43:03 crc kubenswrapper[4690]: I0320 18:43:03.020169 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-54c9b8654f-ms4r7_26b1c9fc-55f9-4895-9d23-a7c7e0e811c3/manager/0.log" Mar 20 18:43:03 crc kubenswrapper[4690]: I0320 18:43:03.069645 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6c4d75f7f9-pv7x5_64ced890-6363-43c3-83e5-0001c72851ef/manager/0.log" Mar 20 18:43:15 crc kubenswrapper[4690]: I0320 18:43:15.895855 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jcvgn"] Mar 20 18:43:15 crc kubenswrapper[4690]: E0320 18:43:15.896737 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a16fca88-e24b-40d2-ae8c-b4fa681a8516" containerName="oc" Mar 20 18:43:15 crc kubenswrapper[4690]: I0320 18:43:15.896752 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="a16fca88-e24b-40d2-ae8c-b4fa681a8516" containerName="oc" Mar 20 18:43:15 crc kubenswrapper[4690]: I0320 18:43:15.896980 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="a16fca88-e24b-40d2-ae8c-b4fa681a8516" containerName="oc" Mar 20 18:43:15 crc kubenswrapper[4690]: I0320 18:43:15.898409 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jcvgn" Mar 20 18:43:15 crc kubenswrapper[4690]: I0320 18:43:15.918376 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jcvgn"] Mar 20 18:43:16 crc kubenswrapper[4690]: I0320 18:43:16.020556 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvwtv\" (UniqueName: \"kubernetes.io/projected/db239cc7-bf99-4031-8bf1-e0ee242ab5ab-kube-api-access-rvwtv\") pod \"redhat-marketplace-jcvgn\" (UID: \"db239cc7-bf99-4031-8bf1-e0ee242ab5ab\") " pod="openshift-marketplace/redhat-marketplace-jcvgn" Mar 20 18:43:16 crc kubenswrapper[4690]: I0320 18:43:16.020845 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db239cc7-bf99-4031-8bf1-e0ee242ab5ab-catalog-content\") pod \"redhat-marketplace-jcvgn\" (UID: \"db239cc7-bf99-4031-8bf1-e0ee242ab5ab\") " pod="openshift-marketplace/redhat-marketplace-jcvgn" Mar 20 18:43:16 crc kubenswrapper[4690]: I0320 18:43:16.020928 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db239cc7-bf99-4031-8bf1-e0ee242ab5ab-utilities\") pod \"redhat-marketplace-jcvgn\" (UID: \"db239cc7-bf99-4031-8bf1-e0ee242ab5ab\") " pod="openshift-marketplace/redhat-marketplace-jcvgn" Mar 20 18:43:16 crc kubenswrapper[4690]: I0320 18:43:16.123034 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db239cc7-bf99-4031-8bf1-e0ee242ab5ab-utilities\") pod \"redhat-marketplace-jcvgn\" (UID: \"db239cc7-bf99-4031-8bf1-e0ee242ab5ab\") " pod="openshift-marketplace/redhat-marketplace-jcvgn" Mar 20 18:43:16 crc kubenswrapper[4690]: I0320 18:43:16.123186 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvwtv\" (UniqueName: \"kubernetes.io/projected/db239cc7-bf99-4031-8bf1-e0ee242ab5ab-kube-api-access-rvwtv\") pod \"redhat-marketplace-jcvgn\" (UID: \"db239cc7-bf99-4031-8bf1-e0ee242ab5ab\") " pod="openshift-marketplace/redhat-marketplace-jcvgn" Mar 20 18:43:16 crc kubenswrapper[4690]: I0320 18:43:16.123220 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db239cc7-bf99-4031-8bf1-e0ee242ab5ab-catalog-content\") pod \"redhat-marketplace-jcvgn\" (UID: \"db239cc7-bf99-4031-8bf1-e0ee242ab5ab\") " pod="openshift-marketplace/redhat-marketplace-jcvgn" Mar 20 18:43:16 crc kubenswrapper[4690]: I0320 18:43:16.123714 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db239cc7-bf99-4031-8bf1-e0ee242ab5ab-utilities\") pod \"redhat-marketplace-jcvgn\" (UID: \"db239cc7-bf99-4031-8bf1-e0ee242ab5ab\") " pod="openshift-marketplace/redhat-marketplace-jcvgn" Mar 20 18:43:16 crc kubenswrapper[4690]: I0320 18:43:16.123761 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db239cc7-bf99-4031-8bf1-e0ee242ab5ab-catalog-content\") pod \"redhat-marketplace-jcvgn\" (UID: \"db239cc7-bf99-4031-8bf1-e0ee242ab5ab\") " pod="openshift-marketplace/redhat-marketplace-jcvgn" Mar 20 18:43:16 crc kubenswrapper[4690]: I0320 18:43:16.142762 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvwtv\" (UniqueName: \"kubernetes.io/projected/db239cc7-bf99-4031-8bf1-e0ee242ab5ab-kube-api-access-rvwtv\") pod \"redhat-marketplace-jcvgn\" (UID: \"db239cc7-bf99-4031-8bf1-e0ee242ab5ab\") " pod="openshift-marketplace/redhat-marketplace-jcvgn" Mar 20 18:43:16 crc kubenswrapper[4690]: I0320 18:43:16.218918 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jcvgn" Mar 20 18:43:16 crc kubenswrapper[4690]: I0320 18:43:16.727168 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jcvgn"] Mar 20 18:43:17 crc kubenswrapper[4690]: I0320 18:43:17.821508 4690 generic.go:334] "Generic (PLEG): container finished" podID="db239cc7-bf99-4031-8bf1-e0ee242ab5ab" containerID="36d13893c88c5acd1bbb0239d8d5f722730c2ea5738ca83c6648af61be62a973" exitCode=0 Mar 20 18:43:17 crc kubenswrapper[4690]: I0320 18:43:17.821610 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jcvgn" event={"ID":"db239cc7-bf99-4031-8bf1-e0ee242ab5ab","Type":"ContainerDied","Data":"36d13893c88c5acd1bbb0239d8d5f722730c2ea5738ca83c6648af61be62a973"} Mar 20 18:43:17 crc kubenswrapper[4690]: I0320 18:43:17.821943 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jcvgn" event={"ID":"db239cc7-bf99-4031-8bf1-e0ee242ab5ab","Type":"ContainerStarted","Data":"7d346aa809b80be24f4a365e88b56c3c3c3b9352e9a10a9b0bcdd5c8a1168268"} Mar 20 18:43:19 crc kubenswrapper[4690]: I0320 18:43:19.839834 4690 generic.go:334] "Generic (PLEG): container finished" podID="db239cc7-bf99-4031-8bf1-e0ee242ab5ab" containerID="628587148c108d89c7564e08a298c5d8780cd55361f323ced5315bae72526903" exitCode=0 Mar 20 18:43:19 crc kubenswrapper[4690]: I0320 18:43:19.839911 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jcvgn" event={"ID":"db239cc7-bf99-4031-8bf1-e0ee242ab5ab","Type":"ContainerDied","Data":"628587148c108d89c7564e08a298c5d8780cd55361f323ced5315bae72526903"} Mar 20 18:43:20 crc kubenswrapper[4690]: I0320 18:43:20.849988 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jcvgn" event={"ID":"db239cc7-bf99-4031-8bf1-e0ee242ab5ab","Type":"ContainerStarted","Data":"781b3dcfb6f7aa7947c9769238a96470093087e23851e1393d886eddd69dc57f"} Mar 20 18:43:20 crc kubenswrapper[4690]: I0320 18:43:20.888031 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jcvgn" podStartSLOduration=3.295334918 podStartE2EDuration="5.888008009s" podCreationTimestamp="2026-03-20 18:43:15 +0000 UTC" firstStartedPulling="2026-03-20 18:43:17.824221021 +0000 UTC m=+4272.690046699" lastFinishedPulling="2026-03-20 18:43:20.416894092 +0000 UTC m=+4275.282719790" observedRunningTime="2026-03-20 18:43:20.881217606 +0000 UTC m=+4275.747043284" watchObservedRunningTime="2026-03-20 18:43:20.888008009 +0000 UTC m=+4275.753833707" Mar 20 18:43:23 crc kubenswrapper[4690]: I0320 18:43:23.521852 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-kkhg7_dc26c755-5e1b-480b-b3ed-b3d3dee36d94/control-plane-machine-set-operator/0.log" Mar 20 18:43:23 crc kubenswrapper[4690]: I0320 18:43:23.730373 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-9n98c_bb444275-6cc1-42be-b742-afc344a60995/kube-rbac-proxy/0.log" Mar 20 18:43:23 crc kubenswrapper[4690]: I0320 18:43:23.743040 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-9n98c_bb444275-6cc1-42be-b742-afc344a60995/machine-api-operator/0.log" Mar 20 18:43:26 crc kubenswrapper[4690]: I0320 18:43:26.219889 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jcvgn" Mar 20 18:43:26 crc kubenswrapper[4690]: I0320 18:43:26.221690 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jcvgn" Mar 20 18:43:26 crc kubenswrapper[4690]: I0320 18:43:26.277435 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jcvgn" Mar 20 18:43:26 crc kubenswrapper[4690]: I0320 18:43:26.945401 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jcvgn" Mar 20 18:43:26 crc kubenswrapper[4690]: I0320 18:43:26.988873 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jcvgn"] Mar 20 18:43:28 crc kubenswrapper[4690]: I0320 18:43:28.913278 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-jcvgn" podUID="db239cc7-bf99-4031-8bf1-e0ee242ab5ab" containerName="registry-server" containerID="cri-o://781b3dcfb6f7aa7947c9769238a96470093087e23851e1393d886eddd69dc57f" gracePeriod=2 Mar 20 18:43:29 crc kubenswrapper[4690]: I0320 18:43:29.343371 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jcvgn" Mar 20 18:43:29 crc kubenswrapper[4690]: I0320 18:43:29.501210 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvwtv\" (UniqueName: \"kubernetes.io/projected/db239cc7-bf99-4031-8bf1-e0ee242ab5ab-kube-api-access-rvwtv\") pod \"db239cc7-bf99-4031-8bf1-e0ee242ab5ab\" (UID: \"db239cc7-bf99-4031-8bf1-e0ee242ab5ab\") " Mar 20 18:43:29 crc kubenswrapper[4690]: I0320 18:43:29.501525 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db239cc7-bf99-4031-8bf1-e0ee242ab5ab-catalog-content\") pod \"db239cc7-bf99-4031-8bf1-e0ee242ab5ab\" (UID: \"db239cc7-bf99-4031-8bf1-e0ee242ab5ab\") " Mar 20 18:43:29 crc kubenswrapper[4690]: I0320 18:43:29.501568 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db239cc7-bf99-4031-8bf1-e0ee242ab5ab-utilities\") pod \"db239cc7-bf99-4031-8bf1-e0ee242ab5ab\" (UID: \"db239cc7-bf99-4031-8bf1-e0ee242ab5ab\") " Mar 20 18:43:29 crc kubenswrapper[4690]: I0320 18:43:29.502213 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db239cc7-bf99-4031-8bf1-e0ee242ab5ab-utilities" (OuterVolumeSpecName: "utilities") pod "db239cc7-bf99-4031-8bf1-e0ee242ab5ab" (UID: "db239cc7-bf99-4031-8bf1-e0ee242ab5ab"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:43:29 crc kubenswrapper[4690]: I0320 18:43:29.512723 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db239cc7-bf99-4031-8bf1-e0ee242ab5ab-kube-api-access-rvwtv" (OuterVolumeSpecName: "kube-api-access-rvwtv") pod "db239cc7-bf99-4031-8bf1-e0ee242ab5ab" (UID: "db239cc7-bf99-4031-8bf1-e0ee242ab5ab"). InnerVolumeSpecName "kube-api-access-rvwtv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:43:29 crc kubenswrapper[4690]: I0320 18:43:29.538204 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db239cc7-bf99-4031-8bf1-e0ee242ab5ab-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "db239cc7-bf99-4031-8bf1-e0ee242ab5ab" (UID: "db239cc7-bf99-4031-8bf1-e0ee242ab5ab"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:43:29 crc kubenswrapper[4690]: I0320 18:43:29.602639 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvwtv\" (UniqueName: \"kubernetes.io/projected/db239cc7-bf99-4031-8bf1-e0ee242ab5ab-kube-api-access-rvwtv\") on node \"crc\" DevicePath \"\"" Mar 20 18:43:29 crc kubenswrapper[4690]: I0320 18:43:29.602665 4690 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db239cc7-bf99-4031-8bf1-e0ee242ab5ab-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 18:43:29 crc kubenswrapper[4690]: I0320 18:43:29.602675 4690 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db239cc7-bf99-4031-8bf1-e0ee242ab5ab-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 18:43:29 crc kubenswrapper[4690]: I0320 18:43:29.925841 4690 generic.go:334] "Generic (PLEG): container finished" podID="db239cc7-bf99-4031-8bf1-e0ee242ab5ab" containerID="781b3dcfb6f7aa7947c9769238a96470093087e23851e1393d886eddd69dc57f" exitCode=0 Mar 20 18:43:29 crc kubenswrapper[4690]: I0320 18:43:29.925903 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jcvgn" event={"ID":"db239cc7-bf99-4031-8bf1-e0ee242ab5ab","Type":"ContainerDied","Data":"781b3dcfb6f7aa7947c9769238a96470093087e23851e1393d886eddd69dc57f"} Mar 20 18:43:29 crc kubenswrapper[4690]: I0320 18:43:29.925958 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jcvgn" event={"ID":"db239cc7-bf99-4031-8bf1-e0ee242ab5ab","Type":"ContainerDied","Data":"7d346aa809b80be24f4a365e88b56c3c3c3b9352e9a10a9b0bcdd5c8a1168268"} Mar 20 18:43:29 crc kubenswrapper[4690]: I0320 18:43:29.925998 4690 scope.go:117] "RemoveContainer" containerID="781b3dcfb6f7aa7947c9769238a96470093087e23851e1393d886eddd69dc57f" Mar 20 18:43:29 crc kubenswrapper[4690]: I0320 18:43:29.926083 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jcvgn" Mar 20 18:43:29 crc kubenswrapper[4690]: I0320 18:43:29.965061 4690 scope.go:117] "RemoveContainer" containerID="628587148c108d89c7564e08a298c5d8780cd55361f323ced5315bae72526903" Mar 20 18:43:29 crc kubenswrapper[4690]: I0320 18:43:29.965137 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jcvgn"] Mar 20 18:43:29 crc kubenswrapper[4690]: I0320 18:43:29.976091 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-jcvgn"] Mar 20 18:43:29 crc kubenswrapper[4690]: I0320 18:43:29.986796 4690 scope.go:117] "RemoveContainer" containerID="36d13893c88c5acd1bbb0239d8d5f722730c2ea5738ca83c6648af61be62a973" Mar 20 18:43:30 crc kubenswrapper[4690]: I0320 18:43:30.031684 4690 scope.go:117] "RemoveContainer" containerID="781b3dcfb6f7aa7947c9769238a96470093087e23851e1393d886eddd69dc57f" Mar 20 18:43:30 crc kubenswrapper[4690]: E0320 18:43:30.032092 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"781b3dcfb6f7aa7947c9769238a96470093087e23851e1393d886eddd69dc57f\": container with ID starting with 781b3dcfb6f7aa7947c9769238a96470093087e23851e1393d886eddd69dc57f not found: ID does not exist" containerID="781b3dcfb6f7aa7947c9769238a96470093087e23851e1393d886eddd69dc57f" Mar 20 18:43:30 crc kubenswrapper[4690]: I0320 18:43:30.032121 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"781b3dcfb6f7aa7947c9769238a96470093087e23851e1393d886eddd69dc57f"} err="failed to get container status \"781b3dcfb6f7aa7947c9769238a96470093087e23851e1393d886eddd69dc57f\": rpc error: code = NotFound desc = could not find container \"781b3dcfb6f7aa7947c9769238a96470093087e23851e1393d886eddd69dc57f\": container with ID starting with 781b3dcfb6f7aa7947c9769238a96470093087e23851e1393d886eddd69dc57f not found: ID does not exist" Mar 20 18:43:30 crc kubenswrapper[4690]: I0320 18:43:30.032142 4690 scope.go:117] "RemoveContainer" containerID="628587148c108d89c7564e08a298c5d8780cd55361f323ced5315bae72526903" Mar 20 18:43:30 crc kubenswrapper[4690]: E0320 18:43:30.032328 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"628587148c108d89c7564e08a298c5d8780cd55361f323ced5315bae72526903\": container with ID starting with 628587148c108d89c7564e08a298c5d8780cd55361f323ced5315bae72526903 not found: ID does not exist" containerID="628587148c108d89c7564e08a298c5d8780cd55361f323ced5315bae72526903" Mar 20 18:43:30 crc kubenswrapper[4690]: I0320 18:43:30.032348 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"628587148c108d89c7564e08a298c5d8780cd55361f323ced5315bae72526903"} err="failed to get container status \"628587148c108d89c7564e08a298c5d8780cd55361f323ced5315bae72526903\": rpc error: code = NotFound desc = could not find container \"628587148c108d89c7564e08a298c5d8780cd55361f323ced5315bae72526903\": container with ID starting with 628587148c108d89c7564e08a298c5d8780cd55361f323ced5315bae72526903 not found: ID does not exist" Mar 20 18:43:30 crc kubenswrapper[4690]: I0320 18:43:30.032361 4690 scope.go:117] "RemoveContainer" containerID="36d13893c88c5acd1bbb0239d8d5f722730c2ea5738ca83c6648af61be62a973" Mar 20 18:43:30 crc kubenswrapper[4690]: E0320 18:43:30.032529 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36d13893c88c5acd1bbb0239d8d5f722730c2ea5738ca83c6648af61be62a973\": container with ID starting with 36d13893c88c5acd1bbb0239d8d5f722730c2ea5738ca83c6648af61be62a973 not found: ID does not exist" containerID="36d13893c88c5acd1bbb0239d8d5f722730c2ea5738ca83c6648af61be62a973" Mar 20 18:43:30 crc kubenswrapper[4690]: I0320 18:43:30.032548 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36d13893c88c5acd1bbb0239d8d5f722730c2ea5738ca83c6648af61be62a973"} err="failed to get container status \"36d13893c88c5acd1bbb0239d8d5f722730c2ea5738ca83c6648af61be62a973\": rpc error: code = NotFound desc = could not find container \"36d13893c88c5acd1bbb0239d8d5f722730c2ea5738ca83c6648af61be62a973\": container with ID starting with 36d13893c88c5acd1bbb0239d8d5f722730c2ea5738ca83c6648af61be62a973 not found: ID does not exist" Mar 20 18:43:31 crc kubenswrapper[4690]: I0320 18:43:31.893874 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db239cc7-bf99-4031-8bf1-e0ee242ab5ab" path="/var/lib/kubelet/pods/db239cc7-bf99-4031-8bf1-e0ee242ab5ab/volumes" Mar 20 18:43:37 crc kubenswrapper[4690]: I0320 18:43:37.822198 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-sj4vr_7f7c4ed7-ab53-40e9-8977-77afd116ce1b/cert-manager-controller/0.log" Mar 20 18:43:37 crc kubenswrapper[4690]: I0320 18:43:37.957443 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-5lwhr_2040aed1-0ccc-4068-8e68-5ddda58ddd5e/cert-manager-cainjector/0.log" Mar 20 18:43:38 crc kubenswrapper[4690]: I0320 18:43:38.836132 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-s9shq_46b55360-cf52-4b63-90e4-b578b7181c19/cert-manager-webhook/0.log" Mar 20 18:43:51 crc kubenswrapper[4690]: I0320 18:43:51.467036 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-66twl_bab1f41a-57aa-43a8-b690-62eb634c99dc/nmstate-console-plugin/0.log" Mar 20 18:43:51 crc kubenswrapper[4690]: I0320 18:43:51.610388 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-h7mj5_15577f5b-3df3-4e8e-bebe-6abe5379debf/nmstate-handler/0.log" Mar 20 18:43:51 crc kubenswrapper[4690]: I0320 18:43:51.667627 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-zpndn_2a8c0e04-bbfb-46b1-8562-2a1697b85035/kube-rbac-proxy/0.log" Mar 20 18:43:51 crc kubenswrapper[4690]: I0320 18:43:51.681663 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-zpndn_2a8c0e04-bbfb-46b1-8562-2a1697b85035/nmstate-metrics/0.log" Mar 20 18:43:51 crc kubenswrapper[4690]: I0320 18:43:51.825686 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-6th4x_7e787826-7e6f-4ac9-856e-73304533640d/nmstate-operator/0.log" Mar 20 18:43:51 crc kubenswrapper[4690]: I0320 18:43:51.867364 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-2gftm_943c74c5-b182-46da-9ea4-164a4eb553d0/nmstate-webhook/0.log" Mar 20 18:43:54 crc kubenswrapper[4690]: I0320 18:43:54.287026 4690 patch_prober.go:28] interesting pod/machine-config-daemon-wtg2q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 18:43:54 crc kubenswrapper[4690]: I0320 18:43:54.287322 4690 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 18:44:00 crc kubenswrapper[4690]: I0320 18:44:00.148257 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567204-4wjch"] Mar 20 18:44:00 crc kubenswrapper[4690]: E0320 18:44:00.150695 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db239cc7-bf99-4031-8bf1-e0ee242ab5ab" containerName="extract-utilities" Mar 20 18:44:00 crc kubenswrapper[4690]: I0320 18:44:00.150827 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="db239cc7-bf99-4031-8bf1-e0ee242ab5ab" containerName="extract-utilities" Mar 20 18:44:00 crc kubenswrapper[4690]: E0320 18:44:00.150925 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db239cc7-bf99-4031-8bf1-e0ee242ab5ab" containerName="registry-server" Mar 20 18:44:00 crc kubenswrapper[4690]: I0320 18:44:00.150993 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="db239cc7-bf99-4031-8bf1-e0ee242ab5ab" containerName="registry-server" Mar 20 18:44:00 crc kubenswrapper[4690]: E0320 18:44:00.151055 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db239cc7-bf99-4031-8bf1-e0ee242ab5ab" containerName="extract-content" Mar 20 18:44:00 crc kubenswrapper[4690]: I0320 18:44:00.151112 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="db239cc7-bf99-4031-8bf1-e0ee242ab5ab" containerName="extract-content" Mar 20 18:44:00 crc kubenswrapper[4690]: I0320 18:44:00.151391 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="db239cc7-bf99-4031-8bf1-e0ee242ab5ab" containerName="registry-server" Mar 20 18:44:00 crc kubenswrapper[4690]: I0320 18:44:00.152201 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567204-4wjch" Mar 20 18:44:00 crc kubenswrapper[4690]: I0320 18:44:00.154298 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5fwhb" Mar 20 18:44:00 crc kubenswrapper[4690]: I0320 18:44:00.154663 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 18:44:00 crc kubenswrapper[4690]: I0320 18:44:00.160825 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567204-4wjch"] Mar 20 18:44:00 crc kubenswrapper[4690]: I0320 18:44:00.168723 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 18:44:00 crc kubenswrapper[4690]: I0320 18:44:00.280439 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwz7q\" (UniqueName: \"kubernetes.io/projected/f313f18e-e2ae-49d0-92df-2a27a669f159-kube-api-access-nwz7q\") pod \"auto-csr-approver-29567204-4wjch\" (UID: \"f313f18e-e2ae-49d0-92df-2a27a669f159\") " pod="openshift-infra/auto-csr-approver-29567204-4wjch" Mar 20 18:44:00 crc kubenswrapper[4690]: I0320 18:44:00.382631 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwz7q\" (UniqueName: \"kubernetes.io/projected/f313f18e-e2ae-49d0-92df-2a27a669f159-kube-api-access-nwz7q\") pod \"auto-csr-approver-29567204-4wjch\" (UID: \"f313f18e-e2ae-49d0-92df-2a27a669f159\") " pod="openshift-infra/auto-csr-approver-29567204-4wjch" Mar 20 18:44:00 crc kubenswrapper[4690]: I0320 18:44:00.406435 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwz7q\" (UniqueName: \"kubernetes.io/projected/f313f18e-e2ae-49d0-92df-2a27a669f159-kube-api-access-nwz7q\") pod \"auto-csr-approver-29567204-4wjch\" (UID: \"f313f18e-e2ae-49d0-92df-2a27a669f159\") " pod="openshift-infra/auto-csr-approver-29567204-4wjch" Mar 20 18:44:00 crc kubenswrapper[4690]: I0320 18:44:00.470569 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567204-4wjch" Mar 20 18:44:00 crc kubenswrapper[4690]: I0320 18:44:00.902255 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567204-4wjch"] Mar 20 18:44:00 crc kubenswrapper[4690]: W0320 18:44:00.905373 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf313f18e_e2ae_49d0_92df_2a27a669f159.slice/crio-a501bc60af34bb1d8402ca03d44d42c946ade21b0c9424532aea099ddc263e1a WatchSource:0}: Error finding container a501bc60af34bb1d8402ca03d44d42c946ade21b0c9424532aea099ddc263e1a: Status 404 returned error can't find the container with id a501bc60af34bb1d8402ca03d44d42c946ade21b0c9424532aea099ddc263e1a Mar 20 18:44:01 crc kubenswrapper[4690]: I0320 18:44:01.205546 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567204-4wjch" event={"ID":"f313f18e-e2ae-49d0-92df-2a27a669f159","Type":"ContainerStarted","Data":"a501bc60af34bb1d8402ca03d44d42c946ade21b0c9424532aea099ddc263e1a"} Mar 20 18:44:03 crc kubenswrapper[4690]: I0320 18:44:03.226828 4690 generic.go:334] "Generic (PLEG): container finished" podID="f313f18e-e2ae-49d0-92df-2a27a669f159" containerID="da4c7bf380ca28f1bfc7774e88a39af318882548719be67922e0f3436a9c119a" exitCode=0 Mar 20 18:44:03 crc kubenswrapper[4690]: I0320 18:44:03.226886 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567204-4wjch" event={"ID":"f313f18e-e2ae-49d0-92df-2a27a669f159","Type":"ContainerDied","Data":"da4c7bf380ca28f1bfc7774e88a39af318882548719be67922e0f3436a9c119a"} Mar 20 18:44:04 crc kubenswrapper[4690]: I0320 18:44:04.573689 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567204-4wjch" Mar 20 18:44:04 crc kubenswrapper[4690]: I0320 18:44:04.670485 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwz7q\" (UniqueName: \"kubernetes.io/projected/f313f18e-e2ae-49d0-92df-2a27a669f159-kube-api-access-nwz7q\") pod \"f313f18e-e2ae-49d0-92df-2a27a669f159\" (UID: \"f313f18e-e2ae-49d0-92df-2a27a669f159\") " Mar 20 18:44:04 crc kubenswrapper[4690]: I0320 18:44:04.676204 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f313f18e-e2ae-49d0-92df-2a27a669f159-kube-api-access-nwz7q" (OuterVolumeSpecName: "kube-api-access-nwz7q") pod "f313f18e-e2ae-49d0-92df-2a27a669f159" (UID: "f313f18e-e2ae-49d0-92df-2a27a669f159"). InnerVolumeSpecName "kube-api-access-nwz7q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:44:04 crc kubenswrapper[4690]: I0320 18:44:04.772276 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwz7q\" (UniqueName: \"kubernetes.io/projected/f313f18e-e2ae-49d0-92df-2a27a669f159-kube-api-access-nwz7q\") on node \"crc\" DevicePath \"\"" Mar 20 18:44:05 crc kubenswrapper[4690]: I0320 18:44:05.254709 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567204-4wjch" event={"ID":"f313f18e-e2ae-49d0-92df-2a27a669f159","Type":"ContainerDied","Data":"a501bc60af34bb1d8402ca03d44d42c946ade21b0c9424532aea099ddc263e1a"} Mar 20 18:44:05 crc kubenswrapper[4690]: I0320 18:44:05.255249 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a501bc60af34bb1d8402ca03d44d42c946ade21b0c9424532aea099ddc263e1a" Mar 20 18:44:05 crc kubenswrapper[4690]: I0320 18:44:05.254762 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567204-4wjch" Mar 20 18:44:05 crc kubenswrapper[4690]: I0320 18:44:05.643007 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567198-c9q5m"] Mar 20 18:44:05 crc kubenswrapper[4690]: I0320 18:44:05.656731 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567198-c9q5m"] Mar 20 18:44:05 crc kubenswrapper[4690]: I0320 18:44:05.895555 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="973021e2-d5d7-48a9-b0e1-487008ee4009" path="/var/lib/kubelet/pods/973021e2-d5d7-48a9-b0e1-487008ee4009/volumes" Mar 20 18:44:19 crc kubenswrapper[4690]: I0320 18:44:19.821184 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-jbfw7_fa6b1a4f-0c86-4aa0-8d19-f29a78797c6e/kube-rbac-proxy/0.log" Mar 20 18:44:19 crc kubenswrapper[4690]: I0320 18:44:19.896909 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-jbfw7_fa6b1a4f-0c86-4aa0-8d19-f29a78797c6e/controller/0.log" Mar 20 18:44:20 crc kubenswrapper[4690]: I0320 18:44:20.087708 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9cwmb_2edd85fb-0387-4738-ba35-2b326b635a1b/cp-frr-files/0.log" Mar 20 18:44:20 crc kubenswrapper[4690]: I0320 18:44:20.269334 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9cwmb_2edd85fb-0387-4738-ba35-2b326b635a1b/cp-metrics/0.log" Mar 20 18:44:20 crc kubenswrapper[4690]: I0320 18:44:20.287892 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9cwmb_2edd85fb-0387-4738-ba35-2b326b635a1b/cp-reloader/0.log" Mar 20 18:44:20 crc kubenswrapper[4690]: I0320 18:44:20.295189 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9cwmb_2edd85fb-0387-4738-ba35-2b326b635a1b/cp-reloader/0.log" Mar 20 18:44:20 crc kubenswrapper[4690]: I0320 18:44:20.300388 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9cwmb_2edd85fb-0387-4738-ba35-2b326b635a1b/cp-frr-files/0.log" Mar 20 18:44:20 crc kubenswrapper[4690]: I0320 18:44:20.454047 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9cwmb_2edd85fb-0387-4738-ba35-2b326b635a1b/cp-frr-files/0.log" Mar 20 18:44:20 crc kubenswrapper[4690]: I0320 18:44:20.481496 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9cwmb_2edd85fb-0387-4738-ba35-2b326b635a1b/cp-reloader/0.log" Mar 20 18:44:20 crc kubenswrapper[4690]: I0320 18:44:20.516652 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9cwmb_2edd85fb-0387-4738-ba35-2b326b635a1b/cp-metrics/0.log" Mar 20 18:44:20 crc kubenswrapper[4690]: I0320 18:44:20.518134 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9cwmb_2edd85fb-0387-4738-ba35-2b326b635a1b/cp-metrics/0.log" Mar 20 18:44:20 crc kubenswrapper[4690]: I0320 18:44:20.681576 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9cwmb_2edd85fb-0387-4738-ba35-2b326b635a1b/cp-metrics/0.log" Mar 20 18:44:20 crc kubenswrapper[4690]: I0320 18:44:20.697960 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9cwmb_2edd85fb-0387-4738-ba35-2b326b635a1b/controller/0.log" Mar 20 18:44:20 crc kubenswrapper[4690]: I0320 18:44:20.723064 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9cwmb_2edd85fb-0387-4738-ba35-2b326b635a1b/cp-reloader/0.log" Mar 20 18:44:20 crc kubenswrapper[4690]: I0320 18:44:20.746675 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9cwmb_2edd85fb-0387-4738-ba35-2b326b635a1b/cp-frr-files/0.log" Mar 20 18:44:20 crc kubenswrapper[4690]: I0320 18:44:20.853480 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9cwmb_2edd85fb-0387-4738-ba35-2b326b635a1b/frr-metrics/0.log" Mar 20 18:44:20 crc kubenswrapper[4690]: I0320 18:44:20.943278 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9cwmb_2edd85fb-0387-4738-ba35-2b326b635a1b/kube-rbac-proxy/0.log" Mar 20 18:44:20 crc kubenswrapper[4690]: I0320 18:44:20.976194 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9cwmb_2edd85fb-0387-4738-ba35-2b326b635a1b/kube-rbac-proxy-frr/0.log" Mar 20 18:44:21 crc kubenswrapper[4690]: I0320 18:44:21.082445 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9cwmb_2edd85fb-0387-4738-ba35-2b326b635a1b/reloader/0.log" Mar 20 18:44:21 crc kubenswrapper[4690]: I0320 18:44:21.135352 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-fn9hk_60df4ef1-2b12-4b8f-aec8-0a716fa5f7d0/frr-k8s-webhook-server/0.log" Mar 20 18:44:21 crc kubenswrapper[4690]: I0320 18:44:21.366881 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-77445cdfc6-46r4h_fcfa35ef-e556-4c2e-a742-84265930366f/manager/0.log" Mar 20 18:44:21 crc kubenswrapper[4690]: I0320 18:44:21.514477 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-5b96b44647-l5zw2_7a5a20c4-0745-41d2-a2ff-f389423513b2/webhook-server/0.log" Mar 20 18:44:21 crc kubenswrapper[4690]: I0320 18:44:21.609509 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-cfggn_9e0ecbbf-1e0c-408a-b58c-07cd90497c39/kube-rbac-proxy/0.log" Mar 20 18:44:22 crc kubenswrapper[4690]: I0320 18:44:22.202676 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-cfggn_9e0ecbbf-1e0c-408a-b58c-07cd90497c39/speaker/0.log" Mar 20 18:44:22 crc kubenswrapper[4690]: I0320 18:44:22.575811 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9cwmb_2edd85fb-0387-4738-ba35-2b326b635a1b/frr/0.log" Mar 20 18:44:24 crc kubenswrapper[4690]: I0320 18:44:24.274046 4690 patch_prober.go:28] interesting pod/machine-config-daemon-wtg2q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 18:44:24 crc kubenswrapper[4690]: I0320 18:44:24.274117 4690 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 18:44:25 crc kubenswrapper[4690]: I0320 18:44:25.813084 4690 scope.go:117] "RemoveContainer" containerID="3a1c7a9f9da72a541c12435448886bf8fa98a757aa213cbb2267650c5ceaef50" Mar 20 18:44:37 crc kubenswrapper[4690]: I0320 18:44:37.248893 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8742tzwd_417924bb-8f83-4db1-b370-92e0fac118f4/util/0.log" Mar 20 18:44:37 crc kubenswrapper[4690]: I0320 18:44:37.494849 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8742tzwd_417924bb-8f83-4db1-b370-92e0fac118f4/util/0.log" Mar 20 18:44:37 crc kubenswrapper[4690]: I0320 18:44:37.510268 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8742tzwd_417924bb-8f83-4db1-b370-92e0fac118f4/pull/0.log" Mar 20 18:44:37 crc kubenswrapper[4690]: I0320 18:44:37.529845 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8742tzwd_417924bb-8f83-4db1-b370-92e0fac118f4/pull/0.log" Mar 20 18:44:37 crc kubenswrapper[4690]: I0320 18:44:37.708707 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8742tzwd_417924bb-8f83-4db1-b370-92e0fac118f4/util/0.log" Mar 20 18:44:37 crc kubenswrapper[4690]: I0320 18:44:37.743086 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8742tzwd_417924bb-8f83-4db1-b370-92e0fac118f4/pull/0.log" Mar 20 18:44:37 crc kubenswrapper[4690]: I0320 18:44:37.772920 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8742tzwd_417924bb-8f83-4db1-b370-92e0fac118f4/extract/0.log" Mar 20 18:44:37 crc kubenswrapper[4690]: I0320 18:44:37.877984 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1428tj_40861b19-ba1a-4adf-8ee2-25a7c3016940/util/0.log" Mar 20 18:44:38 crc kubenswrapper[4690]: I0320 18:44:38.077163 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1428tj_40861b19-ba1a-4adf-8ee2-25a7c3016940/pull/0.log" Mar 20 18:44:38 crc kubenswrapper[4690]: I0320 18:44:38.077977 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1428tj_40861b19-ba1a-4adf-8ee2-25a7c3016940/util/0.log" Mar 20 18:44:38 crc kubenswrapper[4690]: I0320 18:44:38.128696 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1428tj_40861b19-ba1a-4adf-8ee2-25a7c3016940/pull/0.log" Mar 20 18:44:38 crc kubenswrapper[4690]: I0320 18:44:38.421885 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1428tj_40861b19-ba1a-4adf-8ee2-25a7c3016940/util/0.log" Mar 20 18:44:38 crc kubenswrapper[4690]: I0320 18:44:38.438890 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1428tj_40861b19-ba1a-4adf-8ee2-25a7c3016940/pull/0.log" Mar 20 18:44:38 crc kubenswrapper[4690]: I0320 18:44:38.452869 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1428tj_40861b19-ba1a-4adf-8ee2-25a7c3016940/extract/0.log" Mar 20 18:44:38 crc kubenswrapper[4690]: I0320 18:44:38.598760 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vpf95_2f23f3d2-ebe3-44b0-9872-dfb5da5932e2/extract-utilities/0.log" Mar 20 18:44:38 crc kubenswrapper[4690]: I0320 18:44:38.801650 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vpf95_2f23f3d2-ebe3-44b0-9872-dfb5da5932e2/extract-content/0.log" Mar 20 18:44:38 crc kubenswrapper[4690]: I0320 18:44:38.806205 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vpf95_2f23f3d2-ebe3-44b0-9872-dfb5da5932e2/extract-content/0.log" Mar 20 18:44:38 crc kubenswrapper[4690]: I0320 18:44:38.806277 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vpf95_2f23f3d2-ebe3-44b0-9872-dfb5da5932e2/extract-utilities/0.log" Mar 20 18:44:38 crc kubenswrapper[4690]: I0320 18:44:38.994558 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vpf95_2f23f3d2-ebe3-44b0-9872-dfb5da5932e2/extract-content/0.log" Mar 20 18:44:39 crc kubenswrapper[4690]: I0320 18:44:39.079488 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vpf95_2f23f3d2-ebe3-44b0-9872-dfb5da5932e2/extract-utilities/0.log" Mar 20 18:44:39 crc kubenswrapper[4690]: I0320 18:44:39.237530 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bnw8t_254fbc18-10d1-444c-aef5-12f66b65b191/extract-utilities/0.log" Mar 20 18:44:39 crc kubenswrapper[4690]: I0320 18:44:39.442949 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vpf95_2f23f3d2-ebe3-44b0-9872-dfb5da5932e2/registry-server/0.log" Mar 20 18:44:39 crc kubenswrapper[4690]: I0320 18:44:39.474978 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bnw8t_254fbc18-10d1-444c-aef5-12f66b65b191/extract-content/0.log" Mar 20 18:44:39 crc kubenswrapper[4690]: I0320 18:44:39.476892 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bnw8t_254fbc18-10d1-444c-aef5-12f66b65b191/extract-utilities/0.log" Mar 20 18:44:39 crc kubenswrapper[4690]: I0320 18:44:39.502297 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bnw8t_254fbc18-10d1-444c-aef5-12f66b65b191/extract-content/0.log" Mar 20 18:44:39 crc kubenswrapper[4690]: I0320 18:44:39.634966 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bnw8t_254fbc18-10d1-444c-aef5-12f66b65b191/extract-content/0.log" Mar 20 18:44:39 crc kubenswrapper[4690]: I0320 18:44:39.659658 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bnw8t_254fbc18-10d1-444c-aef5-12f66b65b191/extract-utilities/0.log" Mar 20 18:44:39 crc kubenswrapper[4690]: I0320 18:44:39.947644 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xxql6_255ea7b7-2364-4ebf-9104-6a78278ee9c0/extract-utilities/0.log" Mar 20 18:44:39 crc kubenswrapper[4690]: I0320 18:44:39.951988 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-7hpm8_23f72eed-c3c0-4aed-a4a8-8243c27a2785/marketplace-operator/0.log" Mar 20 18:44:40 crc kubenswrapper[4690]: I0320 18:44:40.138602 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xxql6_255ea7b7-2364-4ebf-9104-6a78278ee9c0/extract-utilities/0.log" Mar 20 18:44:40 crc kubenswrapper[4690]: I0320 18:44:40.171036 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xxql6_255ea7b7-2364-4ebf-9104-6a78278ee9c0/extract-content/0.log" Mar 20 18:44:40 crc kubenswrapper[4690]: I0320 18:44:40.205626 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xxql6_255ea7b7-2364-4ebf-9104-6a78278ee9c0/extract-content/0.log" Mar 20 18:44:40 crc kubenswrapper[4690]: I0320 18:44:40.226615 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bnw8t_254fbc18-10d1-444c-aef5-12f66b65b191/registry-server/0.log" Mar 20 18:44:40 crc kubenswrapper[4690]: I0320 18:44:40.376699 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xxql6_255ea7b7-2364-4ebf-9104-6a78278ee9c0/extract-utilities/0.log" Mar 20 18:44:40 crc kubenswrapper[4690]: I0320 18:44:40.409418 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xxql6_255ea7b7-2364-4ebf-9104-6a78278ee9c0/extract-content/0.log" Mar 20 18:44:40 crc kubenswrapper[4690]: I0320 18:44:40.582010 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xxql6_255ea7b7-2364-4ebf-9104-6a78278ee9c0/registry-server/0.log" Mar 20 18:44:40 crc kubenswrapper[4690]: I0320 18:44:40.617699 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qx8lq_d788b569-8dbd-4311-bab4-04c7cd0f1444/extract-utilities/0.log" Mar 20 18:44:40 crc kubenswrapper[4690]: I0320 18:44:40.746559 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qx8lq_d788b569-8dbd-4311-bab4-04c7cd0f1444/extract-utilities/0.log" Mar 20 18:44:40 crc kubenswrapper[4690]: I0320 18:44:40.808923 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qx8lq_d788b569-8dbd-4311-bab4-04c7cd0f1444/extract-content/0.log" Mar 20 18:44:40 crc kubenswrapper[4690]: I0320 18:44:40.841378 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qx8lq_d788b569-8dbd-4311-bab4-04c7cd0f1444/extract-content/0.log" Mar 20 18:44:41 crc kubenswrapper[4690]: I0320 18:44:41.046611 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qx8lq_d788b569-8dbd-4311-bab4-04c7cd0f1444/extract-utilities/0.log" Mar 20 18:44:41 crc kubenswrapper[4690]: I0320 18:44:41.048079 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qx8lq_d788b569-8dbd-4311-bab4-04c7cd0f1444/extract-content/0.log" Mar 20 18:44:41 crc kubenswrapper[4690]: I0320 18:44:41.503941 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qx8lq_d788b569-8dbd-4311-bab4-04c7cd0f1444/registry-server/0.log" Mar 20 18:44:54 crc kubenswrapper[4690]: I0320 18:44:54.273511 4690 patch_prober.go:28] interesting pod/machine-config-daemon-wtg2q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 18:44:54 crc kubenswrapper[4690]: I0320 18:44:54.274014 4690 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 18:44:54 crc kubenswrapper[4690]: I0320 18:44:54.274058 4690 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" Mar 20 18:44:54 crc kubenswrapper[4690]: I0320 18:44:54.274625 4690 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fbdcba45779a1815c161907da2b0a7af3b6b510d3ce48a9115db0a6fc2b46293"} pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 18:44:54 crc kubenswrapper[4690]: I0320 18:44:54.274670 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" containerName="machine-config-daemon" containerID="cri-o://fbdcba45779a1815c161907da2b0a7af3b6b510d3ce48a9115db0a6fc2b46293" gracePeriod=600 Mar 20 18:44:54 crc kubenswrapper[4690]: E0320 18:44:54.433076 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:44:55 crc kubenswrapper[4690]: I0320 18:44:55.083933 4690 generic.go:334] "Generic (PLEG): container finished" podID="c18651e4-89e3-43fd-a780-bfa6df87591e" containerID="fbdcba45779a1815c161907da2b0a7af3b6b510d3ce48a9115db0a6fc2b46293" exitCode=0 Mar 20 18:44:55 crc kubenswrapper[4690]: I0320 18:44:55.084043 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" event={"ID":"c18651e4-89e3-43fd-a780-bfa6df87591e","Type":"ContainerDied","Data":"fbdcba45779a1815c161907da2b0a7af3b6b510d3ce48a9115db0a6fc2b46293"} Mar 20 18:44:55 crc kubenswrapper[4690]: I0320 18:44:55.084624 4690 scope.go:117] "RemoveContainer" containerID="71d102b5805b8fe7f4d59a06973e7c063c629bb25d542cdb23626c14c624bec3" Mar 20 18:44:55 crc kubenswrapper[4690]: I0320 18:44:55.085342 4690 scope.go:117] "RemoveContainer" containerID="fbdcba45779a1815c161907da2b0a7af3b6b510d3ce48a9115db0a6fc2b46293" Mar 20 18:44:55 crc kubenswrapper[4690]: E0320 18:44:55.085781 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:45:00 crc kubenswrapper[4690]: I0320 18:45:00.167827 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567205-hgxbn"] Mar 20 18:45:00 crc kubenswrapper[4690]: E0320 18:45:00.168742 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f313f18e-e2ae-49d0-92df-2a27a669f159" containerName="oc" Mar 20 18:45:00 crc kubenswrapper[4690]: I0320 18:45:00.168755 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="f313f18e-e2ae-49d0-92df-2a27a669f159" containerName="oc" Mar 20 18:45:00 crc kubenswrapper[4690]: I0320 18:45:00.168946 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="f313f18e-e2ae-49d0-92df-2a27a669f159" containerName="oc" Mar 20 18:45:00 crc kubenswrapper[4690]: I0320 18:45:00.169576 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567205-hgxbn" Mar 20 18:45:00 crc kubenswrapper[4690]: I0320 18:45:00.171645 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 18:45:00 crc kubenswrapper[4690]: I0320 18:45:00.171858 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 18:45:00 crc kubenswrapper[4690]: I0320 18:45:00.179313 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567205-hgxbn"] Mar 20 18:45:00 crc kubenswrapper[4690]: I0320 18:45:00.186590 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d27a521b-aaf4-46ff-8f35-abbe8b6d7d2c-config-volume\") pod \"collect-profiles-29567205-hgxbn\" (UID: \"d27a521b-aaf4-46ff-8f35-abbe8b6d7d2c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567205-hgxbn" Mar 20 18:45:00 crc kubenswrapper[4690]: I0320 18:45:00.186679 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d27a521b-aaf4-46ff-8f35-abbe8b6d7d2c-secret-volume\") pod \"collect-profiles-29567205-hgxbn\" (UID: \"d27a521b-aaf4-46ff-8f35-abbe8b6d7d2c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567205-hgxbn" Mar 20 18:45:00 crc kubenswrapper[4690]: I0320 18:45:00.186716 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6887\" (UniqueName: \"kubernetes.io/projected/d27a521b-aaf4-46ff-8f35-abbe8b6d7d2c-kube-api-access-p6887\") pod \"collect-profiles-29567205-hgxbn\" (UID: \"d27a521b-aaf4-46ff-8f35-abbe8b6d7d2c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567205-hgxbn" Mar 20 18:45:00 crc kubenswrapper[4690]: I0320 18:45:00.288703 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d27a521b-aaf4-46ff-8f35-abbe8b6d7d2c-config-volume\") pod \"collect-profiles-29567205-hgxbn\" (UID: \"d27a521b-aaf4-46ff-8f35-abbe8b6d7d2c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567205-hgxbn" Mar 20 18:45:00 crc kubenswrapper[4690]: I0320 18:45:00.289850 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d27a521b-aaf4-46ff-8f35-abbe8b6d7d2c-secret-volume\") pod \"collect-profiles-29567205-hgxbn\" (UID: \"d27a521b-aaf4-46ff-8f35-abbe8b6d7d2c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567205-hgxbn" Mar 20 18:45:00 crc kubenswrapper[4690]: I0320 18:45:00.289908 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6887\" (UniqueName: \"kubernetes.io/projected/d27a521b-aaf4-46ff-8f35-abbe8b6d7d2c-kube-api-access-p6887\") pod \"collect-profiles-29567205-hgxbn\" (UID: \"d27a521b-aaf4-46ff-8f35-abbe8b6d7d2c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567205-hgxbn" Mar 20 18:45:00 crc kubenswrapper[4690]: I0320 18:45:00.289707 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d27a521b-aaf4-46ff-8f35-abbe8b6d7d2c-config-volume\") pod \"collect-profiles-29567205-hgxbn\" (UID: \"d27a521b-aaf4-46ff-8f35-abbe8b6d7d2c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567205-hgxbn" Mar 20 18:45:00 crc kubenswrapper[4690]: I0320 18:45:00.321985 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d27a521b-aaf4-46ff-8f35-abbe8b6d7d2c-secret-volume\") pod \"collect-profiles-29567205-hgxbn\" (UID: \"d27a521b-aaf4-46ff-8f35-abbe8b6d7d2c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567205-hgxbn" Mar 20 18:45:00 crc kubenswrapper[4690]: I0320 18:45:00.330969 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6887\" (UniqueName: \"kubernetes.io/projected/d27a521b-aaf4-46ff-8f35-abbe8b6d7d2c-kube-api-access-p6887\") pod \"collect-profiles-29567205-hgxbn\" (UID: \"d27a521b-aaf4-46ff-8f35-abbe8b6d7d2c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567205-hgxbn" Mar 20 18:45:00 crc kubenswrapper[4690]: I0320 18:45:00.498773 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567205-hgxbn" Mar 20 18:45:01 crc kubenswrapper[4690]: I0320 18:45:01.046352 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567205-hgxbn"] Mar 20 18:45:01 crc kubenswrapper[4690]: W0320 18:45:01.057801 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd27a521b_aaf4_46ff_8f35_abbe8b6d7d2c.slice/crio-f96418259320203a6f14072f962642a690fea69019bd7defae034d56cec6f580 WatchSource:0}: Error finding container f96418259320203a6f14072f962642a690fea69019bd7defae034d56cec6f580: Status 404 returned error can't find the container with id f96418259320203a6f14072f962642a690fea69019bd7defae034d56cec6f580 Mar 20 18:45:01 crc kubenswrapper[4690]: I0320 18:45:01.143794 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567205-hgxbn" event={"ID":"d27a521b-aaf4-46ff-8f35-abbe8b6d7d2c","Type":"ContainerStarted","Data":"f96418259320203a6f14072f962642a690fea69019bd7defae034d56cec6f580"} Mar 20 18:45:02 crc kubenswrapper[4690]: I0320 18:45:02.153889 4690 generic.go:334] "Generic (PLEG): container finished" podID="d27a521b-aaf4-46ff-8f35-abbe8b6d7d2c" containerID="4ab6e6ffabcdeda1c196c170d4961fa0bff58b59b12d4dc19cef3ee703e72d44" exitCode=0 Mar 20 18:45:02 crc kubenswrapper[4690]: I0320 18:45:02.153980 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567205-hgxbn" event={"ID":"d27a521b-aaf4-46ff-8f35-abbe8b6d7d2c","Type":"ContainerDied","Data":"4ab6e6ffabcdeda1c196c170d4961fa0bff58b59b12d4dc19cef3ee703e72d44"} Mar 20 18:45:04 crc kubenswrapper[4690]: I0320 18:45:04.170164 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567205-hgxbn" Mar 20 18:45:04 crc kubenswrapper[4690]: I0320 18:45:04.171963 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567205-hgxbn" event={"ID":"d27a521b-aaf4-46ff-8f35-abbe8b6d7d2c","Type":"ContainerDied","Data":"f96418259320203a6f14072f962642a690fea69019bd7defae034d56cec6f580"} Mar 20 18:45:04 crc kubenswrapper[4690]: I0320 18:45:04.172000 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f96418259320203a6f14072f962642a690fea69019bd7defae034d56cec6f580" Mar 20 18:45:04 crc kubenswrapper[4690]: I0320 18:45:04.369129 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d27a521b-aaf4-46ff-8f35-abbe8b6d7d2c-config-volume\") pod \"d27a521b-aaf4-46ff-8f35-abbe8b6d7d2c\" (UID: \"d27a521b-aaf4-46ff-8f35-abbe8b6d7d2c\") " Mar 20 18:45:04 crc kubenswrapper[4690]: I0320 18:45:04.369266 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6887\" (UniqueName: \"kubernetes.io/projected/d27a521b-aaf4-46ff-8f35-abbe8b6d7d2c-kube-api-access-p6887\") pod \"d27a521b-aaf4-46ff-8f35-abbe8b6d7d2c\" (UID: \"d27a521b-aaf4-46ff-8f35-abbe8b6d7d2c\") " Mar 20 18:45:04 crc kubenswrapper[4690]: I0320 18:45:04.369311 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d27a521b-aaf4-46ff-8f35-abbe8b6d7d2c-secret-volume\") pod \"d27a521b-aaf4-46ff-8f35-abbe8b6d7d2c\" (UID: \"d27a521b-aaf4-46ff-8f35-abbe8b6d7d2c\") " Mar 20 18:45:04 crc kubenswrapper[4690]: I0320 18:45:04.369976 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d27a521b-aaf4-46ff-8f35-abbe8b6d7d2c-config-volume" (OuterVolumeSpecName: "config-volume") pod "d27a521b-aaf4-46ff-8f35-abbe8b6d7d2c" (UID: "d27a521b-aaf4-46ff-8f35-abbe8b6d7d2c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 18:45:04 crc kubenswrapper[4690]: I0320 18:45:04.379490 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d27a521b-aaf4-46ff-8f35-abbe8b6d7d2c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d27a521b-aaf4-46ff-8f35-abbe8b6d7d2c" (UID: "d27a521b-aaf4-46ff-8f35-abbe8b6d7d2c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:45:04 crc kubenswrapper[4690]: I0320 18:45:04.394054 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d27a521b-aaf4-46ff-8f35-abbe8b6d7d2c-kube-api-access-p6887" (OuterVolumeSpecName: "kube-api-access-p6887") pod "d27a521b-aaf4-46ff-8f35-abbe8b6d7d2c" (UID: "d27a521b-aaf4-46ff-8f35-abbe8b6d7d2c"). InnerVolumeSpecName "kube-api-access-p6887". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:45:04 crc kubenswrapper[4690]: I0320 18:45:04.470762 4690 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d27a521b-aaf4-46ff-8f35-abbe8b6d7d2c-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 18:45:04 crc kubenswrapper[4690]: I0320 18:45:04.470801 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6887\" (UniqueName: \"kubernetes.io/projected/d27a521b-aaf4-46ff-8f35-abbe8b6d7d2c-kube-api-access-p6887\") on node \"crc\" DevicePath \"\"" Mar 20 18:45:04 crc kubenswrapper[4690]: I0320 18:45:04.470815 4690 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d27a521b-aaf4-46ff-8f35-abbe8b6d7d2c-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 18:45:05 crc kubenswrapper[4690]: I0320 18:45:05.180718 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567205-hgxbn" Mar 20 18:45:05 crc kubenswrapper[4690]: I0320 18:45:05.263191 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567160-sjbbb"] Mar 20 18:45:05 crc kubenswrapper[4690]: I0320 18:45:05.273725 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567160-sjbbb"] Mar 20 18:45:05 crc kubenswrapper[4690]: I0320 18:45:05.931666 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06575ad3-48a5-4fdc-8e69-761bf8ab240b" path="/var/lib/kubelet/pods/06575ad3-48a5-4fdc-8e69-761bf8ab240b/volumes" Mar 20 18:45:06 crc kubenswrapper[4690]: I0320 18:45:06.883961 4690 scope.go:117] "RemoveContainer" containerID="fbdcba45779a1815c161907da2b0a7af3b6b510d3ce48a9115db0a6fc2b46293" Mar 20 18:45:06 crc kubenswrapper[4690]: E0320 18:45:06.884331 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:45:19 crc kubenswrapper[4690]: I0320 18:45:19.883523 4690 scope.go:117] "RemoveContainer" containerID="fbdcba45779a1815c161907da2b0a7af3b6b510d3ce48a9115db0a6fc2b46293" Mar 20 18:45:19 crc kubenswrapper[4690]: E0320 18:45:19.884482 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:45:25 crc kubenswrapper[4690]: I0320 18:45:25.895482 4690 scope.go:117] "RemoveContainer" containerID="5033a1b7e5a1ae707b4a8d8bc47a1721de5645ededab65cca0f9f63de56c7796" Mar 20 18:45:32 crc kubenswrapper[4690]: I0320 18:45:32.883315 4690 scope.go:117] "RemoveContainer" containerID="fbdcba45779a1815c161907da2b0a7af3b6b510d3ce48a9115db0a6fc2b46293" Mar 20 18:45:32 crc kubenswrapper[4690]: E0320 18:45:32.884980 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:45:43 crc kubenswrapper[4690]: I0320 18:45:43.883940 4690 scope.go:117] "RemoveContainer" containerID="fbdcba45779a1815c161907da2b0a7af3b6b510d3ce48a9115db0a6fc2b46293" Mar 20 18:45:43 crc kubenswrapper[4690]: E0320 18:45:43.885229 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:45:54 crc kubenswrapper[4690]: I0320 18:45:54.883156 4690 scope.go:117] "RemoveContainer" containerID="fbdcba45779a1815c161907da2b0a7af3b6b510d3ce48a9115db0a6fc2b46293" Mar 20 18:45:54 crc kubenswrapper[4690]: E0320 18:45:54.884383 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:46:00 crc kubenswrapper[4690]: I0320 18:46:00.143038 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567206-dfmd7"] Mar 20 18:46:00 crc kubenswrapper[4690]: E0320 18:46:00.143911 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d27a521b-aaf4-46ff-8f35-abbe8b6d7d2c" containerName="collect-profiles" Mar 20 18:46:00 crc kubenswrapper[4690]: I0320 18:46:00.143923 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="d27a521b-aaf4-46ff-8f35-abbe8b6d7d2c" containerName="collect-profiles" Mar 20 18:46:00 crc kubenswrapper[4690]: I0320 18:46:00.144113 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="d27a521b-aaf4-46ff-8f35-abbe8b6d7d2c" containerName="collect-profiles" Mar 20 18:46:00 crc kubenswrapper[4690]: I0320 18:46:00.144716 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567206-dfmd7" Mar 20 18:46:00 crc kubenswrapper[4690]: I0320 18:46:00.147544 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 18:46:00 crc kubenswrapper[4690]: I0320 18:46:00.150729 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 18:46:00 crc kubenswrapper[4690]: I0320 18:46:00.157324 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5fwhb" Mar 20 18:46:00 crc kubenswrapper[4690]: I0320 18:46:00.158756 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567206-dfmd7"] Mar 20 18:46:00 crc kubenswrapper[4690]: I0320 18:46:00.240656 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5m8h\" (UniqueName: \"kubernetes.io/projected/a7a13a04-67be-4afc-9675-6a6f8b2a7227-kube-api-access-t5m8h\") pod \"auto-csr-approver-29567206-dfmd7\" (UID: \"a7a13a04-67be-4afc-9675-6a6f8b2a7227\") " pod="openshift-infra/auto-csr-approver-29567206-dfmd7" Mar 20 18:46:00 crc kubenswrapper[4690]: I0320 18:46:00.342864 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5m8h\" (UniqueName: \"kubernetes.io/projected/a7a13a04-67be-4afc-9675-6a6f8b2a7227-kube-api-access-t5m8h\") pod \"auto-csr-approver-29567206-dfmd7\" (UID: \"a7a13a04-67be-4afc-9675-6a6f8b2a7227\") " pod="openshift-infra/auto-csr-approver-29567206-dfmd7" Mar 20 18:46:00 crc kubenswrapper[4690]: I0320 18:46:00.611468 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5m8h\" (UniqueName: \"kubernetes.io/projected/a7a13a04-67be-4afc-9675-6a6f8b2a7227-kube-api-access-t5m8h\") pod \"auto-csr-approver-29567206-dfmd7\" (UID: \"a7a13a04-67be-4afc-9675-6a6f8b2a7227\") " pod="openshift-infra/auto-csr-approver-29567206-dfmd7" Mar 20 18:46:00 crc kubenswrapper[4690]: I0320 18:46:00.797935 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567206-dfmd7" Mar 20 18:46:01 crc kubenswrapper[4690]: I0320 18:46:01.232738 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567206-dfmd7"] Mar 20 18:46:01 crc kubenswrapper[4690]: I0320 18:46:01.727032 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567206-dfmd7" event={"ID":"a7a13a04-67be-4afc-9675-6a6f8b2a7227","Type":"ContainerStarted","Data":"d9d152521c0716d29931fe885ccc60cae5478207d59f2d50b69bd3ca6406d41f"} Mar 20 18:46:02 crc kubenswrapper[4690]: I0320 18:46:02.740998 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567206-dfmd7" event={"ID":"a7a13a04-67be-4afc-9675-6a6f8b2a7227","Type":"ContainerStarted","Data":"db027b6f3af51a7913768bbb4095e4f5f804816607c72b99cbc0d58f5a01ffe0"} Mar 20 18:46:02 crc kubenswrapper[4690]: I0320 18:46:02.759459 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567206-dfmd7" podStartSLOduration=1.777273688 podStartE2EDuration="2.759439151s" podCreationTimestamp="2026-03-20 18:46:00 +0000 UTC" firstStartedPulling="2026-03-20 18:46:01.258379494 +0000 UTC m=+4436.124205172" lastFinishedPulling="2026-03-20 18:46:02.240544937 +0000 UTC m=+4437.106370635" observedRunningTime="2026-03-20 18:46:02.754002487 +0000 UTC m=+4437.619828165" watchObservedRunningTime="2026-03-20 18:46:02.759439151 +0000 UTC m=+4437.625264829" Mar 20 18:46:03 crc kubenswrapper[4690]: I0320 18:46:03.751927 4690 generic.go:334] "Generic (PLEG): container finished" podID="a7a13a04-67be-4afc-9675-6a6f8b2a7227" containerID="db027b6f3af51a7913768bbb4095e4f5f804816607c72b99cbc0d58f5a01ffe0" exitCode=0 Mar 20 18:46:03 crc kubenswrapper[4690]: I0320 18:46:03.751972 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567206-dfmd7" event={"ID":"a7a13a04-67be-4afc-9675-6a6f8b2a7227","Type":"ContainerDied","Data":"db027b6f3af51a7913768bbb4095e4f5f804816607c72b99cbc0d58f5a01ffe0"} Mar 20 18:46:05 crc kubenswrapper[4690]: I0320 18:46:05.169758 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567206-dfmd7" Mar 20 18:46:05 crc kubenswrapper[4690]: I0320 18:46:05.237196 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5m8h\" (UniqueName: \"kubernetes.io/projected/a7a13a04-67be-4afc-9675-6a6f8b2a7227-kube-api-access-t5m8h\") pod \"a7a13a04-67be-4afc-9675-6a6f8b2a7227\" (UID: \"a7a13a04-67be-4afc-9675-6a6f8b2a7227\") " Mar 20 18:46:05 crc kubenswrapper[4690]: I0320 18:46:05.246470 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7a13a04-67be-4afc-9675-6a6f8b2a7227-kube-api-access-t5m8h" (OuterVolumeSpecName: "kube-api-access-t5m8h") pod "a7a13a04-67be-4afc-9675-6a6f8b2a7227" (UID: "a7a13a04-67be-4afc-9675-6a6f8b2a7227"). InnerVolumeSpecName "kube-api-access-t5m8h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:46:05 crc kubenswrapper[4690]: I0320 18:46:05.339997 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5m8h\" (UniqueName: \"kubernetes.io/projected/a7a13a04-67be-4afc-9675-6a6f8b2a7227-kube-api-access-t5m8h\") on node \"crc\" DevicePath \"\"" Mar 20 18:46:05 crc kubenswrapper[4690]: I0320 18:46:05.776698 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567206-dfmd7" event={"ID":"a7a13a04-67be-4afc-9675-6a6f8b2a7227","Type":"ContainerDied","Data":"d9d152521c0716d29931fe885ccc60cae5478207d59f2d50b69bd3ca6406d41f"} Mar 20 18:46:05 crc kubenswrapper[4690]: I0320 18:46:05.776770 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d9d152521c0716d29931fe885ccc60cae5478207d59f2d50b69bd3ca6406d41f" Mar 20 18:46:05 crc kubenswrapper[4690]: I0320 18:46:05.776805 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567206-dfmd7" Mar 20 18:46:05 crc kubenswrapper[4690]: I0320 18:46:05.850016 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567200-2z4fn"] Mar 20 18:46:05 crc kubenswrapper[4690]: I0320 18:46:05.860648 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567200-2z4fn"] Mar 20 18:46:05 crc kubenswrapper[4690]: I0320 18:46:05.892544 4690 scope.go:117] "RemoveContainer" containerID="fbdcba45779a1815c161907da2b0a7af3b6b510d3ce48a9115db0a6fc2b46293" Mar 20 18:46:05 crc kubenswrapper[4690]: E0320 18:46:05.893064 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:46:05 crc kubenswrapper[4690]: I0320 18:46:05.907536 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be39da38-bef8-41df-9277-b515edecd46c" path="/var/lib/kubelet/pods/be39da38-bef8-41df-9277-b515edecd46c/volumes" Mar 20 18:46:18 crc kubenswrapper[4690]: I0320 18:46:18.883588 4690 scope.go:117] "RemoveContainer" containerID="fbdcba45779a1815c161907da2b0a7af3b6b510d3ce48a9115db0a6fc2b46293" Mar 20 18:46:18 crc kubenswrapper[4690]: E0320 18:46:18.884811 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:46:25 crc kubenswrapper[4690]: I0320 18:46:25.960858 4690 scope.go:117] "RemoveContainer" containerID="dc3c16228c72e0cb3ecf520f98d305ffd477107c631cb40b3f014effbd621188" Mar 20 18:46:32 crc kubenswrapper[4690]: I0320 18:46:32.884732 4690 scope.go:117] "RemoveContainer" containerID="fbdcba45779a1815c161907da2b0a7af3b6b510d3ce48a9115db0a6fc2b46293" Mar 20 18:46:32 crc kubenswrapper[4690]: E0320 18:46:32.885672 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:46:34 crc kubenswrapper[4690]: I0320 18:46:34.091824 4690 generic.go:334] "Generic (PLEG): container finished" podID="27395613-e0e9-49a0-a752-008c71dd5c23" containerID="c2ce96c4ac33ee45e2281d7cce9a08dd409a12388bd9bb0d846d20248ab79024" exitCode=0 Mar 20 18:46:34 crc kubenswrapper[4690]: I0320 18:46:34.091865 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fhzm4/must-gather-jbnnt" event={"ID":"27395613-e0e9-49a0-a752-008c71dd5c23","Type":"ContainerDied","Data":"c2ce96c4ac33ee45e2281d7cce9a08dd409a12388bd9bb0d846d20248ab79024"} Mar 20 18:46:34 crc kubenswrapper[4690]: I0320 18:46:34.092424 4690 scope.go:117] "RemoveContainer" containerID="c2ce96c4ac33ee45e2281d7cce9a08dd409a12388bd9bb0d846d20248ab79024" Mar 20 18:46:35 crc kubenswrapper[4690]: I0320 18:46:35.071288 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-fhzm4_must-gather-jbnnt_27395613-e0e9-49a0-a752-008c71dd5c23/gather/0.log" Mar 20 18:46:43 crc kubenswrapper[4690]: I0320 18:46:43.884154 4690 scope.go:117] "RemoveContainer" containerID="fbdcba45779a1815c161907da2b0a7af3b6b510d3ce48a9115db0a6fc2b46293" Mar 20 18:46:43 crc kubenswrapper[4690]: E0320 18:46:43.885191 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:46:46 crc kubenswrapper[4690]: I0320 18:46:46.797050 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-fhzm4/must-gather-jbnnt"] Mar 20 18:46:46 crc kubenswrapper[4690]: I0320 18:46:46.798549 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-fhzm4/must-gather-jbnnt" podUID="27395613-e0e9-49a0-a752-008c71dd5c23" containerName="copy" containerID="cri-o://3f525513029ea6b94d57effc8292fcc7fa821331b3259ea6aec14f8eae863f85" gracePeriod=2 Mar 20 18:46:46 crc kubenswrapper[4690]: I0320 18:46:46.807765 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-fhzm4/must-gather-jbnnt"] Mar 20 18:46:47 crc kubenswrapper[4690]: I0320 18:46:47.213455 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-fhzm4_must-gather-jbnnt_27395613-e0e9-49a0-a752-008c71dd5c23/copy/0.log" Mar 20 18:46:47 crc kubenswrapper[4690]: I0320 18:46:47.214045 4690 generic.go:334] "Generic (PLEG): container finished" podID="27395613-e0e9-49a0-a752-008c71dd5c23" containerID="3f525513029ea6b94d57effc8292fcc7fa821331b3259ea6aec14f8eae863f85" exitCode=143 Mar 20 18:46:47 crc kubenswrapper[4690]: I0320 18:46:47.214093 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="adcff40499340913ed7b0fc5c6686b4220658e4f2d3f458f5b037846c748f655" Mar 20 18:46:47 crc kubenswrapper[4690]: I0320 18:46:47.247616 4690 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-fhzm4_must-gather-jbnnt_27395613-e0e9-49a0-a752-008c71dd5c23/copy/0.log" Mar 20 18:46:47 crc kubenswrapper[4690]: I0320 18:46:47.248035 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fhzm4/must-gather-jbnnt" Mar 20 18:46:47 crc kubenswrapper[4690]: I0320 18:46:47.393106 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/27395613-e0e9-49a0-a752-008c71dd5c23-must-gather-output\") pod \"27395613-e0e9-49a0-a752-008c71dd5c23\" (UID: \"27395613-e0e9-49a0-a752-008c71dd5c23\") " Mar 20 18:46:47 crc kubenswrapper[4690]: I0320 18:46:47.393160 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljq8t\" (UniqueName: \"kubernetes.io/projected/27395613-e0e9-49a0-a752-008c71dd5c23-kube-api-access-ljq8t\") pod \"27395613-e0e9-49a0-a752-008c71dd5c23\" (UID: \"27395613-e0e9-49a0-a752-008c71dd5c23\") " Mar 20 18:46:47 crc kubenswrapper[4690]: I0320 18:46:47.401535 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27395613-e0e9-49a0-a752-008c71dd5c23-kube-api-access-ljq8t" (OuterVolumeSpecName: "kube-api-access-ljq8t") pod "27395613-e0e9-49a0-a752-008c71dd5c23" (UID: "27395613-e0e9-49a0-a752-008c71dd5c23"). InnerVolumeSpecName "kube-api-access-ljq8t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:46:47 crc kubenswrapper[4690]: I0320 18:46:47.495154 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljq8t\" (UniqueName: \"kubernetes.io/projected/27395613-e0e9-49a0-a752-008c71dd5c23-kube-api-access-ljq8t\") on node \"crc\" DevicePath \"\"" Mar 20 18:46:47 crc kubenswrapper[4690]: I0320 18:46:47.564098 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27395613-e0e9-49a0-a752-008c71dd5c23-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "27395613-e0e9-49a0-a752-008c71dd5c23" (UID: "27395613-e0e9-49a0-a752-008c71dd5c23"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:46:47 crc kubenswrapper[4690]: I0320 18:46:47.597555 4690 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/27395613-e0e9-49a0-a752-008c71dd5c23-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 20 18:46:47 crc kubenswrapper[4690]: I0320 18:46:47.899711 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27395613-e0e9-49a0-a752-008c71dd5c23" path="/var/lib/kubelet/pods/27395613-e0e9-49a0-a752-008c71dd5c23/volumes" Mar 20 18:46:48 crc kubenswrapper[4690]: I0320 18:46:48.223339 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fhzm4/must-gather-jbnnt" Mar 20 18:46:57 crc kubenswrapper[4690]: I0320 18:46:57.885155 4690 scope.go:117] "RemoveContainer" containerID="fbdcba45779a1815c161907da2b0a7af3b6b510d3ce48a9115db0a6fc2b46293" Mar 20 18:46:57 crc kubenswrapper[4690]: E0320 18:46:57.886592 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:47:08 crc kubenswrapper[4690]: I0320 18:47:08.883565 4690 scope.go:117] "RemoveContainer" containerID="fbdcba45779a1815c161907da2b0a7af3b6b510d3ce48a9115db0a6fc2b46293" Mar 20 18:47:08 crc kubenswrapper[4690]: E0320 18:47:08.884396 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:47:23 crc kubenswrapper[4690]: I0320 18:47:23.884199 4690 scope.go:117] "RemoveContainer" containerID="fbdcba45779a1815c161907da2b0a7af3b6b510d3ce48a9115db0a6fc2b46293" Mar 20 18:47:23 crc kubenswrapper[4690]: E0320 18:47:23.885489 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:47:26 crc kubenswrapper[4690]: I0320 18:47:26.050665 4690 scope.go:117] "RemoveContainer" containerID="c2ce96c4ac33ee45e2281d7cce9a08dd409a12388bd9bb0d846d20248ab79024" Mar 20 18:47:26 crc kubenswrapper[4690]: I0320 18:47:26.135569 4690 scope.go:117] "RemoveContainer" containerID="3f525513029ea6b94d57effc8292fcc7fa821331b3259ea6aec14f8eae863f85" Mar 20 18:47:34 crc kubenswrapper[4690]: I0320 18:47:34.883494 4690 scope.go:117] "RemoveContainer" containerID="fbdcba45779a1815c161907da2b0a7af3b6b510d3ce48a9115db0a6fc2b46293" Mar 20 18:47:34 crc kubenswrapper[4690]: E0320 18:47:34.884861 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:47:48 crc kubenswrapper[4690]: I0320 18:47:48.883790 4690 scope.go:117] "RemoveContainer" containerID="fbdcba45779a1815c161907da2b0a7af3b6b510d3ce48a9115db0a6fc2b46293" Mar 20 18:47:48 crc kubenswrapper[4690]: E0320 18:47:48.884865 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:48:00 crc kubenswrapper[4690]: I0320 18:48:00.142456 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567208-kjmcw"] Mar 20 18:48:00 crc kubenswrapper[4690]: E0320 18:48:00.144621 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27395613-e0e9-49a0-a752-008c71dd5c23" containerName="copy" Mar 20 18:48:00 crc kubenswrapper[4690]: I0320 18:48:00.144646 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="27395613-e0e9-49a0-a752-008c71dd5c23" containerName="copy" Mar 20 18:48:00 crc kubenswrapper[4690]: E0320 18:48:00.144666 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7a13a04-67be-4afc-9675-6a6f8b2a7227" containerName="oc" Mar 20 18:48:00 crc kubenswrapper[4690]: I0320 18:48:00.144672 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7a13a04-67be-4afc-9675-6a6f8b2a7227" containerName="oc" Mar 20 18:48:00 crc kubenswrapper[4690]: E0320 18:48:00.144684 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27395613-e0e9-49a0-a752-008c71dd5c23" containerName="gather" Mar 20 18:48:00 crc kubenswrapper[4690]: I0320 18:48:00.144690 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="27395613-e0e9-49a0-a752-008c71dd5c23" containerName="gather" Mar 20 18:48:00 crc kubenswrapper[4690]: I0320 18:48:00.144888 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="27395613-e0e9-49a0-a752-008c71dd5c23" containerName="gather" Mar 20 18:48:00 crc kubenswrapper[4690]: I0320 18:48:00.144919 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7a13a04-67be-4afc-9675-6a6f8b2a7227" containerName="oc" Mar 20 18:48:00 crc kubenswrapper[4690]: I0320 18:48:00.144932 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="27395613-e0e9-49a0-a752-008c71dd5c23" containerName="copy" Mar 20 18:48:00 crc kubenswrapper[4690]: I0320 18:48:00.145709 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567208-kjmcw" Mar 20 18:48:00 crc kubenswrapper[4690]: I0320 18:48:00.148085 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5fwhb" Mar 20 18:48:00 crc kubenswrapper[4690]: I0320 18:48:00.148271 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 18:48:00 crc kubenswrapper[4690]: I0320 18:48:00.148553 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 18:48:00 crc kubenswrapper[4690]: I0320 18:48:00.163715 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567208-kjmcw"] Mar 20 18:48:00 crc kubenswrapper[4690]: I0320 18:48:00.217134 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvflj\" (UniqueName: \"kubernetes.io/projected/ac8d281a-e43a-4efb-aeb7-f6e242b1afea-kube-api-access-bvflj\") pod \"auto-csr-approver-29567208-kjmcw\" (UID: \"ac8d281a-e43a-4efb-aeb7-f6e242b1afea\") " pod="openshift-infra/auto-csr-approver-29567208-kjmcw" Mar 20 18:48:00 crc kubenswrapper[4690]: I0320 18:48:00.318622 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvflj\" (UniqueName: \"kubernetes.io/projected/ac8d281a-e43a-4efb-aeb7-f6e242b1afea-kube-api-access-bvflj\") pod \"auto-csr-approver-29567208-kjmcw\" (UID: \"ac8d281a-e43a-4efb-aeb7-f6e242b1afea\") " pod="openshift-infra/auto-csr-approver-29567208-kjmcw" Mar 20 18:48:00 crc kubenswrapper[4690]: I0320 18:48:00.809980 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvflj\" (UniqueName: \"kubernetes.io/projected/ac8d281a-e43a-4efb-aeb7-f6e242b1afea-kube-api-access-bvflj\") pod \"auto-csr-approver-29567208-kjmcw\" (UID: \"ac8d281a-e43a-4efb-aeb7-f6e242b1afea\") " pod="openshift-infra/auto-csr-approver-29567208-kjmcw" Mar 20 18:48:01 crc kubenswrapper[4690]: I0320 18:48:01.061803 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567208-kjmcw" Mar 20 18:48:01 crc kubenswrapper[4690]: I0320 18:48:01.566399 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567208-kjmcw"] Mar 20 18:48:01 crc kubenswrapper[4690]: I0320 18:48:01.574249 4690 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 18:48:02 crc kubenswrapper[4690]: I0320 18:48:02.011162 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567208-kjmcw" event={"ID":"ac8d281a-e43a-4efb-aeb7-f6e242b1afea","Type":"ContainerStarted","Data":"b8f15b3d97a37e172461ed8ab58b7ea462c291af7eeee3afe4f8e230517f8b56"} Mar 20 18:48:02 crc kubenswrapper[4690]: I0320 18:48:02.883737 4690 scope.go:117] "RemoveContainer" containerID="fbdcba45779a1815c161907da2b0a7af3b6b510d3ce48a9115db0a6fc2b46293" Mar 20 18:48:02 crc kubenswrapper[4690]: E0320 18:48:02.884271 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:48:04 crc kubenswrapper[4690]: I0320 18:48:04.032491 4690 generic.go:334] "Generic (PLEG): container finished" podID="ac8d281a-e43a-4efb-aeb7-f6e242b1afea" containerID="05d095a8002886ed770279abb3f8073044228f682aa2feede89cbb504b6fbde3" exitCode=0 Mar 20 18:48:04 crc kubenswrapper[4690]: I0320 18:48:04.032602 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567208-kjmcw" event={"ID":"ac8d281a-e43a-4efb-aeb7-f6e242b1afea","Type":"ContainerDied","Data":"05d095a8002886ed770279abb3f8073044228f682aa2feede89cbb504b6fbde3"} Mar 20 18:48:05 crc kubenswrapper[4690]: I0320 18:48:05.380245 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567208-kjmcw" Mar 20 18:48:05 crc kubenswrapper[4690]: I0320 18:48:05.524241 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bvflj\" (UniqueName: \"kubernetes.io/projected/ac8d281a-e43a-4efb-aeb7-f6e242b1afea-kube-api-access-bvflj\") pod \"ac8d281a-e43a-4efb-aeb7-f6e242b1afea\" (UID: \"ac8d281a-e43a-4efb-aeb7-f6e242b1afea\") " Mar 20 18:48:05 crc kubenswrapper[4690]: I0320 18:48:05.535549 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac8d281a-e43a-4efb-aeb7-f6e242b1afea-kube-api-access-bvflj" (OuterVolumeSpecName: "kube-api-access-bvflj") pod "ac8d281a-e43a-4efb-aeb7-f6e242b1afea" (UID: "ac8d281a-e43a-4efb-aeb7-f6e242b1afea"). InnerVolumeSpecName "kube-api-access-bvflj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:48:05 crc kubenswrapper[4690]: I0320 18:48:05.626925 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bvflj\" (UniqueName: \"kubernetes.io/projected/ac8d281a-e43a-4efb-aeb7-f6e242b1afea-kube-api-access-bvflj\") on node \"crc\" DevicePath \"\"" Mar 20 18:48:06 crc kubenswrapper[4690]: I0320 18:48:06.057853 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567208-kjmcw" event={"ID":"ac8d281a-e43a-4efb-aeb7-f6e242b1afea","Type":"ContainerDied","Data":"b8f15b3d97a37e172461ed8ab58b7ea462c291af7eeee3afe4f8e230517f8b56"} Mar 20 18:48:06 crc kubenswrapper[4690]: I0320 18:48:06.058472 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b8f15b3d97a37e172461ed8ab58b7ea462c291af7eeee3afe4f8e230517f8b56" Mar 20 18:48:06 crc kubenswrapper[4690]: I0320 18:48:06.057947 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567208-kjmcw" Mar 20 18:48:06 crc kubenswrapper[4690]: I0320 18:48:06.126542 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567202-zhspx"] Mar 20 18:48:06 crc kubenswrapper[4690]: I0320 18:48:06.136608 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567202-zhspx"] Mar 20 18:48:07 crc kubenswrapper[4690]: E0320 18:48:07.422748 4690 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podac8d281a_e43a_4efb_aeb7_f6e242b1afea.slice/crio-b8f15b3d97a37e172461ed8ab58b7ea462c291af7eeee3afe4f8e230517f8b56\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podac8d281a_e43a_4efb_aeb7_f6e242b1afea.slice\": RecentStats: unable to find data in memory cache]" Mar 20 18:48:07 crc kubenswrapper[4690]: I0320 18:48:07.902637 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a16fca88-e24b-40d2-ae8c-b4fa681a8516" path="/var/lib/kubelet/pods/a16fca88-e24b-40d2-ae8c-b4fa681a8516/volumes" Mar 20 18:48:13 crc kubenswrapper[4690]: I0320 18:48:13.883448 4690 scope.go:117] "RemoveContainer" containerID="fbdcba45779a1815c161907da2b0a7af3b6b510d3ce48a9115db0a6fc2b46293" Mar 20 18:48:13 crc kubenswrapper[4690]: E0320 18:48:13.886574 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:48:17 crc kubenswrapper[4690]: E0320 18:48:17.682859 4690 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podac8d281a_e43a_4efb_aeb7_f6e242b1afea.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podac8d281a_e43a_4efb_aeb7_f6e242b1afea.slice/crio-b8f15b3d97a37e172461ed8ab58b7ea462c291af7eeee3afe4f8e230517f8b56\": RecentStats: unable to find data in memory cache]" Mar 20 18:48:26 crc kubenswrapper[4690]: I0320 18:48:26.190824 4690 scope.go:117] "RemoveContainer" containerID="1f6cc1187b0561c33045c29f84dc045a8b8a7a3b8e3801a9a8a7616128865d44" Mar 20 18:48:27 crc kubenswrapper[4690]: I0320 18:48:27.883529 4690 scope.go:117] "RemoveContainer" containerID="fbdcba45779a1815c161907da2b0a7af3b6b510d3ce48a9115db0a6fc2b46293" Mar 20 18:48:27 crc kubenswrapper[4690]: E0320 18:48:27.884491 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:48:27 crc kubenswrapper[4690]: E0320 18:48:27.978767 4690 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podac8d281a_e43a_4efb_aeb7_f6e242b1afea.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podac8d281a_e43a_4efb_aeb7_f6e242b1afea.slice/crio-b8f15b3d97a37e172461ed8ab58b7ea462c291af7eeee3afe4f8e230517f8b56\": RecentStats: unable to find data in memory cache]" Mar 20 18:48:38 crc kubenswrapper[4690]: E0320 18:48:38.259365 4690 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podac8d281a_e43a_4efb_aeb7_f6e242b1afea.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podac8d281a_e43a_4efb_aeb7_f6e242b1afea.slice/crio-b8f15b3d97a37e172461ed8ab58b7ea462c291af7eeee3afe4f8e230517f8b56\": RecentStats: unable to find data in memory cache]" Mar 20 18:48:41 crc kubenswrapper[4690]: I0320 18:48:41.884818 4690 scope.go:117] "RemoveContainer" containerID="fbdcba45779a1815c161907da2b0a7af3b6b510d3ce48a9115db0a6fc2b46293" Mar 20 18:48:41 crc kubenswrapper[4690]: E0320 18:48:41.886029 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:48:48 crc kubenswrapper[4690]: E0320 18:48:48.486920 4690 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podac8d281a_e43a_4efb_aeb7_f6e242b1afea.slice/crio-b8f15b3d97a37e172461ed8ab58b7ea462c291af7eeee3afe4f8e230517f8b56\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podac8d281a_e43a_4efb_aeb7_f6e242b1afea.slice\": RecentStats: unable to find data in memory cache]" Mar 20 18:48:55 crc kubenswrapper[4690]: I0320 18:48:55.889435 4690 scope.go:117] "RemoveContainer" containerID="fbdcba45779a1815c161907da2b0a7af3b6b510d3ce48a9115db0a6fc2b46293" Mar 20 18:48:55 crc kubenswrapper[4690]: E0320 18:48:55.890116 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:48:58 crc kubenswrapper[4690]: E0320 18:48:58.761977 4690 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podac8d281a_e43a_4efb_aeb7_f6e242b1afea.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podac8d281a_e43a_4efb_aeb7_f6e242b1afea.slice/crio-b8f15b3d97a37e172461ed8ab58b7ea462c291af7eeee3afe4f8e230517f8b56\": RecentStats: unable to find data in memory cache]" Mar 20 18:49:10 crc kubenswrapper[4690]: I0320 18:49:10.675159 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-c7rtb"] Mar 20 18:49:10 crc kubenswrapper[4690]: E0320 18:49:10.676194 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac8d281a-e43a-4efb-aeb7-f6e242b1afea" containerName="oc" Mar 20 18:49:10 crc kubenswrapper[4690]: I0320 18:49:10.676215 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac8d281a-e43a-4efb-aeb7-f6e242b1afea" containerName="oc" Mar 20 18:49:10 crc kubenswrapper[4690]: I0320 18:49:10.676719 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac8d281a-e43a-4efb-aeb7-f6e242b1afea" containerName="oc" Mar 20 18:49:10 crc kubenswrapper[4690]: I0320 18:49:10.678383 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c7rtb" Mar 20 18:49:10 crc kubenswrapper[4690]: I0320 18:49:10.702126 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-c7rtb"] Mar 20 18:49:10 crc kubenswrapper[4690]: I0320 18:49:10.858980 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/302ea27f-2a86-4bdc-a25a-7d4599f3545e-utilities\") pod \"community-operators-c7rtb\" (UID: \"302ea27f-2a86-4bdc-a25a-7d4599f3545e\") " pod="openshift-marketplace/community-operators-c7rtb" Mar 20 18:49:10 crc kubenswrapper[4690]: I0320 18:49:10.859015 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/302ea27f-2a86-4bdc-a25a-7d4599f3545e-catalog-content\") pod \"community-operators-c7rtb\" (UID: \"302ea27f-2a86-4bdc-a25a-7d4599f3545e\") " pod="openshift-marketplace/community-operators-c7rtb" Mar 20 18:49:10 crc kubenswrapper[4690]: I0320 18:49:10.859134 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qk6n\" (UniqueName: \"kubernetes.io/projected/302ea27f-2a86-4bdc-a25a-7d4599f3545e-kube-api-access-7qk6n\") pod \"community-operators-c7rtb\" (UID: \"302ea27f-2a86-4bdc-a25a-7d4599f3545e\") " pod="openshift-marketplace/community-operators-c7rtb" Mar 20 18:49:10 crc kubenswrapper[4690]: I0320 18:49:10.882781 4690 scope.go:117] "RemoveContainer" containerID="fbdcba45779a1815c161907da2b0a7af3b6b510d3ce48a9115db0a6fc2b46293" Mar 20 18:49:10 crc kubenswrapper[4690]: E0320 18:49:10.883050 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:49:10 crc kubenswrapper[4690]: I0320 18:49:10.961011 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qk6n\" (UniqueName: \"kubernetes.io/projected/302ea27f-2a86-4bdc-a25a-7d4599f3545e-kube-api-access-7qk6n\") pod \"community-operators-c7rtb\" (UID: \"302ea27f-2a86-4bdc-a25a-7d4599f3545e\") " pod="openshift-marketplace/community-operators-c7rtb" Mar 20 18:49:10 crc kubenswrapper[4690]: I0320 18:49:10.961220 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/302ea27f-2a86-4bdc-a25a-7d4599f3545e-utilities\") pod \"community-operators-c7rtb\" (UID: \"302ea27f-2a86-4bdc-a25a-7d4599f3545e\") " pod="openshift-marketplace/community-operators-c7rtb" Mar 20 18:49:10 crc kubenswrapper[4690]: I0320 18:49:10.961252 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/302ea27f-2a86-4bdc-a25a-7d4599f3545e-catalog-content\") pod \"community-operators-c7rtb\" (UID: \"302ea27f-2a86-4bdc-a25a-7d4599f3545e\") " pod="openshift-marketplace/community-operators-c7rtb" Mar 20 18:49:10 crc kubenswrapper[4690]: I0320 18:49:10.961944 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/302ea27f-2a86-4bdc-a25a-7d4599f3545e-utilities\") pod \"community-operators-c7rtb\" (UID: \"302ea27f-2a86-4bdc-a25a-7d4599f3545e\") " pod="openshift-marketplace/community-operators-c7rtb" Mar 20 18:49:10 crc kubenswrapper[4690]: I0320 18:49:10.962011 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/302ea27f-2a86-4bdc-a25a-7d4599f3545e-catalog-content\") pod \"community-operators-c7rtb\" (UID: \"302ea27f-2a86-4bdc-a25a-7d4599f3545e\") " pod="openshift-marketplace/community-operators-c7rtb" Mar 20 18:49:10 crc kubenswrapper[4690]: I0320 18:49:10.984249 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qk6n\" (UniqueName: \"kubernetes.io/projected/302ea27f-2a86-4bdc-a25a-7d4599f3545e-kube-api-access-7qk6n\") pod \"community-operators-c7rtb\" (UID: \"302ea27f-2a86-4bdc-a25a-7d4599f3545e\") " pod="openshift-marketplace/community-operators-c7rtb" Mar 20 18:49:11 crc kubenswrapper[4690]: I0320 18:49:11.002873 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c7rtb" Mar 20 18:49:11 crc kubenswrapper[4690]: I0320 18:49:11.509369 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-c7rtb"] Mar 20 18:49:11 crc kubenswrapper[4690]: I0320 18:49:11.773135 4690 generic.go:334] "Generic (PLEG): container finished" podID="302ea27f-2a86-4bdc-a25a-7d4599f3545e" containerID="b85576c3e1d575f2399424b6541f59ab99ee0a70183306aad043b7c19721a0b7" exitCode=0 Mar 20 18:49:11 crc kubenswrapper[4690]: I0320 18:49:11.773421 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c7rtb" event={"ID":"302ea27f-2a86-4bdc-a25a-7d4599f3545e","Type":"ContainerDied","Data":"b85576c3e1d575f2399424b6541f59ab99ee0a70183306aad043b7c19721a0b7"} Mar 20 18:49:11 crc kubenswrapper[4690]: I0320 18:49:11.773488 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c7rtb" event={"ID":"302ea27f-2a86-4bdc-a25a-7d4599f3545e","Type":"ContainerStarted","Data":"9fcfe97a854dbef9022efb7d908d871b21694d2f4898464c61e901f4c78f138c"} Mar 20 18:49:12 crc kubenswrapper[4690]: I0320 18:49:12.785302 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c7rtb" event={"ID":"302ea27f-2a86-4bdc-a25a-7d4599f3545e","Type":"ContainerStarted","Data":"a7fc5e4d29765508c107994f94bed3f602873a3ada439223f24b2ab33c8b50aa"} Mar 20 18:49:13 crc kubenswrapper[4690]: I0320 18:49:13.797048 4690 generic.go:334] "Generic (PLEG): container finished" podID="302ea27f-2a86-4bdc-a25a-7d4599f3545e" containerID="a7fc5e4d29765508c107994f94bed3f602873a3ada439223f24b2ab33c8b50aa" exitCode=0 Mar 20 18:49:13 crc kubenswrapper[4690]: I0320 18:49:13.797107 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c7rtb" event={"ID":"302ea27f-2a86-4bdc-a25a-7d4599f3545e","Type":"ContainerDied","Data":"a7fc5e4d29765508c107994f94bed3f602873a3ada439223f24b2ab33c8b50aa"} Mar 20 18:49:14 crc kubenswrapper[4690]: I0320 18:49:14.812531 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c7rtb" event={"ID":"302ea27f-2a86-4bdc-a25a-7d4599f3545e","Type":"ContainerStarted","Data":"eb90d89881d0dc5b6dc84363062967963103a0d622e9cdf6cfaf6b314e4e190d"} Mar 20 18:49:14 crc kubenswrapper[4690]: I0320 18:49:14.842204 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-c7rtb" podStartSLOduration=2.355410576 podStartE2EDuration="4.842181317s" podCreationTimestamp="2026-03-20 18:49:10 +0000 UTC" firstStartedPulling="2026-03-20 18:49:11.774818293 +0000 UTC m=+4626.640643971" lastFinishedPulling="2026-03-20 18:49:14.261589034 +0000 UTC m=+4629.127414712" observedRunningTime="2026-03-20 18:49:14.835134547 +0000 UTC m=+4629.700960245" watchObservedRunningTime="2026-03-20 18:49:14.842181317 +0000 UTC m=+4629.708007005" Mar 20 18:49:17 crc kubenswrapper[4690]: I0320 18:49:17.057726 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2d2pk"] Mar 20 18:49:17 crc kubenswrapper[4690]: I0320 18:49:17.060292 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2d2pk" Mar 20 18:49:17 crc kubenswrapper[4690]: I0320 18:49:17.076017 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2d2pk"] Mar 20 18:49:17 crc kubenswrapper[4690]: I0320 18:49:17.202513 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2p9v6\" (UniqueName: \"kubernetes.io/projected/5ec30ac9-d298-48d6-bdf1-570165be3bc2-kube-api-access-2p9v6\") pod \"certified-operators-2d2pk\" (UID: \"5ec30ac9-d298-48d6-bdf1-570165be3bc2\") " pod="openshift-marketplace/certified-operators-2d2pk" Mar 20 18:49:17 crc kubenswrapper[4690]: I0320 18:49:17.202663 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ec30ac9-d298-48d6-bdf1-570165be3bc2-catalog-content\") pod \"certified-operators-2d2pk\" (UID: \"5ec30ac9-d298-48d6-bdf1-570165be3bc2\") " pod="openshift-marketplace/certified-operators-2d2pk" Mar 20 18:49:17 crc kubenswrapper[4690]: I0320 18:49:17.202939 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ec30ac9-d298-48d6-bdf1-570165be3bc2-utilities\") pod \"certified-operators-2d2pk\" (UID: \"5ec30ac9-d298-48d6-bdf1-570165be3bc2\") " pod="openshift-marketplace/certified-operators-2d2pk" Mar 20 18:49:17 crc kubenswrapper[4690]: I0320 18:49:17.305019 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ec30ac9-d298-48d6-bdf1-570165be3bc2-catalog-content\") pod \"certified-operators-2d2pk\" (UID: \"5ec30ac9-d298-48d6-bdf1-570165be3bc2\") " pod="openshift-marketplace/certified-operators-2d2pk" Mar 20 18:49:17 crc kubenswrapper[4690]: I0320 18:49:17.305118 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ec30ac9-d298-48d6-bdf1-570165be3bc2-utilities\") pod \"certified-operators-2d2pk\" (UID: \"5ec30ac9-d298-48d6-bdf1-570165be3bc2\") " pod="openshift-marketplace/certified-operators-2d2pk" Mar 20 18:49:17 crc kubenswrapper[4690]: I0320 18:49:17.305176 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2p9v6\" (UniqueName: \"kubernetes.io/projected/5ec30ac9-d298-48d6-bdf1-570165be3bc2-kube-api-access-2p9v6\") pod \"certified-operators-2d2pk\" (UID: \"5ec30ac9-d298-48d6-bdf1-570165be3bc2\") " pod="openshift-marketplace/certified-operators-2d2pk" Mar 20 18:49:17 crc kubenswrapper[4690]: I0320 18:49:17.305651 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ec30ac9-d298-48d6-bdf1-570165be3bc2-catalog-content\") pod \"certified-operators-2d2pk\" (UID: \"5ec30ac9-d298-48d6-bdf1-570165be3bc2\") " pod="openshift-marketplace/certified-operators-2d2pk" Mar 20 18:49:17 crc kubenswrapper[4690]: I0320 18:49:17.305704 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ec30ac9-d298-48d6-bdf1-570165be3bc2-utilities\") pod \"certified-operators-2d2pk\" (UID: \"5ec30ac9-d298-48d6-bdf1-570165be3bc2\") " pod="openshift-marketplace/certified-operators-2d2pk" Mar 20 18:49:17 crc kubenswrapper[4690]: I0320 18:49:17.331711 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2p9v6\" (UniqueName: \"kubernetes.io/projected/5ec30ac9-d298-48d6-bdf1-570165be3bc2-kube-api-access-2p9v6\") pod \"certified-operators-2d2pk\" (UID: \"5ec30ac9-d298-48d6-bdf1-570165be3bc2\") " pod="openshift-marketplace/certified-operators-2d2pk" Mar 20 18:49:17 crc kubenswrapper[4690]: I0320 18:49:17.383403 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2d2pk" Mar 20 18:49:17 crc kubenswrapper[4690]: W0320 18:49:17.878276 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ec30ac9_d298_48d6_bdf1_570165be3bc2.slice/crio-cb226f0d1eae37b37ed3e47895faf21a3c46e934fd3159acd8bc5bfd367e3e42 WatchSource:0}: Error finding container cb226f0d1eae37b37ed3e47895faf21a3c46e934fd3159acd8bc5bfd367e3e42: Status 404 returned error can't find the container with id cb226f0d1eae37b37ed3e47895faf21a3c46e934fd3159acd8bc5bfd367e3e42 Mar 20 18:49:17 crc kubenswrapper[4690]: I0320 18:49:17.881781 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2d2pk"] Mar 20 18:49:18 crc kubenswrapper[4690]: I0320 18:49:18.851847 4690 generic.go:334] "Generic (PLEG): container finished" podID="5ec30ac9-d298-48d6-bdf1-570165be3bc2" containerID="7e694c04b3e1a71bd4b385734d373fbbfb8adec28c43d8d9db44a69507f681c5" exitCode=0 Mar 20 18:49:18 crc kubenswrapper[4690]: I0320 18:49:18.851912 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2d2pk" event={"ID":"5ec30ac9-d298-48d6-bdf1-570165be3bc2","Type":"ContainerDied","Data":"7e694c04b3e1a71bd4b385734d373fbbfb8adec28c43d8d9db44a69507f681c5"} Mar 20 18:49:18 crc kubenswrapper[4690]: I0320 18:49:18.852103 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2d2pk" event={"ID":"5ec30ac9-d298-48d6-bdf1-570165be3bc2","Type":"ContainerStarted","Data":"cb226f0d1eae37b37ed3e47895faf21a3c46e934fd3159acd8bc5bfd367e3e42"} Mar 20 18:49:20 crc kubenswrapper[4690]: I0320 18:49:20.875214 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2d2pk" event={"ID":"5ec30ac9-d298-48d6-bdf1-570165be3bc2","Type":"ContainerStarted","Data":"f0ec0a692c0844b97066577de661d6dd7c94c248be84835c0adf549f975a43b1"} Mar 20 18:49:21 crc kubenswrapper[4690]: I0320 18:49:21.003521 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-c7rtb" Mar 20 18:49:21 crc kubenswrapper[4690]: I0320 18:49:21.004705 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-c7rtb" Mar 20 18:49:21 crc kubenswrapper[4690]: I0320 18:49:21.065796 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-c7rtb" Mar 20 18:49:21 crc kubenswrapper[4690]: I0320 18:49:21.885187 4690 generic.go:334] "Generic (PLEG): container finished" podID="5ec30ac9-d298-48d6-bdf1-570165be3bc2" containerID="f0ec0a692c0844b97066577de661d6dd7c94c248be84835c0adf549f975a43b1" exitCode=0 Mar 20 18:49:21 crc kubenswrapper[4690]: I0320 18:49:21.928678 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2d2pk" event={"ID":"5ec30ac9-d298-48d6-bdf1-570165be3bc2","Type":"ContainerDied","Data":"f0ec0a692c0844b97066577de661d6dd7c94c248be84835c0adf549f975a43b1"} Mar 20 18:49:22 crc kubenswrapper[4690]: I0320 18:49:22.024875 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-c7rtb" Mar 20 18:49:22 crc kubenswrapper[4690]: I0320 18:49:22.897804 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2d2pk" event={"ID":"5ec30ac9-d298-48d6-bdf1-570165be3bc2","Type":"ContainerStarted","Data":"fc013d1da2ba6de29c0857c3c6a6b78731ab5ea5a50f0d41d4a71ba9cce1bbba"} Mar 20 18:49:22 crc kubenswrapper[4690]: I0320 18:49:22.928987 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2d2pk" podStartSLOduration=2.446178497 podStartE2EDuration="5.928968373s" podCreationTimestamp="2026-03-20 18:49:17 +0000 UTC" firstStartedPulling="2026-03-20 18:49:18.854095098 +0000 UTC m=+4633.719920766" lastFinishedPulling="2026-03-20 18:49:22.336884954 +0000 UTC m=+4637.202710642" observedRunningTime="2026-03-20 18:49:22.922071787 +0000 UTC m=+4637.787897465" watchObservedRunningTime="2026-03-20 18:49:22.928968373 +0000 UTC m=+4637.794794051" Mar 20 18:49:23 crc kubenswrapper[4690]: I0320 18:49:23.883183 4690 scope.go:117] "RemoveContainer" containerID="fbdcba45779a1815c161907da2b0a7af3b6b510d3ce48a9115db0a6fc2b46293" Mar 20 18:49:23 crc kubenswrapper[4690]: E0320 18:49:23.883748 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:49:24 crc kubenswrapper[4690]: I0320 18:49:24.445077 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-c7rtb"] Mar 20 18:49:24 crc kubenswrapper[4690]: I0320 18:49:24.913345 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-c7rtb" podUID="302ea27f-2a86-4bdc-a25a-7d4599f3545e" containerName="registry-server" containerID="cri-o://eb90d89881d0dc5b6dc84363062967963103a0d622e9cdf6cfaf6b314e4e190d" gracePeriod=2 Mar 20 18:49:25 crc kubenswrapper[4690]: I0320 18:49:25.923925 4690 generic.go:334] "Generic (PLEG): container finished" podID="302ea27f-2a86-4bdc-a25a-7d4599f3545e" containerID="eb90d89881d0dc5b6dc84363062967963103a0d622e9cdf6cfaf6b314e4e190d" exitCode=0 Mar 20 18:49:25 crc kubenswrapper[4690]: I0320 18:49:25.923994 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c7rtb" event={"ID":"302ea27f-2a86-4bdc-a25a-7d4599f3545e","Type":"ContainerDied","Data":"eb90d89881d0dc5b6dc84363062967963103a0d622e9cdf6cfaf6b314e4e190d"} Mar 20 18:49:25 crc kubenswrapper[4690]: I0320 18:49:25.924405 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c7rtb" event={"ID":"302ea27f-2a86-4bdc-a25a-7d4599f3545e","Type":"ContainerDied","Data":"9fcfe97a854dbef9022efb7d908d871b21694d2f4898464c61e901f4c78f138c"} Mar 20 18:49:25 crc kubenswrapper[4690]: I0320 18:49:25.924428 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9fcfe97a854dbef9022efb7d908d871b21694d2f4898464c61e901f4c78f138c" Mar 20 18:49:25 crc kubenswrapper[4690]: I0320 18:49:25.937462 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c7rtb" Mar 20 18:49:26 crc kubenswrapper[4690]: I0320 18:49:26.124786 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/302ea27f-2a86-4bdc-a25a-7d4599f3545e-catalog-content\") pod \"302ea27f-2a86-4bdc-a25a-7d4599f3545e\" (UID: \"302ea27f-2a86-4bdc-a25a-7d4599f3545e\") " Mar 20 18:49:26 crc kubenswrapper[4690]: I0320 18:49:26.125004 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7qk6n\" (UniqueName: \"kubernetes.io/projected/302ea27f-2a86-4bdc-a25a-7d4599f3545e-kube-api-access-7qk6n\") pod \"302ea27f-2a86-4bdc-a25a-7d4599f3545e\" (UID: \"302ea27f-2a86-4bdc-a25a-7d4599f3545e\") " Mar 20 18:49:26 crc kubenswrapper[4690]: I0320 18:49:26.125064 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/302ea27f-2a86-4bdc-a25a-7d4599f3545e-utilities\") pod \"302ea27f-2a86-4bdc-a25a-7d4599f3545e\" (UID: \"302ea27f-2a86-4bdc-a25a-7d4599f3545e\") " Mar 20 18:49:26 crc kubenswrapper[4690]: I0320 18:49:26.126215 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/302ea27f-2a86-4bdc-a25a-7d4599f3545e-utilities" (OuterVolumeSpecName: "utilities") pod "302ea27f-2a86-4bdc-a25a-7d4599f3545e" (UID: "302ea27f-2a86-4bdc-a25a-7d4599f3545e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:49:26 crc kubenswrapper[4690]: I0320 18:49:26.129792 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/302ea27f-2a86-4bdc-a25a-7d4599f3545e-kube-api-access-7qk6n" (OuterVolumeSpecName: "kube-api-access-7qk6n") pod "302ea27f-2a86-4bdc-a25a-7d4599f3545e" (UID: "302ea27f-2a86-4bdc-a25a-7d4599f3545e"). InnerVolumeSpecName "kube-api-access-7qk6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:49:26 crc kubenswrapper[4690]: I0320 18:49:26.177309 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/302ea27f-2a86-4bdc-a25a-7d4599f3545e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "302ea27f-2a86-4bdc-a25a-7d4599f3545e" (UID: "302ea27f-2a86-4bdc-a25a-7d4599f3545e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:49:26 crc kubenswrapper[4690]: I0320 18:49:26.228597 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7qk6n\" (UniqueName: \"kubernetes.io/projected/302ea27f-2a86-4bdc-a25a-7d4599f3545e-kube-api-access-7qk6n\") on node \"crc\" DevicePath \"\"" Mar 20 18:49:26 crc kubenswrapper[4690]: I0320 18:49:26.228659 4690 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/302ea27f-2a86-4bdc-a25a-7d4599f3545e-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 18:49:26 crc kubenswrapper[4690]: I0320 18:49:26.228674 4690 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/302ea27f-2a86-4bdc-a25a-7d4599f3545e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 18:49:26 crc kubenswrapper[4690]: I0320 18:49:26.932130 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c7rtb" Mar 20 18:49:26 crc kubenswrapper[4690]: I0320 18:49:26.969858 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-c7rtb"] Mar 20 18:49:26 crc kubenswrapper[4690]: I0320 18:49:26.981780 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-c7rtb"] Mar 20 18:49:27 crc kubenswrapper[4690]: I0320 18:49:27.383651 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2d2pk" Mar 20 18:49:27 crc kubenswrapper[4690]: I0320 18:49:27.384069 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2d2pk" Mar 20 18:49:27 crc kubenswrapper[4690]: I0320 18:49:27.763108 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2d2pk" Mar 20 18:49:27 crc kubenswrapper[4690]: I0320 18:49:27.898758 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="302ea27f-2a86-4bdc-a25a-7d4599f3545e" path="/var/lib/kubelet/pods/302ea27f-2a86-4bdc-a25a-7d4599f3545e/volumes" Mar 20 18:49:28 crc kubenswrapper[4690]: I0320 18:49:28.015887 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2d2pk" Mar 20 18:49:29 crc kubenswrapper[4690]: I0320 18:49:29.439684 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2d2pk"] Mar 20 18:49:29 crc kubenswrapper[4690]: I0320 18:49:29.962094 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2d2pk" podUID="5ec30ac9-d298-48d6-bdf1-570165be3bc2" containerName="registry-server" containerID="cri-o://fc013d1da2ba6de29c0857c3c6a6b78731ab5ea5a50f0d41d4a71ba9cce1bbba" gracePeriod=2 Mar 20 18:49:30 crc kubenswrapper[4690]: I0320 18:49:30.469002 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2d2pk" Mar 20 18:49:30 crc kubenswrapper[4690]: I0320 18:49:30.619953 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ec30ac9-d298-48d6-bdf1-570165be3bc2-catalog-content\") pod \"5ec30ac9-d298-48d6-bdf1-570165be3bc2\" (UID: \"5ec30ac9-d298-48d6-bdf1-570165be3bc2\") " Mar 20 18:49:30 crc kubenswrapper[4690]: I0320 18:49:30.620371 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ec30ac9-d298-48d6-bdf1-570165be3bc2-utilities\") pod \"5ec30ac9-d298-48d6-bdf1-570165be3bc2\" (UID: \"5ec30ac9-d298-48d6-bdf1-570165be3bc2\") " Mar 20 18:49:30 crc kubenswrapper[4690]: I0320 18:49:30.620424 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2p9v6\" (UniqueName: \"kubernetes.io/projected/5ec30ac9-d298-48d6-bdf1-570165be3bc2-kube-api-access-2p9v6\") pod \"5ec30ac9-d298-48d6-bdf1-570165be3bc2\" (UID: \"5ec30ac9-d298-48d6-bdf1-570165be3bc2\") " Mar 20 18:49:30 crc kubenswrapper[4690]: I0320 18:49:30.621542 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ec30ac9-d298-48d6-bdf1-570165be3bc2-utilities" (OuterVolumeSpecName: "utilities") pod "5ec30ac9-d298-48d6-bdf1-570165be3bc2" (UID: "5ec30ac9-d298-48d6-bdf1-570165be3bc2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:49:30 crc kubenswrapper[4690]: I0320 18:49:30.634110 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ec30ac9-d298-48d6-bdf1-570165be3bc2-kube-api-access-2p9v6" (OuterVolumeSpecName: "kube-api-access-2p9v6") pod "5ec30ac9-d298-48d6-bdf1-570165be3bc2" (UID: "5ec30ac9-d298-48d6-bdf1-570165be3bc2"). InnerVolumeSpecName "kube-api-access-2p9v6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:49:30 crc kubenswrapper[4690]: I0320 18:49:30.722204 4690 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ec30ac9-d298-48d6-bdf1-570165be3bc2-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 18:49:30 crc kubenswrapper[4690]: I0320 18:49:30.722239 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2p9v6\" (UniqueName: \"kubernetes.io/projected/5ec30ac9-d298-48d6-bdf1-570165be3bc2-kube-api-access-2p9v6\") on node \"crc\" DevicePath \"\"" Mar 20 18:49:30 crc kubenswrapper[4690]: I0320 18:49:30.876102 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ec30ac9-d298-48d6-bdf1-570165be3bc2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5ec30ac9-d298-48d6-bdf1-570165be3bc2" (UID: "5ec30ac9-d298-48d6-bdf1-570165be3bc2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:49:30 crc kubenswrapper[4690]: I0320 18:49:30.925755 4690 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ec30ac9-d298-48d6-bdf1-570165be3bc2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 18:49:30 crc kubenswrapper[4690]: I0320 18:49:30.976704 4690 generic.go:334] "Generic (PLEG): container finished" podID="5ec30ac9-d298-48d6-bdf1-570165be3bc2" containerID="fc013d1da2ba6de29c0857c3c6a6b78731ab5ea5a50f0d41d4a71ba9cce1bbba" exitCode=0 Mar 20 18:49:30 crc kubenswrapper[4690]: I0320 18:49:30.976761 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2d2pk" event={"ID":"5ec30ac9-d298-48d6-bdf1-570165be3bc2","Type":"ContainerDied","Data":"fc013d1da2ba6de29c0857c3c6a6b78731ab5ea5a50f0d41d4a71ba9cce1bbba"} Mar 20 18:49:30 crc kubenswrapper[4690]: I0320 18:49:30.976795 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2d2pk" event={"ID":"5ec30ac9-d298-48d6-bdf1-570165be3bc2","Type":"ContainerDied","Data":"cb226f0d1eae37b37ed3e47895faf21a3c46e934fd3159acd8bc5bfd367e3e42"} Mar 20 18:49:30 crc kubenswrapper[4690]: I0320 18:49:30.976821 4690 scope.go:117] "RemoveContainer" containerID="fc013d1da2ba6de29c0857c3c6a6b78731ab5ea5a50f0d41d4a71ba9cce1bbba" Mar 20 18:49:30 crc kubenswrapper[4690]: I0320 18:49:30.976969 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2d2pk" Mar 20 18:49:31 crc kubenswrapper[4690]: I0320 18:49:31.018697 4690 scope.go:117] "RemoveContainer" containerID="f0ec0a692c0844b97066577de661d6dd7c94c248be84835c0adf549f975a43b1" Mar 20 18:49:31 crc kubenswrapper[4690]: I0320 18:49:31.029584 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2d2pk"] Mar 20 18:49:31 crc kubenswrapper[4690]: I0320 18:49:31.036993 4690 scope.go:117] "RemoveContainer" containerID="7e694c04b3e1a71bd4b385734d373fbbfb8adec28c43d8d9db44a69507f681c5" Mar 20 18:49:31 crc kubenswrapper[4690]: I0320 18:49:31.039389 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2d2pk"] Mar 20 18:49:31 crc kubenswrapper[4690]: I0320 18:49:31.078713 4690 scope.go:117] "RemoveContainer" containerID="fc013d1da2ba6de29c0857c3c6a6b78731ab5ea5a50f0d41d4a71ba9cce1bbba" Mar 20 18:49:31 crc kubenswrapper[4690]: E0320 18:49:31.079186 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc013d1da2ba6de29c0857c3c6a6b78731ab5ea5a50f0d41d4a71ba9cce1bbba\": container with ID starting with fc013d1da2ba6de29c0857c3c6a6b78731ab5ea5a50f0d41d4a71ba9cce1bbba not found: ID does not exist" containerID="fc013d1da2ba6de29c0857c3c6a6b78731ab5ea5a50f0d41d4a71ba9cce1bbba" Mar 20 18:49:31 crc kubenswrapper[4690]: I0320 18:49:31.079234 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc013d1da2ba6de29c0857c3c6a6b78731ab5ea5a50f0d41d4a71ba9cce1bbba"} err="failed to get container status \"fc013d1da2ba6de29c0857c3c6a6b78731ab5ea5a50f0d41d4a71ba9cce1bbba\": rpc error: code = NotFound desc = could not find container \"fc013d1da2ba6de29c0857c3c6a6b78731ab5ea5a50f0d41d4a71ba9cce1bbba\": container with ID starting with fc013d1da2ba6de29c0857c3c6a6b78731ab5ea5a50f0d41d4a71ba9cce1bbba not found: ID does not exist" Mar 20 18:49:31 crc kubenswrapper[4690]: I0320 18:49:31.079282 4690 scope.go:117] "RemoveContainer" containerID="f0ec0a692c0844b97066577de661d6dd7c94c248be84835c0adf549f975a43b1" Mar 20 18:49:31 crc kubenswrapper[4690]: E0320 18:49:31.079922 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0ec0a692c0844b97066577de661d6dd7c94c248be84835c0adf549f975a43b1\": container with ID starting with f0ec0a692c0844b97066577de661d6dd7c94c248be84835c0adf549f975a43b1 not found: ID does not exist" containerID="f0ec0a692c0844b97066577de661d6dd7c94c248be84835c0adf549f975a43b1" Mar 20 18:49:31 crc kubenswrapper[4690]: I0320 18:49:31.079982 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0ec0a692c0844b97066577de661d6dd7c94c248be84835c0adf549f975a43b1"} err="failed to get container status \"f0ec0a692c0844b97066577de661d6dd7c94c248be84835c0adf549f975a43b1\": rpc error: code = NotFound desc = could not find container \"f0ec0a692c0844b97066577de661d6dd7c94c248be84835c0adf549f975a43b1\": container with ID starting with f0ec0a692c0844b97066577de661d6dd7c94c248be84835c0adf549f975a43b1 not found: ID does not exist" Mar 20 18:49:31 crc kubenswrapper[4690]: I0320 18:49:31.079999 4690 scope.go:117] "RemoveContainer" containerID="7e694c04b3e1a71bd4b385734d373fbbfb8adec28c43d8d9db44a69507f681c5" Mar 20 18:49:31 crc kubenswrapper[4690]: E0320 18:49:31.080325 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e694c04b3e1a71bd4b385734d373fbbfb8adec28c43d8d9db44a69507f681c5\": container with ID starting with 7e694c04b3e1a71bd4b385734d373fbbfb8adec28c43d8d9db44a69507f681c5 not found: ID does not exist" containerID="7e694c04b3e1a71bd4b385734d373fbbfb8adec28c43d8d9db44a69507f681c5" Mar 20 18:49:31 crc kubenswrapper[4690]: I0320 18:49:31.080365 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e694c04b3e1a71bd4b385734d373fbbfb8adec28c43d8d9db44a69507f681c5"} err="failed to get container status \"7e694c04b3e1a71bd4b385734d373fbbfb8adec28c43d8d9db44a69507f681c5\": rpc error: code = NotFound desc = could not find container \"7e694c04b3e1a71bd4b385734d373fbbfb8adec28c43d8d9db44a69507f681c5\": container with ID starting with 7e694c04b3e1a71bd4b385734d373fbbfb8adec28c43d8d9db44a69507f681c5 not found: ID does not exist" Mar 20 18:49:31 crc kubenswrapper[4690]: I0320 18:49:31.893486 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ec30ac9-d298-48d6-bdf1-570165be3bc2" path="/var/lib/kubelet/pods/5ec30ac9-d298-48d6-bdf1-570165be3bc2/volumes" Mar 20 18:49:34 crc kubenswrapper[4690]: I0320 18:49:34.882806 4690 scope.go:117] "RemoveContainer" containerID="fbdcba45779a1815c161907da2b0a7af3b6b510d3ce48a9115db0a6fc2b46293" Mar 20 18:49:34 crc kubenswrapper[4690]: E0320 18:49:34.883633 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:49:48 crc kubenswrapper[4690]: I0320 18:49:48.883597 4690 scope.go:117] "RemoveContainer" containerID="fbdcba45779a1815c161907da2b0a7af3b6b510d3ce48a9115db0a6fc2b46293" Mar 20 18:49:48 crc kubenswrapper[4690]: E0320 18:49:48.884304 4690 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wtg2q_openshift-machine-config-operator(c18651e4-89e3-43fd-a780-bfa6df87591e)\"" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" podUID="c18651e4-89e3-43fd-a780-bfa6df87591e" Mar 20 18:50:00 crc kubenswrapper[4690]: I0320 18:50:00.149184 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567210-kf28s"] Mar 20 18:50:00 crc kubenswrapper[4690]: E0320 18:50:00.150891 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="302ea27f-2a86-4bdc-a25a-7d4599f3545e" containerName="extract-utilities" Mar 20 18:50:00 crc kubenswrapper[4690]: I0320 18:50:00.150905 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="302ea27f-2a86-4bdc-a25a-7d4599f3545e" containerName="extract-utilities" Mar 20 18:50:00 crc kubenswrapper[4690]: E0320 18:50:00.150913 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="302ea27f-2a86-4bdc-a25a-7d4599f3545e" containerName="registry-server" Mar 20 18:50:00 crc kubenswrapper[4690]: I0320 18:50:00.150920 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="302ea27f-2a86-4bdc-a25a-7d4599f3545e" containerName="registry-server" Mar 20 18:50:00 crc kubenswrapper[4690]: E0320 18:50:00.150942 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ec30ac9-d298-48d6-bdf1-570165be3bc2" containerName="registry-server" Mar 20 18:50:00 crc kubenswrapper[4690]: I0320 18:50:00.150949 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ec30ac9-d298-48d6-bdf1-570165be3bc2" containerName="registry-server" Mar 20 18:50:00 crc kubenswrapper[4690]: E0320 18:50:00.150964 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="302ea27f-2a86-4bdc-a25a-7d4599f3545e" containerName="extract-content" Mar 20 18:50:00 crc kubenswrapper[4690]: I0320 18:50:00.150970 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="302ea27f-2a86-4bdc-a25a-7d4599f3545e" containerName="extract-content" Mar 20 18:50:00 crc kubenswrapper[4690]: E0320 18:50:00.150996 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ec30ac9-d298-48d6-bdf1-570165be3bc2" containerName="extract-content" Mar 20 18:50:00 crc kubenswrapper[4690]: I0320 18:50:00.151002 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ec30ac9-d298-48d6-bdf1-570165be3bc2" containerName="extract-content" Mar 20 18:50:00 crc kubenswrapper[4690]: E0320 18:50:00.151012 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ec30ac9-d298-48d6-bdf1-570165be3bc2" containerName="extract-utilities" Mar 20 18:50:00 crc kubenswrapper[4690]: I0320 18:50:00.151017 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ec30ac9-d298-48d6-bdf1-570165be3bc2" containerName="extract-utilities" Mar 20 18:50:00 crc kubenswrapper[4690]: I0320 18:50:00.151207 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="302ea27f-2a86-4bdc-a25a-7d4599f3545e" containerName="registry-server" Mar 20 18:50:00 crc kubenswrapper[4690]: I0320 18:50:00.151225 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ec30ac9-d298-48d6-bdf1-570165be3bc2" containerName="registry-server" Mar 20 18:50:00 crc kubenswrapper[4690]: I0320 18:50:00.151845 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567210-kf28s" Mar 20 18:50:00 crc kubenswrapper[4690]: I0320 18:50:00.154918 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 18:50:00 crc kubenswrapper[4690]: I0320 18:50:00.155432 4690 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 18:50:00 crc kubenswrapper[4690]: I0320 18:50:00.155447 4690 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5fwhb" Mar 20 18:50:00 crc kubenswrapper[4690]: I0320 18:50:00.161135 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567210-kf28s"] Mar 20 18:50:00 crc kubenswrapper[4690]: I0320 18:50:00.287092 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6m9n\" (UniqueName: \"kubernetes.io/projected/e3c5ef78-e74e-4eba-8d6a-bb3d7cc3a8d5-kube-api-access-m6m9n\") pod \"auto-csr-approver-29567210-kf28s\" (UID: \"e3c5ef78-e74e-4eba-8d6a-bb3d7cc3a8d5\") " pod="openshift-infra/auto-csr-approver-29567210-kf28s" Mar 20 18:50:00 crc kubenswrapper[4690]: I0320 18:50:00.389445 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6m9n\" (UniqueName: \"kubernetes.io/projected/e3c5ef78-e74e-4eba-8d6a-bb3d7cc3a8d5-kube-api-access-m6m9n\") pod \"auto-csr-approver-29567210-kf28s\" (UID: \"e3c5ef78-e74e-4eba-8d6a-bb3d7cc3a8d5\") " pod="openshift-infra/auto-csr-approver-29567210-kf28s" Mar 20 18:50:00 crc kubenswrapper[4690]: I0320 18:50:00.409567 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6m9n\" (UniqueName: \"kubernetes.io/projected/e3c5ef78-e74e-4eba-8d6a-bb3d7cc3a8d5-kube-api-access-m6m9n\") pod \"auto-csr-approver-29567210-kf28s\" (UID: \"e3c5ef78-e74e-4eba-8d6a-bb3d7cc3a8d5\") " pod="openshift-infra/auto-csr-approver-29567210-kf28s" Mar 20 18:50:00 crc kubenswrapper[4690]: I0320 18:50:00.515789 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567210-kf28s" Mar 20 18:50:00 crc kubenswrapper[4690]: I0320 18:50:00.962404 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567210-kf28s"] Mar 20 18:50:01 crc kubenswrapper[4690]: I0320 18:50:01.275035 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567210-kf28s" event={"ID":"e3c5ef78-e74e-4eba-8d6a-bb3d7cc3a8d5","Type":"ContainerStarted","Data":"3c42cf047c6d4364e83d6f56f6e12a828dbea4f6b7f44ec8ef7b542de082def5"} Mar 20 18:50:01 crc kubenswrapper[4690]: I0320 18:50:01.884737 4690 scope.go:117] "RemoveContainer" containerID="fbdcba45779a1815c161907da2b0a7af3b6b510d3ce48a9115db0a6fc2b46293" Mar 20 18:50:03 crc kubenswrapper[4690]: I0320 18:50:03.294574 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wtg2q" event={"ID":"c18651e4-89e3-43fd-a780-bfa6df87591e","Type":"ContainerStarted","Data":"1b480c3201785188c2432950d34cfdf6a06894ee8e73b257dcdade78553f6345"} Mar 20 18:50:03 crc kubenswrapper[4690]: I0320 18:50:03.309090 4690 generic.go:334] "Generic (PLEG): container finished" podID="e3c5ef78-e74e-4eba-8d6a-bb3d7cc3a8d5" containerID="4bec3c625511dd2d612124ad34d47685349fe4c120063206a18e74735f6de935" exitCode=0 Mar 20 18:50:03 crc kubenswrapper[4690]: I0320 18:50:03.309146 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567210-kf28s" event={"ID":"e3c5ef78-e74e-4eba-8d6a-bb3d7cc3a8d5","Type":"ContainerDied","Data":"4bec3c625511dd2d612124ad34d47685349fe4c120063206a18e74735f6de935"} Mar 20 18:50:04 crc kubenswrapper[4690]: I0320 18:50:04.701858 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567210-kf28s" Mar 20 18:50:04 crc kubenswrapper[4690]: I0320 18:50:04.781727 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m6m9n\" (UniqueName: \"kubernetes.io/projected/e3c5ef78-e74e-4eba-8d6a-bb3d7cc3a8d5-kube-api-access-m6m9n\") pod \"e3c5ef78-e74e-4eba-8d6a-bb3d7cc3a8d5\" (UID: \"e3c5ef78-e74e-4eba-8d6a-bb3d7cc3a8d5\") " Mar 20 18:50:04 crc kubenswrapper[4690]: I0320 18:50:04.788851 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3c5ef78-e74e-4eba-8d6a-bb3d7cc3a8d5-kube-api-access-m6m9n" (OuterVolumeSpecName: "kube-api-access-m6m9n") pod "e3c5ef78-e74e-4eba-8d6a-bb3d7cc3a8d5" (UID: "e3c5ef78-e74e-4eba-8d6a-bb3d7cc3a8d5"). InnerVolumeSpecName "kube-api-access-m6m9n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:50:04 crc kubenswrapper[4690]: I0320 18:50:04.884287 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m6m9n\" (UniqueName: \"kubernetes.io/projected/e3c5ef78-e74e-4eba-8d6a-bb3d7cc3a8d5-kube-api-access-m6m9n\") on node \"crc\" DevicePath \"\"" Mar 20 18:50:05 crc kubenswrapper[4690]: I0320 18:50:05.333167 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567210-kf28s" event={"ID":"e3c5ef78-e74e-4eba-8d6a-bb3d7cc3a8d5","Type":"ContainerDied","Data":"3c42cf047c6d4364e83d6f56f6e12a828dbea4f6b7f44ec8ef7b542de082def5"} Mar 20 18:50:05 crc kubenswrapper[4690]: I0320 18:50:05.333585 4690 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c42cf047c6d4364e83d6f56f6e12a828dbea4f6b7f44ec8ef7b542de082def5" Mar 20 18:50:05 crc kubenswrapper[4690]: I0320 18:50:05.333210 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567210-kf28s" Mar 20 18:50:05 crc kubenswrapper[4690]: I0320 18:50:05.788998 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567204-4wjch"] Mar 20 18:50:05 crc kubenswrapper[4690]: I0320 18:50:05.794441 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567204-4wjch"] Mar 20 18:50:05 crc kubenswrapper[4690]: I0320 18:50:05.895110 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f313f18e-e2ae-49d0-92df-2a27a669f159" path="/var/lib/kubelet/pods/f313f18e-e2ae-49d0-92df-2a27a669f159/volumes" Mar 20 18:50:26 crc kubenswrapper[4690]: I0320 18:50:26.922613 4690 scope.go:117] "RemoveContainer" containerID="da4c7bf380ca28f1bfc7774e88a39af318882548719be67922e0f3436a9c119a" Mar 20 18:50:40 crc kubenswrapper[4690]: I0320 18:50:40.595844 4690 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-pnhtq"] Mar 20 18:50:40 crc kubenswrapper[4690]: E0320 18:50:40.597382 4690 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3c5ef78-e74e-4eba-8d6a-bb3d7cc3a8d5" containerName="oc" Mar 20 18:50:40 crc kubenswrapper[4690]: I0320 18:50:40.597409 4690 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3c5ef78-e74e-4eba-8d6a-bb3d7cc3a8d5" containerName="oc" Mar 20 18:50:40 crc kubenswrapper[4690]: I0320 18:50:40.597736 4690 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3c5ef78-e74e-4eba-8d6a-bb3d7cc3a8d5" containerName="oc" Mar 20 18:50:40 crc kubenswrapper[4690]: I0320 18:50:40.600478 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pnhtq" Mar 20 18:50:40 crc kubenswrapper[4690]: I0320 18:50:40.623022 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pnhtq"] Mar 20 18:50:40 crc kubenswrapper[4690]: I0320 18:50:40.661877 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47e30ea2-a3d2-4cd4-a7ba-fa61a5fa26a0-utilities\") pod \"redhat-operators-pnhtq\" (UID: \"47e30ea2-a3d2-4cd4-a7ba-fa61a5fa26a0\") " pod="openshift-marketplace/redhat-operators-pnhtq" Mar 20 18:50:40 crc kubenswrapper[4690]: I0320 18:50:40.661952 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nq4pw\" (UniqueName: \"kubernetes.io/projected/47e30ea2-a3d2-4cd4-a7ba-fa61a5fa26a0-kube-api-access-nq4pw\") pod \"redhat-operators-pnhtq\" (UID: \"47e30ea2-a3d2-4cd4-a7ba-fa61a5fa26a0\") " pod="openshift-marketplace/redhat-operators-pnhtq" Mar 20 18:50:40 crc kubenswrapper[4690]: I0320 18:50:40.662323 4690 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47e30ea2-a3d2-4cd4-a7ba-fa61a5fa26a0-catalog-content\") pod \"redhat-operators-pnhtq\" (UID: \"47e30ea2-a3d2-4cd4-a7ba-fa61a5fa26a0\") " pod="openshift-marketplace/redhat-operators-pnhtq" Mar 20 18:50:40 crc kubenswrapper[4690]: I0320 18:50:40.763677 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47e30ea2-a3d2-4cd4-a7ba-fa61a5fa26a0-catalog-content\") pod \"redhat-operators-pnhtq\" (UID: \"47e30ea2-a3d2-4cd4-a7ba-fa61a5fa26a0\") " pod="openshift-marketplace/redhat-operators-pnhtq" Mar 20 18:50:40 crc kubenswrapper[4690]: I0320 18:50:40.764234 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47e30ea2-a3d2-4cd4-a7ba-fa61a5fa26a0-catalog-content\") pod \"redhat-operators-pnhtq\" (UID: \"47e30ea2-a3d2-4cd4-a7ba-fa61a5fa26a0\") " pod="openshift-marketplace/redhat-operators-pnhtq" Mar 20 18:50:40 crc kubenswrapper[4690]: I0320 18:50:40.764375 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47e30ea2-a3d2-4cd4-a7ba-fa61a5fa26a0-utilities\") pod \"redhat-operators-pnhtq\" (UID: \"47e30ea2-a3d2-4cd4-a7ba-fa61a5fa26a0\") " pod="openshift-marketplace/redhat-operators-pnhtq" Mar 20 18:50:40 crc kubenswrapper[4690]: I0320 18:50:40.764473 4690 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nq4pw\" (UniqueName: \"kubernetes.io/projected/47e30ea2-a3d2-4cd4-a7ba-fa61a5fa26a0-kube-api-access-nq4pw\") pod \"redhat-operators-pnhtq\" (UID: \"47e30ea2-a3d2-4cd4-a7ba-fa61a5fa26a0\") " pod="openshift-marketplace/redhat-operators-pnhtq" Mar 20 18:50:40 crc kubenswrapper[4690]: I0320 18:50:40.764744 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47e30ea2-a3d2-4cd4-a7ba-fa61a5fa26a0-utilities\") pod \"redhat-operators-pnhtq\" (UID: \"47e30ea2-a3d2-4cd4-a7ba-fa61a5fa26a0\") " pod="openshift-marketplace/redhat-operators-pnhtq" Mar 20 18:50:40 crc kubenswrapper[4690]: I0320 18:50:40.785907 4690 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nq4pw\" (UniqueName: \"kubernetes.io/projected/47e30ea2-a3d2-4cd4-a7ba-fa61a5fa26a0-kube-api-access-nq4pw\") pod \"redhat-operators-pnhtq\" (UID: \"47e30ea2-a3d2-4cd4-a7ba-fa61a5fa26a0\") " pod="openshift-marketplace/redhat-operators-pnhtq" Mar 20 18:50:40 crc kubenswrapper[4690]: I0320 18:50:40.935013 4690 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pnhtq" Mar 20 18:50:41 crc kubenswrapper[4690]: I0320 18:50:41.407893 4690 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pnhtq"] Mar 20 18:50:41 crc kubenswrapper[4690]: W0320 18:50:41.414770 4690 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47e30ea2_a3d2_4cd4_a7ba_fa61a5fa26a0.slice/crio-cc64faf36401e6843a9031da3e410d7cc2a3f02f3dac7586b958cc28be758e68 WatchSource:0}: Error finding container cc64faf36401e6843a9031da3e410d7cc2a3f02f3dac7586b958cc28be758e68: Status 404 returned error can't find the container with id cc64faf36401e6843a9031da3e410d7cc2a3f02f3dac7586b958cc28be758e68 Mar 20 18:50:41 crc kubenswrapper[4690]: I0320 18:50:41.723555 4690 generic.go:334] "Generic (PLEG): container finished" podID="47e30ea2-a3d2-4cd4-a7ba-fa61a5fa26a0" containerID="d0ac701f24699d9730340773153ef440c641142b330f997e7ddd8c3879d7713b" exitCode=0 Mar 20 18:50:41 crc kubenswrapper[4690]: I0320 18:50:41.723660 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pnhtq" event={"ID":"47e30ea2-a3d2-4cd4-a7ba-fa61a5fa26a0","Type":"ContainerDied","Data":"d0ac701f24699d9730340773153ef440c641142b330f997e7ddd8c3879d7713b"} Mar 20 18:50:41 crc kubenswrapper[4690]: I0320 18:50:41.723866 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pnhtq" event={"ID":"47e30ea2-a3d2-4cd4-a7ba-fa61a5fa26a0","Type":"ContainerStarted","Data":"cc64faf36401e6843a9031da3e410d7cc2a3f02f3dac7586b958cc28be758e68"} Mar 20 18:50:43 crc kubenswrapper[4690]: I0320 18:50:43.770367 4690 generic.go:334] "Generic (PLEG): container finished" podID="47e30ea2-a3d2-4cd4-a7ba-fa61a5fa26a0" containerID="653bd398e10747d983d681f317f2c2e5c100d574959a13c09b1278842e908243" exitCode=0 Mar 20 18:50:43 crc kubenswrapper[4690]: I0320 18:50:43.770440 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pnhtq" event={"ID":"47e30ea2-a3d2-4cd4-a7ba-fa61a5fa26a0","Type":"ContainerDied","Data":"653bd398e10747d983d681f317f2c2e5c100d574959a13c09b1278842e908243"} Mar 20 18:50:44 crc kubenswrapper[4690]: I0320 18:50:44.780658 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pnhtq" event={"ID":"47e30ea2-a3d2-4cd4-a7ba-fa61a5fa26a0","Type":"ContainerStarted","Data":"22d29e328609881a0771cd49fe2a2d093964f360208cd416d8eb45407d9330a8"} Mar 20 18:50:44 crc kubenswrapper[4690]: I0320 18:50:44.803627 4690 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-pnhtq" podStartSLOduration=2.335769224 podStartE2EDuration="4.803604228s" podCreationTimestamp="2026-03-20 18:50:40 +0000 UTC" firstStartedPulling="2026-03-20 18:50:41.725170991 +0000 UTC m=+4716.590996669" lastFinishedPulling="2026-03-20 18:50:44.193005985 +0000 UTC m=+4719.058831673" observedRunningTime="2026-03-20 18:50:44.797593398 +0000 UTC m=+4719.663419076" watchObservedRunningTime="2026-03-20 18:50:44.803604228 +0000 UTC m=+4719.669429916" Mar 20 18:50:50 crc kubenswrapper[4690]: I0320 18:50:50.935532 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-pnhtq" Mar 20 18:50:50 crc kubenswrapper[4690]: I0320 18:50:50.936183 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-pnhtq" Mar 20 18:50:52 crc kubenswrapper[4690]: I0320 18:50:52.003217 4690 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-pnhtq" podUID="47e30ea2-a3d2-4cd4-a7ba-fa61a5fa26a0" containerName="registry-server" probeResult="failure" output=< Mar 20 18:50:52 crc kubenswrapper[4690]: timeout: failed to connect service ":50051" within 1s Mar 20 18:50:52 crc kubenswrapper[4690]: > Mar 20 18:51:01 crc kubenswrapper[4690]: I0320 18:51:01.029459 4690 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-pnhtq" Mar 20 18:51:01 crc kubenswrapper[4690]: I0320 18:51:01.121394 4690 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-pnhtq" Mar 20 18:51:01 crc kubenswrapper[4690]: I0320 18:51:01.269952 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pnhtq"] Mar 20 18:51:02 crc kubenswrapper[4690]: I0320 18:51:02.984627 4690 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-pnhtq" podUID="47e30ea2-a3d2-4cd4-a7ba-fa61a5fa26a0" containerName="registry-server" containerID="cri-o://22d29e328609881a0771cd49fe2a2d093964f360208cd416d8eb45407d9330a8" gracePeriod=2 Mar 20 18:51:03 crc kubenswrapper[4690]: I0320 18:51:03.500607 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pnhtq" Mar 20 18:51:03 crc kubenswrapper[4690]: I0320 18:51:03.663301 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47e30ea2-a3d2-4cd4-a7ba-fa61a5fa26a0-catalog-content\") pod \"47e30ea2-a3d2-4cd4-a7ba-fa61a5fa26a0\" (UID: \"47e30ea2-a3d2-4cd4-a7ba-fa61a5fa26a0\") " Mar 20 18:51:03 crc kubenswrapper[4690]: I0320 18:51:03.663406 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nq4pw\" (UniqueName: \"kubernetes.io/projected/47e30ea2-a3d2-4cd4-a7ba-fa61a5fa26a0-kube-api-access-nq4pw\") pod \"47e30ea2-a3d2-4cd4-a7ba-fa61a5fa26a0\" (UID: \"47e30ea2-a3d2-4cd4-a7ba-fa61a5fa26a0\") " Mar 20 18:51:03 crc kubenswrapper[4690]: I0320 18:51:03.663543 4690 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47e30ea2-a3d2-4cd4-a7ba-fa61a5fa26a0-utilities\") pod \"47e30ea2-a3d2-4cd4-a7ba-fa61a5fa26a0\" (UID: \"47e30ea2-a3d2-4cd4-a7ba-fa61a5fa26a0\") " Mar 20 18:51:03 crc kubenswrapper[4690]: I0320 18:51:03.664817 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47e30ea2-a3d2-4cd4-a7ba-fa61a5fa26a0-utilities" (OuterVolumeSpecName: "utilities") pod "47e30ea2-a3d2-4cd4-a7ba-fa61a5fa26a0" (UID: "47e30ea2-a3d2-4cd4-a7ba-fa61a5fa26a0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:51:03 crc kubenswrapper[4690]: I0320 18:51:03.669615 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47e30ea2-a3d2-4cd4-a7ba-fa61a5fa26a0-kube-api-access-nq4pw" (OuterVolumeSpecName: "kube-api-access-nq4pw") pod "47e30ea2-a3d2-4cd4-a7ba-fa61a5fa26a0" (UID: "47e30ea2-a3d2-4cd4-a7ba-fa61a5fa26a0"). InnerVolumeSpecName "kube-api-access-nq4pw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:51:03 crc kubenswrapper[4690]: I0320 18:51:03.766148 4690 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nq4pw\" (UniqueName: \"kubernetes.io/projected/47e30ea2-a3d2-4cd4-a7ba-fa61a5fa26a0-kube-api-access-nq4pw\") on node \"crc\" DevicePath \"\"" Mar 20 18:51:03 crc kubenswrapper[4690]: I0320 18:51:03.766197 4690 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47e30ea2-a3d2-4cd4-a7ba-fa61a5fa26a0-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 18:51:03 crc kubenswrapper[4690]: I0320 18:51:03.820549 4690 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47e30ea2-a3d2-4cd4-a7ba-fa61a5fa26a0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "47e30ea2-a3d2-4cd4-a7ba-fa61a5fa26a0" (UID: "47e30ea2-a3d2-4cd4-a7ba-fa61a5fa26a0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:51:03 crc kubenswrapper[4690]: I0320 18:51:03.868467 4690 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47e30ea2-a3d2-4cd4-a7ba-fa61a5fa26a0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 18:51:03 crc kubenswrapper[4690]: I0320 18:51:03.994531 4690 generic.go:334] "Generic (PLEG): container finished" podID="47e30ea2-a3d2-4cd4-a7ba-fa61a5fa26a0" containerID="22d29e328609881a0771cd49fe2a2d093964f360208cd416d8eb45407d9330a8" exitCode=0 Mar 20 18:51:03 crc kubenswrapper[4690]: I0320 18:51:03.994575 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pnhtq" event={"ID":"47e30ea2-a3d2-4cd4-a7ba-fa61a5fa26a0","Type":"ContainerDied","Data":"22d29e328609881a0771cd49fe2a2d093964f360208cd416d8eb45407d9330a8"} Mar 20 18:51:03 crc kubenswrapper[4690]: I0320 18:51:03.994600 4690 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pnhtq" event={"ID":"47e30ea2-a3d2-4cd4-a7ba-fa61a5fa26a0","Type":"ContainerDied","Data":"cc64faf36401e6843a9031da3e410d7cc2a3f02f3dac7586b958cc28be758e68"} Mar 20 18:51:03 crc kubenswrapper[4690]: I0320 18:51:03.994616 4690 scope.go:117] "RemoveContainer" containerID="22d29e328609881a0771cd49fe2a2d093964f360208cd416d8eb45407d9330a8" Mar 20 18:51:03 crc kubenswrapper[4690]: I0320 18:51:03.994723 4690 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pnhtq" Mar 20 18:51:04 crc kubenswrapper[4690]: I0320 18:51:04.023374 4690 scope.go:117] "RemoveContainer" containerID="653bd398e10747d983d681f317f2c2e5c100d574959a13c09b1278842e908243" Mar 20 18:51:04 crc kubenswrapper[4690]: I0320 18:51:04.025009 4690 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pnhtq"] Mar 20 18:51:04 crc kubenswrapper[4690]: I0320 18:51:04.032759 4690 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-pnhtq"] Mar 20 18:51:04 crc kubenswrapper[4690]: I0320 18:51:04.061745 4690 scope.go:117] "RemoveContainer" containerID="d0ac701f24699d9730340773153ef440c641142b330f997e7ddd8c3879d7713b" Mar 20 18:51:04 crc kubenswrapper[4690]: I0320 18:51:04.112771 4690 scope.go:117] "RemoveContainer" containerID="22d29e328609881a0771cd49fe2a2d093964f360208cd416d8eb45407d9330a8" Mar 20 18:51:04 crc kubenswrapper[4690]: E0320 18:51:04.113627 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22d29e328609881a0771cd49fe2a2d093964f360208cd416d8eb45407d9330a8\": container with ID starting with 22d29e328609881a0771cd49fe2a2d093964f360208cd416d8eb45407d9330a8 not found: ID does not exist" containerID="22d29e328609881a0771cd49fe2a2d093964f360208cd416d8eb45407d9330a8" Mar 20 18:51:04 crc kubenswrapper[4690]: I0320 18:51:04.113687 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22d29e328609881a0771cd49fe2a2d093964f360208cd416d8eb45407d9330a8"} err="failed to get container status \"22d29e328609881a0771cd49fe2a2d093964f360208cd416d8eb45407d9330a8\": rpc error: code = NotFound desc = could not find container \"22d29e328609881a0771cd49fe2a2d093964f360208cd416d8eb45407d9330a8\": container with ID starting with 22d29e328609881a0771cd49fe2a2d093964f360208cd416d8eb45407d9330a8 not found: ID does not exist" Mar 20 18:51:04 crc kubenswrapper[4690]: I0320 18:51:04.113730 4690 scope.go:117] "RemoveContainer" containerID="653bd398e10747d983d681f317f2c2e5c100d574959a13c09b1278842e908243" Mar 20 18:51:04 crc kubenswrapper[4690]: E0320 18:51:04.114213 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"653bd398e10747d983d681f317f2c2e5c100d574959a13c09b1278842e908243\": container with ID starting with 653bd398e10747d983d681f317f2c2e5c100d574959a13c09b1278842e908243 not found: ID does not exist" containerID="653bd398e10747d983d681f317f2c2e5c100d574959a13c09b1278842e908243" Mar 20 18:51:04 crc kubenswrapper[4690]: I0320 18:51:04.114279 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"653bd398e10747d983d681f317f2c2e5c100d574959a13c09b1278842e908243"} err="failed to get container status \"653bd398e10747d983d681f317f2c2e5c100d574959a13c09b1278842e908243\": rpc error: code = NotFound desc = could not find container \"653bd398e10747d983d681f317f2c2e5c100d574959a13c09b1278842e908243\": container with ID starting with 653bd398e10747d983d681f317f2c2e5c100d574959a13c09b1278842e908243 not found: ID does not exist" Mar 20 18:51:04 crc kubenswrapper[4690]: I0320 18:51:04.114314 4690 scope.go:117] "RemoveContainer" containerID="d0ac701f24699d9730340773153ef440c641142b330f997e7ddd8c3879d7713b" Mar 20 18:51:04 crc kubenswrapper[4690]: E0320 18:51:04.114795 4690 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0ac701f24699d9730340773153ef440c641142b330f997e7ddd8c3879d7713b\": container with ID starting with d0ac701f24699d9730340773153ef440c641142b330f997e7ddd8c3879d7713b not found: ID does not exist" containerID="d0ac701f24699d9730340773153ef440c641142b330f997e7ddd8c3879d7713b" Mar 20 18:51:04 crc kubenswrapper[4690]: I0320 18:51:04.114828 4690 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0ac701f24699d9730340773153ef440c641142b330f997e7ddd8c3879d7713b"} err="failed to get container status \"d0ac701f24699d9730340773153ef440c641142b330f997e7ddd8c3879d7713b\": rpc error: code = NotFound desc = could not find container \"d0ac701f24699d9730340773153ef440c641142b330f997e7ddd8c3879d7713b\": container with ID starting with d0ac701f24699d9730340773153ef440c641142b330f997e7ddd8c3879d7713b not found: ID does not exist" Mar 20 18:51:05 crc kubenswrapper[4690]: I0320 18:51:05.896613 4690 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47e30ea2-a3d2-4cd4-a7ba-fa61a5fa26a0" path="/var/lib/kubelet/pods/47e30ea2-a3d2-4cd4-a7ba-fa61a5fa26a0/volumes" var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515157313517024455 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015157313520017364 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015157301677016521 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015157301700015454 5ustar corecore